Sample records for budget optimization problem

  1. Minimal investment risk of a portfolio optimization problem with budget and investment concentration constraints

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2017-02-01

    In the present paper, the minimal investment risk for a portfolio optimization problem with imposed budget and investment concentration constraints is considered using replica analysis. Since the minimal investment risk is influenced by the investment concentration constraint (as well as the budget constraint), it is intuitive that the minimal investment risk for the problem with an investment concentration constraint can be larger than that without the constraint (that is, with only the budget constraint). Moreover, a numerical experiment shows the effectiveness of our proposed analysis. In contrast, the standard operations research approach failed to identify accurately the minimal investment risk of the portfolio optimization problem.

  2. Replica analysis for the duality of the portfolio optimization problem

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2016-11-01

    In the present paper, the primal-dual problem consisting of the investment risk minimization problem and the expected return maximization problem in the mean-variance model is discussed using replica analysis. As a natural extension of the investment risk minimization problem under only a budget constraint that we analyzed in a previous study, we herein consider a primal-dual problem in which the investment risk minimization problem with budget and expected return constraints is regarded as the primal problem, and the expected return maximization problem with budget and investment risk constraints is regarded as the dual problem. With respect to these optimal problems, we analyze a quenched disordered system involving both of these optimization problems using the approach developed in statistical mechanical informatics and confirm that both optimal portfolios can possess the primal-dual structure. Finally, the results of numerical simulations are shown to validate the effectiveness of the proposed method.

  3. Replica analysis for the duality of the portfolio optimization problem.

    PubMed

    Shinzato, Takashi

    2016-11-01

    In the present paper, the primal-dual problem consisting of the investment risk minimization problem and the expected return maximization problem in the mean-variance model is discussed using replica analysis. As a natural extension of the investment risk minimization problem under only a budget constraint that we analyzed in a previous study, we herein consider a primal-dual problem in which the investment risk minimization problem with budget and expected return constraints is regarded as the primal problem, and the expected return maximization problem with budget and investment risk constraints is regarded as the dual problem. With respect to these optimal problems, we analyze a quenched disordered system involving both of these optimization problems using the approach developed in statistical mechanical informatics and confirm that both optimal portfolios can possess the primal-dual structure. Finally, the results of numerical simulations are shown to validate the effectiveness of the proposed method.

  4. Statistical mechanics of budget-constrained auctions

    NASA Astrophysics Data System (ADS)

    Altarelli, F.; Braunstein, A.; Realpe-Gomez, J.; Zecchina, R.

    2009-07-01

    Finding the optimal assignment in budget-constrained auctions is a combinatorial optimization problem with many important applications, a notable example being in the sale of advertisement space by search engines (in this context the problem is often referred to as the off-line AdWords problem). On the basis of the cavity method of statistical mechanics, we introduce a message-passing algorithm that is capable of solving efficiently random instances of the problem extracted from a natural distribution, and we derive from its properties the phase diagram of the problem. As the control parameter (average value of the budgets) is varied, we find two phase transitions delimiting a region in which long-range correlations arise.

  5. Random Matrix Approach for Primal-Dual Portfolio Optimization Problems

    NASA Astrophysics Data System (ADS)

    Tada, Daichi; Yamamoto, Hisashi; Shinzato, Takashi

    2017-12-01

    In this paper, we revisit the portfolio optimization problems of the minimization/maximization of investment risk under constraints of budget and investment concentration (primal problem) and the maximization/minimization of investment concentration under constraints of budget and investment risk (dual problem) for the case that the variances of the return rates of the assets are identical. We analyze both optimization problems by the Lagrange multiplier method and the random matrix approach. Thereafter, we compare the results obtained from our proposed approach with the results obtained in previous work. Moreover, we use numerical experiments to validate the results obtained from the replica approach and the random matrix approach as methods for analyzing both the primal and dual portfolio optimization problems.

  6. An Algorithm for the Mixed Transportation Network Design Problem

    PubMed Central

    Liu, Xinyu; Chen, Qun

    2016-01-01

    This paper proposes an optimization algorithm, the dimension-down iterative algorithm (DDIA), for solving a mixed transportation network design problem (MNDP), which is generally expressed as a mathematical programming with equilibrium constraint (MPEC). The upper level of the MNDP aims to optimize the network performance via both the expansion of the existing links and the addition of new candidate links, whereas the lower level is a traditional Wardrop user equilibrium (UE) problem. The idea of the proposed solution algorithm (DDIA) is to reduce the dimensions of the problem. A group of variables (discrete/continuous) is fixed to optimize another group of variables (continuous/discrete) alternately; then, the problem is transformed into solving a series of CNDPs (continuous network design problems) and DNDPs (discrete network design problems) repeatedly until the problem converges to the optimal solution. The advantage of the proposed algorithm is that its solution process is very simple and easy to apply. Numerical examples show that for the MNDP without budget constraint, the optimal solution can be found within a few iterations with DDIA. For the MNDP with budget constraint, however, the result depends on the selection of initial values, which leads to different optimal solutions (i.e., different local optimal solutions). Some thoughts are given on how to derive meaningful initial values, such as by considering the budgets of new and reconstruction projects separately. PMID:27626803

  7. Optimal infrastructure maintenance scheduling problem under budget uncertainty.

    DOT National Transportation Integrated Search

    2010-05-01

    This research addresses a general class of infrastructure asset management problems. Infrastructure : agencies usually face budget uncertainties that will eventually lead to suboptimal planning if : maintenance decisions are made without taking the u...

  8. Maximizing algebraic connectivity in interconnected networks.

    PubMed

    Shakeri, Heman; Albin, Nathan; Darabi Sahneh, Faryad; Poggi-Corradini, Pietro; Scoglio, Caterina

    2016-03-01

    Algebraic connectivity, the second eigenvalue of the Laplacian matrix, is a measure of node and link connectivity on networks. When studying interconnected networks it is useful to consider a multiplex model, where the component networks operate together with interlayer links among them. In order to have a well-connected multilayer structure, it is necessary to optimally design these interlayer links considering realistic constraints. In this work, we solve the problem of finding an optimal weight distribution for one-to-one interlayer links under budget constraint. We show that for the special multiplex configurations with identical layers, the uniform weight distribution is always optimal. On the other hand, when the two layers are arbitrary, increasing the budget reveals the existence of two different regimes. Up to a certain threshold budget, the second eigenvalue of the supra-Laplacian is simple, the optimal weight distribution is uniform, and the Fiedler vector is constant on each layer. Increasing the budget past the threshold, the optimal weight distribution can be nonuniform. The interesting consequence of this result is that there is no need to solve the optimization problem when the available budget is less than the threshold, which can be easily found analytically.

  9. Optimal Budget Allocation for Sample Average Approximation

    DTIC Science & Technology

    2011-06-01

    an optimization algorithm applied to the sample average problem. We examine the convergence rate of the estimator as the computing budget tends to...regime for the optimization algorithm . 1 Introduction Sample average approximation (SAA) is a frequently used approach to solving stochastic programs...appealing due to its simplicity and the fact that a large number of standard optimization algorithms are often available to optimize the resulting sample

  10. Optimal Computing Budget Allocation for Particle Swarm Optimization in Stochastic Optimization.

    PubMed

    Zhang, Si; Xu, Jie; Lee, Loo Hay; Chew, Ek Peng; Wong, Wai Peng; Chen, Chun-Hung

    2017-04-01

    Particle Swarm Optimization (PSO) is a popular metaheuristic for deterministic optimization. Originated in the interpretations of the movement of individuals in a bird flock or fish school, PSO introduces the concept of personal best and global best to simulate the pattern of searching for food by flocking and successfully translate the natural phenomena to the optimization of complex functions. Many real-life applications of PSO cope with stochastic problems. To solve a stochastic problem using PSO, a straightforward approach is to equally allocate computational effort among all particles and obtain the same number of samples of fitness values. This is not an efficient use of computational budget and leaves considerable room for improvement. This paper proposes a seamless integration of the concept of optimal computing budget allocation (OCBA) into PSO to improve the computational efficiency of PSO for stochastic optimization problems. We derive an asymptotically optimal allocation rule to intelligently determine the number of samples for all particles such that the PSO algorithm can efficiently select the personal best and global best when there is stochastic estimation noise in fitness values. We also propose an easy-to-implement sequential procedure. Numerical tests show that our new approach can obtain much better results using the same amount of computational effort.

  11. Optimal Computing Budget Allocation for Particle Swarm Optimization in Stochastic Optimization

    PubMed Central

    Zhang, Si; Xu, Jie; Lee, Loo Hay; Chew, Ek Peng; Chen, Chun-Hung

    2017-01-01

    Particle Swarm Optimization (PSO) is a popular metaheuristic for deterministic optimization. Originated in the interpretations of the movement of individuals in a bird flock or fish school, PSO introduces the concept of personal best and global best to simulate the pattern of searching for food by flocking and successfully translate the natural phenomena to the optimization of complex functions. Many real-life applications of PSO cope with stochastic problems. To solve a stochastic problem using PSO, a straightforward approach is to equally allocate computational effort among all particles and obtain the same number of samples of fitness values. This is not an efficient use of computational budget and leaves considerable room for improvement. This paper proposes a seamless integration of the concept of optimal computing budget allocation (OCBA) into PSO to improve the computational efficiency of PSO for stochastic optimization problems. We derive an asymptotically optimal allocation rule to intelligently determine the number of samples for all particles such that the PSO algorithm can efficiently select the personal best and global best when there is stochastic estimation noise in fitness values. We also propose an easy-to-implement sequential procedure. Numerical tests show that our new approach can obtain much better results using the same amount of computational effort. PMID:29170617

  12. Expected Fitness Gains of Randomized Search Heuristics for the Traveling Salesperson Problem.

    PubMed

    Nallaperuma, Samadhi; Neumann, Frank; Sudholt, Dirk

    2017-01-01

    Randomized search heuristics are frequently applied to NP-hard combinatorial optimization problems. The runtime analysis of randomized search heuristics has contributed tremendously to our theoretical understanding. Recently, randomized search heuristics have been examined regarding their achievable progress within a fixed-time budget. We follow this approach and present a fixed-budget analysis for an NP-hard combinatorial optimization problem. We consider the well-known Traveling Salesperson Problem (TSP) and analyze the fitness increase that randomized search heuristics are able to achieve within a given fixed-time budget. In particular, we analyze Manhattan and Euclidean TSP instances and Randomized Local Search (RLS), (1+1) EA and (1+[Formula: see text]) EA algorithms for the TSP in a smoothed complexity setting, and derive the lower bounds of the expected fitness gain for a specified number of generations.

  13. Optimal control of information epidemics modeled as Maki Thompson rumors

    NASA Astrophysics Data System (ADS)

    Kandhway, Kundan; Kuri, Joy

    2014-12-01

    We model the spread of information in a homogeneously mixed population using the Maki Thompson rumor model. We formulate an optimal control problem, from the perspective of single campaigner, to maximize the spread of information when the campaign budget is fixed. Control signals, such as advertising in the mass media, attempt to convert ignorants and stiflers into spreaders. We show the existence of a solution to the optimal control problem when the campaigning incurs non-linear costs under the isoperimetric budget constraint. The solution employs Pontryagin's Minimum Principle and a modified version of forward backward sweep technique for numerical computation to accommodate the isoperimetric budget constraint. The techniques developed in this paper are general and can be applied to similar optimal control problems in other areas. We have allowed the spreading rate of the information epidemic to vary over the campaign duration to model practical situations when the interest level of the population in the subject of the campaign changes with time. The shape of the optimal control signal is studied for different model parameters and spreading rate profiles. We have also studied the variation of the optimal campaigning costs with respect to various model parameters. Results indicate that, for some model parameters, significant improvements can be achieved by the optimal strategy compared to the static control strategy. The static strategy respects the same budget constraint as the optimal strategy and has a constant value throughout the campaign horizon. This work finds application in election and social awareness campaigns, product advertising, movie promotion and crowdfunding campaigns.

  14. Optimal resource allocation for defense of targets based on differing measures of attractiveness.

    PubMed

    Bier, Vicki M; Haphuriwat, Naraphorn; Menoyo, Jaime; Zimmerman, Rae; Culpen, Alison M

    2008-06-01

    This article describes the results of applying a rigorous computational model to the problem of the optimal defensive resource allocation among potential terrorist targets. In particular, our study explores how the optimal budget allocation depends on the cost effectiveness of security investments, the defender's valuations of the various targets, and the extent of the defender's uncertainty about the attacker's target valuations. We use expected property damage, expected fatalities, and two metrics of critical infrastructure (airports and bridges) as our measures of target attractiveness. Our results show that the cost effectiveness of security investment has a large impact on the optimal budget allocation. Also, different measures of target attractiveness yield different optimal budget allocations, emphasizing the importance of developing more realistic terrorist objective functions for use in budget allocation decisions for homeland security.

  15. Maximizing and minimizing investment concentration with constraints of budget and investment risk

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2018-01-01

    In this paper, as a first step in examining the properties of a feasible portfolio subset that is characterized by budget and risk constraints, we assess the maximum and minimum of the investment concentration using replica analysis. To do this, we apply an analytical approach of statistical mechanics. We note that the optimization problem considered in this paper is the dual problem of the portfolio optimization problem discussed in the literature, and we verify that these optimal solutions are also dual. We also present numerical experiments, in which we use the method of steepest descent that is based on Lagrange's method of undetermined multipliers, and we compare the numerical results to those obtained by replica analysis in order to assess the effectiveness of our proposed approach.

  16. Efficient Simulation Budget Allocation for Selecting an Optimal Subset

    NASA Technical Reports Server (NTRS)

    Chen, Chun-Hung; He, Donghai; Fu, Michael; Lee, Loo Hay

    2008-01-01

    We consider a class of the subset selection problem in ranking and selection. The objective is to identify the top m out of k designs based on simulated output. Traditional procedures are conservative and inefficient. Using the optimal computing budget allocation framework, we formulate the problem as that of maximizing the probability of correc tly selecting all of the top-m designs subject to a constraint on the total number of samples available. For an approximation of this corre ct selection probability, we derive an asymptotically optimal allocat ion and propose an easy-to-implement heuristic sequential allocation procedure. Numerical experiments indicate that the resulting allocatio ns are superior to other methods in the literature that we tested, and the relative efficiency increases for larger problems. In addition, preliminary numerical results indicate that the proposed new procedur e has the potential to enhance computational efficiency for simulation optimization.

  17. Constrained Optimization Methods in Health Services Research-An Introduction: Report 1 of the ISPOR Optimization Methods Emerging Good Practices Task Force.

    PubMed

    Crown, William; Buyukkaramikli, Nasuh; Thokala, Praveen; Morton, Alec; Sir, Mustafa Y; Marshall, Deborah A; Tosh, Jon; Padula, William V; Ijzerman, Maarten J; Wong, Peter K; Pasupathy, Kalyan S

    2017-03-01

    Providing health services with the greatest possible value to patients and society given the constraints imposed by patient characteristics, health care system characteristics, budgets, and so forth relies heavily on the design of structures and processes. Such problems are complex and require a rigorous and systematic approach to identify the best solution. Constrained optimization is a set of methods designed to identify efficiently and systematically the best solution (the optimal solution) to a problem characterized by a number of potential solutions in the presence of identified constraints. This report identifies 1) key concepts and the main steps in building an optimization model; 2) the types of problems for which optimal solutions can be determined in real-world health applications; and 3) the appropriate optimization methods for these problems. We first present a simple graphical model based on the treatment of "regular" and "severe" patients, which maximizes the overall health benefit subject to time and budget constraints. We then relate it back to how optimization is relevant in health services research for addressing present day challenges. We also explain how these mathematical optimization methods relate to simulation methods, to standard health economic analysis techniques, and to the emergent fields of analytics and machine learning. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  18. Multiple Choice Knapsack Problem: example of planning choice in transportation.

    PubMed

    Zhong, Tao; Young, Rhonda

    2010-05-01

    Transportation programming, a process of selecting projects for funding given budget and other constraints, is becoming more complex as a result of new federal laws, local planning regulations, and increased public involvement. This article describes the use of an integer programming tool, Multiple Choice Knapsack Problem (MCKP), to provide optimal solutions to transportation programming problems in cases where alternative versions of projects are under consideration. In this paper, optimization methods for use in the transportation programming process are compared and then the process of building and solving the optimization problems is discussed. The concepts about the use of MCKP are presented and a real-world transportation programming example at various budget levels is provided. This article illustrates how the use of MCKP addresses the modern complexities and provides timely solutions in transportation programming practice. While the article uses transportation programming as a case study, MCKP can be useful in other fields where a similar decision among a subset of the alternatives is required. Copyright 2009 Elsevier Ltd. All rights reserved.

  19. Cost effective campaigning in social networks

    NASA Astrophysics Data System (ADS)

    Kotnis, Bhushan; Kuri, Joy

    2016-05-01

    Campaigners are increasingly using online social networking platforms for promoting products, ideas and information. A popular method of promoting a product or even an idea is incentivizing individuals to evangelize the idea vigorously by providing them with referral rewards in the form of discounts, cash backs, or social recognition. Due to budget constraints on scarce resources such as money and manpower, it may not be possible to provide incentives for the entire population, and hence incentives need to be allocated judiciously to appropriate individuals for ensuring the highest possible outreach size. We aim to do the same by formulating and solving an optimization problem using percolation theory. In particular, we compute the set of individuals that are provided incentives for minimizing the expected cost while ensuring a given outreach size. We also solve the problem of computing the set of individuals to be incentivized for maximizing the outreach size for given cost budget. The optimization problem turns out to be non trivial; it involves quantities that need to be computed by numerically solving a fixed point equation. Our primary contribution is, that for a fairly general cost structure, we show that the optimization problems can be solved by solving a simple linear program. We believe that our approach of using percolation theory to formulate an optimization problem is the first of its kind.

  20. Mean-Reverting Portfolio With Budget Constraint

    NASA Astrophysics Data System (ADS)

    Zhao, Ziping; Palomar, Daniel P.

    2018-05-01

    This paper considers the mean-reverting portfolio design problem arising from statistical arbitrage in the financial markets. We first propose a general problem formulation aimed at finding a portfolio of underlying component assets by optimizing a mean-reversion criterion characterizing the mean-reversion strength, taking into consideration the variance of the portfolio and an investment budget constraint. Then several specific problems are considered based on the general formulation, and efficient algorithms are proposed. Numerical results on both synthetic and market data show that our proposed mean-reverting portfolio design methods can generate consistent profits and outperform the traditional design methods and the benchmark methods in the literature.

  1. Methodologies for optimal resource allocation to the national space program and new space utilizations. Volume 1: Technical description

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The optimal allocation of resources to the national space program over an extended time period requires the solution of a large combinatorial problem in which the program elements are interdependent. The computer model uses an accelerated search technique to solve this problem. The model contains a large number of options selectable by the user to provide flexible input and a broad range of output for use in sensitivity analyses of all entering elements. Examples of these options are budget smoothing under varied appropriation levels, entry of inflation and discount effects, and probabilistic output which provides quantified degrees of certainty that program costs will remain within planned budget. Criteria and related analytic procedures were established for identifying potential new space program directions. Used in combination with the optimal resource allocation model, new space applications can be analyzed in realistic perspective, including the advantage gain from existing space program plant and on-going programs such as the space transportation system.

  2. Statistical mechanics of influence maximization with thermal noise

    NASA Astrophysics Data System (ADS)

    Lynn, Christopher W.; Lee, Daniel D.

    2017-03-01

    The problem of optimally distributing a budget of influence among individuals in a social network, known as influence maximization, has typically been studied in the context of contagion models and deterministic processes, which fail to capture stochastic interactions inherent in real-world settings. Here, we show that by introducing thermal noise into influence models, the dynamics exactly resemble spins in a heterogeneous Ising system. In this way, influence maximization in the presence of thermal noise has a natural physical interpretation as maximizing the magnetization of an Ising system given a budget of external magnetic field. Using this statistical mechanical formulation, we demonstrate analytically that for small external-field budgets, the optimal influence solutions exhibit a highly non-trivial temperature dependence, focusing on high-degree hub nodes at high temperatures and on easily influenced peripheral nodes at low temperatures. For the general problem, we present a projected gradient ascent algorithm that uses the magnetic susceptibility to calculate locally optimal external-field distributions. We apply our algorithm to synthetic and real-world networks, demonstrating that our analytic results generalize qualitatively. Our work establishes a fruitful connection with statistical mechanics and demonstrates that influence maximization depends crucially on the temperature of the system, a fact that has not been appreciated by existing research.

  3. On portfolio risk diversification

    NASA Astrophysics Data System (ADS)

    Takada, Hellinton H.; Stern, Julio M.

    2017-06-01

    The first portfolio risk diversification strategy was put into practice by the All Weather fund in 1996. The idea of risk diversification is related to the risk contribution of each available asset class or investment factor to the total portfolio risk. The maximum diversification or the risk parity allocation is achieved when the set of risk contributions is given by a uniform distribution. Meucci (2009) introduced the maximization of the Rényi entropy as part of a leverage constrained optimization problem to achieve such diversified risk contributions when dealing with uncorrelated investment factors. A generalization of the risk parity is the risk budgeting when there is a prior for the distribution of the risk contributions. Our contribution is the generalization of the existent optimization frameworks to be able to solve the risk budgeting problem. In addition, our framework does not possess any leverage constraint.

  4. Optimizing Experimental Designs Relative to Costs and Effect Sizes.

    ERIC Educational Resources Information Center

    Headrick, Todd C.; Zumbo, Bruno D.

    A general model is derived for the purpose of efficiently allocating integral numbers of units in multi-level designs given prespecified power levels. The derivation of the model is based on a constrained optimization problem that maximizes a general form of a ratio of expected mean squares subject to a budget constraint. This model provides more…

  5. Intelligent Optimization of Modulation Indexes in Unified Tracking and Communication System

    NASA Astrophysics Data System (ADS)

    Yang, Wei-wei; Cong, Bo; Huang, Qiong; Zhu, Li-wei

    2016-02-01

    In the unified tracking and communication system, the ranging signal and the telemetry, communication signals are used in the same channel. In the link budget, it is necessary to allocate the power reasonably, so as to ensure the performance of system and reduce the cost. In this paper, the nonlinear optimization problem is studied using intelligent optimization method. Simulation analysis results show that the proposed method is effective.

  6. Generating spatially optimized habitat in a trade-off between social optimality and budget efficiency.

    PubMed

    Drechsler, Martin

    2017-02-01

    Auctions have been proposed as alternatives to payments for environmental services when spatial interactions and costs are better known to landowners than to the conservation agency (asymmetric information). Recently, an auction scheme was proposed that delivers optimal conservation in the sense that social welfare is maximized. I examined the social welfare and the budget efficiency delivered by this scheme, where social welfare represents the difference between the monetized ecological benefit and the conservation cost incurred to the landowners and budget efficiency is defined as maximizing the ecological benefit for a given conservation budget. For the analysis, I considered a stylized landscape with land patches that can be used for agriculture or conservation. The ecological benefit was measured by an objective function that increases with increasing number and spatial aggregation of conserved land patches. I compared the social welfare and the budget efficiency of the auction scheme with an agglomeration payment, a policy scheme that considers spatial interactions and that was proposed recently. The auction delivered a higher level of social welfare than the agglomeration payment. However, the agglomeration payment was more efficient budgetarily than the auction, so the comparative performances of the 2 schemes depended on the chosen policy criterion-social welfare or budget efficiency. Both policy criteria are relevant for conservation. Which one should be chosen depends on the problem at hand, for example, whether social preferences should be taken into account in the decision of how much money to invest in conservation or whether the available conservation budget is strictly limited. © 2016 Society for Conservation Biology.

  7. Optimization methods for activities selection problems

    NASA Astrophysics Data System (ADS)

    Mahad, Nor Faradilah; Alias, Suriana; Yaakop, Siti Zulaika; Arshad, Norul Amanina Mohd; Mazni, Elis Sofia

    2017-08-01

    Co-curriculum activities must be joined by every student in Malaysia and these activities bring a lot of benefits to the students. By joining these activities, the students can learn about the time management and they can developing many useful skills. This project focuses on the selection of co-curriculum activities in secondary school using the optimization methods which are the Analytic Hierarchy Process (AHP) and Zero-One Goal Programming (ZOGP). A secondary school in Negeri Sembilan, Malaysia was chosen as a case study. A set of questionnaires were distributed randomly to calculate the weighted for each activity based on the 3 chosen criteria which are soft skills, interesting activities and performances. The weighted was calculated by using AHP and the results showed that the most important criteria is soft skills. Then, the ZOGP model will be analyzed by using LINGO Software version 15.0. There are two priorities to be considered. The first priority which is to minimize the budget for the activities is achieved since the total budget can be reduced by RM233.00. Therefore, the total budget to implement the selected activities is RM11,195.00. The second priority which is to select the co-curriculum activities is also achieved. The results showed that 9 out of 15 activities were selected. Thus, it can concluded that AHP and ZOGP approach can be used as the optimization methods for activities selection problem.

  8. BCDP: Budget Constrained and Delay-Bounded Placement for Hybrid Roadside Units in Vehicular Ad Hoc Networks

    PubMed Central

    Li, Peng; Huang, Chuanhe; Liu, Qin

    2014-01-01

    In vehicular ad hoc networks, roadside units (RSUs) placement has been proposed to improve the the overall network performance in many ITS applications. This paper addresses the budget constrained and delay-bounded placement problem (BCDP) for roadside units in vehicular ad hoc networks. There are two types of RSUs: cable connected RSU (c-RSU) and wireless RSU (w-RSU). c-RSUs are interconnected through wired lines, and they form the backbone of VANETs, while w-RSUs connect to other RSUs through wireless communication and serve as an economical extension of the coverage of c-RSUs. The delay-bounded coverage range and deployment cost of these two cases are totally different. We are given a budget constraint and a delay bound, the problem is how to find the optimal candidate sites with the maximal delay-bounded coverage to place RSUs such that a message from any c-RSU in the region can be disseminated to the more vehicles within the given budget constraint and delay bound. We first prove that the BCDP problem is NP-hard. Then we propose several algorithms to solve the BCDP problem. Simulation results show the heuristic algorithms can significantly improve the coverage range and reduce the total deployment cost, compared with other heuristic methods. PMID:25436656

  9. On the role of budget sufficiency, cost efficiency, and uncertainty in species management

    USGS Publications Warehouse

    van der Burg, Max Post; Bly, Bartholomew B.; Vercauteren, Tammy; Grand, James B.; Tyre, Andrew J.

    2014-01-01

    Many conservation planning frameworks rely on the assumption that one should prioritize locations for management actions based on the highest predicted conservation value (i.e., abundance, occupancy). This strategy may underperform relative to the expected outcome if one is working with a limited budget or the predicted responses are uncertain. Yet, cost and tolerance to uncertainty rarely become part of species management plans. We used field data and predictive models to simulate a decision problem involving western burrowing owls (Athene cunicularia hypugaea) using prairie dog colonies (Cynomys ludovicianus) in western Nebraska. We considered 2 species management strategies: one maximized abundance and the other maximized abundance in a cost-efficient way. We then used heuristic decision algorithms to compare the 2 strategies in terms of how well they met a hypothetical conservation objective. Finally, we performed an info-gap decision analysis to determine how these strategies performed under different budget constraints and uncertainty about owl response. Our results suggested that when budgets were sufficient to manage all sites, the maximizing strategy was optimal and suggested investing more in expensive actions. This pattern persisted for restricted budgets up to approximately 50% of the sufficient budget. Below this budget, the cost-efficient strategy was optimal and suggested investing in cheaper actions. When uncertainty in the expected responses was introduced, the strategy that maximized abundance remained robust under a sufficient budget. Reducing the budget induced a slight trade-off between expected performance and robustness, which suggested that the most robust strategy depended both on one's budget and tolerance to uncertainty. Our results suggest that wildlife managers should explicitly account for budget limitations and be realistic about their expected levels of performance.

  10. Influence of the economy crisis on project cost management

    NASA Astrophysics Data System (ADS)

    Simankina, Tatyana; Ćetković, Jasmina; Verstina, Natalia; Evseev, Evgeny

    2017-10-01

    Economy crisis significantly affects primarily the project cost management. The article considers the problems of project management in the field of housing under conditions of economy crisis. Project budgets are reduced, their mutual interference grows and framework of risks changes. Apparently, specific approaches are required to be developed to optimize the expenses and guarantee the project implementation within the approved budget. There is considered domestic and foreign experience in terms of project cost management with involvement of BIM technologies.

  11. Committee-Based Active Learning for Surrogate-Assisted Particle Swarm Optimization of Expensive Problems.

    PubMed

    Wang, Handing; Jin, Yaochu; Doherty, John

    2017-09-01

    Function evaluations (FEs) of many real-world optimization problems are time or resource consuming, posing a serious challenge to the application of evolutionary algorithms (EAs) to solve these problems. To address this challenge, the research on surrogate-assisted EAs has attracted increasing attention from both academia and industry over the past decades. However, most existing surrogate-assisted EAs (SAEAs) either still require thousands of expensive FEs to obtain acceptable solutions, or are only applied to very low-dimensional problems. In this paper, a novel surrogate-assisted particle swarm optimization (PSO) inspired from committee-based active learning (CAL) is proposed. In the proposed algorithm, a global model management strategy inspired from CAL is developed, which searches for the best and most uncertain solutions according to a surrogate ensemble using a PSO algorithm and evaluates these solutions using the expensive objective function. In addition, a local surrogate model is built around the best solution obtained so far. Then, a PSO algorithm searches on the local surrogate to find its optimum and evaluates it. The evolutionary search using the global model management strategy switches to the local search once no further improvement can be observed, and vice versa. This iterative search process continues until the computational budget is exhausted. Experimental results comparing the proposed algorithm with a few state-of-the-art SAEAs on both benchmark problems up to 30 decision variables as well as an airfoil design problem demonstrate that the proposed algorithm is able to achieve better or competitive solutions with a limited budget of hundreds of exact FEs.

  12. Solving Connected Subgraph Problems in Wildlife Conservation

    NASA Astrophysics Data System (ADS)

    Dilkina, Bistra; Gomes, Carla P.

    We investigate mathematical formulations and solution techniques for a variant of the Connected Subgraph Problem. Given a connected graph with costs and profits associated with the nodes, the goal is to find a connected subgraph that contains a subset of distinguished vertices. In this work we focus on the budget-constrained version, where we maximize the total profit of the nodes in the subgraph subject to a budget constraint on the total cost. We propose several mixed-integer formulations for enforcing the subgraph connectivity requirement, which plays a key role in the combinatorial structure of the problem. We show that a new formulation based on subtour elimination constraints is more effective at capturing the combinatorial structure of the problem, providing significant advantages over the previously considered encoding which was based on a single commodity flow. We test our formulations on synthetic instances as well as on real-world instances of an important problem in environmental conservation concerning the design of wildlife corridors. Our encoding results in a much tighter LP relaxation, and more importantly, it results in finding better integer feasible solutions as well as much better upper bounds on the objective (often proving optimality or within less than 1% of optimality), both when considering the synthetic instances as well as the real-world wildlife corridor instances.

  13. Optimal control of diarrhea transmission in a flood evacuation zone

    NASA Astrophysics Data System (ADS)

    Erwina, N.; Aldila, D.; Soewono, E.

    2014-03-01

    Evacuation of residents and diarrhea disease outbreak in evacuation zone have become serious problem that frequently happened during flood periods. Limited clean water supply and infrastructure in evacuation zone contribute to a critical spread of diarrhea. Transmission of diarrhea disease can be reduced by controlling clean water supply and treating diarrhea patients properly. These treatments require significant amount of budget, which may not be fulfilled in the fields. In his paper, transmission of diarrhea disease in evacuation zone using SIRS model is presented as control optimum problem with clean water supply and rate of treated patients as input controls. Existence and stability of equilibrium points and sensitivity analysis are investigated analytically for constant input controls. Optimum clean water supply and rate of treatment are found using optimum control technique. Optimal results for transmission of diarrhea and the corresponding controls during the period of observation are simulated numerically. The optimum result shows that transmission of diarrhea disease can be controlled with proper combination of water supply and rate of treatment within allowable budget.

  14. A firefly algorithm for solving competitive location-design problem: a case study

    NASA Astrophysics Data System (ADS)

    Sadjadi, Seyed Jafar; Ashtiani, Milad Gorji; Ramezanian, Reza; Makui, Ahmad

    2016-12-01

    This paper aims at determining the optimal number of new facilities besides specifying both the optimal location and design level of them under the budget constraint in a competitive environment by a novel hybrid continuous and discrete firefly algorithm. A real-world application of locating new chain stores in the city of Tehran, Iran, is used and the results are analyzed. In addition, several examples have been solved to evaluate the efficiency of the proposed model and algorithm. The results demonstrate that the performed method provides good-quality results for the test problems.

  15. Sample size calculation in cost-effectiveness cluster randomized trials: optimal and maximin approaches.

    PubMed

    Manju, Md Abu; Candel, Math J J M; Berger, Martijn P F

    2014-07-10

    In this paper, the optimal sample sizes at the cluster and person levels for each of two treatment arms are obtained for cluster randomized trials where the cost-effectiveness of treatments on a continuous scale is studied. The optimal sample sizes maximize the efficiency or power for a given budget or minimize the budget for a given efficiency or power. Optimal sample sizes require information on the intra-cluster correlations (ICCs) for effects and costs, the correlations between costs and effects at individual and cluster levels, the ratio of the variance of effects translated into costs to the variance of the costs (the variance ratio), sampling and measuring costs, and the budget. When planning, a study information on the model parameters usually is not available. To overcome this local optimality problem, the current paper also presents maximin sample sizes. The maximin sample sizes turn out to be rather robust against misspecifying the correlation between costs and effects at the cluster and individual levels but may lose much efficiency when misspecifying the variance ratio. The robustness of the maximin sample sizes against misspecifying the ICCs depends on the variance ratio. The maximin sample sizes are robust under misspecification of the ICC for costs for realistic values of the variance ratio greater than one but not robust under misspecification of the ICC for effects. Finally, we show how to calculate optimal or maximin sample sizes that yield sufficient power for a test on the cost-effectiveness of an intervention.

  16. Modelling night-time ecosystem respiration by a constrained source optimization method

    Treesearch

    Chun-Tai Lai; Gabriel Katul; John Butnor; David Ellsworth; Ram Oren

    2002-01-01

    One of the main challenges to quantifying ecosystem carbon budgets is properly quantifying the magnitude of night-time ecosystem respiration. Inverse Lagrangian dispersion analysis provides a promising approach to addressing such a problem when measured mean CO2 concentration profiles and nocturnal velocity statistics are available. An inverse...

  17. Outcome based state budget allocation for diabetes prevention programs using multi-criteria optimization with robust weights.

    PubMed

    Mehrotra, Sanjay; Kim, Kibaek

    2011-12-01

    We consider the problem of outcomes based budget allocations to chronic disease prevention programs across the United States (US) to achieve greater geographical healthcare equity. We use Diabetes Prevention and Control Programs (DPCP) by the Center for Disease Control and Prevention (CDC) as an example. We present a multi-criteria robust weighted sum model for such multi-criteria decision making in a group decision setting. The principal component analysis and an inverse linear programming techniques are presented and used to study the actual 2009 budget allocation by CDC. Our results show that the CDC budget allocation process for the DPCPs is not likely model based. In our empirical study, the relative weights for different prevalence and comorbidity factors and the corresponding budgets obtained under different weight regions are discussed. Parametric analysis suggests that money should be allocated to states to promote diabetes education and to increase patient-healthcare provider interactions to reduce disparity across the US.

  18. Efficient Computing Budget Allocation for Finding Simplest Good Designs

    PubMed Central

    Jia, Qing-Shan; Zhou, Enlu; Chen, Chun-Hung

    2012-01-01

    In many applications some designs are easier to implement, require less training data and shorter training time, and consume less storage than the others. Such designs are called simple designs, and are usually preferred over complex ones when they all have good performance. Despite the abundant existing studies on how to find good designs in simulation-based optimization (SBO), there exist few studies on finding simplest good designs. We consider this important problem in this paper, and make the following contributions. First, we provide lower bounds for the probabilities of correctly selecting the m simplest designs with top performance, and selecting the best m such simplest good designs, respectively. Second, we develop two efficient computing budget allocation methods to find m simplest good designs and to find the best m such designs, respectively; and show their asymptotic optimalities. Third, we compare the performance of the two methods with equal allocations over 6 academic examples and a smoke detection problem in wireless sensor networks. We hope that this work brings insight to finding the simplest good designs in general. PMID:23687404

  19. An effective rumor-containing strategy

    NASA Astrophysics Data System (ADS)

    Pan, Cheng; Yang, Lu-Xing; Yang, Xiaofan; Wu, Yingbo; Tang, Yuan Yan

    2018-06-01

    False rumors can lead to huge economic losses or/and social instability. Hence, mitigating the impact of bogus rumors is of primary importance. This paper focuses on the problem of how to suppress a false rumor by use of the truth. Based on a set of rational hypotheses and a novel rumor-truth mixed spreading model, the effectiveness and cost of a rumor-containing strategy are quantified, respectively. On this basis, the original problem is modeled as a constrained optimization problem (the RC model), in which the independent variable and the objective function represent a rumor-containing strategy and the effectiveness of a rumor-containing strategy, respectively. The goal of the optimization problem is to find the most effective rumor-containing strategy subject to a limited rumor-containing budget. Some optimal rumor-containing strategies are given by solving their respective RC models. The influence of different factors on the highest cost effectiveness of a RC model is illuminated through computer experiments. The results obtained are instructive to develop effective rumor-containing strategies.

  20. Regularizing portfolio optimization

    NASA Astrophysics Data System (ADS)

    Still, Susanne; Kondor, Imre

    2010-07-01

    The optimization of large portfolios displays an inherent instability due to estimation error. This poses a fundamental problem, because solutions that are not stable under sample fluctuations may look optimal for a given sample, but are, in effect, very far from optimal with respect to the average risk. In this paper, we approach the problem from the point of view of statistical learning theory. The occurrence of the instability is intimately related to over-fitting, which can be avoided using known regularization methods. We show how regularized portfolio optimization with the expected shortfall as a risk measure is related to support vector regression. The budget constraint dictates a modification. We present the resulting optimization problem and discuss the solution. The L2 norm of the weight vector is used as a regularizer, which corresponds to a diversification 'pressure'. This means that diversification, besides counteracting downward fluctuations in some assets by upward fluctuations in others, is also crucial because it improves the stability of the solution. The approach we provide here allows for the simultaneous treatment of optimization and diversification in one framework that enables the investor to trade off between the two, depending on the size of the available dataset.

  1. Optimization of Turkish Air Force SAR Units Forward Deployment Points for a Central Based SAR Force Structure

    DTIC Science & Technology

    2015-03-26

    Turkish Airborne Early Warning and Control (AEW& C ) aircraft in the combat arena. He examines three combat scenarios Turkey might encounter to cover and...to limited SAR assets, constrained budgets, logistic- maintenance problems, and high risk level of military flights. In recent years, the Turkish Air...model, Set Covering Location Problem (SCLP), defines the minimum number of SAR DPs to cover all fighter aircraft training areas (TAs). The second

  2. Macroscopic relationship in primal-dual portfolio optimization problem

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2018-02-01

    In the present paper, using a replica analysis, we examine the portfolio optimization problem handled in previous work and discuss the minimization of investment risk under constraints of budget and expected return for the case that the distribution of the hyperparameters of the mean and variance of the return rate of each asset are not limited to a specific probability family. Findings derived using our proposed method are compared with those in previous work to verify the effectiveness of our proposed method. Further, we derive a Pythagorean theorem of the Sharpe ratio and macroscopic relations of opportunity loss. Using numerical experiments, the effectiveness of our proposed method is demonstrated for a specific situation.

  3. Surrogate assisted multidisciplinary design optimization for an all-electric GEO satellite

    NASA Astrophysics Data System (ADS)

    Shi, Renhe; Liu, Li; Long, Teng; Liu, Jian; Yuan, Bin

    2017-09-01

    State-of-the-art all-electric geostationary earth orbit (GEO) satellites use electric thrusters to execute all propulsive duties, which significantly differ from the traditional all-chemical ones in orbit-raising, station-keeping, radiation damage protection, and power budget, etc. Design optimization task of an all-electric GEO satellite is therefore a complex multidisciplinary design optimization (MDO) problem involving unique design considerations. However, solving the all-electric GEO satellite MDO problem faces big challenges in disciplinary modeling techniques and efficient optimization strategy. To address these challenges, we presents a surrogate assisted MDO framework consisting of several modules, i.e., MDO problem definition, multidisciplinary modeling, multidisciplinary analysis (MDA), and surrogate assisted optimizer. Based on the proposed framework, the all-electric GEO satellite MDO problem is formulated to minimize the total mass of the satellite system under a number of practical constraints. Then considerable efforts are spent on multidisciplinary modeling involving geosynchronous transfer, GEO station-keeping, power, thermal control, attitude control, and structure disciplines. Since orbit dynamics models and finite element structural model are computationally expensive, an adaptive response surface surrogate based optimizer is incorporated in the proposed framework to solve the satellite MDO problem with moderate computational cost, where a response surface surrogate is gradually refined to represent the computationally expensive MDA process. After optimization, the total mass of the studied GEO satellite is decreased by 185.3 kg (i.e., 7.3% of the total mass). Finally, the optimal design is further discussed to demonstrate the effectiveness of our proposed framework to cope with the all-electric GEO satellite system design optimization problems. This proposed surrogate assisted MDO framework can also provide valuable references for other all-electric spacecraft system design.

  4. Optimal allocation of resources over health care programmes: dealing with decreasing marginal utility and uncertainty.

    PubMed

    Al, Maiwenn J; Feenstra, Talitha L; Hout, Ben A van

    2005-07-01

    This paper addresses the problem of how to value health care programmes with different ratios of costs to effects, specifically when taking into account that these costs and effects are uncertain. First, the traditional framework of maximising health effects with a given health care budget is extended to a flexible budget using a value function over money and health effects. Second, uncertainty surrounding costs and effects is included in the model using expected utility. Other approaches to uncertainty that do not specify a utility function are discussed and it is argued that these also include implicit notions about risk attitude.

  5. The feasibility of a Paleolithic diet for low-income consumers.

    PubMed

    Metzgar, Matthew; Rideout, Todd C; Fontes-Villalba, Maelan; Kuipers, Remko S

    2011-06-01

    Many low-income consumers face a limited budget for food purchases. The United States Department of Agriculture developed the Thrifty Food Plan to address this problem of consuming a healthy diet given a budget constraint. This dietary optimization program uses common food choices to build a suitable diet. In this article, the United States Department of Agriculture data sets are used to test the feasibility of consuming a Paleolithic diet given a limited budget. The Paleolithic diet is described as the diet that humans are genetically adapted to, containing only the preagricultural food groups of meat, seafood, fruits, vegetables, and nuts. Constraints were applied to the diet optimization model to restrict grains, dairy, and certain other food categories. Constraints were also applied for macronutrients, micronutrients, and long-chain polyunsaturated fatty acids. The results show that it is possible to consume a Paleolithic diet given the constraints. However, the diet does fall short of meeting the daily recommended intakes for certain micronutrients. A 9.3% increase in income is needed to consume a Paleolithic diet that meets all daily recommended intakes except for calcium. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Effects of monetary reserves and rate of gain on human risky choice under budget constraints.

    PubMed

    Pietras, Cynthia J; Searcy, Gabriel D; Huitema, Brad E; Brandt, Andrew E

    2008-07-01

    The energy-budget rule is an optimal foraging model that predicts that choice should be risk averse when net gains plus reserves meet energy requirements (positive energy-budget conditions) and risk prone when net gains plus reserves fall below requirements (negative energy-budget conditions). Studies have shown that the energy-budget rule provides a good description of risky choice in humans when choice is studied under economic conditions (i.e., earnings budgets) that simulate positive and negative energy budgets. In previous human studies, earnings budgets were manipulated by varying earnings requirements, but in most nonhuman studies, energy budgets have been manipulated by varying reserves and/or mean rates of reinforcement. The present study therefore investigated choice in humans between certain and variable monetary outcomes when earnings budgets were manipulated by varying monetary reserves and mean rates of monetary gain. Consistent with the energy-budget rule, choice tended to be risk averse under positive-budget conditions and risk neutral or risk prone under negative-budget conditions. Sequential choices were also well described by a dynamic optimization model, especially when expected earnings for optimal choices were high. These results replicate and extend the results of prior experiments in showing that humans' choices are generally consistent with the predictions of the energy-budget rule when studied under conditions analogous to those used in nonhuman energy-budget studies, and that choice patterns tend to maximize reinforcement.

  7. Optimal Inspection of Imports to Prevent Invasive Pest Introduction.

    PubMed

    Chen, Cuicui; Epanchin-Niell, Rebecca S; Haight, Robert G

    2018-03-01

    The United States imports more than 1 billion live plants annually-an important and growing pathway for introduction of damaging nonnative invertebrates and pathogens. Inspection of imports is one safeguard for reducing pest introductions, but capacity constraints limit inspection effort. We develop an optimal sampling strategy to minimize the costs of pest introductions from trade by posing inspection as an acceptance sampling problem that incorporates key features of the decision context, including (i) simultaneous inspection of many heterogeneous lots, (ii) a lot-specific sampling effort, (iii) a budget constraint that limits total inspection effort, (iv) inspection error, and (v) an objective of minimizing cost from accepted defective units. We derive a formula for expected number of accepted infested units (expected slippage) given lot size, sample size, infestation rate, and detection rate, and we formulate and analyze the inspector's optimization problem of allocating a sampling budget among incoming lots to minimize the cost of slippage. We conduct an empirical analysis of live plant inspection, including estimation of plant infestation rates from historical data, and find that inspections optimally target the largest lots with the highest plant infestation rates, leaving some lots unsampled. We also consider that USDA-APHIS, which administers inspections, may want to continue inspecting all lots at a baseline level; we find that allocating any additional capacity, beyond a comprehensive baseline inspection, to the largest lots with the highest infestation rates allows inspectors to meet the dual goals of minimizing the costs of slippage and maintaining baseline sampling without substantial compromise. © 2017 Society for Risk Analysis.

  8. Economic optimization of natural hazard protection - conceptual study of existing approaches

    NASA Astrophysics Data System (ADS)

    Spackova, Olga; Straub, Daniel

    2013-04-01

    Risk-based planning of protection measures against natural hazards has become a common practice in many countries. The selection procedure aims at identifying an economically efficient strategy with regard to the estimated costs and risk (i.e. expected damage). A correct setting of the evaluation methodology and decision criteria should ensure an optimal selection of the portfolio of risk protection measures under a limited state budget. To demonstrate the efficiency of investments, indicators such as Benefit-Cost Ratio (BCR), Marginal Costs (MC) or Net Present Value (NPV) are commonly used. However, the methodologies for efficiency evaluation differ amongst different countries and different hazard types (floods, earthquakes etc.). Additionally, several inconsistencies can be found in the applications of the indicators in practice. This is likely to lead to a suboptimal selection of the protection strategies. This study provides a general formulation for optimization of the natural hazard protection measures from a socio-economic perspective. It assumes that all costs and risks can be expressed in monetary values. The study regards the problem as a discrete hierarchical optimization, where the state level sets the criteria and constraints, while the actual optimization is made on the regional level (towns, catchments) when designing particular protection measures and selecting the optimal protection level. The study shows that in case of an unlimited budget, the task is quite trivial, as it is sufficient to optimize the protection measures in individual regions independently (by minimizing the sum of risk and cost). However, if the budget is limited, the need for an optimal allocation of resources amongst the regions arises. To ensure this, minimum values of BCR or MC can be required by the state, which must be achieved in each region. The study investigates the meaning of these indicators in the optimization task at the conceptual level and compares their suitability. To illustrate the theoretical findings, the indicators are tested on a hypothetical example of five regions with different risk levels. Last but not least, political and societal aspects and limitations in the use of the risk-based optimization framework are discussed.

  9. Multi-period project portfolio selection under risk considerations and stochastic income

    NASA Astrophysics Data System (ADS)

    Tofighian, Ali Asghar; Moezzi, Hamid; Khakzar Barfuei, Morteza; Shafiee, Mahmood

    2018-02-01

    This paper deals with multi-period project portfolio selection problem. In this problem, the available budget is invested on the best portfolio of projects in each period such that the net profit is maximized. We also consider more realistic assumptions to cover wider range of applications than those reported in previous studies. A novel mathematical model is presented to solve the problem, considering risks, stochastic incomes, and possibility of investing extra budget in each time period. Due to the complexity of the problem, an effective meta-heuristic method hybridized with a local search procedure is presented to solve the problem. The algorithm is based on genetic algorithm (GA), which is a prominent method to solve this type of problems. The GA is enhanced by a new solution representation and well selected operators. It also is hybridized with a local search mechanism to gain better solution in shorter time. The performance of the proposed algorithm is then compared with well-known algorithms, like basic genetic algorithm (GA), particle swarm optimization (PSO), and electromagnetism-like algorithm (EM-like) by means of some prominent indicators. The computation results show the superiority of the proposed algorithm in terms of accuracy, robustness and computation time. At last, the proposed algorithm is wisely combined with PSO to improve the computing time considerably.

  10. [Evaluation of budget design and execution, an instrument of performance-based budgeting: some experiences applied to health].

    PubMed

    Peñaloza-Vassallo, K; Gutiérrez-Aguado, A; Prado-Fernández, M

    2017-01-01

    Since 2008, the evaluation of budget design and execution (EDEP for its acronym in Spanish) - one of the evaluations tools developed by the Peruvian Ministry of Economy and Finance (MEF) as part of the implementation of Performance Budgeting- seek to provide reliable information about design coherence and progress in the implementation of public interventions, in order to improve their management and make informed budget decisions. The EDEP methodology includes preparing an evaluation report and defining a matrix of commitments to improve performance. Commitments are defined based on the recommendation of the EDEP. The EDEP seeks to correct exiting problems in public programs and optimize their results. MEF tracks the fulfillment of these commitments and links together the analysis of public budget requests and the progress of these commitments. Now, almost 10 years after its implementation, 57 EDEP have been carried out in different sectors and 07 of them are related to health interventions such as: the comprehensive health system, vaccination service, normal births, acute respiratory infections and acute diarrheal diseases, among others. Beyond the discrepancies in the application of this tool, the EDEP and its matrix of commitments have allowed the use of the results of the evaluations and have become a mechanism to generate useful information to improve the public services.

  11. Trade-offs and efficiencies in optimal budget-constrained multispecies corridor networks.

    PubMed

    Dilkina, Bistra; Houtman, Rachel; Gomes, Carla P; Montgomery, Claire A; McKelvey, Kevin S; Kendall, Katherine; Graves, Tabitha A; Bernstein, Richard; Schwartz, Michael K

    2017-02-01

    Conservation biologists recognize that a system of isolated protected areas will be necessary but insufficient to meet biodiversity objectives. Current approaches to connecting core conservation areas through corridors consider optimal corridor placement based on a single optimization goal: commonly, maximizing the movement for a target species across a network of protected areas. We show that designing corridors for single species based on purely ecological criteria leads to extremely expensive linkages that are suboptimal for multispecies connectivity objectives. Similarly, acquiring the least-expensive linkages leads to ecologically poor solutions. We developed algorithms for optimizing corridors for multispecies use given a specific budget. We applied our approach in western Montana to demonstrate how the solutions may be used to evaluate trade-offs in connectivity for 2 species with different habitat requirements, different core areas, and different conservation values under different budgets. We evaluated corridors that were optimal for each species individually and for both species jointly. Incorporating a budget constraint and jointly optimizing for both species resulted in corridors that were close to the individual species movement-potential optima but with substantial cost savings. Our approach produced corridors that were within 14% and 11% of the best possible corridor connectivity for grizzly bears (Ursus arctos) and wolverines (Gulo gulo), respectively, and saved 75% of the cost. Similarly, joint optimization under a combined budget resulted in improved connectivity for both species relative to splitting the budget in 2 to optimize for each species individually. Our results demonstrate economies of scale and complementarities conservation planners can achieve by optimizing corridor designs for financial costs and for multiple species connectivity jointly. We believe that our approach will facilitate corridor conservation by reducing acquisition costs and by allowing derived corridors to more closely reflect conservation priorities. © 2016 Society for Conservation Biology.

  12. Trade-offs and efficiencies in optimal budget-constrained multispecies corridor networks

    USGS Publications Warehouse

    Dilkina, Bistra; Houtman, Rachel; Gomes, Carla P.; Montgomery, Claire A.; McKelvey, Kevin; Kendall, Katherine; Graves, Tabitha A.; Bernstein, Richard; Schwartz, Michael K.

    2017-01-01

    Conservation biologists recognize that a system of isolated protected areas will be necessary but insufficient to meet biodiversity objectives. Current approaches to connecting core conservation areas through corridors consider optimal corridor placement based on a single optimization goal: commonly, maximizing the movement for a target species across a network of protected areas. We show that designing corridors for single species based on purely ecological criteria leads to extremely expensive linkages that are suboptimal for multispecies connectivity objectives. Similarly, acquiring the least-expensive linkages leads to ecologically poor solutions. We developed algorithms for optimizing corridors for multispecies use given a specific budget. We applied our approach in western Montana to demonstrate how the solutions may be used to evaluate trade-offs in connectivity for 2 species with different habitat requirements, different core areas, and different conservation values under different budgets. We evaluated corridors that were optimal for each species individually and for both species jointly. Incorporating a budget constraint and jointly optimizing for both species resulted in corridors that were close to the individual species movement-potential optima but with substantial cost savings. Our approach produced corridors that were within 14% and 11% of the best possible corridor connectivity for grizzly bears (Ursus arctos) and wolverines (Gulo gulo), respectively, and saved 75% of the cost. Similarly, joint optimization under a combined budget resulted in improved connectivity for both species relative to splitting the budget in 2 to optimize for each species individually. Our results demonstrate economies of scale and complementarities conservation planners can achieve by optimizing corridor designs for financial costs and for multiple species connectivity jointly. We believe that our approach will facilitate corridor conservation by reducing acquisition costs and by allowing derived corridors to more closely reflect conservation priorities.

  13. D-Optimal Experimental Design for Contaminant Source Identification

    NASA Astrophysics Data System (ADS)

    Sai Baba, A. K.; Alexanderian, A.

    2016-12-01

    Contaminant source identification seeks to estimate the release history of a conservative solute given point concentration measurements at some time after the release. This can be mathematically expressed as an inverse problem, with a linear observation operator or a parameter-to-observation map, which we tackle using a Bayesian approach. Acquisition of experimental data can be laborious and expensive. The goal is to control the experimental parameters - in our case, the sparsity of the sensors, to maximize the information gain subject to some physical or budget constraints. This is known as optimal experimental design (OED). D-optimal experimental design seeks to maximize the expected information gain, and has long been considered the gold standard in the statistics community. Our goal is to develop scalable methods for D-optimal experimental designs involving large-scale PDE constrained problems with high-dimensional parameter fields. A major challenge for the OED, is that a nonlinear optimization algorithm for the D-optimality criterion requires repeated evaluation of objective function and gradient involving the determinant of large and dense matrices - this cost can be prohibitively expensive for applications of interest. We propose novel randomized matrix techniques that bring down the computational costs of the objective function and gradient evaluations by several orders of magnitude compared to the naive approach. The effect of randomized estimators on the accuracy and the convergence of the optimization solver will be discussed. The features and benefits of our new approach will be demonstrated on a challenging model problem from contaminant source identification involving the inference of the initial condition from spatio-temporal observations in a time-dependent advection-diffusion problem.

  14. Development of a funding, cost, and spending model for satellite projects

    NASA Technical Reports Server (NTRS)

    Johnson, Jesse P.

    1989-01-01

    The need for a predictive budget/funging model is obvious. The current models used by the Resource Analysis Office (RAO) are used to predict the total costs of satellite projects. An effort to extend the modeling capabilities from total budget analysis to total budget and budget outlays over time analysis was conducted. A statistical based and data driven methodology was used to derive and develop the model. Th budget data for the last 18 GSFC-sponsored satellite projects were analyzed and used to build a funding model which would describe the historical spending patterns. This raw data consisted of dollars spent in that specific year and their 1989 dollar equivalent. This data was converted to the standard format used by the RAO group and placed in a database. A simple statistical analysis was performed to calculate the gross statistics associated with project length and project cost ant the conditional statistics on project length and project cost. The modeling approach used is derived form the theory of embedded statistics which states that properly analyzed data will produce the underlying generating function. The process of funding large scale projects over extended periods of time is described by Life Cycle Cost Models (LCCM). The data was analyzed to find a model in the generic form of a LCCM. The model developed is based on a Weibull function whose parameters are found by both nonlinear optimization and nonlinear regression. In order to use this model it is necessary to transform the problem from a dollar/time space to a percentage of total budget/time space. This transformation is equivalent to moving to a probability space. By using the basic rules of probability, the validity of both the optimization and the regression steps are insured. This statistically significant model is then integrated and inverted. The resulting output represents a project schedule which relates the amount of money spent to the percentage of project completion.

  15. Solution of Stochastic Capital Budgeting Problems in a Multidivisional Firm.

    DTIC Science & Technology

    1980-06-01

    linear programming with simple recourse (see, for example, Dantzig (9) or Ziemba (35)) - 12 - and has been applied to capital budgeting problems with...New York, 1972 34. Weingartner, H.M., Mathematical Programming and Analysis of Capital Budgeting Problems, Markham Pub. Co., Chicago, 1967 35. Ziemba

  16. Combinatorial Optimization in Project Selection Using Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Dewi, Sari; Sawaluddin

    2018-01-01

    This paper discusses the problem of project selection in the presence of two objective functions that maximize profit and minimize cost and the existence of some limitations is limited resources availability and time available so that there is need allocation of resources in each project. These resources are human resources, machine resources, raw material resources. This is treated as a consideration to not exceed the budget that has been determined. So that can be formulated mathematics for objective function (multi-objective) with boundaries that fulfilled. To assist the project selection process, a multi-objective combinatorial optimization approach is used to obtain an optimal solution for the selection of the right project. It then described a multi-objective method of genetic algorithm as one method of multi-objective combinatorial optimization approach to simplify the project selection process in a large scope.

  17. Optimal dynamic control of invasions: applying a systematic conservation approach.

    PubMed

    Adams, Vanessa M; Setterfield, Samantha A

    2015-06-01

    The social, economic, and environmental impacts of invasive plants are well recognized. However, these variable impacts are rarely accounted for in the spatial prioritization of funding for weed management. We examine how current spatially explicit prioritization methods can be extended to identify optimal budget allocations to both eradication and control measures of invasive species to minimize the costs and likelihood of invasion. Our framework extends recent approaches to systematic prioritization of weed management to account for multiple values that are threatened by weed invasions with a multi-year dynamic prioritization approach. We apply our method to the northern portion of the Daly catchment in the Northern Territory, which has significant conservation values that are threatened by gamba grass (Andropogon gayanus), a highly invasive species recognized by the Australian government as a Weed of National Significance (WONS). We interface Marxan, a widely applied conservation planning tool, with a dynamic biophysical model of gamba grass to optimally allocate funds to eradication and control programs under two budget scenarios comparing maximizing gain (MaxGain) and minimizing loss (MinLoss) optimization approaches. The prioritizations support previous findings that a MinLoss approach is a better strategy when threats are more spatially variable than conservation values. Over a 10-year simulation period, we find that a MinLoss approach reduces future infestations by ~8% compared to MaxGain in the constrained budget scenarios and ~12% in the unlimited budget scenarios. We find that due to the extensive current invasion and rapid rate of spread, allocating the annual budget to control efforts is more efficient than funding eradication efforts when there is a constrained budget. Under a constrained budget, applying the most efficient optimization scenario (control, minloss) reduces spread by ~27% compared to no control. Conversely, if the budget is unlimited it is more efficient to fund eradication efforts and reduces spread by ~65% compared to no control.

  18. Replica approach to mean-variance portfolio optimization

    NASA Astrophysics Data System (ADS)

    Varga-Haszonits, Istvan; Caccioli, Fabio; Kondor, Imre

    2016-12-01

    We consider the problem of mean-variance portfolio optimization for a generic covariance matrix subject to the budget constraint and the constraint for the expected return, with the application of the replica method borrowed from the statistical physics of disordered systems. We find that the replica symmetry of the solution does not need to be assumed, but emerges as the unique solution of the optimization problem. We also check the stability of this solution and find that the eigenvalues of the Hessian are positive for r  =  N/T  <  1, where N is the dimension of the portfolio and T the length of the time series used to estimate the covariance matrix. At the critical point r  =  1 a phase transition is taking place. The out of sample estimation error blows up at this point as 1/(1  -  r), independently of the covariance matrix or the expected return, displaying the universality not only of the critical exponent, but also the critical point. As a conspicuous illustration of the dangers of in-sample estimates, the optimal in-sample variance is found to vanish at the critical point inversely proportional to the divergent estimation error.

  19. Optimal environmental management strategy and implementation for groundwater contamination prevention and restoration.

    PubMed

    Wang, Mingyu

    2006-04-01

    An innovative management strategy is proposed for optimized and integrated environmental management for regional or national groundwater contamination prevention and restoration allied with consideration of sustainable development. This management strategy accounts for availability of limited resources, human health and ecological risks from groundwater contamination, costs for groundwater protection measures, beneficial uses and values from groundwater protection, and sustainable development. Six different categories of costs are identified with regard to groundwater prevention and restoration. In addition, different environmental impacts from groundwater contamination including human health and ecological risks are individually taken into account. System optimization principles are implemented to accomplish decision-makings on the optimal resources allocations of the available resources or budgets to different existing contaminated sites and projected contamination sites for a maximal risk reduction. Established management constraints such as budget limitations under different categories of costs are satisfied at the optimal solution. A stepwise optimization process is proposed in which the first step is to select optimally a limited number of sites where remediation or prevention measures will be taken, from all the existing contaminated and projected contamination sites, based on a total regionally or nationally available budget in a certain time frame such as 10 years. Then, several optimization steps determined year-by-year optimal distributions of the available yearly budgets for those selected sites. A hypothetical case study is presented to demonstrate a practical implementation of the management strategy. Several issues pertaining to groundwater contamination exposure and risk assessments and remediation cost evaluations are briefly discussed for adequately understanding implementations of the management strategy.

  20. The effect of agency budgets on minimizing greenhouse gas emissions from road rehabilitation policies

    NASA Astrophysics Data System (ADS)

    Reger, Darren; Madanat, Samer; Horvath, Arpad

    2015-11-01

    Transportation agencies are being urged to reduce their greenhouse gas (GHG) emissions. One possible solution within their scope is to alter their pavement management system to include environmental impacts. Managing pavement assets is important because poor road conditions lead to increased fuel consumption of vehicles. Rehabilitation activities improve pavement condition, but require materials and construction equipment, which produce GHG emissions as well. The agency’s role is to decide when to rehabilitate the road segments in the network. In previous work, we sought to minimize total societal costs (user and agency costs combined) subject to an emissions constraint for a road network, and demonstrated that there exists a range of potentially optimal solutions (a Pareto frontier) with tradeoffs between costs and GHG emissions. However, we did not account for the case where the available financial budget to the agency is binding. This letter considers an agency whose main goal is to reduce its carbon footprint while operating under a constrained financial budget. A Lagrangian dual solution methodology is applied, which selects the optimal timing and optimal action from a set of alternatives for each segment. This formulation quantifies GHG emission savings per additional dollar of agency budget spent, which can be used in a cap-and-trade system or to make budget decisions. We discuss the importance of communication between agencies and their legislature that sets the financial budgets to implement sustainable policies. We show that for a case study of Californian roads, it is optimal to apply frequent, thin overlays as opposed to the less frequent, thick overlays recommended in the literature if the objective is to minimize GHG emissions. A promising new technology, warm-mix asphalt, will have a negligible effect on reducing GHG emissions for road resurfacing under constrained budgets.

  1. Optimal allocation of conservation effort among subpopulations of a threatened species: how important is patch quality?

    PubMed

    Chauvenet, Aliénor L M; Baxter, Peter W J; McDonald-Madden, Eve; Possingham, Hugh P

    2010-04-01

    Money is often a limiting factor in conservation, and attempting to conserve endangered species can be costly. Consequently, a framework for optimizing fiscally constrained conservation decisions for a single species is needed. In this paper we find the optimal budget allocation among isolated subpopulations of a threatened species to minimize local extinction probability. We solve the problem using stochastic dynamic programming, derive a useful and simple alternative guideline for allocating funds, and test its performance using forward simulation. The model considers subpopulations that persist in habitat patches of differing quality, which in our model is reflected in different relationships between money invested and extinction risk. We discover that, in most cases, subpopulations that are less efficient to manage should receive more money than those that are more efficient to manage, due to higher investment needed to reduce extinction risk. Our simple investment guideline performs almost as well as the exact optimal strategy. We illustrate our approach with a case study of the management of the Sumatran tiger, Panthera tigris sumatrae, in Kerinci Seblat National Park (KSNP), Indonesia. We find that different budgets should be allocated to the separate tiger subpopulations in KSNP. The subpopulation that is not at risk of extinction does not require any management investment. Based on the combination of risks of extinction and habitat quality, the optimal allocation for these particular tiger subpopulations is an unusual case: subpopulations that occur in higher-quality habitat (more efficient to manage) should receive more funds than the remaining subpopulation that is in lower-quality habitat. Because the yearly budget allocated to the KSNP for tiger conservation is small, to guarantee the persistence of all the subpopulations that are currently under threat we need to prioritize those that are easier to save. When allocating resources among subpopulations of a threatened species, the combined effects of differences in habitat quality, cost of action, and current subpopulation probability of extinction need to be integrated. We provide a useful guideline for allocating resources among isolated subpopulations of any threatened species.

  2. The Fiscal Year 1993 Bush Budget: Still Not Tackling the Nation's Problems.

    ERIC Educational Resources Information Center

    Greenstein, Robert; Leonard, Paul A.

    On January 29, 1992, the Bush Administration unveiled its fiscal year 1993 budget. An examination of the budget reveals a substantial gap between the administration's rhetoric concerning the budget and what the budget actually contains. An analysis reveals a budget that continues to give priority to defense over domestic spending, one that favors…

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Adair, Kristin Lynn; Durfee, Justin David.

    When developing linear programming models, issues such as budget limitations, customer requirements, or licensing may preclude the use of commercial linear programming solvers. In such cases, one option is to use an open-source linear programming solver. A survey of linear programming tools was conducted to identify potential open-source solvers. From this survey, four open-source solvers were tested using a collection of linear programming test problems and the results were compared to IBM ILOG CPLEX Optimizer (CPLEX) [1], an industry standard. The solvers considered were: COIN-OR Linear Programming (CLP) [2], [3], GNU Linear Programming Kit (GLPK) [4], lp_solve [5] and Modularmore » In-core Nonlinear Optimization System (MINOS) [6]. As no open-source solver outperforms CPLEX, this study demonstrates the power of commercial linear programming software. CLP was found to be the top performing open-source solver considered in terms of capability and speed. GLPK also performed well but cannot match the speed of CLP or CPLEX. lp_solve and MINOS were considerably slower and encountered issues when solving several test problems.« less

  4. Software for Optimizing Quality Assurance of Other Software

    NASA Technical Reports Server (NTRS)

    Feather, Martin; Cornford, Steven; Menzies, Tim

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software.

  5. Minimization of socioeconomic disruption for displaced populations following disasters.

    PubMed

    El-Anwar, Omar; El-Rayes, Khaled; Elnashai, Amr

    2010-07-01

    In the aftermath of catastrophic natural disasters such as hurricanes, tsunamis and earthquakes, emergency management agencies come under intense pressure to provide temporary housing to address the large-scale displacement of the vulnerable population. Temporary housing is essential to enable displaced families to reestablish their normal daily activities until permanent housing solutions can be provided. Temporary housing decisions, however, have often been criticized for their failure to fulfil the socioeconomic needs of the displaced families within acceptable budgets. This paper presents the development of (1) socioeconomic disruption metrics that are capable of quantifying the socioeconomic impacts of temporary housing decisions on displaced populations; and (2) a robust multi-objective optimization model for temporary housing that is capable of simultaneously minimizing socioeconomic disruptions and public expenditures in an effective and efficient manner. A large-scale application example is optimized to illustrate the use of the model and demonstrate its capabilities ingenerating optimal plans for realistic temporary housing problems.

  6. Maximizing algebraic connectivity in air transportation networks

    NASA Astrophysics Data System (ADS)

    Wei, Peng

    In air transportation networks the robustness of a network regarding node and link failures is a key factor for its design. An experiment based on the real air transportation network is performed to show that the algebraic connectivity is a good measure for network robustness. Three optimization problems of algebraic connectivity maximization are then formulated in order to find the most robust network design under different constraints. The algebraic connectivity maximization problem with flight routes addition or deletion is first formulated. Three methods to optimize and analyze the network algebraic connectivity are proposed. The Modified Greedy Perturbation Algorithm (MGP) provides a sub-optimal solution in a fast iterative manner. The Weighted Tabu Search (WTS) is designed to offer a near optimal solution with longer running time. The relaxed semi-definite programming (SDP) is used to set a performance upper bound and three rounding techniques are discussed to find the feasible solution. The simulation results present the trade-off among the three methods. The case study on two air transportation networks of Virgin America and Southwest Airlines show that the developed methods can be applied in real world large scale networks. The algebraic connectivity maximization problem is extended by adding the leg number constraint, which considers the traveler's tolerance for the total connecting stops. The Binary Semi-Definite Programming (BSDP) with cutting plane method provides the optimal solution. The tabu search and 2-opt search heuristics can find the optimal solution in small scale networks and the near optimal solution in large scale networks. The third algebraic connectivity maximization problem with operating cost constraint is formulated. When the total operating cost budget is given, the number of the edges to be added is not fixed. Each edge weight needs to be calculated instead of being pre-determined. It is illustrated that the edge addition and the weight assignment can not be studied separately for the problem with operating cost constraint. Therefore a relaxed SDP method with golden section search is developed to solve both at the same time. The cluster decomposition is utilized to solve large scale networks.

  7. Optimization of black-box models with uncertain climatic inputs—Application to sunflower ideotype design

    PubMed Central

    Picheny, Victor; Trépos, Ronan; Casadebaig, Pierre

    2017-01-01

    Accounting for the interannual climatic variations is a well-known issue for simulation-based studies of environmental systems. It often requires intensive sampling (e.g., averaging the simulation outputs over many climatic series), which hinders many sequential processes, in particular optimization algorithms. We propose here an approach based on a subset selection in a large basis of climatic series, using an ad-hoc similarity function and clustering. A non-parametric reconstruction technique is introduced to estimate accurately the distribution of the output of interest using only the subset sampling. The proposed strategy is non-intrusive and generic (i.e. transposable to most models with climatic data inputs), and can be combined to most “off-the-shelf” optimization solvers. We apply our approach to sunflower ideotype design using the crop model SUNFLO. The underlying optimization problem is formulated as a multi-objective one to account for risk-aversion. Our approach achieves good performances even for limited computational budgets, outperforming significantly standard strategies. PMID:28542198

  8. Optimization methods for decision making in disease prevention and epidemic control.

    PubMed

    Deng, Yan; Shen, Siqian; Vorobeychik, Yevgeniy

    2013-11-01

    This paper investigates problems of disease prevention and epidemic control (DPEC), in which we optimize two sets of decisions: (i) vaccinating individuals and (ii) closing locations, given respective budgets with the goal of minimizing the expected number of infected individuals after intervention. The spread of diseases is inherently stochastic due to the uncertainty about disease transmission and human interaction. We use a bipartite graph to represent individuals' propensities of visiting a set of location, and formulate two integer nonlinear programming models to optimize choices of individuals to vaccinate and locations to close. Our first model assumes that if a location is closed, its visitors stay in a safe location and will not visit other locations. Our second model incorporates compensatory behavior by assuming multiple behavioral groups, always visiting the most preferred locations that remain open. The paper develops algorithms based on a greedy strategy, dynamic programming, and integer programming, and compares the computational efficacy and solution quality. We test problem instances derived from daily behavior patterns of 100 randomly chosen individuals (corresponding to 195 locations) in Portland, Oregon, and provide policy insights regarding the use of the two DPEC models. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. An approximation algorithm for the Noah's Ark problem with random feature loss.

    PubMed

    Hickey, Glenn; Blanchette, Mathieu; Carmi, Paz; Maheshwari, Anil; Zeh, Norbert

    2011-01-01

    The phylogenetic diversity (PD) of a set of species is a measure of their evolutionary distinctness based on a phylogenetic tree. PD is increasingly being adopted as an index of biodiversity in ecological conservation projects. The Noah's Ark Problem (NAP) is an NP-Hard optimization problem that abstracts a fundamental conservation challenge in asking to maximize the expected PD of a set of taxa given a fixed budget, where each taxon is associated with a cost of conservation and a probability of extinction. Only simplified instances of the problem, where one or more parameters are fixed as constants, have as of yet been addressed in the literature. Furthermore, it has been argued that PD is not an appropriate metric for models that allow information to be lost along paths in the tree. We therefore generalize the NAP to incorporate a proposed model of feature loss according to an exponential distribution and term this problem NAP with Loss (NAPL). In this paper, we present a pseudopolynomial time approximation scheme for NAPL.

  10. Risk Reduction and Resource Pooling on a Cooperation Task

    ERIC Educational Resources Information Center

    Pietras, Cynthia J.; Cherek, Don R.; Lane, Scott D.; Tcheremissine, Oleg

    2006-01-01

    Two experiments investigated choice in adult humans on a simulated cooperation task to evaluate a risk-reduction account of sharing based on the energy-budget rule. The energy-budget rule is an optimal foraging model that predicts risk-averse choices when net energy gains exceed energy requirements (positive energy budget) and risk-prone choices…

  11. Defense Dollars and Sense: A Common Cause Guide to the Defense Budget Process.

    ERIC Educational Resources Information Center

    Rovner, Mark

    Designed to increase public awareness of military spending, this 5-part guide examines the process and problems of preparing the national defense budget. The publication begins with a brief overview of the 1984 defense budget. Major military programs, trends in budgeting, and key concerns in budget formulation are explored. Graphs and charts…

  12. A Position and Rate Control System: An Ingredient for Budget Planning.

    ERIC Educational Resources Information Center

    Gilbert, Linda L.

    A position and rate control system was undertaken at Florida State University in 1974 to alleviate the problems of the manual budgeting system. The budget master file was created biweekly by combining a subset of the current payroll/personnel data base with the updated budget information from the previous budget master file, keying on positional…

  13. Strategy And The Spreadsheet: Optimizing The Total Army To Satisfy Both

    DTIC Science & Technology

    2016-02-11

    historically reduces military end strength at the conclusion of major conflicts. The Budget Control Act of 2011 imposed sequestration spending limits on...the military that began the process of drawing down the military through fiscal year 2021. While the 2016 defense budget delays sequestration cuts... budget by a wide margin, has started repeating a historical cycle of budget driven defense cuts. The Army’s large force represents an attractive

  14. Backtracking search algorithm in CVRP models for efficient solid waste collection and route optimization.

    PubMed

    Akhtar, Mahmuda; Hannan, M A; Begum, R A; Basri, Hassan; Scavino, Edgar

    2017-03-01

    Waste collection is an important part of waste management that involves different issues, including environmental, economic, and social, among others. Waste collection optimization can reduce the waste collection budget and environmental emissions by reducing the collection route distance. This paper presents a modified Backtracking Search Algorithm (BSA) in capacitated vehicle routing problem (CVRP) models with the smart bin concept to find the best optimized waste collection route solutions. The objective function minimizes the sum of the waste collection route distances. The study introduces the concept of the threshold waste level (TWL) of waste bins to reduce the number of bins to be emptied by finding an optimal range, thus minimizing the distance. A scheduling model is also introduced to compare the feasibility of the proposed model with that of the conventional collection system in terms of travel distance, collected waste, fuel consumption, fuel cost, efficiency and CO 2 emission. The optimal TWL was found to be between 70% and 75% of the fill level of waste collection nodes and had the maximum tightness value for different problem cases. The obtained results for four days show a 36.80% distance reduction for 91.40% of the total waste collection, which eventually increases the average waste collection efficiency by 36.78% and reduces the fuel consumption, fuel cost and CO 2 emission by 50%, 47.77% and 44.68%, respectively. Thus, the proposed optimization model can be considered a viable tool for optimizing waste collection routes to reduce economic costs and environmental impacts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Determining the Optimal Work Breakdown Structure for Defense Acquisition Contracts

    DTIC Science & Technology

    2016-03-24

    programs. Public utility corresponds with the generally understood concept that having more money is desirable, and having less money is not desirable...From this perspective, program completion on budget provides maximum utility , while being over budget reduces utility as there is less money for other...tree. Utility theory tools were applied using three utility perspectives, and optimal WBSs were identified. Results demonstrated that reporting at WBS

  16. Department of the Navy Supporting Data for Fiscal Year 1983 Budget Estimates Descriptive Summaries Submitted to Congress February 1982. Research, Development, Test & Evaluation, Navy. Book 1 of 3. Technology Base, Advanced Technology Development, Strategic Programs.

    DTIC Science & Technology

    1982-02-01

    optimization methods have been developed for problems in production and distribution modeling including design and evaluation of storage alternatives under...and winds using high frequency , X-band doppler, pulse -limited, and Delta-K radars. Development of millimeter-wave radiometric imaging systems and...generic system design concept for a system capable of defending the Fleet from the high angle threat 1.4 The first model of the drive system for a

  17. Towards a hierarchical optimization modeling framework for ...

    EPA Pesticide Factsheets

    Background:Bilevel optimization has been recognized as a 2-player Stackelberg game where players are represented as leaders and followers and each pursue their own set of objectives. Hierarchical optimization problems, which are a generalization of bilevel, are especially difficult because the optimization is nested, meaning that the objectives of one level depend on solutions to the other levels. We introduce a hierarchical optimization framework for spatially targeting multiobjective green infrastructure (GI) incentive policies under uncertainties related to policy budget, compliance, and GI effectiveness. We demonstrate the utility of the framework using a hypothetical urban watershed, where the levels are characterized by multiple levels of policy makers (e.g., local, regional, national) and policy followers (e.g., landowners, communities), and objectives include minimization of policy cost, implementation cost, and risk; reduction of combined sewer overflow (CSO) events; and improvement in environmental benefits such as reduced nutrient run-off and water availability. Conclusions: While computationally expensive, this hierarchical optimization framework explicitly simulates the interaction between multiple levels of policy makers (e.g., local, regional, national) and policy followers (e.g., landowners, communities) and is especially useful for constructing and evaluating environmental and ecological policy. Using the framework with a hypothetical urba

  18. Mixed Transportation Network Design under a Sustainable Development Perspective

    PubMed Central

    Qin, Jin; Ni, Ling-lin; Shi, Feng

    2013-01-01

    A mixed transportation network design problem considering sustainable development was studied in this paper. Based on the discretization of continuous link-grade decision variables, a bilevel programming model was proposed to describe the problem, in which sustainability factors, including vehicle exhaust emissions, land-use scale, link load, and financial budget, are considered. The objective of the model is to minimize the total amount of resources exploited under the premise of meeting all the construction goals. A heuristic algorithm, which combined the simulated annealing and path-based gradient projection algorithm, was developed to solve the model. The numerical example shows that the transportation network optimized with the method above not only significantly alleviates the congestion on the link, but also reduces vehicle exhaust emissions within the network by up to 41.56%. PMID:23476142

  19. Mixed transportation network design under a sustainable development perspective.

    PubMed

    Qin, Jin; Ni, Ling-lin; Shi, Feng

    2013-01-01

    A mixed transportation network design problem considering sustainable development was studied in this paper. Based on the discretization of continuous link-grade decision variables, a bilevel programming model was proposed to describe the problem, in which sustainability factors, including vehicle exhaust emissions, land-use scale, link load, and financial budget, are considered. The objective of the model is to minimize the total amount of resources exploited under the premise of meeting all the construction goals. A heuristic algorithm, which combined the simulated annealing and path-based gradient projection algorithm, was developed to solve the model. The numerical example shows that the transportation network optimized with the method above not only significantly alleviates the congestion on the link, but also reduces vehicle exhaust emissions within the network by up to 41.56%.

  20. Optimization Technique With Sensitivity Analysis On Menu Scheduling For Boarding School Student Aged 13-18 Using “Sufahani-Ismail Algorithm”

    NASA Astrophysics Data System (ADS)

    Sudin, Azila M.; Sufahani, Suliadi

    2018-04-01

    Boarding school student aged 13-18 need to eat nutritious meals which contains proper calories, vitality and nutrients for appropriate development with a specific end goal to repair and upkeep the body tissues. Furthermore, it averts undesired diseases and contamination. Serving healthier food is a noteworthy stride towards accomplishing that goal. However, arranging a nutritious and balance menu manually is convoluted, wasteful and tedious. Therefore, the aim of this study is to develop a mathematical model with an optimization technique for menu scheduling that fulfill the whole supplement prerequisite for boarding school student, reduce processing time, minimize the budget and furthermore serve assortment type of food each day. It additionally gives the flexibility for the cook to choose any food to be considered in the beginning of the process and change any favored menu even after the ideal arrangement and optimal solution has been obtained. This is called sensitivity analysis. A recalculation procedure will be performed in light of the ideal arrangement and seven days menu was produced. The data was gathered from the Malaysian Ministry of Education and schools authorities. Menu arranging is a known optimization problem. Therefore Binary Programming alongside optimization technique and “Sufahani-Ismail Algorithm” were utilized to take care of this issue. In future, this model can be implemented to other menu problem, for example, for sports, endless disease patients, militaries, colleges, healing facilities and nursing homes.

  1. Field-Based Optimal Placement of Antennas for Body-Worn Wireless Sensors

    PubMed Central

    Januszkiewicz, Łukasz; Di Barba, Paolo; Hausman, Sławomir

    2016-01-01

    We investigate a case of automated energy-budget-aware optimization of the physical position of nodes (sensors) in a Wireless Body Area Network (WBAN). This problem has not been presented in the literature yet, as opposed to antenna and routing optimization, which are relatively well-addressed. In our research, which was inspired by a safety-critical application for firefighters, the sensor network consists of three nodes located on the human body. The nodes communicate over a radio link operating in the 2.4 GHz or 5.8 GHz ISM frequency band. Two sensors have a fixed location: one on the head (earlobe pulse oximetry) and one on the arm (with accelerometers, temperature and humidity sensors, and a GPS receiver), while the position of the third sensor can be adjusted within a predefined region on the wearer’s chest. The path loss between each node pair strongly depends on the location of the nodes and is difficult to predict without performing a full-wave electromagnetic simulation. Our optimization scheme employs evolutionary computing. The novelty of our approach lies not only in the formulation of the problem but also in linking a fully automated optimization procedure with an electromagnetic simulator and a simplified human body model. This combination turns out to be a computationally effective solution, which, depending on the initial placement, has a potential to improve performance of our example sensor network setup by up to about 20 dB with respect to the path loss between selected nodes. PMID:27196911

  2. U.S. Geological Survey Would Fare Well in Proposed Federal Budget

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2010-02-01

    The U.S. Geological Survey (USGS) is among the U.S. federal science agencies that would see significant funding increases if Congress approves the Obama administration's proposed budget for fiscal year (FY) 2011. The FY 2011 budget request would provide USGS with $1.13 billion, an increase of $21.6 million, or 1.9%, above the FY 2010 enacted level. “In a time of budget austerity, to have the budget for a science agency like the USGS actually be at a level above 2010—and 2010 was a pretty good budget year for the USGS—is indeed a very good sign,” USGS director Marcia McNutt said at a 1 February budget briefing. “What we are seeing in the USGS budget is the reflection from both the president and the secretary [of the Department of the Interior, of which USGS is part] of their commitment that the problems that the nation is facing right now are problems to which science can help us find an answer,” she said.

  3. Cost-effectiveness of the streamflow-gaging program in Wyoming

    USGS Publications Warehouse

    Druse, S.A.; Wahl, K.L.

    1988-01-01

    This report documents the results of a cost-effectiveness study of the streamflow-gaging program in Wyoming. Regression analysis or hydrologic flow-routing techniques were considered for 24 combinations of stations from a 139-station network operated in 1984 to investigate suitability of techniques for simulating streamflow records. Only one station was determined to have sufficient accuracy in the regression analysis to consider discontinuance of the gage. The evaluation of the gaging-station network, which included the use of associated uncertainty in streamflow records, is limited to the nonwinter operation of the 47 stations operated by the Riverton Field Office of the U.S. Geological Survey. The current (1987) travel routes and measurement frequencies require a budget of $264,000 and result in an average standard error in streamflow records of 13.2%. Changes in routes and station visits using the same budget, could optimally reduce the standard error by 1.6%. Budgets evaluated ranged from $235,000 to $400,000. A $235,000 budget increased the optimal average standard error/station from 11.6 to 15.5%, and a $400,000 budget could reduce it to 6.6%. For all budgets considered, lost record accounts for about 40% of the average standard error. (USGS)

  4. 42 CFR 441.464 - State assurances.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Self-Directed Personal Assistance Services Program § 441.464 State assurances. A State must assure that... problems that might be associated with budget underutilization. (2) These safeguards may include the... that budget problems are identified on a timely basis so that corrective action may be taken, if...

  5. Getting it right when budgets are tight: Using optimal expansion pathways to prioritize responses to concentrated and mixed HIV epidemics.

    PubMed

    Stuart, Robyn M; Kerr, Cliff C; Haghparast-Bidgoli, Hassan; Estill, Janne; Grobicki, Laura; Baranczuk, Zofia; Prieto, Lorena; Montañez, Vilma; Reporter, Iyanoosh; Gray, Richard T; Skordis-Worrall, Jolene; Keiser, Olivia; Cheikh, Nejma; Boonto, Krittayawan; Osornprasop, Sutayut; Lavadenz, Fernando; Benedikt, Clemens J; Martin-Hughes, Rowan; Hussain, S Azfar; Kelly, Sherrie L; Kedziora, David J; Wilson, David P

    2017-01-01

    Prioritizing investments across health interventions is complicated by the nonlinear relationship between intervention coverage and epidemiological outcomes. It can be difficult for countries to know which interventions to prioritize for greatest epidemiological impact, particularly when budgets are uncertain. We examined four case studies of HIV epidemics in diverse settings, each with different characteristics. These case studies were based on public data available for Belarus, Peru, Togo, and Myanmar. The Optima HIV model and software package was used to estimate the optimal distribution of resources across interventions associated with a range of budget envelopes. We constructed "investment staircases", a useful tool for understanding investment priorities. These were used to estimate the best attainable cost-effectiveness of the response at each investment level. We find that when budgets are very limited, the optimal HIV response consists of a smaller number of 'core' interventions. As budgets increase, those core interventions should first be scaled up, and then new interventions introduced. We estimate that the cost-effectiveness of HIV programming decreases as investment levels increase, but that the overall cost-effectiveness remains below GDP per capita. It is important for HIV programming to respond effectively to the overall level of funding availability. The analytic tools presented here can help to guide program planners understand the most cost-effective HIV responses and plan for an uncertain future.

  6. On the use of genetic algorithm to optimize industrial assets lifecycle management under safety and budget constraints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lonchampt, J.; Fessart, K.

    2013-07-01

    The purpose of this paper is to describe the method and tool dedicated to optimize investments planning for industrial assets. These investments may either be preventive maintenance tasks, asset enhancements or logistic investments such as spare parts purchases. The two methodological points to investigate in such an issue are: 1. The measure of the profitability of a portfolio of investments 2. The selection and planning of an optimal set of investments 3. The measure of the risk of a portfolio of investments The measure of the profitability of a set of investments in the IPOP tool is synthesised in themore » Net Present Value indicator. The NPV is the sum of the differences of discounted cash flows (direct costs, forced outages...) between the situations with and without a given investment. These cash flows are calculated through a pseudo-Markov reliability model representing independently the components of the industrial asset and the spare parts inventories. The component model has been widely discussed over the years but the spare part model is a new one based on some approximations that will be discussed. This model, referred as the NPV function, takes for input an investments portfolio and gives its NPV. The second issue is to optimize the NPV. If all investments were independent, this optimization would be an easy calculation, unfortunately there are two sources of dependency. The first one is introduced by the spare part model, as if components are indeed independent in their reliability model, the fact that several components use the same inventory induces a dependency. The second dependency comes from economic, technical or logistic constraints, such as a global maintenance budget limit or a safety requirement limiting the residual risk of failure of a component or group of component, making the aggregation of individual optimum not necessary feasible. The algorithm used to solve such a difficult optimization problem is a genetic algorithm. After a description of the features of the software a test case is presented showing the influence of the optimization algorithm parameters on its efficiency to find an optimal investments planning. (authors)« less

  7. Optimized Solvent for Energy-Efficient, Environmentally-Friendly Capture of CO{sub 2} at Coal-Fired Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farthing, G. A.; Rimpf, L. M.

    The overall goal of this project, as originally proposed, was to optimize the formulation of a novel solvent as a critical enabler for the cost-effective, energy-efficient, environmentally-friendly capture of CO{sub 2} at coal-fired utility plants. Aqueous blends of concentrated piperazine (PZ) with other compounds had been shown to exhibit high rates of CO{sub 2} absorption, low regeneration energy, and other desirable performance characteristics during an earlier 5-year development program conducted by B&W. The specific objective of this project was to identify PZ-based solvent formulations that globally optimize the performance of coal-fired power plants equipped with CO{sub 2} scrubbing systems. Whilemore » previous solvent development studies have tended to focus on energy consumption and absorber size, important issues to be sure, the current work seeks to explore, understand, and optimize solvent formulation across the full gamut of issues related to commercial application of the technology: capital and operating costs, operability, reliability, environmental, health and safety (EH&S), etc. Work on the project was intended to be performed under four budget periods. The objective of the work in the first budget period has been to identify several candidate formulations of a concentrated PZ-based solvent for detailed characterization and evaluation. Work in the second budget period would generate reliable and comprehensive property and performance data for the identified formulations. Work in the third budget period would quantify the expected performance of the selected formulations in a commercial CO{sub 2} scrubbing process. Finally, work in the fourth budget period would provide a final technology feasibility study and a preliminary technology EH&S assessment. Due to other business priorities, however, B&W has requested that this project be terminated at the end of the first budget period. This document therefore serves as the final report for this project. It is the first volume of the two-volume final report and summarizes Budget Period 1 accomplishments under Tasks 1-5 of the project, including the selection of four solvent formulations for further study.« less

  8. A Bayesian approach for incorporating economic factors in sample size design for clinical trials of individual drugs and portfolios of drugs.

    PubMed

    Patel, Nitin R; Ankolekar, Suresh

    2007-11-30

    Classical approaches to clinical trial design ignore economic factors that determine economic viability of a new drug. We address the choice of sample size in Phase III trials as a decision theory problem using a hybrid approach that takes a Bayesian view from the perspective of a drug company and a classical Neyman-Pearson view from the perspective of regulatory authorities. We incorporate relevant economic factors in the analysis to determine the optimal sample size to maximize the expected profit for the company. We extend the analysis to account for risk by using a 'satisficing' objective function that maximizes the chance of meeting a management-specified target level of profit. We extend the models for single drugs to a portfolio of clinical trials and optimize the sample sizes to maximize the expected profit subject to budget constraints. Further, we address the portfolio risk and optimize the sample sizes to maximize the probability of achieving a given target of expected profit.

  9. California's Budget Problems Leave Community Colleges Holding IOU's

    ERIC Educational Resources Information Center

    Keller, Josh

    2009-01-01

    When California approved its budget last month, the community-college system managed to escape the sharp budget cuts that befell most other agencies. But the state's fiscal troubles have nonetheless created a cash crisis for two-year colleges. As part of its plan to close a $41-billion budget deficit, California will delay providing $540-million…

  10. Cost versus life cycle assessment-based environmental impact optimization of drinking water production plants.

    PubMed

    Capitanescu, F; Rege, S; Marvuglia, A; Benetto, E; Ahmadi, A; Gutiérrez, T Navarrete; Tiruta-Barna, L

    2016-07-15

    Empowering decision makers with cost-effective solutions for reducing industrial processes environmental burden, at both design and operation stages, is nowadays a major worldwide concern. The paper addresses this issue for the sector of drinking water production plants (DWPPs), seeking for optimal solutions trading-off operation cost and life cycle assessment (LCA)-based environmental impact while satisfying outlet water quality criteria. This leads to a challenging bi-objective constrained optimization problem, which relies on a computationally expensive intricate process-modelling simulator of the DWPP and has to be solved with limited computational budget. Since mathematical programming methods are unusable in this case, the paper examines the performances in tackling these challenges of six off-the-shelf state-of-the-art global meta-heuristic optimization algorithms, suitable for such simulation-based optimization, namely Strength Pareto Evolutionary Algorithm (SPEA2), Non-dominated Sorting Genetic Algorithm (NSGA-II), Indicator-based Evolutionary Algorithm (IBEA), Multi-Objective Evolutionary Algorithm based on Decomposition (MOEA/D), Differential Evolution (DE), and Particle Swarm Optimization (PSO). The results of optimization reveal that good reduction in both operating cost and environmental impact of the DWPP can be obtained. Furthermore, NSGA-II outperforms the other competing algorithms while MOEA/D and DE perform unexpectedly poorly. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Crisis Intervention and the Military Family: A Model Installation Program.

    DTIC Science & Technology

    1988-03-07

    abuse to less serious problems like budget management and good parenting techniques. Family dysfunction is defined in two categories: High and low...serious problems like budget management and good parenting techniques. Family dysfunction is defined in two categories: High and low intensity conflict. The...system activated to deal with the problem through the Family N, Advocacy Case Management Team. It is an excellent system that protects, helps and

  12. Fast optimization of multipump Raman amplifiers based on a simplified wavelength and power budget heuristic

    NASA Astrophysics Data System (ADS)

    de O. Rocha, Helder R.; Castellani, Carlos E. S.; Silva, Jair A. L.; Pontes, Maria J.; Segatto, Marcelo E. V.

    2015-01-01

    We report a simple budget heuristic for a fast optimization of multipump Raman amplifiers based on the reallocation of the pump wavelengths and the optical powers. A set of different optical fibers are analyzed as the Raman gain medium, and a four-pump amplifier setup is optimized for each of them in order to achieve ripples close to 1 dB and gains up to 20 dB in the C band. Later, a comparison between our proposed heuristic and a multiobjective optimization based on a nondominated sorting genetic algorithm is made, highlighting the fact that our new approach can give similar solutions after at least an order of magnitude fewer iterations. The results shown in this paper can potentially pave the way for real-time optimization of multipump Raman amplifier systems.

  13. Mars Geoscience Orbiter and Lunar Geoscience Orbiter

    NASA Technical Reports Server (NTRS)

    Fuldner, W. V.; Kaskiewicz, P. F.

    1983-01-01

    The feasibility of using the AE/DE Earth orbiting spacecraft design for the LGO and/or MGO missions was determined. Configurations were developed and subsystems analysis was carried out to optimize the suitability of the spacecraft to the missions. The primary conclusion is that the basic AE/DE spacecraft can readily be applied to the LGO mission with relatively minor, low risk modifications. The MGO mission poses a somewhat more complex problem, primarily due to the overall maneuvering hydrazine budget and power requirements of the sensors and their desired duty cycle. These considerations dictate a modification (scaling up) of the structure to support mission requirements.

  14. Getting it right when budgets are tight: Using optimal expansion pathways to prioritize responses to concentrated and mixed HIV epidemics

    PubMed Central

    Kerr, Cliff C.; Haghparast-Bidgoli, Hassan; Estill, Janne; Grobicki, Laura; Baranczuk, Zofia; Prieto, Lorena; Montañez, Vilma; Reporter, Iyanoosh; Gray, Richard T.; Skordis-Worrall, Jolene; Keiser, Olivia; Cheikh, Nejma; Boonto, Krittayawan; Osornprasop, Sutayut; Lavadenz, Fernando; Benedikt, Clemens J.; Martin-Hughes, Rowan; Hussain, S. Azfar; Kelly, Sherrie L.; Kedziora, David J.; Wilson, David P.

    2017-01-01

    Background Prioritizing investments across health interventions is complicated by the nonlinear relationship between intervention coverage and epidemiological outcomes. It can be difficult for countries to know which interventions to prioritize for greatest epidemiological impact, particularly when budgets are uncertain. Methods We examined four case studies of HIV epidemics in diverse settings, each with different characteristics. These case studies were based on public data available for Belarus, Peru, Togo, and Myanmar. The Optima HIV model and software package was used to estimate the optimal distribution of resources across interventions associated with a range of budget envelopes. We constructed “investment staircases”, a useful tool for understanding investment priorities. These were used to estimate the best attainable cost-effectiveness of the response at each investment level. Findings We find that when budgets are very limited, the optimal HIV response consists of a smaller number of ‘core’ interventions. As budgets increase, those core interventions should first be scaled up, and then new interventions introduced. We estimate that the cost-effectiveness of HIV programming decreases as investment levels increase, but that the overall cost-effectiveness remains below GDP per capita. Significance It is important for HIV programming to respond effectively to the overall level of funding availability. The analytic tools presented here can help to guide program planners understand the most cost-effective HIV responses and plan for an uncertain future. PMID:28972975

  15. The role of family carers in the use of personal budgets by people with mental health problems.

    PubMed

    Hamilton, Sarah; Szymczynska, Paulina; Clewett, Naomi; Manthorpe, Jill; Tew, Jerry; Larsen, John; Pinfold, Vanessa

    2017-01-01

    Personal budgets aim to increase choice and independence for people with social care needs but they remain underused by people with mental health problems compared to other disability groups. The use of personal budgets may impact on families in a variety of ways, both positive and negative. This paper draws on interviews, undertaken in 2012-2013 with 18 family carers and 12 mental health service users, that explored experiences of family involvement in accessing and managing personal budgets for a person with mental health-related social care needs. The sample was drawn from three sites across England, with additional carers being recruited via voluntary sector networks. Our findings show that for many people with severe mental health needs who lack motivation and confidence to negotiate access to personal budgets, carers may provide the necessary support to enable them to benefit from this form of social care support. We illustrate the role carers may play in initiating, pursuing and maximising the level of support available through personal budgets. However, some carers interviewed considered that personal budget funding was reduced because of practitioners' assumptions about carers' willingness and ability to provide support. We also report perceived tensions between family carers and practitioners around appropriate involvement in decision-making. The study findings have implications for local authorities, practitioners and family carers in supporting the involvement of family carers in support for people with severe mental health problems. © 2015 John Wiley & Sons Ltd.

  16. Software for Analyzing Laminar-to-Turbulent Flow Transitions

    NASA Technical Reports Server (NTRS)

    Chang, Chau-Lyan

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software. This is achieved by combining two prior programs in an innovative manner

  17. Baseline budgeting for continuous improvement.

    PubMed

    Kilty, G L

    1999-05-01

    This article is designed to introduce the techniques used to convert traditionally maintained department budgets to baseline budgets. This entails identifying key activities, evaluating for value-added, and implementing continuous improvement opportunities. Baseline Budgeting for Continuous Improvement was created as a result of a newly named company president's request to implement zero-based budgeting. The president was frustrated with the mind-set of the organization, namely, "Next year's budget should be 10 to 15 percent more than this year's spending." Zero-based budgeting was not the answer, but combining the principles of activity-based costing and the Just-in-Time philosophy of eliminating waste and continuous improvement did provide a solution to the problem.

  18. Break-even Analysis: Tool for Budget Planning

    ERIC Educational Resources Information Center

    Lohmann, Roger A.

    1976-01-01

    Multiple funding creates special management problems for the administrator of a human service agency. This article presents a useful analytic technique adapted from business practice that can help the administrator draw up and balance a unified budget. Such a budget also affords reliable overview of the agency's financial status. (Author)

  19. A mathematical model for maximizing the value of phase 3 drug development portfolios incorporating budget constraints and risk.

    PubMed

    Patel, Nitin R; Ankolekar, Suresh; Antonijevic, Zoran; Rajicic, Natasa

    2013-05-10

    We describe a value-driven approach to optimizing pharmaceutical portfolios. Our approach incorporates inputs from research and development and commercial functions by simultaneously addressing internal and external factors. This approach differentiates itself from current practices in that it recognizes the impact of study design parameters, sample size in particular, on the portfolio value. We develop an integer programming (IP) model as the basis for Bayesian decision analysis to optimize phase 3 development portfolios using expected net present value as the criterion. We show how this framework can be used to determine optimal sample sizes and trial schedules to maximize the value of a portfolio under budget constraints. We then illustrate the remarkable flexibility of the IP model to answer a variety of 'what-if' questions that reflect situations that arise in practice. We extend the IP model to a stochastic IP model to incorporate uncertainty in the availability of drugs from earlier development phases for phase 3 development in the future. We show how to use stochastic IP to re-optimize the portfolio development strategy over time as new information accumulates and budget changes occur. Copyright © 2013 John Wiley & Sons, Ltd.

  20. Optimizing Constrained Single Period Problem under Random Fuzzy Demand

    NASA Astrophysics Data System (ADS)

    Taleizadeh, Ata Allah; Shavandi, Hassan; Riazi, Afshin

    2008-09-01

    In this paper, we consider the multi-product multi-constraint newsboy problem with random fuzzy demands and total discount. The demand of the products is often stochastic in the real word but the estimation of the parameters of distribution function may be done by fuzzy manner. So an appropriate option to modeling the demand of products is using the random fuzzy variable. The objective function of proposed model is to maximize the expected profit of newsboy. We consider the constraints such as warehouse space and restriction on quantity order for products, and restriction on budget. We also consider the batch size for products order. Finally we introduce a random fuzzy multi-product multi-constraint newsboy problem (RFM-PM-CNP) and it is changed to a multi-objective mixed integer nonlinear programming model. Furthermore, a hybrid intelligent algorithm based on genetic algorithm, Pareto and TOPSIS is presented for the developed model. Finally an illustrative example is presented to show the performance of the developed model and algorithm.

  1. Winds of Change: Toward a Realistic Vision for the Future of Cooperative Extension.

    ERIC Educational Resources Information Center

    Vitzthum, Edward F.

    The extension and research systems of land-grant institutions are in trouble. Six factors demonstrate the scope of the problem: significant cuts for agriculture in the Clinton administration budget; lawmakers opposed to extension research; federal budget deficit; state budget constraints; decreased power of agriculture in Congress; and…

  2. Program Budgeting and the School Administrator: A Review of Dissertations and Annotated Bibliography. Review Series, Number Two.

    ERIC Educational Resources Information Center

    Piele, Philip K.; Bunting, David G.

    This paper reviews the research findings of recent doctoral dissertations on program budgeting in education and describes the practical applications of these findings for school administration. Organized in nine chapters, the review discusses the problems and shortcomings associated with both traditional and program budgeting techniques, and…

  3. Setting National Priorities: The 1974 Budget.

    ERIC Educational Resources Information Center

    Fried, Edward R.; And Others

    This book attempts to make the problem of budgetary choice at the federal level more intelligible by classifying, analyzing, and projecting into the future the components of the budget in a way which makes it possible to put together several comprehensive alternative budgets, each illustrating from a different view how the Federal Government could…

  4. Scheduling for energy and reliability management on multiprocessor real-time systems

    NASA Astrophysics Data System (ADS)

    Qi, Xuan

    Scheduling algorithms for multiprocessor real-time systems have been studied for years with many well-recognized algorithms proposed. However, it is still an evolving research area and many problems remain open due to their intrinsic complexities. With the emergence of multicore processors, it is necessary to re-investigate the scheduling problems and design/develop efficient algorithms for better system utilization, low scheduling overhead, high energy efficiency, and better system reliability. Focusing cluster schedulings with optimal global schedulers, we study the utilization bound and scheduling overhead for a class of cluster-optimal schedulers. Then, taking energy/power consumption into consideration, we developed energy-efficient scheduling algorithms for real-time systems, especially for the proliferating embedded systems with limited energy budget. As the commonly deployed energy-saving technique (e.g. dynamic voltage frequency scaling (DVFS)) will significantly affect system reliability, we study schedulers that have intelligent mechanisms to recuperate system reliability to satisfy the quality assurance requirements. Extensive simulation is conducted to evaluate the performance of the proposed algorithms on reduction of scheduling overhead, energy saving, and reliability improvement. The simulation results show that the proposed reliability-aware power management schemes could preserve the system reliability while still achieving substantial energy saving.

  5. Selecting a mix of prevention strategies against cervical cancer for maximum efficiency with an optimization program.

    PubMed

    Demarteau, Nadia; Breuer, Thomas; Standaert, Baudouin

    2012-04-01

    Screening and vaccination against human papillomavirus (HPV) can protect against cervical cancer. Neither alone can provide 100% protection. Consequently it raises the important question about the most efficient combination of screening at specified time intervals and vaccination to prevent cervical cancer. Our objective was to identify the mix of cervical cancer prevention strategies (screening and/or vaccination against HPV) that achieves maximum reduction in cancer cases within a fixed budget. We assessed the optimal mix of strategies for the prevention of cervical cancer using an optimization program. The evaluation used two models. One was a Markov cohort model used as the evaluation model to estimate the costs and outcomes of 52 different prevention strategies. The other was an optimization model in which the results of each prevention strategy of the previous model were entered as input data. The latter model determined the combination of the different prevention options to minimize cervical cancer under budget, screening coverage and vaccination coverage constraints. We applied the model in two countries with different healthcare organizations, epidemiology, screening practices, resource settings and treatment costs: the UK and Brazil. 100,000 women aged 12 years and above across the whole population over a 1-year period at steady state were included. The intervention was papanicolaou (Pap) smear screening programmes and/or vaccination against HPV with the bivalent HPV 16/18 vaccine (Cervarix® [Cervarix is a registered trademark of the GlaxoSmithKline group of companies]). The main outcome measures were optimal distribution of the population between different interventions (screening, vaccination, screening plus vaccination and no screening or vaccination) with the resulting number of cervical cancer and associated costs. In the base-case analysis (= same budget as today), the optimal prevention strategy would be, after introducing vaccination with a coverage rate of 80% in girls aged 12 years and retaining screening coverage at pre-vaccination levels (65% in the UK, 50% in Brazil), to increase the screening interval to 6 years (from 3) in the UK and to 5 years (from 3) in Brazil. This would result in a reduction of cervical cancer by 41% in the UK and by 54% in Brazil from pre-vaccination levels with no budget increase. Sensitivity analysis shows that vaccination alone at 80% coverage with no screening would achieve a cervical cancer reduction rate of 20% in the UK and 43% in Brazil compared with the pre-vaccination situation with a budget reduction of 30% and 14%, respectively. In both countries, the sharp reduction in cervical cancer is seen when the vaccine coverage rate exceeds the maximum screening coverage rate, or when screening coverage rate exceeds the maximum vaccine coverage rate, while maintaining the budget. As with any model, there are limitations to the value of predictions depending upon the assumptions made in each model. Spending the same budget that was used for screening and treatment of cervical cancer in the pre-vaccination era, results of the optimization program show that it would be possible to substantially reduce the number of cases by implementing an optimal combination of HPV vaccination (80% coverage) and screening at pre-vaccination coverage (65% UK, 50% Brazil) while extending the screening interval to every 6 years in the UK and 5 years in Brazil.

  6. JiTTree: A Just-in-Time Compiled Sparse GPU Volume Data Structure.

    PubMed

    Labschütz, Matthias; Bruckner, Stefan; Gröller, M Eduard; Hadwiger, Markus; Rautek, Peter

    2016-01-01

    Sparse volume data structures enable the efficient representation of large but sparse volumes in GPU memory for computation and visualization. However, the choice of a specific data structure for a given data set depends on several factors, such as the memory budget, the sparsity of the data, and data access patterns. In general, there is no single optimal sparse data structure, but a set of several candidates with individual strengths and drawbacks. One solution to this problem are hybrid data structures which locally adapt themselves to the sparsity. However, they typically suffer from increased traversal overhead which limits their utility in many applications. This paper presents JiTTree, a novel sparse hybrid volume data structure that uses just-in-time compilation to overcome these problems. By combining multiple sparse data structures and reducing traversal overhead we leverage their individual advantages. We demonstrate that hybrid data structures adapt well to a large range of data sets. They are especially superior to other sparse data structures for data sets that locally vary in sparsity. Possible optimization criteria are memory, performance and a combination thereof. Through just-in-time (JIT) compilation, JiTTree reduces the traversal overhead of the resulting optimal data structure. As a result, our hybrid volume data structure enables efficient computations on the GPU, while being superior in terms of memory usage when compared to non-hybrid data structures.

  7. Optimal allocation of HIV prevention funds for state health departments.

    PubMed

    Yaylali, Emine; Farnham, Paul G; Cohen, Stacy; Purcell, David W; Hauck, Heather; Sansom, Stephanie L

    2018-01-01

    To estimate the optimal allocation of Centers for Disease Control and Prevention (CDC) HIV prevention funds for health departments in 52 jurisdictions, incorporating Health Resources and Services Administration (HRSA) Ryan White HIV/AIDS Program funds, to improve outcomes along the HIV care continuum and prevent infections. Using surveillance data from 2010 to 2012 and budgetary data from 2012, we divided the 52 health departments into 5 groups varying by number of persons living with diagnosed HIV (PLWDH), median annual CDC HIV prevention budget, and median annual HRSA expenditures supporting linkage to care, retention in care, and adherence to antiretroviral therapy. Using an optimization and a Bernoulli process model, we solved for the optimal CDC prevention budget allocation for each health department group. The optimal allocation distributed the funds across prevention interventions and populations at risk for HIV to prevent the greatest number of new HIV cases annually. Both the HIV prevention interventions funded by the optimal allocation of CDC HIV prevention funds and the proportions of the budget allocated were similar across health department groups, particularly those representing the large majority of PLWDH. Consistently funded interventions included testing, partner services and linkage to care and interventions for men who have sex with men (MSM). Sensitivity analyses showed that the optimal allocation shifted when there were differences in transmission category proportions and progress along the HIV care continuum. The robustness of the results suggests that most health departments can use these analyses to guide the investment of CDC HIV prevention funds into strategies to prevent the most new cases of HIV.

  8. Optimal allocation of HIV prevention funds for state health departments

    PubMed Central

    Farnham, Paul G.; Cohen, Stacy; Purcell, David W.; Hauck, Heather; Sansom, Stephanie L.

    2018-01-01

    Objective To estimate the optimal allocation of Centers for Disease Control and Prevention (CDC) HIV prevention funds for health departments in 52 jurisdictions, incorporating Health Resources and Services Administration (HRSA) Ryan White HIV/AIDS Program funds, to improve outcomes along the HIV care continuum and prevent infections. Methods Using surveillance data from 2010 to 2012 and budgetary data from 2012, we divided the 52 health departments into 5 groups varying by number of persons living with diagnosed HIV (PLWDH), median annual CDC HIV prevention budget, and median annual HRSA expenditures supporting linkage to care, retention in care, and adherence to antiretroviral therapy. Using an optimization and a Bernoulli process model, we solved for the optimal CDC prevention budget allocation for each health department group. The optimal allocation distributed the funds across prevention interventions and populations at risk for HIV to prevent the greatest number of new HIV cases annually. Results Both the HIV prevention interventions funded by the optimal allocation of CDC HIV prevention funds and the proportions of the budget allocated were similar across health department groups, particularly those representing the large majority of PLWDH. Consistently funded interventions included testing, partner services and linkage to care and interventions for men who have sex with men (MSM). Sensitivity analyses showed that the optimal allocation shifted when there were differences in transmission category proportions and progress along the HIV care continuum. Conclusion The robustness of the results suggests that most health departments can use these analyses to guide the investment of CDC HIV prevention funds into strategies to prevent the most new cases of HIV. PMID:29768489

  9. Model-Based Thermal System Design Optimization for the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Cataldo, Giuseppe; Niedner, Malcolm B.; Fixsen, Dale J.; Moseley, Samuel H.

    2017-01-01

    Spacecraft thermal model validation is normally performed by comparing model predictions with thermal test data and reducing their discrepancies to meet the mission requirements. Based on thermal engineering expertise, the model input parameters are adjusted to tune the model output response to the test data. The end result is not guaranteed to be the best solution in terms of reduced discrepancy and the process requires months to complete. A model-based methodology was developed to perform the validation process in a fully automated fashion and provide mathematical bases to the search for the optimal parameter set that minimizes the discrepancies between model and data. The methodology was successfully applied to several thermal subsystems of the James Webb Space Telescope (JWST). Global or quasiglobal optimal solutions were found and the total execution time of the model validation process was reduced to about two weeks. The model sensitivities to the parameters, which are required to solve the optimization problem, can be calculated automatically before the test begins and provide a library for sensitivity studies. This methodology represents a crucial commodity when testing complex, large-scale systems under time and budget constraints. Here, results for the JWST Core thermal system will be presented in detail.

  10. Model-based thermal system design optimization for the James Webb Space Telescope

    NASA Astrophysics Data System (ADS)

    Cataldo, Giuseppe; Niedner, Malcolm B.; Fixsen, Dale J.; Moseley, Samuel H.

    2017-10-01

    Spacecraft thermal model validation is normally performed by comparing model predictions with thermal test data and reducing their discrepancies to meet the mission requirements. Based on thermal engineering expertise, the model input parameters are adjusted to tune the model output response to the test data. The end result is not guaranteed to be the best solution in terms of reduced discrepancy and the process requires months to complete. A model-based methodology was developed to perform the validation process in a fully automated fashion and provide mathematical bases to the search for the optimal parameter set that minimizes the discrepancies between model and data. The methodology was successfully applied to several thermal subsystems of the James Webb Space Telescope (JWST). Global or quasiglobal optimal solutions were found and the total execution time of the model validation process was reduced to about two weeks. The model sensitivities to the parameters, which are required to solve the optimization problem, can be calculated automatically before the test begins and provide a library for sensitivity studies. This methodology represents a crucial commodity when testing complex, large-scale systems under time and budget constraints. Here, results for the JWST Core thermal system will be presented in detail.

  11. Two-step adaptive management for choosing between two management actions

    USGS Publications Warehouse

    Moore, Alana L.; Walker, Leila; Runge, Michael C.; McDonald-Madden, Eve; McCarthy, Michael A

    2017-01-01

    Adaptive management is widely advocated to improve environmental management. Derivations of optimal strategies for adaptive management, however, tend to be case specific and time consuming. In contrast, managers might seek relatively simple guidance, such as insight into when a new potential management action should be considered, and how much effort should be expended on trialing such an action. We constructed a two-time-step scenario where a manager is choosing between two possible management actions. The manager has a total budget that can be split between a learning phase and an implementation phase. We use this scenario to investigate when and how much a manager should invest in learning about the management actions available. The optimal investment in learning can be understood intuitively by accounting for the expected value of sample information, the benefits that accrue during learning, the direct costs of learning, and the opportunity costs of learning. We find that the optimal proportion of the budget to spend on learning is characterized by several critical thresholds that mark a jump from spending a large proportion of the budget on learning to spending nothing. For example, as sampling variance increases, it is optimal to spend a larger proportion of the budget on learning, up to a point: if the sampling variance passes a critical threshold, it is no longer beneficial to invest in learning. Similar thresholds are observed as a function of the total budget and the difference in the expected performance of the two actions. We illustrate how this model can be applied using a case study of choosing between alternative rearing diets for hihi, an endangered New Zealand passerine. Although the model presented is a simplified scenario, we believe it is relevant to many management situations. Managers often have relatively short time horizons for management, and might be reluctant to consider further investment in learning and monitoring beyond collecting data from a single time period.

  12. Two-step adaptive management for choosing between two management actions.

    PubMed

    Moore, Alana L; Walker, Leila; Runge, Michael C; McDonald-Madden, Eve; McCarthy, Michael A

    2017-06-01

    Adaptive management is widely advocated to improve environmental management. Derivations of optimal strategies for adaptive management, however, tend to be case specific and time consuming. In contrast, managers might seek relatively simple guidance, such as insight into when a new potential management action should be considered, and how much effort should be expended on trialing such an action. We constructed a two-time-step scenario where a manager is choosing between two possible management actions. The manager has a total budget that can be split between a learning phase and an implementation phase. We use this scenario to investigate when and how much a manager should invest in learning about the management actions available. The optimal investment in learning can be understood intuitively by accounting for the expected value of sample information, the benefits that accrue during learning, the direct costs of learning, and the opportunity costs of learning. We find that the optimal proportion of the budget to spend on learning is characterized by several critical thresholds that mark a jump from spending a large proportion of the budget on learning to spending nothing. For example, as sampling variance increases, it is optimal to spend a larger proportion of the budget on learning, up to a point: if the sampling variance passes a critical threshold, it is no longer beneficial to invest in learning. Similar thresholds are observed as a function of the total budget and the difference in the expected performance of the two actions. We illustrate how this model can be applied using a case study of choosing between alternative rearing diets for hihi, an endangered New Zealand passerine. Although the model presented is a simplified scenario, we believe it is relevant to many management situations. Managers often have relatively short time horizons for management, and might be reluctant to consider further investment in learning and monitoring beyond collecting data from a single time period. © 2017 by the Ecological Society of America.

  13. Optimized blind gamma-ray pulsar searches at fixed computing budget

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pletsch, Holger J.; Clark, Colin J., E-mail: holger.pletsch@aei.mpg.de

    The sensitivity of blind gamma-ray pulsar searches in multiple years worth of photon data, as from the Fermi LAT, is primarily limited by the finite computational resources available. Addressing this 'needle in a haystack' problem, here we present methods for optimizing blind searches to achieve the highest sensitivity at fixed computing cost. For both coherent and semicoherent methods, we consider their statistical properties and study their search sensitivity under computational constraints. The results validate a multistage strategy, where the first stage scans the entire parameter space using an efficient semicoherent method and promising candidates are then refined through a fullymore » coherent analysis. We also find that for the first stage of a blind search incoherent harmonic summing of powers is not worthwhile at fixed computing cost for typical gamma-ray pulsars. Further enhancing sensitivity, we present efficiency-improved interpolation techniques for the semicoherent search stage. Via realistic simulations we demonstrate that overall these optimizations can significantly lower the minimum detectable pulsed fraction by almost 50% at the same computational expense.« less

  14. Model inversion via multi-fidelity Bayesian optimization: a new paradigm for parameter estimation in haemodynamics, and beyond.

    PubMed

    Perdikaris, Paris; Karniadakis, George Em

    2016-05-01

    We present a computational framework for model inversion based on multi-fidelity information fusion and Bayesian optimization. The proposed methodology targets the accurate construction of response surfaces in parameter space, and the efficient pursuit to identify global optima while keeping the number of expensive function evaluations at a minimum. We train families of correlated surrogates on available data using Gaussian processes and auto-regressive stochastic schemes, and exploit the resulting predictive posterior distributions within a Bayesian optimization setting. This enables a smart adaptive sampling procedure that uses the predictive posterior variance to balance the exploration versus exploitation trade-off, and is a key enabler for practical computations under limited budgets. The effectiveness of the proposed framework is tested on three parameter estimation problems. The first two involve the calibration of outflow boundary conditions of blood flow simulations in arterial bifurcations using multi-fidelity realizations of one- and three-dimensional models, whereas the last one aims to identify the forcing term that generated a particular solution to an elliptic partial differential equation. © 2016 The Author(s).

  15. Model inversion via multi-fidelity Bayesian optimization: a new paradigm for parameter estimation in haemodynamics, and beyond

    PubMed Central

    Perdikaris, Paris; Karniadakis, George Em

    2016-01-01

    We present a computational framework for model inversion based on multi-fidelity information fusion and Bayesian optimization. The proposed methodology targets the accurate construction of response surfaces in parameter space, and the efficient pursuit to identify global optima while keeping the number of expensive function evaluations at a minimum. We train families of correlated surrogates on available data using Gaussian processes and auto-regressive stochastic schemes, and exploit the resulting predictive posterior distributions within a Bayesian optimization setting. This enables a smart adaptive sampling procedure that uses the predictive posterior variance to balance the exploration versus exploitation trade-off, and is a key enabler for practical computations under limited budgets. The effectiveness of the proposed framework is tested on three parameter estimation problems. The first two involve the calibration of outflow boundary conditions of blood flow simulations in arterial bifurcations using multi-fidelity realizations of one- and three-dimensional models, whereas the last one aims to identify the forcing term that generated a particular solution to an elliptic partial differential equation. PMID:27194481

  16. Large-scale budget applications of mathematical programming in the Forest Service

    Treesearch

    Malcolm Kirby

    1978-01-01

    Mathematical programming applications in the Forest Service, U.S. Department of Agriculture, are growing. They are being used for widely varying problems: budgeting, lane use planning, timber transport, road maintenance and timber harvest planning. Large-scale applications are being mace in budgeting. The model that is described can be used by developing economies....

  17. Managing Campus Budgets in Trying Times: Did Practices Follow Principles?

    ERIC Educational Resources Information Center

    Burke, Joseph C.

    This study examined how closely 11 accepted principles for managing budgets were followed at 98 public college campuses during the first half of the 1990s, a period of budget problems. The colleges reviewed were in six states: California, Florida, Massachusetts, New York, Texas, and Wisconsin. The study found that (1) planning was not inclusive,…

  18. State Budgets and Federal Stimulus: A First Look

    ERIC Educational Resources Information Center

    Karolak, Eric

    2009-01-01

    U.S. has slipped into a recession greater and deeper than anyone expected. At least 47 states are grappling with budget shortfalls, and 46 states already anticipate problems into 2010 and 2011. State legislatures have the daunting task of balancing budgets as revenues drop and the need for services increases. Whether one participates in a child…

  19. Influence of the Participatory Budgeting on the Infrastructural Development of the Territories in the Russian Federation

    ERIC Educational Resources Information Center

    Tsurkan, Marina V.; Sotskova, Svetlana I.; Aksinina, Olga S.; Lyubarskaya, Maria A.; Tkacheva, Oksana N.

    2016-01-01

    The relevance of the investigated problem is caused by the need for the advancing of participatory budgeting practice in the Russian Federation. Due to insufficient development of theoretical, scientific, and methodological aspects of the participatory budgeting, very few territories in the Russian Federation use this tool effectively. The most…

  20. Trade-offs and efficiencies in optimal budget-constrained multispecies corridor networks

    Treesearch

    Bistra Dilkina; Rachel Houtman; Carla P. Gomes; Claire A. Montgomery; Kevin S. McKelvey; Katherine Kendall; Tabitha A. Graves; Richard Bernstein; Michael K. Schwartz

    2016-01-01

    Conservation biologists recognize that a system of isolated protected areas will be necessary but insufficient to meet biodiversity objectives. Current approaches to connecting core conservation areas through corridors consider optimal corridor placement based on a single optimization goal: commonly, maximizing the movement for a target species across a...

  1. Energy budget and propagation of faults via shearing and opening using work optimization

    NASA Astrophysics Data System (ADS)

    Madden, Elizabeth H.; Cooke, Michele L.; McBeck, Jessica

    2017-08-01

    We present numerical models of faults propagating by work optimization in a homogeneous medium. These simulations allow quantification and comparison of the energy budgets of fault growth by shear versus tensile failure. The energy consumed by growth of a fault, Wgrow, propagating by in-line shearing is 76% of the total energy associated with that growth, while 24% is spent on frictional work during propagation. Wgrow for a fault propagating into intact rock by tensile failure, at an angle to the parent fault, consumes 60% of the work budget, while only 6% is consumed by frictional work associated with propagation. Following the conservation of energy, this leaves 34% of the energy budget available for other activities and suggests that out-of-plane propagation of faults in Earth's crust may release energy for other processes, such as permanent damage zone formation or rupture acceleration. Comparison of these estimates of Wgrow with estimates of the critical energy release rate and earthquake fracture energy at several scales underscores their theoretical similarities and their dependence on stress drop.

  2. Roy's safety-first portfolio principle in financial risk management of disastrous events.

    PubMed

    Chiu, Mei Choi; Wong, Hoi Ying; Li, Duan

    2012-11-01

    Roy pioneers the concept and practice of risk management of disastrous events via his safety-first principle for portfolio selection. More specifically, his safety-first principle advocates an optimal portfolio strategy generated from minimizing the disaster probability, while subject to the budget constraint and the mean constraint that the expected final wealth is not less than a preselected disaster level. This article studies the dynamic safety-first principle in continuous time and its application in asset and liability management. We reveal that the distortion resulting from dropping the mean constraint, as a common practice to approximate the original Roy's setting, either leads to a trivial case or changes the problem nature completely to a target-reaching problem, which produces a highly leveraged trading strategy. Recognizing the ill-posed nature of the corresponding Lagrangian method when retaining the mean constraint, we invoke a wisdom observed from a limited funding-level regulation of pension funds and modify the original safety-first formulation accordingly by imposing an upper bound on the funding level. This model revision enables us to solve completely the safety-first asset-liability problem by a martingale approach and to derive an optimal policy that follows faithfully the spirit of the safety-first principle and demonstrates a prominent nature of fighting for the best and preventing disaster from happening. © 2012 Society for Risk Analysis.

  3. Primer Vector Optimization: Survey of Theory, new Analysis and Applications

    NASA Astrophysics Data System (ADS)

    Guzman

    This paper presents a preliminary study in developing a set of optimization tools for orbit rendezvous, transfer and station keeping. This work is part of a large scale effort undergoing at NASA Goddard Space Flight Center and a.i. solutions, Inc. to build generic methods, which will enable missions with tight fuel budgets. Since no single optimization technique can solve efficiently all existing problems, a library of tools where the user could pick the method most suited for the particular mission is envisioned. The first trajectory optimization technique explored is Lawden's primer vector theory [Ref. 1]. Primer vector theory can be considered as a byproduct of applying Calculus of Variations (COV) techniques to the problem of minimizing the fuel usage of impulsive trajectories. For an n-impulse trajectory, it involves the solution of n-1 two-point boundary value problems. In this paper, we look at some of the different formulations of the primer vector (dependent on the frame employed and on the force model). Also, the applicability of primer vector theory is examined in effort to understand when and why the theory can fail. Specifically, since COV is based on "small variations", singularities in the linearized (variational) equations of motion along the arcs must be taken into account. These singularities are a recurring problem in analyzes that employ "small variations" [Refs. 2, 3]. For example, singularities in the (2-body problem) variational equations along elliptic arcs occur when [Ref. 4], 1) the difference between the initial and final times is a multiple of the reference orbit period, 2) the difference between the initial and final true anomalies are given by k, for k= 0, 1, 2, 3,..., note that this cover the 3) the time of flight is a minimum for the given difference in true anomaly. For the N-body problem, the situation is more complex and is still under investigation. Several examples, such as the initialization of an orbit (ascent trajectory) and rotation of the line of apsides, are utilized as test cases. Recommendations, future work, and the possible addition of other optimization techniques are also discussed. References: [1] Lawden D.F., Optimal Trajectories for Space Navigation, Butterworths, London, 1963. [2] Wilson, R.S., Howell, K.C., and, Lo, M, "Optimization of Insertion Cost for Transfer Trajectories to Libration Point Orbits", AIAA/AAS Astrodynamics Specialist Conference, AAS 99-041, Girdwood, Alaska, August 16-19, 1999. [3] Goodson, T, "Monte-Carlo Maneuver Analysis for the Microwave Anisotropy Probe", AAS/AIAA Astrodynamics Specialist Conference, AAS 01-331, Quebec City, Canada, July 30 - August 2, 2001. [4] Stern, R.G., "Singularities in the Analytic Solution of the Linearized Variational Equations of Elliptical Motion", Report RE-8, May 1964, Experimental Astronomy Lab., Massachusetts Institute of Technology, Cambridge, Massachusetts.

  4. Application of the stepwise focusing method to optimize the cost-effectiveness of genome-wide association studies with limited research budgets for genotyping and phenotyping.

    PubMed

    Ohashi, J; Clark, A G

    2005-05-01

    The recent cataloguing of a large number of SNPs enables us to perform genome-wide association studies for detecting common genetic variants associated with disease. Such studies, however, generally have limited research budgets for genotyping and phenotyping. It is therefore necessary to optimize the study design by determining the most cost-effective numbers of SNPs and individuals to analyze. In this report we applied the stepwise focusing method, with two-stage design, developed by Satagopan et al. (2002) and Saito & Kamatani (2002), to optimize the cost-effectiveness of a genome-wide direct association study using a transmission/disequilibrium test (TDT). The stepwise focusing method consists of two steps: a large number of SNPs are examined in the first focusing step, and then all the SNPs showing a significant P-value are tested again using a larger set of individuals in the second focusing step. In the framework of optimization, the numbers of SNPs and families and the significance levels in the first and second steps were regarded as variables to be considered. Our results showed that the stepwise focusing method achieves a distinct gain of power compared to a conventional method with the same research budget.

  5. Underestimation of Project Costs

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2015-01-01

    Large projects almost always exceed their budgets. Estimating cost is difficult and estimated costs are usually too low. Three different reasons are suggested: bad luck, overoptimism, and deliberate underestimation. Project management can usually point to project difficulty and complexity, technical uncertainty, stakeholder conflicts, scope changes, unforeseen events, and other not really unpredictable bad luck. Project planning is usually over-optimistic, so the likelihood and impact of bad luck is systematically underestimated. Project plans reflect optimism and hope for success in a supposedly unique new effort rather than rational expectations based on historical data. Past project problems are claimed to be irrelevant because "This time it's different." Some bad luck is inevitable and reasonable optimism is understandable, but deliberate deception must be condemned. In a competitive environment, project planners and advocates often deliberately underestimate costs to help gain project approval and funding. Project benefits, cost savings, and probability of success are exaggerated and key risks ignored. Project advocates have incentives to distort information and conceal difficulties from project approvers. One naively suggested cure is more openness, honesty, and group adherence to shared overall goals. A more realistic alternative is threatening overrun projects with cancellation. Neither approach seems to solve the problem. A better method to avoid the delusions of over-optimism and the deceptions of biased advocacy is to base the project cost estimate on the actual costs of a large group of similar projects. Over optimism and deception can continue beyond the planning phase and into project execution. Hard milestones based on verified tests and demonstrations can provide a reality check.

  6. Data Mining and Optimization Tools for Developing Engine Parameters Tools

    NASA Technical Reports Server (NTRS)

    Dhawan, Atam P.

    1998-01-01

    This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. From the total budget of $5,000, Tricia and I studied the problem domain for developing ail Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy datasets. From the study and discussion with NASA LERC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of the data for GA based multi-resolution optimal search. Wavelet processing is proposed to create a coarse resolution representation of data providing two advantages in GA based search: 1. We will have less data to begin with to make search sub-spaces. 2. It will have robustness against the noise because at every level of wavelet based decomposition, we will be decomposing the signal into low pass and high pass filters.

  7. Outcomes from personal budgets in mental health: service users' experiences in three English local authorities.

    PubMed

    Larsen, John; Tew, Jerry; Hamilton, Sarah; Manthorpe, Jill; Pinfold, Vanessa; Szymczynska, Paulina; Clewett, Naomi

    2015-08-01

    In England, personal budgets are offered to eligible people with severe mental health problems to enable them to purchase what is helpful for their quality of life or recovery. However, in-depth insight into people's own perceptions of the outcomes is lacking. To investigate people's own reporting of outcomes from using personal budgets in relation to social care needs arising from severe mental health problems. A convenience sample of 47 individuals receiving personal budgets was recruited from three English local authorities. In-depth semi-structured interviews were subject to thematic framework analysis. Most participants identified positive outcomes across domains interconnected through individual life circumstances, with mental health and wellbeing, social participation and relationships, and confidence and skills most commonly reported. Some needed more support than others to identify goals and make use of the personal budget to take a more active part in the society. Personal budgets can enable people to achieve outcomes that are relevant to them in the context of their lives, particularly through enhancing their wellbeing and social participation. Consideration should be given to distinguishing those individuals potentially requiring more support for engagement from those who can engage more independently to identify and pursue their goals.

  8. Integrating clinical and economic evidence in clinical guidelines: More needed than ever!

    PubMed

    Knies, Saskia; Severens, Johan L; Brouwer, Werner B F

    2018-04-26

    In recent years, several expensive new health technologies have been introduced. The availability of those technologies intensifies the discussion regarding the affordability of these technologies at different decision-making levels. On the meso level, both hospitals and clinicians are facing budget constraints resulting in a tension to balance between different patients' interests. As such, it is crucial to make optimal use of the available resources. Different strategies are in place to deal with this problem, but decisions on a macro level on what to fund or not can limit the role and freedom of clinicians in their decisions on a micro level. At the same time, without central guidance regarding such decisions, micro level decisions may lead to inequities and undesirable treatment variation between clinicians and hospitals. The challenge is to find instruments that can balance both levels of decision making. Clinicians are becoming increasingly aware that their decisions to spend more resources (like time and budget) on 1 particular patient group reduce the resources available to other patients. Involving clinicians in thinking about the optimal use of limited resources, also in an attempt to bridge the world of economic reasoning and clinical practice, is crucial therefore. We argue that clinical guidelines may prove a clear vehicle for this by including both clinical and economic evidence to support the recommendations made. The development of such guidelines requires cooperation of clinicians, and health economists are cooperating with each other. The development of clinical guidelines which combine economic and clinical evidence should be stimulated, to balance central guidance and uniformity while maintaining necessary decentralized freedom. This is an opportunity to combine the reality of budgets and opportunity costs with clinical practice. Missing this opportunity risks either variation and inequity or central and necessarily crude measures. © 2018 The Authors Journal of Evaluation in Clinical Practice Published by John Wiley & Sons Ltd.

  9. A Closer look on Ineffectiveness in Riau Mainland Expenditure: Local Government Budget Case

    NASA Astrophysics Data System (ADS)

    Yandra, Alexsander; Roserdevi Nasution, Sri; Harsini; Wardi, Jeni

    2018-05-01

    this study discussed about the issues on ineffectiveness of expenditure by one Indonesia local Government in Riau. This Provence were amounted Rp. 10.7 trillion through Local Government Budget (APBD) in 2015. According to data from Financial Management Board and Regions Assets (BPKAD) APBD Riau in 2015 stood at approximately 37.58% until October 2015,another data taken from the Ministry of Home Affairs, Riau regional budget, from January to December 2015, it shows the lowest in Indonesia which amounted to 59.6%. The percentage described the lacking implementation of the budget, this can be interpreted that Riau government is less optimal and irrelevant in spending the budget in 2015. Through a theoretical approach to government spending, the implementation of public policies showed the ineffectiveness of the budget that have implicated towards regional development. As regional budget is only the draft in achieving the targets. Budget management in 2015 by the provincial administration through the Local Government Unit (SKPD) shows unsynchronized between the Medium Term Development Plan with the work program from SKPD.

  10. Budgeting Based on Results: A Return-on-Investment Model Contributes to More Effective Annual Spending Choices

    ERIC Educational Resources Information Center

    Cooper, Kelt L.

    2011-01-01

    One major problem in developing school district budgets immune to cuts is the model administrators traditionally use--an expenditure model. The simplicity of this model is seductive: What were the revenues and expenditures last year? What are the expected revenues and expenditures this year? A few adjustments here and there and one has a budget.…

  11. Budgeting in health care systems.

    PubMed

    Maynard, A

    1984-01-01

    During the last decade there has been a recognition that all health care systems, public and private, are characterised by perverse incentives (especially moral hazard and third party pays) which generate inefficiency in the use of scarce economic resources. Inefficiency is unethical: doctors who use resources inefficiently deprive potential patients of care from which they could benefit. To eradicate unethical and inefficient practices two economic rules have to be followed: (i) no service should be provided if its total costs exceed its total benefits; (ii) if total benefits exceed total costs, the level of provision should be at that level at which the additional input cost (marginal cost) is equal to the additional benefits (marginal benefit). This efficiency test can be applied to health care systems, their component parts and the individuals (especially doctors) who control resource allocation within them. Unfortunately, all health care systems neither generate this relevant decision making data nor are they flexible enough to use it to affect health care decisions. There are two basic varieties of budgeting system: resource based and production targeted. The former generates obsession with cash limits and too little regard of the benefits, particularly at the margins, of alternative patterns of resource allocation. The latter generates undue attention to the production of processes of care and scant regard for costs, especially at the margins. Consequently, one set of budget rules may lead to cost containment regardless of benefits and the other set of budget rules may lead to output maximization regardless of costs. To close this circle of inefficiency it is necessary to evolve market-like structures. To do this a system of client group (defined broadly across all existing activities public and private) budgets is advocated with an identification of the budget holder who has the capacity to shift resources and seek out cost effective policies. Negotiated output targets with defined budgets and incentives for decision makers to economise in their use of resources are being incorporated into experiments in the health care systems of Western Europe and the United States. Undue optimism about the success of these experiments must be avoided because these problems have existed in the West and in the Soviet bloc for decades and efficient solutions are noticeable by their absence.

  12. Defer Maintenance, Invite Disaster

    ERIC Educational Resources Information Center

    Bowman, William W.

    1977-01-01

    An AGB- and NACUBO-sponsored survey showed that "wish lists" are accumulating overdue major maintenance projects because energy costs are consuming physical plant budgets. Problem areas are discussed: budget "guesstimation," preventive maintenance, deferred maintenance inventory, the APPA accounting format, resource allocation,…

  13. The NOx Budget Trading Program: A Collaborative, Innovative Approach to Solving a Regional Air Pollution Problem

    EPA Pesticide Factsheets

    This article examines the development and implementation of the NOx Budget Trading Program (NBP) and the lessons the Environmental Protection Agency has learned from this seasonal emissions cap-and-trade program.

  14. Initialization of Formation Flying Using Primer Vector Theory

    NASA Technical Reports Server (NTRS)

    Mailhe, Laurie; Schiff, Conrad; Folta, David

    2002-01-01

    In this paper, we extend primer vector analysis to formation flying. Optimization of the classical rendezvous or free-time transfer problem between two orbits using primer vector theory has been extensively studied for one spacecraft. However, an increasing number of missions are now considering flying a set of spacecraft in close formation. Missions such as the Magnetospheric MultiScale (MMS) and Leonardo-BRDF (Bidirectional Reflectance Distribution Function) need to determine strategies to transfer each spacecraft from the common launch orbit to their respective operational orbit. In addition, all the spacecraft must synchronize their states so that they achieve the same desired formation geometry over each orbit. This periodicity requirement imposes constraints on the boundary conditions that can be used for the primer vector algorithm. In this work we explore the impact of the periodicity requirement in optimizing each spacecraft transfer trajectory using primer vector theory. We first present our adaptation of primer vector theory to formation flying. Using this method, we then compute the AV budget for each spacecraft subject to different formation endpoint constraints.

  15. Optimization of Exposure Time Division for Multi-object Photometry

    NASA Astrophysics Data System (ADS)

    Popowicz, Adam; Kurek, Aleksander R.

    2017-09-01

    Optical observations of wide fields of view entail the problem of selecting the best exposure time. As many objects are usually observed simultaneously, the quality of photometry of the brightest ones is always better than that of the dimmer ones, even though all of them are frequently equally interesting for astronomers. Thus, measuring all objects with the highest possible precision is desirable. In this paper, we present a new optimization algorithm, dedicated for the division of exposure time into sub-exposures, which enables photometry with a more balanced noise budget. The proposed technique increases the photometric precision of dimmer objects at the expense of the measurement fidelity of the brightest ones. We have tested the method on real observations using two telescope setups, demonstrating its usefulness and good consistency with theoretical expectations. The main application of our approach is a wide range of sky surveys, including ones performed by space telescopes. The method can be used to plan virtually any photometric observation of objects that show a wide range of magnitudes.

  16. Study on thick film spin-on carbon hardmask

    NASA Astrophysics Data System (ADS)

    Kim, Taeho; Kim, Youngmin; Hwang, Sunmin; Lee, Hyunsoo; Han, Miyeon; Lim, Sanghak

    2017-03-01

    A thick spin-on carbon hardmask (SOH) material is designed to overcome inherent problems of amorphous deposited carbon layer (ACL) and thick photoresist. For ACL in use of semiconductor production process, especially when film thickness from sub-micrometer up to few micrometers is required, not only its inherent low transparency at long wavelength light often causes alignment problems with under layers, but also considerable variation of film thickness within a wafer can also cause patterning problems. To avoid these issues, a thick SOH is designed with monomers of high transparency and good solubility at the same time. In comparison with photoresist, the SOH has good etch resistance and high thermal stability, and it provides wide process window of decreased film thickness and increased thermal budget up to 400°C after processes such as high temperature deposition of SiON. In order to achieve high thickness along with uniform film, many solvent factors was considered such as solubility parameter, surface tension, vapor pressure, and others. By optimizing many solvent factors, we were able to develop a product with a good coating performance

  17. Small business, cash budgets and general practice.

    PubMed

    Jackson, A R

    1991-01-01

    In practice management, general practice falls into the category of small business with all its attendant generic problems. Disciplined planning and good financial management are not often seen in small business. These are required if general practitioners are to continue (or return to) the provision of high quality medical services. An effective budget process, especially cash-flow budgeting, is the key to successful planning and financial management. Budgeting will bring Control, Co-ordination, and Credibility to your practice. It will enable you to set goals and to achieve them.

  18. Optimization strategies for sediment reduction practices on roads in steep, forested terrain

    USGS Publications Warehouse

    Madej, Mary Ann; Eschenbach, E.A.; Diaz, C.; Teasley, R.; Baker, K.

    2006-01-01

    Many forested steeplands in the western United States display a legacy of disturbances due to timber harvest, mining or wildfires, for example. Such disturbances have caused accelerated hillslope erosion, leading to increased sedimentation in fish-bearing streams. Several restoration techniques have been implemented to address these problems in mountain catchments, many of which involve the removal of abandoned roads and re-establishing drainage networks across road prisms. With limited restoration funds to be applied across large catchments, land managers are faced with deciding which areas and problems should be treated first, and by which technique, in order to design the most effective and cost-effective sediment reduction strategy. Currently most restoration is conducted on a site-specific scale according to uniform treatment policies. To create catchment-scale policies for restoration, we developed two optimization models - dynamic programming and genetic algorithms - to determine the most cost-effective treatment level for roads and stream crossings in a pilot study basin with approximately 700 road segments and crossings. These models considered the trade-offs between the cost and effectiveness of different restoration strategies to minimize the predicted erosion from all forest roads within a catchment, while meeting a specified budget constraint. The optimal sediment reduction strategies developed by these models performed much better than two strategies of uniform erosion control which are commonly applied to road erosion problems by land managers, with sediment savings increased by an additional 48 to 80 per cent. These optimization models can be used to formulate the most cost-effective restoration policy for sediment reduction on a catchment scale. Thus, cost savings can be applied to further restoration work within the catchment. Nevertheless, the models are based on erosion rates measured on past restoration sites, and need to be up-dated as additional monitoring studies evaluate long-term basin response to erosion control treatments. Copyright ?? 2006 John Wiley & Sons, Ltd.

  19. Minimum variance optimal rate allocation for multiplexed H.264/AVC bitstreams.

    PubMed

    Tagliasacchi, Marco; Valenzise, Giuseppe; Tubaro, Stefano

    2008-07-01

    Consider the problem of transmitting multiple video streams to fulfill a constant bandwidth constraint. The available bit budget needs to be distributed across the sequences in order to meet some optimality criteria. For example, one might want to minimize the average distortion or, alternatively, minimize the distortion variance, in order to keep almost constant quality among the encoded sequences. By working in the rho-domain, we propose a low-delay rate allocation scheme that, at each time instant, provides a closed form solution for either the aforementioned problems. We show that minimizing the distortion variance instead of the average distortion leads, for each of the multiplexed sequences, to a coding penalty less than 0.5 dB, in terms of average PSNR. In addition, our analysis provides an explicit relationship between model parameters and this loss. In order to smooth the distortion also along time, we accommodate a shared encoder buffer to compensate for rate fluctuations. Although the proposed scheme is general, and it can be adopted for any video and image coding standard, we provide experimental evidence by transcoding bitstreams encoded using the state-of-the-art H.264/AVC standard. The results of our simulations reveal that is it possible to achieve distortion smoothing both in time and across the sequences, without sacrificing coding efficiency.

  20. Just how important is a good season start? Overall team performance and financial budget of elite soccer clubs.

    PubMed

    Lago-Peñas, Carlos; Sampaio, Jaime

    2015-01-01

    The aim of the current study was (i) to identify how important is a good season start on elite soccer teams' performance and (ii) to examine whether this impact is related to the clubs' financial budget. The match performances and annual budgets of all teams were collected from the English FA Premier League, French Ligue 1, Spanish La Liga, Italian Serie A and German Bundesliga for three consecutive seasons (2010-2011 to 2012-2013). A k-means cluster analysis classified the clubs according to their budget as High Range Budget Clubs, Upper-Mid Range Budget Clubs, Lower-Mid Range Budget Clubs and Low Range Budget Clubs. Data were examined through linear regression models. Overall, the results suggested that the better the team performance at the beginning of the season, the better the ranking at the end of the season. However, the impact of the effect depended on the clubs' annual budget, with lower budgets being associated with a greater importance of having a good season start (P < 0.01). Moreover, there were differences in trends across the different leagues. These variables can be used to develop accurate models to estimate final rankings. Conversely, Lower-Mid and Lower Range Budget Clubs can benefit from fine-tuning preseason planning in order to accelerate the acquisition of optimal performances.

  1. Planning additional drilling campaign using two-space genetic algorithm: A game theoretical approach

    NASA Astrophysics Data System (ADS)

    Kumral, Mustafa; Ozer, Umit

    2013-03-01

    Grade and tonnage are the most important technical uncertainties in mining ventures because of the use of estimations/simulations, which are mostly generated from drill data. Open pit mines are planned and designed on the basis of the blocks representing the entire orebody. Each block has different estimation/simulation variance reflecting uncertainty to some extent. The estimation/simulation realizations are submitted to mine production scheduling process. However, the use of a block model with varying estimation/simulation variances will lead to serious risk in the scheduling. In the medium of multiple simulations, the dispersion variances of blocks can be thought to regard technical uncertainties. However, the dispersion variance cannot handle uncertainty associated with varying estimation/simulation variances of blocks. This paper proposes an approach that generates the configuration of the best additional drilling campaign to generate more homogenous estimation/simulation variances of blocks. In other words, the objective is to find the best drilling configuration in such a way as to minimize grade uncertainty under budget constraint. Uncertainty measure of the optimization process in this paper is interpolation variance, which considers data locations and grades. The problem is expressed as a minmax problem, which focuses on finding the best worst-case performance i.e., minimizing interpolation variance of the block generating maximum interpolation variance. Since the optimization model requires computing the interpolation variances of blocks being simulated/estimated in each iteration, the problem cannot be solved by standard optimization tools. This motivates to use two-space genetic algorithm (GA) approach to solve the problem. The technique has two spaces: feasible drill hole configuration with minimization of interpolation variance and drill hole simulations with maximization of interpolation variance. Two-space interacts to find a minmax solution iteratively. A case study was conducted to demonstrate the performance of approach. The findings showed that the approach could be used to plan a new drilling campaign.

  2. Technology Assessment in Support of the Presidential Vision for Space Exploration

    NASA Technical Reports Server (NTRS)

    Weisbin, Charles R.; Lincoln, William; Mrozinski, Joe; Hua, Hook; Merida, Sofia; Shelton, Kacie; Adumitroaie, Virgil; Derleth, Jason; Silberg, Robert

    2006-01-01

    This document is a viewgraph presentation that contains: (1) pictorial description of lunar context, (2) Definition of base case, (3) Optimization results, (4) Effects of cost uncertainties for base case and different assumed annual budget levels and (5) Effects of temporal optimization.

  3. Cost containment for the public health.

    PubMed

    Eastaugh, Steven R

    2006-01-01

    The U.S. health care system has major problems with respect to patient access and cost control. Trimming excess hospital expenses and expanding public health activities are cost effective. By budgeting well, with global budgets set for the high cost sectors, the United States might emerge with lower tax hikes, a healthier population, better facilities, and enhanced access to service. Nations with global budgets have better health statistics, and lower costs, compared to the United States. With global budgets, these countries employ 75 to 85 percent fewer employees in administration and regulation, but patient satisfaction is almost double the rate in the United States. Implement a global budget for health care, or substantially raise taxes, is the basic choice faced in this country. Key words: global budget control cost containment.

  4. How Do the Location, Size and Budget of Open Space Conservation Affect Land Values?

    Treesearch

    JunJie Wu; Wenchao Xu; Ralph J. Alig

    2016-01-01

    In this article we present a model to examine the optimal location, size, and budget of open space conservation and the resulting impact on land values and local fiscal conditions in an urban area. Results indicate that open space conservation can transform the defining features of an urban landscape. A well-designed open space conservation program can improve...

  5. Fiscal Law, Incremental Funding, and Conditional Contracts.

    DTIC Science & Technology

    1985-01-22

    Resolutions 6. Appropriations Lapses ii r7 7- 7.7 -.. 7 B. Problems Created In Agencies By Congressional Budget Process 1. Program Instability and Waste... created by the Budget and Accounting Act of 1921. To aid the President in his task of producing a single comprehensive federal budget, BOB had been...and borrows money to meet outlay payments, not to meet the level of new obligational authority created by Congress each year. 10 Although Secretary of

  6. Techniques for computing regional radiant emittances of the earth-atmosphere system from observations by wide-angle satellite radiometers, phase 3

    NASA Technical Reports Server (NTRS)

    Pina, J. F.; House, F. B.

    1975-01-01

    Radiometers on earth orbiting satellites measure the exchange of radiant energy between the earth-atmosphere (E-A) system and space at observation points in space external to the E-A system. Observations by wideangle, spherical and flat radiometers are analyzed and interpreted with regard to the general problem of the earth energy budget (EEB) and to the problem of determining the energy budget of regions smaller than the field of view (FOV) of these radiometers.

  7. Towards robust optimal design of storm water systems

    NASA Astrophysics Data System (ADS)

    Marquez Calvo, Oscar; Solomatine, Dimitri

    2015-04-01

    In this study the focus is on the design of a storm water or a combined sewer system. Such a system should be capable to handle properly most of the storm to minimize the damages caused by flooding due to the lack of capacity of the system to cope with rain water at peak times. This problem is a multi-objective optimization problem: we have to take into account the minimization of the construction costs, the minimization of damage costs due to flooding, and possibly other criteria. One of the most important factors influencing the design of storm water systems is the expected amount of water to deal with. It is common that this infrastructure is developed with the capacity to cope with events that occur once in, say 10 or 20 years - so-called design rainfall events. However, rainfall is a random variable and such uncertainty typically is not taken explicitly into account in optimization. Rainfall design data is based on historical information of rainfalls, but many times this data is based on unreliable measures; or in not enough historical information; or as we know, the patterns of rainfall are changing regardless of historical information. There are also other sources of uncertainty influencing design, for example, leakages in the pipes and accumulation of sediments in pipes. In the context of storm water or combined sewer systems design or rehabilitation, robust optimization technique should be able to find the best design (or rehabilitation plan) within the available budget but taking into account uncertainty in those variables that were used to design the system. In this work we consider various approaches to robust optimization proposed by various authors (Gabrel, Murat, Thiele 2013; Beyer, Sendhoff 2007) and test a novel method ROPAR (Solomatine 2012) to analyze robustness. References Beyer, H.G., & Sendhoff, B. (2007). Robust optimization - A comprehensive survey. Comput. Methods Appl. Mech. Engrg., 3190-3218. Gabrel, V.; Murat, C., Thiele, A. (2014). Recent advances in robust optimization: An overview. European Journal of Operational Research. 471-483. Solomatine, D.P. (2012). Robust Optimization and Probabilistic Analysis of Robustness (ROPAR). http://www.unesco-ihe.org/hi/sol/papers/ ROPAR.pdf.

  8. Leaky Roof? Tight Budget? No Problem!

    ERIC Educational Resources Information Center

    Szcygiel, Tony L.

    1998-01-01

    Examines the piece-by-piece approach to school re-roofing that can help alleviate both maintenance and budget concerns. Addresses the question of whether an entire new roof is required and discusses funding and why a single-ply roof is a good choice for partial replacement. (GR)

  9. Evaluating data worth for ground-water management under uncertainty

    USGS Publications Warehouse

    Wagner, B.J.

    1999-01-01

    A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models-a chance-constrained ground-water management model and an integer-programing sampling network design model-to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information-i.e., the projected reduction in management costs-with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models - a chance-constrained ground-water management model and an integer-programming sampling network design model - to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information - i.e., the projected reduction in management costs - with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.

  10. Steps Toward Optimal Competitive Scheduling

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy; Crawford, James; Khatib, Lina; Brafman, Ronen

    2006-01-01

    This paper is concerned with the problem of allocating a unit capacity resource to multiple users within a pre-defined time period. The resource is indivisible, so that at most one user can use it at each time instance. However, different users may use it at different times. The users have independent, se@sh preferences for when and for how long they are allocated this resource. Thus, they value different resource access durations differently, and they value different time slots differently. We seek an optimal allocation schedule for this resource. This problem arises in many institutional settings where, e.g., different departments, agencies, or personal, compete for a single resource. We are particularly motivated by the problem of scheduling NASA's Deep Space Satellite Network (DSN) among different users within NASA. Access to DSN is needed for transmitting data from various space missions to Earth. Each mission has different needs for DSN time, depending on satellite and planetary orbits. Typically, the DSN is over-subscribed, in that not all missions will be allocated as much time as they want. This leads to various inefficiencies - missions spend much time and resource lobbying for their time, often exaggerating their needs. NASA, on the other hand, would like to make optimal use of this resource, ensuring that the good for NASA is maximized. This raises the thorny problem of how to measure the utility to NASA of each allocation. In the typical case, it is difficult for the central agency, NASA in our case, to assess the value of each interval to each user - this is really only known to the users who understand their needs. Thus, our problem is more precisely formulated as follows: find an allocation schedule for the resource that maximizes the sum of users preferences, when the preference values are private information of the users. We bypass this problem by making the assumptions that one can assign money to customers. This assumption is reasonable; a committee is usually in charge of deciding the priority of each mission competing for access to the DSN within a time period while scheduling. Instead, we can assume that the committee assigns a budget to each mission.This paper is concerned with the problem of allocating a unit capacity resource to multiple users within a pre-defined time period. The resource is indivisible, so that at most one user can use it at each time instance. However, different users may use it at different times. The users have independent, se@sh preferences for when and for how long they are allocated this resource. Thus, they value different resource access durations differently, and they value different time slots differently. We seek an optimal allocation schedule for this resource. This problem arises in many institutional settings where, e.g., different departments, agencies, or personal, compete for a single resource. We are particularly motivated by the problem of scheduling NASA's Deep Space Satellite Network (DSN) among different users within NASA. Access to DSN is needed for transmitting data from various space missions to Earth. Each mission has different needs for DSN time, depending on satellite and planetary orbits. Typically, the DSN is over-subscribed, in that not all missions will be allocated as much time as they want. This leads to various inefficiencies - missions spend much time and resource lobbying for their time, often exaggerating their needs. NASA, on the other hand, would like to make optimal use of this resource, ensuring that the good for NASA is maximized. This raises the thorny problem of how to measure the utility to NASA of each allocation. In the typical case, it is difficult for the central agency, NASA in our case, to assess the value of each interval to each user - this is really only known to the users who understand their needs. Thus, our problem is more precisely formulated as follows: find an allocation schedule for the resource that maximizes the sum ofsers preferences, when the preference values are private information of the users. We bypass this problem by making the assumptions that one can assign money to customers. This assumption is reasonable; a committee is usually in charge of deciding the priority of each mission competing for access to the DSN within a time period while scheduling. Instead, we can assume that the committee assigns a budget to each mission.

  11. Affordability and cost-effectiveness: decision-making on the cost-effectiveness plane.

    PubMed

    Sendi, P P; Briggs, A H

    2001-10-01

    Much recent research interest has focused on handling uncertainty in cost-effectiveness analysis and in particular the calculation of confidence intervals for incremental cost-effectiveness ratios (ICERs). Problems of interpretation when ICERs are negative have led to two important and related developments: the use of the net-benefit statistic and the presentation of uncertainty in cost-effectiveness analysis using acceptability curves. However, neither of these developments directly addresses the problem that decision-makers are constrained by a fixed-budget and may not be able to fund new, more expensive interventions, even if they have been shown to represent good value for money. In response to this limitation, the authors introduce the 'affordability curve' which reflects the probability that a programme is affordable for a wide range of threshold budgets. The authors argue that the joint probability an intervention is affordable and cost-effective is more useful for decision-making since it captures both dimensions of the decision problem faced by those responsible for health service budgets. Copyright 2001 John Wiley & Sons, Ltd.

  12. Optimizing cost-efficiency in mean exposure assessment - cost functions reconsidered

    PubMed Central

    2011-01-01

    Background Reliable exposure data is a vital concern in medical epidemiology and intervention studies. The present study addresses the needs of the medical researcher to spend monetary resources devoted to exposure assessment with an optimal cost-efficiency, i.e. obtain the best possible statistical performance at a specified budget. A few previous studies have suggested mathematical optimization procedures based on very simple cost models; this study extends the methodology to cover even non-linear cost scenarios. Methods Statistical performance, i.e. efficiency, was assessed in terms of the precision of an exposure mean value, as determined in a hierarchical, nested measurement model with three stages. Total costs were assessed using a corresponding three-stage cost model, allowing costs at each stage to vary non-linearly with the number of measurements according to a power function. Using these models, procedures for identifying the optimally cost-efficient allocation of measurements under a constrained budget were developed, and applied on 225 scenarios combining different sizes of unit costs, cost function exponents, and exposure variance components. Results Explicit mathematical rules for identifying optimal allocation could be developed when cost functions were linear, while non-linear cost functions implied that parts of or the entire optimization procedure had to be carried out using numerical methods. For many of the 225 scenarios, the optimal strategy consisted in measuring on only one occasion from each of as many subjects as allowed by the budget. Significant deviations from this principle occurred if costs for recruiting subjects were large compared to costs for setting up measurement occasions, and, at the same time, the between-subjects to within-subject variance ratio was small. In these cases, non-linearities had a profound influence on the optimal allocation and on the eventual size of the exposure data set. Conclusions The analysis procedures developed in the present study can be used for informed design of exposure assessment strategies, provided that data are available on exposure variability and the costs of collecting and processing data. The present shortage of empirical evidence on costs and appropriate cost functions however impedes general conclusions on optimal exposure measurement strategies in different epidemiologic scenarios. PMID:21600023

  13. Optimizing cost-efficiency in mean exposure assessment--cost functions reconsidered.

    PubMed

    Mathiassen, Svend Erik; Bolin, Kristian

    2011-05-21

    Reliable exposure data is a vital concern in medical epidemiology and intervention studies. The present study addresses the needs of the medical researcher to spend monetary resources devoted to exposure assessment with an optimal cost-efficiency, i.e. obtain the best possible statistical performance at a specified budget. A few previous studies have suggested mathematical optimization procedures based on very simple cost models; this study extends the methodology to cover even non-linear cost scenarios. Statistical performance, i.e. efficiency, was assessed in terms of the precision of an exposure mean value, as determined in a hierarchical, nested measurement model with three stages. Total costs were assessed using a corresponding three-stage cost model, allowing costs at each stage to vary non-linearly with the number of measurements according to a power function. Using these models, procedures for identifying the optimally cost-efficient allocation of measurements under a constrained budget were developed, and applied on 225 scenarios combining different sizes of unit costs, cost function exponents, and exposure variance components. Explicit mathematical rules for identifying optimal allocation could be developed when cost functions were linear, while non-linear cost functions implied that parts of or the entire optimization procedure had to be carried out using numerical methods.For many of the 225 scenarios, the optimal strategy consisted in measuring on only one occasion from each of as many subjects as allowed by the budget. Significant deviations from this principle occurred if costs for recruiting subjects were large compared to costs for setting up measurement occasions, and, at the same time, the between-subjects to within-subject variance ratio was small. In these cases, non-linearities had a profound influence on the optimal allocation and on the eventual size of the exposure data set. The analysis procedures developed in the present study can be used for informed design of exposure assessment strategies, provided that data are available on exposure variability and the costs of collecting and processing data. The present shortage of empirical evidence on costs and appropriate cost functions however impedes general conclusions on optimal exposure measurement strategies in different epidemiologic scenarios.

  14. Cost effectiveness of the stream-gaging program in North Dakota

    USGS Publications Warehouse

    Ryan, Gerald L.

    1989-01-01

    This report documents results of a cost-effectiveness study of the stream-gaging program In North Dakota. It is part of a nationwide evaluation of the stream-gaging program of the U.S. Geological Survey.One phase of evaluating cost effectiveness is to identify less costly alternative methods of simulating streamflow records. Statistical or hydro logic flow-routing methods were used as alternative methods to simulate streamflow records for 21 combinations of gaging stations from the 94-gaging-station network. Accuracy of the alternative methods was sufficient to consider discontinuing only one gaging station.Operation of the gaging-station network was evaluated by using associated uncertainty in streamflow records. The evaluation was limited to the nonwinter operation of 29 gaging stations in eastern North Dakota. The current (1987) travel routes and measurement frequencies require a budget of about $248/000 and result in an average equivalent Gaussian spread in streamflow records of 16.5 percent. Changes in routes and measurement frequencies optimally could reduce the average equivalent Gaussian spread to 14.7 percent.Budgets evaluated ranged from $235,000 to $400,000. A $235,000 budget would increase the optimal average equivalent Gaussian spread from 14.7 to 20.4 percent, and a $400,000 budget could decrease it to 5.8 percent.

  15. A Climate Data Record (CDR) for the global terrestrial water budget: 1984–2010

    DOE PAGES

    Zhang, Yu; Pan, Ming; Sheffield, Justin; ...

    2018-01-12

    Closing the terrestrial water budget is necessary to provide consistent estimates of budget components for understanding water resources and changes over time. Given the lack of in situ observations of budget components at anything but local scale, merging information from multiple data sources (e.g., in situ observation, satellite remote sensing, land surface model, and reanalysis) through data assimilation techniques that optimize the estimation of fluxes is a promising approach. Conditioned on the current limited data availability, a systematic method is developed to optimally combine multiple available data sources for precipitation ( P), evapotranspiration (ET), runoff ( R), and the totalmore » water storage change (TWSC) at 0.5° spatial resolution globally and to obtain water budget closure (i.e., to enforce P-ET- R-TWSC = 0) through a constrained Kalman filter (CKF) data assimilation technique under the assumption that the deviation from the ensemble mean of all data sources for the same budget variable is used as a proxy of the uncertainty in individual water budget variables. The resulting long-term (1984–2010), monthly 0.5° resolution global terrestrial water cycle Climate Data Record (CDR) data set is developed under the auspices of the National Aeronautics and Space Administration (NASA) Earth System Data Records (ESDRs) program. This data set serves to bridge the gap between sparsely gauged regions and the regions with sufficient in situ observations in investigating the temporal and spatial variability in the terrestrial hydrology at multiple scales. The CDR created in this study is validated against in situ measurements like river discharge from the Global Runoff Data Centre (GRDC) and the United States Geological Survey (USGS), and ET from FLUXNET. The data set is shown to be reliable and can serve the scientific community in understanding historical climate variability in water cycle fluxes and stores, benchmarking the current climate, and validating models.« less

  16. A Climate Data Record (CDR) for the global terrestrial water budget: 1984–2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yu; Pan, Ming; Sheffield, Justin

    Closing the terrestrial water budget is necessary to provide consistent estimates of budget components for understanding water resources and changes over time. Given the lack of in situ observations of budget components at anything but local scale, merging information from multiple data sources (e.g., in situ observation, satellite remote sensing, land surface model, and reanalysis) through data assimilation techniques that optimize the estimation of fluxes is a promising approach. Conditioned on the current limited data availability, a systematic method is developed to optimally combine multiple available data sources for precipitation ( P), evapotranspiration (ET), runoff ( R), and the totalmore » water storage change (TWSC) at 0.5° spatial resolution globally and to obtain water budget closure (i.e., to enforce P-ET- R-TWSC = 0) through a constrained Kalman filter (CKF) data assimilation technique under the assumption that the deviation from the ensemble mean of all data sources for the same budget variable is used as a proxy of the uncertainty in individual water budget variables. The resulting long-term (1984–2010), monthly 0.5° resolution global terrestrial water cycle Climate Data Record (CDR) data set is developed under the auspices of the National Aeronautics and Space Administration (NASA) Earth System Data Records (ESDRs) program. This data set serves to bridge the gap between sparsely gauged regions and the regions with sufficient in situ observations in investigating the temporal and spatial variability in the terrestrial hydrology at multiple scales. The CDR created in this study is validated against in situ measurements like river discharge from the Global Runoff Data Centre (GRDC) and the United States Geological Survey (USGS), and ET from FLUXNET. The data set is shown to be reliable and can serve the scientific community in understanding historical climate variability in water cycle fluxes and stores, benchmarking the current climate, and validating models.« less

  17. A Climate Data Record (CDR) for the global terrestrial water budget: 1984-2010

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; Pan, Ming; Sheffield, Justin; Siemann, Amanda L.; Fisher, Colby K.; Liang, Miaoling; Beck, Hylke E.; Wanders, Niko; MacCracken, Rosalyn F.; Houser, Paul R.; Zhou, Tian; Lettenmaier, Dennis P.; Pinker, Rachel T.; Bytheway, Janice; Kummerow, Christian D.; Wood, Eric F.

    2018-01-01

    Closing the terrestrial water budget is necessary to provide consistent estimates of budget components for understanding water resources and changes over time. Given the lack of in situ observations of budget components at anything but local scale, merging information from multiple data sources (e.g., in situ observation, satellite remote sensing, land surface model, and reanalysis) through data assimilation techniques that optimize the estimation of fluxes is a promising approach. Conditioned on the current limited data availability, a systematic method is developed to optimally combine multiple available data sources for precipitation (P), evapotranspiration (ET), runoff (R), and the total water storage change (TWSC) at 0.5° spatial resolution globally and to obtain water budget closure (i.e., to enforce P - ET - R - TWSC = 0) through a constrained Kalman filter (CKF) data assimilation technique under the assumption that the deviation from the ensemble mean of all data sources for the same budget variable is used as a proxy of the uncertainty in individual water budget variables. The resulting long-term (1984-2010), monthly 0.5° resolution global terrestrial water cycle Climate Data Record (CDR) data set is developed under the auspices of the National Aeronautics and Space Administration (NASA) Earth System Data Records (ESDRs) program. This data set serves to bridge the gap between sparsely gauged regions and the regions with sufficient in situ observations in investigating the temporal and spatial variability in the terrestrial hydrology at multiple scales. The CDR created in this study is validated against in situ measurements like river discharge from the Global Runoff Data Centre (GRDC) and the United States Geological Survey (USGS), and ET from FLUXNET. The data set is shown to be reliable and can serve the scientific community in understanding historical climate variability in water cycle fluxes and stores, benchmarking the current climate, and validating models.

  18. 78 FR 57153 - Proposed Information Collection Request; Comment Request; NOX

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-17

    ... to reduce emissions of nitrogen oxides (NO X ) from power plants and other large combustion sources...), a pervasive air pollution problem in many areas of the eastern United States. The NO X Budget... Collection Request; Comment Request; NOX Budget Trading Program To Reduce the Regional Transport of Ozone...

  19. 76 FR 21781 - Agency Information Collection Activities; Submission for OMB Emergency Review: Comment Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-18

    ... plant re-purposing from professionals in the communities that have already faced the problems associated...) titled, ``Repurposed Auto Manufacturing Facilities Study,'' to the Office of Management and Budget (OMB... Department of Labor, Office of the Secretary, Office of Management and Budget, Room 10235, Washington, DC...

  20. The Budgeting Mechanism in Development Companies

    ERIC Educational Resources Information Center

    Kovaleva, Tatiana M.; Khvostenko, Oleg A.; Glukhova, Alla G.; Nikeryasova, Veronica V.; Gavrilov, Denis E.

    2016-01-01

    Relevance of the researched problem is caused by the fact that today there is a requirement for a unique, generalized, theoretically and methodically elaborated budgeting mechanism disaggregating the aims of strategic level to the level of structural units of the company. The aim of article is to develop methodical provisions and practical…

  1. Streamlining Local School Budgeting in Ohio.

    ERIC Educational Resources Information Center

    Christman, Larry H.

    In order to make financial management of Ohio schools simpler and more understandable to the public, this Citizens' Council report recommends structural changes in school district budgeting that would necessitate changes in state law. The major problem areas identified are the districts' use of the calendar year as fiscal year, the requirement…

  2. Understanding Atmospheric Carbon Budgets: Teaching Students Conservation of Mass

    ERIC Educational Resources Information Center

    Reichert, Collin; Cervato, Cinzia; Niederhauser, Dale; Larsen, Michael D.

    2015-01-01

    In this paper we describe student use of a series of connected online problem-solving activities to remediate atmospheric carbon budget misconceptions held by undergraduate university students. In particular, activities were designed to address a common misconception about conservation of mass when students assume a simplistic, direct relationship…

  3. Long-Range Budget Planning in Private Colleges and Universities

    ERIC Educational Resources Information Center

    Hopkins, David S. P.; Massy, William F.

    1977-01-01

    Computer models have greatly assisted budget planners in privately financed institutions to identify and analyze major financial problems. The implementation of such a model at Stanford University is described that considers student aid expenses, indirect cost recovery, endowments, price elasticity of enrollment, and student/faculty ratios.…

  4. DECISION-MAKING IN THE SCHOOLS: AN OUTSIDER’S VIEW,

    DTIC Science & Technology

    DECISION MAKING , EDUCATION), (*EDUCATION, MANAGEMENT PLANNING AND CONTROL), (*MANAGEMENT PLANNING AND CONTROL, EDUCATION), BUDGETS, MANAGEMENT ENGINEERING, PERSONNEL MANAGEMENT, STUDENTS, LEARNING, OPTIMIZATION

  5. Budgeted Interactive Learning

    DTIC Science & Technology

    2017-06-15

    the methodology of reducing the online-algorithm-selecting problem as a contextual bandit problem, which is yet another interactive learning...KH2016a] Kuan-Hao Huang and Hsuan-Tien Lin. Linear upper confidence bound algorithm for contextual bandit problem with piled rewards. In Proceedings

  6. Global change and conservation triage on National Wildlife Refuges

    USGS Publications Warehouse

    Johnson, Fred A.; Eaton, Mitchell; McMahon, Gerard; Raye Nilius,; Mike Bryant,; Dave Case,; Martin, Julien; Wood, Nathan J.; Laura Taylor,

    2015-01-01

    National Wildlife Refuges (NWRs) in the United States play an important role in the adaptation of social-ecological systems to climate change, land-use change, and other global-change processes. Coastal refuges are already experiencing threats from sea-level rise and other change processes that are largely beyond their ability to influence, while at the same time facing tighter budgets and reduced staff. We engaged in workshops with NWR managers along the U.S. Atlantic coast to understand the problems they face from global-change processes and began a multidisciplinary collaboration to use decision science to help address them. We are applying a values-focused approach to base management decisions on the resource objectives of land managers, as well as those of stakeholders who may benefit from the goods and services produced by a refuge. Two insights that emerged from our workshops were a conspicuous mismatch between the scale at which management can influence outcomes and the scale of environmental processes, and the need to consider objectives related to ecosystem goods and services that traditionally have not been explicitly considered by refuges (e.g., protection from storm surge). The broadening of objectives complicates the decision-making process, but also provides opportunities for collaboration with stakeholders who may have agendas different from those of the refuge, as well as an opportunity for addressing problems across scales. From a practical perspective, we recognized the need to (1) efficiently allocate limited staff time and budgets for short-term management of existing programs and resources under the current refuge design and (2) develop long-term priorities for acquiring or protecting new land/habitat to supplement or replace the existing refuge footprint and thus sustain refuge values as the system evolves over time. Structuring the decision-making problem in this manner facilitated a better understanding of the issues of scale and suggested that a long-term solution will require a significant reassessment of objectives to better reflect the comprehensive values of refuges to society. We discuss some future considerations to integrate these two problems into a single framework by developing novel optimization approaches for dynamic problems that account for uncertainty in future conditions.

  7. History of satellite missions and measurements of the Earth Radiation Budget (1957-1984)

    NASA Technical Reports Server (NTRS)

    House, F. B.; Gruber, A.; Hunt, G. E.; Mecherikunnel, A. T.

    1986-01-01

    The history of satellite missions and their measurements of the earth radiation budget from the beginning of the space age until the present time are reviewed. The survey emphasizes the early struggle to develop instrument systems to monitor reflected shortwave and emitted long-wave exitances from the earth, and the problems associated with the interpretation of these observations from space. In some instances, valuable data sets were developed from satellite measurements whose instruments were not specifically designed for earth radiation budget observations.

  8. Big Dam Era: A Legislative and Institutional History of the Pick-Sloan Missouri Basin Program

    DTIC Science & Technology

    1993-01-01

    other problems in the upper tributary basins." Bureau of the Budget Director Harold D. Smith preferred an alterna- tive approach. (Note: By...executive order in 1970, the Bureau of the Budget became the Office of Management and Budget.) In February 1944, Smith responded to a letter from Secretary...President Roosevelt. Smith said he had informed Roosevelt of the Corps proposal and it was not what the President wanted, "at least at the present." The

  9. Risk-sensitive choice in humans as a function of an earnings budget.

    PubMed Central

    Pietras, C J; Hackenberc, T D

    2001-01-01

    Risky choice in 3 adult humans was investigated across procedural manipulations designed to model energy-budget manipulations conducted with nonhumans. Subjects were presented with repeated choices between a fixed and a variable number of points. An energy budget was simulated by use of an earnings budget, defined as the number of points needed within a block of trials for points to be exchanged for money. During positive earnings-budget conditions, exclusive preference for the fixed option met the earnings requirement. During negative earnings-budget conditions, exclusive preference for the certain option did not meet the earnings requirement, but choice for the variable option met the requirement probabilistically. Choice was generally risk averse (the fixed option was preferred) when the earnings budget was positive and risk prone (the variable option was preferred) when the earnings budget was negative. Furthermore, choice was most risk prone during negative earnings-budget conditions in which the earnings requirement was most stringent. Local choice patterns were also frequently consistent with the predictions of a dynamic optimization model, indicating that choice was simultaneously sensitive to short-term choice contingencies, current point earnings, and the earnings requirement. Overall, these results show that the patterns of risky choice generated by energy-budget variables can also be produced by choice contingencies that do not involve immediate survival, and that risky choice in humans may be similar to that shown in nonhumans when choice is studied under analogous experimental conditions. PMID:11516113

  10. A Model for Determining Optimal Governance Structure in DoD Acquisition Projects in a Performance-Based Environment

    DTIC Science & Technology

    2011-03-09

    task stability, technology application certainty, risk, and transaction-specific investments impact the selection of the optimal mode of governance...technology application certainty, risk, and transaction-specific investments impact the selection of the optimal mode of governance. Our model views...U.S. Defense Industry. The 1990s were a perfect storm of technological change, consolidation , budget downturns, environmental uncertainty, and the

  11. A problem-oriented approach to journal selection for hospital libraries.

    PubMed Central

    Delman, B S

    1982-01-01

    This paper describes a problem-oriented approach to journal selection (PAJS), including general methodology, theoretical terms, and a brief description of results when the system was applied in three different hospitals. The PAJS system relates the objective information which the MEDLARS data base offers about the universe of biomedical literature to objective, problem-oriented information supplied by the hospital's medical records. The results were manipulated quantitatively to determine (1) the relevance of various journals to each of the hospital's defined significant information problems and (2) the overall utility of each journal to the institution as a whole. The utility information was plotted on a graph to identify the collection of journal titles which would be most useful to the given hospital. Attempts made to verify certain aspects of the whole process are also described. The results suggest that the methodology is generally able to provide an effective library response. The system optimizes resources vis-a-vis information and can be used for both budget allocation and justification. It offers an algorithm to which operations researchers can apply any one of a variety of mathematical programming methods. Although originally intended for librarians in the community hospital environment, the PAJS system is generalizable and has application potential in a variety of special library settings. PMID:6758893

  12. Design and optimization of interplanetary spacecraft trajectories

    NASA Astrophysics Data System (ADS)

    McConaghy, Thomas Troy

    Scientists involved in space exploration are always looking for ways to accomplish more with their limited budgets. Mission designers can decrease operational costs by crafting trajectories with low launch costs, short time-of-flight, or low propellant requirements. Gravity-assist maneuvers and low-thrust, high-efficiency ion propulsion can be of great help. This dissertation describes advances in methods to design and optimize interplanetary spacecraft trajectories. particularly for missions using gravity-assist maneuvers or low-thrust engines (or both). The first part of this dissertation describes a new, efficient, two-step methodology to design and optimize low-thrust gravity-assist trajectories. Models for the launch vehicle, solar arrays, and engines are introduced and several examples of optimized trajectories are presented. For example, a 3.7-year Earth-Venus-Earth-Mars-Jupiter flyby trajectory with maximized final mass is described. The way that the parameterization of the optimization problem affects convergence speed and reliability is also investigated. The choice of coordinate system is shown to make a significant difference. The second part of this dissertation describes a way to construct Earth-Mars cycler trajectories---periodic orbits that repeatedly encounter Earth and Mars, yet require little or no propellant. We find that well-known cyclers, such as the Aldrin cycler, are special cases of a much larger family of cyclers. In fact, so many new cyclers are found that a comprehensive naming system (nomenclature) is proposed. One particularly promising new cycler, the "ballistic S1L1 cycler" is analyzed in greater detail.

  13. Multiobjective optimization applied to structural sizing of low cost university-class microsatellite projects

    NASA Astrophysics Data System (ADS)

    Ravanbakhsh, Ali; Franchini, Sebastián

    2012-10-01

    In recent years, there has been continuing interest in the participation of university research groups in space technology studies by means of their own microsatellites. The involvement in such projects has some inherent challenges, such as limited budget and facilities. Also, due to the fact that the main objective of these projects is for educational purposes, usually there are uncertainties regarding their in orbit mission and scientific payloads at the early phases of the project. On the other hand, there are predetermined limitations for their mass and volume budgets owing to the fact that most of them are launched as an auxiliary payload in which the launch cost is reduced considerably. The satellite structure subsystem is the one which is most affected by the launcher constraints. This can affect different aspects, including dimensions, strength and frequency requirements. In this paper, the main focus is on developing a structural design sizing tool containing not only the primary structures properties as variables but also the system level variables such as payload mass budget and satellite total mass and dimensions. This approach enables the design team to obtain better insight into the design in an extended design envelope. The structural design sizing tool is based on analytical structural design formulas and appropriate assumptions including both static and dynamic models of the satellite. Finally, a Genetic Algorithm (GA) multiobjective optimization is applied to the design space. The result is a Pareto-optimal based on two objectives, minimum satellite total mass and maximum payload mass budget, which gives a useful insight to the design team at the early phases of the design.

  14. Robust allocation of a defensive budget considering an attacker's private information.

    PubMed

    Nikoofal, Mohammad E; Zhuang, Jun

    2012-05-01

    Attackers' private information is one of the main issues in defensive resource allocation games in homeland security. The outcome of a defense resource allocation decision critically depends on the accuracy of estimations about the attacker's attributes. However, terrorists' goals may be unknown to the defender, necessitating robust decisions by the defender. This article develops a robust-optimization game-theoretical model for identifying optimal defense resource allocation strategies for a rational defender facing a strategic attacker while the attacker's valuation of targets, being the most critical attribute of the attacker, is unknown but belongs to bounded distribution-free intervals. To our best knowledge, no previous research has applied robust optimization in homeland security resource allocation when uncertainty is defined in bounded distribution-free intervals. The key features of our model include (1) modeling uncertainty in attackers' attributes, where uncertainty is characterized by bounded intervals; (2) finding the robust-optimization equilibrium for the defender using concepts dealing with budget of uncertainty and price of robustness; and (3) applying the proposed model to real data. © 2011 Society for Risk Analysis.

  15. Chance-constrained multi-objective optimization of groundwater remediation design at DNAPLs-contaminated sites using a multi-algorithm genetically adaptive method

    NASA Astrophysics Data System (ADS)

    Ouyang, Qi; Lu, Wenxi; Hou, Zeyu; Zhang, Yu; Li, Shuai; Luo, Jiannan

    2017-05-01

    In this paper, a multi-algorithm genetically adaptive multi-objective (AMALGAM) method is proposed as a multi-objective optimization solver. It was implemented in the multi-objective optimization of a groundwater remediation design at sites contaminated by dense non-aqueous phase liquids. In this study, there were two objectives: minimization of the total remediation cost, and minimization of the remediation time. A non-dominated sorting genetic algorithm II (NSGA-II) was adopted to compare with the proposed method. For efficiency, the time-consuming surfactant-enhanced aquifer remediation simulation model was replaced by a surrogate model constructed by a multi-gene genetic programming (MGGP) technique. Similarly, two other surrogate modeling methods-support vector regression (SVR) and Kriging (KRG)-were employed to make comparisons with MGGP. In addition, the surrogate-modeling uncertainty was incorporated in the optimization model by chance-constrained programming (CCP). The results showed that, for the problem considered in this study, (1) the solutions obtained by AMALGAM incurred less remediation cost and required less time than those of NSGA-II, indicating that AMALGAM outperformed NSGA-II. It was additionally shown that (2) the MGGP surrogate model was more accurate than SVR and KRG; and (3) the remediation cost and time increased with the confidence level, which can enable decision makers to make a suitable choice by considering the given budget, remediation time, and reliability.

  16. Solving inverse problem for Markov chain model of customer lifetime value using flower pollination algorithm

    NASA Astrophysics Data System (ADS)

    Al-Ma'shumah, Fathimah; Permana, Dony; Sidarto, Kuntjoro Adji

    2015-12-01

    Customer Lifetime Value is an important and useful concept in marketing. One of its benefits is to help a company for budgeting marketing expenditure for customer acquisition and customer retention. Many mathematical models have been introduced to calculate CLV considering the customer retention/migration classification scheme. A fairly new class of these models which will be described in this paper uses Markov Chain Models (MCM). This class of models has the major advantage for its flexibility to be modified to several different cases/classification schemes. In this model, the probabilities of customer retention and acquisition play an important role. From Pfeifer and Carraway, 2000, the final formula of CLV obtained from MCM usually contains nonlinear form of the transition probability matrix. This nonlinearity makes the inverse problem of CLV difficult to solve. This paper aims to solve this inverse problem, yielding the approximate transition probabilities for the customers, by applying metaheuristic optimization algorithm developed by Yang, 2013, Flower Pollination Algorithm. The major interpretation of obtaining the transition probabilities are to set goals for marketing teams in keeping the relative frequencies of customer acquisition and customer retention.

  17. Implications of optimization cost for balancing exploration and exploitation in global search and for experimental optimization

    NASA Astrophysics Data System (ADS)

    Chaudhuri, Anirban

    Global optimization based on expensive and time consuming simulations or experiments usually cannot be carried out to convergence, but must be stopped because of time constraints, or because the cost of the additional function evaluations exceeds the benefits of improving the objective(s). This dissertation sets to explore the implications of such budget and time constraints on the balance between exploration and exploitation and the decision of when to stop. Three different aspects are considered in terms of their effects on the balance between exploration and exploitation: 1) history of optimization, 2) fixed evaluation budget, and 3) cost as a part of objective function. To this end, this research develops modifications to the surrogate-based optimization technique, Efficient Global Optimization algorithm, that controls better the balance between exploration and exploitation, and stopping criteria facilitated by these modifications. Then the focus shifts to examining experimental optimization, which shares the issues of cost and time constraints. Through a study on optimization of thrust and power for a small flapping wing for micro air vehicles, important differences and similarities between experimental and simulation-based optimization are identified. The most important difference is that reduction of noise in experiments becomes a major time and cost issue, and a second difference is that parallelism as a way to cut cost is more challenging. The experimental optimization reveals the tendency of the surrogate to display optimistic bias near the surrogate optimum, and this tendency is then verified to also occur in simulation based optimization.

  18. State Planning, Budgeting, and Accountability: Approaches for Higher Education. AAHE-ERIC/Higher Education Research Report No. 6, 1982.

    ERIC Educational Resources Information Center

    Floyd, Carol Everly

    Statewide planning for higher education and the approaches that states take to budgeting and accountability are reviewed in this monograph. Statewide planning involves identifying problems and collecting relevant data, analyzing interrelationships among variables, and choosing the most desirable alternatives to reach objectives. State-level higher…

  19. The Development of a Planning, Programming and Budgeting System. Technical Report.

    ERIC Educational Resources Information Center

    Appelquist, Claes-Goran; Zandren, S.

    In CERI's program on institutional management in higher education, eight universities were brought together to set up teams within their institutions to work on their respective pre-selected problem areas. The planning, programming and budgeting system (PPBS) was developed as a management tool which would improve effectiveness by increasing the…

  20. Pay for Play: Fees for Cocurricular Activities.

    ERIC Educational Resources Information Center

    Pepe, Thomas J.; Tufts, Alice L.

    1984-01-01

    As school budgets face serious problems, one area under examination is the cocurricular activities section of the school budget. Many districts are charging user fees to students participating in school sports, band, drama, and even elective courses. Since no direct reference is made to education in the United States Constitution, education is a…

  1. A Changing Federalism: The Condition of the States.

    ERIC Educational Resources Information Center

    Adams, E. Kathleen

    A majority of the 50 states are currently experiencing budget problems as a result of recent changes in the fiscal roles of federal, state, and local governments. Four major factors are responsible for the recent deterioration of state budgets: (1) reductions in federal aid to states and localities, (2) changes in the federal corporate and…

  2. Cost Accounting: Problems and Research Related to Cost Definitions and Collection of Data

    ERIC Educational Resources Information Center

    Lyons, John M.

    1978-01-01

    Recent evidence suggests that traditional cost analysis may not be the most appropriate way to justify educational budgets. This article suggests that using constructed cost models to develop operating budget requests can help ensure that the distinction between legitimate information needs and managerial autonomy is maintained. (LBH)

  3. Inventorying national forest resources...for planning-programing-budgeting system

    Treesearch

    Miles R. Hill; Elliot L. Amidon

    1968-01-01

    New systems for analyzing resource management problems, such as Planning-Programing-Budgeting, will require automated procedures to collect and assemble resource inventory data. A computer - oriented system called Map Information Assembly and Display System developed for this purpose was tested on a National Forest in California. It provided information on eight forest...

  4. Employer-provided health insurance and the incidence of job lock: a literature review and empirical test.

    PubMed

    Rashad, Inas; Sarpong, Eric

    2008-12-01

    The incidence of 'job lock' in the health insurance context has long been viewed as a potential problem with employer-provided health insurance, a concept that was instrumental in the passage of the United States Consolidated Omnibus Budget Reconciliation Act of 1986, and later, the Health Insurance Portability and Accountability Act in 1996. Several recent developments in healthcare in the USA include declining healthcare coverage and a noticeable shift in the burden of medical care costs to employees. If these developments cause employees with employer-provided health insurance to feel locked into their jobs, optimal job matches in the labor force may not take place. A summary of the seminal papers in the current literature on the topic of job lock is given, followed by an empirical exercise using single individuals from the National Health Interview Survey (1997-2003) and the 1979 cohort of the National Longitudinal Survey of Youth (1989-2000). Econometric methods used include difference in differences, ordinary least squares and individual fixed effects models, in gauging the potential effect that employer-provided health insurance may have on job tenure and voluntary job departure. Our findings are consistent with recent assertions that there is some evidence of job lock. Individuals with employer-provided health insurance stay on the job 16% longer and are 60% less likely to voluntarily leave their jobs than those with insurance that is not provided by their employers. Productivity may not be optimal if incentives are altered owing to the existence of fringe benefits, such as health insurance. Further research in this area should determine whether legislation beyond the Consolidated Omnibus Budget Reconciliation Act and Health Insurance Portability and Accountability Act laws is needed.

  5. Stochastic control approaches for sensor management in search and exploitation

    NASA Astrophysics Data System (ADS)

    Hitchings, Darin Chester

    Recent improvements in the capabilities of autonomous vehicles have motivated their increased use in such applications as defense, homeland security, environmental monitoring, and surveillance. To enhance performance in these applications, new algorithms are required to control teams of robots autonomously and through limited interactions with human operators. In this dissertation we develop new algorithms for control of robots performing information-seeking missions in unknown environments. These missions require robots to control their sensors in order to discover the presence of objects, keep track of the objects, and learn what these objects are, given a fixed sensing budget. Initially, we investigate control of multiple sensors, with a finite set of sensing options and finite-valued measurements, to locate and classify objects given a limited resource budget. The control problem is formulated as a Partially Observed Markov Decision Problem (POMDP), but its exact solution requires excessive computation. Under the assumption that sensor error statistics are independent and time-invariant, we develop a class of algorithms using Lagrangian Relaxation techniques to obtain optimal mixed strategies using performance bounds developed in previous research. We investigate alternative Receding Horizon (RH) controllers to convert the mixed strategies to feasible adaptive-sensing strategies and evaluate the relative performance of these controllers in simulation. The resulting controllers provide superior performance to alternative algorithms proposed in the literature and obtain solutions to large-scale POMDP problems several orders of magnitude faster than optimal Dynamic Programming (DP) approaches with comparable performance quality. We extend our results for finite action, finite measurement sensor control to scenarios with moving objects. We use Hidden Markov Models (HMMs) for the evolution of objects, according to the dynamics of a birth-death process. We develop a new lower bound on the performance of adaptive controllers in these scenarios, develop algorithms for computing solutions to this lower bound, and use these algorithms as part of a RH controller for sensor allocation in the presence of moving objects We also consider an adaptive Search problem where sensing actions are continuous and the underlying measurement space is also continuous. We extend our previous hierarchical decomposition approach based on performance bounds to this problem and develop novel implementations of Stochastic Dynamic Programming (SDP) techniques to solve this problem. Our algorithms are nearly two orders of magnitude faster than previously proposed approaches and yield solutions of comparable quality. For supervisory control, we discuss how human operators can work with and augment robotic teams performing these tasks. Our focus is on how tasks are partitioned among teams of robots and how a human operator can make intelligent decisions for task partitioning. We explore these questions through the design of a game that involves robot automata controlled by our algorithms and a human supervisor that partitions tasks based on different levels of support information. This game can be used with human subject experiments to explore the effect of information on quality of supervisory control.

  6. The price of conserving avian phylogenetic diversity: a global prioritization approach

    PubMed Central

    Nunes, Laura A.; Turvey, Samuel T.; Rosindell, James

    2015-01-01

    The combination of rapid biodiversity loss and limited funds available for conservation represents a major global concern. While there are many approaches for conservation prioritization, few are framed as financial optimization problems. We use recently published avian data to conduct a global analysis of the financial resources required to conserve different quantities of phylogenetic diversity (PD). We introduce a new prioritization metric (ADEPD) that After Downlisting a species gives the Expected Phylogenetic Diversity at some future time. Unlike other metrics, ADEPD considers the benefits to future PD associated with downlisting a species (e.g. moving from Endangered to Vulnerable in the International Union for Conservation of Nature Red List). Combining ADEPD scores with data on the financial cost of downlisting different species provides a cost–benefit prioritization approach for conservation. We find that under worst-case spending $3915 can save 1 year of PD, while under optimal spending $1 can preserve over 16.7 years of PD. We find that current conservation spending patterns are only expected to preserve one quarter of the PD that optimal spending could achieve with the same total budget. Maximizing PD is only one approach within the wider goal of biodiversity conservation, but our analysis highlights more generally the danger involved in uninformed spending of limited resources. PMID:25561665

  7. The price of conserving avian phylogenetic diversity: a global prioritization approach.

    PubMed

    Nunes, Laura A; Turvey, Samuel T; Rosindell, James

    2015-02-19

    The combination of rapid biodiversity loss and limited funds available for conservation represents a major global concern. While there are many approaches for conservation prioritization, few are framed as financial optimization problems. We use recently published avian data to conduct a global analysis of the financial resources required to conserve different quantities of phylogenetic diversity (PD). We introduce a new prioritization metric (ADEPD) that After Downlisting a species gives the Expected Phylogenetic Diversity at some future time. Unlike other metrics, ADEPD considers the benefits to future PD associated with downlisting a species (e.g. moving from Endangered to Vulnerable in the International Union for Conservation of Nature Red List). Combining ADEPD scores with data on the financial cost of downlisting different species provides a cost-benefit prioritization approach for conservation. We find that under worst-case spending $3915 can save 1 year of PD, while under optimal spending $1 can preserve over 16.7 years of PD. We find that current conservation spending patterns are only expected to preserve one quarter of the PD that optimal spending could achieve with the same total budget. Maximizing PD is only one approach within the wider goal of biodiversity conservation, but our analysis highlights more generally the danger involved in uninformed spending of limited resources.

  8. Optimizing reserve expansion for disjunct populations of San Joaquin kit fox

    Treesearch

    Robert G. Haight; Brian Cypher; Patrick A. Kelly; Scott Phillips; Katherine Ralls; Hugh P. Possingham

    2004-01-01

    Expanding habitat protection is a common strategy for species conservation. We present a model to optimize the expansion of reserves for disjunct populations of an endangered species. The objective is to maximize the expected number of surviving populations subject to budget and habitat constraints. The model accounts for benefits of reserve expansion in terms of...

  9. International Conference on Problems Related to the Stratosphere

    NASA Technical Reports Server (NTRS)

    Huntress, W., Jr.

    1977-01-01

    The conference focused on four main areas of investigation: laboratory studies and stratospheric chemistry and constituents, sources for and chemical budget of stratospheric halogen compounds, sources for and chemical budget of stratospheric nitrous oxide, and the dynamics of decision making on regulation of potential pollutants of the stratosphere. Abstracts of the scientific sessions of the conference as well as complete transcriptions of the panel discussions on sources for an atmospheric budget of holocarbons and nitrous oxide are included. The political, social and economic issues involving regulation of potential stratospheric pollutants were examined extensively.

  10. An Examination of the Decision-Making Processes Used by Superintendents in Reducing School District Budgets

    ERIC Educational Resources Information Center

    Slaven, Lori A.

    2014-01-01

    Purpose: The purpose of this study was to determine the degree of importance of Harvey et al.'s (1997) 13 problem-solving strategies for making retrenchment decisions on school district budgets as perceived by California superintendents of medium-sized school districts. Methodology: The subjects in the present study were 86 superintendents of…

  11. Changing Students' Perceptions of Inequality?: Combining Traditional Methods and a Budget Exercise to Facilitate a Sociological Perspective

    ERIC Educational Resources Information Center

    Garoutte, Lisa; Bobbitt-Zeher, Donna

    2011-01-01

    Budget exercises are frequently used in introductory and social problems courses to facilitate student understanding of income inequality. But do these exercises actually lead to greater sociological understanding? To explore this issue, the authors studied undergraduate students enrolled in introductory sociology courses during the 2008-2009…

  12. The NO{sub x} Budget trading program: a collaborative, innovative approach to solving a regional air pollution problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Napolitano, Sam; Stevens, Gabrielle; Schreifels, Jeremy

    2007-11-15

    The NO{sub x} Budget Trading Program showed that regional cap-and-trade programs are adaptable to more than one pollutant, time period, and geographic scale, and can achieve compliance results similar to the Acid Rain Program. Here are 11 specific lessons that have emerged from the experience. (author)

  13. Their Budgets Slashed, Public Colleges Share the Pain with a Glut of Applicants

    ERIC Educational Resources Information Center

    Mangan, Katherine

    2008-01-01

    This article reports that as students flock to public two-year and four-year colleges amid an ailing economy, they find the colleges struggling with financial problems of their own. Midyear budget cuts are forcing many institutions to lay off faculty members, cut course sections, and freeze enrollment. Rising unemployment, slumping values of…

  14. Problems and Opportunities in School Financial Management: A Consultant's Perspective.

    ERIC Educational Resources Information Center

    Chabotar, Kent John

    1987-01-01

    Summarizes major problems in school financial management and suggests practical improvements to aid external reporting of financial data and internal management. Sections of the article describe these categories of problems: (1) budget presentation; (2) management control; (3) cost accounting; and (4) financial reporting. (PS)

  15. Strategic Gang Scheduling for Railroad Maintenance

    DOT National Transportation Integrated Search

    2012-08-14

    We address the railway track maintenance scheduling problem. The problem stems from the : significant percentage of the annual budget invested by the railway industry for maintaining its railway : tracks. The process requires consideration of human r...

  16. Modern Estimates of Global Water Cycle Fluxes

    NASA Astrophysics Data System (ADS)

    Rodell, M.; Beaudoing, H. K.; L'Ecuyer, T. S.; Olson, W. S.

    2014-12-01

    The goal of the first phase of the NASA Energy and Water Cycle Study (NEWS) Water and Energy Cycle Climatology project was to develop "state of the global water cycle" and "state of the global energy cycle" assessments based on data from modern ground and space based observing systems and data integrating models. Here we describe results of the water cycle assessment, including mean annual and monthly fluxes over continents and ocean basins during the first decade of the millennium. To the extent possible, the water flux estimates are based on (1) satellite measurements and (2) data-integrating models. A careful accounting of uncertainty in each flux was applied within a routine that enforced multiple water and energy budget constraints simultaneously in a variational framework, in order to produce objectively-determined, optimized estimates. Simultaneous closure of the water and energy budgets caused the ocean evaporation and precipitation terms to increase by about 10% and 5% relative to the original estimates, mainly because the energy budget required turbulent heat fluxes to be substantially larger in order to balance net radiation. In the majority of cases, the observed annual, surface and atmospheric water budgets over the continents and oceans close with much less than 10% residual. Observed residuals and optimized uncertainty estimates are considerably larger for monthly surface and atmospheric water budget closure, often nearing or exceeding 20% in North America, Eurasia, Australia and neighboring islands, and the Arctic and South Atlantic Oceans. The residuals in South America and Africa tend to be smaller, possibly because cold land processes are a non-issue. Fluxes are poorly observed over the Arctic Ocean, certain seas, Antarctica, and the Australasian and Indonesian Islands, leading to reliance on atmospheric analysis estimates. Other details of the study and future directions will be discussed.

  17. Impact of a financial risk-sharing scheme on budget-impact estimations: a game-theoretic approach.

    PubMed

    Gavious, Arieh; Greenberg, Dan; Hammerman, Ariel; Segev, Ella

    2014-06-01

    As part of the process of updating the National List of Health Services in Israel, health plans (the 'payers') and manufacturers each provide estimates on the expected number of patients that will utilize a new drug. Currently, payers face major financial consequences when actual utilization is higher than the allocated budget. We suggest a risk-sharing model between the two stakeholders; if the actual number of patients exceeds the manufacturer's prediction, the manufacturer will reimburse the payers by a rebate rate of α from the deficit. In case of under-utilization, payers will refund the government at a rate of γ from the surplus budget. Our study objective was to identify the optimal early estimations of both 'players' prior to and after implementation of the risk-sharing scheme. Using a game-theoretic approach, in which both players' statements are considered simultaneously, we examined the impact of risk-sharing within a given range of rebate proportions, on players' early budget estimations. When increasing manufacturer's rebate α to be over 50 %, then manufacturers will announce a larger number, and health plans will announce a lower number of patients than they would without risk sharing, thus substantially decreasing the gap between their estimates. Increasing γ changes players' estimates only slightly. In reaction to applying a substantial risk-sharing rebate α on the manufacturer, both players are expected to adjust their budget estimates toward an optimal equilibrium. Increasing α is a better vehicle for reaching the desired equilibrium rather than increasing γ, as the manufacturer's rebate α substantially influences both players, whereas γ has little effect on the players behavior.

  18. Automation--planning to implementation; the problems en route.

    PubMed Central

    Pizer, I H

    1976-01-01

    Once the major decision to automate library processes is made, there are a variety of problems which may be encountered before the planned system becomes operational. These include problems of personnel, budget, procurement of adjunct services, institutional priorities, and manufacturing uncertainties. Actual and potential difficulties are discussed. PMID:1247703

  19. Observations on Leadership, Problem Solving, and Preferred Futures of Universities

    ERIC Educational Resources Information Center

    Puncochar, Judith

    2013-01-01

    A focus on enrollments, rankings, uncertain budgets, and branding efforts to operate universities could have serious implications for discussions of sustainable solutions to complex problems and the decision-making processes of leaders. The Authentic Leadership Model for framing ill-defined problems in higher education is posited to improve the…

  20. Statistical and optimal learning with applications in business analytics

    NASA Astrophysics Data System (ADS)

    Han, Bin

    Statistical learning is widely used in business analytics to discover structure or exploit patterns from historical data, and build models that capture relationships between an outcome of interest and a set of variables. Optimal learning on the other hand, solves the operational side of the problem, by iterating between decision making and data acquisition/learning. All too often the two problems go hand-in-hand, which exhibit a feedback loop between statistics and optimization. We apply this statistical/optimal learning concept on a context of fundraising marketing campaign problem arising in many non-profit organizations. Many such organizations use direct-mail marketing to cultivate one-time donors and convert them into recurring contributors. Cultivated donors generate much more revenue than new donors, but also lapse with time, making it important to steadily draw in new cultivations. The direct-mail budget is limited, but better-designed mailings can improve success rates without increasing costs. We first apply statistical learning to analyze the effectiveness of several design approaches used in practice, based on a massive dataset covering 8.6 million direct-mail communications with donors to the American Red Cross during 2009-2011. We find evidence that mailed appeals are more effective when they emphasize disaster preparedness and training efforts over post-disaster cleanup. Including small cards that affirm donors' identity as Red Cross supporters is an effective strategy, while including gift items such as address labels is not. Finally, very recent acquisitions are more likely to respond to appeals that ask them to contribute an amount similar to their most recent donation, but this approach has an adverse effect on donors with a longer history. We show via simulation that a simple design strategy based on these insights has potential to improve success rates from 5.4% to 8.1%. Given these findings, when new scenario arises, however, new data need to be acquired to update our model and decisions, which is studied under optimal learning framework. The goal becomes discovering a sequential information collection strategy that learns the best campaign design alternative as quickly as possible. Regression structure is used to learn about a set of unknown parameters, which alternates with optimization to design new data points. Such problems have been extensively studied in the ranking and selection (R&S) community, but traditional R&S procedures experience high computational costs when the decision space grows combinatorially. We present a value of information procedure for simultaneously learning unknown regression parameters and unknown sampling noise. We then develop an approximate version of the procedure, based on semi-definite programming relaxation, that retains good performance and scales better to large problems. We also prove the asymptotic consistency of the algorithm in the parametric model, a result that has not previously been available for even the known-variance case.

  1. WHAMII - An enumeration and insertion procedure with binomial bounds for the stochastic time-constrained traveling salesman problem

    NASA Technical Reports Server (NTRS)

    Dahl, Roy W.; Keating, Karen; Salamone, Daryl J.; Levy, Laurence; Nag, Barindra; Sanborn, Joan A.

    1987-01-01

    This paper presents an algorithm (WHAMII) designed to solve the Artificial Intelligence Design Challenge at the 1987 AIAA Guidance, Navigation and Control Conference. The problem under consideration is a stochastic generalization of the traveling salesman problem in which travel costs can incur a penalty with a given probability. The variability in travel costs leads to a probability constraint with respect to violating the budget allocation. Given the small size of the problem (eleven cities), an approach is considered that combines partial tour enumeration with a heuristic city insertion procedure. For computational efficiency during both the enumeration and insertion procedures, precalculated binomial probabilities are used to determine an upper bound on the actual probability of violating the budget constraint for each tour. The actual probability is calculated for the final best tour, and additional insertions are attempted until the actual probability exceeds the bound.

  2. Deploying initial attack resources for wildfire suppression: spatial coordination, budget constraints, and capacity constraints

    Treesearch

    Yohan Lee; Jeremy S. Fried; Heidi J. Albers; Robert G. Haight

    2013-01-01

    We combine a scenario-based, standard-response optimization model with stochastic simulation to improve the efficiency of resource deployment for initial attack on wildland fires in three planning units in California. The optimization model minimizes the expected number of fires that do not receive a standard response--defined as the number of resources by type that...

  3. Radiation budget measurement/model interface

    NASA Technical Reports Server (NTRS)

    Vonderhaar, T. H.; Ciesielski, P.; Randel, D.; Stevens, D.

    1983-01-01

    This final report includes research results from the period February, 1981 through November, 1982. Two new results combine to form the final portion of this work. They are the work by Hanna (1982) and Stevens to successfully test and demonstrate a low-order spectral climate model and the work by Ciesielski et al. (1983) to combine and test the new radiation budget results from NIMBUS-7 with earlier satellite measurements. Together, the two related activities set the stage for future research on radiation budget measurement/model interfacing. Such combination of results will lead to new applications of satellite data to climate problems. The objectives of this research under the present contract are therefore satisfied. Additional research reported herein includes the compilation and documentation of the radiation budget data set a Colorado State University and the definition of climate-related experiments suggested after lengthy analysis of the satellite radiation budget experiments.

  4. A Primer on Responsibility Centre Budgeting and Responsibility Centre Management. Professional File, Winter 1999, Number 17.

    ERIC Educational Resources Information Center

    Lang, Daniel W.

    This monograph is a "how-to" manual on responsibility center budgeting (RCB) and responsibility center management (RCM) in the context of Canadian and U.S. institutions. It explains how RCB/RCM works in practice and discusses some of the problems encountered in implementing this strategy at a number of Canadian and U.S. universities. The…

  5. A Method for the Selection of Exploration Areas for Unconformity Uranium Deposits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, DeVerle P.; Zaluski, Gerard; Marlatt, James

    2009-06-15

    The method we propose employs two analyses: (1) exploration simulation and risk valuation and (2) portfolio optimization. The first analysis, implemented by the investment worth system (IWS), uses Monte Carlo simulation to integrate a wide spectrum of uncertain and varied components to a relative frequency histogram for net present value of the exploration investment, which is converted to a risk-adjusted value (RAV). Iterative rerunning of the IWS enables the mapping of the relationship of RAV to magnitude of exploration expenditure, X. The second major analysis uses RAV vs. X maps to identify that subset (portfolio) of areas that maximizes themore » RAV of the firm's multiyear exploration budget. The IWS, which is demonstrated numerically, consists of six components based on the geologic description of a hypothetical basin and project area (PA) and a mix of hypothetical and actual conditions of an unidentified area. The geology is quantified and processed by Bayesian belief networks to produce the geology-based inputs required by the IWS. An exploration investment of $60 M produced a highly skewed distribution of net present value (NPV), having mean and median values of $4,160 M and $139 M, respectively. For hypothetical mining firm Minex, the RAV of the exploration investment of $60 M is only $110.7 M. An RAV that is less than 3% of mean NPV reflects the aversion by Minex to risk as well as the magnitude of risk implicit to the highly skewed NPV distribution and the probability of 0.45 for capital loss. Potential benefits of initiating exploration of a portfolio of areas, as contrasted with one area, include increased marginal productivity of exploration as well as reduced probability for nondiscovery. For an exogenously determined multiyear exploration budget, a conceptual framework for portfolio optimization is developed based on marginal RAV exploration products for candidate PAs. PORTFOLIO, a software developed to implement optimization, allocates exploration to PAs so that the RAV of the exploration budget is maximized. Moreover, PORTFOLIO provides a means to examine the impact of magnitude of budget on the composition of the exploration portfolio and the optimum allocation of exploration to PAs that comprise the portfolio. Using fictitious data for five PAs, a numerical demonstration is provided of the use of PORTFOLIO to identify those PAs that comprise the optimum exploration portfolio and to optimally allocate the multiyear budget across portfolio PAs.« less

  6. Use of mathematical decomposition to optimize investments in gas production and distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dougherty, E.L.; Lombardino, E.; Hutchinson, P.

    1986-01-01

    This paper presents an analytical approach based upon the decomposition method of mathematical programming for determining the optimal investment sequence in each year of a planning horizon for a group of reservoirs that produce gas and gas liquids through a trunk-line network and a gas processing plant. The paper describes the development of the simulation and investment planning system (SIPS) to perform the required calculations. Net present value (NPV) is maximized with the requirement that the incremental present value ratio (PWPI) of any investment in any reservoir be greater than a specified minimum value. A unique feature is a gasmore » reservoir simulation model that aids SIPS in evaluating field development investments. The optimal solution supplies specified dry gas offtake requirements through time until the remaining reserves are insufficient to meet requirements economically. The sales value of recovered liquids contributes significantly to NPV, while the required spare gas-producing capacity reduces NPV. Sips was used successfully for 4 years to generate annual investment plans and operating budgets, and to perform many special studies for a producing complex containing over 50 reservoirs. This experience is reviewed. In considering this large problem, SIPS converges to the optimal solution in 10 to 20 iterations. The primary factor that determines this number is how good the starting guess is. Although sips can generate a starting guess, beginning with a previous optimal solution ordinarily results in faster convergence. Computing time increases in proportion to the number of reservoirs because more than 90% of computing time is spent solving the, reservoir, subproblems.« less

  7. Access to anti-cancer drugs in India: is there a need to revise reimbursement policies?

    PubMed

    Haitsma, Gertruud; Patel, Himanshu; Gurumurthy, Parthasarathi; Postma, Maarten J

    2018-06-01

    The aim of this study was to examine the access of Indian cancer patients to optimum cancer care under selected government schemes by reviewing reimbursement schemes for cancer care in India. All cancer care reimbursement schemes in India were identified and three highly utilized schemes (VAS, RAS, CMCHS) were selected. Quality of breast, colorectal, lung, head & neck, and gastric cancer care was reviewed with respect to NCCN guidelines. Direct medical costs and shortage of budget in reimbursed amounts were calculated for each listed chemotherapy regimen. Medical oncology practice following the schemes' formularies is inferior to recommendations by the NCCN guidelines. Innovative treatment (targeted therapies) like trastuzumab, pertuzumab (breast), bevacizumab, cetuximab, panitumumab (colorectal), erlotinib, gefitinib, crizotinib, and nivolumab (lung) are either not reimbursed (VAS, CMCHS) or partially reimbursed (RAS). Average shortage of budget was found to be 43% (breast), 55% (colorectal), 74% (lung), 7% (head & neck), and 51% (gastric cancer). Policy makers should consider addition of newer treatments, exclusion of sub-optimal treatments, increments in per patient budget and optimization of supportive care, which may contribute to improvements in survival and quality of life for Indian cancer patients.

  8. Developing deterioration models for Nebraska bridges.

    DOT National Transportation Integrated Search

    2011-07-01

    Nebraska Bridge Management System (NBMS) was developed in 1999 to assist in optimizing budget allocation for : the maintenance, rehabilitation and replacement needs of highway bridges. This requires the prediction of bridge : deterioration to calcula...

  9. Chance-constrained multi-objective optimization of groundwater remediation design at DNAPLs-contaminated sites using a multi-algorithm genetically adaptive method.

    PubMed

    Ouyang, Qi; Lu, Wenxi; Hou, Zeyu; Zhang, Yu; Li, Shuai; Luo, Jiannan

    2017-05-01

    In this paper, a multi-algorithm genetically adaptive multi-objective (AMALGAM) method is proposed as a multi-objective optimization solver. It was implemented in the multi-objective optimization of a groundwater remediation design at sites contaminated by dense non-aqueous phase liquids. In this study, there were two objectives: minimization of the total remediation cost, and minimization of the remediation time. A non-dominated sorting genetic algorithm II (NSGA-II) was adopted to compare with the proposed method. For efficiency, the time-consuming surfactant-enhanced aquifer remediation simulation model was replaced by a surrogate model constructed by a multi-gene genetic programming (MGGP) technique. Similarly, two other surrogate modeling methods-support vector regression (SVR) and Kriging (KRG)-were employed to make comparisons with MGGP. In addition, the surrogate-modeling uncertainty was incorporated in the optimization model by chance-constrained programming (CCP). The results showed that, for the problem considered in this study, (1) the solutions obtained by AMALGAM incurred less remediation cost and required less time than those of NSGA-II, indicating that AMALGAM outperformed NSGA-II. It was additionally shown that (2) the MGGP surrogate model was more accurate than SVR and KRG; and (3) the remediation cost and time increased with the confidence level, which can enable decision makers to make a suitable choice by considering the given budget, remediation time, and reliability. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Balancing Detection and Eradication for Control of Epidemics: Sudden Oak Death in Mixed-Species Stands

    PubMed Central

    Ndeffo Mbah, Martial L.; Gilligan, Christopher A.

    2010-01-01

    Culling of infected individuals is a widely used measure for the control of several plant and animal pathogens but culling first requires detection of often cryptically-infected hosts. In this paper, we address the problem of how to allocate resources between detection and culling when the budget for disease management is limited. The results are generic but we motivate the problem for the control of a botanical epidemic in a natural ecosystem: sudden oak death in mixed evergreen forests in coastal California, in which species composition is generally dominated by a spreader species (bay laurel) and a second host species (coast live oak) that is an epidemiological dead-end in that it does not transmit infection but which is frequently a target for preservation. Using a combination of an epidemiological model for two host species with a common pathogen together with optimal control theory we address the problem of how to balance the allocation of resources for detection and epidemic control in order to preserve both host species in the ecosystem. Contrary to simple expectations our results show that an intermediate level of detection is optimal. Low levels of detection, characteristic of low effort expended on searching and detection of diseased trees, and high detection levels, exemplified by the deployment of large amounts of resources to identify diseased trees, fail to bring the epidemic under control. Importantly, we show that a slight change in the balance between the resources allocated to detection and those allocated to control may lead to drastic inefficiencies in control strategies. The results hold when quarantine is introduced to reduce the ingress of infected material into the region of interest. PMID:20856850

  11. A population-based model for priority setting across the care continuum and across modalities

    PubMed Central

    Segal, Leonie; Mortimer, Duncan

    2006-01-01

    Background The Health-sector Wide (HsW) priority setting model is designed to shift the focus of priority setting away from 'program budgets' – that are typically defined by modality or disease-stage – and towards well-defined target populations with a particular disease/health problem. Methods The key features of the HsW model are i) a disease/health problem framework, ii) a sequential approach to covering the entire health sector, iii) comprehensiveness of scope in identifying intervention options and iv) the use of objective evidence. The HsW model redefines the unit of analysis over which priorities are set to include all mutually exclusive and complementary interventions for the prevention and treatment of each disease/health problem under consideration. The HsW model is therefore incompatible with the fragmented approach to priority setting across multiple program budgets that currently characterises allocation in many health systems. The HsW model employs standard cost-utility analyses and decision-rules with the aim of maximising QALYs contingent upon the global budget constraint for the set of diseases/health problems under consideration. It is recognised that the objective function may include non-health arguments that would imply a departure from simple QALY maximisation and that political constraints frequently limit degrees of freedom. In addressing these broader considerations, the HsW model can be modified to maximise value-weighted QALYs contingent upon the global budget constraint and any political constraints bearing upon allocation decisions. Results The HsW model has been applied in several contexts, recently to osteoarthritis, that has demonstrated both its practical application and its capacity to derive clear evidenced-based policy recommendations. Conclusion Comparisons with other approaches to priority setting, such as Programme Budgeting and Marginal Analysis (PBMA) and modality-based cost-effectiveness comparisons, as typified by Australia's Pharmaceutical Benefits Advisory Committee process for the listing of pharmaceuticals for government funding, demonstrate the value added by the HsW model notably in its greater likelihood of contributing to allocative efficiency. PMID:16566841

  12. A Content Analysis of Defense Budget Rhetoric

    DTIC Science & Technology

    2011-06-01

    budget favored a programmatic and fiscal orientation (see Mayar , 1993; Art, 1985; Blackmon, 1975) Kanter examined the changes made to the President’s...views federal agencies. Shull examined the interaction between presidential support for agencies and Congress’ reaction to the President’s position on...micromanagement. “Micromanagement is best viewed as a problem of competition among political actors for policy control” ( Mayar , 1993, p. 294). In

  13. Dropouts and Budgets: A Test of a Dropout Reduction Model among Students in Israeli Higher Education

    ERIC Educational Resources Information Center

    Bar-Am, Ran; Arar, Osama

    2017-01-01

    This article deals with the problem of student dropout during the first year in a higher education institution. To date, no model on a budget has been developed and tested to prevent dropout among Engineering Students. This case study was conducted among first-year students taking evening classes in two practical engineering colleges in Israel.…

  14. Parameter Estimation of Computationally Expensive Watershed Models Through Efficient Multi-objective Optimization and Interactive Decision Analytics

    NASA Astrophysics Data System (ADS)

    Akhtar, Taimoor; Shoemaker, Christine

    2016-04-01

    Watershed model calibration is inherently a multi-criteria problem. Conflicting trade-offs exist between different quantifiable calibration criterions indicating the non-existence of a single optimal parameterization. Hence, many experts prefer a manual approach to calibration where the inherent multi-objective nature of the calibration problem is addressed through an interactive, subjective, time-intensive and complex decision making process. Multi-objective optimization can be used to efficiently identify multiple plausible calibration alternatives and assist calibration experts during the parameter estimation process. However, there are key challenges to the use of multi objective optimization in the parameter estimation process which include: 1) multi-objective optimization usually requires many model simulations, which is difficult for complex simulation models that are computationally expensive; and 2) selection of one from numerous calibration alternatives provided by multi-objective optimization is non-trivial. This study proposes a "Hybrid Automatic Manual Strategy" (HAMS) for watershed model calibration to specifically address the above-mentioned challenges. HAMS employs a 3-stage framework for parameter estimation. Stage 1 incorporates the use of an efficient surrogate multi-objective algorithm, GOMORS, for identification of numerous calibration alternatives within a limited simulation evaluation budget. The novelty of HAMS is embedded in Stages 2 and 3 where an interactive visual and metric based analytics framework is available as a decision support tool to choose a single calibration from the numerous alternatives identified in Stage 1. Stage 2 of HAMS provides a goodness-of-fit measure / metric based interactive framework for identification of a small subset (typically less than 10) of meaningful and diverse set of calibration alternatives from the numerous alternatives obtained in Stage 1. Stage 3 incorporates the use of an interactive visual analytics framework for decision support in selection of one parameter combination from the alternatives identified in Stage 2. HAMS is applied for calibration of flow parameters of a SWAT model, (Soil and Water Assessment Tool) designed to simulate flow in the Cannonsville watershed in upstate New York. Results from the application of HAMS to Cannonsville indicate that efficient multi-objective optimization and interactive visual and metric based analytics can bridge the gap between the effective use of both automatic and manual strategies for parameter estimation of computationally expensive watershed models.

  15. Challenging the Sacred Assumption: A Call for a Systemic Review of Army Aviation Maintenance

    DTIC Science & Technology

    2017-05-25

    structure , training, equipping and sustainment. Each study intends to optimize the force structure to achieve a balance between the modernization and...operational budgets. Since 1994, Army Aviation force structures , training resources, available equipment and aircraft have changed significantly. Yet...and are focused on force structure , training, equipping and sustainment. Each study intends to optimize the force structure to achieve a balance

  16. Minding Your Business: How to Avoid the Seven Deadly Financial Pitfalls.

    ERIC Educational Resources Information Center

    Stephens, Keith

    1990-01-01

    Describes financial management problems typically encountered by child care center directors and owners. Offers suggestions for planning and management techniques to overcome problems of cash flow, budgeting, rising costs, underpricing, declining revenues, fee collection, and liquidity. (NH)

  17. Development of transportation asset management decision support tools : final report.

    DOT National Transportation Integrated Search

    2017-08-09

    This study developed a web-based prototype decision support platform to demonstrate the benefits of transportation asset management in monitoring asset performance, supporting asset funding decisions, planning budget tradeoffs, and optimizing resourc...

  18. Investigate plow blade optimization : executive summary.

    DOT National Transportation Integrated Search

    2015-08-01

    Snow and ice management is the single largest expenditure in the maintenance budget for the Ohio : Department of Transportation (ODOT), with an annual cost including labor, equipment, and materials : reaching approximately $86 million. Given the curr...

  19. Optimizing automatic traffic recorders network in Minnesota.

    DOT National Transportation Integrated Search

    2016-01-01

    Accurate traffic counts are important for budgeting, traffic planning, and roadway design. With thousands of : centerline miles of roadways, it is not possible to install continuous counters at all locations of interest (e.g., : intersections). There...

  20. Partial differential equations constrained combinatorial optimization on an adiabatic quantum computer

    NASA Astrophysics Data System (ADS)

    Chandra, Rishabh

    Partial differential equation-constrained combinatorial optimization (PDECCO) problems are a mixture of continuous and discrete optimization problems. PDECCO problems have discrete controls, but since the partial differential equations (PDE) are continuous, the optimization space is continuous as well. Such problems have several applications, such as gas/water network optimization, traffic optimization, micro-chip cooling optimization, etc. Currently, no efficient classical algorithm which guarantees a global minimum for PDECCO problems exists. A new mapping has been developed that transforms PDECCO problem, which only have linear PDEs as constraints, into quadratic unconstrained binary optimization (QUBO) problems that can be solved using an adiabatic quantum optimizer (AQO). The mapping is efficient, it scales polynomially with the size of the PDECCO problem, requires only one PDE solve to form the QUBO problem, and if the QUBO problem is solved correctly and efficiently on an AQO, guarantees a global optimal solution for the original PDECCO problem.

  1. Optima Nutrition: an allocative efficiency tool to reduce childhood stunting by better targeting of nutrition-related interventions.

    PubMed

    Pearson, Ruth; Killedar, Madhura; Petravic, Janka; Kakietek, Jakub J; Scott, Nick; Grantham, Kelsey L; Stuart, Robyn M; Kedziora, David J; Kerr, Cliff C; Skordis-Worrall, Jolene; Shekar, Meera; Wilson, David P

    2018-03-20

    Child stunting due to chronic malnutrition is a major problem in low- and middle-income countries due, in part, to inadequate nutrition-related practices and insufficient access to services. Limited budgets for nutritional interventions mean that available resources must be targeted in the most cost-effective manner to have the greatest impact. Quantitative tools can help guide budget allocation decisions. The Optima approach is an established framework to conduct resource allocation optimization analyses. We applied this approach to develop a new tool, 'Optima Nutrition', for conducting allocative efficiency analyses that address childhood stunting. At the core of the Optima approach is an epidemiological model for assessing the burden of disease; we use an adapted version of the Lives Saved Tool (LiST). Six nutritional interventions have been included in the first release of the tool: antenatal micronutrient supplementation, balanced energy-protein supplementation, exclusive breastfeeding promotion, promotion of improved infant and young child feeding (IYCF) practices, public provision of complementary foods, and vitamin A supplementation. To demonstrate the use of this tool, we applied it to evaluate the optimal allocation of resources in 7 districts in Bangladesh, using both publicly available data (such as through DHS) and data from a complementary costing study. Optima Nutrition can be used to estimate how to target resources to improve nutrition outcomes. Specifically, for the Bangladesh example, despite only limited nutrition-related funding available (an estimated $0.75 per person in need per year), even without any extra resources, better targeting of investments in nutrition programming could increase the cumulative number of children living without stunting by 1.3 million (an extra 5%) by 2030 compared to the current resource allocation. To minimize stunting, priority interventions should include promotion of improved IYCF practices as well as vitamin A supplementation. Once these programs are adequately funded, the public provision of complementary foods should be funded as the next priority. Programmatic efforts should give greatest emphasis to the regions of Dhaka and Chittagong, which have the greatest number of stunted children. A resource optimization tool can provide important guidance for targeting nutrition investments to achieve greater impact.

  2. Interactive Genetic Algorithm - An Adaptive and Interactive Decision Support Framework for Design of Optimal Groundwater Monitoring Plans

    NASA Astrophysics Data System (ADS)

    Babbar-Sebens, M.; Minsker, B. S.

    2006-12-01

    In the water resources management field, decision making encompasses many kinds of engineering, social, and economic constraints and objectives. Representing all of these problem dependant criteria through models (analytical or numerical) and various formulations (e.g., objectives, constraints, etc.) within an optimization- simulation system can be a very non-trivial issue. Most models and formulations utilized for discerning desirable traits in a solution can only approximate the decision maker's (DM) true preference criteria, and they often fail to consider important qualitative and incomputable phenomena related to the management problem. In our research, we have proposed novel decision support frameworks that allow DMs to actively participate in the optimization process. The DMs explicitly indicate their true preferences based on their subjective criteria and the results of various simulation models and formulations. The feedback from the DMs is then used to guide the search process towards solutions that are "all-rounders" from the perspective of the DM. The two main research questions explored in this work are: a) Does interaction between the optimization algorithm and a DM assist the system in searching for groundwater monitoring designs that are robust from the DM's perspective?, and b) How can an interactive search process be made more effective when human factors, such as human fatigue and cognitive learning processes, affect the performance of the algorithm? The application of these frameworks on a real-world groundwater long-term monitoring (LTM) case study in Michigan highlighted the following salient advantages: a) in contrast to the non-interactive optimization methodology, the proposed interactive frameworks were able to identify low cost monitoring designs whose interpolation maps respected the expected spatial distribution of the contaminants, b) for many same-cost designs, the interactive methodologies were able to propose multiple alternatives that met the DM's preference criteria, therefore allowing the expert to select among several strong candidate designs depending on her/his LTM budget, c) two of the methodologies - Case-Based Micro Interactive Genetic Algorithm (CBMIGA) and Interactive Genetic Algorithm with Mixed Initiative Interaction (IGAMII) - were also able to assist in controlling human fatigue and adapt to the DM's learning process.

  3. Optimizing Eco-Efficiency Across the Procurement Portfolio.

    PubMed

    Pelton, Rylie E O; Li, Mo; Smith, Timothy M; Lyon, Thomas P

    2016-06-07

    Manufacturing organizations' environmental impacts are often attributable to processes in the firm's upstream supply chain. Environmentally preferable procurement (EPP) and the establishment of environmental purchasing criteria can potentially reduce these indirect impacts. Life-cycle assessment (LCA) can help identify the purchasing criteria that are most effective in reducing environmental impacts. However, the high costs of LCA and the problems associated with the comparability of results have limited efforts to integrate procurement performance with quantitative organizational environmental performance targets. Moreover, environmental purchasing criteria, when implemented, are often established on a product-by-product basis without consideration of other products in the procurement portfolio. We develop an approach that utilizes streamlined LCA methods, together with linear programming, to determine optimal portfolios of product impact-reduction opportunities under budget constraints. The approach is illustrated through a simulated breakfast cereal manufacturing firm procuring grain, containerboard boxes, plastic packaging, electricity, and industrial cleaning solutions. Results suggest that extending EPP decisions and resources to the portfolio level, recently made feasible through the methods illustrated herein, can provide substantially greater CO2e and water-depletion reductions per dollar spend than a product-by-product approach, creating opportunities for procurement organizations to participate in firm-wide environmental impact reduction targets.

  4. How to Assess the Value of Medicines?

    PubMed Central

    Simoens, Steven

    2010-01-01

    This study aims to discuss approaches to assessing the value of medicines. Economic evaluation assesses value by means of the incremental cost-effectiveness ratio (ICER). Health is maximized by selecting medicines with increasing ICERs until the budget is exhausted. The budget size determines the value of the threshold ICER and vice versa. Alternatively, the threshold value can be inferred from pricing/reimbursement decisions, although such values vary between countries. Threshold values derived from the value-of-life literature depend on the technique used. The World Health Organization has proposed a threshold value tied to the national GDP. As decision makers may wish to consider multiple criteria, variable threshold values and weighted ICERs have been suggested. Other approaches (i.e., replacement approach, program budgeting and marginal analysis) have focused on improving resource allocation, rather than maximizing health subject to a budget constraint. Alternatively, the generalized optimization framework and multi-criteria decision analysis make it possible to consider other criteria in addition to value. PMID:21607066

  5. How to assess the value of medicines?

    PubMed

    Simoens, Steven

    2010-01-01

    This study aims to discuss approaches to assessing the value of medicines. Economic evaluation assesses value by means of the incremental cost-effectiveness ratio (ICER). Health is maximized by selecting medicines with increasing ICERs until the budget is exhausted. The budget size determines the value of the threshold ICER and vice versa. Alternatively, the threshold value can be inferred from pricing/reimbursement decisions, although such values vary between countries. Threshold values derived from the value-of-life literature depend on the technique used. The World Health Organization has proposed a threshold value tied to the national GDP. As decision makers may wish to consider multiple criteria, variable threshold values and weighted ICERs have been suggested. Other approaches (i.e., replacement approach, program budgeting and marginal analysis) have focused on improving resource allocation, rather than maximizing health subject to a budget constraint. Alternatively, the generalized optimization framework and multi-criteria decision analysis make it possible to consider other criteria in addition to value.

  6. Managing pay for performance: aligning social science research with budget predictability.

    PubMed

    Rosenau, Pauline Vaillancourt; Lal, Lincy S; Lako, Christiaan

    2012-01-01

    Managers and policymakers are seeking practical guidelines for assessing the outcomes of emerging pay-for-performance (P4P) programs. Evaluations of P4P programs published to date are mixed-some are confusing-and methodological problems with them are common. This article first identifies and summarizes obstacles to implementing effective P4P programs. Second, it describes results from social science research going back several decades to support evidence-based P4P best practices. Among the findings from this research, the zero-sum and "earn it back" P4P incentive systems have important drawbacks and may be counterproductive, neither reducing health system costs nor improving quality. The research suggests that punishing participants for low performance may further reduce individuals' performance, especially when involvement is required. We suggest that optimal P4P systems are those that reward all participants for performance improvements. Third, the article links P4P design to budgetary considerations. P4P program designs that provide incentives while improving quality and reducing costs are critical if budget neutrality is a priority for the organization and its resources are limited. In these types of P4P designs, cost calculations are straightforward: The greater the participation, the higher the savings. The article concludes by recommending an evidence-based P4P approach for practitioners that can be implemented without large upfront investment. More research on this topic is also advised.

  7. Optimal Maintenance Crew Composition and Enhancement of Crew Productivity

    DOT National Transportation Integrated Search

    2008-08-01

    The South Carolina Department of Transportation dedicates a large portion of both : its budget and other resources to the maintenance of the States transportation : infrastructure. In order to maximize the efficiency and productivity of the State...

  8. Technology Assessment in Support of the Presidential Vision for Space Exploration

    NASA Technical Reports Server (NTRS)

    Weisbin, Charles R.; Lincoln, William; Mrozinski, Joe; Hua, Hook; Merida, Sofia; Shelton, Kacie; Adumitroaie, Virgil; Derleth, Jason; Silberg, Robert

    2006-01-01

    This paper discusses the process and results of technology assessment in support of the United States Vision for Space Exploration of the Moon, Mars and Beyond. The paper begins by reviewing the Presidential Vision: a major endeavor in building systems of systems. It discusses why we wish to return to the Moon, and the exploration architecture for getting there safely, sustaining a presence, and safely returning. Next, a methodology for optimal technology investment is proposed with discussion of inputs including a capability hierarchy, mission importance weightings, available resource profiles as a function of time, likelihoods of development success, and an objective function. A temporal optimization formulation is offered, and the investment recommendations presented along with sensitivity analyses. Key questions addressed are sensitivity of budget allocations to cost uncertainties, reduction in available budget levels, and shifting funding within constraints imposed by mission timeline.

  9. Adaptive decision rules for the acquisition of nature reserves.

    PubMed

    Turner, Will R; Wilcove, David S

    2006-04-01

    Although reserve-design algorithms have shown promise for increasing the efficiency of conservation planning, recent work casts doubt on the usefulness of some of these approaches in practice. Using three data sets that vary widely in size and complexity, we compared various decision rules for acquiring reserve networks over multiyear periods. We explored three factors that are often important in real-world conservation efforts: uncertain availability of sites for acquisition, degradation of sites, and overall budget constraints. We evaluated the relative strengths and weaknesses of existing optimal and heuristic decision rules and developed a new set of adaptive decision rules that combine the strengths of existing optimal and heuristic approaches. All three of the new adaptive rules performed better than the existing rules we tested under virtually all scenarios of site availability, site degradation, and budget constraints. Moreover, the adaptive rules required no additional data beyond what was readily available and were relatively easy to compute.

  10. Accuracy optimization with wavelength tunability in overlay imaging technology

    NASA Astrophysics Data System (ADS)

    Lee, Honggoo; Kang, Yoonshik; Han, Sangjoon; Shim, Kyuchan; Hong, Minhyung; Kim, Seungyoung; Lee, Jieun; Lee, Dongyoung; Oh, Eungryong; Choi, Ahlin; Kim, Youngsik; Marciano, Tal; Klein, Dana; Hajaj, Eitan M.; Aharon, Sharon; Ben-Dov, Guy; Lilach, Saltoun; Serero, Dan; Golotsvan, Anna

    2018-03-01

    As semiconductor manufacturing technology progresses and the dimensions of integrated circuit elements shrink, overlay budget is accordingly being reduced. Overlay budget closely approaches the scale of measurement inaccuracies due to both optical imperfections of the measurement system and the interaction of light with geometrical asymmetries of the measured targets. Measurement inaccuracies can no longer be ignored due to their significant effect on the resulting device yield. In this paper we investigate a new approach for imaging based overlay (IBO) measurements by optimizing accuracy rather than contrast precision, including its effect over the total target performance, using wavelength tunable overlay imaging metrology. We present new accuracy metrics based on theoretical development and present their quality in identifying the measurement accuracy when compared to CD-SEM overlay measurements. The paper presents the theoretical considerations and simulation work, as well as measurement data, for which tunability combined with the new accuracy metrics is shown to improve accuracy performance.

  11. NOAA seeks healthy budget

    NASA Astrophysics Data System (ADS)

    Bush, Susan

    The small, crowded room of the House side of the U.S. Capitol building belied the large budget of $1,611,991,000 requested for Fiscal Year 1992 by the National Oceanic and Atmospheric Administration. John A. Knauss, Undersecretary for Oceans and Atmosphere, U.S. Department of Commerce, delivered his testimony on February 28 before the House Appropriations Subcommittee on Commerce, Justice, and State, the Judiciary and Related Agencies. He told the subcommittee that the budget “attempts to balance the two goals of maintaining NOAA's position as an important science agency and addressing the serious budget problems that the government continues to face.”Climate and global change, modernization of the National Weather Service, and the Coastal Ocean Science program are NOAA's three ongoing, high-priority initiatives that the budget addresses. Also, three additional initiatives—a NOAA-wide program to improve environmental data management, President Bush's multiagency Coastal America initiative, and a seafood safety program administered jointly by NOAA and the Food and Drug Administration—are addressed.

  12. Roadmap to control HBV and HDV epidemics in China

    DOE PAGES

    Goyal, Ashish; Murray, John M.

    2017-04-23

    Hepatitis B virus (HBV) is endemic in China. Almost 10% of HBV infected individuals are also infected with hepatitis D virus (HDV) which has a 5–10 times higher mortality rate than HBV mono-infection. The aim of this manuscript is to devise strategies that can not only control HBV infections but also HDV infections in China under the current health care budget in an optimal manner. Furthermore, using a mathematical model, an annual budget of 10 billion dollars was optimally allocated among five interventions namely, testing and HBV adult vaccination, treatment for mono-infected and dually-infected individuals, second line treatment for HBVmore » mono-infections, and awareness programs. As a result, we determine that the optimal strategy is to test and treat both infections as early as possible while applying awareness programs at full intensity. Under this strategy, an additional 19.8 million HBV, 1.9 million HDV infections and 0.25 million lives will be saved over the next 10 years at a cost-savings of 79 billion dollars than performing no intervention. Introduction of second line treatment does not add a significant economic burden yet prevents 1.4 million new HBV infections and 15,000 new HDV infections. In conclusion, test and treatment programs are highly efficient in reducing HBV and HDV prevalence in the population. Under the current health budget in China, not only test and treat programs but awareness programs and second line treatment can also be implemented that minimizes prevalence and mortality, and maximizes economic benefits.« less

  13. Roadmap to control HBV and HDV epidemics in China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goyal, Ashish; Murray, John M.

    Hepatitis B virus (HBV) is endemic in China. Almost 10% of HBV infected individuals are also infected with hepatitis D virus (HDV) which has a 5–10 times higher mortality rate than HBV mono-infection. The aim of this manuscript is to devise strategies that can not only control HBV infections but also HDV infections in China under the current health care budget in an optimal manner. Furthermore, using a mathematical model, an annual budget of 10 billion dollars was optimally allocated among five interventions namely, testing and HBV adult vaccination, treatment for mono-infected and dually-infected individuals, second line treatment for HBVmore » mono-infections, and awareness programs. As a result, we determine that the optimal strategy is to test and treat both infections as early as possible while applying awareness programs at full intensity. Under this strategy, an additional 19.8 million HBV, 1.9 million HDV infections and 0.25 million lives will be saved over the next 10 years at a cost-savings of 79 billion dollars than performing no intervention. Introduction of second line treatment does not add a significant economic burden yet prevents 1.4 million new HBV infections and 15,000 new HDV infections. In conclusion, test and treatment programs are highly efficient in reducing HBV and HDV prevalence in the population. Under the current health budget in China, not only test and treat programs but awareness programs and second line treatment can also be implemented that minimizes prevalence and mortality, and maximizes economic benefits.« less

  14. The value of flexibility in conservation financing.

    PubMed

    Lennox, Gareth D; Fargione, Joseph; Spector, Sacha; Williams, Gwyn; Armsworth, Paul R

    2017-06-01

    Land-acquisition strategies employed by conservation organizations vary in their flexibility. Conservation-planning theory largely fails to reflect this by presenting models that are either extremely inflexible-parcel acquisitions are irreversible and budgets are fixed-or extremely flexible-previously acquired parcels can readily be sold. This latter approach, the selling of protected areas, is infeasible or problematic in many situations. We considered the value to conservation organizations of increasing the flexibility of their land-acquisition strategies through their approach to financing deals. Specifically, we modeled 2 acquisition-financing methods commonly used by conservation organizations: borrowing and budget carry-over. Using simulated data, we compared results from these models with those from an inflexible fixed-budget model and an extremely flexible selling model in which previous acquisitions could be sold to fund new acquisitions. We then examined 3 case studies of how conservation organizations use borrowing and budget carry-over in practice. Model comparisons showed that borrowing and budget carry-over always returned considerably higher rewards than the fixed-budget model. How they performed relative to the selling model depended on the relative conservation value of past acquisitions. Both the models and case studies showed that incorporating flexibility through borrowing or budget carry-over gives conservation organizations the ability to purchase parcels of higher conservation value than when budgets are fixed without the problems associated with the selling of protected areas. © 2016 Society for Conservation Biology.

  15. Implementation of equity in resource allocation for regional earthquake risk mitigation using two-stage stochastic programming.

    PubMed

    Zolfaghari, Mohammad R; Peyghaleh, Elnaz

    2015-03-01

    This article presents a new methodology to implement the concept of equity in regional earthquake risk mitigation programs using an optimization framework. It presents a framework that could be used by decisionmakers (government and authorities) to structure budget allocation strategy toward different seismic risk mitigation measures, i.e., structural retrofitting for different building structural types in different locations and planning horizons. A two-stage stochastic model is developed here to seek optimal mitigation measures based on minimizing mitigation expenditures, reconstruction expenditures, and especially large losses in highly seismically active countries. To consider fairness in the distribution of financial resources among different groups of people, the equity concept is incorporated using constraints in model formulation. These constraints limit inequity to the user-defined level to achieve the equity-efficiency tradeoff in the decision-making process. To present practical application of the proposed model, it is applied to a pilot area in Tehran, the capital city of Iran. Building stocks, structural vulnerability functions, and regional seismic hazard characteristics are incorporated to compile a probabilistic seismic risk model for the pilot area. Results illustrate the variation of mitigation expenditures by location and structural type for buildings. These expenditures are sensitive to the amount of available budget and equity consideration for the constant risk aversion. Most significantly, equity is more easily achieved if the budget is unlimited. Conversely, increasing equity where the budget is limited decreases the efficiency. The risk-return tradeoff, equity-reconstruction expenditures tradeoff, and variation of per-capita expected earthquake loss in different income classes are also presented. © 2015 Society for Risk Analysis.

  16. The Observed State of the Water Cycle in the Early Twenty-First Century

    NASA Technical Reports Server (NTRS)

    Rodell, M.; Beaudoing, H. K.; L'Ecuyer, T. S.; Olson, W. S.; Famiglietti, J. S.; Houser, P. R.; Adler, R.; Bosilovich, M. G.; Clayson, C. A.; Chambers, D.; hide

    2015-01-01

    This study quantifies mean annual and monthly fluxes of Earth's water cycle over continents and ocean basins during the first decade of the millennium. To the extent possible, the flux estimates are based on satellite measurements first and data-integrating models second. A careful accounting of uncertainty in the estimates is included. It is applied within a routine that enforces multiple water and energy budget constraints simultaneously in a variational framework in order to produce objectively determined optimized flux estimates. In the majority of cases, the observed annual surface and atmospheric water budgets over the continents and oceans close with much less than 10% residual. Observed residuals and optimized uncertainty estimates are considerably larger for monthly surface and atmospheric water budget closure, often nearing or exceeding 20% in North America, Eurasia, Australia and neighboring islands, and the Arctic and South Atlantic Oceans. The residuals in South America and Africa tend to be smaller, possibly because cold land processes are negligible. Fluxes were poorly observed over the Arctic Ocean, certain seas, Antarctica, and the Australasian and Indonesian islands, leading to reliance on atmospheric analysis estimates. Many of the satellite systems that contributed data have been or will soon be lost or replaced. Models that integrate ground-based and remote observations will be critical for ameliorating gaps and discontinuities in the data records caused by these transitions. Continued development of such models is essential for maximizing the value of the observations. Next-generation observing systems are the best hope for significantly improving global water budget accounting.

  17. Elevating the role of finance at Mary Lanning Healthcare.

    PubMed

    Hoffman, Amanda; Spence, Jay

    2013-11-01

    To effectively partner with hospital operations leaders, healthcare finance leaders should: Streamline and align financial planning and budgeting functions across the organization; Ensure capital planning is regarded as a strategic process; Optimize performance monitoring across management levels.

  18. MDOT Pavement Management System : Prediction Models and Feedback System

    DOT National Transportation Integrated Search

    2000-10-01

    As a primary component of a Pavement Management System (PMS), prediction models are crucial for one or more of the following analyses: : maintenance planning, budgeting, life-cycle analysis, multi-year optimization of maintenance works program, and a...

  19. Achieving full connectivity of sites in the multiperiod reserve network design problem

    USGS Publications Warehouse

    Jafari, Nahid; Nuse, Bryan L.; Moore, Clinton; Dilkina, Bistra; Hepinstall-Cymerman, Jeffrey

    2017-01-01

    The conservation reserve design problem is a challenge to solve because of the spatial and temporal nature of the problem, uncertainties in the decision process, and the possibility of alternative conservation actions for any given land parcel. Conservation agencies tasked with reserve design may benefit from a dynamic decision system that provides tactical guidance for short-term decision opportunities while maintaining focus on a long-term objective of assembling the best set of protected areas possible. To plan cost-effective conservation over time under time-varying action costs and budget, we propose a multi-period mixed integer programming model for the budget-constrained selection of fully connected sites. The objective is to maximize a summed conservation value over all network parcels at the end of the planning horizon. The originality of this work is in achieving full spatial connectivity of the selected sites during the schedule of conservation actions.

  20. Planification de la maintenance d'un parc de turbines-alternateurs par programmation mathematique

    NASA Astrophysics Data System (ADS)

    Aoudjit, Hakim

    A growing number of Hydro-Quebec's hydro generators are at the end of their useful life and maintenance managers fear to face a number of overhauls exceeding what can be handled. Maintenance crews and budgets are limited and these withdrawals may take up to a full year and mobilize significant resources in addition to the loss of electricity production. In addition, increased export sales forecasts and severe production patterns are expected to speed up wear that can lead to halting many units at the same time. Currently, expert judgment is at the heart of withdrawals which rely primarily on periodic inspections and in-situ measurements and the results are sent to the maintenance planning team who coordinate all the withdrawals decisions. The degradations phenomena taking place is random in nature and the prediction capability of wear using only inspections is limited to short-term at best. A long term planning of major overhauls is sought by managers for the sake of justifying and rationalizing budgets and resources. The maintenance managers are able to provide a huge amount of data. Among them, is the hourly production of each unit for several years, the repairs history on each part of a unit as well as major withdrawals since the 1950's. In this research, we tackle the problem of long term maintenance planning for a fleet of 90 hydro generators at Hydro-Quebec over a 50 years planning horizon period. We lay a scientific and rational framework to support withdrawals decisions by using part of the available data and maintenance history while fulfilling a set of technical and economic constraints. We propose a planning approach based on a constrained optimization framework. We begin by decomposing and sorting hydro generator components to highlight the most influential parts. A failure rate model is developed to take into account the technical characteristics and unit utilization. Then, replacement and repair policies are evaluated for each of the components then strategies are derived for the whole unit. Traditional univariate policies such as the age replacement policy and the minimal repair policy are calculated. These policies are extended to build alternative bivariate maintenance policy as well as a repair strategy where the state of a component after a repair is rejuvenated by a constant coefficient. These templates form the basis for the calculation of objective function for the scheduling problem. On one hand, this issue is treated as a nonlinear problem where the objective is to minimize the average total maintenance cost per unit of time on an infinite horizon for the fleet with technical and economic constraints. A formulation is also proposed in the case of a finite time horizon. In the event of electricity production variation, and given that the usage profile is known, the influence of production scenarios is reflected on the unit's components through their failure rate. In this context, prognoses on possible resources problems are made by studying the characteristics of the generated plans. On the second hand, the withdrawals are now subjected to two decision criteria. In addition to minimizing the average total maintenance cost per unit of time on an infinite time horizon, the best achievable reliability of remaining turbo generators is sought. This problem is treated as a biobjective nonlinear optimization problem. Finally a series of problems describing multiple contexts are solved for planning renovations of 90 turbo generators units considering 3 major components in each unit and 2 types of maintenance policies for each component.

  1. Impact of the macroeconomic factors on university budgeting the US and Russia

    NASA Astrophysics Data System (ADS)

    Bogomolova, Arina; Balk, Igor; Ivachenko, Natalya; Temkin, Anatoly

    2017-10-01

    This paper discuses impact of macroeconomics factor on the university budgeting. Modern developments in the area of data science and machine learning made it possible to utilise automated techniques to address several problems of humankind ranging from genetic engineering and particle physics to sociology and economics. This paper is the first step to create a robust toolkit which will help universities sustain macroeconomic challenges utilising modern predictive analytics techniques.

  2. Oversight on the Federal Role in Education (Part 1). Hearing before the Committee on Education and Labor, House of Representatives, Ninety-Ninth Congress, First Session (New Orleans, LA, February 14, 1985).

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. House Committee on Education and Labor.

    This report of a regional hearing on education considers the Federal role in education and, specifically, how Federal budget cuts will affect both school programs and major national problems such as declining productivity, inflation, social tensions, budget deficits, and challenged world leadership. Testimony was heard from representatives of the…

  3. Agency problems of global budget system in Taiwan's National Health Insurance.

    PubMed

    Yan, Yu-Hua; Yang, Chen-Wei; Fang, Shih-Chieh

    2014-05-01

    The main purpose of this study was to investigate the agency problem presented by the global budget system followed by hospitals in Taiwan. In this study, we examine empirically the interaction between the principal: Bureau of National Health Insurance (BNHI) and agency: medical service providers (hospitals); we also describe actual medical service provider and hospital governance conditions from a agency theory perspective. This study identified a positive correlation between aversion to agency hazard (self-interest behavior, asymmetric information, and risk hedging) and agency problem risks (disregard of medical ethics, pursuit of extra-contract profit, disregard of professionalism, and cost orientation). Agency costs refer to BNHI auditing and monitoring expenditures used to prevent hospitals from deviating from NHI policy goals. This study also found agency costs negatively moderate the relationship between agency hazards and agency problems The main contribution of this study is its use of agency theory to clarify agency problems and several potential factors caused by the NHI system. This study also contributes to the field of health policy study by clarifying the nature and importance of agency problems in the health care sector. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. The Stability of Tidal Equilibrium for Hierarchical Star-Planet-Moon Systems

    NASA Astrophysics Data System (ADS)

    Adams, Fred C.

    2018-04-01

    Motivated by the current search for exomoons, this talk considers the stability of tidal equilibrium for hierarchical three-body systems containing a star, a planet, and a moon. In this treatment, the energy and angular momentum budgets include contributions from the planetary orbit, lunar orbit, stellar spin, planetary spin, and lunar spin. The goal is to determine the optimized energy state of the system subject to the constraint of constant angular momentum. Due to the lack of a closed form solution for the full three-body problem, however, we must use use an approximate description of the orbits. We first consider the Keplerian limit and find that the critical energy states are saddle points, rather than minima, so that these hierarchical systems have no stable tidal equilibrium states. We then generalize the calculation so that the lunar orbit is described by a time-averaged version of the circular restricted three-body problem. In this latter case, the critical energy state is a shallow minimum, so that a tidal equilibrium state exists. In both cases, however, the lunar orbit for the critical point lies outside the boundary (roughly half the Hill radius) where (previous) numerical simulations indicate dynamical instability.

  5. Optimal allocation of land and water resources to achieve Water, Energy and Food Security in the upper Blue Nile basin

    NASA Astrophysics Data System (ADS)

    Allam, M.; Eltahir, E. A. B.

    2017-12-01

    Rapid population growth, hunger problems, increasing energy demands, persistent conflicts between the Nile basin riparian countries and the potential impacts of climate change highlight the urgent need for the conscious stewardship of the upper Blue Nile (UBN) basin resources. This study develops a framework for the optimal allocation of land and water resources to agriculture and hydropower production in the UBN basin. The framework consists of three optimization models that aim to: (a) provide accurate estimates of the basin water budget, (b) allocate land and water resources optimally to agriculture, and (c) allocate water to agriculture and hydropower production, and investigate trade-offs between them. First, a data assimilation procedure for data-scarce basins is proposed to deal with data limitations and produce estimates of the hydrologic components that are consistent with the principles of mass and energy conservation. Second, the most representative topography and soil properties datasets are objectively identified and used to delineate the agricultural potential in the basin. The agricultural potential is incorporated into a land-water allocation model that maximizes the net economic benefits from rain-fed agriculture while allowing for enhancing the soils from one suitability class to another to increase agricultural productivity in return for an investment in soil inputs. The optimal agricultural expansion is expected to reduce the basin flow by 7.6 cubic kilometres, impacting downstream countries. The optimization framework is expanded to include hydropower production. This study finds that allocating water to grow rain-fed teff in the basin is more profitable than allocating water for hydropower production. Optimal operation rules for the Grand Ethiopian Renaissance dam (GERD) are identified to maximize annual hydropower generation while achieving a relatively uniform monthly production rate. Trade-offs between agricultural expansion and hydropower generation are analysed in an attempt to define cooperation scenarios that would achieve win-win outcomes for all riparian countries.

  6. Towards resiliency with micro-grids: Portfolio optimization and investment under uncertainty

    NASA Astrophysics Data System (ADS)

    Gharieh, Kaveh

    Energy security and sustained supply of power are critical for community welfare and economic growth. In the face of the increased frequency and intensity of extreme weather conditions which can result in power grid outage, the value of micro-grids to improve the communities' power reliability and resiliency is becoming more important. Micro-grids capability to operate in islanded mode in stressed-out conditions, dramatically decreases the economic loss of critical infrastructure in power shortage occasions. More wide-spread participation of micro-grids in the wholesale energy market in near future, makes the development of new investment models necessary. However, market and price risks in short term and long term along with risk factors' impacts shall be taken into consideration in development of new investment models. This work proposes a set of models and tools to address different problems associated with micro-grid assets including optimal portfolio selection, investment and financing in both community and a sample critical infrastructure (i.e. wastewater treatment plant) levels. The models account for short-term operational volatilities and long-term market uncertainties. A number of analytical methodologies and financial concepts have been adopted to develop the aforementioned models as follows. (1) Capital budgeting planning and portfolio optimization models with Monte Carlo stochastic scenario generation are applied to derive the optimal investment decision for a portfolio of micro-grid assets considering risk factors and multiple sources of uncertainties. (2) Real Option theory, Monte Carlo simulation and stochastic optimization techniques are applied to obtain optimal modularized investment decisions for hydrogen tri-generation systems in wastewater treatment facilities, considering multiple sources of uncertainty. (3) Public Private Partnership (PPP) financing concept coupled with investment horizon approach are applied to estimate public and private parties' revenue shares from a community-level micro-grid project over the course of assets' lifetime considering their optimal operation under uncertainty.

  7. Financial Capability in Early Social Work Practice: Lessons for Today.

    PubMed

    Stuart, Paul H

    2016-10-01

    During the profession's first decades, social workers tried to improve their clients’ financial capability (FC). This article describes the methods used by early social workers who attempted to enhance the FC of their clients, based on contemporary descriptions of their practice. Social workers initially emphasized thrift, later adding more sophisticated consideration of the cost of foods, rent, and other necessities. Social work efforts were furthered by home economists, who served as specialists in nutrition, clothing, interior design, and other topics related to homemaking. Early home economists included specialists in nutrition and family budgeting; these specialists worked with social services agencies to provide a financial basis for family budgets and assisted clients with family budgeting. Some agencies engaged home economists as consultants and as direct providers of instruction on home budgets for clients. By the 1930s, however, social work interest in family budget problems focused on the psychological meaning of low income to the client, rather than in measures to increase client FC. Consequently, social workers’ active engagement with family budget issues—engagement that characterized earlier decades—faded. These early efforts can inform contemporary practice as social workers are once again concerned about improving their clients’ FC.

  8. FY 1992 Budget committed to R&D

    NASA Astrophysics Data System (ADS)

    Bush, Susan

    President's Bush's Fiscal Year 1992 budget for research and development is clear proof of his commitment to R&D as a long-term investment for the next American century, according to D. Allan Bromley, Assistant to the President for Science and Technology and Director, Office of Science and Technology Policy. The FY 92 budget proposes to allocate $75.6 billion for research and development, an increase of $8.4billion, or 13% over the amount appropriated for FY 91. Calling it a “good budget,” Bromley revealed the specifics of research and development in the President's budget on February 4.Bromley believes that as a nation we are underinvesting in research and development,but sees the 1992 budget increases as concrete steps to address this problem. The newly organized and revitalized Federal Coordinating Council for Science, Engineering, and Technology (FCCSET)—an interagency forum of Cabinet secretaries, deputy secretaries, and the heads of independent agencies that reviews, coordinates, and helps implement federal science and technology policy-named three high-priority cross—cutting areas of R&D and organized special interagency programs in these areas. The areas are high-performance computing and communications, global change, and mathematics and science education.

  9. On the optimal two-impulse maneuver

    NASA Technical Reports Server (NTRS)

    Bonneau, F. G.; Diarra, C. M.

    1992-01-01

    An algorithm and a solution to the optimal two-impulse maneuver for the Ulysses spacecraft is presented. The option in which both maneuvers are designed using the critical plane strategy, which further reduces the propellant budget, is included. Results show savings of 12.4 and 16.7 m/s for full targeting and critical plane strategy, respectively. It is argued that this software will play an important role in the area of multimission maneuver design.

  10. Cost-effectiveness of the Federal stream-gaging program in Virginia

    USGS Publications Warehouse

    Carpenter, D.H.

    1985-01-01

    Data uses and funding sources were identified for the 77 continuous stream gages currently being operated in Virginia by the U.S. Geological Survey with a budget of $446,000. Two stream gages were identified as not being used sufficiently to warrant continuing their operation. Operation of these stations should be considered for discontinuation. Data collected at two other stations were identified as having uses primarily related to short-term studies; these stations should also be considered for discontinuation at the end of the data collection phases of the studies. The remaining 73 stations should be kept in the program for the foreseeable future. The current policy for operation of the 77-station program requires a budget of $446,000/yr. The average standard error of estimation of streamflow records is 10.1%. It was shown that this overall level of accuracy at the 77 sites could be maintained with a budget of $430,500 if resources were redistributed among the gages. A minimum budget of $428,500 is required to operate the 77-gage program; a smaller budget would not permit proper service and maintenance of the gages and recorders. At the minimum budget, with optimized operation, the average standard error would be 10.4%. The maximum budget analyzed was $650,000, which resulted in an average standard error of 5.5%. The study indicates that a major component of error is caused by lost or missing data. If perfect equipment were available, the standard error for the current program and budget could be reduced to 7.6%. This also can be interpreted to mean that the streamflow data have a standard error of this magnitude during times when the equipment is operating properly. (Author 's abstract)

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaitsgory, Vladimir, E-mail: vladimir.gaitsgory@mq.edu.au; Rossomakhine, Sergey, E-mail: serguei.rossomakhine@flinders.edu.au

    The paper aims at the development of an apparatus for analysis and construction of near optimal solutions of singularly perturbed (SP) optimal controls problems (that is, problems of optimal control of SP systems) considered on the infinite time horizon. We mostly focus on problems with time discounting criteria but a possibility of the extension of results to periodic optimization problems is discussed as well. Our consideration is based on earlier results on averaging of SP control systems and on linear programming formulations of optimal control problems. The idea that we exploit is to first asymptotically approximate a given problem ofmore » optimal control of the SP system by a certain averaged optimal control problem, then reformulate this averaged problem as an infinite-dimensional linear programming (LP) problem, and then approximate the latter by semi-infinite LP problems. We show that the optimal solution of these semi-infinite LP problems and their duals (that can be found with the help of a modification of an available LP software) allow one to construct near optimal controls of the SP system. We demonstrate the construction with two numerical examples.« less

  12. The constrained inversion of Nimbus-7 wide field-of-view radiometer measurements for the Earth Radiation Budget

    NASA Technical Reports Server (NTRS)

    Hucek, Richard R.; Ardanuy, Philip; Kyle, H. Lee

    1990-01-01

    The results of a constrained, wide field-of-view radiometer measurement deconvolution are presented and compared against higher resolution results obtained from the Earth Radiation Budget instrument on the Nimbus-7 satellite and from the Earth Radiation Budget Experiment. The method is applicable to both longwave and shortwave observations and is specifically designed to treat the problem of anisotropic reflection and emission at the top of the atmosphere as well as low signal-to-noise ratios that arise regionally within a field. The procedure is reviewed, and the improvements in resolution obtained are examined. Some minor improvements in the albedo algorithm are also described.

  13. 42 CFR 483.136 - Evaluating whether an individual with mental retardation requires specialized services (PASARR/MR).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) The individual's medical problems; (2) The level of impact these problems have on the individual's...) Independent living development such as meal preparation, budgeting and personal finances, survival skills... the most personal care needs; (B) Understand simple commands; (C) Communicate basic needs and wants...

  14. VM Capacity-Aware Scheduling within Budget Constraints in IaaS Clouds

    PubMed Central

    Thanasias, Vasileios; Lee, Choonhwa; Hanif, Muhammad; Kim, Eunsam; Helal, Sumi

    2016-01-01

    Recently, cloud computing has drawn significant attention from both industry and academia, bringing unprecedented changes to computing and information technology. The infrastructure-as-a-Service (IaaS) model offers new abilities such as the elastic provisioning and relinquishing of computing resources in response to workload fluctuations. However, because the demand for resources dynamically changes over time, the provisioning of resources in a way that a given budget is efficiently utilized while maintaining a sufficing performance remains a key challenge. This paper addresses the problem of task scheduling and resource provisioning for a set of tasks running on IaaS clouds; it presents novel provisioning and scheduling algorithms capable of executing tasks within a given budget, while minimizing the slowdown due to the budget constraint. Our simulation study demonstrates a substantial reduction up to 70% in the overall task slowdown rate by the proposed algorithms. PMID:27501046

  15. VM Capacity-Aware Scheduling within Budget Constraints in IaaS Clouds.

    PubMed

    Thanasias, Vasileios; Lee, Choonhwa; Hanif, Muhammad; Kim, Eunsam; Helal, Sumi

    2016-01-01

    Recently, cloud computing has drawn significant attention from both industry and academia, bringing unprecedented changes to computing and information technology. The infrastructure-as-a-Service (IaaS) model offers new abilities such as the elastic provisioning and relinquishing of computing resources in response to workload fluctuations. However, because the demand for resources dynamically changes over time, the provisioning of resources in a way that a given budget is efficiently utilized while maintaining a sufficing performance remains a key challenge. This paper addresses the problem of task scheduling and resource provisioning for a set of tasks running on IaaS clouds; it presents novel provisioning and scheduling algorithms capable of executing tasks within a given budget, while minimizing the slowdown due to the budget constraint. Our simulation study demonstrates a substantial reduction up to 70% in the overall task slowdown rate by the proposed algorithms.

  16. Performance of Grey Wolf Optimizer on large scale problems

    NASA Astrophysics Data System (ADS)

    Gupta, Shubham; Deep, Kusum

    2017-01-01

    For solving nonlinear continuous problems of optimization numerous nature inspired optimization techniques are being proposed in literature which can be implemented to solve real life problems wherein the conventional techniques cannot be applied. Grey Wolf Optimizer is one of such technique which is gaining popularity since the last two years. The objective of this paper is to investigate the performance of Grey Wolf Optimization Algorithm on large scale optimization problems. The Algorithm is implemented on 5 common scalable problems appearing in literature namely Sphere, Rosenbrock, Rastrigin, Ackley and Griewank Functions. The dimensions of these problems are varied from 50 to 1000. The results indicate that Grey Wolf Optimizer is a powerful nature inspired Optimization Algorithm for large scale problems, except Rosenbrock which is a unimodal function.

  17. Feed Forward Neural Network and Optimal Control Problem with Control and State Constraints

    NASA Astrophysics Data System (ADS)

    Kmet', Tibor; Kmet'ová, Mária

    2009-09-01

    A feed forward neural network based optimal control synthesis is presented for solving optimal control problems with control and state constraints. The paper extends adaptive critic neural network architecture proposed by [5] to the optimal control problems with control and state constraints. The optimal control problem is transcribed into a nonlinear programming problem which is implemented with adaptive critic neural network. The proposed simulation method is illustrated by the optimal control problem of nitrogen transformation cycle model. Results show that adaptive critic based systematic approach holds promise for obtaining the optimal control with control and state constraints.

  18. Budget- and Priority-Setting Criteria at State Health Agencies in Times of Austerity: A Mixed-Methods Study

    PubMed Central

    Resnick, Beth; Kass, Nancy; Sellers, Katie; Young, Jessica; Bernet, Patrick; Jarris, Paul

    2014-01-01

    Objectives. We examined critical budget and priority criteria for state health agencies to identify likely decision-making factors, pressures, and opportunities in times of austerity. Methods. We have presented findings from a 2-stage, mixed-methods study with state public health leaders regarding public health budget- and priority-setting processes. In stage 1, we conducted hour-long interviews in 2011 with 45 health agency executive and division or bureau leaders from 6 states. Stage 2 was an online survey of 207 executive and division or bureau leaders from all state health agencies (66% response rate). Results. Respondents identified 5 key criteria: whether a program was viewed as “mission critical,” the seriousness of the consequences of not funding the program, financing considerations, external directives and mandates, and the magnitude of the problem the program addressed. Conclusions. We have presented empirical findings on criteria used in state health agency budgetary decision-making. These criteria suggested a focus and interest on core public health and the largest public health problems with the most serious ramifications. PMID:24825212

  19. Budget- and priority-setting criteria at state health agencies in times of austerity: a mixed-methods study.

    PubMed

    Leider, Jonathon P; Resnick, Beth; Kass, Nancy; Sellers, Katie; Young, Jessica; Bernet, Patrick; Jarris, Paul

    2014-06-01

    We examined critical budget and priority criteria for state health agencies to identify likely decision-making factors, pressures, and opportunities in times of austerity. We have presented findings from a 2-stage, mixed-methods study with state public health leaders regarding public health budget- and priority-setting processes. In stage 1, we conducted hour-long interviews in 2011 with 45 health agency executive and division or bureau leaders from 6 states. Stage 2 was an online survey of 207 executive and division or bureau leaders from all state health agencies (66% response rate). Respondents identified 5 key criteria: whether a program was viewed as "mission critical," the seriousness of the consequences of not funding the program, financing considerations, external directives and mandates, and the magnitude of the problem the program addressed. We have presented empirical findings on criteria used in state health agency budgetary decision-making. These criteria suggested a focus and interest on core public health and the largest public health problems with the most serious ramifications.

  20. O/MN Budget Execution at U.S. Naval Shore Activities: A Model for Improving Resource Allocation.

    DTIC Science & Technology

    1981-12-01

    those objective. Is thus for tie, B r d*nateet ennhanced written ai ti 1 fiquoA by the Commaul9 fco VARIANCE RIPIA3ATION PORN ...of each requirement to be prioritized. Needs/desires of all concerned are discussed and classified. [ Family Service Center Director] Under present...obligation of end year sweep up of funds. [Admin. Officer/Director Family Service Center] More knowledge of operational level problems. [Budget Analyst

  1. The Application of Mathematical Programming to the Productivity Investment Fund Program: A Capital Rationing Problem.

    DTIC Science & Technology

    1987-01-01

    survey research, Gitman and Forrester (1977) report that managers consider the most difficult and important stages of capital budgeting to be project...confirms this (Finn, 1973; Gitman and Forrester, 1977). In the public sector, hard rationing is probably more common, where compliance with budgeted...decisions (e.g., Klammer, 1972, Banda and Nolan, 1972; Osteryoung, 1973; Gitman and Forrester, 1977; Farragher, 1986). It is not known if the existence of

  2. Financial Crisis; Department of Defense Spending: How to Control Fraud, Waste, and Abuse in the Military

    DTIC Science & Technology

    2012-04-25

    problem that continues to cost billions of dollars in losses at a time when the United States Military and Federal Government can least afford it. It...to look like. The federal budget has become the leading issue affecting every Government Agency. The military’s budget is decreasing by more than...discuss the cost savings measures and initiatives currently being debated in Washington, and offer, for debate, several new recommendations that have

  3. A public health and budget impact analysis of vaccinating the elderly and at-risk adults with the 23-valent pneumococcal polysaccharide vaccine or 13-valent pneumococcal conjugate vaccine in the UK.

    PubMed

    Jiang, Yiling; Gauthier, Aline; Keeping, Sam; Carroll, Stuart

    2014-12-01

    Since the introduction of the routine childhood immunization, a change in epidemiology of pneumococcal disease has been seen in both children and adults. This study aimed to quantify the public health and budget impact of pneumococcal vaccination of the elderly and those in at risk groups in the UK. The model was adapted from a previous population-based Markov model. At-risk adults and the elderly were assumed to receive PPV23 or PCV13 vaccination or no vaccination. Over the study period (2012-2016), PPV23 vaccination led to a reduction in the number of invasive pneumococcal disease cases in most scenarios. The net budget impact ranged between £15 and £39 million (vs no vaccination) or between -£116 and -£93 million (vs PCV13). PPV23 vaccination program remains the optimal strategy from public health and budgetary perspectives despite epidemiological changes. PCV13 is likely to impose a significant budget with limited health benefits.

  4. DoD Implementation of the Better Buying Power Initiatives

    DTIC Science & Technology

    2012-12-01

    statutory measures (Budget Control Act of 2011), Congress has been directed to face this problem and provide a workable solution or face the evils...Defense Support Program (DSP) to provide the capabilities that the SBIRS has had problems delivering (Richelson, 2007; Werner, 2011). The purpose of the...Defense David Packard took office. They were keen on addressing the problems plaguing defense acquisition: excessive centralization, inefficiencies

  5. A Constrained Maximization Model for inspecting the impact of leaf shape on optimal leaf size and stoma resistance

    NASA Astrophysics Data System (ADS)

    Ding, J.; Johnson, E. A.; Martin, Y. E.

    2017-12-01

    Leaf is the basic production unit of plants. Water is the most critical resource of plants. Its availability controls primary productivity of plants by affecting leaf carbon budget. To avoid the damage of cavitation from lowering vein water potential t caused by evapotranspiration, the leaf must increase the stomatal resistance to reduce evapotranspiration rate. This comes at the cost of reduced carbon fixing rate as increasing stoma resistance meanwhile slows carbon intake rate. Studies suggest that stoma will operate at an optimal resistance to maximize the carbon gain with respect to water. Different plant species have different leaf shapes, a genetically determined trait. Further, on the same plant leaf size can vary many times in size that is related to soil moisture, an indicator of water availability. According to metabolic scaling theory, increasing leaf size will increase total xylem resistance of vein, which may also constrain leaf carbon budget. We present a Constrained Maximization Model of leaf (leaf CMM) that incorporates metabolic theory into the coupling of evapotranspiration and carbon fixation to examine how leaf size, stoma resistance and maximum net leaf primary productivity change with petiole xylem water potential. The model connects vein network structure to leaf shape and use the difference between petiole xylem water potential and the critical minor vein cavitation forming water potential as the budget. The CMM shows that both maximum net leaf primary production and optimal leaf size increase with petiole xylem water potential while optimal stoma resistance decreases. Narrow leaf has overall lower optimal leaf size and maximum net leaf carbon gain and higher optimal stoma resistance than those of broad leaf. This is because with small width to length ratio, total xylem resistance increases faster with leaf size. Total xylem resistance of narrow leaf increases faster with leaf size causing higher average and marginal cost of xylem water potential with respect to net leaf carbon gain. With same leaf area, total xylem resistance of narrow leaf is higher than broad leaf. Given same stoma resistance and petiole water potential, narrow leaf will lose more xylem water potential than broad leaf. Consequently, narrow leaf has smaller size and higher stoma resistance at optimum.

  6. Optimization and Persistence

    DTIC Science & Technology

    1997-09-01

    and Ronen 1987], capital budget- ing [Brown, Clemence, Teufert, and Wood 1991], and global supply chain manage- ment [ Arntzen , Brown, Harrison, and...crew-pairing opti- mization at American Airlines," Interfaces, Vol. 21, No. 1 Oanuary- February), pp. 62-74. Arntzen , B. C.; Brown, G. G.; Harrison, T

  7. COPS: Large-scale nonlinearly constrained optimization problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bondarenko, A.S.; Bortz, D.M.; More, J.J.

    2000-02-10

    The authors have started the development of COPS, a collection of large-scale nonlinearly Constrained Optimization Problems. The primary purpose of this collection is to provide difficult test cases for optimization software. Problems in the current version of the collection come from fluid dynamics, population dynamics, optimal design, and optimal control. For each problem they provide a short description of the problem, notes on the formulation of the problem, and results of computational experiments with general optimization solvers. They currently have results for DONLP2, LANCELOT, MINOS, SNOPT, and LOQO.

  8. Global physician budgets as common-property resources: some implications for physicians and medical associations.

    PubMed Central

    Hurley, J; Card, R

    1996-01-01

    Since 1990 payment for physician services in the fee-for-service sector has shifted from an open-ended system to fixed global budgets. This shift has created a new economic context for practising medicine in Canada. A global cap creates a conflict between physicians' individual economic self-interest and their collective interest in constraining total billings within the capped budget. These types of incentive problems occur in managing what are known in economics as "common-property resources." Analysts studying common-property resources have documented several management principles associated with successful, long-run use of such resources in the face of these conflicting incentives. These management principles include early defining the boundaries of the common-property resource, explicitly specifying rules for using the resource, developing collective decision-making arrangements and monitoring mechanisms, and creating low-cost conflict-resolution mechanisms. The authors argue that global physician budgets can usefully be viewed as common-property-resources. They describe some of the key management principles and note some implications for physicians and the provincial and territorial medical associations as they adapt to global budgets. PMID:8612251

  9. Link removal for the control of stochastically evolving epidemics over networks: a comparison of approaches.

    PubMed

    Enns, Eva A; Brandeau, Margaret L

    2015-04-21

    For many communicable diseases, knowledge of the underlying contact network through which the disease spreads is essential to determining appropriate control measures. When behavior change is the primary intervention for disease prevention, it is important to understand how to best modify network connectivity using the limited resources available to control disease spread. We describe and compare four algorithms for selecting a limited number of links to remove from a network: two "preventive" approaches (edge centrality, R0 minimization), where the decision of which links to remove is made prior to any disease outbreak and depends only on the network structure; and two "reactive" approaches (S-I edge centrality, optimal quarantining), where information about the initial disease states of the nodes is incorporated into the decision of which links to remove. We evaluate the performance of these algorithms in minimizing the total number of infections that occur over the course of an acute outbreak of disease. We consider different network structures, including both static and dynamic Erdös-Rényi random networks with varying levels of connectivity, a real-world network of residential hotels connected through injection drug use, and a network exhibiting community structure. We show that reactive approaches outperform preventive approaches in averting infections. Among reactive approaches, removing links in order of S-I edge centrality is favored when the link removal budget is small, while optimal quarantining performs best when the link removal budget is sufficiently large. The budget threshold above which optimal quarantining outperforms the S-I edge centrality algorithm is a function of both network structure (higher for unstructured Erdös-Rényi random networks compared to networks with community structure or the real-world network) and disease infectiousness (lower for highly infectious diseases). We conduct a value-of-information analysis of knowing which nodes are initially infected by comparing the performance improvement achieved by reactive over preventive strategies. We find that such information is most valuable for moderate budget levels, with increasing value as disease spread becomes more likely (due to either increased connectedness of the network or increased infectiousness of the disease). Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Link removal for the control of stochastically evolving epidemics over networks: A comparison of approaches

    PubMed Central

    Brandeau, Margaret L.

    2015-01-01

    For many communicable diseases, knowledge of the underlying contact network through which the disease spreads is essential to determining appropriate control measures. When behavior change is the primary intervention for disease prevention, it is important to understand how to best modify network connectivity using the limited resources available to control disease spread. We describe and compare four algorithms for selecting a limited number of links to remove from a network: two “preventive” approaches (edge centrality, R0 minimization), where the decision of which links to remove is made prior to any disease outbreak and depends only on the network structure; and two “reactive” approaches (S-I edge centrality, optimal quarantining), where information about the initial disease states of the nodes is incorporated into the decision of which links to remove. We evaluate the performance of these algorithms in minimizing the total number of infections that occur over the course of an acute outbreak of disease. We consider different network structures, including both static and dynamic Erdős-Rényi random networks with varying levels of connectivity, a real-world network of residential hotels connected through injection drug use, and a network exhibiting community structure. We show that reactive approaches outperform preventive approaches in averting infections. Among reactive approaches, removing links in order of S-I edge centrality is favored when the link removal budget is small, while optimal quarantining performs best when the link removal budget is sufficiently large. The budget threshold above which optimal quarantining outperforms the S-I edge centrality algorithm is a function of both network structure (higher for unstructured Erdős-Rényi random networks compared to networks with community structure or the real-world network) and disease infectiousness (lower for highly infectious diseases). We conduct a value-of-information analysis of knowing which nodes are initially infected by comparing the performance improvement achieved by reactive over preventive strategies. We find that such information is most valuable for moderate budget levels, with increasing value as disease spread becomes more likely (due to either increased connectedness of the network or increased infectiousness of the disease). PMID:25698229

  11. Reserve selection with land market feedbacks.

    PubMed

    Butsic, Van; Lewis, David J; Radeloff, Volker C

    2013-01-15

    How to best site reserves is a leading question for conservation biologists. Recently, reserve selection has emphasized efficient conservation: maximizing conservation goals given the reality of limited conservation budgets, and this work indicates that land market can potentially undermine the conservation benefits of reserves by increasing property values and development probabilities near reserves. Here we propose a reserve selection methodology which optimizes conservation given both a budget constraint and land market feedbacks by using a combination of econometric models along with stochastic dynamic programming. We show that amenity based feedbacks can be accounted for in optimal reserve selection by choosing property price and land development models which exogenously estimate the effects of reserve establishment. In our empirical example, we use previously estimated models of land development and property prices to select parcels to maximize coarse woody debris along 16 lakes in Vilas County, WI, USA. Using each lake as an independent experiment, we find that including land market feedbacks in the reserve selection algorithm has only small effects on conservation efficacy. Likewise, we find that in our setting heuristic (minloss and maxgain) algorithms perform nearly as well as the optimal selection strategy. We emphasize that land market feedbacks can be included in optimal reserve selection; the extent to which this improves reserve placement will likely vary across landscapes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Generalized bipartite quantum state discrimination problems with sequential measurements

    NASA Astrophysics Data System (ADS)

    Nakahira, Kenji; Kato, Kentaro; Usuda, Tsuyoshi Sasaki

    2018-02-01

    We investigate an optimization problem of finding quantum sequential measurements, which forms a wide class of state discrimination problems with the restriction that only local operations and one-way classical communication are allowed. Sequential measurements from Alice to Bob on a bipartite system are considered. Using the fact that the optimization problem can be formulated as a problem with only Alice's measurement and is convex programming, we derive its dual problem and necessary and sufficient conditions for an optimal solution. Our results are applicable to various practical optimization criteria, including the Bayes criterion, the Neyman-Pearson criterion, and the minimax criterion. In the setting of the problem of finding an optimal global measurement, its dual problem and necessary and sufficient conditions for an optimal solution have been widely used to obtain analytical and numerical expressions for optimal solutions. Similarly, our results are useful to obtain analytical and numerical expressions for optimal sequential measurements. Examples in which our results can be used to obtain an analytical expression for an optimal sequential measurement are provided.

  13. The effects of atmospheric chemistry on radiation budget in the Community Earth Systems Model

    NASA Astrophysics Data System (ADS)

    Choi, Y.; Czader, B.; Diao, L.; Rodriguez, J.; Jeong, G.

    2013-12-01

    The Community Earth Systems Model (CESM)-Whole Atmosphere Community Climate Model (WACCM) simulations were performed to study the impact of atmospheric chemistry on the radiation budget over the surface within a weather prediction time scale. The secondary goal is to get a simplified and optimized chemistry module for the short time period. Three different chemistry modules were utilized to represent tropospheric and stratospheric chemistry, which differ in how their reactions and species are represented: (1) simplified tropospheric and stratospheric chemistry (approximately 30 species), (2) simplified tropospheric chemistry and comprehensive stratospheric chemistry from the Model of Ozone and Related Chemical Tracers, version 3 (MOZART-3, approximately 60 species), and (3) comprehensive tropospheric and stratospheric chemistry (MOZART-4, approximately 120 species). Our results indicate the different details in chemistry treatment from these model components affect the surface temperature and impact the radiation budget.

  14. Hospital non-price competition under the Global Budget Payment and Prospective Payment Systems.

    PubMed

    Chen, Wen-Yi; Lin, Yu-Hui

    2008-06-01

    This paper provides theoretical analyses of two alternative hospital payment systems for controlling medical cost: the Global Budget Payment System (GBPS) and the Prospective Payment System (PPS). The former method assigns a fixed total budget for all healthcare services over a given period with hospitals being paid on a fee-for-service basis. The latter method is usually connected with a fixed payment to hospitals within a Diagnosis-Related Group. Our results demonstrate that, given the same expenditure, the GBPS would approach optimal levels of quality and efficiency as well as the level of social welfare provided by the PPS, as long as market competition is sufficiently high; our results also demonstrate that the treadmill effect, modeling an inverse relationship between price and quantity under the GBPS, would be a quality-enhancing and efficiency-improving outcome due to market competition.

  15. A Cascade Optimization Strategy for Solution of Difficult Multidisciplinary Design Problems

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.; Berke, Laszlo

    1996-01-01

    A research project to comparatively evaluate 10 nonlinear optimization algorithms was recently completed. A conclusion was that no single optimizer could successfully solve all 40 problems in the test bed, even though most optimizers successfully solved at least one-third of the problems. We realized that improved search directions and step lengths, available in the 10 optimizers compared, were not likely to alleviate the convergence difficulties. For the solution of those difficult problems we have devised an alternative approach called cascade optimization strategy. The cascade strategy uses several optimizers, one followed by another in a specified sequence, to solve a problem. A pseudorandom scheme perturbs design variables between the optimizers. The cascade strategy has been tested successfully in the design of supersonic and subsonic aircraft configurations and air-breathing engines for high-speed civil transport applications. These problems could not be successfully solved by an individual optimizer. The cascade optimization strategy, however, generated feasible optimum solutions for both aircraft and engine problems. This paper presents the cascade strategy and solutions to a number of these problems.

  16. MASM: a market architecture for sensor management in distributed sensor networks

    NASA Astrophysics Data System (ADS)

    Viswanath, Avasarala; Mullen, Tracy; Hall, David; Garga, Amulya

    2005-03-01

    Rapid developments in sensor technology and its applications have energized research efforts towards devising a firm theoretical foundation for sensor management. Ubiquitous sensing, wide bandwidth communications and distributed processing provide both opportunities and challenges for sensor and process control and optimization. Traditional optimization techniques do not have the ability to simultaneously consider the wildly non-commensurate measures involved in sensor management in a single optimization routine. Market-oriented programming provides a valuable and principled paradigm to designing systems to solve this dynamic and distributed resource allocation problem. We have modeled the sensor management scenario as a competitive market, wherein the sensor manager holds a combinatorial auction to sell the various items produced by the sensors and the communication channels. However, standard auction mechanisms have been found not to be directly applicable to the sensor management domain. For this purpose, we have developed a specialized market architecture MASM (Market architecture for Sensor Management). In MASM, the mission manager is responsible for deciding task allocations to the consumers and their corresponding budgets and the sensor manager is responsible for resource allocation to the various consumers. In addition to having a modified combinatorial winner determination algorithm, MASM has specialized sensor network modules that address commensurability issues between consumers and producers in the sensor network domain. A preliminary multi-sensor, multi-target simulation environment has been implemented to test the performance of the proposed system. MASM outperformed the information theoretic sensor manager in meeting the mission objectives in the simulation experiments.

  17. Staff Dismissal: Problems & Solutions. AASA Critical Issues Report.

    ERIC Educational Resources Information Center

    Neill, Shirley Boes; Custis, Jerry

    This report, addressed to administrators and board members, discusses teacher dismissals in light of such motivating factors as declining enrollment, teacher supply and demand, and budget problems. Divided into nine chapters, this how-to-do-it book discusses some of the following topics: facts and figures on the dismissal of teachers, alternatives…

  18. 78 FR 3433 - Agency Information Collection Activities; Submission for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-16

    ... and by educating the public, especially young people, about tobacco products and the dangers their use... identified. When FDA receives tobacco-specific adverse event and product problem information, it will use the... quality problem, or product use error occurs. This risk identification process is the first necessary step...

  19. Targeting Political Communications: A Problem in Market Segmentation.

    ERIC Educational Resources Information Center

    Markwart, Richard I.

    Political campaigns are major, high-budget marketing efforts, but because they are usually managed by people with little training in either marketing or communications, they fail to persuade voters to vote in the desired way. Political targeting can be treated as a segmentation problem, one of identifying and responding to the specific qualities…

  20. The Lily-White University Presses.

    ERIC Educational Resources Information Center

    Shin, Annys

    1996-01-01

    Argues that the university presses are immune from racial change and discusses the problem of using location as an argument for not being able to lure blacks into university publishing. Howard University Press is used to illustrate the problem of budget cutting and the ability to boost black recruitment efforts or establish a united black press…

  1. Compromises.

    ERIC Educational Resources Information Center

    Sizer, Theodore R.

    1984-01-01

    Taking as examples the issues of improving students'"high order thinking skills" and arriving at more equitable teacher salaries and school budgets, the author discusses the need for compromise solutions to widespread problems. (JBM)

  2. Optimization models for prioritizing bus stop facility investments for riders with disabilities : March 2010.

    DOT National Transportation Integrated Search

    2010-03-01

    The Americans with Disabilities Act (ADA) of 1990 prescribes the minimum requirements for bus stop accessibility by riders with disabilities. Due to limited budgets, transit agencies can only select a limited number of bus stop locations for ADA impr...

  3. Reducing uncertainties in decadal variability of the global carbon budget with multiple datasets

    PubMed Central

    Li, Wei; Ciais, Philippe; Wang, Yilong; Peng, Shushi; Broquet, Grégoire; Ballantyne, Ashley P.; Canadell, Josep G.; Cooper, Leila; Friedlingstein, Pierre; Le Quéré, Corinne; Myneni, Ranga B.; Peters, Glen P.; Piao, Shilong; Pongratz, Julia

    2016-01-01

    Conventional calculations of the global carbon budget infer the land sink as a residual between emissions, atmospheric accumulation, and the ocean sink. Thus, the land sink accumulates the errors from the other flux terms and bears the largest uncertainty. Here, we present a Bayesian fusion approach that combines multiple observations in different carbon reservoirs to optimize the land (B) and ocean (O) carbon sinks, land use change emissions (L), and indirectly fossil fuel emissions (F) from 1980 to 2014. Compared with the conventional approach, Bayesian optimization decreases the uncertainties in B by 41% and in O by 46%. The L uncertainty decreases by 47%, whereas F uncertainty is marginally improved through the knowledge of natural fluxes. Both ocean and net land uptake (B + L) rates have positive trends of 29 ± 8 and 37 ± 17 Tg C⋅y−2 since 1980, respectively. Our Bayesian fusion of multiple observations reduces uncertainties, thereby allowing us to isolate important variability in global carbon cycle processes. PMID:27799533

  4. The CPAT 2.0.2 Domain Model - How CPAT 2.0.2 "Thinks" From an Analyst Perspective.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waddell, Lucas; Muldoon, Frank; Melander, Darryl J.

    To help effectively plan the management and modernization of their large and diverse fleets of vehicles, the Program Executive Office Ground Combat Systems (PEO GCS) and the Program Executive Office Combat Support and Combat Service Support (PEO CS &CSS) commissioned the development of a large - scale portfolio planning optimization tool. This software, the Capability Portfolio Analysis Tool (CPAT), creates a detailed schedule that optimally prioritizes the modernization or replacement of vehicles within the fleet - respecting numerous business rules associated with fleet structure, budgets, industrial base, research and testing, etc., while maximizing overall fleet performance through time. This reportmore » contains a description of the organizational fleet structure and a thorough explanation of the business rules that the CPAT formulation follows involving performance, scheduling, production, and budgets. This report, which is an update to the original CPAT domain model published in 2015 (SAND2015 - 4009), covers important new CPAT features. This page intentionally left blank« less

  5. A Life Cycle Cost Management Primer for Use within the Aeronautical Systems Division.

    DTIC Science & Technology

    1983-03-01

    8217- ’ - :. -- . "- .- / . ’ - i - % ’. - " ’ .. ’ .. ."-’ ?.. . . . . . . . . . . . A I AC KOWLEDGEMENTS I would like to express my gratitude and appreciation to Major Tom...scrutiny over and decreased buying power of our nation’s defense budget. The DOD, aware of these problems, realizes that it can no longer rely on...today because of the increased scrutiny over and decreased buying power of our nation’s defense budget. The Air Force, like other DOD com- ponents, can

  6. Multi-Item Multiperiodic Inventory Control Problem with Variable Demand and Discounts: A Particle Swarm Optimization Algorithm

    PubMed Central

    Mousavi, Seyed Mohsen; Niaki, S. T. A.; Bahreininejad, Ardeshir; Musa, Siti Nurmaya

    2014-01-01

    A multi-item multiperiod inventory control model is developed for known-deterministic variable demands under limited available budget. Assuming the order quantity is more than the shortage quantity in each period, the shortage in combination of backorder and lost sale is considered. The orders are placed in batch sizes and the decision variables are assumed integer. Moreover, all unit discounts for a number of products and incremental quantity discount for some other items are considered. While the objectives are to minimize both the total inventory cost and the required storage space, the model is formulated into a fuzzy multicriteria decision making (FMCDM) framework and is shown to be a mixed integer nonlinear programming type. In order to solve the model, a multiobjective particle swarm optimization (MOPSO) approach is applied. A set of compromise solution including optimum and near optimum ones via MOPSO has been derived for some numerical illustration, where the results are compared with those obtained using a weighting approach. To assess the efficiency of the proposed MOPSO, the model is solved using multi-objective genetic algorithm (MOGA) as well. A large number of numerical examples are generated at the end, where graphical and statistical approaches show more efficiency of MOPSO compared with MOGA. PMID:25093195

  7. Cost- and reliability-oriented aggregation point association in long-term evolution and passive optical network hybrid access infrastructure for smart grid neighborhood area network

    NASA Astrophysics Data System (ADS)

    Cheng, Xiao; Feng, Lei; Zhou, Fanqin; Wei, Lei; Yu, Peng; Li, Wenjing

    2018-02-01

    With the rapid development of the smart grid, the data aggregation point (AP) in the neighborhood area network (NAN) is becoming increasingly important for forwarding the information between the home area network and wide area network. Due to limited budget, it is unable to use one-single access technology to meet the ongoing requirements on AP coverage. This paper first introduces the wired and wireless hybrid access network with the integration of long-term evolution (LTE) and passive optical network (PON) system for NAN, which allows a good trade-off among cost, flexibility, and reliability. Then, based on the already existing wireless LTE network, an AP association optimization model is proposed to make the PON serve as many APs as possible, considering both the economic efficiency and network reliability. Moreover, since the features of the constraints and variables of this NP-hard problem, a hybrid intelligent optimization algorithm is proposed, which is achieved by the mixture of the genetic, ant colony and dynamic greedy algorithm. By comparing with other published methods, simulation results verify the performance of the proposed method in improving the AP coverage and the performance of the proposed algorithm in terms of convergence.

  8. A Time-Regularized, Multiple Gravity-Assist Low-Thrust, Bounded-Impulse Model for Trajectory Optimization

    NASA Technical Reports Server (NTRS)

    Ellison, Donald H.; Englander, Jacob A.; Conway, Bruce A.

    2017-01-01

    The multiple gravity assist low-thrust (MGALT) trajectory model combines the medium-fidelity Sims-Flanagan bounded-impulse transcription with a patched-conics flyby model and is an important tool for preliminary trajectory design. While this model features fast state propagation via Keplers equation and provides a pleasingly accurate estimation of the total mass budget for the eventual flight suitable integrated trajectory it does suffer from one major drawback, namely its temporal spacing of the control nodes. We introduce a variant of the MGALT transcription that utilizes the generalized anomaly from the universal formulation of Keplers equation as a decision variable in addition to the trajectory phase propagation time. This results in two improvements over the traditional model. The first is that the maneuver locations are equally spaced in generalized anomaly about the orbit rather than time. The second is that the Kepler propagator now has the generalized anomaly as its independent variable instead of time and thus becomes an iteration-free propagation method. The new algorithm is outlined, including the impact that this has on the computation of Jacobian entries for numerical optimization, and a motivating application problem is presented that illustrates the improvements that this model has over the traditional MGALT transcription.

  9. A Time-Regularized Multiple Gravity-Assist Low-Thrust Bounded-Impulse Model for Trajectory Optimization

    NASA Technical Reports Server (NTRS)

    Ellison, Donald H.; Englander, Jacob A.; Conway, Bruce A.

    2017-01-01

    The multiple gravity assist low-thrust (MGALT) trajectory model combines the medium-fidelity Sims-Flanagan bounded-impulse transcription with a patched-conics flyby model and is an important tool for preliminary trajectory design. While this model features fast state propagation via Kepler's equation and provides a pleasingly accurate estimation of the total mass budget for the eventual flight-suitable integrated trajectory it does suffer from one major drawback, namely its temporal spacing of the control nodes. We introduce a variant of the MGALT transcription that utilizes the generalized anomaly from the universal formulation of Kepler's equation as a decision variable in addition to the trajectory phase propagation time. This results in two improvements over the traditional model. The first is that the maneuver locations are equally spaced in generalized anomaly about the orbit rather than time. The second is that the Kepler propagator now has the generalized anomaly as its independent variable instead of time and thus becomes an iteration-free propagation method. The new algorithm is outlined, including the impact that this has on the computation of Jacobian entries for numerical optimization, and a motivating application problem is presented that illustrates the improvements that this model has over the traditional MGALT transcription.

  10. Integrated modeling environment for systems-level performance analysis of the Next-Generation Space Telescope

    NASA Astrophysics Data System (ADS)

    Mosier, Gary E.; Femiano, Michael; Ha, Kong; Bely, Pierre Y.; Burg, Richard; Redding, David C.; Kissil, Andrew; Rakoczy, John; Craig, Larry

    1998-08-01

    All current concepts for the NGST are innovative designs which present unique systems-level challenges. The goals are to outperform existing observatories at a fraction of the current price/performance ratio. Standard practices for developing systems error budgets, such as the 'root-sum-of- squares' error tree, are insufficient for designs of this complexity. Simulation and optimization are the tools needed for this project; in particular tools that integrate controls, optics, thermal and structural analysis, and design optimization. This paper describes such an environment which allows sub-system performance specifications to be analyzed parametrically, and includes optimizing metrics that capture the science requirements. The resulting systems-level design trades are greatly facilitated, and significant cost savings can be realized. This modeling environment, built around a tightly integrated combination of commercial off-the-shelf and in-house- developed codes, provides the foundation for linear and non- linear analysis on both the time and frequency-domains, statistical analysis, and design optimization. It features an interactive user interface and integrated graphics that allow highly-effective, real-time work to be done by multidisciplinary design teams. For the NGST, it has been applied to issues such as pointing control, dynamic isolation of spacecraft disturbances, wavefront sensing and control, on-orbit thermal stability of the optics, and development of systems-level error budgets. In this paper, results are presented from parametric trade studies that assess requirements for pointing control, structural dynamics, reaction wheel dynamic disturbances, and vibration isolation. These studies attempt to define requirements bounds such that the resulting design is optimized at the systems level, without attempting to optimize each subsystem individually. The performance metrics are defined in terms of image quality, specifically centroiding error and RMS wavefront error, which directly links to science requirements.

  11. Light budget and optimization strategies for display applications of dichroic nematic droplet/polymer films

    NASA Astrophysics Data System (ADS)

    Drzaic, Paul S.

    1991-06-01

    A detailed light budget is performed on reflective displays made from nematic droplet/polymer (NCAP) films incorporating dichroic dyes. It is shown that the radiance of these displays can be modeled using a simple scheme involving the convolution of the transmission spectra of the various components of the display. Comparisons are made between the relative importance of these optical elements in the display. It is shown that first-surface reflection (glare) is an important factor in reducing the optical performance of these displays, but that the effect of glare can be minimized through the proper choice of dye concentration.

  12. Algorithms for bilevel optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia; Dennis, J. E., Jr.

    1994-01-01

    General multilevel nonlinear optimization problems arise in design of complex systems and can be used as a means of regularization for multi-criteria optimization problems. Here, for clarity in displaying our ideas, we restrict ourselves to general bi-level optimization problems, and we present two solution approaches. Both approaches use a trust-region globalization strategy, and they can be easily extended to handle the general multilevel problem. We make no convexity assumptions, but we do assume that the problem has a nondegenerate feasible set. We consider necessary optimality conditions for the bi-level problem formulations and discuss results that can be extended to obtain multilevel optimization formulations with constraints at each level.

  13. Near-Optimal Guidance Method for Maximizing the Reachable Domain of Gliding Aircraft

    NASA Astrophysics Data System (ADS)

    Tsuchiya, Takeshi

    This paper proposes a guidance method for gliding aircraft by using onboard computers to calculate a near-optimal trajectory in real-time, and thereby expanding the reachable domain. The results are applicable to advanced aircraft and future space transportation systems that require high safety. The calculation load of the optimal control problem that is used to maximize the reachable domain is too large for current computers to calculate in real-time. Thus the optimal control problem is divided into two problems: a gliding distance maximization problem in which the aircraft motion is limited to a vertical plane, and an optimal turning flight problem in a horizontal direction. First, the former problem is solved using a shooting method. It can be solved easily because its scale is smaller than that of the original problem, and because some of the features of the optimal solution are obtained in the first part of this paper. Next, in the latter problem, the optimal bank angle is computed from the solution of the former; this is an analytical computation, rather than an iterative computation. Finally, the reachable domain obtained from the proposed near-optimal guidance method is compared with that obtained from the original optimal control problem.

  14. NASA/Howard University Large Space Structures Institute

    NASA Technical Reports Server (NTRS)

    Broome, T. H., Jr.

    1984-01-01

    Basic research on the engineering behavior of large space structures is presented. Methods of structural analysis, control, and optimization of large flexible systems are examined. Topics of investigation include the Load Correction Method (LCM) modeling technique, stabilization of flexible bodies by feedback control, mathematical refinement of analysis equations, optimization of the design of structural components, deployment dynamics, and the use of microprocessors in attitude and shape control of large space structures. Information on key personnel, budgeting, support plans and conferences is included.

  15. [Design of medical devices management system supporting full life-cycle process management].

    PubMed

    Su, Peng; Zhong, Jianping

    2014-03-01

    Based on the analysis of the present status of medical devices management, this paper optimized management process, developed a medical devices management system with Web technologies. With information technology to dynamic master the use of state of the entire life-cycle of medical devices. Through the closed-loop management with pre-event budget, mid-event control and after-event analysis, improved the delicacy management level of medical devices, optimized asset allocation, promoted positive operation of devices.

  16. The Effect of Temperature on the Optimization of Photovoltaic Cells Using Silvaco ATLAS Modeling

    DTIC Science & Technology

    2010-09-01

    the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE ...September 2010 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE The Effect of Temperature on the Optimization of...light source used in this thesis. The sun provides an average of 135 mW/cm2 of input power for objects in orbit around the Earth and as much as 100 mW

  17. A New Model of Transportation Service for Student with Disabilities

    ERIC Educational Resources Information Center

    Meslin, Pete

    2011-01-01

    With tighter education budgets for support service, some districts must consider other means of providing transportation service for students with disabilities. Some districts have used creative strategies, such as optimizing class locations, sharing service with other districts, using other modes of transportation, and consulting transportation…

  18. Direct Method Transcription for a Human-Class Translunar Injection Trajectory Optimization

    NASA Technical Reports Server (NTRS)

    Witzberger, Kevin E.; Zeiler, Tom

    2012-01-01

    This paper presents a new trajectory optimization software package developed in the framework of a low-to-high fidelity 3 degrees-of-freedom (DOF)/6-DOF vehicle simulation program named Mission Analysis Simulation Tool in Fortran (MASTIF) and its application to a translunar trajectory optimization problem. The functionality of the developed optimization package is implemented as a new "mode" in generalized settings to make it applicable for a general trajectory optimization problem. In doing so, a direct optimization method using collocation is employed for solving the problem. Trajectory optimization problems in MASTIF are transcribed to a constrained nonlinear programming (NLP) problem and solved with SNOPT, a commercially available NLP solver. A detailed description of the optimization software developed is provided as well as the transcription specifics for the translunar injection (TLI) problem. The analysis includes a 3-DOF trajectory TLI optimization and a 3-DOF vehicle TLI simulation using closed-loop guidance.

  19. Optimizing Barrier Removal to Restore Connectivity in Utah's Weber Basin

    NASA Astrophysics Data System (ADS)

    Kraft, M.; Null, S. E.

    2016-12-01

    Instream barriers, such as dams, culverts and diversions are economically important for water supply, but negatively affect river ecosystems and disrupt hydrologic processes. Removal of uneconomical and aging in-stream barriers to improve habitat connectivity is increasingly used to restore river connectivity. Most past barrier removal projects focused on individual barriers using a score-and-rank technique, ignoring cumulative change from multiple, spatially-connected barrier removals. Similarly, most water supply models optimize either human water use or aquatic connectivity, failing to holistically represent human and environmental benefits. In this study, a dual objective optimization model identified in-stream barriers that impede aquatic habitat connectivity for trout, using streamflow, temperature, and channel gradient as indicators of aquatic habitat suitability. Water scarcity costs are minimized using agricultural and urban economic penalty functions to incorporate water supply benefits and a budget monetizes costs of removing small barriers like culverts and road crossings. The optimization model developed is applied to a case study in Utah's Weber basin to prioritize removal of the most environmentally harmful barriers, while maintaining human water uses. The dual objective solution basis was developed to quantify and graphically visualize tradeoffs between connected quality-weighted habitat for Bonneville cutthroat trout and economic water uses. Modeled results include a spectrum of barrier removal alternatives based on budget and quality-weighted reconnected habitat that can be communicated with local stakeholders. This research will help prioritize barrier removals and future restoration decisions. The modeling approach expands current barrier removal optimization methods by explicitly including economic and environmental water uses.

  20. Optimal Electrical Energy Slewing for Reaction Wheel Spacecraft

    NASA Astrophysics Data System (ADS)

    Marsh, Harleigh Christian

    The results contained in this dissertation contribute to a deeper level of understanding to the energy required to slew a spacecraft using reaction wheels. This work addresses the fundamental manner in which spacecrafts are slewed (eigenaxis maneuvering), and demonstrates that this conventional maneuver can be dramatically improved upon in regards to reduction of energy, dissipative losses, as well as peak power. Energy is a fundamental resource that effects every asset, system, and subsystem upon a spacecraft, from the attitude control system which orients the spacecraft, to the communication subsystem to link with ground stations, to the payloads which collect scientific data. For a reaction wheel spacecraft, the attitude control system is a particularly heavy load on the power and energy resources on a spacecraft. The central focus of this dissertation is reducing the burden which the attitude control system places upon the spacecraft in regards to electrical energy, which is shown in this dissertation to be a challenging problem to computationally solve and analyze. Reducing power and energy demands can have a multitude of benefits, spanning from the initial design phase, to in-flight operations, to potentially extending the mission life of the spacecraft. This goal is approached from a practical standpoint apropos to an industry-flight setting. Metrics to measure electrical energy and power are developed which are in-line with the cost associated to operating reaction wheel based attitude control systems. These metrics are incorporated into multiple families of practical high-dimensional constrained nonlinear optimal control problems to reduce the electrical energy, as well as the instantaneous power burdens imposed by the attitude control system upon the spacecraft. Minimizing electrical energy is shown to be a problem in L1 optimal control which is nonsmooth in regards to state variables as well as the control. To overcome the challenge of nonsmoothness, a method is adopted in this dissertation to transform the nonsmooth minimum electrical energy problem into an equivalent smooth formulation, which then allows standard techniques in optimal control to solve and analyze the problem. Through numerically solving families of optimal control problems, the relationship between electrical energy and transfer time is identified and explored for both off-and on-eigenaxis maneuvering, under minimum dissipative losses as well as under minimum electrical energy. A trade space between on-and off-eigenaxis maneuvering is identified, from which is shown that agile near time optimal maneuvers exist within the energy budget associated with conventional eigenaxis maneuvering. Moreover, even for conventional eigenaxis maneuvering, energy requirements can be dramatically reduced by maneuvering off-eigenaxis. These results address one of the fundamental assumptions in the field of optimal path design verses conventional maneuver design. Two practical flight situations are addressed in this dissertation in regards to reducing energy and power: The case when the attitude of the spacecraft is predetermined, and the case where reaction wheels can not be directly controlled. For the setting where the attitude of spacecraft is on a predefined trajectory, it is demonstrated that reduced energy maneuvers are only attainable though the application of null-motions, which requires control of the reaction wheels. A computationally light formulation is developed minimizing the dissipative losses through the application of null motions. In the situation where the reaction wheels can not be directly controlled, it is demonstrated that energy consumption, dissipative losses, and peak-power loads, of the reaction-wheel array can each be reduced substantially by controlling the input to the attitude control system through attitude steering. It is demonstrated that the open loop trajectories correctly predict the closed loop response when tracked by an attitude control system which does not allow direct command of the reaction wheels.

  1. Multidisciplinary Optimization of a Transport Aircraft Wing using Particle Swarm Optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Venter, Gerhard

    2002-01-01

    The purpose of this paper is to demonstrate the application of particle swarm optimization to a realistic multidisciplinary optimization test problem. The paper's new contributions to multidisciplinary optimization is the application of a new algorithm for dealing with the unique challenges associated with multidisciplinary optimization problems, and recommendations as to the utility of the algorithm in future multidisciplinary optimization applications. The selected example is a bi-level optimization problem that demonstrates severe numerical noise and has a combination of continuous and truly discrete design variables. The use of traditional gradient-based optimization algorithms is thus not practical. The numerical results presented indicate that the particle swarm optimization algorithm is able to reliably find the optimum design for the problem presented here. The algorithm is capable of dealing with the unique challenges posed by multidisciplinary optimization as well as the numerical noise and truly discrete variables present in the current example problem.

  2. Real-Life Research: Project Runway Makeover Model

    ERIC Educational Resources Information Center

    Jaeger, Paige; Nesi, Olga M.

    2014-01-01

    Real-life research is incredibly varied. We research cars. We research lawn problems. We research child behavior problems, health issues, possible vacation destinations, and prices to stretch our budgets. No two scenarios are ever alike, and no two health issues should be assumed to be the same. That is reality, and that is a picture of what the…

  3. 77 FR 69506 - Agency Information Collection Activities: Submission for the Office of Management and Budget (OMB...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-19

    ... conduct or sponsor, and that a person is not required to respond to, a collection of information unless it... Non-Routine Emergency Generic Problems. 3. Current OMB approval number: 3150-0012. 4. The form number... possible non-routine generic problems which would require prompt action from the NRC to preclude potential...

  4. Measuring forest evapotranspiration--theory and problems

    Treesearch

    Anthony C. Federer; Anthony C. Federer

    1970-01-01

    A satisfactory general method of measuring forest evapotranspiration has yet to be developed. Many procedures have been tried, but only the soil-water budget method and the micrometeorological methods offer any degree of success. This paper is a discussion of these procedures and the problems that arise in applying them. It is designed as a reference for scientists and...

  5. In Swimming Branch, Identification of Family Burden of Families of Mentally Disabled Athletes

    ERIC Educational Resources Information Center

    Nacar, Eyyup; Karahuseyinoglu, M. Fatih; Karatas, Baykal; Altungul, Oguzhan

    2017-01-01

    The families of individuals with Down-syndrome, autism, and mental problem who need for special requirements experience physical problems, tiredness, and antisocial life, which bring additional cost to family budget, from time to time due to difficulties of their children The aim of this study is to identify family burdens charged by kids with…

  6. Optimal recombination in genetic algorithms for flowshop scheduling problems

    NASA Astrophysics Data System (ADS)

    Kovalenko, Julia

    2016-10-01

    The optimal recombination problem consists in finding the best possible offspring as a result of a recombination operator in a genetic algorithm, given two parent solutions. We prove NP-hardness of the optimal recombination for various variants of the flowshop scheduling problem with makespan criterion and criterion of maximum lateness. An algorithm for solving the optimal recombination problem for permutation flowshop problems is built, using enumeration of prefect matchings in a special bipartite graph. The algorithm is adopted for the classical flowshop scheduling problem and for the no-wait flowshop problem. It is shown that the optimal recombination problem for the permutation flowshop scheduling problem is solvable in polynomial time for almost all pairs of parent solutions as the number of jobs tends to infinity.

  7. Toward the Library-Bookstore

    ERIC Educational Resources Information Center

    Severtson, Susan; Banks, George

    1971-01-01

    Close collaboration between library and bookstore (or a complete merger of the two) could do much to solve the immediate problems faced by many small colleges with small library collections and limited budgets. (MF)

  8. Mouth Problems and HIV

    MedlinePlus

    ... Fellowships, & Training Grants Job Openings Diversity Loan Repayment Staff Contacts News & Events - + NIDCR News E-Newsletters Grantee ... For Reporters About Us - + Mission Strategic Plan Leadership & Staff Advisory Committees Budget & Congressional Job Openings Diversity Getting ...

  9. An efficient and accurate solution methodology for bilevel multi-objective programming problems using a hybrid evolutionary-local-search algorithm.

    PubMed

    Deb, Kalyanmoy; Sinha, Ankur

    2010-01-01

    Bilevel optimization problems involve two optimization tasks (upper and lower level), in which every feasible upper level solution must correspond to an optimal solution to a lower level optimization problem. These problems commonly appear in many practical problem solving tasks including optimal control, process optimization, game-playing strategy developments, transportation problems, and others. However, they are commonly converted into a single level optimization problem by using an approximate solution procedure to replace the lower level optimization task. Although there exist a number of theoretical, numerical, and evolutionary optimization studies involving single-objective bilevel programming problems, not many studies look at the context of multiple conflicting objectives in each level of a bilevel programming problem. In this paper, we address certain intricate issues related to solving multi-objective bilevel programming problems, present challenging test problems, and propose a viable and hybrid evolutionary-cum-local-search based algorithm as a solution methodology. The hybrid approach performs better than a number of existing methodologies and scales well up to 40-variable difficult test problems used in this study. The population sizing and termination criteria are made self-adaptive, so that no additional parameters need to be supplied by the user. The study indicates a clear niche of evolutionary algorithms in solving such difficult problems of practical importance compared to their usual solution by a computationally expensive nested procedure. The study opens up many issues related to multi-objective bilevel programming and hopefully this study will motivate EMO and other researchers to pay more attention to this important and difficult problem solving activity.

  10. Calibrated models as management tools for stream-aquifer systems: the case of central Kansas, USA

    NASA Astrophysics Data System (ADS)

    Sophocleous, Marios; Perkins, Samuel P.

    1993-12-01

    We address the problem of declining streamflows in interconnected stream-aquifer systems and explore possible management options to address the problem for two areas of central Kansas: the Arkansas River valley from Kinsley to Great Bend and the lower Rattlesnake Creek-Quivira National Wildlife Refuge area. The approach we followed implements, calibrates, and partially validates for the study areas a stream-aquifer numerical model combined with a parameter estimation package and sensitivity analysis. Hydrologic budgets for both predevelopment and developed conditions indicate significant differences in the hydrologic components of the study areas resulting from development. The predevelopment water budgets give an estimate of natural ground-water recharge, whereas the budgets for developed conditions give an estimate of induced recharge, indicating that major ground-water development changes the recharge-discharge regime of the model areas with time. Such stream-aquifer models serve to link proposed actions to hydrologic effects, as is clearly demonstrated by the effects of various management alternatives on the streamflows of the Arkansas River and Rattlesnake Creek. Thus we show that a possible means of restoring specified streamflows in the area is to implement protective stream corridors with restricted ground-water extraction.

  11. Calibrated models as management tools for stream-aquifer systems: the case of central Kansas, USA

    USGS Publications Warehouse

    Sophocleous, M.; Perkins, S.P.

    1993-01-01

    We address the problem of declining streamflows in interconnected stream-aquifer systems and explore possible management options to address the problem for two areas of central Kansas: the Arkansas River valley from Kinsley to Great Bend and the lower Rattlesnake Creek-Quivira National Wildlife Refuge area. The approach we followed implements, calibrates, and partially validates for the study areas a stream-aquifer numerical model combined with a parameter estimation package and sensitivity analysis. Hydrologic budgets for both predevelopment and developed conditions indicate significant differences in the hydrologic components of the study areas resulting from development. The predevelopment water budgets give an estimate of natural ground-water recharge, whereas the budgets for developed conditions give an estimate of induced recharge, indicating that major ground-water development changes the recharge-discharge regime of the model areas with time. Such stream-aquifer models serve to link proposed actions to hydrologic effects, as is clearly demonstrated by the effects of various management alternatives on the streamflows of the Arkansas River and Rattlesnake Creek. Thus we show that a possible means of restoring specified streamflows in the area is to implement protective stream corridors with restricted ground-water extraction. ?? 1993.

  12. Cost effectiveness of the US Geological Survey stream-gaging program in Alabama

    USGS Publications Warehouse

    Jeffcoat, H.H.

    1987-01-01

    A study of the cost effectiveness of the stream gaging program in Alabama identified data uses and funding sources for 72 surface water stations (including dam stations, slope stations, and continuous-velocity stations) operated by the U.S. Geological Survey in Alabama with a budget of $393,600. Of these , 58 gaging stations were used in all phases of the analysis at a funding level of $328,380. For the current policy of operation of the 58-station program, the average standard error of estimation of instantaneous discharge is 29.3%. This overall level of accuracy can be maintained with a budget of $319,800 by optimizing routes and implementing some policy changes. The maximum budget considered in the analysis was $361,200, which gave an average standard error of estimation of 20.6%. The minimum budget considered was $299,360, with an average standard error of estimation of 36.5%. The study indicates that a major source of error in the stream gaging records is lost or missing data that are the result of streamside equipment failure. If perfect equipment were available, the standard error in estimating instantaneous discharge under the current program and budget could be reduced to 18.6%. This can also be interpreted to mean that the streamflow data records have a standard error of this magnitude during times when the equipment is operating properly. (Author 's abstract)

  13. Outlook 2011

    ERIC Educational Resources Information Center

    Kennedy, Mike

    2011-01-01

    Many people view the coming of January 1 each year as an opportunity for a new beginning. The conditions that are facing school and university administrators and educators make it difficult to approach the coming year with optimism. The effects of the recession that crippled state and local budgets have yet to ease, and schools and universities…

  14. Ace Project as a Project Management Tool

    ERIC Educational Resources Information Center

    Cline, Melinda; Guynes, Carl S.; Simard, Karine

    2010-01-01

    The primary challenge of project management is to achieve the project goals and objectives while adhering to project constraints--usually scope, quality, time and budget. The secondary challenge is to optimize the allocation and integration of resources necessary to meet pre-defined objectives. Project management software provides an active…

  15. On the Allocation of Resources for Secondary Schools

    ERIC Educational Resources Information Center

    Haelermans, Carla; De Witte, Kristof; Blank, Jos L. T.

    2012-01-01

    This paper studies the optimal allocation of resources--in terms of school management, teachers, supporting employees and materials--in secondary schools. We use a flexible budget constrained output distance function model to estimate both technical and allocative efficiency scores for 448 Dutch secondary schools between 2002 and 2007. The results…

  16. Thriving through Recession: Higher Education in a down Economy

    ERIC Educational Resources Information Center

    Goodman, Roger

    2009-01-01

    The constant flow of alarming economic and business news, rapidly declining endowments and potential disruption to the student-loan industry have all beaten down optimism about higher education's financial and strategic outlook. Universities large and small have announced budget cuts, layoffs, salary freezes, capital spending slowdowns and other…

  17. On CALL: One Approach to Improving Services for Students with Low-Incidence Disabilities

    ERIC Educational Resources Information Center

    Bradley-Johnson, Sharon; Johnson, C. Merle; Drevon, Daniel D.

    2015-01-01

    Students with low-incidence disabilities frequently receive less than optimal psychoeducational services because the specialized tests and instructional materials required to meet their idiosyncratic needs often are unavailable due to budget constraints, inadequate training of school personnel, and the difficulty school personnel have keeping…

  18. Resource-Bounded Information Gathering for Correlation Clustering

    DTIC Science & Technology

    2007-01-01

    5], budgeted learning, [4], and active learning , for example, [3]. 3 Acknowledgments We thank Avrim Blum, Katrina Ligett, Chris Pal, Sridhar...2007 3. N. Roy, A. McCallum, Toward Optimal Active Learning through Sampling Estima- tion of Error Reduction, Proc. of 18th ICML, 2001 4. A. Kapoor, R

  19. Optimal Design for Two-Level Random Assignment and Regression Discontinuity Studies

    ERIC Educational Resources Information Center

    Rhoads, Christopher H.; Dye, Charles

    2016-01-01

    An important concern when planning research studies is to obtain maximum precision of an estimate of a treatment effect given a budget constraint. When research designs have a "multilevel" or "hierarchical" structure changes in sample size at different levels of the design will impact precision differently. Furthermore, there…

  20. Finite dimensional approximation of a class of constrained nonlinear optimal control problems

    NASA Technical Reports Server (NTRS)

    Gunzburger, Max D.; Hou, L. S.

    1994-01-01

    An abstract framework for the analysis and approximation of a class of nonlinear optimal control and optimization problems is constructed. Nonlinearities occur in both the objective functional and in the constraints. The framework includes an abstract nonlinear optimization problem posed on infinite dimensional spaces, and approximate problem posed on finite dimensional spaces, together with a number of hypotheses concerning the two problems. The framework is used to show that optimal solutions exist, to show that Lagrange multipliers may be used to enforce the constraints, to derive an optimality system from which optimal states and controls may be deduced, and to derive existence results and error estimates for solutions of the approximate problem. The abstract framework and the results derived from that framework are then applied to three concrete control or optimization problems and their approximation by finite element methods. The first involves the von Karman plate equations of nonlinear elasticity, the second, the Ginzburg-Landau equations of superconductivity, and the third, the Navier-Stokes equations for incompressible, viscous flows.

  1. Deferred Maintenance Strategies You Must Try.

    ERIC Educational Resources Information Center

    Sturgeon, Julie

    2000-01-01

    Discusses how some college administrators have found "over-the-counter" cures for deferred maintenance problems resulting from budgeting shortfalls. Catch-up approaches discussed include those involving planning, prioritizing, partnering, and gift giving. (GR)

  2. Optimal conservation resource allocation under variable economic and ecological time discounting rates in boreal forest.

    PubMed

    Mazziotta, Adriano; Pouzols, Federico Montesino; Mönkkönen, Mikko; Kotiaho, Janne S; Strandman, Harri; Moilanen, Atte

    2016-09-15

    Resource allocation to multiple alternative conservation actions is a complex task. A common trade-off occurs between protection of smaller, expensive, high-quality areas versus larger, cheaper, partially degraded areas. We investigate optimal allocation into three actions in boreal forest: current standard forest management rules, setting aside of mature stands, or setting aside of clear-cuts. We first estimated how habitat availability for focal indicator species and economic returns from timber harvesting develop through time as a function of forest type and action chosen. We then developed an optimal resource allocation by accounting for budget size and habitat availability of indicator species in different forest types. We also accounted for the perspective adopted towards sustainability, modeled via temporal preference and economic and ecological time discounting. Controversially, we found that in boreal forest set-aside followed by protection of clear-cuts can become a winning cost-effective strategy when accounting for habitat requirements of multiple species, long planning horizon, and limited budget. It is particularly effective when adopting a long-term sustainability perspective, and accounting for present revenues from timber harvesting. The present analysis assesses the cost-effective conditions to allocate resources into an inexpensive conservation strategy that nevertheless has potential to produce high ecological values in the future. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Optimizing insecticide allocation strategies based on houses and livestock shelters for visceral leishmaniasis control in Bihar, India.

    PubMed

    Gorahava, Kaushik K; Rosenberger, Jay M; Mubayi, Anuj

    2015-07-01

    Visceral leishmaniasis (VL) is the most deadly form of the leishmaniasis family of diseases, which affects numerous developing countries. The Indian state of Bihar has the highest prevalence and mortality rate of VL in the world. Insecticide spraying is believed to be an effective vector control program for controlling the spread of VL in Bihar; however, it is expensive and less effective if not implemented systematically. This study develops and analyzes a novel optimization model for VL control in Bihar that identifies an optimal (best possible) allocation of chosen insecticide (dichlorodiphenyltrichloroethane [DDT] or deltamethrin) based on the sizes of human and cattle populations in the region. The model maximizes the insecticide-induced sandfly death rate in human and cattle dwellings while staying within the current state budget for VL vector control efforts. The model results suggest that deltamethrin might not be a good replacement for DDT because the insecticide-induced sandfly deaths are 3.72 times more in case of DDT even after 90 days post spray. Different insecticide allocation strategies between the two types of sites (houses and cattle sheds) are suggested based on the state VL-control budget and have a direct implication on VL elimination efforts in a resource-limited region. © The American Society of Tropical Medicine and Hygiene.

  4. LDRD Final Report: Global Optimization for Engineering Science Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HART,WILLIAM E.

    1999-12-01

    For a wide variety of scientific and engineering problems the desired solution corresponds to an optimal set of objective function parameters, where the objective function measures a solution's quality. The main goal of the LDRD ''Global Optimization for Engineering Science Problems'' was the development of new robust and efficient optimization algorithms that can be used to find globally optimal solutions to complex optimization problems. This SAND report summarizes the technical accomplishments of this LDRD, discusses lessons learned and describes open research issues.

  5. A simple method for estimating gross carbon budgets for vegetation in forest ecosystems.

    PubMed

    Ryan, Michael G.

    1991-01-01

    Gross carbon budgets for vegetation in forest ecosystems are difficult to construct because of problems in scaling flux measurements made on small samples over short periods of time and in determining belowground carbon allocation. Recently, empirical relationships have been developed to estimate total belowground carbon allocation from litterfall, and maintenance respiration from tissue nitrogen content. I outline a method for estimating gross carbon budgets using these empirical relationships together with data readily available from ecosystem studies (aboveground wood and canopy production, aboveground wood and canopy biomass, litterfall, and tissue nitrogen contents). Estimates generated with this method are compared with annual carbon fixation estimates from the Forest-BGC model for a lodgepole pine (Pinus contorta Dougl.) and a Pacific silver fir (Abies amabilis Dougl.) chronosequence.

  6. Characterization of cirrus clouds and atmospheric state using a new hyper-spectral optimal estimation retrieval

    NASA Astrophysics Data System (ADS)

    Veglio, P.; Holz, R.

    2016-12-01

    The importance of cirrus clouds as regulators of Earth's climate and radiation budget has been widely demonstrated, but still their characterization remains challenging. In order to derive cirrus properties, many retrieval techniques rely on prior assumptions on the atmospheric state or on the ice microphysics, either because the computational cost is too high or because the measurements do not have enough information, as in the case of broadband sensors. In this work we present a novel infrared hyper-spectral optimal estimation retrieval capable of simultaneously deriving cirrus cloud parameters (optical depth, effective radius, cloud top height) and atmospheric state (temperature and water vapor profiles) with their associated uncertainties by using a fast forward radiative transfer code. The use of hyperspectral data help overcoming the problem of the information content while the computational cost can be addressed by using a fast radiative transfer model. The tradeoff of this choice is an increasing in the complexity of the problem. Also, it is important to consider that by using a fast, approximate radiative transfer model, the uncertainties must be carefully evaluated in order to prevent or minimize any biases that could negatively affect the results. For this application data from the HS3 field campaign are used, which provide high quality hyper-spectral measurements from Scanning HIS along with CPL and possibly also dropsonde data and GDAS reanalysis to help validate the results. The future of this work will be to move from aircraft to satellite observations, and the natural choice is AIRS and CALIOP that offer a similar setup to what is currently used for this study.

  7. Research on NC laser combined cutting optimization model of sheet metal parts

    NASA Astrophysics Data System (ADS)

    Wu, Z. Y.; Zhang, Y. L.; Li, L.; Wu, L. H.; Liu, N. B.

    2017-09-01

    The optimization problem for NC laser combined cutting of sheet metal parts was taken as the research object in this paper. The problem included two contents: combined packing optimization and combined cutting path optimization. In the problem of combined packing optimization, the method of “genetic algorithm + gravity center NFP + geometric transformation” was used to optimize the packing of sheet metal parts. In the problem of combined cutting path optimization, the mathematical model of cutting path optimization was established based on the parts cutting constraint rules of internal contour priority and cross cutting. The model played an important role in the optimization calculation of NC laser combined cutting.

  8. Quadratic constrained mixed discrete optimization with an adiabatic quantum optimizer

    NASA Astrophysics Data System (ADS)

    Chandra, Rishabh; Jacobson, N. Tobias; Moussa, Jonathan E.; Frankel, Steven H.; Kais, Sabre

    2014-07-01

    We extend the family of problems that may be implemented on an adiabatic quantum optimizer (AQO). When a quadratic optimization problem has at least one set of discrete controls and the constraints are linear, we call this a quadratic constrained mixed discrete optimization (QCMDO) problem. QCMDO problems are NP-hard, and no efficient classical algorithm for their solution is known. Included in the class of QCMDO problems are combinatorial optimization problems constrained by a linear partial differential equation (PDE) or system of linear PDEs. An essential complication commonly encountered in solving this type of problem is that the linear constraint may introduce many intermediate continuous variables into the optimization while the computational cost grows exponentially with problem size. We resolve this difficulty by developing a constructive mapping from QCMDO to quadratic unconstrained binary optimization (QUBO) such that the size of the QUBO problem depends only on the number of discrete control variables. With a suitable embedding, taking into account the physical constraints of the realizable coupling graph, the resulting QUBO problem can be implemented on an existing AQO. The mapping itself is efficient, scaling cubically with the number of continuous variables in the general case and linearly in the PDE case if an efficient preconditioner is available.

  9. An Enhanced Memetic Algorithm for Single-Objective Bilevel Optimization Problems.

    PubMed

    Islam, Md Monjurul; Singh, Hemant Kumar; Ray, Tapabrata; Sinha, Ankur

    2017-01-01

    Bilevel optimization, as the name reflects, deals with optimization at two interconnected hierarchical levels. The aim is to identify the optimum of an upper-level  leader problem, subject to the optimality of a lower-level follower problem. Several problems from the domain of engineering, logistics, economics, and transportation have an inherent nested structure which requires them to be modeled as bilevel optimization problems. Increasing size and complexity of such problems has prompted active theoretical and practical interest in the design of efficient algorithms for bilevel optimization. Given the nested nature of bilevel problems, the computational effort (number of function evaluations) required to solve them is often quite high. In this article, we explore the use of a Memetic Algorithm (MA) to solve bilevel optimization problems. While MAs have been quite successful in solving single-level optimization problems, there have been relatively few studies exploring their potential for solving bilevel optimization problems. MAs essentially attempt to combine advantages of global and local search strategies to identify optimum solutions with low computational cost (function evaluations). The approach introduced in this article is a nested Bilevel Memetic Algorithm (BLMA). At both upper and lower levels, either a global or a local search method is used during different phases of the search. The performance of BLMA is presented on twenty-five standard test problems and two real-life applications. The results are compared with other established algorithms to demonstrate the efficacy of the proposed approach.

  10. Optimal Price Decision Problem for Simultaneous Multi-article Auction and Its Optimal Price Searching Method by Particle Swarm Optimization

    NASA Astrophysics Data System (ADS)

    Masuda, Kazuaki; Aiyoshi, Eitaro

    We propose a method for solving optimal price decision problems for simultaneous multi-article auctions. An auction problem, originally formulated as a combinatorial problem, determines both every seller's whether or not to sell his/her article and every buyer's which article(s) to buy, so that the total utility of buyers and sellers will be maximized. Due to the duality theory, we transform it equivalently into a dual problem in which Lagrange multipliers are interpreted as articles' transaction price. As the dual problem is a continuous optimization problem with respect to the multipliers (i.e., the transaction prices), we propose a numerical method to solve it by applying heuristic global search methods. In this paper, Particle Swarm Optimization (PSO) is used to solve the dual problem, and experimental results are presented to show the validity of the proposed method.

  11. Bi-objective approach for placing ground and air ambulance base and helipad locations in order to optimize EMS response.

    PubMed

    Shahriari, Milad; Bozorgi-Amiri, Ali; Tavakoli, Shayan; Yousefi-Babadi, Abolghasem

    2017-12-01

    Shortening the travel time of patient transfer has clinical implications for many conditions such as cardiac arrest, trauma, stroke and STEMI. As resources are often limited precise calculations are needed. In this paper we consider the location problem for both ground and aerial emergency medical services. Given the uncertainty of when patients are in need of prompt medical attention we consider these demand points to be uncertain. We consider various ways in which ground and helicopter ambulances can work together to make the whole process go faster. We develop a mathematical model that minimizes travel time and maximizes service level. We use a compromising programming method to solve this bi-objective mathematical model. For numerical experiments we apply our model to a case study in Lorestan, Iran, using geographical and population data, and the location of the actual hospital based in the capital of the province. Results show that low-accessibility locations are the main focus of the proposed problem and with mathematical modeling access to a hospital is vastly improved. We also found out that once the budget reaches a certain point which suffices for building certain ambulance bases more investments does not necessarily result in less travel time. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Child survival in England: Strengthening governance for health.

    PubMed

    Wolfe, Ingrid; Mandeville, Kate; Harrison, Katherine; Lingam, Raghu

    2017-11-01

    The United Kingdom, like all European countries, is struggling to strengthen health systems and improve conditions for child health and survival. Child mortality in the UK has failed to improve in line with other countries. Securing optimal conditions for child health requires a healthy society, strong health system, and effective health care. We examine inter-sectoral and intra-sectoral policy and governance for child health and survival in England. Literature reviews and universally applicable clinical scenarios were used to examine child health problems and English policy and governance responses for improving child health through integrating care and strengthening health systems, over the past 15 years. We applied the TAPIC framework for analysing policy governance: transparency, accountability, participation, integrity, and capacity. We identified strengths and weaknesses in child health governance in all the five domains. However there remain policy failures that are not fully explained by the TAPIC framework. Other problems with successfully translating policy to improved health that we identified include policy flux; policies insufficiently supported by delivery mechanisms, measurable targets, and sufficient budgets; and policies with unintended or contradictory aspects. We make recommendations for inter-sectoral and intra-sectoral child health governance, policy, and action to improve child health in England with relevant lessons for other countries. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  13. Starting From Ground Zero

    ERIC Educational Resources Information Center

    Fischer, William B.; Stauffer, Robert A.

    1978-01-01

    Erie County Community College (New York) has developed a zero-based program budgeting system to meet current fiscal problems and diminished resources. The system allocates resources on the basis of program effectiveness and market potential. (LH)

  14. Optimized, Budget-constrained Monitoring Well Placement Using DREAM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yonkofski, Catherine M. R.; Davidson, Casie L.; Rodriguez, Luke R.

    Defining the ideal suite of monitoring technologies to be deployed at a carbon capture and storage (CCS) site presents a challenge to project developers, financers, insurers, regulators and other stakeholders. The monitoring, verification, and accounting (MVA) toolkit offers a suite of technologies to monitor an extensive range of parameters across a wide span of spatial and temporal resolutions, each with their own degree of sensitivity to changes in the parameter being monitored. Understanding how best to optimize MVA budgets to minimize the time to leak detection could help to address issues around project risks, and in turn help support broadmore » CCS deployment. This paper presents a case study demonstrating an application of the Designs for Risk Evaluation and Management (DREAM) tool using an ensemble of CO 2 leakage scenarios taken from a previous study on leakage impacts to groundwater. Impacts were assessed and monitored as a function of pH, total dissolved solids (TDS), and trace metal concentrations of arsenic (As), cadmium (Cd), chromium (Cr), and lead (Pb). Using output from the previous study, DREAM was used to optimize monitoring system designs based on variable sampling locations and parameters. The algorithm requires the user to define a finite budget to limit the number of monitoring wells and technologies deployed, and then iterates well placement and sensor type and location until it converges on the configuration with the lowest time to first detection of the leak averaged across all scenarios. To facilitate an understanding of the optimal number of sampling wells, DREAM was used to assess the marginal utility of additional sampling locations. Based on assumptions about monitoring costs and replacement costs of degraded water, the incremental cost of each additional sampling well can be compared against its marginal value in terms of avoided aquifer degradation. Applying this method, DREAM identified the most cost-effective ensemble with 14 monitoring locations. Here, while this preliminary study applied relatively simplistic cost and technology assumptions, it provides an exciting proof-of-concept for the application of DREAM to questions of cost-optimized MVA system design that are informed not only by site-specific costs and technology options, but also by reservoir simulation results developed during site characterization and operation.« less

  15. Optimized, Budget-constrained Monitoring Well Placement Using DREAM

    DOE PAGES

    Yonkofski, Catherine M. R.; Davidson, Casie L.; Rodriguez, Luke R.; ...

    2017-08-18

    Defining the ideal suite of monitoring technologies to be deployed at a carbon capture and storage (CCS) site presents a challenge to project developers, financers, insurers, regulators and other stakeholders. The monitoring, verification, and accounting (MVA) toolkit offers a suite of technologies to monitor an extensive range of parameters across a wide span of spatial and temporal resolutions, each with their own degree of sensitivity to changes in the parameter being monitored. Understanding how best to optimize MVA budgets to minimize the time to leak detection could help to address issues around project risks, and in turn help support broadmore » CCS deployment. This paper presents a case study demonstrating an application of the Designs for Risk Evaluation and Management (DREAM) tool using an ensemble of CO 2 leakage scenarios taken from a previous study on leakage impacts to groundwater. Impacts were assessed and monitored as a function of pH, total dissolved solids (TDS), and trace metal concentrations of arsenic (As), cadmium (Cd), chromium (Cr), and lead (Pb). Using output from the previous study, DREAM was used to optimize monitoring system designs based on variable sampling locations and parameters. The algorithm requires the user to define a finite budget to limit the number of monitoring wells and technologies deployed, and then iterates well placement and sensor type and location until it converges on the configuration with the lowest time to first detection of the leak averaged across all scenarios. To facilitate an understanding of the optimal number of sampling wells, DREAM was used to assess the marginal utility of additional sampling locations. Based on assumptions about monitoring costs and replacement costs of degraded water, the incremental cost of each additional sampling well can be compared against its marginal value in terms of avoided aquifer degradation. Applying this method, DREAM identified the most cost-effective ensemble with 14 monitoring locations. Here, while this preliminary study applied relatively simplistic cost and technology assumptions, it provides an exciting proof-of-concept for the application of DREAM to questions of cost-optimized MVA system design that are informed not only by site-specific costs and technology options, but also by reservoir simulation results developed during site characterization and operation.« less

  16. Techniques for shuttle trajectory optimization

    NASA Technical Reports Server (NTRS)

    Edge, E. R.; Shieh, C. J.; Powers, W. F.

    1973-01-01

    The application of recently developed function-space Davidon-type techniques to the shuttle ascent trajectory optimization problem is discussed along with an investigation of the recently developed PRAXIS algorithm for parameter optimization. At the outset of this analysis, the major deficiency of the function-space algorithms was their potential storage problems. Since most previous analyses of the methods were with relatively low-dimension problems, no storage problems were encountered. However, in shuttle trajectory optimization, storage is a problem, and this problem was handled efficiently. Topics discussed include: the shuttle ascent model and the development of the particular optimization equations; the function-space algorithms; the operation of the algorithm and typical simulations; variable final-time problem considerations; and a modification of Powell's algorithm.

  17. Titrating versus targeting home care services to frail elderly clients: an application of agency theory and cost-benefit analysis to home care policy.

    PubMed

    Weissert, William; Chernew, Michael; Hirth, Richard

    2003-02-01

    The article summarizes the shortcomings of current home care targeting policy, provides a conceptual framework for understanding the sources of its problems, and proposes an alternative resource allocation method. Methods required for different aspects of the study included synthesis of the published literature, regression analysis of risk predictors, and comparison of actual resource allocations with simulated budgets. Problems of imperfect agency ranging from unclear goals and inappropriate incentives to lack of information about the marginal effectiveness of home care could be mitigated with an improved budgeting method that combines client selection and resource allocation. No program can produce its best outcome performance when its goals are unclear and its technology is unstandardized. Titration of care would reallocate resources to maximize marginal benefit for marginal cost.

  18. Radiation measurements from polar and geosynchronous satellites

    NASA Technical Reports Server (NTRS)

    Vonderhaar, T. H.

    1973-01-01

    During the 1960's, radiation budget measurements from satellites have allowed quantitative study of the global energetics of our atmosphere-ocean system. A continuing program is planned, including independent measurement of the solar constant. Thus far, the measurements returned from two basically different types of satellite experiments are in agreement on the long term global scales where they are most comparable. This fact, together with independent estimates of the accuracy of measurement from each system, shows that the energy exchange between earth and space is now measured better than it can be calculated. Examples of application of the radiation budget data were shown. They can be related to the age-old problem of climate change, to the basic question of the thermal forcing of our circulation systems, and to the contemporary problems of local area energetics and computer modeling of the atmosphere.

  19. On l(1): Optimal decentralized performance

    NASA Technical Reports Server (NTRS)

    Sourlas, Dennis; Manousiouthakis, Vasilios

    1993-01-01

    In this paper, the Manousiouthakis parametrization of all decentralized stabilizing controllers is employed in mathematically formulating the l(sup 1) optimal decentralized controller synthesis problem. The resulting optimization problem is infinite dimensional and therefore not directly amenable to computations. It is shown that finite dimensional optimization problems that have value arbitrarily close to the infinite dimensional one can be constructed. Based on this result, an algorithm that solves the l(sup 1) decentralized performance problems is presented. A global optimization approach to the solution of the infinite dimensional approximating problems is also discussed.

  20. Execution of Multidisciplinary Design Optimization Approaches on Common Test Problems

    NASA Technical Reports Server (NTRS)

    Balling, R. J.; Wilkinson, C. A.

    1997-01-01

    A class of synthetic problems for testing multidisciplinary design optimization (MDO) approaches is presented. These test problems are easy to reproduce because all functions are given as closed-form mathematical expressions. They are constructed in such a way that the optimal value of all variables and the objective is unity. The test problems involve three disciplines and allow the user to specify the number of design variables, state variables, coupling functions, design constraints, controlling design constraints, and the strength of coupling. Several MDO approaches were executed on two sample synthetic test problems. These approaches included single-level optimization approaches, collaborative optimization approaches, and concurrent subspace optimization approaches. Execution results are presented, and the robustness and efficiency of these approaches an evaluated for these sample problems.

  1. Time-domain finite elements in optimal control with application to launch-vehicle guidance. PhD. Thesis

    NASA Technical Reports Server (NTRS)

    Bless, Robert R.

    1991-01-01

    A time-domain finite element method is developed for optimal control problems. The theory derived is general enough to handle a large class of problems including optimal control problems that are continuous in the states and controls, problems with discontinuities in the states and/or system equations, problems with control inequality constraints, problems with state inequality constraints, or problems involving any combination of the above. The theory is developed in such a way that no numerical quadrature is necessary regardless of the degree of nonlinearity in the equations. Also, the same shape functions may be employed for every problem because all strong boundary conditions are transformed into natural or weak boundary conditions. In addition, the resulting nonlinear algebraic equations are very sparse. Use of sparse matrix solvers allows for the rapid and accurate solution of very difficult optimization problems. The formulation is applied to launch-vehicle trajectory optimization problems, and results show that real-time optimal guidance is realizable with this method. Finally, a general problem solving environment is created for solving a large class of optimal control problems. The algorithm uses both FORTRAN and a symbolic computation program to solve problems with a minimum of user interaction. The use of symbolic computation eliminates the need for user-written subroutines which greatly reduces the setup time for solving problems.

  2. On Hardness of Pricing Items for Single-Minded Bidders

    NASA Astrophysics Data System (ADS)

    Khandekar, Rohit; Kimbrel, Tracy; Makarychev, Konstantin; Sviridenko, Maxim

    We consider the following item pricing problem which has received much attention recently. A seller has an infinite numbers of copies of n items. There are m buyers, each with a budget and an intention to buy a fixed subset of items. Given prices on the items, each buyer buys his subset of items, at the given prices, provided the total price of the subset is at most his budget. The objective of the seller is to determine the prices such that her total profit is maximized.

  3. The measurement of the earth's radiation budget as a problem in information theory - A tool for the rational design of earth observing systems

    NASA Technical Reports Server (NTRS)

    Barkstrom, B. R.

    1983-01-01

    The measurement of the earth's radiation budget has been chosen to illustrate the technique of objective system design. The measurement process is an approximately linear transformation of the original field of radiant exitances, so that linear statistical techniques may be employed. The combination of variability, measurement strategy, and error propagation is presently made with the help of information theory, as suggested by Kondratyev et al. (1975) and Peckham (1974). Covariance matrices furnish the quantitative statement of field variability.

  4. Dealing with problem number one--budget cuts: can you do more with less?

    PubMed

    2001-09-01

    To cope with current budget restraints and cutbacks, hospital security departments are increasingly integrating their manpower with technology in the form of access control, CCTV cameras, and alarm systems to supplement their services as well as becoming more dependent on computerized information technology systems and IT departments to track hospital activities and incidents. Security directors contacted for this report also emphasize that they are doing more with less by providing value-added services both to expand activities and to demonstrate the importance of their departments to top management.

  5. An Empirical Comparison of Seven Iterative and Evolutionary Function Optimization Heuristics

    NASA Technical Reports Server (NTRS)

    Baluja, Shumeet

    1995-01-01

    This report is a repository of the results obtained from a large scale empirical comparison of seven iterative and evolution-based optimization heuristics. Twenty-seven static optimization problems, spanning six sets of problem classes which are commonly explored in genetic algorithm literature, are examined. The problem sets include job-shop scheduling, traveling salesman, knapsack, binpacking, neural network weight optimization, and standard numerical optimization. The search spaces in these problems range from 2368 to 22040. The results indicate that using genetic algorithms for the optimization of static functions does not yield a benefit, in terms of the final answer obtained, over simpler optimization heuristics. Descriptions of the algorithms tested and the encodings of the problems are described in detail for reproducibility.

  6. Optimization of educational paths for higher education

    NASA Astrophysics Data System (ADS)

    Tarasyev, Alexandr A.; Agarkov, Gavriil; Medvedev, Aleksandr

    2017-11-01

    In our research, we combine the theory of economic behavior and the methodology of increasing efficiency of the human capital to estimate the optimal educational paths. We provide an optimization model for higher education process to analyze possible educational paths for each rational individual. The preferences of each rational individual are compared to the best economically possible educational path. The main factor of the individual choice, which is formed by the formation of optimal educational path, deals with higher salaries level in the chosen economic sector after graduation. Another factor that influences on the economic profit is the reduction of educational costs or the possibility of the budget support for the student. The main outcome of this research consists in correction of the governmental policy of investment in human capital based on the results of educational paths optimal control.

  7. High Energy Rainy Day Physical Education for Cheapskates: Rhythmic Newspaper Strips

    ERIC Educational Resources Information Center

    Walkwitz, Edward

    2005-01-01

    What to do during physical education when it is raining or the gym is not available is a problem that confronts many physical education teachers. The problem is compounded when teachers do not have the equipment needed to carry out instruction due to budget limitations. A related concern is making sure children with special needs, who attend PE…

  8. Nursing workforce planning: insights from seven Malaysian hospitals.

    PubMed

    Drake, Robert

    In 2010, the Royal College of Nursing asked: 'What is the optimal level and mix of nurses required to deliver quality care as cost-effectively as possible?' This question implies there is a relationship between staffing levels, quality of care and financial efficiency. This paper examines the relationship between the staff budget, the number of staff required to achieve a target level of care and the actual number of staff employed in seven hospitals in Malaysia. It seeks to critically evaluate local challenges arising from staff budgeting/planning procedures, identify general issues that apply beyond Malaysian healthcare institutions and, finally, to propose a model that combines finance, staffing and level of care.

  9. Study to investigate and evaluate means of optimizing the radar function for the space shuttle

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A detailed analysis of the spiral scan was performed for antenna sizes ranging from 20 inches to 36 inches in diameter and for search angles characteristic of both the radar and the communication acquisition modes. The power budgets for passive target radar detection were calculated for antenna diameters ranging from 20 to 36 inches. Dwell times commensurate with spiral scan were used for these budget calculations. The signal design for the candidate pulse Doppler system is summarized. Ground return analysis carried out for the passive target radar mode is examined, and the details are presented. A concluding description of the proposed candidate radar/communication system configuration is given.

  10. Cities and “budget-based” management of the energy-water-climate nexus: Case studies in transportation policy, infrastructure systems, and urban utility risk management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sperling, Joshua B.; Ramaswami, Anu

    This article reviews city case studies to inform a framework for developing urban infrastructure design standards and policy instruments that together aim to pursue energy efficiency and greenhouse gas mitigation through city carbon budgets and water use efficiency and climate risk adaptation through city water budgets. Here, this article also proposes combining carbon and water budgeting at the city-scale for achieving successful coupled city carbon and water budget (CCCWB) programs. Under a CCCWB program, key actors including local governments, infrastructure designers/operators, and households would be assigned a GHG emissions and water 'budget' and be required by state or federal levelsmore » to keep within this budget through the use of flexibility mechanisms, incentive programs, and sanctions. Multiple incentives and cross-scale governance arrangements would be tied to energy-water systems integration, resource-efficient transportation and infrastructure development, and effective monitoring and management of energy use, emissions, climate risks to, and security of energy-water-transport-food and other critical systems. As a first step to promote strategies for CCCWB development, we systematically review approaches of and shortcomings to existing budget-based programs in the UK and US, and suggest improvements in three areas: measurement, modeling effectiveness of interventions for staying within a budget, and governance. To date, the majority of climate action or sustainability plans by cities, while mentioning climate impacts as a premise for the plan, do not address these impacts in the plan. They focus primarily on GHG mitigation while ignoring resource depletion challenges and energy-climate-water linkages, whereby water supplies can begin to limit energy production and energy shifts to mitigate climate change can limit water availability. Coupled carbon-water budget plans, programs, and policies - described in this study- may address these concerns as well as the emerging trends that will exacerbate these problems - e.g., including population growth, climatic changes, and emerging policy choices that are not coordinated. Cities and 'Budget-Based' Management of the Energy-Water-Climate Nexus: Case Studies to Inform Strategy for Integrated Performance- and Incentive-Based Design and Policy Instruments.« less

  11. Cities and “budget-based” management of the energy-water-climate nexus: Case studies in transportation policy, infrastructure systems, and urban utility risk management

    DOE PAGES

    Sperling, Joshua B.; Ramaswami, Anu

    2017-11-03

    This article reviews city case studies to inform a framework for developing urban infrastructure design standards and policy instruments that together aim to pursue energy efficiency and greenhouse gas mitigation through city carbon budgets and water use efficiency and climate risk adaptation through city water budgets. Here, this article also proposes combining carbon and water budgeting at the city-scale for achieving successful coupled city carbon and water budget (CCCWB) programs. Under a CCCWB program, key actors including local governments, infrastructure designers/operators, and households would be assigned a GHG emissions and water 'budget' and be required by state or federal levelsmore » to keep within this budget through the use of flexibility mechanisms, incentive programs, and sanctions. Multiple incentives and cross-scale governance arrangements would be tied to energy-water systems integration, resource-efficient transportation and infrastructure development, and effective monitoring and management of energy use, emissions, climate risks to, and security of energy-water-transport-food and other critical systems. As a first step to promote strategies for CCCWB development, we systematically review approaches of and shortcomings to existing budget-based programs in the UK and US, and suggest improvements in three areas: measurement, modeling effectiveness of interventions for staying within a budget, and governance. To date, the majority of climate action or sustainability plans by cities, while mentioning climate impacts as a premise for the plan, do not address these impacts in the plan. They focus primarily on GHG mitigation while ignoring resource depletion challenges and energy-climate-water linkages, whereby water supplies can begin to limit energy production and energy shifts to mitigate climate change can limit water availability. Coupled carbon-water budget plans, programs, and policies - described in this study- may address these concerns as well as the emerging trends that will exacerbate these problems - e.g., including population growth, climatic changes, and emerging policy choices that are not coordinated. Cities and 'Budget-Based' Management of the Energy-Water-Climate Nexus: Case Studies to Inform Strategy for Integrated Performance- and Incentive-Based Design and Policy Instruments.« less

  12. Water and nutrient budgets for Vancouver Lake, Vancouver, Washington, October 2010-October 2012

    USGS Publications Warehouse

    Sheibley, Rich W.; Foreman, James R.; Marshall, Cameron A.; Welch, Wendy B.

    2014-01-01

    Vancouver Lake, a large shallow lake in Clark County, near Vancouver, Washington, has been undergoing water-quality problems for decades. Recently, the biggest concern for the lake are the almost annual harmful cyanobacteria blooms that cause the lake to close for recreation for several weeks each summer. Despite decades of interest in improving the water quality of the lake, fundamental information on the timing and amount of water and nutrients entering and exiting the lake is lacking. In 2010, the U.S. Geological Survey conducted a 2-year field study to quantify water flows and nutrient loads in order to develop water and nutrient budgets for the lake. This report presents monthly and annual water and nutrient budgets from October 2010–October 2012 to identify major sources and sinks of nutrients. Lake River, a tidally influenced tributary to the lake, flows into and out of the lake almost daily and composed the greatest proportion of both the water and nutrient budgets for the lake, often at orders of magnitude greater than any other source. From the water budget, we identified precipitation, evaporation and groundwater inflow as minor components of the lake hydrologic cycle, each contributing 1 percent or less to the total water budget. Nutrient budgets were compiled monthly and annually for total nitrogen, total phosphorus, and orthophosphate; and, nitrogen loads were generally an order of magnitude greater than phosphorus loads across all sources. For total nitrogen, flow from Lake River at Felida, Washington, made up 88 percent of all inputs into the lake. For total phosphorus and orthophosphate, Lake River at Felida flowing into the lake was 91 and 76 percent of total inputs, respectively. Nutrient loads from precipitation and groundwater inflow were 1 percent or less of the total budgets. Nutrient inputs from Burnt Bridge Creek and Flushing Channel composed 12 percent of the total nitrogen budget, 8 percent of the total phosphorus budget, and 21 percent of the orthophosphate budget. We identified several data gaps and areas for future research, which include the need for better understanding nutrient inputs to the lake from sediment resuspension and better quantification of indirect nutrient inputs to the lake from Salmon Creek.

  13. Multiobjective optimization of temporal processes.

    PubMed

    Song, Zhe; Kusiak, Andrew

    2010-06-01

    This paper presents a dynamic predictive-optimization framework of a nonlinear temporal process. Data-mining (DM) and evolutionary strategy algorithms are integrated in the framework for solving the optimization model. DM algorithms learn dynamic equations from the process data. An evolutionary strategy algorithm is then applied to solve the optimization problem guided by the knowledge extracted by the DM algorithm. The concept presented in this paper is illustrated with the data from a power plant, where the goal is to maximize the boiler efficiency and minimize the limestone consumption. This multiobjective optimization problem can be either transformed into a single-objective optimization problem through preference aggregation approaches or into a Pareto-optimal optimization problem. The computational results have shown the effectiveness of the proposed optimization framework.

  14. Defining Top-of-Atmosphere Flux Reference Level for Earth Radiation Budget Studies

    NASA Technical Reports Server (NTRS)

    Loeb, N. G.; Kato, S.; Wielicki, B. A.

    2002-01-01

    To estimate the earth's radiation budget at the top of the atmosphere (TOA) from satellite-measured radiances, it is necessary to account for the finite geometry of the earth and recognize that the earth is a solid body surrounded by a translucent atmosphere of finite thickness that attenuates solar radiation differently at different heights. As a result, in order to account for all of the reflected solar and emitted thermal radiation from the planet by direct integration of satellite-measured radiances, the measurement viewing geometry must be defined at a reference level well above the earth s surface (e.g., 100 km). This ensures that all radiation contributions, including radiation escaping the planet along slant paths above the earth s tangent point, are accounted for. By using a field-of- view (FOV) reference level that is too low (such as the surface reference level), TOA fluxes for most scene types are systematically underestimated by 1-2 W/sq m. In addition, since TOA flux represents a flow of radiant energy per unit area, and varies with distance from the earth according to the inverse-square law, a reference level is also needed to define satellite-based TOA fluxes. From theoretical radiative transfer calculations using a model that accounts for spherical geometry, the optimal reference level for defining TOA fluxes in radiation budget studies for the earth is estimated to be approximately 20 km. At this reference level, there is no need to explicitly account for horizontal transmission of solar radiation through the atmosphere in the earth radiation budget calculation. In this context, therefore, the 20-km reference level corresponds to the effective radiative top of atmosphere for the planet. Although the optimal flux reference level depends slightly on scene type due to differences in effective transmission of solar radiation with cloud height, the difference in flux caused by neglecting the scene-type dependence is less than 0.1%. If an inappropriate TOA flux reference level is used to define satellite TOA fluxes, and horizontal transmission of solar radiation through the planet is not accounted for in the radiation budget equation, systematic errors in net flux of up to 8 W/sq m can result. Since climate models generally use a plane-parallel model approximation to estimate TOA fluxes and the earth radiation budget, they implicitly assume zero horizontal transmission of solar radiation in the radiation budget equation, and do not need to specify a flux reference level. By defining satellite-based TOA flux estimates at a 20-km flux reference level, comparisons with plane-parallel climate model calculations are simplified since there is no need to explicitly correct plane-parallel climate model fluxes for horizontal transmission of solar radiation through a finite earth.

  15. Pre-contract scoping processes value stream mapping.

    DOT National Transportation Integrated Search

    2016-10-01

    Three fundamental issues contribute to the scoping process : problems: : 1. The scoping process is inefficient and inconsistent. : 2. Staffing for scoping is insufficient. : 3. The programmed project budgets are locked in based on early, : uncertain ...

  16. 78 FR 36754 - Agency Information Collection Activities; Submission to the Office of Management and Budget for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-19

    ...-year-olds which focuses on assessing students science, mathematics, and reading literacy. PISA was... test will also include computer- based assessments in reading, mathematics, and collaborative problem...

  17. Health Effects of Exposures to Mercury

    MedlinePlus

    ... risk assessment for mercuric chloride in EPA's IRIS database Top of Page Contact Us to ask a question, provide feedback, or report a problem. Discover. Accessibility EPA Administrator Budget & Performance Contracting Grants January 19, 2017 Web ...

  18. Optimal control of an invasive species using a reaction-diffusion model and linear programming

    USGS Publications Warehouse

    Bonneau, Mathieu; Johnson, Fred A.; Smith, Brian J.; Romagosa, Christina M.; Martin, Julien; Mazzotti, Frank J.

    2017-01-01

    Managing an invasive species is particularly challenging as little is generally known about the species’ biological characteristics in its new habitat. In practice, removal of individuals often starts before the species is studied to provide the information that will later improve control. Therefore, the locations and the amount of control have to be determined in the face of great uncertainty about the species characteristics and with a limited amount of resources. We propose framing spatial control as a linear programming optimization problem. This formulation, paired with a discrete reaction-diffusion model, permits calculation of an optimal control strategy that minimizes the remaining number of invaders for a fixed cost or that minimizes the control cost for containment or protecting specific areas from invasion. We propose computing the optimal strategy for a range of possible model parameters, representing current uncertainty on the possible invasion scenarios. Then, a best strategy can be identified depending on the risk attitude of the decision-maker. We use this framework to study the spatial control of the Argentine black and white tegus (Salvator merianae) in South Florida. There is uncertainty about tegu demography and we considered several combinations of model parameters, exhibiting various dynamics of invasion. For a fixed one-year budget, we show that the risk-averse strategy, which optimizes the worst-case scenario of tegus’ dynamics, and the risk-neutral strategy, which optimizes the expected scenario, both concentrated control close to the point of introduction. A risk-seeking strategy, which optimizes the best-case scenario, focuses more on models where eradication of the species in a cell is possible and consists of spreading control as much as possible. For the establishment of a containment area, assuming an exponential growth we show that with current control methods it might not be possible to implement such a strategy for some of the models that we considered. Including different possible models allows an examination of how the strategy is expected to perform in different scenarios. Then, a strategy that accounts for the risk attitude of the decision-maker can be designed.

  19. Social Emotional Optimization Algorithm for Nonlinear Constrained Optimization Problems

    NASA Astrophysics Data System (ADS)

    Xu, Yuechun; Cui, Zhihua; Zeng, Jianchao

    Nonlinear programming problem is one important branch in operational research, and has been successfully applied to various real-life problems. In this paper, a new approach called Social emotional optimization algorithm (SEOA) is used to solve this problem which is a new swarm intelligent technique by simulating the human behavior guided by emotion. Simulation results show that the social emotional optimization algorithm proposed in this paper is effective and efficiency for the nonlinear constrained programming problems.

  20. Evolutionary optimization methods for accelerator design

    NASA Astrophysics Data System (ADS)

    Poklonskiy, Alexey A.

    Many problems from the fields of accelerator physics and beam theory can be formulated as optimization problems and, as such, solved using optimization methods. Despite growing efficiency of the optimization methods, the adoption of modern optimization techniques in these fields is rather limited. Evolutionary Algorithms (EAs) form a relatively new and actively developed optimization methods family. They possess many attractive features such as: ease of the implementation, modest requirements on the objective function, a good tolerance to noise, robustness, and the ability to perform a global search efficiently. In this work we study the application of EAs to problems from accelerator physics and beam theory. We review the most commonly used methods of unconstrained optimization and describe the GATool, evolutionary algorithm and the software package, used in this work, in detail. Then we use a set of test problems to assess its performance in terms of computational resources, quality of the obtained result, and the tradeoff between them. We justify the choice of GATool as a heuristic method to generate cutoff values for the COSY-GO rigorous global optimization package for the COSY Infinity scientific computing package. We design the model of their mutual interaction and demonstrate that the quality of the result obtained by GATool increases as the information about the search domain is refined, which supports the usefulness of this model. We Giscuss GATool's performance on the problems suffering from static and dynamic noise and study useful strategies of GATool parameter tuning for these and other difficult problems. We review the challenges of constrained optimization with EAs and methods commonly used to overcome them. We describe REPA, a new constrained optimization method based on repairing, in exquisite detail, including the properties of its two repairing techniques: REFIND and REPROPT. We assess REPROPT's performance on the standard constrained optimization test problems for EA with a variety of different configurations and suggest optimal default parameter values based on the results. Then we study the performance of the REPA method on the same set of test problems and compare the obtained results with those of several commonly used constrained optimization methods with EA. Based on the obtained results, particularly on the outstanding performance of REPA on test problem that presents significant difficulty for other reviewed EAs, we conclude that the proposed method is useful and competitive. We discuss REPA parameter tuning for difficult problems and critically review some of the problems from the de-facto standard test problem set for the constrained optimization with EA. In order to demonstrate the practical usefulness of the developed method, we study several problems of accelerator design and demonstrate how they can be solved with EAs. These problems include a simple accelerator design problem (design a quadrupole triplet to be stigmatically imaging, find all possible solutions), a complex real-life accelerator design problem (an optimization of the front end section for the future neutrino factory), and a problem of the normal form defect function optimization which is used to rigorously estimate the stability of the beam dynamics in circular accelerators. The positive results we obtained suggest that the application of EAs to problems from accelerator theory can be very beneficial and has large potential. The developed optimization scenarios and tools can be used to approach similar problems.

  1. A weak Hamiltonian finite element method for optimal control problems

    NASA Technical Reports Server (NTRS)

    Hodges, Dewey H.; Bless, Robert R.

    1989-01-01

    A temporal finite element method based on a mixed form of the Hamiltonian weak principle is developed for dynamics and optimal control problems. The mixed form of Hamilton's weak principle contains both displacements and momenta as primary variables that are expanded in terms of nodal values and simple polynomial shape functions. Unlike other forms of Hamilton's principle, however, time derivatives of the momenta and displacements do not appear therein; instead, only the virtual momenta and virtual displacements are differentiated with respect to time. Based on the duality that is observed to exist between the mixed form of Hamilton's weak principle and variational principles governing classical optimal control problems, a temporal finite element formulation of the latter can be developed in a rather straightforward manner. Several well-known problems in dynamics and optimal control are illustrated. The example dynamics problem involves a time-marching problem. As optimal control examples, elementary trajectory optimization problems are treated.

  2. A weak Hamiltonian finite element method for optimal control problems

    NASA Technical Reports Server (NTRS)

    Hodges, Dewey H.; Bless, Robert R.

    1990-01-01

    A temporal finite element method based on a mixed form of the Hamiltonian weak principle is developed for dynamics and optimal control problems. The mixed form of Hamilton's weak principle contains both displacements and momenta as primary variables that are expanded in terms of nodal values and simple polynomial shape functions. Unlike other forms of Hamilton's principle, however, time derivatives of the momenta and displacements do not appear therein; instead, only the virtual momenta and virtual displacements are differentiated with respect to time. Based on the duality that is observed to exist between the mixed form of Hamilton's weak principle and variational principles governing classical optimal control problems, a temporal finite element formulation of the latter can be developed in a rather straightforward manner. Several well-known problems in dynamics and optimal control are illustrated. The example dynamics problem involves a time-marching problem. As optimal control examples, elementary trajectory optimization problems are treated.

  3. Weak Hamiltonian finite element method for optimal control problems

    NASA Technical Reports Server (NTRS)

    Hodges, Dewey H.; Bless, Robert R.

    1991-01-01

    A temporal finite element method based on a mixed form of the Hamiltonian weak principle is developed for dynamics and optimal control problems. The mixed form of Hamilton's weak principle contains both displacements and momenta as primary variables that are expanded in terms of nodal values and simple polynomial shape functions. Unlike other forms of Hamilton's principle, however, time derivatives of the momenta and displacements do not appear therein; instead, only the virtual momenta and virtual displacements are differentiated with respect to time. Based on the duality that is observed to exist between the mixed form of Hamilton's weak principle and variational principles governing classical optimal control problems, a temporal finite element formulation of the latter can be developed in a rather straightforward manner. Several well-known problems in dynamics and optimal control are illustrated. The example dynamics problem involves a time-marching problem. As optimal control examples, elementary trajectory optimization problems are treated.

  4. Hybrid maize breeding with doubled haploids: I. One-stage versus two-stage selection for testcross performance.

    PubMed

    Longin, C Friedrich H; Utz, H Friedrich; Reif, Jochen C; Schipprack, Wolfgang; Melchinger, Albrecht E

    2006-03-01

    Optimum allocation of resources is of fundamental importance for the efficiency of breeding programs. The objectives of our study were to (1) determine the optimum allocation for the number of lines and test locations in hybrid maize breeding with doubled haploids (DHs) regarding two optimization criteria, the selection gain deltaG(k) and the probability P(k) of identifying superior genotypes, (2) compare both optimization criteria including their standard deviations (SDs), and (3) investigate the influence of production costs of DHs on the optimum allocation. For different budgets, number of finally selected lines, ratios of variance components, and production costs of DHs, the optimum allocation of test resources under one- and two-stage selection for testcross performance with a given tester was determined by using Monte Carlo simulations. In one-stage selection, lines are tested in field trials in a single year. In two-stage selection, optimum allocation of resources involves evaluation of (1) a large number of lines in a small number of test locations in the first year and (2) a small number of the selected superior lines in a large number of test locations in the second year, thereby maximizing both optimization criteria. Furthermore, to have a realistic chance of identifying a superior genotype, the probability P(k) of identifying superior genotypes should be greater than 75%. For budgets between 200 and 5,000 field plot equivalents, P(k) > 75% was reached only for genotypes belonging to the best 5% of the population. As the optimum allocation for P(k)(5%) was similar to that for deltaG(k), the choice of the optimization criterion was not crucial. The production costs of DHs had only a minor effect on the optimum number of locations and on values of the optimization criteria.

  5. Health sector operational planning and budgeting processes in Kenya—“never the twain shall meet”

    PubMed Central

    Molyneux, Sassy; Goodman, Catherine

    2015-01-01

    Summary Operational planning is considered an important tool for translating government policies and strategic objectives into day‐to‐day management activities. However, developing countries suffer from persistent misalignment between policy, planning and budgeting. The Medium Term Expenditure Framework (MTEF) was introduced to address this misalignment. Kenya adopted the MTEF in the early 2000s, and in 2005, the Ministry of Health adopted the Annual Operational Plan process to adapt the MTEF to the health sector. This study assessed the degree to which the health sector Annual Operational Plan process in Kenya has achieved alignment between planning and budgeting at the national level, using document reviews, participant observation and key informant interviews. We found that the Kenyan health sector was far from achieving planning and budgeting alignment. Several factors contributed to this problem including weak Ministry of Health stewardship and institutionalized separation between planning and budgeting processes; a rapidly changing planning and budgeting environment; lack of reliable data to inform target setting and poor participation by key stakeholders in the process including a top‐down approach to target setting. We conclude that alignment is unlikely to be achieved without consideration of the specific institutional contexts and the power relationships between stakeholders. In particular, there is a need for institutional integration of the planning and budgeting processes into a common cycle and framework with common reporting lines and for improved data and local‐level input to inform appropriate and realistic target setting. © 2015 The Authors. International Journal of Health Planning and Management published by John Wiley & Sons, Ltd. PMID:25783862

  6. How to apply the optimal estimation method to your lidar measurements for improved retrievals of temperature and composition

    NASA Astrophysics Data System (ADS)

    Sica, R. J.; Haefele, A.; Jalali, A.; Gamage, S.; Farhani, G.

    2018-04-01

    The optimal estimation method (OEM) has a long history of use in passive remote sensing, but has only recently been applied to active instruments like lidar. The OEM's advantage over traditional techniques includes obtaining a full systematic and random uncertainty budget plus the ability to work with the raw measurements without first applying instrument corrections. In our meeting presentation we will show you how to use the OEM for temperature and composition retrievals for Rayleigh-scatter, Ramanscatter and DIAL lidars.

  7. Nitrogen supply and demand in short-rotation sweetgum plantations

    Treesearch

    D. Andrew Scott; James A. Burger; Donald J. Kaczmarek; Michael B. Kane

    2004-01-01

    Intensive management is crucial for optimizing hardwood plantation success, and nitrogen (N) nutrition management is one of the most important practices in intensive management. Because management of short-rotation woody crop plantations is a mixture of row-crop agriculture and plantation forestry, we tested the usefulness of an agronomic budget modified for deciduous...

  8. A Practical Solution to Optimizing the Reliability of Teaching Observation Measures under Budget Constraints

    ERIC Educational Resources Information Center

    Meyer, J. Patrick; Liu, Xiang; Mashburn, Andrew J.

    2014-01-01

    Researchers often use generalizability theory to estimate relative error variance and reliability in teaching observation measures. They also use it to plan future studies and design the best possible measurement procedures. However, designing the best possible measurement procedure comes at a cost, and researchers must stay within their budget…

  9. Staff Perceptions of E-Learning in a Community Health Care Organization

    ERIC Educational Resources Information Center

    Gabriel, Monica; Longman, Sandra

    2004-01-01

    How do organizations cope with the increased speed of technological change? How do leaders optimize resources with tightened budgets? How do staff and students acquire the necessary knowledge and skills in the midst of constant change? Electronic learning (e-learning) is one form of learning that utilizes technology to deliver, interact or…

  10. Relevant and Timely Learning for Busy Leaders

    ERIC Educational Resources Information Center

    Claxton, Julia; Gold, Jeff; Edwards, Claire; Coope, Gary

    2009-01-01

    Lord Leitch was commissioned by the Chancellor in 2004 with a remit to "identify the UK's optimal skills mix in 2020 to maximise economic growth, productivity and social justice and to consider the policy implications of achieving the level of change required." In the 2006 Budget, the Chancellor asked Lord Leitch to consider how to…

  11. Optimality conditions for the numerical solution of optimization problems with PDE constraints :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguilo Valentin, Miguel Alejandro; Ridzal, Denis

    2014-03-01

    A theoretical framework for the numerical solution of partial di erential equation (PDE) constrained optimization problems is presented in this report. This theoretical framework embodies the fundamental infrastructure required to e ciently implement and solve this class of problems. Detail derivations of the optimality conditions required to accurately solve several parameter identi cation and optimal control problems are also provided in this report. This will allow the reader to further understand how the theoretical abstraction presented in this report translates to the application.

  12. Unintended greenhouse gas consequences of lowering level of service in urban transit systems

    NASA Astrophysics Data System (ADS)

    Griswold, Julia B.; Cheng, Han; Madanat, Samer; Horvath, Arpad

    2014-12-01

    Public transit is often touted as a ‘green’ transportation option and a way for users to reduce their environmental footprint by avoiding automobile emissions, but that may not be the case when systems run well below passenger capacity. In previous work, we explored an approach to optimizing the design and operations of transit systems for both costs and emissions, using continuum approximation models and assuming fixed demand. In this letter, we expand upon our previous work to explore how the level of service for users impacts emissions. We incorporate travel time elasticities into the optimization to account for demand shifts from transit to cars, resulting from increases in transit travel time. We find that emissions reductions are moderated, but not eliminated, for relatively inelastic users. We consider two scenarios: the first is where only the agency faces an emissions budget; the second is where the entire city faces an emissions budget. In the latter scenario, the emissions reductions resulting from reductions in transit level of service are mitigated as users switch to automobile.

  13. Value for money? A contingent valuation study of the optimal size of the Swedish health care budget.

    PubMed

    Eckerlund, I; Johannesson, M; Johansson, P O; Tambour, M; Zethraeus, N

    1995-11-01

    The contingent valuation method has been developed in the environmental field to measure the willingness to pay for environmental changes using survey methods. In this exploratory study the contingent valuation method was used to analyse how much individuals are willing to spend in total in the form of taxes for health care in Sweden, i.e. to analyse the optimal size of the 'health care budget' in Sweden. A binary contingent valuation question was included in a telephone survey of a random sample of 1260 households in Sweden. With a conservative interpretation of the data the result shows that 50% of the respondents would accept an increased tax payment to health care of about SEK 60 per month ($1 = SEK 8). It is concluded that the results indicate that the population overall thinks that the current spending on health care in Sweden is on a reasonable level. There seems to be a willingness to increase the tax payments somewhat, but major increases does not seem acceptable to a majority of the population.

  14. FRANOPP: Framework for analysis and optimization problems user's guide

    NASA Technical Reports Server (NTRS)

    Riley, K. M.

    1981-01-01

    Framework for analysis and optimization problems (FRANOPP) is a software aid for the study and solution of design (optimization) problems which provides the driving program and plotting capability for a user generated programming system. In addition to FRANOPP, the programming system also contains the optimization code CONMIN, and two user supplied codes, one for analysis and one for output. With FRANOPP the user is provided with five options for studying a design problem. Three of the options utilize the plot capability and present an indepth study of the design problem. The study can be focused on a history of the optimization process or on the interaction of variables within the design problem.

  15. Fuzzy bi-objective preventive health care network design.

    PubMed

    Davari, Soheil; Kilic, Kemal; Ertek, Gurdal

    2015-09-01

    Preventive health care is unlike health care for acute ailments, as people are less alert to their unknown medical problems. In order to motivate public and to attain desired participation levels for preventive programs, the attractiveness of the health care facility is a major concern. Health economics literature indicates that attractiveness of a facility is significantly influenced by proximity of the clients to it. Hence attractiveness is generally modelled as a function of distance. However, abundant empirical evidence suggests that other qualitative factors such as perceived quality, attractions nearby, amenities, etc. also influence attractiveness. Therefore, a realistic measure should incorporate the vagueness in the concept of attractiveness to the model. The public policy makers should also maintain the equity among various neighborhoods, which should be considered as a second objective. Finally, even though the general tendency in the literature is to focus on health benefits, the cost effectiveness is still a factor that should be considered. In this paper, a fuzzy bi-objective model with budget constraints is developed. Later, by modelling the attractiveness by means of fuzzy triangular numbers and treating the budget constraint as a soft constraint, a modified (and more realistic) version of the model is introduced. Two solution methodologies, namely fuzzy goal programming and fuzzy chance constrained optimization are proposed as solutions. Both the original and the modified models are solved within the framework of a case study in Istanbul, Turkey. In the case study, the Microsoft Bing Map is utilized in order to determine more accurate distance measures among the nodes.

  16. The potential impact of immunization campaign budget re-allocation on global eradication of paediatric infectious diseases

    PubMed Central

    2011-01-01

    Background The potential benefits of coordinating infectious disease eradication programs that use campaigns such as supplementary immunization activities (SIAs) should not be over-looked. One example of a coordinated approach is an adaptive "sequential strategy": first, all annual SIA budget is dedicated to the eradication of a single infectious disease; once that disease is eradicated, the annual SIA budget is re-focussed on eradicating a second disease, etc. Herd immunity suggests that a sequential strategy may eradicate several infectious diseases faster than a non-adaptive "simultaneous strategy" of dividing annual budget equally among eradication programs for those diseases. However, mathematical modeling is required to understand the potential extent of this effect. Methods Our objective was to illustrate how budget allocation strategies can interact with the nonlinear nature of disease transmission to determine time to eradication of several infectious diseases under different budget allocation strategies. Using a mathematical transmission model, we analyzed three hypothetical vaccine-preventable infectious diseases in three different countries. A central decision-maker can distribute funding among SIA programs for these three diseases according to either a sequential strategy or a simultaneous strategy. We explored the time to eradication under these two strategies under a range of scenarios. Results For a certain range of annual budgets, all three diseases can be eradicated relatively quickly under the sequential strategy, whereas eradication never occurs under the simultaneous strategy. However, moderate changes to total SIA budget, SIA frequency, order of eradication, or funding disruptions can create disproportionately large differences in the time and budget required for eradication under the sequential strategy. We find that the predicted time to eradication can be very sensitive to small differences in the rate of case importation between the countries. We also find that the time to eradication of all three diseases is not necessarily lowest when the least transmissible disease is targeted first. Conclusions Relatively modest differences in budget allocation strategies in the near-term can result in surprisingly large long-term differences in time required to eradicate, as a result of the amplifying effects of herd immunity and the nonlinearities of disease transmission. More sophisticated versions of such models may be useful to large international donors or other organizations as a planning or portfolio optimization tool, where choices must be made regarding how much funding to dedicate to different infectious disease eradication efforts. PMID:21955853

  17. The potential impact of immunization campaign budget re-allocation on global eradication of paediatric infectious diseases.

    PubMed

    Fitzpatrick, Tiffany; Bauch, Chris T

    2011-09-28

    The potential benefits of coordinating infectious disease eradication programs that use campaigns such as supplementary immunization activities (SIAs) should not be over-looked. One example of a coordinated approach is an adaptive "sequential strategy": first, all annual SIA budget is dedicated to the eradication of a single infectious disease; once that disease is eradicated, the annual SIA budget is re-focussed on eradicating a second disease, etc. Herd immunity suggests that a sequential strategy may eradicate several infectious diseases faster than a non-adaptive "simultaneous strategy" of dividing annual budget equally among eradication programs for those diseases. However, mathematical modeling is required to understand the potential extent of this effect. Our objective was to illustrate how budget allocation strategies can interact with the nonlinear nature of disease transmission to determine time to eradication of several infectious diseases under different budget allocation strategies. Using a mathematical transmission model, we analyzed three hypothetical vaccine-preventable infectious diseases in three different countries. A central decision-maker can distribute funding among SIA programs for these three diseases according to either a sequential strategy or a simultaneous strategy. We explored the time to eradication under these two strategies under a range of scenarios. For a certain range of annual budgets, all three diseases can be eradicated relatively quickly under the sequential strategy, whereas eradication never occurs under the simultaneous strategy. However, moderate changes to total SIA budget, SIA frequency, order of eradication, or funding disruptions can create disproportionately large differences in the time and budget required for eradication under the sequential strategy. We find that the predicted time to eradication can be very sensitive to small differences in the rate of case importation between the countries. We also find that the time to eradication of all three diseases is not necessarily lowest when the least transmissible disease is targeted first. Relatively modest differences in budget allocation strategies in the near-term can result in surprisingly large long-term differences in time required to eradicate, as a result of the amplifying effects of herd immunity and the nonlinearities of disease transmission. More sophisticated versions of such models may be useful to large international donors or other organizations as a planning or portfolio optimization tool, where choices must be made regarding how much funding to dedicate to different infectious disease eradication efforts.

  18. Evaluation of Genetic Algorithm Concepts Using Model Problems. Part 2; Multi-Objective Optimization

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.; Pulliam, Thomas H.

    2003-01-01

    A genetic algorithm approach suitable for solving multi-objective optimization problems is described and evaluated using a series of simple model problems. Several new features including a binning selection algorithm and a gene-space transformation procedure are included. The genetic algorithm is suitable for finding pareto optimal solutions in search spaces that are defined by any number of genes and that contain any number of local extrema. Results indicate that the genetic algorithm optimization approach is flexible in application and extremely reliable, providing optimal results for all optimization problems attempted. The binning algorithm generally provides pareto front quality enhancements and moderate convergence efficiency improvements for most of the model problems. The gene-space transformation procedure provides a large convergence efficiency enhancement for problems with non-convoluted pareto fronts and a degradation in efficiency for problems with convoluted pareto fronts. The most difficult problems --multi-mode search spaces with a large number of genes and convoluted pareto fronts-- require a large number of function evaluations for GA convergence, but always converge.

  19. Wireless Sensor Network Optimization: Multi-Objective Paradigm.

    PubMed

    Iqbal, Muhammad; Naeem, Muhammad; Anpalagan, Alagan; Ahmed, Ashfaq; Azam, Muhammad

    2015-07-20

    Optimization problems relating to wireless sensor network planning, design, deployment and operation often give rise to multi-objective optimization formulations where multiple desirable objectives compete with each other and the decision maker has to select one of the tradeoff solutions. These multiple objectives may or may not conflict with each other. Keeping in view the nature of the application, the sensing scenario and input/output of the problem, the type of optimization problem changes. To address different nature of optimization problems relating to wireless sensor network design, deployment, operation, planing and placement, there exist a plethora of optimization solution types. We review and analyze different desirable objectives to show whether they conflict with each other, support each other or they are design dependent. We also present a generic multi-objective optimization problem relating to wireless sensor network which consists of input variables, required output, objectives and constraints. A list of constraints is also presented to give an overview of different constraints which are considered while formulating the optimization problems in wireless sensor networks. Keeping in view the multi facet coverage of this article relating to multi-objective optimization, this will open up new avenues of research in the area of multi-objective optimization relating to wireless sensor networks.

  20. Uncertainty Aware Structural Topology Optimization Via a Stochastic Reduced Order Model Approach

    NASA Technical Reports Server (NTRS)

    Aguilo, Miguel A.; Warner, James E.

    2017-01-01

    This work presents a stochastic reduced order modeling strategy for the quantification and propagation of uncertainties in topology optimization. Uncertainty aware optimization problems can be computationally complex due to the substantial number of model evaluations that are necessary to accurately quantify and propagate uncertainties. This computational complexity is greatly magnified if a high-fidelity, physics-based numerical model is used for the topology optimization calculations. Stochastic reduced order model (SROM) methods are applied here to effectively 1) alleviate the prohibitive computational cost associated with an uncertainty aware topology optimization problem; and 2) quantify and propagate the inherent uncertainties due to design imperfections. A generic SROM framework that transforms the uncertainty aware, stochastic topology optimization problem into a deterministic optimization problem that relies only on independent calls to a deterministic numerical model is presented. This approach facilitates the use of existing optimization and modeling tools to accurately solve the uncertainty aware topology optimization problems in a fraction of the computational demand required by Monte Carlo methods. Finally, an example in structural topology optimization is presented to demonstrate the effectiveness of the proposed uncertainty aware structural topology optimization approach.

  1. A Kind of Nonlinear Programming Problem Based on Mixed Fuzzy Relation Equations Constraints

    NASA Astrophysics Data System (ADS)

    Li, Jinquan; Feng, Shuang; Mi, Honghai

    In this work, a kind of nonlinear programming problem with non-differential objective function and under the constraints expressed by a system of mixed fuzzy relation equations is investigated. First, some properties of this kind of optimization problem are obtained. Then, a polynomial-time algorithm for this kind of optimization problem is proposed based on these properties. Furthermore, we show that this algorithm is optimal for the considered optimization problem in this paper. Finally, numerical examples are provided to illustrate our algorithms.

  2. Supporting local planning and budgeting for maternal, neonatal and child health in the Philippines

    PubMed Central

    2013-01-01

    Background Responsibility for planning and delivery of health services in the Philippines is devolved to the local government level. Given the recognised need to strengthen capacity for local planning and budgeting, we implemented Investment Cases (IC) for Maternal, Neonatal and Child Health (MNCH) in three selected sub-national units: two poor, rural provinces and one highly-urbanised city. The IC combines structured problem-solving by local policymakers and planners to identify key health system constraints and strategies to scale-up critical MNCH interventions with a decision-support model to estimate the cost and impact of different scaling-up scenarios. Methods We outline how the initiative was implemented, the aspects that worked well, and the key limitations identified in the sub-national application of this approach. Results Local officials found the structured analysis of health system constraints helpful to identify problems and select locally appropriate strategies. In particular the process was an improvement on standard approaches that focused only on supply-side issues. However, the lack of data available at the local level is a major impediment to planning. While the majority of the strategies recommended by the IC were incorporated into the 2011 plans and budgets in the three study sites, one key strategy in the participating city was subsequently reversed in 2012. Higher level systemic issues are likely to have influenced use of evidence in plans and budgets and implementation of strategies. Conclusions Efforts should be made to improve locally-representative data through routine information systems for planning and monitoring purposes. Even with sound plans and budgets, evidence is only one factor influencing investments in health. Political considerations at a local level and issues related to decentralisation, influence prioritisation and implementation of plans. In addition to the strengthening of capacity at local level, a parallel process at a higher level of government to relieve fund channelling and coordination issues is critical for any evidence-based planning approach to have a significant impact on health service delivery. PMID:23343218

  3. Student Leadership: A Checklist for Success.

    ERIC Educational Resources Information Center

    Chmielewski, Tonya R.

    2000-01-01

    Presents a checklist of student-government advisers' responsibilities governing bylaws, organization, section or appointment guidelines, job descriptions, committees, meetings, attendance policies, decision making, problem solving, public relations, goals, communication methods, training, budgets, fund raising, advertising, and other matters. A…

  4. Fast Optimization for Aircraft Descent and Approach Trajectory

    NASA Technical Reports Server (NTRS)

    Luchinsky, Dmitry G.; Schuet, Stefan; Brenton, J.; Timucin, Dogan; Smith, David; Kaneshige, John

    2017-01-01

    We address problem of on-line scheduling of the aircraft descent and approach trajectory. We formulate a general multiphase optimal control problem for optimization of the descent trajectory and review available methods of its solution. We develop a fast algorithm for solution of this problem using two key components: (i) fast inference of the dynamical and control variables of the descending trajectory from the low dimensional flight profile data and (ii) efficient local search for the resulting reduced dimensionality non-linear optimization problem. We compare the performance of the proposed algorithm with numerical solution obtained using optimal control toolbox General Pseudospectral Optimal Control Software. We present results of the solution of the scheduling problem for aircraft descent using novel fast algorithm and discuss its future applications.

  5. Research on cutting path optimization of sheet metal parts based on ant colony algorithm

    NASA Astrophysics Data System (ADS)

    Wu, Z. Y.; Ling, H.; Li, L.; Wu, L. H.; Liu, N. B.

    2017-09-01

    In view of the disadvantages of the current cutting path optimization methods of sheet metal parts, a new method based on ant colony algorithm was proposed in this paper. The cutting path optimization problem of sheet metal parts was taken as the research object. The essence and optimization goal of the optimization problem were presented. The traditional serial cutting constraint rule was improved. The cutting constraint rule with cross cutting was proposed. The contour lines of parts were discretized and the mathematical model of cutting path optimization was established. Thus the problem was converted into the selection problem of contour lines of parts. Ant colony algorithm was used to solve the problem. The principle and steps of the algorithm were analyzed.

  6. Cost-effectiveness of the U.S. Geological Survey stream-gaging program in Indiana

    USGS Publications Warehouse

    Stewart, J.A.; Miller, R.L.; Butch, G.K.

    1986-01-01

    Analysis of the stream gaging program in Indiana was divided into three phases. The first phase involved collecting information concerning the data need and the funding source for each of the 173 surface water stations in Indiana. The second phase used alternate methods to produce streamflow records at selected sites. Statistical models were used to generate stream flow data for three gaging stations. In addition, flow routing models were used at two of the sites. Daily discharges produced from models did not meet the established accuracy criteria and, therefore, these methods should not replace stream gaging procedures at those gaging stations. The third phase of the study determined the uncertainty of the rating and the error at individual gaging stations, and optimized travel routes and frequency of visits to gaging stations. The annual budget, in 1983 dollars, for operating the stream gaging program in Indiana is $823,000. The average standard error of instantaneous discharge for all continuous record gaging stations is 25.3%. A budget of $800,000 could maintain this level of accuracy if stream gaging stations were visited according to phase III results. A minimum budget of $790,000 is required to operate the gaging network. At this budget, the average standard error of instantaneous discharge would be 27.7%. A maximum budget of $1 ,000,000 was simulated in the analysis and the average standard error of instantaneous discharge was reduced to 16.8%. (Author 's abstract)

  7. Large-scale structural optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.

    1983-01-01

    Problems encountered by aerospace designers in attempting to optimize whole aircraft are discussed, along with possible solutions. Large scale optimization, as opposed to component-by-component optimization, is hindered by computational costs, software inflexibility, concentration on a single, rather than trade-off, design methodology and the incompatibility of large-scale optimization with single program, single computer methods. The software problem can be approached by placing the full analysis outside of the optimization loop. Full analysis is then performed only periodically. Problem-dependent software can be removed from the generic code using a systems programming technique, and then embody the definitions of design variables, objective function and design constraints. Trade-off algorithms can be used at the design points to obtain quantitative answers. Finally, decomposing the large-scale problem into independent subproblems allows systematic optimization of the problems by an organization of people and machines.

  8. Application of the gravity search algorithm to multi-reservoir operation optimization

    NASA Astrophysics Data System (ADS)

    Bozorg-Haddad, Omid; Janbaz, Mahdieh; Loáiciga, Hugo A.

    2016-12-01

    Complexities in river discharge, variable rainfall regime, and drought severity merit the use of advanced optimization tools in multi-reservoir operation. The gravity search algorithm (GSA) is an evolutionary optimization algorithm based on the law of gravity and mass interactions. This paper explores the GSA's efficacy for solving benchmark functions, single reservoir, and four-reservoir operation optimization problems. The GSA's solutions are compared with those of the well-known genetic algorithm (GA) in three optimization problems. The results show that the GSA's results are closer to the optimal solutions than the GA's results in minimizing the benchmark functions. The average values of the objective function equal 1.218 and 1.746 with the GSA and GA, respectively, in solving the single-reservoir hydropower operation problem. The global solution equals 1.213 for this same problem. The GSA converged to 99.97% of the global solution in its average-performing history, while the GA converged to 97% of the global solution of the four-reservoir problem. Requiring fewer parameters for algorithmic implementation and reaching the optimal solution in fewer number of functional evaluations are additional advantages of the GSA over the GA. The results of the three optimization problems demonstrate a superior performance of the GSA for optimizing general mathematical problems and the operation of reservoir systems.

  9. Discrete particle swarm optimization to solve multi-objective limited-wait hybrid flow shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Santosa, B.; Siswanto, N.; Fiqihesa

    2018-04-01

    This paper proposes a discrete Particle Swam Optimization (PSO) to solve limited-wait hybrid flowshop scheduing problem with multi objectives. Flow shop schedulimg represents the condition when several machines are arranged in series and each job must be processed at each machine with same sequence. The objective functions are minimizing completion time (makespan), total tardiness time, and total machine idle time. Flow shop scheduling model always grows to cope with the real production system accurately. Since flow shop scheduling is a NP-Hard problem then the most suitable method to solve is metaheuristics. One of metaheuristics algorithm is Particle Swarm Optimization (PSO), an algorithm which is based on the behavior of a swarm. Originally, PSO was intended to solve continuous optimization problems. Since flow shop scheduling is a discrete optimization problem, then, we need to modify PSO to fit the problem. The modification is done by using probability transition matrix mechanism. While to handle multi objectives problem, we use Pareto Optimal (MPSO). The results of MPSO is better than the PSO because the MPSO solution set produced higher probability to find the optimal solution. Besides the MPSO solution set is closer to the optimal solution

  10. Constraint Optimization Literature Review

    DTIC Science & Technology

    2015-11-01

    COPs. 15. SUBJECT TERMS high-performance computing, mobile ad hoc network, optimization, constraint, satisfaction 16. SECURITY CLASSIFICATION OF: 17...Optimization Problems 1 2.1 Constraint Satisfaction Problems 1 2.2 Constraint Optimization Problems 3 3. Constraint Optimization Algorithms 9 3.1...Constraint Satisfaction Algorithms 9 3.1.1 Brute-Force search 9 3.1.2 Constraint Propagation 10 3.1.3 Depth-First Search 13 3.1.4 Local Search 18

  11. Tax revenue and inflation rate predictions in Banda Aceh using Vector Error Correction Model (VECM)

    NASA Astrophysics Data System (ADS)

    Maulia, Eva; Miftahuddin; Sofyan, Hizir

    2018-05-01

    A country has some important parameters to achieve the welfare of the economy, such as tax revenues and inflation. One of the largest revenues of the state budget in Indonesia comes from the tax sector. Besides, the rate of inflation occurring in a country can be used as one measure, to measure economic problems that the country facing. Given the importance of tax revenue and inflation rate control in achieving economic prosperity, it is necessary to analyze the relationship and forecasting tax revenue and inflation rate. VECM (Vector Error Correction Model) was chosen as the method used in this research, because of the data used in the form of multivariate time series data. This study aims to produce a VECM model with optimal lag and to predict the tax revenue and inflation rate of the VECM model. The results show that the best model for data of tax revenue and the inflation rate in Banda Aceh City is VECM with 3rd optimal lag or VECM (3). Of the seven models formed, there is a significant model that is the acceptance model of income tax. The predicted results of tax revenue and the inflation rate in Kota Banda Aceh for the next 6, 12 and 24 periods (months) obtained using VECM (3) are considered valid, since they have a minimum error value compared to other models.

  12. Comparison of Optimal Design Methods in Inverse Problems

    PubMed Central

    Banks, H. T.; Holm, Kathleen; Kappel, Franz

    2011-01-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher Information Matrix (FIM). A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criteria with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model [13], the standard harmonic oscillator model [13] and a popular glucose regulation model [16, 19, 29]. PMID:21857762

  13. How much is new information worth? Evaluating the financial benefit of resolving management uncertainty

    USGS Publications Warehouse

    Maxwell, Sean L.; Rhodes, Jonathan R.; Runge, Michael C.; Possingham, Hugh P.; Ng, Chooi Fei; McDonald Madden, Eve

    2015-01-01

    Conservation decision-makers face a trade-off between spending limited funds on direct management action, or gaining new information in an attempt to improve management performance in the future. Value-of-information analysis can help to resolve this trade-off by evaluating how much management performance could improve if new information was gained. Value-of-information analysis has been used extensively in other disciplines, but there are only a few examples where it has informed conservation planning, none of which have used it to evaluate the financial value of gaining new information. We address this gap by applying value-of-information analysis to the management of a declining koala Phascolarctos cinereuspopulation. Decision-makers responsible for managing this population face uncertainty about survival and fecundity rates, and how habitat cover affects mortality threats. The value of gaining new information about these uncertainties was calculated using a deterministic matrix model of the koala population to find the expected population growth rate if koala mortality threats were optimally managed under alternative model hypotheses, which represented the uncertainties faced by koala managers. Gaining new information about survival and fecundity rates and the effect of habitat cover on mortality threats will do little to improve koala management. Across a range of management budgets, no more than 1·7% of the budget should be spent on resolving these uncertainties. The value of information was low because optimal management decisions were not sensitive to the uncertainties we considered. Decisions were instead driven by a substantial difference in the cost efficiency of management actions. The value of information was up to forty times higher when the cost efficiencies of different koala management actions were similar. Synthesis and applications. This study evaluates the ecological and financial benefits of gaining new information to inform a conservation problem. We also theoretically demonstrate that the value of reducing uncertainty is highest when it is not clear which management action is the most cost efficient. This study will help expand the use of value-of-information analyses in conservation by providing a cost efficiency metric by which to evaluate research or monitoring.

  14. Multiobjective optimization approach: thermal food processing.

    PubMed

    Abakarov, A; Sushkov, Y; Almonacid, S; Simpson, R

    2009-01-01

    The objective of this study was to utilize a multiobjective optimization technique for the thermal sterilization of packaged foods. The multiobjective optimization approach used in this study is based on the optimization of well-known aggregating functions by an adaptive random search algorithm. The applicability of the proposed approach was illustrated by solving widely used multiobjective test problems taken from the literature. The numerical results obtained for the multiobjective test problems and for the thermal processing problem show that the proposed approach can be effectively used for solving multiobjective optimization problems arising in the food engineering field.

  15. Pattern uniformity control in integrated structures

    NASA Astrophysics Data System (ADS)

    Kobayashi, Shinji; Okada, Soichiro; Shimura, Satoru; Nafus, Kathleen; Fonseca, Carlos; Biesemans, Serge; Enomoto, Masashi

    2017-03-01

    In our previous paper dealing with multi-patterning, we proposed a new indicator to quantify the quality of final wafer pattern transfer, called interactive pattern fidelity error (IPFE). It detects patterning failures resulting from any source of variation in creating integrated patterns. IPFE is a function of overlay and edge placement error (EPE) of all layers comprising the final pattern (i.e. lower and upper layers). In this paper, we extend the use cases with Via in additional to the bridge case (Block on Spacer). We propose an IPFE budget and CD budget using simple geometric and statistical models with analysis of a variance (ANOVA). In addition, we validate the model with experimental data. From the experimental results, improvements in overlay, local-CDU (LCDU) of contact hole (CH) or pillar patterns (especially, stochastic pattern noise (SPN)) and pitch walking are all critical to meet budget requirements. We also provide a special note about the importance of the line length used in analyzing LWR. We find that IPFE and CD budget requirements are consistent to the table of the ITRS's technical requirement. Therefore the IPFE concept can be adopted for a variety of integrated structures comprising digital logic circuits. Finally, we suggest how to use IPFE for yield management and optimization requirements for each process.

  16. The coral reefs optimization algorithm: a novel metaheuristic for efficiently solving optimization problems.

    PubMed

    Salcedo-Sanz, S; Del Ser, J; Landa-Torres, I; Gil-López, S; Portilla-Figueras, J A

    2014-01-01

    This paper presents a novel bioinspired algorithm to tackle complex optimization problems: the coral reefs optimization (CRO) algorithm. The CRO algorithm artificially simulates a coral reef, where different corals (namely, solutions to the optimization problem considered) grow and reproduce in coral colonies, fighting by choking out other corals for space in the reef. This fight for space, along with the specific characteristics of the corals' reproduction, produces a robust metaheuristic algorithm shown to be powerful for solving hard optimization problems. In this research the CRO algorithm is tested in several continuous and discrete benchmark problems, as well as in practical application scenarios (i.e., optimum mobile network deployment and off-shore wind farm design). The obtained results confirm the excellent performance of the proposed algorithm and open line of research for further application of the algorithm to real-world problems.

  17. The Coral Reefs Optimization Algorithm: A Novel Metaheuristic for Efficiently Solving Optimization Problems

    PubMed Central

    Salcedo-Sanz, S.; Del Ser, J.; Landa-Torres, I.; Gil-López, S.; Portilla-Figueras, J. A.

    2014-01-01

    This paper presents a novel bioinspired algorithm to tackle complex optimization problems: the coral reefs optimization (CRO) algorithm. The CRO algorithm artificially simulates a coral reef, where different corals (namely, solutions to the optimization problem considered) grow and reproduce in coral colonies, fighting by choking out other corals for space in the reef. This fight for space, along with the specific characteristics of the corals' reproduction, produces a robust metaheuristic algorithm shown to be powerful for solving hard optimization problems. In this research the CRO algorithm is tested in several continuous and discrete benchmark problems, as well as in practical application scenarios (i.e., optimum mobile network deployment and off-shore wind farm design). The obtained results confirm the excellent performance of the proposed algorithm and open line of research for further application of the algorithm to real-world problems. PMID:25147860

  18. 8-PSK Signaling over non-linear satellite channels

    NASA Technical Reports Server (NTRS)

    Horan, Sheila B.; Caballero, Ruben B. Eng.

    1996-01-01

    Space agencies are under pressure to utilize better bandwidth-efficient communication methods due to the actual allocated frequency bands becoming more congested. Also budget reductions is another problem that the space agencies must deal with. This budget constraint results in simpler spacecraft carrying less communication capabilities and also the reduction in staff to capture data in the earth stations. It is then imperative that the most bandwidth efficient communication methods be utilized. This thesis presents a study of 8-ary Phase Shift Keying (8PSK) modulation with respect to bandwidth, power efficiency, spurious emissions and interference susceptibility over a non-linear satellite channel.

  19. Quantum Heterogeneous Computing for Satellite Positioning Optimization

    NASA Astrophysics Data System (ADS)

    Bass, G.; Kumar, V.; Dulny, J., III

    2016-12-01

    Hard optimization problems occur in many fields of academic study and practical situations. We present results in which quantum heterogeneous computing is used to solve a real-world optimization problem: satellite positioning. Optimization problems like this can scale very rapidly with problem size, and become unsolvable with traditional brute-force methods. Typically, such problems have been approximately solved with heuristic approaches; however, these methods can take a long time to calculate and are not guaranteed to find optimal solutions. Quantum computing offers the possibility of producing significant speed-up and improved solution quality. There are now commercially available quantum annealing (QA) devices that are designed to solve difficult optimization problems. These devices have 1000+ quantum bits, but they have significant hardware size and connectivity limitations. We present a novel heterogeneous computing stack that combines QA and classical machine learning and allows the use of QA on problems larger than the quantum hardware could solve in isolation. We begin by analyzing the satellite positioning problem with a heuristic solver, the genetic algorithm. The classical computer's comparatively large available memory can explore the full problem space and converge to a solution relatively close to the true optimum. The QA device can then evolve directly to the optimal solution within this more limited space. Preliminary experiments, using the Quantum Monte Carlo (QMC) algorithm to simulate QA hardware, have produced promising results. Working with problem instances with known global minima, we find a solution within 8% in a matter of seconds, and within 5% in a few minutes. Future studies include replacing QMC with commercially available quantum hardware and exploring more problem sets and model parameters. Our results have important implications for how heterogeneous quantum computing can be used to solve difficult optimization problems in any field.

  20. Operations research applications in nuclear energy

    NASA Astrophysics Data System (ADS)

    Johnson, Benjamin Lloyd

    This dissertation consists of three papers; the first is published in Annals of Operations Research, the second is nearing submission to INFORMS Journal on Computing, and the third is the predecessor of a paper nearing submission to Progress in Nuclear Energy. We apply operations research techniques to nuclear waste disposal and nuclear safeguards. Although these fields are different, they allow us to showcase some benefits of using operations research techniques to enhance nuclear energy applications. The first paper, "Optimizing High-Level Nuclear Waste Disposal within a Deep Geologic Repository," presents a mixed-integer programming model that determines where to place high-level nuclear waste packages in a deep geologic repository to minimize heat load concentration. We develop a heuristic that increases the size of solvable model instances. The second paper, "Optimally Configuring a Measurement System to Detect Diversions from a Nuclear Fuel Cycle," introduces a simulation-optimization algorithm and an integer-programming model to find the best, or near-best, resource-limited nuclear fuel cycle measurement system with a high degree of confidence. Given location-dependent measurement method precisions, we (i) optimize the configuration of n methods at n locations of a hypothetical nuclear fuel cycle facility, (ii) find the most important location at which to improve method precision, and (iii) determine the effect of measurement frequency on near-optimal configurations and objective values. Our results correspond to existing outcomes but we obtain them at least an order of magnitude faster. The third paper, "Optimizing Nuclear Material Control and Accountability Measurement Systems," extends the integer program from the second paper to locate measurement methods in a larger, hypothetical nuclear fuel cycle scenario given fixed purchase and utilization budgets. This paper also presents two mixed-integer quadratic programming models to increase the precision of existing methods given a fixed improvement budget and to reduce the measurement uncertainty in the system while limiting improvement costs. We quickly obtain similar or better solutions compared to several intuitive analyses that take much longer to perform.

  1. Wireless Sensor Network Optimization: Multi-Objective Paradigm

    PubMed Central

    Iqbal, Muhammad; Naeem, Muhammad; Anpalagan, Alagan; Ahmed, Ashfaq; Azam, Muhammad

    2015-01-01

    Optimization problems relating to wireless sensor network planning, design, deployment and operation often give rise to multi-objective optimization formulations where multiple desirable objectives compete with each other and the decision maker has to select one of the tradeoff solutions. These multiple objectives may or may not conflict with each other. Keeping in view the nature of the application, the sensing scenario and input/output of the problem, the type of optimization problem changes. To address different nature of optimization problems relating to wireless sensor network design, deployment, operation, planing and placement, there exist a plethora of optimization solution types. We review and analyze different desirable objectives to show whether they conflict with each other, support each other or they are design dependent. We also present a generic multi-objective optimization problem relating to wireless sensor network which consists of input variables, required output, objectives and constraints. A list of constraints is also presented to give an overview of different constraints which are considered while formulating the optimization problems in wireless sensor networks. Keeping in view the multi facet coverage of this article relating to multi-objective optimization, this will open up new avenues of research in the area of multi-objective optimization relating to wireless sensor networks. PMID:26205271

  2. Cost component analysis.

    PubMed

    Lörincz, András; Póczos, Barnabás

    2003-06-01

    In optimizations the dimension of the problem may severely, sometimes exponentially increase optimization time. Parametric function approximatiors (FAPPs) have been suggested to overcome this problem. Here, a novel FAPP, cost component analysis (CCA) is described. In CCA, the search space is resampled according to the Boltzmann distribution generated by the energy landscape. That is, CCA converts the optimization problem to density estimation. Structure of the induced density is searched by independent component analysis (ICA). The advantage of CCA is that each independent ICA component can be optimized separately. In turn, (i) CCA intends to partition the original problem into subproblems and (ii) separating (partitioning) the original optimization problem into subproblems may serve interpretation. Most importantly, (iii) CCA may give rise to high gains in optimization time. Numerical simulations illustrate the working of the algorithm.

  3. The design of multirate digital control systems

    NASA Technical Reports Server (NTRS)

    Berg, M. C.

    1986-01-01

    The successive loop closures synthesis method is the only method for multirate (MR) synthesis in common use. A new method for MR synthesis is introduced which requires a gradient-search solution to a constrained optimization problem. Some advantages of this method are that the control laws for all control loops are synthesized simultaneously, taking full advantage of all cross-coupling effects, and that simple, low-order compensator structures are easily accomodated. The algorithm and associated computer program for solving the constrained optimization problem are described. The successive loop closures , optimal control, and constrained optimization synthesis methods are applied to two example design problems. A series of compensator pairs are synthesized for each example problem. The succesive loop closure, optimal control, and constrained optimization synthesis methods are compared, in the context of the two design problems.

  4. Making a Splash.

    ERIC Educational Resources Information Center

    Maland, Jim

    1998-01-01

    Presents a 10-step process that allows swimming-pool owners to objectively scrutinize their existing facility's needs, construction, and operation and maintenance budgets before renovating a structurally deficient or costly pool. Four examples of problems involving pool-renovation decision making are highlighted. (GR)

  5. Education and Training: Springboard or Hurdle?

    ERIC Educational Resources Information Center

    Walsh, M.

    1987-01-01

    A survey of 19 British companies documented their use of education and training programs. Questions covered such areas as (1) expenditure rates, (2) strategy and policies, (3) appraisal and budgeting, and (4) accounting practice. Problems and potential changes were solicited. (CH)

  6. 48 CFR 9901.304 - Membership.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Administrator. (d) An individual who is particularly knowledgeable about cost accounting problems and systems... Section 9901.304 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET ADMINISTRATION RULES AND PROCEDURES 9901.304 Membership...

  7. 48 CFR 31.107 - Contracts with State, local, and federally recognized Indian tribal governments.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... basis for a uniform approach to the problem of determining costs and to promote efficiency and better... requirements promulgated by the sponsoring Government agencies. (b) The Office of Management and Budget will...

  8. Mounting Debt.

    ERIC Educational Resources Information Center

    Fullerton, Jon

    2004-01-01

    Examines the politics of school budgeting. School managers operate with primitive accounting systems that can mask financial problems for years and are trapped by state, federal, and union mandates. Reforms include allowing districts flexibility to reallocate money; demanding both instructional leadership and financial expertise from…

  9. The Evil Twin of Agenor: More Evidence for Tectonic Convergence on Europa

    NASA Astrophysics Data System (ADS)

    Greenberg, R.; Hurford, T.

    2003-03-01

    Reconstruction along a lineament similar to Agenor, but located diametrically opposite, indicates it is a convergence site, confirming hypotheses that similar features elsewhere formed that way and helping solve the surface-area budget problem.

  10. Missions to the sun and to the earth. [planning of NASA Solar Terrestrial Program

    NASA Technical Reports Server (NTRS)

    Timothy, A. F.

    1978-01-01

    The program outlined in the present paper represents an optimized plan of solar terrestrial physics. It is constrained only in the sense that it involves not more than one new major mission per year for the Solar Terrestrial Division during the 1980-1985 period. However, the flight activity proposed, if accepted by the Agency and by Congress, would involve a growth in the existing Solar Terrestrial budget by more than a factor of 2. Thus, the program may be considered as somewhat optimistic when viewed in the broader context of the NASA goals and budget. The Agency's integrated FY 1980 Five Year Plan will show how many missions proposed will survive this planning process.

  11. The optimal location of piezoelectric actuators and sensors for vibration control of plates

    NASA Astrophysics Data System (ADS)

    Kumar, K. Ramesh; Narayanan, S.

    2007-12-01

    This paper considers the optimal placement of collocated piezoelectric actuator-sensor pairs on a thin plate using a model-based linear quadratic regulator (LQR) controller. LQR performance is taken as objective for finding the optimal location of sensor-actuator pairs. The problem is formulated using the finite element method (FEM) as multi-input-multi-output (MIMO) model control. The discrete optimal sensor and actuator location problem is formulated in the framework of a zero-one optimization problem. A genetic algorithm (GA) is used to solve the zero-one optimization problem. Different classical control strategies like direct proportional feedback, constant-gain negative velocity feedback and the LQR optimal control scheme are applied to study the control effectiveness.

  12. Exploring the quantum speed limit with computer games

    NASA Astrophysics Data System (ADS)

    Sørensen, Jens Jakob W. H.; Pedersen, Mads Kock; Munch, Michael; Haikka, Pinja; Jensen, Jesper Halkjær; Planke, Tilo; Andreasen, Morten Ginnerup; Gajdacz, Miroslav; Mølmer, Klaus; Lieberoth, Andreas; Sherson, Jacob F.

    2016-04-01

    Humans routinely solve problems of immense computational complexity by intuitively forming simple, low-dimensional heuristic strategies. Citizen science (or crowd sourcing) is a way of exploiting this ability by presenting scientific research problems to non-experts. ‘Gamification’—the application of game elements in a non-game context—is an effective tool with which to enable citizen scientists to provide solutions to research problems. The citizen science games Foldit, EteRNA and EyeWire have been used successfully to study protein and RNA folding and neuron mapping, but so far gamification has not been applied to problems in quantum physics. Here we report on Quantum Moves, an online platform gamifying optimization problems in quantum physics. We show that human players are able to find solutions to difficult problems associated with the task of quantum computing. Players succeed where purely numerical optimization fails, and analyses of their solutions provide insights into the problem of optimization of a more profound and general nature. Using player strategies, we have thus developed a few-parameter heuristic optimization method that efficiently outperforms the most prominent established numerical methods. The numerical complexity associated with time-optimal solutions increases for shorter process durations. To understand this better, we produced a low-dimensional rendering of the optimization landscape. This rendering reveals why traditional optimization methods fail near the quantum speed limit (that is, the shortest process duration with perfect fidelity). Combined analyses of optimization landscapes and heuristic solution strategies may benefit wider classes of optimization problems in quantum physics and beyond.

  13. Exploring the quantum speed limit with computer games.

    PubMed

    Sørensen, Jens Jakob W H; Pedersen, Mads Kock; Munch, Michael; Haikka, Pinja; Jensen, Jesper Halkjær; Planke, Tilo; Andreasen, Morten Ginnerup; Gajdacz, Miroslav; Mølmer, Klaus; Lieberoth, Andreas; Sherson, Jacob F

    2016-04-14

    Humans routinely solve problems of immense computational complexity by intuitively forming simple, low-dimensional heuristic strategies. Citizen science (or crowd sourcing) is a way of exploiting this ability by presenting scientific research problems to non-experts. 'Gamification'--the application of game elements in a non-game context--is an effective tool with which to enable citizen scientists to provide solutions to research problems. The citizen science games Foldit, EteRNA and EyeWire have been used successfully to study protein and RNA folding and neuron mapping, but so far gamification has not been applied to problems in quantum physics. Here we report on Quantum Moves, an online platform gamifying optimization problems in quantum physics. We show that human players are able to find solutions to difficult problems associated with the task of quantum computing. Players succeed where purely numerical optimization fails, and analyses of their solutions provide insights into the problem of optimization of a more profound and general nature. Using player strategies, we have thus developed a few-parameter heuristic optimization method that efficiently outperforms the most prominent established numerical methods. The numerical complexity associated with time-optimal solutions increases for shorter process durations. To understand this better, we produced a low-dimensional rendering of the optimization landscape. This rendering reveals why traditional optimization methods fail near the quantum speed limit (that is, the shortest process duration with perfect fidelity). Combined analyses of optimization landscapes and heuristic solution strategies may benefit wider classes of optimization problems in quantum physics and beyond.

  14. Systems Engineering and Project Management for Product Development: Optimizing Their Working Interfaces

    DTIC Science & Technology

    2013-09-01

    reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction...Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503. 1 . AGENCY USE ONLY (Leave blank) 2. REPORT DATE... 1   A.  BACKGROUND ................................................................................... 1   B.  PRODUCT DEVELOPMENT

  15. Out of control little-used clinical assets are draining healthcare budgets.

    PubMed

    Horblyuk, Ruslan; Kaneta, Kristopher; McMillen, Gary L; Mullins, Christopher; O'Brien, Thomas M; Roy, Ankita

    2012-07-01

    To improve utilization and reduce the cost of maintaining mobile clinical equipment, healthcare organization leaders should do the following: Select an initial asset group to target. Conduct a physical inventory. Evaluate the organization's asset "ecosystem." Optimize workflow processes. Phase in new processes, and phase out inventory. Devote time to change management. Develop a replacement strategy.

  16. Habitat acquisition strategies for grassland birds in an urbanizing landscape

    Treesearch

    Stephanie A. Snyder; James R. Miller; Adam M. Skibbe; Robert G. Haight

    2007-01-01

    Habitat protection for grassland birds is an important component of open space land acquisition in suburban Chicago. We use optimization decision models to develop recommendations for land protection and analyze tradeoffs between alternative goals. One goal is to acquire (and restore if necessary) as much grassland habitat as possible for a given budget. Because a...

  17. Numerical optimization methods for controlled systems with parameters

    NASA Astrophysics Data System (ADS)

    Tyatyushkin, A. I.

    2017-10-01

    First- and second-order numerical methods for optimizing controlled dynamical systems with parameters are discussed. In unconstrained-parameter problems, the control parameters are optimized by applying the conjugate gradient method. A more accurate numerical solution in these problems is produced by Newton's method based on a second-order functional increment formula. Next, a general optimal control problem with state constraints and parameters involved on the righthand sides of the controlled system and in the initial conditions is considered. This complicated problem is reduced to a mathematical programming one, followed by the search for optimal parameter values and control functions by applying a multimethod algorithm. The performance of the proposed technique is demonstrated by solving application problems.

  18. Data Understanding Applied to Optimization

    NASA Technical Reports Server (NTRS)

    Buntine, Wray; Shilman, Michael

    1998-01-01

    The goal of this research is to explore and develop software for supporting visualization and data analysis of search and optimization. Optimization is an ever-present problem in science. The theory of NP-completeness implies that the problems can only be resolved by increasingly smarter problem specific knowledge, possibly for use in some general purpose algorithms. Visualization and data analysis offers an opportunity to accelerate our understanding of key computational bottlenecks in optimization and to automatically tune aspects of the computation for specific problems. We will prototype systems to demonstrate how data understanding can be successfully applied to problems characteristic of NASA's key science optimization tasks, such as central tasks for parallel processing, spacecraft scheduling, and data transmission from a remote satellite.

  19. Multiobjective Optimization Using a Pareto Differential Evolution Approach

    NASA Technical Reports Server (NTRS)

    Madavan, Nateri K.; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    Differential Evolution is a simple, fast, and robust evolutionary algorithm that has proven effective in determining the global optimum for several difficult single-objective optimization problems. In this paper, the Differential Evolution algorithm is extended to multiobjective optimization problems by using a Pareto-based approach. The algorithm performs well when applied to several test optimization problems from the literature.

  20. On a distinctive feature of problems of calculating time-average characteristics of nuclear reactor optimal control sets

    NASA Astrophysics Data System (ADS)

    Trifonenkov, A. V.; Trifonenkov, V. P.

    2017-01-01

    This article deals with a feature of problems of calculating time-average characteristics of nuclear reactor optimal control sets. The operation of a nuclear reactor during threatened period is considered. The optimal control search problem is analysed. The xenon poisoning causes limitations on the variety of statements of the problem of calculating time-average characteristics of a set of optimal reactor power off controls. The level of xenon poisoning is limited. There is a problem of choosing an appropriate segment of the time axis to ensure that optimal control problem is consistent. Two procedures of estimation of the duration of this segment are considered. Two estimations as functions of the xenon limitation were plot. Boundaries of the interval of averaging are defined more precisely.

  1. Fuzzy Multi-Objective Vendor Selection Problem with Modified S-CURVE Membership Function

    NASA Astrophysics Data System (ADS)

    Díaz-Madroñero, Manuel; Peidro, David; Vasant, Pandian

    2010-06-01

    In this paper, the S-Curve membership function methodology is used in a vendor selection (VS) problem. An interactive method for solving multi-objective VS problems with fuzzy goals is developed. The proposed method attempts simultaneously to minimize the total order costs, the number of rejected items and the number of late delivered items with reference to several constraints such as meeting buyers' demand, vendors' capacity, vendors' quota flexibility, vendors' allocated budget, etc. We compare in an industrial case the performance of S-curve membership functions, representing uncertainty goals and constraints in VS problems, with linear membership functions.

  2. Health sector operational planning and budgeting processes in Kenya-"never the twain shall meet".

    PubMed

    Tsofa, Benjamin; Molyneux, Sassy; Goodman, Catherine

    2016-07-01

    Operational planning is considered an important tool for translating government policies and strategic objectives into day-to-day management activities. However, developing countries suffer from persistent misalignment between policy, planning and budgeting. The Medium Term Expenditure Framework (MTEF) was introduced to address this misalignment. Kenya adopted the MTEF in the early 2000s, and in 2005, the Ministry of Health adopted the Annual Operational Plan process to adapt the MTEF to the health sector. This study assessed the degree to which the health sector Annual Operational Plan process in Kenya has achieved alignment between planning and budgeting at the national level, using document reviews, participant observation and key informant interviews. We found that the Kenyan health sector was far from achieving planning and budgeting alignment. Several factors contributed to this problem including weak Ministry of Health stewardship and institutionalized separation between planning and budgeting processes; a rapidly changing planning and budgeting environment; lack of reliable data to inform target setting and poor participation by key stakeholders in the process including a top-down approach to target setting. We conclude that alignment is unlikely to be achieved without consideration of the specific institutional contexts and the power relationships between stakeholders. In particular, there is a need for institutional integration of the planning and budgeting processes into a common cycle and framework with common reporting lines and for improved data and local-level input to inform appropriate and realistic target setting. © 2015 The Authors. International Journal of Health Planning and Management published by John Wiley & Sons, Ltd. © 2015 The Authors. International Journal of Health Planning and Management published by John Wiley & Sons, Ltd.

  3. Robust optimization modelling with applications to industry and environmental problems

    NASA Astrophysics Data System (ADS)

    Chaerani, Diah; Dewanto, Stanley P.; Lesmana, Eman

    2017-10-01

    Robust Optimization (RO) modeling is one of the existing methodology for handling data uncertainty in optimization problem. The main challenge in this RO methodology is how and when we can reformulate the robust counterpart of uncertain problems as a computationally tractable optimization problem or at least approximate the robust counterpart by a tractable problem. Due to its definition the robust counterpart highly depends on how we choose the uncertainty set. As a consequence we can meet this challenge only if this set is chosen in a suitable way. The development on RO grows fast, since 2004, a new approach of RO called Adjustable Robust Optimization (ARO) is introduced to handle uncertain problems when the decision variables must be decided as a ”wait and see” decision variables. Different than the classic Robust Optimization (RO) that models decision variables as ”here and now”. In ARO, the uncertain problems can be considered as a multistage decision problem, thus decision variables involved are now become the wait and see decision variables. In this paper we present the applications of both RO and ARO. We present briefly all results to strengthen the importance of RO and ARO in many real life problems.

  4. A Matrix-Free Algorithm for Multidisciplinary Design Optimization

    NASA Astrophysics Data System (ADS)

    Lambe, Andrew Borean

    Multidisciplinary design optimization (MDO) is an approach to engineering design that exploits the coupling between components or knowledge disciplines in a complex system to improve the final product. In aircraft design, MDO methods can be used to simultaneously design the outer shape of the aircraft and the internal structure, taking into account the complex interaction between the aerodynamic forces and the structural flexibility. Efficient strategies are needed to solve such design optimization problems and guarantee convergence to an optimal design. This work begins with a comprehensive review of MDO problem formulations and solution algorithms. First, a fundamental MDO problem formulation is defined from which other formulations may be obtained through simple transformations. Using these fundamental problem formulations, decomposition methods from the literature are reviewed and classified. All MDO methods are presented in a unified mathematical notation to facilitate greater understanding. In addition, a novel set of diagrams, called extended design structure matrices, are used to simultaneously visualize both data communication and process flow between the many software components of each method. For aerostructural design optimization, modern decomposition-based MDO methods cannot efficiently handle the tight coupling between the aerodynamic and structural states. This fact motivates the exploration of methods that can reduce the computational cost. A particular structure in the direct and adjoint methods for gradient computation motivates the idea of a matrix-free optimization method. A simple matrix-free optimizer is developed based on the augmented Lagrangian algorithm. This new matrix-free optimizer is tested on two structural optimization problems and one aerostructural optimization problem. The results indicate that the matrix-free optimizer is able to efficiently solve structural and multidisciplinary design problems with thousands of variables and constraints. On the aerostructural test problem formulated with thousands of constraints, the matrix-free optimizer is estimated to reduce the total computational time by up to 90% compared to conventional optimizers.

  5. A Matrix-Free Algorithm for Multidisciplinary Design Optimization

    NASA Astrophysics Data System (ADS)

    Lambe, Andrew Borean

    Multidisciplinary design optimization (MDO) is an approach to engineering design that exploits the coupling between components or knowledge disciplines in a complex system to improve the final product. In aircraft design, MDO methods can be used to simultaneously design the outer shape of the aircraft and the internal structure, taking into account the complex interaction between the aerodynamic forces and the structural flexibility. Efficient strategies are needed to solve such design optimization problems and guarantee convergence to an optimal design. This work begins with a comprehensive review of MDO problem formulations and solution algorithms. First, a fundamental MDO problem formulation is defined from which other formulations may be obtained through simple transformations. Using these fundamental problem formulations, decomposition methods from the literature are reviewed and classified. All MDO methods are presented in a unified mathematical notation to facilitate greater understanding. In addition, a novel set of diagrams, called extended design structure matrices, are used to simultaneously visualize both data communication and process flow between the many software components of each method. For aerostructural design optimization, modern decomposition-based MDO methods cannot efficiently handle the tight coupling between the aerodynamic and structural states. This fact motivates the exploration of methods that can reduce the computational cost. A particular structure in the direct and adjoint methods for gradient computation. motivates the idea of a matrix-free optimization method. A simple matrix-free optimizer is developed based on the augmented Lagrangian algorithm. This new matrix-free optimizer is tested on two structural optimization problems and one aerostructural optimization problem. The results indicate that the matrix-free optimizer is able to efficiently solve structural and multidisciplinary design problems with thousands of variables and constraints. On the aerostructural test problem formulated with thousands of constraints, the matrix-free optimizer is estimated to reduce the total computational time by up to 90% compared to conventional optimizers.

  6. A sequential linear optimization approach for controller design

    NASA Technical Reports Server (NTRS)

    Horta, L. G.; Juang, J.-N.; Junkins, J. L.

    1985-01-01

    A linear optimization approach with a simple real arithmetic algorithm is presented for reliable controller design and vibration suppression of flexible structures. Using first order sensitivity of the system eigenvalues with respect to the design parameters in conjunction with a continuation procedure, the method converts a nonlinear optimization problem into a maximization problem with linear inequality constraints. The method of linear programming is then applied to solve the converted linear optimization problem. The general efficiency of the linear programming approach allows the method to handle structural optimization problems with a large number of inequality constraints on the design vector. The method is demonstrated using a truss beam finite element model for the optimal sizing and placement of active/passive-structural members for damping augmentation. Results using both the sequential linear optimization approach and nonlinear optimization are presented and compared. The insensitivity to initial conditions of the linear optimization approach is also demonstrated.

  7. Design and multi-physics optimization of rotary MRF brakes

    NASA Astrophysics Data System (ADS)

    Topcu, Okan; Taşcıoğlu, Yiğit; Konukseven, Erhan İlhan

    2018-03-01

    Particle swarm optimization (PSO) is a popular method to solve the optimization problems. However, calculations for each particle will be excessive when the number of particles and complexity of the problem increases. As a result, the execution speed will be too slow to achieve the optimized solution. Thus, this paper proposes an automated design and optimization method for rotary MRF brakes and similar multi-physics problems. A modified PSO algorithm is developed for solving multi-physics engineering optimization problems. The difference between the proposed method and the conventional PSO is to split up the original single population into several subpopulations according to the division of labor. The distribution of tasks and the transfer of information to the next party have been inspired by behaviors of a hunting party. Simulation results show that the proposed modified PSO algorithm can overcome the problem of heavy computational burden of multi-physics problems while improving the accuracy. Wire type, MR fluid type, magnetic core material, and ideal current inputs have been determined by the optimization process. To the best of the authors' knowledge, this multi-physics approach is novel for optimizing rotary MRF brakes and the developed PSO algorithm is capable of solving other multi-physics engineering optimization problems. The proposed method has showed both better performance compared to the conventional PSO and also has provided small, lightweight, high impedance rotary MRF brake designs.

  8. Algorithmic Perspectives on Problem Formulations in MDO

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia M.; Lewis, Robert Michael

    2000-01-01

    This work is concerned with an approach to formulating the multidisciplinary optimization (MDO) problem that reflects an algorithmic perspective on MDO problem solution. The algorithmic perspective focuses on formulating the problem in light of the abilities and inabilities of optimization algorithms, so that the resulting nonlinear programming problem can be solved reliably and efficiently by conventional optimization techniques. We propose a modular approach to formulating MDO problems that takes advantage of the problem structure, maximizes the autonomy of implementation, and allows for multiple easily interchangeable problem statements to be used depending on the available resources and the characteristics of the application problem.

  9. Can there be fair funding for fundholding practices?

    PubMed Central

    Dixon, J.

    1994-01-01

    Most regional health authorities set budgets for fundholding practices according to the amount of care used by the practice population. This article explains why this funding method can only lead to an inequitable allocation of resources between fundholding and non-fundholding practices. Using the experience of North West Thames region, the efforts made to make funding fairer are discussed. The steps that health authorities could take to investigate and reduce the problem are also outlined. In the absence of a capitation formula for funding fundholding practices, the paper suggests that health authorities should do much more to investigate the amount of money they spend on non-fundholding practices. Regions could develop and use other methods to set budgets rather than rely on activity recorded by practices. Regions and the Department of Health should resolve urgently if and how far the budgets for fundholders should be compensated for increases in provider prices. Images p773-a PMID:8142835

  10. Library management in the tight budget seventies. Problems, challenges, and opportunities.

    PubMed

    White, H S

    1977-01-01

    This paper examines changes in the management of university, special, and medical libraries brought about by the budget curtailments that followed the more affluent funding period of the mid-1960s. Based on a study conducted for the National Science Foundation by the Indiana University Graduate Library School, this paper deals with misconceptions that have arisen in the relationship between publishers and librarians, and differentiates between the priority perceptions of academic and of special librarians in the allocation of progressively scarcer resources. It concludes that libraries must make strong efforts to reduce the growing erosion of materials acquisitions budgets because of growing labor costs as a percentage of all library expenditures; that they must make a working reality of the resource-sharing mechanisms established through consortia and networks; and that they must use advanced evaluative techniques in the determination of which services and programs to implement, expand, and retain, and which to curtail and abandon.

  11. A general optimality criteria algorithm for a class of engineering optimization problems

    NASA Astrophysics Data System (ADS)

    Belegundu, Ashok D.

    2015-05-01

    An optimality criteria (OC)-based algorithm for optimization of a general class of nonlinear programming (NLP) problems is presented. The algorithm is only applicable to problems where the objective and constraint functions satisfy certain monotonicity properties. For multiply constrained problems which satisfy these assumptions, the algorithm is attractive compared with existing NLP methods as well as prevalent OC methods, as the latter involve computationally expensive active set and step-size control strategies. The fixed point algorithm presented here is applicable not only to structural optimization problems but also to certain problems as occur in resource allocation and inventory models. Convergence aspects are discussed. The fixed point update or resizing formula is given physical significance, which brings out a strength and trim feature. The number of function evaluations remains independent of the number of variables, allowing the efficient solution of problems with large number of variables.

  12. Managing the Budget: Stock-Flow Reasoning and the CO2 Accumulation Problem.

    PubMed

    Newell, Ben R; Kary, Arthur; Moore, Chris; Gonzalez, Cleotilde

    2016-01-01

    The majority of people show persistent poor performance in reasoning about "stock-flow problems" in the laboratory. An important example is the failure to understand the relationship between the "stock" of CO2 in the atmosphere, the "inflow" via anthropogenic CO2 emissions, and the "outflow" via natural CO2 absorption. This study addresses potential causes of reasoning failures in the CO2 accumulation problem and reports two experiments involving a simple re-framing of the task as managing an analogous financial (rather than CO2 ) budget. In Experiment 1 a financial version of the task that required participants to think in terms of controlling debt demonstrated significant improvements compared to a standard CO2 accumulation problem. Experiment 2, in which participants were invited to think about managing savings, suggested that this improvement was fortuitous and coincidental rather than due to a fundamental change in understanding the stock-flow relationships. The role of graphical information in aiding or abetting stock-flow reasoning was also explored in both experiments, with the results suggesting that graphs do not always assist understanding. The potential for leveraging the kind of reasoning exhibited in such tasks in an effort to change people's willingness to reduce CO2 emissions is briefly discussed. Copyright © 2015 Cognitive Science Society, Inc.

  13. Agency problems in hospitals participating in self-management project under global budget system in Taiwan.

    PubMed

    Yan, Yu-Hua; Hsu, Shuofen; Yang, Chen-Wei; Fang, Shih-Chieh

    2010-02-01

    The main purposes of this study are to clarify the agency problems in the hospitals participating in self-management project within the context of Global Budgeting Payment System regulated by Taiwan government, and also to provide some suggestions for hospital administrator and health policy maker in reducing the waste of healthcare resources resulting from agency problems. For the purposes above, this study examines the relationships between two agency problems (ex ante moral hazard and ex post moral hazard) aroused among the hospitals and Bureau of National Health Insurance in Taiwan's health care sector. This study empirically tested the theoretical model at organization level. The findings suggest that the hospital's ex ante moral hazards before participating the self-management project do have some influence on its ex post moral hazards after participating the self-management project. This study concludes that the goal conflict between the agents and the principal certainly exist. The principal tries hard to control the expenditure escalation and keep the financial balance, but the agents have to subsist within limited healthcare resources. Therefore, the agency cost would definitely occur due to the conflicts between both parties. According to the results of the research, some suggestions and related management concepts were proposed at the end of the paper.

  14. Convex optimization problem prototyping for image reconstruction in computed tomography with the Chambolle-Pock algorithm

    PubMed Central

    Sidky, Emil Y.; Jørgensen, Jakob H.; Pan, Xiaochuan

    2012-01-01

    The primal-dual optimization algorithm developed in Chambolle and Pock (CP), 2011 is applied to various convex optimization problems of interest in computed tomography (CT) image reconstruction. This algorithm allows for rapid prototyping of optimization problems for the purpose of designing iterative image reconstruction algorithms for CT. The primal-dual algorithm is briefly summarized in the article, and its potential for prototyping is demonstrated by explicitly deriving CP algorithm instances for many optimization problems relevant to CT. An example application modeling breast CT with low-intensity X-ray illumination is presented. PMID:22538474

  15. Wind Farm Turbine Type and Placement Optimization

    NASA Astrophysics Data System (ADS)

    Graf, Peter; Dykes, Katherine; Scott, George; Fields, Jason; Lunacek, Monte; Quick, Julian; Rethore, Pierre-Elouan

    2016-09-01

    The layout of turbines in a wind farm is already a challenging nonlinear, nonconvex, nonlinearly constrained continuous global optimization problem. Here we begin to address the next generation of wind farm optimization problems by adding the complexity that there is more than one turbine type to choose from. The optimization becomes a nonlinear constrained mixed integer problem, which is a very difficult class of problems to solve. This document briefly summarizes the algorithm and code we have developed, the code validation steps we have performed, and the initial results for multi-turbine type and placement optimization (TTP_OPT) we have run.

  16. Wind farm turbine type and placement optimization

    DOE PAGES

    Graf, Peter; Dykes, Katherine; Scott, George; ...

    2016-10-03

    The layout of turbines in a wind farm is already a challenging nonlinear, nonconvex, nonlinearly constrained continuous global optimization problem. Here we begin to address the next generation of wind farm optimization problems by adding the complexity that there is more than one turbine type to choose from. The optimization becomes a nonlinear constrained mixed integer problem, which is a very difficult class of problems to solve. Furthermore, this document briefly summarizes the algorithm and code we have developed, the code validation steps we have performed, and the initial results for multi-turbine type and placement optimization (TTP_OPT) we have run.

  17. Gravity inversion of a fault by Particle swarm optimization (PSO).

    PubMed

    Toushmalani, Reza

    2013-01-01

    Particle swarm optimization is a heuristic global optimization method and also an optimization algorithm, which is based on swarm intelligence. It comes from the research on the bird and fish flock movement behavior. In this paper we introduce and use this method in gravity inverse problem. We discuss the solution for the inverse problem of determining the shape of a fault whose gravity anomaly is known. Application of the proposed algorithm to this problem has proven its capability to deal with difficult optimization problems. The technique proved to work efficiently when tested to a number of models.

  18. Post-Optimality Analysis In Aerospace Vehicle Design

    NASA Technical Reports Server (NTRS)

    Braun, Robert D.; Kroo, Ilan M.; Gage, Peter J.

    1993-01-01

    This analysis pertains to the applicability of optimal sensitivity information to aerospace vehicle design. An optimal sensitivity (or post-optimality) analysis refers to computations performed once the initial optimization problem is solved. These computations may be used to characterize the design space about the present solution and infer changes in this solution as a result of constraint or parameter variations, without reoptimizing the entire system. The present analysis demonstrates that post-optimality information generated through first-order computations can be used to accurately predict the effect of constraint and parameter perturbations on the optimal solution. This assessment is based on the solution of an aircraft design problem in which the post-optimality estimates are shown to be within a few percent of the true solution over the practical range of constraint and parameter variations. Through solution of a reusable, single-stage-to-orbit, launch vehicle design problem, this optimal sensitivity information is also shown to improve the efficiency of the design process, For a hierarchically decomposed problem, this computational efficiency is realized by estimating the main-problem objective gradient through optimal sep&ivity calculations, By reducing the need for finite differentiation of a re-optimized subproblem, a significant decrease in the number of objective function evaluations required to reach the optimal solution is obtained.

  19. Analysis of a Two-Dimensional Thermal Cloaking Problem on the Basis of Optimization

    NASA Astrophysics Data System (ADS)

    Alekseev, G. V.

    2018-04-01

    For a two-dimensional model of thermal scattering, inverse problems arising in the development of tools for cloaking material bodies on the basis of a mixed thermal cloaking strategy are considered. By applying the optimization approach, these problems are reduced to optimization ones in which the role of controls is played by variable parameters of the medium occupying the cloaking shell and by the heat flux through a boundary segment of the basic domain. The solvability of the direct and optimization problems is proved, and an optimality system is derived. Based on its analysis, sufficient conditions on the input data are established that ensure the uniqueness and stability of optimal solutions.

  20. Solving mixed integer nonlinear programming problems using spiral dynamics optimization algorithm

    NASA Astrophysics Data System (ADS)

    Kania, Adhe; Sidarto, Kuntjoro Adji

    2016-02-01

    Many engineering and practical problem can be modeled by mixed integer nonlinear programming. This paper proposes to solve the problem with modified spiral dynamics inspired optimization method of Tamura and Yasuda. Four test cases have been examined, including problem in engineering and sport. This method succeeds in obtaining the optimal result in all test cases.

  1. Nash equilibrium and multi criterion aerodynamic optimization

    NASA Astrophysics Data System (ADS)

    Tang, Zhili; Zhang, Lianhe

    2016-06-01

    Game theory and its particular Nash Equilibrium (NE) are gaining importance in solving Multi Criterion Optimization (MCO) in engineering problems over the past decade. The solution of a MCO problem can be viewed as a NE under the concept of competitive games. This paper surveyed/proposed four efficient algorithms for calculating a NE of a MCO problem. Existence and equivalence of the solution are analyzed and proved in the paper based on fixed point theorem. Specific virtual symmetric Nash game is also presented to set up an optimization strategy for single objective optimization problems. Two numerical examples are presented to verify proposed algorithms. One is mathematical functions' optimization to illustrate detailed numerical procedures of algorithms, the other is aerodynamic drag reduction of civil transport wing fuselage configuration by using virtual game. The successful application validates efficiency of algorithms in solving complex aerodynamic optimization problem.

  2. Exact solution of large asymmetric traveling salesman problems.

    PubMed

    Miller, D L; Pekny, J F

    1991-02-15

    The traveling salesman problem is one of a class of difficult problems in combinatorial optimization that is representative of a large number of important scientific and engineering problems. A survey is given of recent applications and methods for solving large problems. In addition, an algorithm for the exact solution of the asymmetric traveling salesman problem is presented along with computational results for several classes of problems. The results show that the algorithm performs remarkably well for some classes of problems, determining an optimal solution even for problems with large numbers of cities, yet for other classes, even small problems thwart determination of a provably optimal solution.

  3. Stretching Resources.

    ERIC Educational Resources Information Center

    Robinson, Bob, Sr.

    2002-01-01

    Discusses factors leading to the current problem of poorly maintained schools, such as budget and labor cuts and misdirected purchasing policies. Provides an overview of some solutions, such as innovations in cleaning supplies and equipment, and having an integrated system to clean the school district. (EV)

  4. 7 CFR 764.457 - Vendor requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) Maintain and use a financial management information system to make financial decisions; (3) Understand and... budget; and (6) Use production records and other production information to identify problems, evaluate... general goal setting, risk management, and planning. (2) Financial management courses, covering all...

  5. Review: Optimization methods for groundwater modeling and management

    NASA Astrophysics Data System (ADS)

    Yeh, William W.-G.

    2015-09-01

    Optimization methods have been used in groundwater modeling as well as for the planning and management of groundwater systems. This paper reviews and evaluates the various optimization methods that have been used for solving the inverse problem of parameter identification (estimation), experimental design, and groundwater planning and management. Various model selection criteria are discussed, as well as criteria used for model discrimination. The inverse problem of parameter identification concerns the optimal determination of model parameters using water-level observations. In general, the optimal experimental design seeks to find sampling strategies for the purpose of estimating the unknown model parameters. A typical objective of optimal conjunctive-use planning of surface water and groundwater is to minimize the operational costs of meeting water demand. The optimization methods include mathematical programming techniques such as linear programming, quadratic programming, dynamic programming, stochastic programming, nonlinear programming, and the global search algorithms such as genetic algorithms, simulated annealing, and tabu search. Emphasis is placed on groundwater flow problems as opposed to contaminant transport problems. A typical two-dimensional groundwater flow problem is used to explain the basic formulations and algorithms that have been used to solve the formulated optimization problems.

  6. A new chaotic multi-verse optimization algorithm for solving engineering optimization problems

    NASA Astrophysics Data System (ADS)

    Sayed, Gehad Ismail; Darwish, Ashraf; Hassanien, Aboul Ella

    2018-03-01

    Multi-verse optimization algorithm (MVO) is one of the recent meta-heuristic optimization algorithms. The main inspiration of this algorithm came from multi-verse theory in physics. However, MVO like most optimization algorithms suffers from low convergence rate and entrapment in local optima. In this paper, a new chaotic multi-verse optimization algorithm (CMVO) is proposed to overcome these problems. The proposed CMVO is applied on 13 benchmark functions and 7 well-known design problems in the engineering and mechanical field; namely, three-bar trust, speed reduce design, pressure vessel problem, spring design, welded beam, rolling element-bearing and multiple disc clutch brake. In the current study, a modified feasible-based mechanism is employed to handle constraints. In this mechanism, four rules were used to handle the specific constraint problem through maintaining a balance between feasible and infeasible solutions. Moreover, 10 well-known chaotic maps are used to improve the performance of MVO. The experimental results showed that CMVO outperforms other meta-heuristic optimization algorithms on most of the optimization problems. Also, the results reveal that sine chaotic map is the most appropriate map to significantly boost MVO's performance.

  7. Optimal control of a harmonic oscillator: Economic interpretations

    NASA Astrophysics Data System (ADS)

    Janová, Jitka; Hampel, David

    2013-10-01

    Optimal control is a popular technique for modelling and solving the dynamic decision problems in economics. A standard interpretation of the criteria function and Lagrange multipliers in the profit maximization problem is well known. On a particular example, we aim to a deeper understanding of the possible economic interpretations of further mathematical and solution features of the optimal control problem: we focus on the solution of the optimal control problem for harmonic oscillator serving as a model for Phillips business cycle. We discuss the economic interpretations of arising mathematical objects with respect to well known reasoning for these in other problems.

  8. Singular perturbation analysis of AOTV-related trajectory optimization problems

    NASA Technical Reports Server (NTRS)

    Calise, Anthony J.; Bae, Gyoung H.

    1990-01-01

    The problem of real time guidance and optimal control of Aeroassisted Orbit Transfer Vehicles (AOTV's) was addressed using singular perturbation theory as an underlying method of analysis. Trajectories were optimized with the objective of minimum energy expenditure in the atmospheric phase of the maneuver. Two major problem areas were addressed: optimal reentry, and synergetic plane change with aeroglide. For the reentry problem, several reduced order models were analyzed with the objective of optimal changes in heading with minimum energy loss. It was demonstrated that a further model order reduction to a single state model is possible through the application of singular perturbation theory. The optimal solution for the reduced problem defines an optimal altitude profile dependent on the current energy level of the vehicle. A separate boundary layer analysis is used to account for altitude and flight path angle dynamics, and to obtain lift and bank angle control solutions. By considering alternative approximations to solve the boundary layer problem, three guidance laws were derived, each having an analytic feedback form. The guidance laws were evaluated using a Maneuvering Reentry Research Vehicle model and all three laws were found to be near optimal. For the problem of synergetic plane change with aeroglide, a difficult terminal boundary layer control problem arises which to date is found to be analytically intractable. Thus a predictive/corrective solution was developed to satisfy the terminal constraints on altitude and flight path angle. A composite guidance solution was obtained by combining the optimal reentry solution with the predictive/corrective guidance method. Numerical comparisons with the corresponding optimal trajectory solutions show that the resulting performance is very close to optimal. An attempt was made to obtain numerically optimized trajectories for the case where heating rate is constrained. A first order state variable inequality constraint was imposed on the full order AOTV point mass equations of motion, using a simple aerodynamic heating rate model.

  9. Conceptual Comparison of Population Based Metaheuristics for Engineering Problems

    PubMed Central

    Green, Paul

    2015-01-01

    Metaheuristic algorithms are well-known optimization tools which have been employed for solving a wide range of optimization problems. Several extensions of differential evolution have been adopted in solving constrained and nonconstrained multiobjective optimization problems, but in this study, the third version of generalized differential evolution (GDE) is used for solving practical engineering problems. GDE3 metaheuristic modifies the selection process of the basic differential evolution and extends DE/rand/1/bin strategy in solving practical applications. The performance of the metaheuristic is investigated through engineering design optimization problems and the results are reported. The comparison of the numerical results with those of other metaheuristic techniques demonstrates the promising performance of the algorithm as a robust optimization tool for practical purposes. PMID:25874265

  10. Conceptual comparison of population based metaheuristics for engineering problems.

    PubMed

    Adekanmbi, Oluwole; Green, Paul

    2015-01-01

    Metaheuristic algorithms are well-known optimization tools which have been employed for solving a wide range of optimization problems. Several extensions of differential evolution have been adopted in solving constrained and nonconstrained multiobjective optimization problems, but in this study, the third version of generalized differential evolution (GDE) is used for solving practical engineering problems. GDE3 metaheuristic modifies the selection process of the basic differential evolution and extends DE/rand/1/bin strategy in solving practical applications. The performance of the metaheuristic is investigated through engineering design optimization problems and the results are reported. The comparison of the numerical results with those of other metaheuristic techniques demonstrates the promising performance of the algorithm as a robust optimization tool for practical purposes.

  11. Efficiency of quantum vs. classical annealing in nonconvex learning problems

    PubMed Central

    Zecchina, Riccardo

    2018-01-01

    Quantum annealers aim at solving nonconvex optimization problems by exploiting cooperative tunneling effects to escape local minima. The underlying idea consists of designing a classical energy function whose ground states are the sought optimal solutions of the original optimization problem and add a controllable quantum transverse field to generate tunneling processes. A key challenge is to identify classes of nonconvex optimization problems for which quantum annealing remains efficient while thermal annealing fails. We show that this happens for a wide class of problems which are central to machine learning. Their energy landscapes are dominated by local minima that cause exponential slowdown of classical thermal annealers while simulated quantum annealing converges efficiently to rare dense regions of optimal solutions. PMID:29382764

  12. Direct Multiple Shooting Optimization with Variable Problem Parameters

    NASA Technical Reports Server (NTRS)

    Whitley, Ryan J.; Ocampo, Cesar A.

    2009-01-01

    Taking advantage of a novel approach to the design of the orbital transfer optimization problem and advanced non-linear programming algorithms, several optimal transfer trajectories are found for problems with and without known analytic solutions. This method treats the fixed known gravitational constants as optimization variables in order to reduce the need for an advanced initial guess. Complex periodic orbits are targeted with very simple guesses and the ability to find optimal transfers in spite of these bad guesses is successfully demonstrated. Impulsive transfers are considered for orbits in both the 2-body frame as well as the circular restricted three-body problem (CRTBP). The results with this new approach demonstrate the potential for increasing robustness for all types of orbit transfer problems.

  13. The pseudo-Boolean optimization approach to form the N-version software structure

    NASA Astrophysics Data System (ADS)

    Kovalev, I. V.; Kovalev, D. I.; Zelenkov, P. V.; Voroshilova, A. A.

    2015-10-01

    The problem of developing an optimal structure of N-version software system presents a kind of very complex optimization problem. This causes the use of deterministic optimization methods inappropriate for solving the stated problem. In this view, exploiting heuristic strategies looks more rational. In the field of pseudo-Boolean optimization theory, the so called method of varied probabilities (MVP) has been developed to solve problems with a large dimensionality. Some additional modifications of MVP have been made to solve the problem of N-version systems design. Those algorithms take into account the discovered specific features of the objective function. The practical experiments have shown the advantage of using these algorithm modifications because of reducing a search space.

  14. Optimal control of LQR for discrete time-varying systems with input delays

    NASA Astrophysics Data System (ADS)

    Yin, Yue-Zhu; Yang, Zhong-Lian; Yin, Zhi-Xiang; Xu, Feng

    2018-04-01

    In this work, we consider the optimal control problem of linear quadratic regulation for discrete time-variant systems with single input and multiple input delays. An innovative and simple method to derive the optimal controller is given. The studied problem is first equivalently converted into a problem subject to a constraint condition. Last, with the established duality, the problem is transformed into a static mathematical optimisation problem without input delays. The optimal control input solution to minimise performance index function is derived by solving this optimisation problem with two methods. A numerical simulation example is carried out and its results show that our two approaches are both feasible and very effective.

  15. Exact solution for an optimal impermeable parachute problem

    NASA Astrophysics Data System (ADS)

    Lupu, Mircea; Scheiber, Ernest

    2002-10-01

    In the paper there are solved direct and inverse boundary problems and analytical solutions are obtained for optimization problems in the case of some nonlinear integral operators. It is modeled the plane potential flow of an inviscid, incompressible and nonlimited fluid jet, witch encounters a symmetrical, curvilinear obstacle--the deflector of maximal drag. There are derived integral singular equations, for direct and inverse problems and the movement in the auxiliary canonical half-plane is obtained. Next, the optimization problem is solved in an analytical manner. The design of the optimal airfoil is performed and finally, numerical computations concerning the drag coefficient and other geometrical and aerodynamical parameters are carried out. This model corresponds to the Helmholtz impermeable parachute problem.

  16. From nonlinear optimization to convex optimization through firefly algorithm and indirect approach with applications to CAD/CAM.

    PubMed

    Gálvez, Akemi; Iglesias, Andrés

    2013-01-01

    Fitting spline curves to data points is a very important issue in many applied fields. It is also challenging, because these curves typically depend on many continuous variables in a highly interrelated nonlinear way. In general, it is not possible to compute these parameters analytically, so the problem is formulated as a continuous nonlinear optimization problem, for which traditional optimization techniques usually fail. This paper presents a new bioinspired method to tackle this issue. In this method, optimization is performed through a combination of two techniques. Firstly, we apply the indirect approach to the knots, in which they are not initially the subject of optimization but precomputed with a coarse approximation scheme. Secondly, a powerful bioinspired metaheuristic technique, the firefly algorithm, is applied to optimization of data parameterization; then, the knot vector is refined by using De Boor's method, thus yielding a better approximation to the optimal knot vector. This scheme converts the original nonlinear continuous optimization problem into a convex optimization problem, solved by singular value decomposition. Our method is applied to some illustrative real-world examples from the CAD/CAM field. Our experimental results show that the proposed scheme can solve the original continuous nonlinear optimization problem very efficiently.

  17. From Nonlinear Optimization to Convex Optimization through Firefly Algorithm and Indirect Approach with Applications to CAD/CAM

    PubMed Central

    Gálvez, Akemi; Iglesias, Andrés

    2013-01-01

    Fitting spline curves to data points is a very important issue in many applied fields. It is also challenging, because these curves typically depend on many continuous variables in a highly interrelated nonlinear way. In general, it is not possible to compute these parameters analytically, so the problem is formulated as a continuous nonlinear optimization problem, for which traditional optimization techniques usually fail. This paper presents a new bioinspired method to tackle this issue. In this method, optimization is performed through a combination of two techniques. Firstly, we apply the indirect approach to the knots, in which they are not initially the subject of optimization but precomputed with a coarse approximation scheme. Secondly, a powerful bioinspired metaheuristic technique, the firefly algorithm, is applied to optimization of data parameterization; then, the knot vector is refined by using De Boor's method, thus yielding a better approximation to the optimal knot vector. This scheme converts the original nonlinear continuous optimization problem into a convex optimization problem, solved by singular value decomposition. Our method is applied to some illustrative real-world examples from the CAD/CAM field. Our experimental results show that the proposed scheme can solve the original continuous nonlinear optimization problem very efficiently. PMID:24376380

  18. Analytical and numerical analysis of inverse optimization problems: conditions of uniqueness and computational methods

    PubMed Central

    Zatsiorsky, Vladimir M.

    2011-01-01

    One of the key problems of motor control is the redundancy problem, in particular how the central nervous system (CNS) chooses an action out of infinitely many possible. A promising way to address this question is to assume that the choice is made based on optimization of a certain cost function. A number of cost functions have been proposed in the literature to explain performance in different motor tasks: from force sharing in grasping to path planning in walking. However, the problem of uniqueness of the cost function(s) was not addressed until recently. In this article, we analyze two methods of finding additive cost functions in inverse optimization problems with linear constraints, so-called linear-additive inverse optimization problems. These methods are based on the Uniqueness Theorem for inverse optimization problems that we proved recently (Terekhov et al., J Math Biol 61(3):423–453, 2010). Using synthetic data, we show that both methods allow for determining the cost function. We analyze the influence of noise on the both methods. Finally, we show how a violation of the conditions of the Uniqueness Theorem may lead to incorrect solutions of the inverse optimization problem. PMID:21311907

  19. On Bipartite Graphs Trees and Their Partial Vertex Covers.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caskurlu, Bugra; Mkrtchyan, Vahan; Parekh, Ojas D.

    2015-03-01

    Graphs can be used to model risk management in various systems. Particularly, Caskurlu et al. in [7] have considered a system, which has threats, vulnerabilities and assets, and which essentially represents a tripartite graph. The goal in this model is to reduce the risk in the system below a predefined risk threshold level. One can either restricting the permissions of the users, or encapsulating the system assets. The pointed out two strategies correspond to deleting minimum number of elements corresponding to vulnerabilities and assets, such that the flow between threats and assets is reduced below the predefined threshold level. Itmore » can be shown that the main goal in this risk management system can be formulated as a Partial Vertex Cover problem on bipartite graphs. It is well-known that the Vertex Cover problem is in P on bipartite graphs, however; the computational complexity of the Partial Vertex Cover problem on bipartite graphs has remained open. In this paper, we establish that the Partial Vertex Cover problem is NP-hard on bipartite graphs, which was also recently independently demonstrated [N. Apollonio and B. Simeone, Discrete Appl. Math., 165 (2014), pp. 37–48; G. Joret and A. Vetta, preprint, arXiv:1211.4853v1 [cs.DS], 2012]. We then identify interesting special cases of bipartite graphs, for which the Partial Vertex Cover problem, the closely related Budgeted Maximum Coverage problem, and their weighted extensions can be solved in polynomial time. We also present an 8/9-approximation algorithm for the Budgeted Maximum Coverage problem in the class of bipartite graphs. We show that this matches and resolves the integrality gap of the natural LP relaxation of the problem and improves upon a recent 4/5-approximation.« less

  20. Terascale Computing in Accelerator Science and Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ko, Kwok

    2002-08-21

    We have entered the age of ''terascale'' scientific computing. Processors and system architecture both continue to evolve; hundred-teraFLOP computers are expected in the next few years, and petaFLOP computers toward the end of this decade are conceivable. This ever-increasing power to solve previously intractable numerical problems benefits almost every field of science and engineering and is revolutionizing some of them, notably including accelerator physics and technology. At existing accelerators, it will help us optimize performance, expand operational parameter envelopes, and increase reliability. Design decisions for next-generation machines will be informed by unprecedented comprehensive and accurate modeling, as well as computer-aidedmore » engineering; all this will increase the likelihood that even their most advanced subsystems can be commissioned on time, within budget, and up to specifications. Advanced computing is also vital to developing new means of acceleration and exploring the behavior of beams under extreme conditions. With continued progress it will someday become reasonable to speak of a complete numerical model of all phenomena important to a particular accelerator.« less

  1. Systems budgets architecture and development for the Maunakea Spectroscopic Explorer

    NASA Astrophysics Data System (ADS)

    Mignot, Shan; Flagey, Nicolas; Szeto, Kei; Murowinski, Rick; McConnachie, Alan

    2016-08-01

    The Maunakea Spectroscopic Explorer (MSE) project is an enterprise to upgrade the existing Canada-France- Hawaii observatory into a spectroscopic facility based on a 10 meter-class telescope. As such, the project relies on engineering requirements not limited only to its instruments (the low, medium and high resolution spectrographs) but for the whole observatory. The science requirements, the operations concept, the project management and the applicable regulations are the basis from which these requirements are initially derived, yet they do not form hierarchies as each may serve several purposes, that is, pertain to several budgets. Completeness and consistency are hence the main systems engineering challenges for such a large project as MSE. Special attention is devoted to ensuring the traceability of requirements via parametric models, derivation documents, simulations, and finally maintaining KAOS diagrams and a database under IBM Rational DOORS linking them together. This paper will present the architecture of the main budgets under development and the associated processes, expand to highlight those that are interrelated and how the system, as a whole, is then optimized by modelling and analysis of the pertinent system parameters.

  2. A collaborative vendor-buyer production-inventory systems with imperfect quality items, inspection errors, and stochastic demand under budget capacity constraint: a Karush-Kuhn-Tucker conditions approach

    NASA Astrophysics Data System (ADS)

    Kurdhi, N. A.; Nurhayati, R. A.; Wiyono, S. B.; Handajani, S. S.; Martini, T. S.

    2017-01-01

    In this paper, we develop an integrated inventory model considering the imperfect quality items, inspection error, controllable lead time, and budget capacity constraint. The imperfect items were uniformly distributed and detected on the screening process. However there are two types of possibilities. The first is type I of inspection error (when a non-defective item classified as defective) and the second is type II of inspection error (when a defective item classified as non-defective). The demand during the lead time is unknown, and it follows the normal distribution. The lead time can be controlled by adding the crashing cost. Furthermore, the existence of the budget capacity constraint is caused by the limited purchasing cost. The purposes of this research are: to modify the integrated vendor and buyer inventory model, to establish the optimal solution using Kuhn-Tucker’s conditions, and to apply the models. Based on the result of application and the sensitivity analysis, it can be obtained minimum integrated inventory total cost rather than separated inventory.

  3. Adaptive Constrained Optimal Control Design for Data-Based Nonlinear Discrete-Time Systems With Critic-Only Structure.

    PubMed

    Luo, Biao; Liu, Derong; Wu, Huai-Ning

    2018-06-01

    Reinforcement learning has proved to be a powerful tool to solve optimal control problems over the past few years. However, the data-based constrained optimal control problem of nonaffine nonlinear discrete-time systems has rarely been studied yet. To solve this problem, an adaptive optimal control approach is developed by using the value iteration-based Q-learning (VIQL) with the critic-only structure. Most of the existing constrained control methods require the use of a certain performance index and only suit for linear or affine nonlinear systems, which is unreasonable in practice. To overcome this problem, the system transformation is first introduced with the general performance index. Then, the constrained optimal control problem is converted to an unconstrained optimal control problem. By introducing the action-state value function, i.e., Q-function, the VIQL algorithm is proposed to learn the optimal Q-function of the data-based unconstrained optimal control problem. The convergence results of the VIQL algorithm are established with an easy-to-realize initial condition . To implement the VIQL algorithm, the critic-only structure is developed, where only one neural network is required to approximate the Q-function. The converged Q-function obtained from the critic-only VIQL method is employed to design the adaptive constrained optimal controller based on the gradient descent scheme. Finally, the effectiveness of the developed adaptive control method is tested on three examples with computer simulation.

  4. Topology optimization of unsteady flow problems using the lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Nørgaard, Sebastian; Sigmund, Ole; Lazarov, Boyan

    2016-02-01

    This article demonstrates and discusses topology optimization for unsteady incompressible fluid flows. The fluid flows are simulated using the lattice Boltzmann method, and a partial bounceback model is implemented to model the transition between fluid and solid phases in the optimization problems. The optimization problem is solved with a gradient based method, and the design sensitivities are computed by solving the discrete adjoint problem. For moderate Reynolds number flows, it is demonstrated that topology optimization can successfully account for unsteady effects such as vortex shedding and time-varying boundary conditions. Such effects are relevant in several engineering applications, i.e. fluid pumps and control valves.

  5. Artificial bee colony algorithm for constrained possibilistic portfolio optimization problem

    NASA Astrophysics Data System (ADS)

    Chen, Wei

    2015-07-01

    In this paper, we discuss the portfolio optimization problem with real-world constraints under the assumption that the returns of risky assets are fuzzy numbers. A new possibilistic mean-semiabsolute deviation model is proposed, in which transaction costs, cardinality and quantity constraints are considered. Due to such constraints the proposed model becomes a mixed integer nonlinear programming problem and traditional optimization methods fail to find the optimal solution efficiently. Thus, a modified artificial bee colony (MABC) algorithm is developed to solve the corresponding optimization problem. Finally, a numerical example is given to illustrate the effectiveness of the proposed model and the corresponding algorithm.

  6. Multiobjective Aerodynamic Shape Optimization Using Pareto Differential Evolution and Generalized Response Surface Metamodels

    NASA Technical Reports Server (NTRS)

    Madavan, Nateri K.

    2004-01-01

    Differential Evolution (DE) is a simple, fast, and robust evolutionary algorithm that has proven effective in determining the global optimum for several difficult single-objective optimization problems. The DE algorithm has been recently extended to multiobjective optimization problem by using a Pareto-based approach. In this paper, a Pareto DE algorithm is applied to multiobjective aerodynamic shape optimization problems that are characterized by computationally expensive objective function evaluations. To improve computational expensive the algorithm is coupled with generalized response surface meta-models based on artificial neural networks. Results are presented for some test optimization problems from the literature to demonstrate the capabilities of the method.

  7. Parameter meta-optimization of metaheuristics of solving specific NP-hard facility location problem

    NASA Astrophysics Data System (ADS)

    Skakov, E. S.; Malysh, V. N.

    2018-03-01

    The aim of the work is to create an evolutionary method for optimizing the values of the control parameters of metaheuristics of solving the NP-hard facility location problem. A system analysis of the tuning process of optimization algorithms parameters is carried out. The problem of finding the parameters of a metaheuristic algorithm is formulated as a meta-optimization problem. Evolutionary metaheuristic has been chosen to perform the task of meta-optimization. Thus, the approach proposed in this work can be called “meta-metaheuristic”. Computational experiment proving the effectiveness of the procedure of tuning the control parameters of metaheuristics has been performed.

  8. Optimal control problem for linear fractional-order systems, described by equations with Hadamard-type derivative

    NASA Astrophysics Data System (ADS)

    Postnov, Sergey

    2017-11-01

    Two kinds of optimal control problem are investigated for linear time-invariant fractional-order systems with lumped parameters which dynamics described by equations with Hadamard-type derivative: the problem of control with minimal norm and the problem of control with minimal time at given restriction on control norm. The problem setting with nonlocal initial conditions studied. Admissible controls allowed to be the p-integrable functions (p > 1) at half-interval. The optimal control problem studied by moment method. The correctness and solvability conditions for the corresponding moment problem are derived. For several special cases the optimal control problems stated are solved analytically. Some analogies pointed for results obtained with the results which are known for integer-order systems and fractional-order systems describing by equations with Caputo- and Riemann-Liouville-type derivatives.

  9. Multi-step optimization strategy for fuel-optimal orbital transfer of low-thrust spacecraft

    NASA Astrophysics Data System (ADS)

    Rasotto, M.; Armellin, R.; Di Lizia, P.

    2016-03-01

    An effective method for the design of fuel-optimal transfers in two- and three-body dynamics is presented. The optimal control problem is formulated using calculus of variation and primer vector theory. This leads to a multi-point boundary value problem (MPBVP), characterized by complex inner constraints and a discontinuous thrust profile. The first issue is addressed by embedding the MPBVP in a parametric optimization problem, thus allowing a simplification of the set of transversality constraints. The second problem is solved by representing the discontinuous control function by a smooth function depending on a continuation parameter. The resulting trajectory optimization method can deal with different intermediate conditions, and no a priori knowledge of the control structure is required. Test cases in both the two- and three-body dynamics show the capability of the method in solving complex trajectory design problems.

  10. Optimal perturbations for nonlinear systems using graph-based optimal transport

    NASA Astrophysics Data System (ADS)

    Grover, Piyush; Elamvazhuthi, Karthik

    2018-06-01

    We formulate and solve a class of finite-time transport and mixing problems in the set-oriented framework. The aim is to obtain optimal discrete-time perturbations in nonlinear dynamical systems to transport a specified initial measure on the phase space to a final measure in finite time. The measure is propagated under system dynamics in between the perturbations via the associated transfer operator. Each perturbation is described by a deterministic map in the measure space that implements a version of Monge-Kantorovich optimal transport with quadratic cost. Hence, the optimal solution minimizes a sum of quadratic costs on phase space transport due to the perturbations applied at specified times. The action of the transport map is approximated by a continuous pseudo-time flow on a graph, resulting in a tractable convex optimization problem. This problem is solved via state-of-the-art solvers to global optimality. We apply this algorithm to a problem of transport between measures supported on two disjoint almost-invariant sets in a chaotic fluid system, and to a finite-time optimal mixing problem by choosing the final measure to be uniform. In both cases, the optimal perturbations are found to exploit the phase space structures, such as lobe dynamics, leading to efficient global transport. As the time-horizon of the problem is increased, the optimal perturbations become increasingly localized. Hence, by combining the transfer operator approach with ideas from the theory of optimal mass transportation, we obtain a discrete-time graph-based algorithm for optimal transport and mixing in nonlinear systems.

  11. Comparative Evaluation of Different Optimization Algorithms for Structural Design Applications

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Coroneos, Rula M.; Guptill, James D.; Hopkins, Dale A.

    1996-01-01

    Non-linear programming algorithms play an important role in structural design optimization. Fortunately, several algorithms with computer codes are available. At NASA Lewis Research Centre, a project was initiated to assess the performance of eight different optimizers through the development of a computer code CometBoards. This paper summarizes the conclusions of that research. CometBoards was employed to solve sets of small, medium and large structural problems, using the eight different optimizers on a Cray-YMP8E/8128 computer. The reliability and efficiency of the optimizers were determined from the performance of these problems. For small problems, the performance of most of the optimizers could be considered adequate. For large problems, however, three optimizers (two sequential quadratic programming routines, DNCONG of IMSL and SQP of IDESIGN, along with Sequential Unconstrained Minimizations Technique SUMT) outperformed others. At optimum, most optimizers captured an identical number of active displacement and frequency constraints but the number of active stress constraints differed among the optimizers. This discrepancy can be attributed to singularity conditions in the optimization and the alleviation of this discrepancy can improve the efficiency of optimizers.

  12. Performance Trend of Different Algorithms for Structural Design Optimization

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Coroneos, Rula M.; Guptill, James D.; Hopkins, Dale A.

    1996-01-01

    Nonlinear programming algorithms play an important role in structural design optimization. Fortunately, several algorithms with computer codes are available. At NASA Lewis Research Center, a project was initiated to assess performance of different optimizers through the development of a computer code CometBoards. This paper summarizes the conclusions of that research. CometBoards was employed to solve sets of small, medium and large structural problems, using different optimizers on a Cray-YMP8E/8128 computer. The reliability and efficiency of the optimizers were determined from the performance of these problems. For small problems, the performance of most of the optimizers could be considered adequate. For large problems however, three optimizers (two sequential quadratic programming routines, DNCONG of IMSL and SQP of IDESIGN, along with the sequential unconstrained minimizations technique SUMT) outperformed others. At optimum, most optimizers captured an identical number of active displacement and frequency constraints but the number of active stress constraints differed among the optimizers. This discrepancy can be attributed to singularity conditions in the optimization and the alleviation of this discrepancy can improve the efficiency of optimizers.

  13. Evolutionary Optimization of a Geometrically Refined Truss

    NASA Technical Reports Server (NTRS)

    Hull, P. V.; Tinker, M. L.; Dozier, G. V.

    2007-01-01

    Structural optimization is a field of research that has experienced noteworthy growth for many years. Researchers in this area have developed optimization tools to successfully design and model structures, typically minimizing mass while maintaining certain deflection and stress constraints. Numerous optimization studies have been performed to minimize mass, deflection, and stress on a benchmark cantilever truss problem. Predominantly traditional optimization theory is applied to this problem. The cross-sectional area of each member is optimized to minimize the aforementioned objectives. This Technical Publication (TP) presents a structural optimization technique that has been previously applied to compliant mechanism design. This technique demonstrates a method that combines topology optimization, geometric refinement, finite element analysis, and two forms of evolutionary computation: genetic algorithms and differential evolution to successfully optimize a benchmark structural optimization problem. A nontraditional solution to the benchmark problem is presented in this TP, specifically a geometrically refined topological solution. The design process begins with an alternate control mesh formulation, multilevel geometric smoothing operation, and an elastostatic structural analysis. The design process is wrapped in an evolutionary computing optimization toolset.

  14. The expanded invasive weed optimization metaheuristic for solving continuous and discrete optimization problems.

    PubMed

    Josiński, Henryk; Kostrzewa, Daniel; Michalczuk, Agnieszka; Switoński, Adam

    2014-01-01

    This paper introduces an expanded version of the Invasive Weed Optimization algorithm (exIWO) distinguished by the hybrid strategy of the search space exploration proposed by the authors. The algorithm is evaluated by solving three well-known optimization problems: minimization of numerical functions, feature selection, and the Mona Lisa TSP Challenge as one of the instances of the traveling salesman problem. The achieved results are compared with analogous outcomes produced by other optimization methods reported in the literature.

  15. People with dementia and their carers do not receive personal care budgets, claims charity.

    PubMed

    2016-11-30

    Fewer than one third of people who receive social care support for problems with memory and cognition are also allocated cash by their local council to pay for care or support, according to the Alzheimer's Society.

  16. Managing Academic Libraries with Fewer Resources.

    ERIC Educational Resources Information Center

    Riggs, Donald E.

    1992-01-01

    A discussion of academic library management during retrenchment looks at a variety of issues, including staffing needs in the labor-intensive library environment, acquisitions budgeting, interlibrary cooperation (ownership vs. access to resources), entrepreneurship and strategic planning for problem solving, and use of total quality management…

  17. Federal Library Programs for Acquisition of Foreign Materials.

    ERIC Educational Resources Information Center

    Cylke, Frank Kurt

    Sixteen libraries representing those agencies holding membership on the Federal Library Committee were surveyed to determine library foreign language or imprint holdings, acquisitions techniques, procedures and/or problems. Specific questions, relating to holdings, staff, budget and the acquisition, processing, reference and translation services…

  18. Guidelines for the Formulation of Collection Development Policies

    ERIC Educational Resources Information Center

    Library Resources and Technical, 1977

    1977-01-01

    Guidelines are presented for library collection development activities which include: budgeting and allocation, the formulation of collection development policies, the development of review programs to assist in the solution of space problems, and the description and evaluation of library collections. (Author/AP)

  19. Mississippi Basin Carbon Project science plan

    USGS Publications Warehouse

    Sundquist, E.T.; Stallard, R.F.; Bliss, N.B.; Markewich, H.W.; Harden, J.W.; Pavich, M.J.; Dean, M.D.

    1998-01-01

    Understanding the carbon cycle is one of the most difficult challenges facing scientists who study the global environment. Lack of understanding of global carbon cycling is perhaps best illustrated by our inability to balance the present-day global CO2 budget. The amount of CO2 produced by burning fossil fuels and by deforestation appears to exceed the amount accumulating in the atmosphere and oceans. The carbon needed to balance the CO2 budget (the so-called "missing" carbon) is probably absorbed by land plants and ultimately deposited in soils and sediments. Increasing evidence points toward the importance of these terrestrial processes in northern temperate latitudes. Thus, efforts to balance the global CO2 budget focus particular attention on terrestrial carbon uptake in our own North American "backyard."The USGS Mississippi Basin Carbon Project conducts research on the carbon budget in soils and sediments of the Mississippi River basin. The project focuses on the effects of land-use change on carbon storage and transport, nutrient cycles, and erosion and sedimentation throughout the Mississippi River Basin. Particular emphasis is placed on understanding the interactions among changes in erosion, sedimentation, and soil dynamics. The project includes spatial analysis of a wide variety of geographic data sets, estimation of whole-basin and sub-basin carbon and sediment budgets, development and implementation of terrestrial carbon-cycle models, and site-specific field studies of relevant processes. The USGS views this project as a "flagship" effort to demonstrate its capabilities to address the importance of the land surface to biogeochemical problems such as the global carbon budget.

  20. Particle swarm optimization - Genetic algorithm (PSOGA) on linear transportation problem

    NASA Astrophysics Data System (ADS)

    Rahmalia, Dinita

    2017-08-01

    Linear Transportation Problem (LTP) is the case of constrained optimization where we want to minimize cost subject to the balance of the number of supply and the number of demand. The exact method such as northwest corner, vogel, russel, minimal cost have been applied at approaching optimal solution. In this paper, we use heurisitic like Particle Swarm Optimization (PSO) for solving linear transportation problem at any size of decision variable. In addition, we combine mutation operator of Genetic Algorithm (GA) at PSO to improve optimal solution. This method is called Particle Swarm Optimization - Genetic Algorithm (PSOGA). The simulations show that PSOGA can improve optimal solution resulted by PSO.

  1. Structure analysis of tax revenue and inflation rate in Banda Aceh using vector error correction model with multiple alpha

    NASA Astrophysics Data System (ADS)

    Sofyan, Hizir; Maulia, Eva; Miftahuddin

    2017-11-01

    A country has several important parameters to achieve economic prosperity, such as tax revenue and inflation rate. One of the largest revenues of the State Budget in Indonesia comes from the tax sector. Meanwhile, the rate of inflation occurring in a country can be used as an indicator, to measure the good and bad economic problems faced by the country. Given the importance of tax revenue and inflation rate control in achieving economic prosperity, it is necessary to analyze the structure of tax revenue relations and inflation rate. This study aims to produce the best VECM (Vector Error Correction Model) with optimal lag using various alpha and perform structural analysis using the Impulse Response Function (IRF) of the VECM models to examine the relationship of tax revenue, and inflation in Banda Aceh. The results showed that the best model for the data of tax revenue and inflation rate in Banda Aceh City using alpha 0.01 is VECM with optimal lag 2, while the best model for data of tax revenue and inflation rate in Banda Aceh City using alpha 0.05 and 0,1 VECM with optimal lag 3. However, the VECM model with alpha 0.01 yielded four significant models of income tax model, inflation rate of Banda Aceh, inflation rate of health and inflation rate of education in Banda Aceh. While the VECM model with alpha 0.05 and 0.1 yielded one significant model that is income tax model. Based on the VECM models, then there are two structural analysis IRF which is formed to look at the relationship of tax revenue, and inflation in Banda Aceh, the IRF with VECM (2) and IRF with VECM (3).

  2. Hydropower and Environmental Resource Assessment (HERA): a computational tool for the assessment of the hydropower potential of watersheds considering engineering and socio-environmental aspects.

    NASA Astrophysics Data System (ADS)

    Martins, T. M.; Kelman, R.; Metello, M.; Ciarlini, A.; Granville, A. C.; Hespanhol, P.; Castro, T. L.; Gottin, V. M.; Pereira, M. V. F.

    2015-12-01

    The hydroelectric potential of a river is proportional to its head and water flows. Selecting the best development alternative for Greenfield projects watersheds is a difficult task, since it must balance demands for infrastructure, especially in the developing world where a large potential remains unexplored, with environmental conservation. Discussions usually diverge into antagonistic views, as in recent projects in the Amazon forest, for example. This motivates the construction of a computational tool that will support a more qualified debate regarding development/conservation options. HERA provides the optimal head division partition of a river considering technical, economic and environmental aspects. HERA has three main components: (i) pre-processing GIS of topographic and hydrologic data; (ii) automatic engineering and equipment design and budget estimation for candidate projects; (iii) translation of division-partition problem into a mathematical programming model. By integrating an automatic calculation with geoprocessing tools, cloud computation and optimization techniques, HERA makes it possible countless head partition division alternatives to be intrinsically compared - a great advantage with respect to traditional field surveys followed by engineering design methods. Based on optimization techniques, HERA determines which hydro plants should be built, including location, design, technical data (e.g. water head, reservoir area and volume, engineering design (dam, spillways, etc.) and costs). The results can be visualized in the HERA interface, exported to GIS software, Google Earth or CAD systems. HERA has a global scope of application since the main input data area a Digital Terrain Model and water inflows at gauging stations. The objective is to contribute to an increased rationality of decisions by presenting to the stakeholders a clear and quantitative view of the alternatives, their opportunities and threats.

  3. Using linear programming to minimize the cost of nurse personnel.

    PubMed

    Matthews, Charles H

    2005-01-01

    Nursing personnel costs make up a major portion of most hospital budgets. This report evaluates and optimizes the utility of the nurse personnel at the Internal Medicine Outpatient Clinic of Wake Forest University Baptist Medical Center. Linear programming (LP) was employed to determine the effective combination of nurses that would allow for all weekly clinic tasks to be covered while providing the lowest possible cost to the department. Linear programming is a standard application of standard spreadsheet software that allows the operator to establish the variables to be optimized and then requires the operator to enter a series of constraints that will each have an impact on the ultimate outcome. The application is therefore able to quantify and stratify the nurses necessary to execute the tasks. With the report, a specific sensitivity analysis can be performed to assess just how sensitive the outcome is to the stress of adding or deleting a nurse to or from the payroll. The nurse employee cost structure in this study consisted of five certified nurse assistants (CNA), three licensed practicing nurses (LPN), and five registered nurses (RN). The LP revealed that the outpatient clinic should staff four RNs, three LPNs, and four CNAs with 95 percent confidence of covering nurse demand on the floor. This combination of nurses would enable the clinic to: 1. Reduce annual staffing costs by 16 percent; 2. Force each level of nurse to be optimally productive by focusing on tasks specific to their expertise; 3. Assign accountability more efficiently as the nurses adhere to their specific duties; and 4. Ultimately provide a competitive advantage to the clinic as it relates to nurse employee and patient satisfaction. Linear programming can be used to solve capacity problems for just about any staffing situation, provided the model is indeed linear.

  4. Optimizing the location of fuel treatments over time at landscape scales

    Treesearch

    Greg Jones; Woodam Chung

    2011-01-01

    Fuel treatments are a vital part of forest management - but when faced with limited budgets, narrow burning windows, and air quality restrictions, it can be challenging to prioritize where, when, and how fuel treatments should be applied across the landscape to achieve the most benefi t. To help ease this process, land managers can turn to various standalone models,...

  5. [The equivalence and interchangeability of medical articles].

    PubMed

    Antonov, V S

    2013-11-01

    The information concerning the interchangeability of medical articles is highly valuable because it makes it possible to correlate most precisely medical articles with medical technologies and medical care standards and to optimize budget costs under public purchasing. The proposed procedure of determination of interchangeability is based on criteria of equivalence of prescriptions, functional technical and technological characteristics and effectiveness of functioning of medical articles.

  6. Associations of Subjective Immersion, Immersion Subfactors, and Learning Outcomes in the Revised Game Engagement Model

    ERIC Educational Resources Information Center

    Barclay, Paul A.; Bowers, Clint

    2018-01-01

    Serious Educational Video Games (SEGs) play a large role in education for both children and adults. However, the budget for SEGs is typically lower than traditional entertainment video games, bringing with it the need to optimize the learning experience. This article looks at the role game immersion plays in improving learning outcomes, using the…

  7. The Robustness of Designs for Trials with Nested Data against Incorrect Initial Intracluster Correlation Coefficient Estimates

    ERIC Educational Resources Information Center

    Korendijk, Elly J. H.; Moerbeek, Mirjam; Maas, Cora J. M.

    2010-01-01

    In the case of trials with nested data, the optimal allocation of units depends on the budget, the costs, and the intracluster correlation coefficient. In general, the intracluster correlation coefficient is unknown in advance and an initial guess has to be made based on published values or subject matter knowledge. This initial estimate is likely…

  8. Analytical and Computational Properties of Distributed Approaches to MDO

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia M.; Lewis, Robert Michael

    2000-01-01

    Historical evolution of engineering disciplines and the complexity of the MDO problem suggest that disciplinary autonomy is a desirable goal in formulating and solving MDO problems. We examine the notion of disciplinary autonomy and discuss the analytical properties of three approaches to formulating and solving MDO problems that achieve varying degrees of autonomy by distributing the problem along disciplinary lines. Two of the approaches-Optimization by Linear Decomposition and Collaborative Optimization-are based on bi-level optimization and reflect what we call a structural perspective. The third approach, Distributed Analysis Optimization, is a single-level approach that arises from what we call an algorithmic perspective. The main conclusion of the paper is that disciplinary autonomy may come at a price: in the bi-level approaches, the system-level constraints introduced to relax the interdisciplinary coupling and enable disciplinary autonomy can cause analytical and computational difficulties for optimization algorithms. The single-level alternative we discuss affords a more limited degree of autonomy than that of the bi-level approaches, but without the computational difficulties of the bi-level methods. Key Words: Autonomy, bi-level optimization, distributed optimization, multidisciplinary optimization, multilevel optimization, nonlinear programming, problem integration, system synthesis

  9. The NEWS Water Cycle Climatology

    NASA Astrophysics Data System (ADS)

    Rodell, M.; Beaudoing, H. K.; L'Ecuyer, T.; Olson, W. S.

    2012-12-01

    NASA's Energy and Water Cycle Study (NEWS) program fosters collaborative research towards improved quantification and prediction of water and energy cycle consequences of climate change. In order to measure change, it is first necessary to describe current conditions. The goal of the first phase of the NEWS Water and Energy Cycle Climatology project was to develop "state of the global water cycle" and "state of the global energy cycle" assessments based on data from modern ground and space based observing systems and data integrating models. The project was a multi-institutional collaboration with more than 20 active contributors. This presentation will describe the results of the water cycle component of the first phase of the project, which include seasonal (monthly) climatologies of water fluxes over land, ocean, and atmosphere at continental and ocean basin scales. The requirement of closure of the water budget (i.e., mass conservation) at various scales was exploited to constrain the flux estimates via an optimization approach that will also be described. Further, error assessments were included with the input datasets, and we examine these in relation to inferred uncertainty in the optimized flux estimates in order to gauge our current ability to close the water budget within an expected uncertainty range.

  10. The NEWS Water Cycle Climatology

    NASA Technical Reports Server (NTRS)

    Rodell, Matthew; Beaudoing, Hiroko Kato; L'Ecuyer, Tristan; William, Olson

    2012-01-01

    NASA's Energy and Water Cycle Study (NEWS) program fosters collaborative research towards improved quantification and prediction of water and energy cycle consequences of climate change. In order to measure change, it is first necessary to describe current conditions. The goal of the first phase of the NEWS Water and Energy Cycle Climatology project was to develop "state of the global water cycle" and "state of the global energy cycle" assessments based on data from modern ground and space based observing systems and data integrating models. The project was a multi-institutional collaboration with more than 20 active contributors. This presentation will describe the results of the water cycle component of the first phase of the project, which include seasonal (monthly) climatologies of water fluxes over land, ocean, and atmosphere at continental and ocean basin scales. The requirement of closure of the water budget (i.e., mass conservation) at various scales was exploited to constrain the flux estimates via an optimization approach that will also be described. Further, error assessments were included with the input datasets, and we examine these in relation to inferred uncertainty in the optimized flux estimates in order to gauge our current ability to close the water budget within an expected uncertainty range.

  11. Simultaneous optimization of loading pattern and burnable poison placement for PWRs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alim, F.; Ivanov, K.; Yilmaz, S.

    2006-07-01

    To solve in-core fuel management optimization problem, GARCO-PSU (Genetic Algorithm Reactor Core Optimization - Pennsylvania State Univ.) is developed. This code is applicable for all types and geometry of PWR core structures with unlimited number of fuel assembly (FA) types in the inventory. For this reason an innovative genetic algorithm is developed with modifying the classical representation of the genotype. In-core fuel management heuristic rules are introduced into GARCO. The core re-load design optimization has two parts, loading pattern (LP) optimization and burnable poison (BP) placement optimization. These parts depend on each other, but it is difficult to solve themore » combined problem due to its large size. Separating the problem into two parts provides a practical way to solve the problem. However, the result of this method does not reflect the real optimal solution. GARCO-PSU achieves to solve LP optimization and BP placement optimization simultaneously in an efficient manner. (authors)« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Kuo -Ling; Mehrotra, Sanjay

    We present a homogeneous algorithm equipped with a modified potential function for the monotone complementarity problem. We show that this potential function is reduced by at least a constant amount if a scaled Lipschitz condition (SLC) is satisfied. A practical algorithm based on this potential function is implemented in a software package named iOptimize. The implementation in iOptimize maintains global linear and polynomial time convergence properties, while achieving practical performance. It either successfully solves the problem, or concludes that the SLC is not satisfied. When compared with the mature software package MOSEK (barrier solver version 6.0.0.106), iOptimize solves convex quadraticmore » programming problems, convex quadratically constrained quadratic programming problems, and general convex programming problems in fewer iterations. Moreover, several problems for which MOSEK fails are solved to optimality. In addition, we also find that iOptimize detects infeasibility more reliably than the general nonlinear solvers Ipopt (version 3.9.2) and Knitro (version 8.0).« less

  13. A noisy chaotic neural network for solving combinatorial optimization problems: stochastic chaotic simulated annealing.

    PubMed

    Wang, Lipo; Li, Sa; Tian, Fuyu; Fu, Xiuju

    2004-10-01

    Recently Chen and Aihara have demonstrated both experimentally and mathematically that their chaotic simulated annealing (CSA) has better search ability for solving combinatorial optimization problems compared to both the Hopfield-Tank approach and stochastic simulated annealing (SSA). However, CSA may not find a globally optimal solution no matter how slowly annealing is carried out, because the chaotic dynamics are completely deterministic. In contrast, SSA tends to settle down to a global optimum if the temperature is reduced sufficiently slowly. Here we combine the best features of both SSA and CSA, thereby proposing a new approach for solving optimization problems, i.e., stochastic chaotic simulated annealing, by using a noisy chaotic neural network. We show the effectiveness of this new approach with two difficult combinatorial optimization problems, i.e., a traveling salesman problem and a channel assignment problem for cellular mobile communications.

  14. Optimal Control Problems with Switching Points. Ph.D. Thesis, 1990 Final Report

    NASA Technical Reports Server (NTRS)

    Seywald, Hans

    1991-01-01

    The main idea of this report is to give an overview of the problems and difficulties that arise in solving optimal control problems with switching points. A brief discussion of existing optimality conditions is given and a numerical approach for solving the multipoint boundary value problems associated with the first-order necessary conditions of optimal control is presented. Two real-life aerospace optimization problems are treated explicitly. These are altitude maximization for a sounding rocket (Goddard Problem) in the presence of a dynamic pressure limit, and range maximization for a supersonic aircraft flying in the vertical, also in the presence of a dynamic pressure limit. In the second problem singular control appears along arcs with active dynamic pressure limit, which in the context of optimal control, represents a first-order state inequality constraint. An extension of the Generalized Legendre-Clebsch Condition to the case of singular control along state/control constrained arcs is presented and is applied to the aircraft range maximization problem stated above. A contribution to the field of Jacobi Necessary Conditions is made by giving a new proof for the non-optimality of conjugate paths in the Accessory Minimum Problem. Because of its simple and explicit character, the new proof may provide the basis for an extension of Jacobi's Necessary Condition to the case of the trajectories with interior point constraints. Finally, the result that touch points cannot occur for first-order state inequality constraints is extended to the case of vector valued control functions.

  15. Nonexpansiveness of a linearized augmented Lagrangian operator for hierarchical convex optimization

    NASA Astrophysics Data System (ADS)

    Yamagishi, Masao; Yamada, Isao

    2017-04-01

    Hierarchical convex optimization concerns two-stage optimization problems: the first stage problem is a convex optimization; the second stage problem is the minimization of a convex function over the solution set of the first stage problem. For the hierarchical convex optimization, the hybrid steepest descent method (HSDM) can be applied, where the solution set of the first stage problem must be expressed as the fixed point set of a certain nonexpansive operator. In this paper, we propose a nonexpansive operator that yields a computationally efficient update when it is plugged into the HSDM. The proposed operator is inspired by the update of the linearized augmented Lagrangian method. It is applicable to characterize the solution set of recent sophisticated convex optimization problems found in the context of inverse problems, where the sum of multiple proximable convex functions involving linear operators must be minimized to incorporate preferable properties into the minimizers. For such a problem formulation, there has not yet been reported any nonexpansive operator that yields an update free from the inversions of linear operators in cases where it is utilized in the HSDM. Unlike previously known nonexpansive operators, the proposed operator yields an inversion-free update in such cases. As an application of the proposed operator plugged into the HSDM, we also present, in the context of the so-called superiorization, an algorithmic solution to a convex optimization problem over the generalized convex feasible set where the intersection of the hard constraints is not necessarily simple.

  16. TARCMO: Theory and Algorithms for Robust, Combinatorial, Multicriteria Optimization

    DTIC Science & Technology

    2016-11-28

    objective 9 4.6 On The Recoverable Robust Traveling Salesman Problem . . . . . 11 4.7 A Bicriteria Approach to Robust Optimization...be found. 4.6 On The Recoverable Robust Traveling Salesman Problem The traveling salesman problem (TSP) is a well-known combinatorial optimiza- tion...procedure for the robust traveling salesman problem . While this iterative algorithms results in an optimal solution to the robust TSP, computation

  17. A Mathematical Optimization Problem in Bioinformatics

    ERIC Educational Resources Information Center

    Heyer, Laurie J.

    2008-01-01

    This article describes the sequence alignment problem in bioinformatics. Through examples, we formulate sequence alignment as an optimization problem and show how to compute the optimal alignment with dynamic programming. The examples and sample exercises have been used by the author in a specialized course in bioinformatics, but could be adapted…

  18. A Problem on Optimal Transportation

    ERIC Educational Resources Information Center

    Cechlarova, Katarina

    2005-01-01

    Mathematical optimization problems are not typical in the classical curriculum of mathematics. In this paper we show how several generalizations of an easy problem on optimal transportation were solved by gifted secondary school pupils in a correspondence mathematical seminar, how they can be used in university courses of linear programming and…

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graf, Peter; Dykes, Katherine; Scott, George

    The layout of turbines in a wind farm is already a challenging nonlinear, nonconvex, nonlinearly constrained continuous global optimization problem. Here we begin to address the next generation of wind farm optimization problems by adding the complexity that there is more than one turbine type to choose from. The optimization becomes a nonlinear constrained mixed integer problem, which is a very difficult class of problems to solve. Furthermore, this document briefly summarizes the algorithm and code we have developed, the code validation steps we have performed, and the initial results for multi-turbine type and placement optimization (TTP_OPT) we have run.

  20. A framework for modeling and optimizing dynamic systems under uncertainty

    DOE PAGES

    Nicholson, Bethany; Siirola, John

    2017-11-11

    Algebraic modeling languages (AMLs) have drastically simplified the implementation of algebraic optimization problems. However, there are still many classes of optimization problems that are not easily represented in most AMLs. These classes of problems are typically reformulated before implementation, which requires significant effort and time from the modeler and obscures the original problem structure or context. In this work we demonstrate how the Pyomo AML can be used to represent complex optimization problems using high-level modeling constructs. We focus on the operation of dynamic systems under uncertainty and demonstrate the combination of Pyomo extensions for dynamic optimization and stochastic programming.more » We use a dynamic semibatch reactor model and a large-scale bubbling fluidized bed adsorber model as test cases.« less

Top