Science.gov

Sample records for risk based optimization

  1. PTV-based IMPT optimization incorporating planning risk volumes vs robust optimization

    SciTech Connect

    Liu Wei; Li Xiaoqiang; Zhu, Ron. X.; Mohan, Radhe; Frank, Steven J.; Li Yupeng

    2013-02-15

    Purpose: Robust optimization leads to intensity-modulated proton therapy (IMPT) plans that are less sensitive to uncertainties and superior in terms of organs-at-risk (OARs) sparing, target dose coverage, and homogeneity compared to planning target volume (PTV)-based optimized plans. Robust optimization incorporates setup and range uncertainties, which implicitly adds margins to both targets and OARs and is also able to compensate for perturbations in dose distributions within targets and OARs caused by uncertainties. In contrast, the traditional PTV-based optimization considers only setup uncertainties and adds a margin only to targets but no margins to the OARs. It also ignores range uncertainty. The purpose of this work is to determine if robustly optimized plans are superior to PTV-based plans simply because the latter do not assign margins to OARs during optimization. Methods: The authors retrospectively selected from their institutional database five patients with head and neck (H and N) cancer and one with prostate cancer for this analysis. Using their original images and prescriptions, the authors created new IMPT plans using three methods: PTV-based optimization, optimization based on the PTV and planning risk volumes (PRVs) (i.e., 'PTV+PRV-based optimization'), and robust optimization using the 'worst-case' dose distribution. The PRVs were generated by uniformly expanding OARs by 3 mm for the H and N cases and 5 mm for the prostate case. The dose-volume histograms (DVHs) from the worst-case dose distributions were used to assess and compare plan quality. Families of DVHs for each uncertainty for all structures of interest were plotted along with the nominal DVHs. The width of the 'bands' of DVHs was used to quantify the plan sensitivity to uncertainty. Results: Compared with conventional PTV-based and PTV+PRV-based planning, robust optimization led to a smaller bandwidth for the targets in the face of uncertainties {l_brace}clinical target volume [CTV

  2. Rational risk-based decision support for drinking water well managers by optimized monitoring designs

    NASA Astrophysics Data System (ADS)

    Enzenhöfer, R.; Geiges, A.; Nowak, W.

    2011-12-01

    Advection-based well-head protection zones are commonly used to manage the contamination risk of drinking water wells. Considering the insufficient knowledge about hazards and transport properties within the catchment, current Water Safety Plans recommend that catchment managers and stakeholders know, control and monitor all possible hazards within the catchments and perform rational risk-based decisions. Our goal is to supply catchment managers with the required probabilistic risk information, and to generate tools that allow for optimal and rational allocation of resources between improved monitoring versus extended safety margins and risk mitigation measures. To support risk managers with the indispensable information, we address the epistemic uncertainty of advective-dispersive solute transport and well vulnerability (Enzenhoefer et al., 2011) within a stochastic simulation framework. Our framework can separate between uncertainty of contaminant location and actual dilution of peak concentrations by resolving heterogeneity with high-resolution Monte-Carlo simulation. To keep computational costs low, we solve the reverse temporal moment transport equation. Only in post-processing, we recover the time-dependent solute breakthrough curves and the deduced well vulnerability criteria from temporal moments by non-linear optimization. Our first step towards optimal risk management is optimal positioning of sampling locations and optimal choice of data types to reduce best the epistemic prediction uncertainty for well-head delineation, using the cross-bred Likelihood Uncertainty Estimator (CLUE, Leube et al., 2011) for optimal sampling design. Better monitoring leads to more reliable and realistic protection zones and thus helps catchment managers to better justify smaller, yet conservative safety margins. In order to allow an optimal choice in sampling strategies, we compare the trade-off in monitoring versus the delineation costs by accounting for ill

  3. A Risk-Based Multi-Objective Optimization Concept for Early-Warning Monitoring Networks

    NASA Astrophysics Data System (ADS)

    Bode, F.; Loschko, M.; Nowak, W.

    2014-12-01

    Groundwater is a resource for drinking water and hence needs to be protected from contaminations. However, many well catchments include an inventory of known and unknown risk sources which cannot be eliminated, especially in urban regions. As matter of risk control, all these risk sources should be monitored. A one-to-one monitoring situation for each risk source would lead to a cost explosion and is even impossible for unknown risk sources. However, smart optimization concepts could help to find promising low-cost monitoring network designs.In this work we develop a concept to plan monitoring networks using multi-objective optimization. Our considered objectives are to maximize the probability of detecting all contaminations and the early warning time and to minimize the installation and operating costs of the monitoring network. A qualitative risk ranking is used to prioritize the known risk sources for monitoring. The unknown risk sources can neither be located nor ranked. Instead, we represent them by a virtual line of risk sources surrounding the production well.We classify risk sources into four different categories: severe, medium and tolerable for known risk sources and an extra category for the unknown ones. With that, early warning time and detection probability become individual objectives for each risk class. Thus, decision makers can identify monitoring networks which are valid for controlling the top risk sources, and evaluate the capabilities (or search for least-cost upgrade) to also cover moderate, tolerable and unknown risk sources. Monitoring networks which are valid for the remaining risk also cover all other risk sources but the early-warning time suffers.The data provided for the optimization algorithm are calculated in a preprocessing step by a flow and transport model. Uncertainties due to hydro(geo)logical phenomena are taken into account by Monte-Carlo simulations. To avoid numerical dispersion during the transport simulations we use the

  4. A risk explicit interval linear programming model for uncertainty-based environmental economic optimization in the Lake Fuxian watershed, China.

    PubMed

    Zhang, Xiaoling; Huang, Kai; Zou, Rui; Liu, Yong; Yu, Yajuan

    2013-01-01

    The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP) method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP) and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers' preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of "low risk and high return efficiency" in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management. PMID:24191144

  5. A Risk Explicit Interval Linear Programming Model for Uncertainty-Based Environmental Economic Optimization in the Lake Fuxian Watershed, China

    PubMed Central

    Zou, Rui; Liu, Yong; Yu, Yajuan

    2013-01-01

    The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP) method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP) and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers' preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of “low risk and high return efficiency” in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management. PMID:24191144

  6. Risk based approach for design and optimization of stomach specific delivery of rifampicin.

    PubMed

    Vora, Chintan; Patadia, Riddhish; Mittal, Karan; Mashru, Rajashree

    2013-10-15

    The research envisaged focuses on risk management approach for better recognizing the risks, ways to mitigate them and propose a control strategy for the development of rifampicin gastroretentive tablets. Risk assessment using failure mode and effects analysis (FMEA) was done to depict the effects of specific failure modes related to respective formulation/process variable. A Box-Behnken design was used to investigate the effect of amount of sodium bicarbonate (X1), pore former HPMC (X2) and glyceryl behenate (X3) on percent drug release at 1st hour (Q1), 4th hour (Q4), 8th hour (Q8) and floating lag time (min). Main effects and interaction plots were generated to study effects of variables. Selection of the optimized formulation was done using desirability function and overlay contour plots. The optimized formulation exhibited Q1 of 20.9%, Q4 of 59.1%, Q8 of 94.8% and floating lag time of 4.0 min. Akaike information criteria and Model selection criteria revealed that the model was best described by Korsmeyer-Peppas power law. The residual plots demonstrated no existence of non-normality, skewness or outliers. The composite desirability for optimized formulation computed using equations and software were 0.84 and 0.86 respectively. FTIR, DSC and PXRD studies ruled out drug polymer interaction due to thermal treatment. PMID:23916823

  7. A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem

    PubMed Central

    Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming

    2015-01-01

    Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity. PMID:26180842

  8. A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem.

    PubMed

    Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming

    2015-01-01

    Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity. PMID:26180842

  9. An optimization-based approach for facility energy management with uncertainties, and, Power portfolio optimization in deregulated electricity markets with risk management

    NASA Astrophysics Data System (ADS)

    Xu, Jun

    Topic 1. An Optimization-Based Approach for Facility Energy Management with Uncertainties. Effective energy management for facilities is becoming increasingly important in view of the rising energy costs, the government mandate on the reduction of energy consumption, and the human comfort requirements. This part of dissertation presents a daily energy management formulation and the corresponding solution methodology for HVAC systems. The problem is to minimize the energy and demand costs through the control of HVAC units while satisfying human comfort, system dynamics, load limit constraints, and other requirements. The problem is difficult in view of the fact that the system is nonlinear, time-varying, building-dependent, and uncertain; and that the direct control of a large number of HVAC components is difficult. In this work, HVAC setpoints are the control variables developed on top of a Direct Digital Control (DDC) system. A method that combines Lagrangian relaxation, neural networks, stochastic dynamic programming, and heuristics is developed to predict the system dynamics and uncontrollable load, and to optimize the setpoints. Numerical testing and prototype implementation results show that our method can effectively reduce total costs, manage uncertainties, and shed the load, is computationally efficient. Furthermore, it is significantly better than existing methods. Topic 2. Power Portfolio Optimization in Deregulated Electricity Markets with Risk Management. In a deregulated electric power system, multiple markets of different time scales exist with various power supply instruments. A load serving entity (LSE) has multiple choices from these instruments to meet its load obligations. In view of the large amount of power involved, the complex market structure, risks in such volatile markets, stringent constraints to be satisfied, and the long time horizon, a power portfolio optimization problem is of critical importance but difficulty for an LSE to serve the

  10. Risk modelling in portfolio optimization

    NASA Astrophysics Data System (ADS)

    Lam, W. H.; Jaaman, Saiful Hafizah Hj.; Isa, Zaidi

    2013-09-01

    Risk management is very important in portfolio optimization. The mean-variance model has been used in portfolio optimization to minimize the investment risk. The objective of the mean-variance model is to minimize the portfolio risk and achieve the target rate of return. Variance is used as risk measure in the mean-variance model. The purpose of this study is to compare the portfolio composition as well as performance between the optimal portfolio of mean-variance model and equally weighted portfolio. Equally weighted portfolio means the proportions that are invested in each asset are equal. The results show that the portfolio composition of the mean-variance optimal portfolio and equally weighted portfolio are different. Besides that, the mean-variance optimal portfolio gives better performance because it gives higher performance ratio than the equally weighted portfolio.

  11. Application of risk-based methods to optimize inspection planning for regulatory activities at nuclear power plants

    SciTech Connect

    Wong, S.M.; Higgins, J.C.; Martinez-Guridi, G.

    1995-07-01

    As part of regulatory oversight requirements, the U.S. Nuclear Regulatory Commission (USNRC) staff conducts inspection activities to assess operational safety performance in nuclear power plants. Currently, guidance in these inspections is provided by procedures in the NRC Inspection Manual and issuance of Temporary Instructions defining the objectives and scope of the inspection effort. In several studies sponsored by the USNRC over the last few years, Brookhaven National Laboratory (BNL) has developed and applied methodologies for providing risk-based inspection guidance for the safety assessments of nuclear power plant systems. One recent methodology integrates insights from existing Probabilistic Risk Assessment (PRA) studies and Individual Plant Evaluations (TPE) with information from operating experience reviews for consideration in inspection planning for either multi-disciplinary team inspections or individual inspections. In recent studies at BNL, a risk-based methodology was developed to optimize inspection planning for regulatory activities at nuclear power plants. This methodology integrates risk-based insights from the plant configuration risk profile and risk information found in existing PRA/IPE studies.

  12. A risk-based coverage model for video surveillance camera control optimization

    NASA Astrophysics Data System (ADS)

    Zhang, Hongzhou; Du, Zhiguo; Zhao, Xingtao; Li, Peiyue; Li, Dehua

    2015-12-01

    Visual surveillance system for law enforcement or police case investigation is different from traditional application, for it is designed to monitor pedestrians, vehicles or potential accidents. Visual surveillance risk is defined as uncertainty of visual information of targets and events monitored in present work and risk entropy is introduced to modeling the requirement of police surveillance task on quality and quantity of vide information. the prosed coverage model is applied to calculate the preset FoV position of PTZ camera.

  13. Dynamic optimization of ISR sensors using a risk-based reward function applied to ground and space surveillance scenarios

    NASA Astrophysics Data System (ADS)

    DeSena, J. T.; Martin, S. R.; Clarke, J. C.; Dutrow, D. A.; Newman, A. J.

    2012-06-01

    . The algorithm to jointly optimize sensor schedules against search, track, and classify is based on recent work by Papageorgiou and Raykin on risk-based sensor management. It uses a risk-based objective function and attempts to minimize and balance the risks of misclassifying and losing track on an object. It supports the requirement to generate tasking for metric and feature data concurrently and synergistically, and account for both tracking accuracy and object characterization, jointly, in computing reward and cost for optimizing tasking decisions.

  14. Optimization of the fractionated irradiation scheme considering physical doses to tumor and organ at risk based on dose–volume histograms

    SciTech Connect

    Sugano, Yasutaka; Mizuta, Masahiro; Takao, Seishin; Shirato, Hiroki; Sutherland, Kenneth L.; Date, Hiroyuki

    2015-11-15

    Purpose: Radiotherapy of solid tumors has been performed with various fractionation regimens such as multi- and hypofractionations. However, the ability to optimize the fractionation regimen considering the physical dose distribution remains insufficient. This study aims to optimize the fractionation regimen, in which the authors propose a graphical method for selecting the optimal number of fractions (n) and dose per fraction (d) based on dose–volume histograms for tumor and normal tissues of organs around the tumor. Methods: Modified linear-quadratic models were employed to estimate the radiation effects on the tumor and an organ at risk (OAR), where the repopulation of the tumor cells and the linearity of the dose-response curve in the high dose range of the surviving fraction were considered. The minimization problem for the damage effect on the OAR was solved under the constraint that the radiation effect on the tumor is fixed by a graphical method. Here, the damage effect on the OAR was estimated based on the dose–volume histogram. Results: It was found that the optimization of fractionation scheme incorporating the dose–volume histogram is possible by employing appropriate cell surviving models. The graphical method considering the repopulation of tumor cells and a rectilinear response in the high dose range enables them to derive the optimal number of fractions and dose per fraction. For example, in the treatment of prostate cancer, the optimal fractionation was suggested to lie in the range of 8–32 fractions with a daily dose of 2.2–6.3 Gy. Conclusions: It is possible to optimize the number of fractions and dose per fraction based on the physical dose distribution (i.e., dose–volume histogram) by the graphical method considering the effects on tumor and OARs around the tumor. This method may stipulate a new guideline to optimize the fractionation regimen for physics-guided fractionation.

  15. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Chueng, Kar-Ming

    2008-01-01

    The main purpose of this paper is to introduce a risk management approach that allows planners to quantify the risk and efficiency tradeoff in the presence of uncertainties, and to make forward-looking choices in the development and execution of the plan. Demonstrate a planning and risk analysis framework that tightly integrates mathematical optimization, empirical simulation, and theoretical analysis techniques to solve complex problems.

  16. Risk Assessment: Evidence Base

    NASA Technical Reports Server (NTRS)

    Johnson-Throop, Kathy A.

    2007-01-01

    Human systems PRA (Probabilistic Risk Assessment: a) Provides quantitative measures of probability, consequence, and uncertainty; and b) Communicates risk and informs decision-making. Human health risks rated highest in ISS PRA are based on 1997 assessment of clinical events in analog operational settings. Much work remains to analyze remaining human health risks identified in Bioastronautics Roadmap.

  17. RNA based evolutionary optimization

    NASA Astrophysics Data System (ADS)

    Schuster, Peter

    1993-12-01

    . Evolutionary optimization of two-letter sequences in thus more difficult than optimization in the world of natural RNA sequences with four bases. This fact might explain the usage of four bases in the genetic language of nature. Finally we study the mapping from RNA sequences into secondary structures and explore the topology of RNA shape space. We find that ‘neutral paths’ connecting neighbouring sequences with identical structures go very frequently through entire sequence space. Sequences folding into common structures are found everywhere in sequence space. Hence, evolution can migrate to almost every part of sequence space without ‘hill climbing’ and only small fractions of the entire number of sequences have to be searched in order to find suitable structures.

  18. Potential for dose-escalation and reduction of risk in pancreatic cancer using IMRT optimization with lexicographic ordering and gEUD-based cost functions

    SciTech Connect

    Spalding, Aaron C.; Jee, Kyung-Wook; Vineberg, Karen; Jablonowski, Marla; Fraass, Benedick A.; Pan, Charlie C.; Lawrence, Theodore S.; Ten Haken, Randall K.; Ben-Josef, Edgar

    2007-02-15

    Radiotherapy for pancreatic cancer is limited by the tolerance of local organs at risk (OARs) and frequent overlap of the planning target volume (PTV) and OAR volumes. Using lexicographic ordering (LO), a hierarchical optimization technique, with generalized equivalent uniform dose (gEUD) cost functions, we studied the potential of intensity modulated radiation therapy (IMRT) to increase the dose to pancreatic tumors and to areas of vascular involvement that preclude surgical resection [surgical boost volume (SBV)]. We compared 15 forward planned three-dimensional conformal (3DCRT) and IMRT treatment plans for locally advanced unresectable pancreatic cancer. We created IMRT plans optimized using LO with gEUD-based cost functions that account for the contribution of each part of the resulting inhomogeneous dose distribution. LO-IMRT plans allowed substantial PTV dose escalation compared with 3DCRT; median increase from 52 Gy to 66 Gy (a=-5,p<0.005) and median increase from 50 Gy to 59 Gy (a=-15,p<0.005). LO-IMRT also allowed increases to 85 Gy in the SBV, regardless of a value, along with significant dose reductions in OARs. We conclude that LO-IMRT with gEUD cost functions could allow dose escalation in pancreas tumors with concomitant reduction in doses to organs at risk as compared with traditional 3DCRT.

  19. Potential for dose-escalation and reduction of risk in pancreatic cancer using IMRT optimization with lexicographic ordering and gEUD-based cost functions.

    PubMed

    Spalding, Aaron C; Jee, Kyung-Wook; Vineberg, Karen; Jablonowski, Marla; Fraass, Benedick A; Pan, Charlie C; Lawrence, Theodore S; Haken, Randall K Ten; Ben-Josef, Edgar

    2007-02-01

    Radiotherapy for pancreatic cancer is limited by the tolerance of local organs at risk (OARs) and frequent overlap of the planning target volume (PTV) and OAR volumes. Using lexicographic ordering (LO), a hierarchical optimization technique, with generalized equivalent uniform dose (gEUD) cost functions, we studied the potential of intensity modulated radiation therapy (IMRT) to increase the dose to pancreatic tumors and to areas of vascular involvement that preclude surgical resection [surgical boost volume (SBV)]. We compared 15 forward planned three-dimensional conformal (3DCRT) and IMRT treatment plans for locally advanced unresectable pancreatic cancer. We created IMRT plans optimized using LO with gEUD-based cost functions that account for the contribution of each part of the resulting inhomogeneous dose distribution. LO-IMRT plans allowed substantial PTV dose escalation compared with 3DCRT; median increase from 52 Gy to 66 Gy (a=-5,p<0.005) and median increase from 50 Gy to 59 Gy (a=-15,p<0.005). LO-IMRT also allowed increases to 85 Gy in the SBV, regardless of a value, along with significant dose reductions in OARs. We conclude that LO-IMRT with gEUD cost functions could allow dose escalation in pancreas tumors with concomitant reduction in doses to organs at risk as compared with traditional 3DCRT. PMID:17388169

  20. Medical Device Risk Management For Performance Assurance Optimization and Prioritization.

    PubMed

    Gaamangwe, Tidimogo; Babbar, Vishvek; Krivoy, Agustina; Moore, Michael; Kresta, Petr

    2015-01-01

    Performance assurance (PA) is an integral component of clinical engineering medical device risk management. For that reason, the clinical engineering (CE) community has made concerted efforts to define appropriate risk factors and develop quantitative risk models for efficient data processing and improved PA program operational decision making. However, a common framework that relates the various processes of a quantitative risk system does not exist. This article provides a perspective that focuses on medical device quality and risk-based elements of the PA program, which include device inclusion/exclusion, schedule optimization, and inspection prioritization. A PA risk management framework is provided, and previous quantitative models that have contributed to the advancement of PA risk management are examined. A general model for quantitative risk systems is proposed, and further perspective on possible future directions in the area of PA technology is also provided. PMID:26618842

  1. Search-based optimization

    NASA Technical Reports Server (NTRS)

    Wheeler, Ward C.

    2003-01-01

    The problem of determining the minimum cost hypothetical ancestral sequences for a given cladogram is known to be NP-complete (Wang and Jiang, 1994). Traditionally, point estimations of hypothetical ancestral sequences have been used to gain heuristic, upper bounds on cladogram cost. These include procedures with such diverse approaches as non-additive optimization of multiple sequence alignment, direct optimization (Wheeler, 1996), and fixed-state character optimization (Wheeler, 1999). A method is proposed here which, by extending fixed-state character optimization, replaces the estimation process with a search. This form of optimization examines a diversity of potential state solutions for cost-efficient hypothetical ancestral sequences and can result in greatly more parsimonious cladograms. Additionally, such an approach can be applied to other NP-complete phylogenetic optimization problems such as genomic break-point analysis. c2003 The Willi Hennig Society. Published by Elsevier Science (USA). All rights reserved.

  2. Search-based optimization.

    PubMed

    Wheeler, Ward C

    2003-08-01

    The problem of determining the minimum cost hypothetical ancestral sequences for a given cladogram is known to be NP-complete (Wang and Jiang, 1994). Traditionally, point estimations of hypothetical ancestral sequences have been used to gain heuristic, upper bounds on cladogram cost. These include procedures with such diverse approaches as non-additive optimization of multiple sequence alignment, direct optimization (Wheeler, 1996), and fixed-state character optimization (Wheeler, 1999). A method is proposed here which, by extending fixed-state character optimization, replaces the estimation process with a search. This form of optimization examines a diversity of potential state solutions for cost-efficient hypothetical ancestral sequences and can result in greatly more parsimonious cladograms. Additionally, such an approach can be applied to other NP-complete phylogenetic optimization problems such as genomic break-point analysis. PMID:14531408

  3. Towards Risk Based Design for NASA's Missions

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Barrientos, Francesca; Meshkat, Leila

    2004-01-01

    This paper describes the concept of Risk Based Design in the context of NASA s low volume, high cost missions. The concept of accounting for risk in the design lifecycle has been discussed and proposed under several research topics, including reliability, risk analysis, optimization, uncertainty, decision-based design, and robust design. This work aims to identify and develop methods to enable and automate a means to characterize and optimize risk, and use risk as a tradeable resource to make robust and reliable decisions, in the context of the uncertain and ambiguous stage of early conceptual design. This paper first presents a survey of the related topics explored in the design research community as they relate to risk based design. Then, a summary of the topics from the NASA-led Risk Colloquium is presented, followed by current efforts within NASA to account for risk in early design. Finally, a list of "risk elements", identified for early-phase conceptual design at NASA, is presented. The purpose is to lay the foundation and develop a roadmap for future work and collaborations for research to eliminate and mitigate these risk elements in early phase design.

  4. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Cheung, Kar-Ming

    2008-01-01

    This paper describes a systems engineering approach to resource planning by integrating mathematical modeling and constrained optimization, empirical simulation, and theoretical analysis techniques to generate an optimal task plan in the presence of uncertainties.

  5. Differential effects of trait anger on optimism and risk behaviour.

    PubMed

    Pietruska, Karin; Armony, Jorge L

    2013-01-01

    It has been proposed that angry people exhibit optimistic risk estimates about future events and, consequently, are biased towards making risk-seeking choices. The goal of this study was to directly test the hypothesised effect of trait anger on optimism and risk-taking behaviour. One hundred healthy volunteers completed questionnaires about personality traits, optimism and risk behaviour. In addition their risk tendency was assessed with the Balloon Analogue Risk Task (BART), which provides an online measure of risk behaviour. Our results partly confirmed the relation between trait anger and outcome expectations of future life events, but suggest that this optimism does not necessarily translate into actual risk-seeking behaviour. PMID:22780446

  6. Cost tradeoffs in consequence management at nuclear power plants: A risk based approach to setting optimal long-term interdiction limits for regulatory analyses

    SciTech Connect

    Mubayi, V.

    1995-05-01

    The consequences of severe accidents at nuclear power plants can be limited by various protective actions, including emergency responses and long-term measures, to reduce exposures of affected populations. Each of these protective actions involve costs to society. The costs of the long-term protective actions depend on the criterion adopted for the allowable level of long-term exposure. This criterion, called the ``long term interdiction limit,`` is expressed in terms of the projected dose to an individual over a certain time period from the long-term exposure pathways. The two measures of offsite consequences, latent cancers and costs, are inversely related and the choice of an interdiction limit is, in effect, a trade-off between these two measures. By monetizing the health effects (through ascribing a monetary value to life lost), the costs of the two consequence measures vary with the interdiction limit, the health effect costs increasing as the limit is relaxed and the protective action costs decreasing. The minimum of the total cost curve can be used to calculate an optimal long term interdiction limit. The calculation of such an optimal limit is presented for each of five US nuclear power plants which were analyzed for severe accident risk in the NUREG-1150 program by the Nuclear Regulatory Commission.

  7. Research on optimization-based design

    NASA Technical Reports Server (NTRS)

    Balling, R. J.; Parkinson, A. R.; Free, J. C.

    1989-01-01

    Research on optimization-based design is discussed. Illustrative examples are given for cases involving continuous optimization with discrete variables and optimization with tolerances. Approximation of computationally expensive and noisy functions, electromechanical actuator/control system design using decomposition and application of knowledge-based systems and optimization for the design of a valve anti-cavitation device are among the topics covered.

  8. Optimal trading from minimizing the period of bankruptcy risk

    NASA Astrophysics Data System (ADS)

    Liehr, S.; Pawelzik, K.

    2001-04-01

    Assuming that financial markets behave similar to random walk processes we derive a trading strategy with variable investment which is based on the equivalence of the period of bankruptcy risk and the risk to profit ratio. We define a state dependent predictability measure which can be attributed to the deterministic and stochastic components of the price dynamics. The influence of predictability variations and especially of short term inefficiency structures on the optimal amount of investment is analyzed in the given context and a method for adaptation of a trading system to the proposed objective function is presented. Finally we show the performance of our trading strategy on the DAX and S&P 500 as examples for real world data using different types of prediction models in comparison.

  9. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    SciTech Connect

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P.

    1996-12-01

    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising.

  10. Risk-optimized proton therapy to minimize radiogenic second cancers

    NASA Astrophysics Data System (ADS)

    Rechner, Laura A.; Eley, John G.; Howell, Rebecca M.; Zhang, Rui; Mirkovic, Dragan; Newhauser, Wayne D.

    2015-05-01

    Proton therapy confers substantially lower predicted risk of second cancer compared with photon therapy. However, no previous studies have used an algorithmic approach to optimize beam angle or fluence-modulation for proton therapy to minimize those risks. The objectives of this study were to demonstrate the feasibility of risk-optimized proton therapy and to determine the combination of beam angles and fluence weights that minimizes the risk of second cancer in the bladder and rectum for a prostate cancer patient. We used 6 risk models to predict excess relative risk of second cancer. Treatment planning utilized a combination of a commercial treatment planning system and an in-house risk-optimization algorithm. When normal-tissue dose constraints were incorporated in treatment planning, the risk model that incorporated the effects of fractionation, initiation, inactivation, repopulation and promotion selected a combination of anterior and lateral beams, which lowered the relative risk by 21% for the bladder and 30% for the rectum compared to the lateral-opposed beam arrangement. Other results were found for other risk models.

  11. Risk-optimized proton therapy to minimize radiogenic second cancers

    PubMed Central

    Rechner, Laura A.; Eley, John G.; Howell, Rebecca M.; Zhang, Rui; Mirkovic, Dragan; Newhauser, Wayne D.

    2015-01-01

    Proton therapy confers substantially lower predicted risk of second cancer compared with photon therapy. However, no previous studies have used an algorithmic approach to optimize beam angle or fluence-modulation for proton therapy to minimize those risks. The objectives of this study were to demonstrate the feasibility of risk-optimized proton therapy and to determine the combination of beam angles and fluence weights that minimize the risk of second cancer in the bladder and rectum for a prostate cancer patient. We used 6 risk models to predict excess relative risk of second cancer. Treatment planning utilized a combination of a commercial treatment planning system and an in-house risk-optimization algorithm. When normal-tissue dose constraints were incorporated in treatment planning, the risk model that incorporated the effects of fractionation, initiation, inactivation, and repopulation selected a combination of anterior and lateral beams, which lowered the relative risk by 21% for the bladder and 30% for the rectum compared to the lateral-opposed beam arrangement. Other results were found for other risk models. PMID:25919133

  12. Development of insurance terms based on risk assessment

    NASA Astrophysics Data System (ADS)

    Kosarev, Alexey; Nepp, Alexander; Nikonov, Oleg; Rushitskaja, Olga

    2013-10-01

    In the present work represented the technique of forming the insurance conditions based on risk assessment for industrial companies. Authors examine the issues of determining the optimal insurance coverage based on risk assessment. The assessment is based on the calculation Var of accidents. Var-indicators could help to determine optimal levels of deductibles and insurance rates. In paper are presents the results of practical testing of the method.

  13. Risk based management of piping systems

    SciTech Connect

    Conley, M.J.; Aller, J.E.; Tallin, A.; Weber, B.J.

    1996-07-01

    The API Piping Inspection Code is the first such Code to require classification of piping based on the consequences of failure, and to use this classification to influence inspection activity. Since this Code was published, progress has been made in the development of tools to improve on this approach by determining not only the consequences of failure, but also the likelihood of failure. ``Risk`` is defined as the product of the consequence and the likelihood. Measuring risk provides the means to formally manage risk by matching the inspection effort (costs) to the benefits of reduced risk. Using such a cost/benefit analysis allows the optimization of inspection budgets while meeting societal demands for reduction of the risk associated with process plant piping. This paper presents an overview of the tools developed to measure risk, and the methods to determine the effects of past and future inspections on the level of risk. The methodology is being developed as an industry-sponsored project under the direction of an API committee. The intent is to develop an API Recommended Practice that will be linked to In-Service Inspection Standards and the emerging Fitness for Service procedures. Actual studies using a similar approach have shown that a very high percentage of the risk due to piping in an operating facility is associated with relatively few pieces of piping. This permits inspection efforts to be focused on those piping systems that will result in the greatest risk reduction.

  14. Risk-based decisionmaking (Panel)

    SciTech Connect

    Smith, T.H.

    1995-12-31

    By means of a panel discussion and extensive audience interaction, explore the current challenges and progress to date in applying risk considerations to decisionmaking related to low-level waste. This topic is especially timely because of the proposed legislation pertaining to risk-based decisionmaking and because of the increased emphasis placed on radiological performance assessments of low-level waste disposal.

  15. Cancer risk assessment: Optimizing human health through linear dose-response models.

    PubMed

    Calabrese, Edward J; Shamoun, Dima Yazji; Hanekamp, Jaap C

    2015-07-01

    This paper proposes that generic cancer risk assessments be based on the integration of the Linear Non-Threshold (LNT) and hormetic dose-responses since optimal hormetic beneficial responses are estimated to occur at the dose associated with a 10(-4) risk level based on the use of a LNT model as applied to animal cancer studies. The adoption of the 10(-4) risk estimate provides a theoretical and practical integration of two competing risk assessment models whose predictions cannot be validated in human population studies or with standard chronic animal bioassay data. This model-integration reveals both substantial protection of the population from cancer effects (i.e. functional utility of the LNT model) while offering the possibility of significant reductions in cancer incidence should the hormetic dose-response model predictions be correct. The dose yielding the 10(-4) cancer risk therefore yields the optimized toxicologically based "regulatory sweet spot". PMID:25916915

  16. Optimal linear and nonlinear feature extraction based on the minimization of the increased risk of misclassification. [Bayes theorem - statistical analysis/data processing

    NASA Technical Reports Server (NTRS)

    Defigueiredo, R. J. P.

    1974-01-01

    General classes of nonlinear and linear transformations were investigated for the reduction of the dimensionality of the classification (feature) space so that, for a prescribed dimension m of this space, the increase of the misclassification risk is minimized.

  17. The Integration of LNT and Hormesis for Cancer Risk Assessment Optimizes Public Health Protection.

    PubMed

    Calabrese, Edward J; Shamoun, Dima Yazji; Hanekamp, Jaap C

    2016-03-01

    This paper proposes a new cancer risk assessment strategy and methodology that optimizes population-based responses by yielding the lowest disease/tumor incidence across the entire dose continuum. The authors argue that the optimization can be achieved by integrating two seemingly conflicting models; i.e., the linear no-threshold (LNT) and hormetic dose-response models. The integration would yield the optimized response at a risk of 10 with the LNT model. The integrative functionality of the LNT and hormetic dose response models provides an improved estimation of tumor incidence through model uncertainty analysis and major reductions in cancer incidence via hormetic model estimates. This novel approach to cancer risk assessment offers significant improvements over current risk assessment approaches by revealing a regulatory sweet spot that maximizes public health benefits while incorporating practical approaches for model validation. PMID:26808876

  18. Requirements based system risk modeling

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Cornford, Steven; Feather, Martin

    2004-01-01

    The problem that we address in this paper is assessing the expected degree of success of the system or mission based on the degree to which each requirement is satisfied and the relative weight of the requirements. We assume a complete list of the requirements, the relevant risk elements and their probability of occurrence and the quantified effect of the risk elements on the requirements. In order to assess the degree to which each requirement is satisfied, we need to determine the effect of the various risk elements on the requirement.

  19. Risk-based Spacecraft Fire Safety Experiments

    NASA Technical Reports Server (NTRS)

    Apostolakis, G.; Catton, I.; Issacci, F.; Paulos, T.; Jones, S.; Paxton, K.; Paul, M.

    1992-01-01

    Viewgraphs on risk-based spacecraft fire safety experiments are presented. Spacecraft fire risk can never be reduced to a zero probability. Probabilistic risk assessment is a tool to reduce risk to an acceptable level.

  20. Game Theory and Risk-Based Levee System Design

    NASA Astrophysics Data System (ADS)

    Hui, R.; Lund, J. R.; Madani, K.

    2014-12-01

    Risk-based analysis has been developed for optimal levee design for economic efficiency. Along many rivers, two levees on opposite riverbanks act as a simple levee system. Being rational and self-interested, land owners on each river bank would tend to independently optimize their levees with risk-based analysis, resulting in a Pareto-inefficient levee system design from the social planner's perspective. Game theory is applied in this study to analyze decision making process in a simple levee system in which the land owners on each river bank develop their design strategies using risk-based economic optimization. For each land owner, the annual expected total cost includes expected annual damage cost and annualized construction cost. The non-cooperative Nash equilibrium is identified and compared to the social planner's optimal distribution of flood risk and damage cost throughout the system which results in the minimum total flood cost for the system. The social planner's optimal solution is not feasible without appropriate level of compensation for the transferred flood risk to guarantee and improve conditions for all parties. Therefore, cooperative game theory is then employed to develop an economically optimal design that can be implemented in practice. By examining the game in the reversible and irreversible decision making modes, the cost of decision making myopia is calculated to underline the significance of considering the externalities and evolution path of dynamic water resource problems for optimal decision making.

  1. Optimization-Based Models of Muscle Coordination

    PubMed Central

    Prilutsky, Boris I.; Zatsiorsky, Vladimir M.

    2010-01-01

    Optimization-based models may provide reasonably accurate estimates of activation and force patterns of individual muscles in selected well-learned tasks with submaximal efforts. Such optimization criteria as minimum energy expenditure, minimum muscle fatigue, and minimum sense of effort seem most promising. PMID:11800497

  2. Optimization-based models of muscle coordination.

    PubMed

    Prilutsky, Boris I; Zatsiorsky, Vladimir M

    2002-01-01

    Optimization-based models may provide reasonably accurate estimates of activation and force patterns of individual muscles in selected well-learned tasks with submaximal efforts. Such optimization criteria as minimum energy expenditure, minimum muscle fatigue, and minimum sense of effort seem most promising. PMID:11800497

  3. Algorithmic Differentiation for Calculus-based Optimization

    NASA Astrophysics Data System (ADS)

    Walther, Andrea

    2010-10-01

    For numerous applications, the computation and provision of exact derivative information plays an important role for optimizing the considered system but quite often also for its simulation. This presentation introduces the technique of Algorithmic Differentiation (AD), a method to compute derivatives of arbitrary order within working precision. Quite often an additional structure exploitation is indispensable for a successful coupling of these derivatives with state-of-the-art optimization algorithms. The talk will discuss two important situations where the problem-inherent structure allows a calculus-based optimization. Examples from aerodynamics and nano optics illustrate these advanced optimization approaches.

  4. Optimal Combination Treatment and Vascular Outcomes in Recent Ischemic Stroke Patients by Premorbid Risk Level

    PubMed Central

    Park, Jong-Ho; Ovbiagele, Bruce

    2015-01-01

    Background Optimal combination of secondary stroke prevention treatment including antihypertensives, antithrombotic agents, and lipid modifiers is associated with reduced recurrent vascular risk including stroke. It is unclear whether optimal combination treatment has a differential impact on stroke patients based on level of vascular risk. Methods We analyzed a clinical trial dataset comprising 3680 recent non-cardioembolic stroke patients aged ≥35 years and followed for 2 years. Patients were categorized by appropriateness level 0 to III depending on the number of the drugs prescribed divided by the number of drugs potentially indicated for each patient (0=none of the indicated medications prescribed and III=all indicated medications prescribed [optimal combination treatment]). High-risk was defined as having a history of stroke or coronary heart disease (CHD) prior to the index stroke event. Independent associations of medication appropriateness level with a major vascular event (stroke, CHD, or vascular death), ischemic stroke, and all-cause death were analyzed. Results Compared with level 0, for major vascular events, the HR of level III in the low-risk group was 0.51 (95% CI: 0.20–1.28) and 0.32 (0.14–0.70) in the high-risk group; for stroke, the HR of level III in the low-risk group was 0.54 (0.16–1.77) and 0.25 (0.08–0.85) in the high-risk group; and for all-cause death, the HR of level III in the low-risk group was 0.66 (0.09–5.00) and 0.22 (0.06–0.78) in the high-risk group. Conclusion Optimal combination treatment is related to a significantly lower risk of future vascular events and death among high-risk patients after a recent non-cardioembolic stroke. PMID:26044963

  5. Optimal CO2 mitigation under damage risk valuation

    NASA Astrophysics Data System (ADS)

    Crost, Benjamin; Traeger, Christian P.

    2014-07-01

    The current generation has to set mitigation policy under uncertainty about the economic consequences of climate change. This uncertainty governs both the level of damages for a given level of warming, and the steepness of the increase in damage per warming degree. Our model of climate and the economy is a stochastic version of a model employed in assessing the US Social Cost of Carbon (DICE). We compute the optimal carbon taxes and CO2 abatement levels that maximize welfare from economic consumption over time under different risk states. In accordance with recent developments in finance, we separate preferences about time and risk to improve the model's calibration of welfare to observed market interest. We show that introducing the modern asset pricing framework doubles optimal abatement and carbon taxation. Uncertainty over the level of damages at a given temperature increase can result in a slight increase of optimal emissions as compared to using expected damages. In contrast, uncertainty governing the steepness of the damage increase in temperature results in a substantially higher level of optimal mitigation.

  6. Vehicle Shield Optimization and Risk Assessment of Future NEO Missions

    NASA Technical Reports Server (NTRS)

    Nounu, Hatem, N.; Kim, Myung-Hee; Cucinotta, Francis A.

    2011-01-01

    Future human space missions target far destinations such as Near Earth Objects (NEO) or Mars that require extended stay in hostile radiation environments in deep space. The continuous assessment of exploration vehicles is needed to iteratively optimize the designs for shielding protection and calculating the risks associated with such long missions. We use a predictive software capability that calculates the risks to humans inside a spacecraft. The software uses the CAD software Pro/Engineer and Fishbowl tool kit to quantify the radiation shielding properties of the spacecraft geometry by calculating the areal density seen at a certain point, dose point, inside the spacecraft. The shielding results are used by NASA-developed software, BRYNTRN, to quantify the organ doses received in a human body located in the vehicle in a possible solar particle events (SPE) during such prolonged space missions. The organ doses are used to quantify the risks posed on the astronauts' health and life using NASA Space Cancer Model software. An illustration of the shielding optimization and risk calculation on an exploration vehicle design suitable for a NEO mission is provided in this study. The vehicle capsule is made of aluminum shell, airlock with hydrogen-rich carbon composite material end caps. The capsule contains sets of racks that surround a working and living area. A water shelter is provided in the middle of the vehicle to enhance the shielding in case of SPE. The mass distribution is optimized to minimize radiation hotspots and an assessment of the risks associated with a NEO mission is calculated.

  7. Optimization of multi-constrained structures based on optimality criteria

    NASA Technical Reports Server (NTRS)

    Rizzi, P.

    1976-01-01

    A weight-reduction algorithm is developed for the optimal design of structures subject to several multibehavioral inequality constraints. The structural weight is considered to depend linearly on the design variables. The algorithm incorporates a simple recursion formula derived from the Kuhn-Tucker necessary conditions for optimality, associated with a procedure to delete nonactive constraints based on the Gauss-Seidel iterative method for linear systems. A number of example problems is studied, including typical truss structures and simplified wings subject to static loads and with constraints imposed on stresses and displacements. For one of the latter structures, constraints on the fundamental natural frequency and flutter speed are also imposed. The results obtained show that the method is fast, efficient, and general when compared to other competing techniques. Extensions to the generality of the method to include equality constraints and nonlinear merit functions is discussed.

  8. MORT (Management Oversight and Risk Tree) based risk management

    SciTech Connect

    Briscoe, G.J.

    1990-02-01

    Risk Management is the optimization of safety programs. This requires a formal systems approach to hazards identification, risk quantification, and resource allocation/risk acceptance as opposed to case-by-case decisions. The Management Oversight and Risk Tree (MORT) has gained wide acceptance as a comprehensive formal systems approach covering all aspects of risk management. It (MORT) is a comprehensive analytical procedure that provides a disciplined method for determining the causes and contributing factors of major accidents. Alternatively, it serves as a tool to evaluate the quality of an existing safety system. While similar in many respects to fault tree analysis, MORT is more generalized and presents over 1500 specific elements of an ideal ''universal'' management program for optimizing occupational safety.

  9. Optimal quad-tree-based motion estimator

    NASA Astrophysics Data System (ADS)

    Schuster, Guido M.; Katsaggelos, Aggelos K.

    1996-09-01

    In this paper we propose an optimal quad-tree (QT)-based motion estimator for video compression. It is optimal in the sense that for a given bit budget for encoding the displacement vector field (DVF) and the QT segmentation, the scheme finds a DVF and a QT segmentation which minimizes the energy of the resulting displaced frame difference (DFD). We find the optimal QT decomposition and the optimal DVF jointly using the Lagrangian multiplier method and a multilevel dynamic program. The resulting DVF is spatially inhomogeneous since large blocks are used in areas with simple motion and small blocks in areas with complex motion. We present results with the proposed QT-based motion estimator which show that for the same DFD energy the proposed estimator uses about 30% fewer bits than the commonly used block matching algorithm.

  10. DSP code optimization based on cache

    NASA Astrophysics Data System (ADS)

    Xu, Chengfa; Li, Chengcheng; Tang, Bin

    2013-03-01

    DSP program's running efficiency on board is often lower than which via the software simulation during the program development, which is mainly resulted from the user's improper use and incomplete understanding of the cache-based memory. This paper took the TI TMS320C6455 DSP as an example, analyzed its two-level internal cache, and summarized the methods of code optimization. Processor can achieve its best performance when using these code optimization methods. At last, a specific algorithm application in radar signal processing is proposed. Experiment result shows that these optimization are efficient.

  11. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  12. Risk-Constrained Dynamic Programming for Optimal Mars Entry, Descent, and Landing

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Kuwata, Yoshiaki

    2013-01-01

    A chance-constrained dynamic programming algorithm was developed that is capable of making optimal sequential decisions within a user-specified risk bound. This work handles stochastic uncertainties over multiple stages in the CEMAT (Combined EDL-Mobility Analyses Tool) framework. It was demonstrated by a simulation of Mars entry, descent, and landing (EDL) using real landscape data obtained from the Mars Reconnaissance Orbiter. Although standard dynamic programming (DP) provides a general framework for optimal sequential decisionmaking under uncertainty, it typically achieves risk aversion by imposing an arbitrary penalty on failure states. Such a penalty-based approach cannot explicitly bound the probability of mission failure. A key idea behind the new approach is called risk allocation, which decomposes a joint chance constraint into a set of individual chance constraints and distributes risk over them. The joint chance constraint was reformulated into a constraint on an expectation over a sum of an indicator function, which can be incorporated into the cost function by dualizing the optimization problem. As a result, the chance-constraint optimization problem can be turned into an unconstrained optimization over a Lagrangian, which can be solved efficiently using a standard DP approach.

  13. Optimal Infomation-based Classification

    NASA Astrophysics Data System (ADS)

    Hyun, Baro

    Classification is the allocation of an object to an existing category among several based on uncertain measurements. Since information is used to quantify uncertainty, it is natural to consider classification and information as complementary subjects. This dissertation touches upon several topics that relate to the problem of classification, such as information, classification, and team classification. Motivated by the U.S. Air Force Intelligence, Surveillance, and Reconnaissance missions, we investigate the aforementioned topics for classifiers that follow two models: classifiers with workload-independent and workload-dependent performance. We adopt workload-independence and dependence as "first-order" models to capture the features of machines and humans, respectively. We first investigate the relationship between information in the sense of Shannon and classification performance, which is defined as the probability of misclassification. We show that while there is a predominant congruence between them, there are cases when such congruence is violated. We show the phenomenon for both workload-independent and workload-dependent classifiers and investigate the cause of such phenomena analytically. One way of making classification decisions is by setting a threshold on a measured quantity. For instance, if a measurement falls on one side of the threshold, the object that provided the measurement is classified as one type, otherwise, it is of another type. Exploiting thresholding, we formalize a classifier with dichotomous decisions (i.e., with two options, such as true or false) given a single variable measurement. We further extend the formalization to classifiers with trichotomy (i.e., with three options, such as true, false or unknown) and with multivariate measurements. When a team of classifiers is considered, issues on how to exploit redundant numbers of classifiers arise. We analyze these classifiers under different architectures, such as parallel or nested

  14. Estimation of the Optimal Statistical Quality Control Sampling Time Intervals Using a Residual Risk Measure

    PubMed Central

    Hatjimihail, Aristides T.

    2009-01-01

    Background An open problem in clinical chemistry is the estimation of the optimal sampling time intervals for the application of statistical quality control (QC) procedures that are based on the measurement of control materials. This is a probabilistic risk assessment problem that requires reliability analysis of the analytical system, and the estimation of the risk caused by the measurement error. Methodology/Principal Findings Assuming that the states of the analytical system are the reliability state, the maintenance state, the critical-failure modes and their combinations, we can define risk functions based on the mean time of the states, their measurement error and the medically acceptable measurement error. Consequently, a residual risk measure rr can be defined for each sampling time interval. The rr depends on the state probability vectors of the analytical system, the state transition probability matrices before and after each application of the QC procedure and the state mean time matrices. As optimal sampling time intervals can be defined those minimizing a QC related cost measure while the rr is acceptable. I developed an algorithm that estimates the rr for any QC sampling time interval of a QC procedure applied to analytical systems with an arbitrary number of critical-failure modes, assuming any failure time and measurement error probability density function for each mode. Furthermore, given the acceptable rr, it can estimate the optimal QC sampling time intervals. Conclusions/Significance It is possible to rationally estimate the optimal QC sampling time intervals of an analytical system to sustain an acceptable residual risk with the minimum QC related cost. For the optimization the reliability analysis of the analytical system and the risk analysis of the measurement error are needed. PMID:19513124

  15. Inspection-Repair based Availability Optimization of Distribution Systems using Teaching Learning based Optimization

    NASA Astrophysics Data System (ADS)

    Tiwary, Aditya; Arya, L. D.; Arya, Rajesh; Choube, S. C.

    2015-03-01

    This paper describes a technique for optimizing inspection and repair based availability of distribution systems. Optimum duration between two inspections has been obtained for each feeder section with respect to cost function and subject to satisfaction of availability at each load point. Teaching learning based optimization has been used for availability optimization. The developed algorithm has been implemented on radial and meshed distribution systems. The result obtained has been compared with those obtained with differential evolution.

  16. Shape optimization of pulsatile ventricular assist devices using FSI to minimize thrombotic risk

    NASA Astrophysics Data System (ADS)

    Long, C. C.; Marsden, A. L.; Bazilevs, Y.

    2014-10-01

    In this paper we perform shape optimization of a pediatric pulsatile ventricular assist device (PVAD). The device simulation is carried out using fluid-structure interaction (FSI) modeling techniques within a computational framework that combines FEM for fluid mechanics and isogeometric analysis for structural mechanics modeling. The PVAD FSI simulations are performed under realistic conditions (i.e., flow speeds, pressure levels, boundary conditions, etc.), and account for the interaction of air, blood, and a thin structural membrane separating the two fluid subdomains. The shape optimization study is designed to reduce thrombotic risk, a major clinical problem in PVADs. Thrombotic risk is quantified in terms of particle residence time in the device blood chamber. Methods to compute particle residence time in the context of moving spatial domains are presented in a companion paper published in the same issue (Comput Mech, doi: 10.1007/s00466-013-0931-y, 2013). The surrogate management framework, a derivative-free pattern search optimization method that relies on surrogates for increased efficiency, is employed in this work. For the optimization study shown here, particle residence time is used to define a suitable cost or objective function, while four adjustable design optimization parameters are used to define the device geometry. The FSI-based optimization framework is implemented in a parallel computing environment, and deployed with minimal user intervention. Using five SEARCH/ POLL steps the optimization scheme identifies a PVAD design with significantly better throughput efficiency than the original device.

  17. Optimal dividends in the Brownian motion risk model with interest

    NASA Astrophysics Data System (ADS)

    Fang, Ying; Wu, Rong

    2009-07-01

    In this paper, we consider a Brownian motion risk model, and in addition, the surplus earns investment income at a constant force of interest. The objective is to find a dividend policy so as to maximize the expected discounted value of dividend payments. It is well known that optimality is achieved by using a barrier strategy for unrestricted dividend rate. However, ultimate ruin of the company is certain if a barrier strategy is applied. In many circumstances this is not desirable. This consideration leads us to impose a restriction on the dividend stream. We assume that dividends are paid to the shareholders according to admissible strategies whose dividend rate is bounded by a constant. Under this additional constraint, we show that the optimal dividend strategy is formed by a threshold strategy.

  18. Risk-based system refinement

    SciTech Connect

    Winter, V.L.; Berg, R.S.; Dalton, L.J.

    1998-06-01

    When designing a high consequence system, considerable care should be taken to ensure that the system can not easily be placed into a high consequence failure state. A formal system design process should include a model that explicitly shows the complete state space of the system (including failure states) as well as those events (e.g., abnormal environmental conditions, component failures, etc.) that can cause a system to enter a failure state. In this paper the authors present such a model and formally develop a notion of risk-based refinement with respect to the model.

  19. Risk-based planning analysis for a single levee

    NASA Astrophysics Data System (ADS)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  20. Surrogate-based Analysis and Optimization

    NASA Technical Reports Server (NTRS)

    Queipo, Nestor V.; Haftka, Raphael T.; Shyy, Wei; Goel, Tushar; Vaidyanathan, Raj; Tucker, P. Kevin

    2005-01-01

    A major challenge to the successful full-scale development of modem aerospace systems is to address competing objectives such as improved performance, reduced costs, and enhanced safety. Accurate, high-fidelity models are typically time consuming and computationally expensive. Furthermore, informed decisions should be made with an understanding of the impact (global sensitivity) of the design variables on the different objectives. In this context, the so-called surrogate-based approach for analysis and optimization can play a very valuable role. The surrogates are constructed using data drawn from high-fidelity models, and provide fast approximations of the objectives and constraints at new design points, thereby making sensitivity and optimization studies feasible. This paper provides a comprehensive discussion of the fundamental issues that arise in surrogate-based analysis and optimization (SBAO), highlighting concepts, methods, techniques, as well as practical implications. The issues addressed include the selection of the loss function and regularization criteria for constructing the surrogates, design of experiments, surrogate selection and construction, sensitivity analysis, convergence, and optimization. The multi-objective optimal design of a liquid rocket injector is presented to highlight the state of the art and to help guide future efforts.

  1. Estimating vegetation dryness to optimize fire risk assessment with spot vegetation satellite data in savanna ecosystems

    NASA Astrophysics Data System (ADS)

    Verbesselt, J.; Somers, B.; Lhermitte, S.; van Aardt, J.; Jonckheere, I.; Coppin, P.

    2005-10-01

    The lack of information on vegetation dryness prior to the use of fire as a management tool often leads to a significant deterioration of the savanna ecosystem. This paper therefore evaluated the capacity of SPOT VEGETATION time-series to monitor the vegetation dryness (i.e., vegetation moisture content per vegetation amount) in order to optimize fire risk assessment in the savanna ecosystem of Kruger National Park in South Africa. The integrated Relative Vegetation Index approach (iRVI) to quantify the amount of herbaceous biomass at the end of the rain season and the Accumulated Relative Normalized Difference vegetation index decrement (ARND) related to vegetation moisture content were selected. The iRVI and ARND related to vegetation amount and moisture content, respectively, were combined in order to monitor vegetation dryness and optimize fire risk assessment in the savanna ecosystems. In situ fire activity data was used to evaluate the significance of the iRVI and ARND to monitor vegetation dryness for fire risk assessment. Results from the binary logistic regression analysis confirmed that the assessment of fire risk was optimized by integration of both the vegetation quantity (iRVI) and vegetation moisture content (ARND) as statistically significant explanatory variables. Consequently, the integrated use of both iRVI and ARND to monitor vegetation dryness provides a more suitable tool for fire management and suppression compared to other traditional satellite-based fire risk assessment methods, only related to vegetation moisture content.

  2. Local, Optimization-based Simplicial Mesh Smoothing

    Energy Science and Technology Software Center (ESTSC)

    1999-12-09

    OPT-MS is a C software package for the improvement and untangling of simplicial meshes (triangles in 2D, tetrahedra in 3D). Overall mesh quality is improved by iterating over the mesh vertices and adjusting their position to optimize some measure of mesh quality, such as element angle or aspect ratio. Several solution techniques (including Laplacian smoothing, "Smart" Laplacian smoothing, optimization-based smoothing and several combinations thereof) and objective functions (for example, element angle, sin (angle), and aspectmore » ratio) are available to the user for both two and three-dimensional meshes. If the mesh contains invalid elements (those with negative area) a different optimization algorithm for mesh untangling is provided.« less

  3. Assessment of Medical Risks and Optimization of their Management using Integrated Medical Model

    NASA Technical Reports Server (NTRS)

    Fitts, Mary A.; Madurai, Siram; Butler, Doug; Kerstman, Eric; Risin, Diana

    2008-01-01

    The Integrated Medical Model (IMM) Project is a software-based technique that will identify and quantify the medical needs and health risks of exploration crew members during space flight and evaluate the effectiveness of potential mitigation strategies. The IMM Project employs an evidence-based approach that will quantify probability and consequences of defined in-flight medical risks, mitigation strategies, and tactics to optimize crew member health. Using stochastic techniques, the IMM will ultimately inform decision makers at both programmatic and institutional levels and will enable objective assessment of crew health and optimization of mission success using data from relevant cohort populations and from the astronaut population. The objectives of the project include: 1) identification and documentation of conditions that may occur during exploration missions (Baseline Medical Conditions List [BMCL), 2) assessment of the likelihood of conditions in the BMCL occurring during exploration missions (incidence rate), 3) determination of the risk associated with these conditions and quantify in terms of end states (Loss of Crew, Loss of Mission, Evacuation), 4) optimization of in-flight hardware mass, volume, power, bandwidth and cost for a given level of risk or uncertainty, and .. validation of the methodologies used.

  4. A risk-reduction approach for optimal software release time determination with the delay incurred cost

    NASA Astrophysics Data System (ADS)

    Peng, Rui; Li, Yan-Fu; Zhang, Jun-Guang; Li, Xiang

    2015-07-01

    Most existing research on software release time determination assumes that parameters of the software reliability model (SRM) are deterministic and the reliability estimate is accurate. In practice, however, there exists a risk that the reliability requirement cannot be guaranteed due to the parameter uncertainties in the SRM, and such risk can be as high as 50% when the mean value is used. It is necessary for the software project managers to reduce the risk to a lower level by delaying the software release, which inevitably increases the software testing costs. In order to incorporate the managers' preferences over these two factors, a decision model based on multi-attribute utility theory (MAUT) is developed for the determination of optimal risk-reduction release time.

  5. LP based approach to optimal stable matchings

    SciTech Connect

    Teo, Chung-Piaw; Sethuraman, J.

    1997-06-01

    We study the classical stable marriage and stable roommates problems using a polyhedral approach. We propose a new LP formulation for the stable roommates problem. This formulation is non-empty if and only if the underlying roommates problem has a stable matching. Furthermore, for certain special weight functions on the edges, we construct a 2-approximation algorithm for the optimal stable roommates problem. Our technique uses a crucial geometry of the fractional solutions in this formulation. For the stable marriage problem, we show that a related geometry allows us to express any fractional solution in the stable marriage polytope as convex combination of stable marriage solutions. This leads to a genuinely simple proof of the integrality of the stable marriage polytope. Based on these ideas, we devise a heuristic to solve the optimal stable roommates problem. The heuristic combines the power of rounding and cutting-plane methods. We present some computational results based on preliminary implementations of this heuristic.

  6. EUD-based biological optimization for carbon ion therapy

    SciTech Connect

    Brüningk, Sarah C. Kamp, Florian; Wilkens, Jan J.

    2015-11-15

    therapy, the optimization by biological objective functions resulted in slightly superior treatment plans in terms of final EUD for the organs at risk (OARs) compared to voxel-based optimization approaches. This observation was made independent of the underlying objective function metric. An absolute gain in OAR sparing was observed for quadratic objective functions, whereas intersecting DVHs were found for logistic approaches. Even for considerable under- or overestimations of the used effect- or dose–volume parameters during the optimization, treatment plans were obtained that were of similar quality as the results of a voxel-based optimization. Conclusions: EUD-based optimization with either of the presented concepts can successfully be applied to treatment plan optimization. This makes EUE-based optimization for carbon ion therapy a useful tool to optimize more specifically in the sense of biological outcome while voxel-to-voxel variations of the biological effectiveness are still properly accounted for. This may be advantageous in terms of computational cost during treatment plan optimization but also enables a straight forward comparison of different fractionation schemes or treatment modalities.

  7. Base distance optimization for SQUID gradiometers

    SciTech Connect

    Garachtchenko, A.; Matlashov, A.; Kraus, R.

    1998-12-31

    The measurement of magnetic fields generated by weak nearby biomagnetic sources is affected by ambient noise generated by distant sources both internal and external to the subject under study. External ambient noise results from sources with numerous origins, many of which are unpredictable in nature. Internal noise sources are biomagnetic in nature and result from muscle activity (such as the heart, eye blinks, respiration, etc.), pulsation associated with blood flow, surgical implants, etc. Any magnetic noise will interfere with measurements of magnetic sources of interest, such as magnetoencephalography (MEG), in various ways. One of the most effective methods of reducing the magnetic noise measured by the SQUID sensor is to use properly designed superconducting gradiometers. Here, the authors optimized the baseline length of SQUID-based symmetric axial gradiometers using computer simulation. The signal-to-noise ratio (SNR) was used as the optimization criteria. They found that in most cases the optimal baseline is not equal to the depth of the primary source, rather it has a more complex dependence on the gradiometer balance and the ambient magnetic noise. They studied both first and second order gradiometers in simulated shielded environments and only second order gradiometers in a simulated unshielded environment. The noise source was simulated as a distant dipolar source for the shielded cases. They present optimal gradiometer baseline lengths for the various simulated situations below.

  8. Optimal network solution for proactive risk assessment and emergency response

    NASA Astrophysics Data System (ADS)

    Cai, Tianxing

    Coupled with the continuous development in the field industrial operation management, the requirement for operation optimization in large scale manufacturing network has provoked more interest in the research field of engineering. Compared with the traditional way to take the remedial measure after the occurrence of the emergency event or abnormal situation, the current operation control calls for more proactive risk assessment to set up early warning system and comprehensive emergency response planning. Among all the industries, chemical industry and energy industry have higher opportunity to face with the abnormal and emergency situations due to their own industry characterization. Therefore the purpose of the study is to develop methodologies to give aid in emergency response planning and proactive risk assessment in the above two industries. The efficacy of the developed methodologies is demonstrated via two industrial real problems. The first case is to handle energy network dispatch optimization under emergency of local energy shortage under extreme conditions such as earthquake, tsunami, and hurricane, which may cause local areas to suffer from delayed rescues, widespread power outages, tremendous economic losses, and even public safety threats. In such urgent events of local energy shortage, agile energy dispatching through an effective energy transportation network, targeting the minimum energy recovery time, should be a top priority. The second case is a scheduling methodology to coordinate multiple chemical plants' start-ups in order to minimize regional air quality impacts under extreme meteorological conditions. The objective is to reschedule multi-plant start-up sequence to achieve the minimum sum of delay time compared to the expected start-up time of each plant. All these approaches can provide quantitative decision support for multiple stake holders, including government and environment agencies, chemical industry, energy industry and local

  9. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    NASA Astrophysics Data System (ADS)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2016-03-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  10. Optimizing footwear for older people at risk of falls.

    PubMed

    Menant, Jasmine C; Steele, Julie R; Menz, Hylton B; Munro, Bridget J; Lord, Stephen R

    2008-01-01

    Footwear influences balance and the subsequent risk of slips, trips, and falls by altering somatosensory feedback to the foot and ankle and modifying frictional conditions at the shoe/floor interface. Walking indoors barefoot or in socks and walking indoors or outdoors in high-heel shoes have been shown to increase the risk of falls in older people. Other footwear characteristics such as heel collar height, sole hardness, and tread and heel geometry also influence measures of balance and gait. Because many older people wear suboptimal shoes, maximizing safe shoe use may offer an effective fall prevention strategy. Based on findings of a systematic literature review, older people should wear shoes with low heels and firm slip-resistant soles both inside and outside the home. Future research should investigate the potential benefits of tread sole shoes for preventing slips and whether shoes with high collars or flared soles can enhance balance when challenging tasks are undertaken. PMID:19235118

  11. Optimizing the plant-based diet.

    PubMed

    Mann, J I

    2000-09-01

    Any attempt to optimize a plant-based diet necessitates an identification of the features of the diet which confer benefit as well as any which may be associated with detrimental effects. The former task is more difficult than might be assumed as there is no doubt that some of the apparent health benefits observed amongst vegetarians are a consequence of environmental determinants of health which characterize groups of people who choose vegetarian diets, rather than dietary practices. This review will consider the major health benefits of plant-based diets, the specific foods or nutrients which confer the benefits as far as can be ascertained from present knowledge, potential nutrient deficiencies associated with a plant-based diet and nutritional strategies that can be employed to prevent any such deficiencies. PMID:24398280

  12. Optimal halftoning for network-based imaging

    NASA Astrophysics Data System (ADS)

    Ostromoukhov, Victor

    2000-12-01

    In this contribution, we introduce a multiple depth progressive representation for network-based still and moving images. A simple quantization algorithm associated with this representation provides optimal image quality. By optimum, we mean the best possible visual quality for a given value of information under real life constraints such as physical, psychological , or legal constraints. A special variant of the algorithm, multi-depth coherent error diffusion, addresses a specific problem of temporal coherence between frames in moving images. The output produced with our algorithm is visually pleasant because its Fourier spectrum is close to the 'blue noise'.

  13. GPU-based ultrafast IMRT plan optimization.

    PubMed

    Men, Chunhua; Gu, Xuejun; Choi, Dongju; Majumdar, Amitava; Zheng, Ziyi; Mueller, Klaus; Jiang, Steve B

    2009-11-01

    The widespread adoption of on-board volumetric imaging in cancer radiotherapy has stimulated research efforts to develop online adaptive radiotherapy techniques to handle the inter-fraction variation of the patient's geometry. Such efforts face major technical challenges to perform treatment planning in real time. To overcome this challenge, we are developing a supercomputing online re-planning environment (SCORE) at the University of California, San Diego (UCSD). As part of the SCORE project, this paper presents our work on the implementation of an intensity-modulated radiation therapy (IMRT) optimization algorithm on graphics processing units (GPUs). We adopt a penalty-based quadratic optimization model, which is solved by using a gradient projection method with Armijo's line search rule. Our optimization algorithm has been implemented in CUDA for parallel GPU computing as well as in C for serial CPU computing for comparison purpose. A prostate IMRT case with various beamlet and voxel sizes was used to evaluate our implementation. On an NVIDIA Tesla C1060 GPU card, we have achieved speedup factors of 20-40 without losing accuracy, compared to the results from an Intel Xeon 2.27 GHz CPU. For a specific nine-field prostate IMRT case with 5 x 5 mm(2) beamlet size and 2.5 x 2.5 x 2.5 mm(3) voxel size, our GPU implementation takes only 2.8 s to generate an optimal IMRT plan. Our work has therefore solved a major problem in developing online re-planning technologies for adaptive radiotherapy. PMID:19826201

  14. GPU-based ultrafast IMRT plan optimization

    NASA Astrophysics Data System (ADS)

    Men, Chunhua; Gu, Xuejun; Choi, Dongju; Majumdar, Amitava; Zheng, Ziyi; Mueller, Klaus; Jiang, Steve B.

    2009-11-01

    The widespread adoption of on-board volumetric imaging in cancer radiotherapy has stimulated research efforts to develop online adaptive radiotherapy techniques to handle the inter-fraction variation of the patient's geometry. Such efforts face major technical challenges to perform treatment planning in real time. To overcome this challenge, we are developing a supercomputing online re-planning environment (SCORE) at the University of California, San Diego (UCSD). As part of the SCORE project, this paper presents our work on the implementation of an intensity-modulated radiation therapy (IMRT) optimization algorithm on graphics processing units (GPUs). We adopt a penalty-based quadratic optimization model, which is solved by using a gradient projection method with Armijo's line search rule. Our optimization algorithm has been implemented in CUDA for parallel GPU computing as well as in C for serial CPU computing for comparison purpose. A prostate IMRT case with various beamlet and voxel sizes was used to evaluate our implementation. On an NVIDIA Tesla C1060 GPU card, we have achieved speedup factors of 20-40 without losing accuracy, compared to the results from an Intel Xeon 2.27 GHz CPU. For a specific nine-field prostate IMRT case with 5 × 5 mm2 beamlet size and 2.5 × 2.5 × 2.5 mm3 voxel size, our GPU implementation takes only 2.8 s to generate an optimal IMRT plan. Our work has therefore solved a major problem in developing online re-planning technologies for adaptive radiotherapy.

  15. Cytogenetic bases for risk inference

    SciTech Connect

    Bender, M A

    1980-01-01

    Various enviromental pollutants are suspected of being capable of causing cancers or genetic defects even at low levels of exposure. In order to estimate risk from exposure to these pollutants, it would be useful to have some indicator of exposure. It is suggested that chromosomes are ideally suited for this purpose. Through the phenonema of chromosome aberrations and sister chromatid exchanges (SCE), chromosomes respond to virtually all carcinogens and mutagens. Aberrations and SCE are discussed in the context of their use as indicators of increased risk to health by chemical pollutants. (ACR)

  16. Risk based ASME Code requirements

    SciTech Connect

    Gore, B.F.; Vo, T.V.; Balkey, K.R.

    1992-09-01

    The objective of this ASME Research Task Force is to develop and to apply a methodology for incorporating quantitative risk analysis techniques into the definition of in-service inspection (ISI) programs for a wide range of industrial applications. An additional objective, directed towards the field of nuclear power generation, is ultimately to develop a recommendation for comprehensive revisions to the ISI requirements of Section XI of the ASME Boiler and Pressure Vessel Code. This will require development of a firm technical basis for such requirements, which does not presently exist. Several years of additional research will be required before this can be accomplished. A general methodology suitable for application to any industry has been defined and published. It has recently been refined and further developed during application to the field of nuclear power generation. In the nuclear application probabilistic risk assessment (PRA) techniques and information have been incorporated. With additional analysis, PRA information is used to determine the consequence of a component rupture (increased reactor core damage probability). A procedure has also been recommended for using the resulting quantified risk estimates to determine target component rupture probability values to be maintained by inspection activities. Structural risk and reliability analysis (SRRA) calculations are then used to determine characteristics which an inspection strategy must posess in order to maintain component rupture probabilities below target values. The methodology, results of example applications, and plans for future work are discussed.

  17. Risk-Based Sampling: I Don't Want to Weight in Vain.

    PubMed

    Powell, Mark R

    2015-12-01

    Recently, there has been considerable interest in developing risk-based sampling for food safety and animal and plant health for efficient allocation of inspection and surveillance resources. The problem of risk-based sampling allocation presents a challenge similar to financial portfolio analysis. Markowitz (1952) laid the foundation for modern portfolio theory based on mean-variance optimization. However, a persistent challenge in implementing portfolio optimization is the problem of estimation error, leading to false "optimal" portfolios and unstable asset weights. In some cases, portfolio diversification based on simple heuristics (e.g., equal allocation) has better out-of-sample performance than complex portfolio optimization methods due to estimation uncertainty. Even for portfolios with a modest number of assets, the estimation window required for true optimization may imply an implausibly long stationary period. The implications for risk-based sampling are illustrated by a simple simulation model of lot inspection for a small, heterogeneous group of producers. PMID:26033352

  18. Optimal caching algorithm based on dynamic programming

    NASA Astrophysics Data System (ADS)

    Guo, Changjie; Xiang, Zhe; Zhong, Yuzhuo; Long, Jidong

    2001-07-01

    With the dramatic growth of multimedia streams, the efficient distribution of stored videos has become a major concern. There are two basic caching strategies: the whole caching strategy and the caching strategy based on layered encoded video, the latter can satisfy the requirement of the highly heterogeneous access to the Internet. Conventional caching strategies assign each object a cache gain by calculating popularity or density popularity, and determine which videos and which layers should be cached. In this paper, we first investigate the delivery model of stored video based on proxy, and propose two novel caching algorithms, DPLayer (for layered encoded caching scheme) and DPWhole (for whole caching scheme) for multimedia proxy caching. The two algorithms are based on the resource allocation model of dynamic programming to select the optimal subset of objects to be cached in proxy. Simulation proved that our algorithms achieve better performance than other existing schemes. We also analyze the computational complexity and space complexity of the algorithms, and introduce a regulative parameter to compress the states space of the dynamic programming problem and reduce the complexity of algorithms.

  19. Risk-sensitive optimal feedback control accounts for sensorimotor behavior under uncertainty.

    PubMed

    Nagengast, Arne J; Braun, Daniel A; Wolpert, Daniel M

    2010-01-01

    Many aspects of human motor behavior can be understood using optimality principles such as optimal feedback control. However, these proposed optimal control models are risk-neutral; that is, they are indifferent to the variability of the movement cost. Here, we propose the use of a risk-sensitive optimal controller that incorporates movement cost variance either as an added cost (risk-averse controller) or as an added value (risk-seeking controller) to model human motor behavior in the face of uncertainty. We use a sensorimotor task to test the hypothesis that subjects are risk-sensitive. Subjects controlled a virtual ball undergoing Brownian motion towards a target. Subjects were required to minimize an explicit cost, in points, that was a combination of the final positional error of the ball and the integrated control cost. By testing subjects on different levels of Brownian motion noise and relative weighting of the position and control cost, we could distinguish between risk-sensitive and risk-neutral control. We show that subjects change their movement strategy pessimistically in the face of increased uncertainty in accord with the predictions of a risk-averse optimal controller. Our results suggest that risk-sensitivity is a fundamental attribute that needs to be incorporated into optimal feedback control models. PMID:20657657

  20. Risk-Sensitive Optimal Feedback Control Accounts for Sensorimotor Behavior under Uncertainty

    PubMed Central

    Nagengast, Arne J.; Braun, Daniel A.; Wolpert, Daniel M.

    2010-01-01

    Many aspects of human motor behavior can be understood using optimality principles such as optimal feedback control. However, these proposed optimal control models are risk-neutral; that is, they are indifferent to the variability of the movement cost. Here, we propose the use of a risk-sensitive optimal controller that incorporates movement cost variance either as an added cost (risk-averse controller) or as an added value (risk-seeking controller) to model human motor behavior in the face of uncertainty. We use a sensorimotor task to test the hypothesis that subjects are risk-sensitive. Subjects controlled a virtual ball undergoing Brownian motion towards a target. Subjects were required to minimize an explicit cost, in points, that was a combination of the final positional error of the ball and the integrated control cost. By testing subjects on different levels of Brownian motion noise and relative weighting of the position and control cost, we could distinguish between risk-sensitive and risk-neutral control. We show that subjects change their movement strategy pessimistically in the face of increased uncertainty in accord with the predictions of a risk-averse optimal controller. Our results suggest that risk-sensitivity is a fundamental attribute that needs to be incorporated into optimal feedback control models. PMID:20657657

  1. Risk based inspection for atmospheric storage tank

    NASA Astrophysics Data System (ADS)

    Nugroho, Agus; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is an attack that occurs on a metallic material as a result of environment's reaction.Thus, it causes atmospheric storage tank's leakage, material loss, environmental pollution, equipment failure and affects the age of process equipment then finally financial damage. Corrosion risk measurement becomesa vital part of Asset Management at the plant for operating any aging asset.This paper provides six case studies dealing with high speed diesel atmospheric storage tank parts at a power plant. A summary of the basic principles and procedures of corrosion risk analysis and RBI applicable to the Process Industries were discussed prior to the study. Semi quantitative method based onAPI 58I Base-Resource Document was employed. The risk associated with corrosion on the equipment in terms of its likelihood and its consequences were discussed. The corrosion risk analysis outcome used to formulate Risk Based Inspection (RBI) method that should be a part of the atmospheric storage tank operation at the plant. RBI gives more concern to inspection resources which are mostly on `High Risk' and `Medium Risk' criteria and less on `Low Risk' shell. Risk categories of the evaluated equipment were illustrated through case study analysis outcome.

  2. Multi-Point Combinatorial Optimization Method with Distance Based Interaction

    NASA Astrophysics Data System (ADS)

    Yasuda, Keiichiro; Jinnai, Hiroyuki; Ishigame, Atsushi

    This paper proposes a multi-point combinatorial optimization method based on Proximate Optimality Principle (POP), which method has several advantages for solving large-scale combinatorial optimization problems. The proposed algorithm uses not only the distance between search points but also the interaction among search points in order to utilize POP in several types of combinatorial optimization problems. The proposed algorithm is applied to several typical combinatorial optimization problems, a knapsack problem, a traveling salesman problem, and a flow shop scheduling problem, in order to verify the performance of the proposed algorithm. The simulation results indicate that the proposed method has higher optimality than the conventional combinatorial optimization methods.

  3. Time-response-based evolutionary optimization

    NASA Astrophysics Data System (ADS)

    Avigad, Gideon; Goldvard, Alex; Salomon, Shaul

    2015-04-01

    Solutions to engineering problems are often evaluated by considering their time responses; thus, each solution is associated with a function. To avoid optimizing the functions, such optimization is usually carried out by setting auxiliary objectives (e.g. minimal overshoot). Therefore, in order to find different optimal solutions, alternative auxiliary optimization objectives may have to be defined prior to optimization. In the current study, a new approach is suggested that avoids the need to define auxiliary objectives. An algorithm is suggested that enables the optimization of solutions according to their transient behaviours. For this optimization, the functions are sampled and the problem is posed as a multi-objective problem. The recently introduced algorithm NSGA-II-PSA is adopted and tailored to solve it. Mathematical as well as engineering problems are utilized to explain and demonstrate the approach and its applicability to real life problems. The results highlight the advantages of avoiding the definition of artificial objectives.

  4. Community based intervention to optimize osteoporosis management: randomized controlled trial

    PubMed Central

    2010-01-01

    Background Osteoporosis-related fractures are a significant public health concern. Interventions that increase detection and treatment of osteoporosis are underutilized. This pragmatic randomised study was done to evaluate the impact of a multifaceted community-based care program aimed at optimizing evidence-based management in patients at risk for osteoporosis and fractures. Methods This was a 12-month randomized trial performed in Ontario, Canada. Eligible patients were community-dwelling, aged ≥55 years, and identified to be at risk for osteoporosis-related fractures. Two hundred and one patients were allocated to the intervention group or to usual care. Components of the intervention were directed towards primary care physicians and patients and included facilitated bone mineral density testing, patient education and patient-specific recommendations for osteoporosis treatment. The primary outcome was the implementation of appropriate osteoporosis management. Results 101 patients were allocated to intervention and 100 to control. Mean age of participants was 71.9 ± 7.2 years and 94% were women. Pharmacological treatment (alendronate, risedronate, or raloxifene) for osteoporosis was increased by 29% compared to usual care (56% [29/52] vs. 27% [16/60]; relative risk [RR] 2.09, 95% confidence interval [CI] 1.29 to 3.40). More individuals in the intervention group were taking calcium (54% [54/101] vs. 20% [20/100]; RR 2.67, 95% CI 1.74 to 4.12) and vitamin D (33% [33/101] vs. 20% [20/100]; RR 1.63, 95% CI 1.01 to 2.65). Conclusions A multi-faceted community-based intervention improved management of osteoporosis in high risk patients compared with usual care. Trial Registration This trial has been registered with clinicaltrials.gov (ID: NCT00465387) PMID:20799973

  5. CFD based draft tube hydraulic design optimization

    NASA Astrophysics Data System (ADS)

    McNabb, J.; Devals, C.; Kyriacou, S. A.; Murry, N.; Mullins, B. F.

    2014-03-01

    The draft tube design of a hydraulic turbine, particularly in low to medium head applications, plays an important role in determining the efficiency and power characteristics of the overall machine, since an important proportion of the available energy, being in kinetic form leaving the runner, needs to be recovered by the draft tube into static head. For large units, these efficiency and power characteristics can equate to large sums of money when considering the anticipated selling price of the energy produced over the machine's life-cycle. This same draft tube design is also a key factor in determining the overall civil costs of the powerhouse, primarily in excavation and concreting, which can amount to similar orders of magnitude as the price of the energy produced. Therefore, there is a need to find the optimum compromise between these two conflicting requirements. In this paper, an elaborate approach is described for dealing with this optimization problem. First, the draft tube's detailed geometry is defined as a function of a comprehensive set of design parameters (about 20 of which a subset is allowed to vary during the optimization process) and are then used in a non-uniform rational B-spline based geometric modeller to fully define the wetted surfaces geometry. Since the performance of the draft tube is largely governed by 3D viscous effects, such as boundary layer separation from the walls and swirling flow characteristics, which in turn governs the portion of the available kinetic energy which will be converted into pressure, a full 3D meshing and Navier-Stokes analysis is performed for each design. What makes this even more challenging is the fact that the inlet velocity distribution to the draft tube is governed by the runner at each of the various operating conditions that are of interest for the exploitation of the powerhouse. In order to determine these inlet conditions, a combined steady-state runner and an initial draft tube analysis, using a

  6. Probability-Based Software for Grid Optimization: Improved Power System Operations Using Advanced Stochastic Optimization

    SciTech Connect

    2012-02-24

    GENI Project: Sandia National Laboratories is working with several commercial and university partners to develop software for market management systems (MMSs) that enable greater use of renewable energy sources throughout the grid. MMSs are used to securely and optimally determine which energy resources should be used to service energy demand across the country. Contributions of electricity to the grid from renewable energy sources such as wind and solar are intermittent, introducing complications for MMSs, which have trouble accommodating the multiple sources of price and supply uncertainties associated with bringing these new types of energy into the grid. Sandia’s software will bring a new, probability-based formulation to account for these uncertainties. By factoring in various probability scenarios for electricity production from renewable energy sources in real time, Sandia’s formula can reduce the risk of inefficient electricity transmission, save ratepayers money, conserve power, and support the future use of renewable energy.

  7. Risk-Based Comparison of Carbon Capture Technologies

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Jones, Edward

    2013-05-01

    In this paper, we describe an integrated probabilistic risk assessment methodological framework and a decision-support tool suite for implementing systematic comparisons of competing carbon capture technologies. Culminating from a collaborative effort among national laboratories under the Carbon Capture Simulation Initiative (CCSI), the risk assessment framework and the decision-support tool suite encapsulate three interconnected probabilistic modeling and simulation components. The technology readiness level (TRL) assessment component identifies specific scientific and engineering targets required by each readiness level and applies probabilistic estimation techniques to calculate the likelihood of graded as well as nonlinear advancement in technology maturity. The technical risk assessment component focuses on identifying and quantifying risk contributors, especially stochastic distributions for significant risk contributors, performing scenario-based risk analysis, and integrating with carbon capture process model simulations and optimization. The financial risk component estimates the long-term return on investment based on energy retail pricing, production cost, operating and power replacement cost, plan construction and retrofit expenses, and potential tax relief, expressed probabilistically as the net present value distributions over various forecast horizons.

  8. Risk-Based Object Oriented Testing

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Stapko, Ruth; Gallo, Albert

    2000-01-01

    Software testing is a well-defined phase of the software development life cycle. Functional ("black box") testing and structural ("white box") testing are two methods of test case design commonly used by software developers. A lesser known testing method is risk-based testing, which takes into account the probability of failure of a portion of code as determined by its complexity. For object oriented programs, a methodology is proposed for identification of risk-prone classes. Risk-based testing is a highly effective testing technique that can be used to find and fix the most important problems as quickly as possible.

  9. Science, science policy, and risk-based management

    SciTech Connect

    Midgley, L.P.

    1997-09-01

    Recent national awareness of the economic infeasibility of remediating hazardous waste sites to background levels has sparked increased interest in the role of science policy in the environmental risk assessment and risk management process. As individual states develop guidelines for addressing environmental risks at hazardous waste sites, the role of science policy decisions and uncertainty must be carefully evaluated to achieve long-term environmental goals and solutions that are economically feasible and optimally beneficial to all stakeholders. Amendment to Oregon Revised Statute 465.315 establishes policy and Utah Cleanup Action and Risk-Based Closure Standards sets requirements for risk-based cleanup and closure at sites where remediation or removal of hazardous constituents to background levels will not be achieved. This paper discusses the difficulties in effectively integrating potential current and future impacts on human health and the environment, technical feasibility, economic considerations, and political realities into environmental policy and standards, using these references as models. This paper considers the role of both objective and subjective criteria in the risk-based closure and management processes and makes suggestions for improving the system by which these sites may be reclaimed.

  10. 12 CFR Appendix B to Part 3 - Risk-Based Capital Guidelines; Market Risk

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 1 2014-01-01 2014-01-01 false Risk-Based Capital Guidelines; Market Risk B... ADEQUACY STANDARDS Pt. 3, App. B Appendix B to Part 3—Risk-Based Capital Guidelines; Market Risk Section... Application of the Market Risk Capital Rule Section 4Adjustments to the Risk-Based Capital Ratio...

  11. Risk Classification and Risk-based Safety and Mission Assurance

    NASA Technical Reports Server (NTRS)

    Leitner, Jesse A.

    2014-01-01

    Recent activities to revamp and emphasize the need to streamline processes and activities for Class D missions across the agency have led to various interpretations of Class D, including the lumping of a variety of low-cost projects into Class D. Sometimes terms such as Class D minus are used. In this presentation, mission risk classifications will be traced to official requirements and definitions as a measure to ensure that projects and programs align with the guidance and requirements that are commensurate for their defined risk posture. As part of this, the full suite of risk classifications, formal and informal will be defined, followed by an introduction to the new GPR 8705.4 that is currently under review.GPR 8705.4 lays out guidance for the mission success activities performed at the Classes A-D for NPR 7120.5 projects as well as for projects not under NPR 7120.5. Furthermore, the trends in stepping from Class A into higher risk posture classifications will be discussed. The talk will conclude with a discussion about risk-based safety and mission assuranceat GSFC.

  12. Bell-Curve Based Evolutionary Optimization Algorithm

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Laba, K.; Kincaid, R.

    1998-01-01

    The paper presents an optimization algorithm that falls in the category of genetic, or evolutionary algorithms. While the bit exchange is the basis of most of the Genetic Algorithms (GA) in research and applications in America, some alternatives, also in the category of evolutionary algorithms, but use a direct, geometrical approach have gained popularity in Europe and Asia. The Bell-Curve Based Evolutionary Algorithm (BCB) is in this alternative category and is distinguished by the use of a combination of n-dimensional geometry and the normal distribution, the bell-curve, in the generation of the offspring. The tool for creating a child is a geometrical construct comprising a line connecting two parents and a weighted point on that line. The point that defines the child deviates from the weighted point in two directions: parallel and orthogonal to the connecting line, the deviation in each direction obeying a probabilistic distribution. Tests showed satisfactory performance of BCB. The principal advantage of BCB is its controllability via the normal distribution parameters and the geometrical construct variables.

  13. Temporal variation of optimal UV exposure time over Korea: risks and benefits of surface UV radiation

    NASA Astrophysics Data System (ADS)

    Lee, Y. G.; Koo, J. H.

    2015-12-01

    Solar UV radiation in a wavelength range between 280 to 400 nm has both positive and negative influences on human body. Surface UV radiation is the main natural source of vitamin D, providing the promotion of bone and musculoskeletal health and reducing the risk of a number of cancers and other medical conditions. However, overexposure to surface UV radiation is significantly related with the majority of skin cancer, in addition other negative health effects such as sunburn, skin aging, and some forms of eye cataracts. Therefore, it is important to estimate the optimal UV exposure time, representing a balance between reducing negative health effects and maximizing sufficient vitamin D production. Previous studies calculated erythemal UV and vitamin-D UV from the measured and modelled spectral irradiances, respectively, by weighting CIE Erythema and Vitamin D3 generation functions (Kazantzidis et al., 2009; Fioletov et al., 2010). In particular, McKenzie et al. (2009) suggested the algorithm to estimate vitamin-D production UV from erythemal UV (or UV index) and determined the optimum conditions of UV exposure based on skin type Ⅱ according to the Fitzpatrick (1988). Recently, there are various demands for risks and benefits of surface UV radiation on public health over Korea, thus it is necessary to estimate optimal UV exposure time suitable to skin type of East Asians. This study examined the relationship between erythemally weighted UV (UVEry) and vitamin D weighted UV (UVVitD) over Korea during 2004-2012. The temporal variations of the ratio (UVVitD/UVEry) were also analyzed and the ratio as a function of UV index was applied in estimating the optimal UV exposure time. In summer with high surface UV radiation, short exposure time leaded to sufficient vitamin D and erythema and vice versa in winter. Thus, the balancing time in winter was enough to maximize UV benefits and minimize UV risks.

  14. Optimizing Assurance: The Risk Regulation System in Relationships

    ERIC Educational Resources Information Center

    Murray, Sandra L.; Holmes, John G.; Collins, Nancy L.

    2006-01-01

    A model of risk regulation is proposed to explain how people balance the goal of seeking closeness to a romantic partner against the opposing goal of minimizing the likelihood and pain of rejection. The central premise is that confidence in a partner's positive regard and caring allows people to risk seeking dependence and connectedness. The risk…

  15. Model-based optimization of ultrasonic transducers.

    PubMed

    Heikkola, Erkki; Laitinen, Mika

    2005-01-01

    Numerical simulation and automated optimization of Langevin-type ultrasonic transducers are investigated. These kind of transducers are standard components in various applications of high-power ultrasonics such as ultrasonic cleaning and chemical processing. Vibration of the transducer is simulated numerically by standard finite element method and the dimensions and shape parameters of a transducer are optimized with respect to different criteria. The novelty value of this work is the combination of the simulation model and the optimization problem by efficient automatic differentiation techniques. The capabilities of this approach are demonstrated with practical test cases in which various aspects of the operation of a transducer are improved. PMID:15474952

  16. Risk-based and deterministic regulation

    SciTech Connect

    Fischer, L.E.; Brown, N.W.

    1995-07-01

    Both risk-based and deterministic methods are used for regulating the nuclear industry to protect the public safety and health from undue risk. The deterministic method is one where performance standards are specified for each kind of nuclear system or facility. The deterministic performance standards address normal operations and design basis events which include transient and accident conditions. The risk-based method uses probabilistic risk assessment methods to supplement the deterministic one by (1) addressing all possible events (including those beyond the design basis events), (2) using a systematic, logical process for identifying and evaluating accidents, and (3) considering alternative means to reduce accident frequency and/or consequences. Although both deterministic and risk-based methods have been successfully applied, there is need for a better understanding of their applications and supportive roles. This paper describes the relationship between the two methods and how they are used to develop and assess regulations in the nuclear industry. Preliminary guidance is suggested for determining the need for using risk based methods to supplement deterministic ones. However, it is recommended that more detailed guidance and criteria be developed for this purpose.

  17. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  18. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  19. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  20. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  1. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  2. Impact of an integrated treatment algorithm based on platelet function testing and clinical risk assessment: results of the TRIAGE Patients Undergoing Percutaneous Coronary Interventions To Improve Clinical Outcomes Through Optimal Platelet Inhibition study.

    PubMed

    Chandrasekhar, Jaya; Baber, Usman; Mehran, Roxana; Aquino, Melissa; Sartori, Samantha; Yu, Jennifer; Kini, Annapoorna; Sharma, Samin; Skurk, Carsten; Shlofmitz, Richard A; Witzenbichler, Bernhard; Dangas, George

    2016-08-01

    Assessment of platelet reactivity alone for thienopyridine selection with percutaneous coronary intervention (PCI) has not been associated with improved outcomes. In TRIAGE, a prospective multicenter observational pilot study we sought to evaluate the benefit of an integrated algorithm combining clinical risk and platelet function testing to select type of thienopyridine in patients undergoing PCI. Patients on chronic clopidogrel therapy underwent platelet function testing prior to PCI using the VerifyNow assay to determine high on treatment platelet reactivity (HTPR, ≥230 P2Y12 reactivity units or PRU). Based on both PRU and clinical (ischemic and bleeding) risks, patients were switched to prasugrel or continued on clopidogrel per the study algorithm. The primary endpoints were (i) 1-year major adverse cardiovascular events (MACE) composite of death, non-fatal myocardial infarction, or definite or probable stent thrombosis; and (ii) major bleeding, Bleeding Academic Research Consortium type 2, 3 or 5. Out of 318 clopidogrel treated patients with a mean age of 65.9 ± 9.8 years, HTPR was noted in 33.3 %. Ninety (28.0 %) patients overall were switched to prasugrel and 228 (72.0 %) continued clopidogrel. The prasugrel group had fewer smokers and more patients with heart failure. At 1-year MACE occurred in 4.4 % of majority HTPR patients on prasugrel versus 3.5 % of primarily non-HTPR patients on clopidogrel (p = 0.7). Major bleeding (5.6 vs 7.9 %, p = 0.47) was numerically higher with clopidogrel compared with prasugrel. Use of the study clinical risk algorithm for choice and intensity of thienopyridine prescription following PCI resulted in similar ischemic outcomes in HTPR patients receiving prasugrel and primarily non-HTPR patients on clopidogrel without an untoward increase in bleeding with prasugrel. However, the study was prematurely terminated and these findings are therefore hypothesis generating. PMID:27100112

  3. Optimal Hedge for Nodal Price Risk using FTR

    NASA Astrophysics Data System (ADS)

    Tanaka, Hiroaki; Makino, Michiko; Ichida, Yoshio; Akiyoshi, Masanori

    As the deregulation of electric business proceeds, each company needs to construct a risk hedging system. So far many companies have not been taking much care of this suffciently. In this paper, we address the nodal price hedge issue. Most companies have risks for the nodal prices which tend to be highly volatile. There's almost no doubt that such a company actually needs hedge products to make profits stable. We suggest the usage of FTR for this purpose. First, we briefly note the mechanisms of nodal price in PJM market and FTR, and suggest the mathematical formulations. Then we show some numerical examples and discuss our findings.

  4. Optimal separable bases and molecular collisions

    SciTech Connect

    Poirier, L W

    1997-12-01

    A new methodology is proposed for the efficient determination of Green`s functions and eigenstates for quantum systems of two or more dimensions. For a given Hamiltonian, the best possible separable approximation is obtained from the set of all Hilbert space operators. It is shown that this determination itself, as well as the solution of the resultant approximation, are problems of reduced dimensionality for most systems of physical interest. Moreover, the approximate eigenstates constitute the optimal separable basis, in the sense of self-consistent field theory. These distorted waves give rise to a Born series with optimized convergence properties. Analytical results are presented for an application of the method to the two-dimensional shifted harmonic oscillator system. The primary interest however, is quantum reactive scattering in molecular systems. For numerical calculations, the use of distorted waves corresponds to numerical preconditioning. The new methodology therefore gives rise to an optimized preconditioning scheme for the efficient calculation of reactive and inelastic scattering amplitudes, especially at intermediate energies. This scheme is particularly suited to discrete variable representations (DVR`s) and iterative sparse matrix methods commonly employed in such calculations. State to state and cumulative reactive scattering results obtained via the optimized preconditioner are presented for the two-dimensional collinear H + H{sub 2} {yields} H{sub 2} + H system. Computational time and memory requirements for this system are drastically reduced in comparison with other methods, and results are obtained for previously prohibitive energy regimes.

  5. Optimization of agricultural field workability predictions for improved risk management

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Risks introduced by weather variability are key considerations in agricultural production. The sensitivity of agriculture to weather variability is of special concern in the face of climate change. In particular, the availability of workable days is an important consideration in agricultural practic...

  6. Nonlinear Inertia Weighted Teaching-Learning-Based Optimization for Solving Global Optimization Problem

    PubMed Central

    Wu, Zong-Sheng; Fu, Wei-Ping; Xue, Ru

    2015-01-01

    Teaching-learning-based optimization (TLBO) algorithm is proposed in recent years that simulates the teaching-learning phenomenon of a classroom to effectively solve global optimization of multidimensional, linear, and nonlinear problems over continuous spaces. In this paper, an improved teaching-learning-based optimization algorithm is presented, which is called nonlinear inertia weighted teaching-learning-based optimization (NIWTLBO) algorithm. This algorithm introduces a nonlinear inertia weighted factor into the basic TLBO to control the memory rate of learners and uses a dynamic inertia weighted factor to replace the original random number in teacher phase and learner phase. The proposed algorithm is tested on a number of benchmark functions, and its performance comparisons are provided against the basic TLBO and some other well-known optimization algorithms. The experiment results show that the proposed algorithm has a faster convergence rate and better performance than the basic TLBO and some other algorithms as well. PMID:26421005

  7. Optimal Predator Risk Assessment by the Sonar-Jamming Arctiine Moth Bertholdia trigona

    PubMed Central

    Corcoran, Aaron J.; Wagner, Ryan D.; Conner, William E.

    2013-01-01

    Nearly all animals face a tradeoff between seeking food and mates and avoiding predation. Optimal escape theory holds that an animal confronted with a predator should only flee when benefits of flight (increased survival) outweigh the costs (energetic costs, lost foraging time, etc.). We propose a model for prey risk assessment based on the predator's stage of attack. Risk level should increase rapidly from when the predator detects the prey to when it commits to the attack. We tested this hypothesis using a predator – the echolocating bat – whose active biosonar reveals its stage of attack. We used a prey defense – clicking used for sonar jamming by the tiger moth Bertholdia trigona– that can be readily studied in the field and laboratory and is enacted simultaneously with evasive flight. We predicted that prey employ defenses soon after being detected and targeted, and that prey defensive thresholds discriminate between legitimate predatory threats and false threats where a nearby prey is attacked. Laboratory and field experiments using playbacks of ultrasound signals and naturally behaving bats, respectively, confirmed our predictions. Moths clicked soon after bats detected and targeted them. Also, B. trigona clicking thresholds closely matched predicted optimal thresholds for discriminating legitimate and false predator threats for bats using search and approach phase echolocation – the period when bats are searching for and assessing prey. To our knowledge, this is the first quantitative study to correlate the sensory stimuli that trigger defensive behaviors with measurements of signals provided by predators during natural attacks in the field. We propose theoretical models for explaining prey risk assessment depending on the availability of cues that reveal a predator's stage of attack. PMID:23671686

  8. Optimal predator risk assessment by the sonar-jamming arctiine moth Bertholdia trigona.

    PubMed

    Corcoran, Aaron J; Wagner, Ryan D; Conner, William E

    2013-01-01

    Nearly all animals face a tradeoff between seeking food and mates and avoiding predation. Optimal escape theory holds that an animal confronted with a predator should only flee when benefits of flight (increased survival) outweigh the costs (energetic costs, lost foraging time, etc.). We propose a model for prey risk assessment based on the predator's stage of attack. Risk level should increase rapidly from when the predator detects the prey to when it commits to the attack. We tested this hypothesis using a predator--the echolocating bat--whose active biosonar reveals its stage of attack. We used a prey defense--clicking used for sonar jamming by the tiger moth Bertholdia trigona--that can be readily studied in the field and laboratory and is enacted simultaneously with evasive flight. We predicted that prey employ defenses soon after being detected and targeted, and that prey defensive thresholds discriminate between legitimate predatory threats and false threats where a nearby prey is attacked. Laboratory and field experiments using playbacks of ultrasound signals and naturally behaving bats, respectively, confirmed our predictions. Moths clicked soon after bats detected and targeted them. Also, B. trigona clicking thresholds closely matched predicted optimal thresholds for discriminating legitimate and false predator threats for bats using search and approach phase echolocation--the period when bats are searching for and assessing prey. To our knowledge, this is the first quantitative study to correlate the sensory stimuli that trigger defensive behaviors with measurements of signals provided by predators during natural attacks in the field. We propose theoretical models for explaining prey risk assessment depending on the availability of cues that reveal a predator's stage of attack. PMID:23671686

  9. Mice can count and optimize count-based decisions.

    PubMed

    Çavdaroğlu, Bilgehan; Balcı, Fuat

    2016-06-01

    Previous studies showed that rats and pigeons can count their responses, and the resultant count-based judgments exhibit the scalar property (also known as Weber's Law), a psychophysical property that also characterizes interval-timing behavior. Animals were found to take a nearly normative account of these well-established endogenous uncertainty characteristics in their time-based decision-making. On the other hand, no study has yet tested the implications of scalar property of numerosity representations for reward-rate maximization in count-based decision-making. The current study tested mice on a task that required them to press one lever for a minimum number of times before pressing the second lever to collect the armed reward (fixed consecutive number schedule, FCN). Fewer than necessary number of responses reset the response count without reinforcement, whereas emitting responses at least for the minimum number of times reset the response counter with reinforcement. Each mouse was tested with three different FCN schedules (FCN10, FCN20, FCN40). The number of responses emitted on the first lever before pressing the second lever constituted the main unit of analysis. Our findings for the first time showed that mice count their responses with scalar property. We then defined the reward-rate maximizing numerical decision strategies in this task based on the subject-based estimates of the endogenous counting uncertainty. Our results showed that mice learn to maximize the reward-rate by incorporating the uncertainty in their numerosity judgments into their count-based decisions. Our findings extend the scope of optimal temporal risk-assessment to the domain of count-based decision-making. PMID:26463617

  10. An approximation based global optimization strategy for structural synthesis

    NASA Technical Reports Server (NTRS)

    Sepulveda, A. E.; Schmit, L. A.

    1991-01-01

    A global optimization strategy for structural synthesis based on approximation concepts is presented. The methodology involves the solution of a sequence of highly accurate approximate problems using a global optimization algorithm. The global optimization algorithm implemented consists of a branch and bound strategy based on the interval evaluation of the objective function and constraint functions, combined with a local feasible directions algorithm. The approximate design optimization problems are constructed using first order approximations of selected intermediate response quantities in terms of intermediate design variables. Some numerical results for example problems are presented to illustrate the efficacy of the design procedure setforth.

  11. CFD Optimization on Network-Based Parallel Computer System

    NASA Technical Reports Server (NTRS)

    Cheung, Samson H.; Holst, Terry L. (Technical Monitor)

    1994-01-01

    Combining multiple engineering workstations into a network-based heterogeneous parallel computer allows application of aerodynamic optimization with advance computational fluid dynamics codes, which is computationally expensive in mainframe supercomputer. This paper introduces a nonlinear quasi-Newton optimizer designed for this network-based heterogeneous parallel computer on a software called Parallel Virtual Machine. This paper will introduce the methodology behind coupling a Parabolized Navier-Stokes flow solver to the nonlinear optimizer. This parallel optimization package has been applied to reduce the wave drag of a body of revolution and a wing/body configuration with results of 5% to 6% drag reduction.

  12. Defining a region of optimization based on engine usage data

    DOEpatents

    Jiang, Li; Lee, Donghoon; Yilmaz, Hakan; Stefanopoulou, Anna

    2015-08-04

    Methods and systems for engine control optimization are provided. One or more operating conditions of a vehicle engine are detected. A value for each of a plurality of engine control parameters is determined based on the detected one or more operating conditions of the vehicle engine. A range of the most commonly detected operating conditions of the vehicle engine is identified and a region of optimization is defined based on the range of the most commonly detected operating conditions of the vehicle engine. The engine control optimization routine is initiated when the one or more operating conditions of the vehicle engine are within the defined region of optimization.

  13. A seismic risk for the lunar base

    NASA Technical Reports Server (NTRS)

    Oberst, Juergen; Nakamura, Yosio

    1992-01-01

    Shallow moonquakes, which were discovered during observations following the Apollo lunar landing missions, may pose a threat to lunar surface operations. The nature of these moonquakes is similar to that of intraplate earthquakes, which include infrequent but destructive events. Therefore, there is a need for detailed study to assess the possible seismic risk before establishing a lunar base.

  14. Probability-based least square support vector regression metamodeling technique for crashworthiness optimization problems

    NASA Astrophysics Data System (ADS)

    Wang, Hu; Li, Enying; Li, G. Y.

    2011-03-01

    This paper presents a crashworthiness design optimization method based on a metamodeling technique. The crashworthiness optimization is a highly nonlinear and large scale problem, which is composed various nonlinearities, such as geometry, material and contact and needs a large number expensive evaluations. In order to obtain a robust approximation efficiently, a probability-based least square support vector regression is suggested to construct metamodels by considering structure risk minimization. Further, to save the computational cost, an intelligent sampling strategy is applied to generate sample points at the stage of design of experiment (DOE). In this paper, a cylinder, a full vehicle frontal collision is involved. The results demonstrate that the proposed metamodel-based optimization is efficient and effective in solving crashworthiness, design optimization problems.

  15. Optimal trajectories based on linear equations

    NASA Technical Reports Server (NTRS)

    Carter, Thomas E.

    1990-01-01

    The Principal results of a recent theory of fuel optimal space trajectories for linear differential equations are presented. Both impulsive and bounded-thrust problems are treated. A new form of the Lawden Primer vector is found that is identical for both problems. For this reason, starting iteratives from the solution of the impulsive problem are highly effective in the solution of the two-point boundary-value problem associated with bounded thrust. These results were applied to the problem of fuel optimal maneuvers of a spacecraft near a satellite in circular orbit using the Clohessy-Wiltshire equations. For this case two-point boundary-value problems were solved using a microcomputer, and optimal trajectory shapes displayed. The results of this theory can also be applied if the satellite is in an arbitrary Keplerian orbit through the use of the Tschauner-Hempel equations. A new form of the solution of these equations has been found that is identical for elliptical, parabolic, and hyperbolic orbits except in the way that a certain integral is evaluated. For elliptical orbits this integral is evaluated through the use of the eccentric anomaly. An analogous evaluation is performed for hyperbolic orbits.

  16. Laboratory quality control based on risk management.

    PubMed

    Nichols, James H

    2011-01-01

    Risk management is the systematic application of management policies, procedures, and practices to the tasks of analyzing, evaluating, controlling and monitoring risk (the effect of uncertainty on objectives). Clinical laboratories conduct a number of activities that could be considered risk management including verification of performance of new tests, troubleshooting instrument problems and responding to physician complaints. Development of a quality control plan for a laboratory test requires a process map of the testing process with consideration for weak steps in the preanalytic, analytic and postanalytic phases of testing where there is an increased probability of errors. Control processes that either prevent or improve the detection of errors can be implemented at these weak points in the testing process to enhance the overall quality of the test result. This manuscript is based on a presentation at the 2nd International Symposium on Point of Care Testing held at King Faisal Specialist Hospital in Riyadh, Saudi Arabia on October 12-13, 2010. Risk management principles will be reviewed and progress towards adopting a new Clinical and Laboratory Standards Institute Guideline for developing laboratory quality control plans based on risk management will be discussed. PMID:21623049

  17. An Optimization Method for Condition Based Maintenance of Aircraft Fleet Considering Prognostics Uncertainty

    PubMed Central

    Chen, Yiran; Sun, Bo; Li, Songjie

    2014-01-01

    An optimization method for condition based maintenance (CBM) of aircraft fleet considering prognostics uncertainty is proposed. The CBM and dispatch process of aircraft fleet is analyzed first, and the alternative strategy sets for single aircraft are given. Then, the optimization problem of fleet CBM with lower maintenance cost and dispatch risk is translated to the combinatorial optimization problem of single aircraft strategy. Remain useful life (RUL) distribution of the key line replaceable Module (LRM) has been transformed into the failure probability of the aircraft and the fleet health status matrix is established. And the calculation method of the costs and risks for mission based on health status matrix and maintenance matrix is given. Further, an optimization method for fleet dispatch and CBM under acceptable risk is proposed based on an improved genetic algorithm. Finally, a fleet of 10 aircrafts is studied to verify the proposed method. The results shows that it could realize optimization and control of the aircraft fleet oriented to mission success. PMID:24892046

  18. Pediatric appendectomy: optimal surgical timing and risk assessment.

    PubMed

    Burjonrappa, Sathyaprasad; Rachel, Dana

    2014-05-01

    Appendicitis is one of the most common pediatric surgical problems. In the older surgical paradigm, appendectomy was considered to be an emergent procedure; however, with changes to resident work hours and other economic factors, the operation has evolved into an urgent and deliberately planned intervention. This paradigm shift in care has not necessarily seen universal buy-in by all stakeholders. Skeptics worry about the higher incidence of complications, particularly intra-abdominal abscess (IAA), associated with the delay to appendectomy with this strategy. Development of IAA after pediatric appendectomy greatly burdens the healthcare system, incapacitates patients, and limits family functionality. The risk factors that influence the development of IAA after appendectomy were evaluated in 220 children admitted to a large urban teaching hospital over a recent 1.5-year period. Preoperative risk factors included in the study were age, sex, weight, ethnicity, duration and nature of symptoms, white cell count, and ultrasound or computed tomography scan findings (appendicolith, peritoneal fluid, abscess, phlegmon), failed nonoperative management, antibiotics administered, and timing. Intraoperative factors included were timing of appendectomy, surgical and pathological findings of perforation, open or laparoscopic procedure, and use of staple or Endoloop to ligate the appendix. Postoperative factors included were duration and type of antibiotic therapy. There were 94 (43%) perforated and 126 (57%) nonperforated appendicitis during the study period. The incidence of postoperative IAA was 4.5 per cent (nine of 220). Children operated on after overnight antibiotics and resuscitation had a significantly lower risk of IAA as compared with children managed by other strategies (P < 0.0003). Of the preoperative factors, only the presence of a fever in the emergency department (P < 0.001) and identification of complicated appendicitis on imaging (P < 0.0001) were significant

  19. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based...

  20. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based...

  1. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based...

  2. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based...

  3. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based...

  4. Ant colony optimization-based firewall anomaly mitigation engine.

    PubMed

    Penmatsa, Ravi Kiran Varma; Vatsavayi, Valli Kumari; Samayamantula, Srinivas Kumar

    2016-01-01

    A firewall is the most essential component of network perimeter security. Due to human error and the involvement of multiple administrators in configuring firewall rules, there exist common anomalies in firewall rulesets such as Shadowing, Generalization, Correlation, and Redundancy. There is a need for research on efficient ways of resolving such anomalies. The challenge is also to see that the reordered or resolved ruleset conforms to the organization's framed security policy. This study proposes an ant colony optimization (ACO)-based anomaly resolution and reordering of firewall rules called ACO-based firewall anomaly mitigation engine. Modified strategies are also introduced to automatically detect these anomalies and to minimize manual intervention of the administrator. Furthermore, an adaptive reordering strategy is proposed to aid faster reordering when a new rule is appended. The proposed approach was tested with different firewall policy sets. The results were found to be promising in terms of the number of conflicts resolved, with minimal availability loss and marginal security risk. This work demonstrated the application of a metaheuristic search technique, ACO, in improving the performance of a packet-filter firewall with respect to mitigating anomalies in the rules, and at the same time demonstrated conformance to the security policy. PMID:27441151

  5. Optimized entanglement purification schemes for modular based quantum computers

    NASA Astrophysics Data System (ADS)

    Krastanov, Stefan; Jiang, Liang

    The choice of entanglement purification scheme strongly depends on the fidelities of quantum gates and measurements, as well as the imperfection of initial entanglement. For instance, the purification scheme optimal at low gate fidelities may not necessarily be the optimal scheme at higher gate fidelities. We employ an evolutionary algorithm that efficiently optimizes the entanglement purification circuit for given system parameters. Such optimized purification schemes will boost the performance of entanglement purification, and consequently enhance the fidelity of teleportation-based non-local coupling gates, which is an indispensible building block for modular-based quantum computers. In addition, we study how these optimized purification schemes affect the resource overhead caused by error correction in modular based quantum computers.

  6. Credit risk evaluation based on social media.

    PubMed

    Yang, Yang; Gu, Jing; Zhou, Zongfang

    2016-07-01

    Social media has been playing an increasingly important role in the sharing of individuals' opinions on many financial issues, including credit risk in investment decisions. This paper analyzes whether these opinions, which are transmitted through social media, can accurately predict enterprises' future credit risk. We consider financial statements oriented evaluation results based on logit and probit approaches as the benchmarks. We then conduct textual analysis to retrieve both posts and their corresponding commentaries published on two of the most popular social media platforms for financial investors in China. Professional advice from financial analysts is also investigated in this paper. We surprisingly find that the opinions extracted from both posts and commentaries surpass opinions of analysts in terms of credit risk prediction. PMID:26739372

  7. Optimal fractional order PID design via Tabu Search based algorithm.

    PubMed

    Ateş, Abdullah; Yeroglu, Celaleddin

    2016-01-01

    This paper presents an optimization method based on the Tabu Search Algorithm (TSA) to design a Fractional-Order Proportional-Integral-Derivative (FOPID) controller. All parameter computations of the FOPID employ random initial conditions, using the proposed optimization method. Illustrative examples demonstrate the performance of the proposed FOPID controller design method. PMID:26652128

  8. Optimization Research of Generation Investment Based on Linear Programming Model

    NASA Astrophysics Data System (ADS)

    Wu, Juan; Ge, Xueqian

    Linear programming is an important branch of operational research and it is a mathematical method to assist the people to carry out scientific management. GAMS is an advanced simulation and optimization modeling language and it will combine a large number of complex mathematical programming, such as linear programming LP, nonlinear programming NLP, MIP and other mixed-integer programming with the system simulation. In this paper, based on the linear programming model, the optimized investment decision-making of generation is simulated and analyzed. At last, the optimal installed capacity of power plants and the final total cost are got, which provides the rational decision-making basis for optimized investments.

  9. Optimization and analysis of a CFJ-airfoil using adaptive meta-model based design optimization

    NASA Astrophysics Data System (ADS)

    Whitlock, Michael D.

    Although strong potential for Co-Flow Jet (CFJ) flow separation control system has been demonstrated in existing literature, there has been little effort applied towards the optimization of the design for a given application. The high dimensional design space makes any optimization computationally intensive. This work presents the optimization of a CFJ airfoil as applied to a low Reynolds Number regimen using meta-model based design optimization (MBDO). The approach consists of computational fluid dynamics (CFD) analysis coupled with a surrogate model derived using Kriging. A genetic algorithm (GA) is then used to perform optimization on the efficient surrogate model. MBDO was shown to be an effective and efficient approach to solving the CFJ design problem. The final solution set was found to decrease drag by 100% while increasing lift by 42%. When validated, the final solution was found to be within one standard deviation of the CFD model it was representing.

  10. RISK AND RISK ASSESSMENT IN WATER-BASED RECREATION

    EPA Science Inventory

    The great number of individuals using recreational water resources presents a challenge with regard to protecting the health of these recreationists. Risk assessment provides a framework for characterizing the risk associated with exposure to microbial hazards and for managing r...

  11. Breast Cancer-Related Arm Lymphedema: Incidence Rates, Diagnostic Techniques, Optimal Management and Risk Reduction Strategies

    SciTech Connect

    Shah, Chirag; Vicini, Frank A.

    2011-11-15

    As more women survive breast cancer, long-term toxicities affecting their quality of life, such as lymphedema (LE) of the arm, gain importance. Although numerous studies have attempted to determine incidence rates, identify optimal diagnostic tests, enumerate efficacious treatment strategies and outline risk reduction guidelines for breast cancer-related lymphedema (BCRL), few groups have consistently agreed on any of these issues. As a result, standardized recommendations are still lacking. This review will summarize the latest data addressing all of these concerns in order to provide patients and health care providers with optimal, contemporary recommendations. Published incidence rates for BCRL vary substantially with a range of 2-65% based on surgical technique, axillary sampling method, radiation therapy fields treated, and the use of chemotherapy. Newer clinical assessment tools can potentially identify BCRL in patients with subclinical disease with prospective data suggesting that early diagnosis and management with noninvasive therapy can lead to excellent outcomes. Multiple therapies exist with treatments defined by the severity of BCRL present. Currently, the standard of care for BCRL in patients with significant LE is complex decongestive physiotherapy (CDP). Contemporary data also suggest that a multidisciplinary approach to the management of BCRL should begin prior to definitive treatment for breast cancer employing patient-specific surgical, radiation therapy, and chemotherapy paradigms that limit risks. Further, prospective clinical assessments before and after treatment should be employed to diagnose subclinical disease. In those patients who require aggressive locoregional management, prophylactic therapies and the use of CDP can help reduce the long-term sequelae of BCRL.

  12. Practice management based on risk assessment.

    PubMed

    Sandberg, Hans

    2004-01-01

    The management of a dental practice is most often focused on what clinicians do (production of items), and not so much on what is achieved in terms of oral health. The main reason for this is probably that it is easier to measure production and more difficult to measure health outcome. This paper presents a model based on individual risk assessment that aims to achieve a financially sound economy and good oral health. The close-to-the-clinic management tool, the HIDEP Model (Health Improvement in a DEntal Practice) was pioneered initially in Sweden at the end of 1980s. The experience over a 15-year period with different elements of the model is presented, including: the basis of examination and risk assessment; motivation; task delegation and leadership issues; health-finance evaluations; and quality development within a dental clinic. DentiGroupXL, a software program designed to support the work based on the model, is also described. PMID:15646588

  13. Risk-based Classification of Incidents

    NASA Technical Reports Server (NTRS)

    Greenwell, William S.; Knight, John C.; Strunk, Elisabeth A.

    2003-01-01

    As the penetration of software into safety-critical systems progresses, accidents and incidents involving software will inevitably become more frequent. Identifying lessons from these occurrences and applying them to existing and future systems is essential if recurrences are to be prevented. Unfortunately, investigative agencies do not have the resources to fully investigate every incident under their jurisdictions and domains of expertise and thus must prioritize certain occurrences when allocating investigative resources. In the aviation community, most investigative agencies prioritize occurrences based on the severity of their associated losses, allocating more resources to accidents resulting in injury to passengers or extensive aircraft damage. We argue that this scheme is inappropriate because it undervalues incidents whose recurrence could have a high potential for loss while overvaluing fairly straightforward accidents involving accepted risks. We then suggest a new strategy for prioritizing occurrences based on the risk arising from incident recurrence.

  14. Performance optimization of web-based medical simulation.

    PubMed

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2013-01-01

    This paper presents a technique for performance optimization of multimodal interactive web-based medical simulation. A web-based simulation framework is promising for easy access and wide dissemination of medical simulation. However, the real-time performance of the simulation highly depends on hardware capability on the client side. Providing consistent simulation in different hardware is critical for reliable medical simulation. This paper proposes a non-linear mixed integer programming model to optimize the performance of visualization and physics computation while considering hardware capability and application specific constraints. The optimization model identifies and parameterizes the rendering and computing capabilities of the client hardware using an exploratory proxy code. The parameters are utilized to determine the optimized simulation conditions including texture sizes, mesh sizes and canvas resolution. The test results show that the optimization model not only achieves a desired frame per second but also resolves visual artifacts due to low performance hardware. PMID:23400151

  15. A Risk-Based Sensor Placement Methodology

    SciTech Connect

    Lee, Ronald W; Kulesz, James J

    2006-08-01

    A sensor placement methodology is proposed to solve the problem of optimal location of sensors or detectors to protect population against the exposure to and effects of known and/or postulated chemical, biological, and/or radiological threats. Historical meteorological data are used to characterize weather conditions as wind speed and direction pairs with the percentage of occurrence of the pairs over the historical period. The meteorological data drive atmospheric transport and dispersion modeling of the threats, the results of which are used to calculate population at risk against standard exposure levels. Sensor locations are determined via a dynamic programming algorithm where threats captured or detected by sensors placed in prior stages are removed from consideration in subsequent stages. Moreover, the proposed methodology provides a quantification of the marginal utility of each additional sensor or detector. Thus, the criterion for halting the iterative process can be the number of detectors available, a threshold marginal utility value, or the cumulative detection of a minimum factor of the total risk value represented by all threats.

  16. 78 FR 76521 - Risk-Based Capital Guidelines; Market Risk

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-18

    ...\\ 78 FR 43829 (July 22, 2013). The Board's current market risk rule is at 12 CFR parts 208 and 225...) published a final rule on August 30, 2012 to revise their respective market risk rules (77 FR 53059 (August... proposed. \\5\\ 78 FR 62017 (October 11, 2013). II. Description of the Final Market Risk Rule A....

  17. 76 FR 1889 - Risk-Based Capital Guidelines: Market Risk

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-11

    ...The Office of the Comptroller of the Currency (OCC), Board of Governors of the Federal Reserve System (Board), and Federal Deposit Insurance Corporation (FDIC) are requesting comment on a proposal to revise their market risk capital rules to modify their scope to better capture positions for which the market risk capital rules are appropriate; reduce procyclicality in market risk capital......

  18. 77 FR 53059 - Risk-Based Capital Guidelines: Market Risk

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-30

    ...The Office of the Comptroller of the Currency (OCC), Board of Governors of the Federal Reserve System (Board), and Federal Deposit Insurance Corporation (FDIC) are revising their market risk capital rules to better capture positions for which the market risk capital rules are appropriate; reduce procyclicality; enhance the rules' sensitivity to risks that are not adequately captured under......

  19. Optimal Bi-Objective Redundancy Allocation for Systems Reliability and Risk Management.

    PubMed

    Govindan, Kannan; Jafarian, Ahmad; Azbari, Mostafa E; Choi, Tsan-Ming

    2016-08-01

    In the big data era, systems reliability is critical to effective systems risk management. In this paper, a novel multiobjective approach, with hybridization of a known algorithm called NSGA-II and an adaptive population-based simulated annealing (APBSA) method is developed to solve the systems reliability optimization problems. In the first step, to create a good algorithm, we use a coevolutionary strategy. Since the proposed algorithm is very sensitive to parameter values, the response surface method is employed to estimate the appropriate parameters of the algorithm. Moreover, to examine the performance of our proposed approach, several test problems are generated, and the proposed hybrid algorithm and other commonly known approaches (i.e., MOGA, NRGA, and NSGA-II) are compared with respect to four performance measures: 1) mean ideal distance; 2) diversification metric; 3) percentage of domination; and 4) data envelopment analysis. The computational studies have shown that the proposed algorithm is an effective approach for systems reliability and risk management. PMID:25622333

  20. Optimization of a photovoltaic pumping system based on the optimal control theory

    SciTech Connect

    Betka, A.; Attali, A.

    2010-07-15

    This paper suggests how an optimal operation of a photovoltaic pumping system based on an induction motor driving a centrifugal pump can be realized. The optimization problem consists in maximizing the daily pumped water quantity via the optimization of the motor efficiency for every operation point. The proposed structure allows at the same time the minimization the machine losses, the field oriented control and the maximum power tracking of the photovoltaic array. This will be attained based on multi-input and multi-output optimal regulator theory. The effectiveness of the proposed algorithm is described by simulation and the obtained results are compared to those of a system working with a constant air gap flux. (author)

  1. Teaching-learning-based optimization algorithm for unconstrained and constrained real-parameter optimization problems

    NASA Astrophysics Data System (ADS)

    Rao, R. V.; Savsani, V. J.; Balic, J.

    2012-12-01

    An efficient optimization algorithm called teaching-learning-based optimization (TLBO) is proposed in this article to solve continuous unconstrained and constrained optimization problems. The proposed method is based on the effect of the influence of a teacher on the output of learners in a class. The basic philosophy of the method is explained in detail. The algorithm is tested on 25 different unconstrained benchmark functions and 35 constrained benchmark functions with different characteristics. For the constrained benchmark functions, TLBO is tested with different constraint handling techniques such as superiority of feasible solutions, self-adaptive penalty, ɛ-constraint, stochastic ranking and ensemble of constraints. The performance of the TLBO algorithm is compared with that of other optimization algorithms and the results show the better performance of the proposed algorithm.

  2. Game theory and risk-based leveed river system planning with noncooperation

    NASA Astrophysics Data System (ADS)

    Hui, Rui; Lund, Jay R.; Madani, Kaveh

    2016-01-01

    Optimal risk-based levee designs are usually developed for economic efficiency. However, in river systems with multiple levees, the planning and maintenance of different levees are controlled by different agencies or groups. For example, along many rivers, levees on opposite riverbanks constitute a simple leveed river system with each levee designed and controlled separately. Collaborative planning of the two levees can be economically optimal for the whole system. Independent and self-interested landholders on opposite riversides often are willing to separately determine their individual optimal levee plans, resulting in a less efficient leveed river system from an overall society-wide perspective (the tragedy of commons). We apply game theory to simple leveed river system planning where landholders on each riverside independently determine their optimal risk-based levee plans. Outcomes from noncooperative games are analyzed and compared with the overall economically optimal outcome, which minimizes net flood cost system-wide. The system-wide economically optimal solution generally transfers residual flood risk to the lower-valued side of the river, but is often impractical without compensating for flood risk transfer to improve outcomes for all individuals involved. Such compensation can be determined and implemented with landholders' agreements on collaboration to develop an economically optimal plan. By examining iterative multiple-shot noncooperative games with reversible and irreversible decisions, the costs of myopia for the future in making levee planning decisions show the significance of considering the externalities and evolution path of dynamic water resource problems to improve decision-making.

  3. Performance- and risk-based regulation

    SciTech Connect

    Sauter, G.D.

    1994-12-31

    Risk-based regulation (RBR) and performance-based regulation (PBR) are two relatively new concepts for the regulation of nuclear reactor power plants by the U.S. Nuclear Regulatory Commission (NRC). Although RBR and PBR are often considered to be somewhat equivalent, they, in fact, address two fundamentally different regulatory questions. To fruitfully discuss these two concepts, it is important to recognize what each entails. This paper identifies those two fundamental questions and discusses how they are addressed by RBR and PBR.

  4. Optimal hip fracture management in high-risk frail older adults.

    PubMed

    McNicoll, Lynn; Fitzgibbons, Peter G

    2009-07-01

    Management of high-risk hip fracture patients is complicated. The optimal surgical decision must be individualized and made promptly, with the assistance of all important team members, including primary care doctors, patient, family, and the orthopedic team. The risks of delaying surgery are significant and should be avoided if possible. Strategies for improving outcomes in these patients include collaborations with medicine and delirium prevention protocols, especially with early ambulation. PMID:19685643

  5. Optimizing ring-based CSR sources

    SciTech Connect

    Byrd, J.M.; De Santis, S.; Hao, Z.; Martin, M.C.; Munson, D.V.; Li, D.; Nis himura, H.; Robin, D.S.; Sannibale, F.; Schlueter, R.D.; Schoenlein, R.; Jung, J.Y.; Venturini, M.; Wan, W.; Zholents, A.A.; Zolotorev, M.

    2004-01-01

    Coherent synchrotron radiation (CSR) is a fascinating phenomenon recently observed in electron storage rings and shows tremendous promise as a high power source of radiation at terahertz frequencies. However, because of the properties of the radiation and the electron beams needed to produce it, there are a number of interesting features of the storage ring that can be optimized for CSR. Furthermore, CSR has been observed in three distinct forms: as steady pulses from short bunches, bursts from growth of spontaneous modulations in high current bunches, and from micro modulations imposed on a bunch from laser slicing. These processes have their relative merits as sources and can be improved via the ring design. The terahertz (THz) and sub-THz region of the electromagnetic spectrum lies between the infrared and the microwave . This boundary region is beyond the normal reach of optical and electronic measurement techniques and sources associated with these better-known neighbors. Recent research has demonstrated a relatively high power source of THz radiation from electron storage rings: coherent synchrotron radiation (CSR). Besides offering high power, CSR enables broadband optical techniques to be extended to nearly the microwave region, and has inherently sub-picosecond pulses. As a result, new opportunities for scientific research and applications are enabled across a diverse array of disciplines: condensed matter physics, medicine, manufacturing, and space and defense industries. CSR will have a strong impact on THz imaging, spectroscopy, femtosecond dynamics, and driving novel non-linear processes. CSR is emitted by bunches of accelerated charged particles when the bunch length is shorter than the wavelength being emitted. When this criterion is met, all the particles emit in phase, and a single-cycle electromagnetic pulse results with an intensity proportional to the square of the number of particles in the bunch. It is this quadratic dependence that can

  6. Data mining and tree-based optimization

    SciTech Connect

    Grossman, R.; Bodek, H.; Northcutt, D.; Poor, V.

    1996-12-31

    Consider a large collection of objects, each of which has a large number of attributes of several different sorts. We assume that there are data attributes representing data, attributes which are to be statistically estimated or predicted from these, and attributes which can be controlled or set. A motivating example is to assign a credit score to a credit card prospect indicating the likelihood that the prospect will make credit card payments and then to set a credit limit for each prospect in such a way as to maximize the over-all expected revenue from the entire collection of prospects. In the terminology above, the credit score is called a predictive attribute and the credit limit a control attribute. The methodology we describe in the paper uses data mining to provide more accurate estimates of the predictive attributes and to provide more optimal settings of the control attributes. We briefly describe how to parallelize these computations. We also briefly comment on some of data management issues which arise for these types of problems in practice. We propose using object warehouses to provide low overhead, high performance access to large collections of objects as an underlying foundation for our data mining algorithms.

  7. 12 CFR 167.6 - Risk-based capital credit risk-weight categories.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 1 2014-01-01 2014-01-01 false Risk-based capital credit risk-weight categories. 167.6 Section 167.6 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY CAPITAL Regulatory Capital Requirements § 167.6 Risk-based capital credit risk-weight categories. (a) Risk-weighted assets. Risk-weighted assets...

  8. 12 CFR 167.6 - Risk-based capital credit risk-weight categories.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 1 2012-01-01 2012-01-01 false Risk-based capital credit risk-weight categories. 167.6 Section 167.6 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY CAPITAL Regulatory Capital Requirements § 167.6 Risk-based capital credit risk-weight categories. (a) Risk-weighted assets. Risk-weighted assets...

  9. 12 CFR 567.6 - Risk-based capital credit risk-weight categories.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 6 2013-01-01 2012-01-01 true Risk-based capital credit risk-weight categories. 567.6 Section 567.6 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY CAPITAL Regulatory Capital Requirements § 567.6 Risk-based capital credit risk-weight categories. (a) Risk-weighted assets. Risk-weighted assets...

  10. Anatomy-Based Inverse Planning Simulated Annealing Optimization in High-Dose-Rate Prostate Brachytherapy: Significant Dosimetric Advantage Over Other Optimization Techniques

    SciTech Connect

    Jacob, Dayee Raben, Adam; Sarkar, Abhirup; Grimm, Jimm; Simpson, Larry

    2008-11-01

    Purpose: To perform an independent validation of an anatomy-based inverse planning simulated annealing (IPSA) algorithm in obtaining superior target coverage and reducing the dose to the organs at risk. Method and Materials: In a recent prostate high-dose-rate brachytherapy protocol study by the Radiation Therapy Oncology Group (0321), our institution treated 20 patients between June 1, 2005 and November 30, 2006. These patients had received a high-dose-rate boost dose of 19 Gy to the prostate, in addition to an external beam radiotherapy dose of 45 Gy with intensity-modulated radiotherapy. Three-dimensional dosimetry was obtained for the following optimization schemes in the Plato Brachytherapy Planning System, version 14.3.2, using the same dose constraints for all the patients treated during this period: anatomy-based IPSA optimization, geometric optimization, and dose point optimization. Dose-volume histograms were generated for the planning target volume and organs at risk for each optimization method, from which the volume receiving at least 75% of the dose (V{sub 75%}) for the rectum and bladder, volume receiving at least 125% of the dose (V{sub 125%}) for the urethra, and total volume receiving the reference dose (V{sub 100%}) and volume receiving 150% of the dose (V{sub 150%}) for the planning target volume were determined. The dose homogeneity index and conformal index for the planning target volume for each optimization technique were compared. Results: Despite suboptimal needle position in some implants, the IPSA algorithm was able to comply with the tight Radiation Therapy Oncology Group dose constraints for 90% of the patients in this study. In contrast, the compliance was only 30% for dose point optimization and only 5% for geometric optimization. Conclusions: Anatomy-based IPSA optimization proved to be the superior technique and also the fastest for reducing the dose to the organs at risk without compromising the target coverage.

  11. Optimal policy for value-based decision-making.

    PubMed

    Tajima, Satohiro; Drugowitsch, Jan; Pouget, Alexandre

    2016-01-01

    For decades now, normative theories of perceptual decisions, and their implementation as drift diffusion models, have driven and significantly improved our understanding of human and animal behaviour and the underlying neural processes. While similar processes seem to govern value-based decisions, we still lack the theoretical understanding of why this ought to be the case. Here, we show that, similar to perceptual decisions, drift diffusion models implement the optimal strategy for value-based decisions. Such optimal decisions require the models' decision boundaries to collapse over time, and to depend on the a priori knowledge about reward contingencies. Diffusion models only implement the optimal strategy under specific task assumptions, and cease to be optimal once we start relaxing these assumptions, by, for example, using non-linear utility functions. Our findings thus provide the much-needed theory for value-based decisions, explain the apparent similarity to perceptual decisions, and predict conditions under which this similarity should break down. PMID:27535638

  12. Optimal policy for value-based decision-making

    PubMed Central

    Tajima, Satohiro; Drugowitsch, Jan; Pouget, Alexandre

    2016-01-01

    For decades now, normative theories of perceptual decisions, and their implementation as drift diffusion models, have driven and significantly improved our understanding of human and animal behaviour and the underlying neural processes. While similar processes seem to govern value-based decisions, we still lack the theoretical understanding of why this ought to be the case. Here, we show that, similar to perceptual decisions, drift diffusion models implement the optimal strategy for value-based decisions. Such optimal decisions require the models' decision boundaries to collapse over time, and to depend on the a priori knowledge about reward contingencies. Diffusion models only implement the optimal strategy under specific task assumptions, and cease to be optimal once we start relaxing these assumptions, by, for example, using non-linear utility functions. Our findings thus provide the much-needed theory for value-based decisions, explain the apparent similarity to perceptual decisions, and predict conditions under which this similarity should break down. PMID:27535638

  13. Monte Carlo vs. Pencil Beam based optimization of stereotactic lung IMRT

    PubMed Central

    2009-01-01

    Background The purpose of the present study is to compare finite size pencil beam (fsPB) and Monte Carlo (MC) based optimization of lung intensity-modulated stereotactic radiotherapy (lung IMSRT). Materials and methods A fsPB and a MC algorithm as implemented in a biological IMRT planning system were validated by film measurements in a static lung phantom. Then, they were applied for static lung IMSRT planning based on three different geometrical patient models (one phase static CT, density overwrite one phase static CT, average CT) of the same patient. Both 6 and 15 MV beam energies were used. The resulting treatment plans were compared by how well they fulfilled the prescribed optimization constraints both for the dose distributions calculated on the static patient models and for the accumulated dose, recalculated with MC on each of 8 CTs of a 4DCT set. Results In the phantom measurements, the MC dose engine showed discrepancies < 2%, while the fsPB dose engine showed discrepancies of up to 8% in the presence of lateral electron disequilibrium in the target. In the patient plan optimization, this translates into violations of organ at risk constraints and unpredictable target doses for the fsPB optimized plans. For the 4D MC recalculated dose distribution, MC optimized plans always underestimate the target doses, but the organ at risk doses were comparable. The results depend on the static patient model, and the smallest discrepancy was found for the MC optimized plan on the density overwrite one phase static CT model. Conclusions It is feasible to employ the MC dose engine for optimization of lung IMSRT and the plans are superior to fsPB. Use of static patient models introduces a bias in the MC dose distribution compared to the 4D MC recalculated dose, but this bias is predictable and therefore MC based optimization on static patient models is considered safe. PMID:20003380

  14. Two-level optimization of composite wing structures based on panel genetic optimization

    NASA Astrophysics Data System (ADS)

    Liu, Boyang

    load. The resulting response surface is used for wing-level optimization. In general, complex composite structures consist of several laminates. A common problem in the design of such structures is that some plies in the adjacent laminates terminate in the boundary between the laminates. These discontinuities may cause stress concentrations and may increase manufacturing difficulty and cost. We developed measures of continuity of two adjacent laminates. We studied tradeoffs between weight and continuity through a simple composite wing design. Finally, we compared the two-level optimization to a single-level optimization based on flexural lamination parameters. The single-level optimization is efficient and feasible for a wing consisting of unstiffened panels.

  15. Joint global optimization of tomographic data based on particle swarm optimization and decision theory

    NASA Astrophysics Data System (ADS)

    Paasche, H.; Tronicke, J.

    2012-04-01

    In many near surface geophysical applications multiple tomographic data sets are routinely acquired to explore subsurface structures and parameters. Linking the model generation process of multi-method geophysical data sets can significantly reduce ambiguities in geophysical data analysis and model interpretation. Most geophysical inversion approaches rely on local search optimization methods used to find an optimal model in the vicinity of a user-given starting model. The final solution may critically depend on the initial model. Alternatively, global optimization (GO) methods have been used to invert geophysical data. They explore the solution space in more detail and determine the optimal model independently from the starting model. Additionally, they can be used to find sets of optimal models allowing a further analysis of model parameter uncertainties. Here we employ particle swarm optimization (PSO) to realize the global optimization of tomographic data. PSO is an emergent methods based on swarm intelligence characterized by fast and robust convergence towards optimal solutions. The fundamental principle of PSO is inspired by nature, since the algorithm mimics the behavior of a flock of birds searching food in a search space. In PSO, a number of particles cruise a multi-dimensional solution space striving to find optimal model solutions explaining the acquired data. The particles communicate their positions and success and direct their movement according to the position of the currently most successful particle of the swarm. The success of a particle, i.e. the quality of the currently found model by a particle, must be uniquely quantifiable to identify the swarm leader. When jointly inverting disparate data sets, the optimization solution has to satisfy multiple optimization objectives, at least one for each data set. Unique determination of the most successful particle currently leading the swarm is not possible. Instead, only statements about the Pareto

  16. Response-time optimization of rule-based expert systems

    NASA Astrophysics Data System (ADS)

    Zupan, Blaz; Cheng, Albert M. K.

    1994-03-01

    Real-time rule-based decision systems are embedded AI systems and must make critical decisions within stringent timing constraints. In the case where the response time of the rule- based system is not acceptable, it has to be optimized to meet both timing and integrity constraints. This paper describes a novel approach to reduce the response time of rule-based expert systems. Our optimization method is twofold: the first phase constructs the reduced cycle-free finite state transition system corresponding to the input rule-based system, and the second phase further refines the constructed transition system using the simulated annealing approach. The method makes use of rule-base system decomposition, concurrency, and state- equivalency. The new and optimized system is synthesized from the derived transition system. Compared with the original system, the synthesized system has fewer number of rule firings to reach the fixed point, is inherently stable, and has no redundant rules.

  17. A new efficient optimal path planner for mobile robot based on Invasive Weed Optimization algorithm

    NASA Astrophysics Data System (ADS)

    Mohanty, Prases K.; Parhi, Dayal R.

    2014-12-01

    Planning of the shortest/optimal route is essential for efficient operation of autonomous mobile robot or vehicle. In this paper Invasive Weed Optimization (IWO), a new meta-heuristic algorithm, has been implemented for solving the path planning problem of mobile robot in partially or totally unknown environments. This meta-heuristic optimization is based on the colonizing property of weeds. First we have framed an objective function that satisfied the conditions of obstacle avoidance and target seeking behavior of robot in partially or completely unknown environments. Depending upon the value of objective function of each weed in colony, the robot avoids obstacles and proceeds towards destination. The optimal trajectory is generated with this navigational algorithm when robot reaches its destination. The effectiveness, feasibility, and robustness of the proposed algorithm has been demonstrated through series of simulation and experimental results. Finally, it has been found that the developed path planning algorithm can be effectively applied to any kinds of complex situation.

  18. A multiobjective memetic algorithm based on particle swarm optimization.

    PubMed

    Liu, Dasheng; Tan, K C; Goh, C K; Ho, W K

    2007-02-01

    In this paper, a new memetic algorithm (MA) for multiobjective (MO) optimization is proposed, which combines the global search ability of particle swarm optimization with a synchronous local search heuristic for directed local fine-tuning. A new particle updating strategy is proposed based upon the concept of fuzzy global-best to deal with the problem of premature convergence and diversity maintenance within the swarm. The proposed features are examined to show their individual and combined effects in MO optimization. The comparative study shows the effectiveness of the proposed MA, which produces solution sets that are highly competitive in terms of convergence, diversity, and distribution. PMID:17278557

  19. Genetic Algorithm Based Neural Networks for Nonlinear Optimization

    Energy Science and Technology Software Center (ESTSC)

    1994-09-28

    This software develops a novel approach to nonlinear optimization using genetic algorithm based neural networks. To our best knowledge, this approach represents the first attempt at applying both neural network and genetic algorithm techniques to solve a nonlinear optimization problem. The approach constructs a neural network structure and an appropriately shaped energy surface whose minima correspond to optimal solutions of the problem. A genetic algorithm is employed to perform a parallel and powerful search ofmore » the energy surface.« less

  20. Genetic-evolution-based optimization methods for engineering design

    NASA Technical Reports Server (NTRS)

    Rao, S. S.; Pan, T. S.; Dhingra, A. K.; Venkayya, V. B.; Kumar, V.

    1990-01-01

    This paper presents the applicability of a biological model, based on genetic evolution, for engineering design optimization. Algorithms embodying the ideas of reproduction, crossover, and mutation are developed and applied to solve different types of structural optimization problems. Both continuous and discrete variable optimization problems are solved. A two-bay truss for maximum fundamental frequency is considered to demonstrate the continuous variable case. The selection of locations of actuators in an actively controlled structure, for minimum energy dissipation, is considered to illustrate the discrete variable case.

  1. Knowledge-Based Optimization of Molecular Geometries Using Crystal Structures.

    PubMed

    Cole, Jason C; Groom, Colin R; Korb, Oliver; McCabe, Patrick; Shields, Gregory P

    2016-04-25

    This paper describes a novel way to use the structural information contained in the Cambridge Structural Database (CSD) to drive geometry optimization of organic molecules. We describe how CSD structural information is transformed into objective functions for gradient-based optimization to provide good quality geometries for a large variety of organic molecules. Performance is assessed by minimizing different sets of organic molecules reporting RMSD movements for bond lengths, valence angles, torsion angles, and heavy atom positions. PMID:26977906

  2. Trading risk and performance for engineering design optimization using multifidelity analyses

    NASA Astrophysics Data System (ADS)

    Rajnarayan, Dev Gorur

    Computers pervade our lives today: from communication to calculation, their influence percolates many spheres of our existence. With continuing advances in computing, simulations are becoming increasingly complex and accurate. Powerful high-fidelity simulations mimic and predict a variety of real-life scenarios, with applications ranging from entertainment to engineering. The most accurate of such engineering simulations come at a high cost in terms of computing resources and time. Engineers use such simulations to predict the real-world performance of products they design; that is, they use them for analysis. Needless to say, the emphasis is on accuracy of the prediction. For such analysis, one would like to use the most accurate simulation available, and such a simulation is likely to be at the limits of available computing power, quite independently of advances in computing. In engineering design, however, the goal is somewhat different. Engineering design is generally posed as an optimization problem, where the goal is to tweak a set of available inputs or parameters, called design variables, to create a design that is optimal in some way, and meets some preset requirements. In other words, we would like modify the design variables in order to optimize some figure of merit, called an objective function, subject to a set of constraints, typically formulated as equations or inequalities to be satisfied. Typically, a complex engineering system such as an aircraft is described by thousands of design variables, all of which are optimized during the design process. Nevertheless, do we always need to use the highest-fidelity simulations as the objective function and constraints for engineering design? Or can we afford to use lower-fidelity simulations with appropriate corrections? In this thesis, we present a new methodology for surrogate-based optimization. Existing methods combine the possibility erroneous predictions of the low-fidelity surrogate with estimates of

  3. Optimal ''image-based'' weighting for energy-resolved CT

    SciTech Connect

    Schmidt, Taly Gilat

    2009-07-15

    This paper investigates a method of reconstructing images from energy-resolved CT data with negligible beam-hardening artifacts and improved contrast-to-nosie ratio (CNR) compared to conventional energy-weighting methods. Conceptually, the investigated method first reconstructs separate images from each energy bin. The final image is a linear combination of the energy-bin images, with the weights chosen to maximize the CNR in the final image. The optimal weight of a particular energy-bin image is derived to be proportional to the contrast-to-noise-variance ratio in that image. The investigated weighting method is referred to as ''image-based'' weighting, although, as will be described, the weights can be calculated and the energy-bin data combined prior to reconstruction. The performance of optimal image-based energy weighting with respect to CNR and beam-hardening artifacts was investigated through simulations and compared to that of energy integrating, photon counting, and previously studied optimal ''projection-based'' energy weighting. Two acquisitions were simulated: dedicated breast CT and a conventional thorax scan. The energy-resolving detector was simulated with five energy bins. Four methods of estimating the optimal weights were investigated, including task-specific and task-independent methods and methods that require a single reconstruction versus multiple reconstructions. Results demonstrated that optimal image-based weighting improved the CNR compared to energy-integrating weighting by factors of 1.15-1.6 depending on the task. Compared to photon-counting weighting, the CNR improvement ranged from 1.0 to 1.3. The CNR improvement factors were comparable to those of projection-based optimal energy weighting. The beam-hardening cupping artifact increased from 5.2% for energy-integrating weighting to 12.8% for optimal projection-based weighting, while optimal image-based weighting reduced the cupping to 0.6%. Overall, optimal image-based energy weighting

  4. Stackelberg Game of Buyback Policy in Supply Chain with a Risk-Averse Retailer and a Risk-Averse Supplier Based on CVaR

    PubMed Central

    Zhou, Yanju; Chen, Qian; Chen, Xiaohong; Wang, Zongrun

    2014-01-01

    This paper considers a decentralized supply chain in which a single supplier sells a perishable product to a single retailer facing uncertain demand. We assume that the supplier and the retailer are both risk averse and utilize Conditional Value at Risk (CVaR), a risk measure method which is popularized in financial risk management, to estimate their risk attitude. We establish a buyback policy model based on Stackelberg game theory under considering supply chain members' risk preference and get the expressions of the supplier's optimal repurchase price and the retailer's optimal order quantity which are compared with those under risk neutral case. Finally, a numerical example is applied to simulate that model and prove related conclusions. PMID:25247605

  5. Stackelberg game of buyback policy in supply chain with a risk-averse retailer and a risk-averse supplier based on CVaR.

    PubMed

    Zhou, Yanju; Chen, Qian; Chen, Xiaohong; Wang, Zongrun

    2014-01-01

    This paper considers a decentralized supply chain in which a single supplier sells a perishable product to a single retailer facing uncertain demand. We assume that the supplier and the retailer are both risk averse and utilize Conditional Value at Risk (CVaR), a risk measure method which is popularized in financial risk management, to estimate their risk attitude. We establish a buyback policy model based on Stackelberg game theory under considering supply chain members' risk preference and get the expressions of the supplier's optimal repurchase price and the retailer's optimal order quantity which are compared with those under risk neutral case. Finally, a numerical example is applied to simulate that model and prove related conclusions. PMID:25247605

  6. Demonstrating the benefits of template-based design-technology co-optimization

    NASA Astrophysics Data System (ADS)

    Liebmann, Lars; Hibbeler, Jason; Hieter, Nathaniel; Pileggi, Larry; Jhaveri, Tejas; Moe, Matthew; Rovner, Vyacheslav

    2010-03-01

    The concept of template-based design-technology co-optimization as a means of curbing escalating design complexity and increasing technology qualification risk is described. Data is presented highlighting the design efficacy of this proposal in terms of power, performance, and area benefits, quantifying the specific contributions of complex logic gates in this design optimization. Experimental results from 32nm technology node bulk CMOS wafers are presented to quantify the variability and design-margin reductions as well as yield and manufacturability improvements achievable with the proposed template-based design-technology co-optimization technique. The paper closes with data showing the predictable composability of individual templates, demonstrating a fundamental requirement of this proposal.

  7. Trust regions in Kriging-based optimization with expected improvement

    NASA Astrophysics Data System (ADS)

    Regis, Rommel G.

    2016-06-01

    The Kriging-based Efficient Global Optimization (EGO) method works well on many expensive black-box optimization problems. However, it does not seem to perform well on problems with steep and narrow global minimum basins and on high-dimensional problems. This article develops a new Kriging-based optimization method called TRIKE (Trust Region Implementation in Kriging-based optimization with Expected improvement) that implements a trust-region-like approach where each iterate is obtained by maximizing an Expected Improvement (EI) function within some trust region. This trust region is adjusted depending on the ratio of the actual improvement to the EI. This article also develops the Kriging-based CYCLONE (CYClic Local search in OptimizatioN using Expected improvement) method that uses a cyclic pattern to determine the search regions where the EI is maximized. TRIKE and CYCLONE are compared with EGO on 28 test problems with up to 32 dimensions and on a 36-dimensional groundwater bioremediation application in appendices supplied as an online supplement available at http://dx.doi.org/10.1080/0305215X.2015.1082350. The results show that both algorithms yield substantial improvements over EGO and they are competitive with a radial basis function method.

  8. Fatigue reliability based optimal design of planar compliant micropositioning stages

    NASA Astrophysics Data System (ADS)

    Wang, Qiliang; Zhang, Xianmin

    2015-10-01

    Conventional compliant micropositioning stages are usually developed based on static strength and deterministic methods, which may lead to either unsafe or excessive designs. This paper presents a fatigue reliability analysis and optimal design of a three-degree-of-freedom (3 DOF) flexure-based micropositioning stage. Kinematic, modal, static, and fatigue stress modelling of the stage were conducted using the finite element method. The maximum equivalent fatigue stress in the hinges was derived using sequential quadratic programming. The fatigue strength of the hinges was obtained by considering various influencing factors. On this basis, the fatigue reliability of the hinges was analysed using the stress-strength interference method. Fatigue-reliability-based optimal design of the stage was then conducted using the genetic algorithm and MATLAB. To make fatigue life testing easier, a 1 DOF stage was then optimized and manufactured. Experimental results demonstrate the validity of the approach.

  9. Support Vector Machine Based on Adaptive Acceleration Particle Swarm Optimization

    PubMed Central

    Abdulameer, Mohammed Hasan; Othman, Zulaiha Ali

    2014-01-01

    Existing face recognition methods utilize particle swarm optimizer (PSO) and opposition based particle swarm optimizer (OPSO) to optimize the parameters of SVM. However, the utilization of random values in the velocity calculation decreases the performance of these techniques; that is, during the velocity computation, we normally use random values for the acceleration coefficients and this creates randomness in the solution. To address this problem, an adaptive acceleration particle swarm optimization (AAPSO) technique is proposed. To evaluate our proposed method, we employ both face and iris recognition based on AAPSO with SVM (AAPSO-SVM). In the face and iris recognition systems, performance is evaluated using two human face databases, YALE and CASIA, and the UBiris dataset. In this method, we initially perform feature extraction and then recognition on the extracted features. In the recognition process, the extracted features are used for SVM training and testing. During the training and testing, the SVM parameters are optimized with the AAPSO technique, and in AAPSO, the acceleration coefficients are computed using the particle fitness values. The parameters in SVM, which are optimized by AAPSO, perform efficiently for both face and iris recognition. A comparative analysis between our proposed AAPSO-SVM and the PSO-SVM technique is presented. PMID:24790584

  10. Support vector machine based on adaptive acceleration particle swarm optimization.

    PubMed

    Abdulameer, Mohammed Hasan; Sheikh Abdullah, Siti Norul Huda; Othman, Zulaiha Ali

    2014-01-01

    Existing face recognition methods utilize particle swarm optimizer (PSO) and opposition based particle swarm optimizer (OPSO) to optimize the parameters of SVM. However, the utilization of random values in the velocity calculation decreases the performance of these techniques; that is, during the velocity computation, we normally use random values for the acceleration coefficients and this creates randomness in the solution. To address this problem, an adaptive acceleration particle swarm optimization (AAPSO) technique is proposed. To evaluate our proposed method, we employ both face and iris recognition based on AAPSO with SVM (AAPSO-SVM). In the face and iris recognition systems, performance is evaluated using two human face databases, YALE and CASIA, and the UBiris dataset. In this method, we initially perform feature extraction and then recognition on the extracted features. In the recognition process, the extracted features are used for SVM training and testing. During the training and testing, the SVM parameters are optimized with the AAPSO technique, and in AAPSO, the acceleration coefficients are computed using the particle fitness values. The parameters in SVM, which are optimized by AAPSO, perform efficiently for both face and iris recognition. A comparative analysis between our proposed AAPSO-SVM and the PSO-SVM technique is presented. PMID:24790584

  11. Optimizing medical data quality based on multiagent web service framework.

    PubMed

    Wu, Ching-Seh; Khoury, Ibrahim; Shah, Hemant

    2012-07-01

    One of the most important issues in e-healthcare information systems is to optimize the medical data quality extracted from distributed and heterogeneous environments, which can extremely improve diagnostic and treatment decision making. This paper proposes a multiagent web service framework based on service-oriented architecture for the optimization of medical data quality in the e-healthcare information system. Based on the design of the multiagent web service framework, an evolutionary algorithm (EA) for the dynamic optimization of the medical data quality is proposed. The framework consists of two main components; first, an EA will be used to dynamically optimize the composition of medical processes into optimal task sequence according to specific quality attributes. Second, a multiagent framework will be proposed to discover, monitor, and report any inconstancy between the optimized task sequence and the actual medical records. To demonstrate the proposed framework, experimental results for a breast cancer case study are provided. Furthermore, to show the unique performance of our algorithm, a comparison with other works in the literature review will be presented. PMID:22614723

  12. Efficiency Improvements to the Displacement Based Multilevel Structural Optimization Algorithm

    NASA Technical Reports Server (NTRS)

    Plunkett, C. L.; Striz, A. G.; Sobieszczanski-Sobieski, J.

    2001-01-01

    Multilevel Structural Optimization (MSO) continues to be an area of research interest in engineering optimization. In the present project, the weight optimization of beams and trusses using Displacement based Multilevel Structural Optimization (DMSO), a member of the MSO set of methodologies, is investigated. In the DMSO approach, the optimization task is subdivided into a single system and multiple subsystems level optimizations. The system level optimization minimizes the load unbalance resulting from the use of displacement functions to approximate the structural displacements. The function coefficients are then the design variables. Alternately, the system level optimization can be solved using the displacements themselves as design variables, as was shown in previous research. Both approaches ensure that the calculated loads match the applied loads. In the subsystems level, the weight of the structure is minimized using the element dimensions as design variables. The approach is expected to be very efficient for large structures, since parallel computing can be utilized in the different levels of the problem. In this paper, the method is applied to a one-dimensional beam and a large three-dimensional truss. The beam was tested to study possible simplifications to the system level optimization. In previous research, polynomials were used to approximate the global nodal displacements. The number of coefficients of the polynomials equally matched the number of degrees of freedom of the problem. Here it was desired to see if it is possible to only match a subset of the degrees of freedom in the system level. This would lead to a simplification of the system level, with a resulting increase in overall efficiency. However, the methods tested for this type of system level simplification did not yield positive results. The large truss was utilized to test further improvements in the efficiency of DMSO. In previous work, parallel processing was applied to the

  13. An Optimization-based Atomistic-to-Continuum Coupling Method

    DOE PAGESBeta

    Olson, Derek; Bochev, Pavel B.; Luskin, Mitchell; Shapeev, Alexander V.

    2014-08-21

    In this paper, we present a new optimization-based method for atomistic-to-continuum (AtC) coupling. The main idea is to cast the latter as a constrained optimization problem with virtual Dirichlet controls on the interfaces between the atomistic and continuum subdomains. The optimization objective is to minimize the error between the atomistic and continuum solutions on the overlap between the two subdomains, while the atomistic and continuum force balance equations provide the constraints. Separation, rather then blending of the atomistic and continuum problems, and their subsequent use as constraints in the optimization problem distinguishes our approach from the existing AtC formulations. Finally,more » we present and analyze the method in the context of a one-dimensional chain of atoms modeled using a linearized two-body potential with next-nearest neighbor interactions.« less

  14. Optimal weight based on energy imbalance and utility maximization

    NASA Astrophysics Data System (ADS)

    Sun, Ruoyan

    2016-01-01

    This paper investigates the optimal weight for both male and female using energy imbalance and utility maximization. Based on the difference of energy intake and expenditure, we develop a state equation that reveals the weight gain from this energy gap. We ​construct an objective function considering food consumption, eating habits and survival rate to measure utility. Through applying mathematical tools from optimal control methods and qualitative theory of differential equations, we obtain some results. For both male and female, the optimal weight is larger than the physiologically optimal weight calculated by the Body Mass Index (BMI). We also study the corresponding trajectories to steady state weight respectively. Depending on the value of a few parameters, the steady state can either be a saddle point with a monotonic trajectory or a focus with dampened oscillations.

  15. Inversion method based on stochastic optimization for particle sizing.

    PubMed

    Sánchez-Escobar, Juan Jaime; Barbosa-Santillán, Liliana Ibeth; Vargas-Ubera, Javier; Aguilar-Valdés, Félix

    2016-08-01

    A stochastic inverse method is presented based on a hybrid evolutionary optimization algorithm (HEOA) to retrieve a monomodal particle-size distribution (PSD) from the angular distribution of scattered light. By solving an optimization problem, the HEOA (with the Fraunhofer approximation) retrieves the PSD from an intensity pattern generated by Mie theory. The analyzed light-scattering pattern can be attributed to unimodal normal, gamma, or lognormal distribution of spherical particles covering the interval of modal size parameters 46≤α≤150. The HEOA ensures convergence to the near-optimal solution during the optimization of a real-valued objective function by combining the advantages of a multimember evolution strategy and locally weighted linear regression. The numerical results show that our HEOA can be satisfactorily applied to solve the inverse light-scattering problem. PMID:27505357

  16. Measurement matrix optimization method based on matrix orthogonal similarity transformation

    NASA Astrophysics Data System (ADS)

    Pan, Jinfeng

    2016-05-01

    Optimization of the measurement matrix is one of the important research aspects of compressive sensing theory. A measurement matrix optimization method is presented based on the orthogonal similarity transformation of the information operator's Gram matrix. In terms of the fact that the information operator's Gram matrix is a singular symmetric matrix, a simplified orthogonal similarity transformation is deduced, and thus the simplified diagonal matrix that is orthogonally similar to it is obtained. Then an approximation of the Gram matrix is obtained by letting all the nonzero diagonal entries of the simplified diagonal matrix equal their average value. Thus an optimized measurement matrix can be acquired according to its relationship with the information operator. Results of experiments show that the optimized measurement matrix compared to the random measurement matrix is less coherent with dictionaries. The relative signal recovery error also declines when the proposed measurement matrix is utilized.

  17. An Optimization-based Atomistic-to-Continuum Coupling Method

    SciTech Connect

    Olson, Derek; Bochev, Pavel B.; Luskin, Mitchell; Shapeev, Alexander V.

    2014-08-21

    In this paper, we present a new optimization-based method for atomistic-to-continuum (AtC) coupling. The main idea is to cast the latter as a constrained optimization problem with virtual Dirichlet controls on the interfaces between the atomistic and continuum subdomains. The optimization objective is to minimize the error between the atomistic and continuum solutions on the overlap between the two subdomains, while the atomistic and continuum force balance equations provide the constraints. Separation, rather then blending of the atomistic and continuum problems, and their subsequent use as constraints in the optimization problem distinguishes our approach from the existing AtC formulations. Finally, we present and analyze the method in the context of a one-dimensional chain of atoms modeled using a linearized two-body potential with next-nearest neighbor interactions.

  18. Optimization algorithm based characterization scheme for tunable semiconductor lasers.

    PubMed

    Chen, Quanan; Liu, Gonghai; Lu, Qiaoyin; Guo, Weihua

    2016-09-01

    In this paper, an optimization algorithm based characterization scheme for tunable semiconductor lasers is proposed and demonstrated. In the process of optimization, the ratio between the power of the desired frequency and the power except of the desired frequency is used as the figure of merit, which approximately represents the side-mode suppression ratio. In practice, we use tunable optical band-pass and band-stop filters to obtain the power of the desired frequency and the power except of the desired frequency separately. With the assistance of optimization algorithms, such as the particle swarm optimization (PSO) algorithm, we can get stable operation conditions for tunable lasers at designated frequencies directly and efficiently. PMID:27607701

  19. Aerodynamic Shape Optimization Based on Free-form Deformation

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2004-01-01

    This paper presents a free-form deformation technique suitable for aerodynamic shape optimization. Because the proposed technique is independent of grid topology, we can treat structured and unstructured computational fluid dynamics grids in the same manner. The proposed technique is an alternative shape parameterization technique to a trivariate volume technique. It retains the flexibility and freedom of trivariate volumes for CFD shape optimization, but it uses a bivariate surface representation. This reduces the number of design variables by an order of magnitude, and it provides much better control for surface shape changes. The proposed technique is simple, compact, and efficient. The analytical sensitivity derivatives are independent of the design variables and are easily computed for use in a gradient-based optimization. The paper includes the complete formulation and aerodynamics shape optimization results.

  20. Optimization of Designs for Nanotube-based Scanning Probes

    NASA Technical Reports Server (NTRS)

    Harik, V. M.; Gates, T. S.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    Optimization of designs for nanotube-based scanning probes, which may be used for high-resolution characterization of nanostructured materials, is examined. Continuum models to analyze the nanotube deformations are proposed to help guide selection of the optimum probe. The limitations on the use of these models that must be accounted for before applying to any design problem are presented. These limitations stem from the underlying assumptions and the expected range of nanotube loading, end conditions, and geometry. Once the limitations are accounted for, the key model parameters along with the appropriate classification of nanotube structures may serve as a basis for the design optimization of nanotube-based probe tips.

  1. Adjoint-based airfoil shape optimization in transonic flow

    NASA Astrophysics Data System (ADS)

    Gramanzini, Joe-Ray

    The primary focus of this work is efficient aerodynamic shape optimization in transonic flow. Adjoint-based optimization techniques are employed on airfoil sections and evaluated in terms of computational accuracy as well as efficiency. This study examines two test cases proposed by the AIAA Aerodynamic Design Optimization Discussion Group. The first is a two-dimensional, transonic, inviscid, non-lifting optimization of a Modified-NACA 0012 airfoil. The second is a two-dimensional, transonic, viscous optimization problem using a RAE 2822 airfoil. The FUN3D CFD code of NASA Langley Research Center is used as the ow solver for the gradient-based optimization cases. Two shape parameterization techniques are employed to study their effect and the number of design variables on the final optimized shape: Multidisciplinary Aerodynamic-Structural Shape Optimization Using Deformation (MASSOUD) and the BandAids free-form deformation technique. For the two airfoil cases, angle of attack is treated as a global design variable. The thickness and camber distributions are the local design variables for MASSOUD, and selected airfoil surface grid points are the local design variables for BandAids. Using the MASSOUD technique, a drag reduction of 72.14% is achieved for the NACA 0012 case, reducing the total number of drag counts from 473.91 to 130.59. Employing the BandAids technique yields a 78.67% drag reduction, from 473.91 to 99.98. The RAE 2822 case exhibited a drag reduction from 217.79 to 132.79 counts, a 39.05% decrease using BandAids.

  2. Entropy-based optimization of wavelet spatial filters.

    PubMed

    Farina, Darino; Kamavuako, Ernest Nlandu; Wu, Jian; Naddeo, Francesco

    2008-03-01

    A new class of spatial filters for surface electromyographic (EMG) signal detection is proposed. These filters are based on the 2-D spatial wavelet decomposition of the surface EMG recorded with a grid of electrodes and inverse transformation after zeroing a subset of the transformation coefficients. The filter transfer function depends on the selected mother wavelet in the two spatial directions. Wavelet parameterization is proposed with the aim of signal-based optimization of the transfer function of the spatial filter. The optimization criterion was the minimization of the entropy of the time samples of the output signal. The optimized spatial filter is linear and space invariant. In simulated and experimental recordings, the optimized wavelet filter showed increased selectivity with respect to previously proposed filters. For example, in simulation, the ratio between the peak-to-peak amplitude of action potentials generated by motor units 20 degrees apart in the transversal direction was 8.58% (with monopolar recording), 2.47% (double differential), 2.59% (normal double differential), and 0.47% (optimized wavelet filter). In experimental recordings, the duration of the detected action potentials decreased from (mean +/- SD) 6.9 +/- 0.3 ms (monopolar recording), to 4.5 +/- 0.2 ms (normal double differential), 3.7 +/- 0.2 (double differential), and 3.0 +/- 0.1 ms (optimized wavelet filter). In conclusion, the new class of spatial filters with the proposed signal-based optimization of the transfer function allows better discrimination of individual motor unit activities in surface EMG recordings than it was previously possible. PMID:18334382

  3. Reliability-based design optimization under stationary stochastic process loads

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Du, Xiaoping

    2016-08-01

    Time-dependent reliability-based design ensures the satisfaction of reliability requirements for a given period of time, but with a high computational cost. This work improves the computational efficiency by extending the sequential optimization and reliability analysis (SORA) method to time-dependent problems with both stationary stochastic process loads and random variables. The challenge of the extension is the identification of the most probable point (MPP) associated with time-dependent reliability targets. Since a direct relationship between the MPP and reliability target does not exist, this work defines the concept of equivalent MPP, which is identified by the extreme value analysis and the inverse saddlepoint approximation. With the equivalent MPP, the time-dependent reliability-based design optimization is decomposed into two decoupled loops: deterministic design optimization and reliability analysis, and both are performed sequentially. Two numerical examples are used to show the efficiency of the proposed method.

  4. TRUST-TECH based Methods for Optimization and Learning

    NASA Astrophysics Data System (ADS)

    Reddy, Chandan K.

    2007-12-01

    Many problems that arise in machine learning domain deal with nonlinearity and quite often demand users to obtain global optimal solutions rather than local optimal ones. Optimization problems are inherent in machine learning algorithms and hence many methods in machine learning were inherited from the optimization literature. Popularly known as the initialization problem, the ideal set of parameters required will significantly depend on the given initialization values. The recently developed TRUST-TECH (TRansformation Under STability-reTaining Equilibria CHaracterization) methodology systematically explores the subspace of the parameters to obtain a complete set of local optimal solutions. In this thesis work, we propose TRUST-TECH based methods for solving several optimization and machine learning problems. Two stages namely, the local stage and the neighborhood-search stage, are repeated alternatively in the solution space to achieve improvements in the quality of the solutions. Our methods were tested on both synthetic and real datasets and the advantages of using this novel framework are clearly manifested. This framework not only reduces the sensitivity to initialization, but also allows the flexibility for the practitioners to use various global and local methods that work well for a particular problem of interest. Other hierarchical stochastic algorithms like evolutionary algorithms and smoothing algorithms are also studied and frameworks for combining these methods with TRUST-TECH have been proposed and evaluated on several test systems.

  5. 78 FR 43829 - Risk-Based Capital Guidelines; Market Risk

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-22

    .... Frierson, Secretary, Board of Governors of the Federal Reserve System, 20th Street and Constitution Avenue... transparency through enhanced disclosures. \\1\\ 77 FR 53060 (August 30, 2012). The agencies' market risk rules... additional detail on this history in the preamble to the August 2012 final rule. See, 77 FR 53060,...

  6. 12 CFR 390.466 - Risk-based capital credit risk-weight categories.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 5 2014-01-01 2014-01-01 false Risk-based capital credit risk-weight categories. 390.466 Section 390.466 Banks and Banking FEDERAL DEPOSIT INSURANCE CORPORATION REGULATIONS AND STATEMENTS OF GENERAL POLICY REGULATIONS TRANSFERRED FROM THE OFFICE OF THRIFT SUPERVISION Capital § 390.466 Risk-based capital credit...

  7. Optimal Test Design with Rule-Based Item Generation

    ERIC Educational Resources Information Center

    Geerlings, Hanneke; van der Linden, Wim J.; Glas, Cees A. W.

    2013-01-01

    Optimal test-design methods are applied to rule-based item generation. Three different cases of automated test design are presented: (a) test assembly from a pool of pregenerated, calibrated items; (b) test generation on the fly from a pool of calibrated item families; and (c) test generation on the fly directly from calibrated features defining…

  8. Information fusion based optimal control for large civil aircraft system.

    PubMed

    Zhen, Ziyang; Jiang, Ju; Wang, Xinhua; Gao, Chen

    2015-03-01

    Wind disturbance has a great influence on landing security of Large Civil Aircraft. Through simulation research and engineering experience, it can be found that PID control is not good enough to solve the problem of restraining the wind disturbance. This paper focuses on anti-wind attitude control for Large Civil Aircraft in landing phase. In order to improve the riding comfort and the flight security, an information fusion based optimal control strategy is presented to restrain the wind in landing phase for maintaining attitudes and airspeed. Data of Boeing707 is used to establish a nonlinear mode with total variables of Large Civil Aircraft, and then two linear models are obtained which are divided into longitudinal and lateral equations. Based on engineering experience, the longitudinal channel adopts PID control and C inner control to keep longitudinal attitude constant, and applies autothrottle system for keeping airspeed constant, while an information fusion based optimal regulator in the lateral control channel is designed to achieve lateral attitude holding. According to information fusion estimation, by fusing hard constraint information of system dynamic equations and the soft constraint information of performance index function, optimal estimation of the control sequence is derived. Based on this, an information fusion state regulator is deduced for discrete time linear system with disturbance. The simulation results of nonlinear model of aircraft indicate that the information fusion optimal control is better than traditional PID control, LQR control and LQR control with integral action, in anti-wind disturbance performance in the landing phase. PMID:25440950

  9. Electrochemical model based charge optimization for lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Pramanik, Sourav; Anwar, Sohel

    2016-05-01

    In this paper, we propose the design of a novel optimal strategy for charging the lithium-ion battery based on electrochemical battery model that is aimed at improved performance. A performance index that aims at minimizing the charging effort along with a minimum deviation from the rated maximum thresholds for cell temperature and charging current has been defined. The method proposed in this paper aims at achieving a faster charging rate while maintaining safe limits for various battery parameters. Safe operation of the battery is achieved by including the battery bulk temperature as a control component in the performance index which is of critical importance for electric vehicles. Another important aspect of the performance objective proposed here is the efficiency of the algorithm that would allow higher charging rates without compromising the internal electrochemical kinetics of the battery which would prevent abusive conditions, thereby improving the long term durability. A more realistic model, based on battery electro-chemistry has been used for the design of the optimal algorithm as opposed to the conventional equivalent circuit models. To solve the optimization problem, Pontryagins principle has been used which is very effective for constrained optimization problems with both state and input constraints. Simulation results show that the proposed optimal charging algorithm is capable of shortening the charging time of a lithium ion cell while maintaining the temperature constraint when compared with the standard constant current charging. The designed method also maintains the internal states within limits that can avoid abusive operating conditions.

  10. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 8 2012-01-01 2012-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  11. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 8 2013-01-01 2013-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  12. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 8 2014-01-01 2014-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  13. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 7 2011-01-01 2011-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  14. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  15. Pixel-based ant colony algorithm for source mask optimization

    NASA Astrophysics Data System (ADS)

    Kuo, Hung-Fei; Wu, Wei-Chen; Li, Frederick

    2015-03-01

    Source mask optimization (SMO) was considered to be one of the key resolution enhancement techniques for node technology below 20 nm prior to the availability of extreme-ultraviolet tools. SMO has been shown to enlarge the process margins for the critical layer in SRAM and memory cells. In this study, a new illumination shape optimization approach was developed on the basis of the ant colony optimization (ACO) principle. The use of this heuristic pixel-based ACO method in the SMO process provides an advantage over the extant SMO method because of the gradient of the cost function associated with the rapid and stable searching capability of the proposed method. This study was conducted to provide lithographic engineers with references for the quick determination of the optimal illumination shape for complex mask patterns. The test pattern used in this study was a contact layer for SRAM design, with a critical dimension and a minimum pitch of 55 and 110 nm, respectively. The optimized freeform source shape obtained using the ACO method was numerically verified by performing an aerial image investigation, and the result showed that the optimized freeform source shape generated an aerial image profile different from the nominal image profile and with an overall error rate of 9.64%. Furthermore, the overall average critical shape difference was determined to be 1.41, which was lower than that for the other off-axis illumination exposure. The process window results showed an improvement in exposure latitude (EL) and depth of focus (DOF) for the ACO-based freeform source shape compared with those of the Quasar source shape. The maximum EL of the ACO-based freeform source shape reached 7.4% and the DOF was 56 nm at an EL of 5%.

  16. Pragmatic fluid optimization in high-risk surgery patients: when pragmatism dilutes the benefits.

    PubMed

    Reuter, Daniel A

    2012-01-01

    There is increasing evidence that hemodynamic optimization by fluid loading, particularly when performed in the early phase of surgery, is beneficial in high-risk surgery patients: it leads to a reduction in postoperative complications and even to improved long-term outcome. However, it is also true that goal- directed strategies of fluid optimization focusing on cardiac output optimization have not been applied in the clinical routine of many institutions. Reasons are manifold: disbelief in the level of evidence and on the accuracy and practicability of the required monitoring systems, and economics. The FOCCUS trial examined perioperative fluid optimization with a very basic approach: a standardized volume load with 25 ml/kg crystalloids over 6 hours immediately prior to scheduled surgery in high-risk patients. The hypothesis was that this intervention would lead to a compensation of preoperative fluid deficit caused by overnight fasting, and would result in improved perioperative fluid homeostasis with less postoperative complications and earlier hospital discharge. However, the primary study endpoints did not improve significantly. This observation points towards the facts that: firstly, the differentiation between interstitial fluid deficit caused by fasting and intravascular volume loss due to acute blood loss must be recognized in treatment strategies; secondly, the type of fluid replacement may play an important role; and thirdly, protocolized treatment strategies should also always be tailored to suit the patients' individual needs in every individual clinical situation. PMID:22410167

  17. Parallel Harmony Search Based Distributed Energy Resource Optimization

    SciTech Connect

    Ceylan, Oguzhan; Liu, Guodong; Tomsovic, Kevin

    2015-01-01

    This paper presents a harmony search based parallel optimization algorithm to minimize voltage deviations in three phase unbalanced electrical distribution systems and to maximize active power outputs of distributed energy resources (DR). The main contribution is to reduce the adverse impacts on voltage profile during a day as photovoltaics (PVs) output or electrical vehicles (EVs) charging changes throughout a day. The IEEE 123- bus distribution test system is modified by adding DRs and EVs under different load profiles. The simulation results show that by using parallel computing techniques, heuristic methods may be used as an alternative optimization tool in electrical power distribution systems operation.

  18. Optimization of Polarimetric Contrast Enhancement Based on Fisher Criterion

    NASA Astrophysics Data System (ADS)

    Deng, Qiming; Chen, Jiong; Yang, Jian

    The optimization of polarimetric contrast enhancement (OPCE) is a widely used method for maximizing the received power ratio of a desired target versus an undesired target (clutter). In this letter, a new model of the OPCE is proposed based on the Fisher criterion. By introducing the well known two-class problem of linear discriminant analysis (LDA), the proposed model is to enlarge the normalized distance of mean value between the target and the clutter. In addition, a cross-iterative numerical method is proposed for solving the optimization with a quadratic constraint. Experimental results with the polarimetric SAR (POLSAR) data demonstrate the effectiveness of the proposed method.

  19. Reliability-Based Design Optimization of a Composite Airframe Component

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2009-01-01

    A stochastic design optimization methodology (SDO) has been developed to design components of an airframe structure that can be made of metallic and composite materials. The design is obtained as a function of the risk level, or reliability, p. The design method treats uncertainties in load, strength, and material properties as distribution functions, which are defined with mean values and standard deviations. A design constraint or a failure mode is specified as a function of reliability p. Solution to stochastic optimization yields the weight of a structure as a function of reliability p. Optimum weight versus reliability p traced out an inverted-S-shaped graph. The center of the inverted-S graph corresponded to 50 percent (p = 0.5) probability of success. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponds to unity for reliability p (or p = 1). Weight can be reduced to a small value for the most failure-prone design with a reliability that approaches zero (p = 0). Reliability can be changed for different components of an airframe structure. For example, the landing gear can be designed for a very high reliability, whereas it can be reduced to a small extent for a raked wingtip. The SDO capability is obtained by combining three codes: (1) The MSC/Nastran code was the deterministic analysis tool, (2) The fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and (3) NASA Glenn Research Center s optimization testbed CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life raked wingtip structure of the Boeing 767-400 extended range airliner made of metallic and composite materials.

  20. Improving Discrete-Sensitivity-Based Approach for Practical Design Optimization

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay; Cordero, Yvette; Pandya, Mohagna J.

    1997-01-01

    In developing the automated methodologies for simulation-based optimal shape designs, their accuracy, efficiency and practicality are the defining factors to their success. To that end, four recent improvements to the building blocks of such a methodology, intended for more practical design optimization, have been reported. First, in addition to a polynomial-based parameterization, a partial differential equation (PDE) based parameterization was shown to be a practical tool for a number of reasons. Second, an alternative has been incorporated to one of the tedious phases of developing such a methodology, namely, the automatic differentiation of the computer code for the flow analysis in order to generate the sensitivities. Third, by extending the methodology for the thin-layer Navier-Stokes (TLNS) based flow simulations, the more accurate flow physics was made available. However, the computer storage requirement for a shape optimization of a practical configuration with the -fidelity simulations (TLNS and dense-grid based simulations), required substantial computational resources. Therefore, the final improvement reported herein responded to this point by including the alternating-direct-implicit (ADI) based system solver as an alternative to the preconditioned biconjugate (PbCG) and other direct solvers.

  1. SADA: Ecological Risk Based Decision Support System for Selective Remediation

    EPA Science Inventory

    Spatial Analysis and Decision Assistance (SADA) is freeware that implements terrestrial ecological risk assessment and yields a selective remediation design using its integral geographical information system, based on ecological and risk assessment inputs. Selective remediation ...

  2. Optimization of image-based aberration metrology for EUV lithography

    NASA Astrophysics Data System (ADS)

    Levinson, Zac; Fenger, Germain; Burbine, Andrew; Schepis, Anthony R.; Smith, Bruce W.

    2014-04-01

    EUV lithography is likely more sensitive to drift from thermal and degradation effects than optical counterparts. We have developed an automated approach to photoresist image-based aberration metrology. The approach uses binary or phase mask targets and iterative simulation based solutions to retrieve an aberrated pupil function. It is well known that a partially coherent source both allows for the diffraction information of smaller features to be collected by the condenser system, and introduces pupil averaging. In general, smaller features are more sensitive to aberrations than larger features, so there is a trade-off between target sensitivity and printability. Therefore, metrology targets using this technique must be optimized for maximum sensitivity with each illumination system. This study examines aberration metrology target optimization and suggests an optimization scheme for use with any source. Interrogation of both low and high order aberrations is considered. High order aberration terms are interrogated using two separate fitting algorithms. While the optimized targets do show the lowest RMS error under the test conditions, a desirable RMS error is not achieved by either high order interrogation scheme. The implementation of a previously developed algorithm for image-based aberration metrology is used to support this work.

  3. Bare-Bones Teaching-Learning-Based Optimization

    PubMed Central

    Zou, Feng; Wang, Lei; Hei, Xinhong; Chen, Debao; Jiang, Qiaoyong; Li, Hongye

    2014-01-01

    Teaching-learning-based optimization (TLBO) algorithm which simulates the teaching-learning process of the class room is one of the recently proposed swarm intelligent (SI) algorithms. In this paper, a new TLBO variant called bare-bones teaching-learning-based optimization (BBTLBO) is presented to solve the global optimization problems. In this method, each learner of teacher phase employs an interactive learning strategy, which is the hybridization of the learning strategy of teacher phase in the standard TLBO and Gaussian sampling learning based on neighborhood search, and each learner of learner phase employs the learning strategy of learner phase in the standard TLBO or the new neighborhood search strategy. To verify the performance of our approaches, 20 benchmark functions and two real-world problems are utilized. Conducted experiments can been observed that the BBTLBO performs significantly better than, or at least comparable to, TLBO and some existing bare-bones algorithms. The results indicate that the proposed algorithm is competitive to some other optimization algorithms. PMID:25013844

  4. Reliability-based analysis and design optimization for durability

    NASA Astrophysics Data System (ADS)

    Choi, Kyung K.; Youn, Byeng D.; Tang, Jun; Hardee, Edward

    2005-05-01

    In the Army mechanical fatigue subject to external and inertia transient loads in the service life of mechanical systems often leads to a structural failure due to accumulated damage. Structural durability analysis that predicts the fatigue life of mechanical components subject to dynamic stresses and strains is a compute intensive multidisciplinary simulation process, since it requires the integration of several computer-aided engineering tools and considerable data communication and computation. Uncertainties in geometric dimensions due to manufacturing tolerances cause the indeterministic nature of the fatigue life of a mechanical component. Due to the fact that uncertainty propagation to structural fatigue under transient dynamic loading is not only numerically complicated but also extremely computationally expensive, it is a challenging task to develop a structural durability-based design optimization process and reliability analysis to ascertain whether the optimal design is reliable. The objective of this paper is the demonstration of an integrated CAD-based computer-aided engineering process to effectively carry out design optimization for structural durability, yielding a durable and cost-effectively manufacturable product. This paper shows preliminary results of reliability-based durability design optimization for the Army Stryker A-Arm.

  5. Multiobjective inverse planning for intensity modulated radiotherapy with constraint-free gradient-based optimization algorithms

    NASA Astrophysics Data System (ADS)

    Lahanas, Michael; Schreibmann, Eduard; Baltas, Dimos

    2003-09-01

    We consider the behaviour of the limited memory L-BFGS algorithm as a representative constraint-free gradient-based algorithm which is used for multiobjective (MO) dose optimization for intensity modulated radiotherapy (IMRT). Using a parameter transformation, the positivity constraint problem of negative beam fluences is entirely eliminated: a feature which to date has not been fully understood by all investigators. We analyse the global convergence properties of L-BFGS by searching for the existence and the influence of possible local minima. With a fast simulated annealing (FSA) algorithm we examine whether the L-BFGS solutions are globally Pareto optimal. The three examples used in our analysis are a brain tumour, a prostate tumour and a test case with a C-shaped PTV. In 1% of the optimizations global convergence is violated. A simple mechanism practically eliminates the influence of this failure and the obtained solutions are globally optimal. A single-objective dose optimization requires less than 4 s for 5400 parameters and 40 000 sampling points. The elimination of the problem of negative beam fluences and the high computational speed permit constraint-free gradient-based optimization algorithms to be used for MO dose optimization. In this situation, a representative spectrum of possible solutions is obtained which contains information such as the trade-off between the objectives and range of dose values. Using simple decision making tools the best of all the possible solutions can be chosen. We perform an MO dose optimization for the three examples and compare the spectra of solutions, firstly using recommended critical dose values for the organs at risk and secondly, setting these dose values to zero.

  6. The Integrated Medical Model - Optimizing In-flight Space Medical Systems to Reduce Crew Health Risk and Mission Impacts

    NASA Technical Reports Server (NTRS)

    Kerstman, Eric; Walton, Marlei; Minard, Charles; Saile, Lynn; Myers, Jerry; Butler, Doug; Lyengar, Sriram; Fitts, Mary; Johnson-Throop, Kathy

    2009-01-01

    The Integrated Medical Model (IMM) is a decision support tool used by medical system planners and designers as they prepare for exploration planning activities of the Constellation program (CxP). IMM provides an evidence-based approach to help optimize the allocation of in-flight medical resources for a specified level of risk within spacecraft operational constraints. Eighty medical conditions and associated resources are represented in IMM. Nine conditions are due to Space Adaptation Syndrome. The IMM helps answer fundamental medical mission planning questions such as What medical conditions can be expected? What type and quantity of medical resources are most likely to be used?", and "What is the probability of crew death or evacuation due to medical events?" For a specified mission and crew profile, the IMM effectively characterizes the sequence of events that could potentially occur should a medical condition happen. The mathematical relationships among mission and crew attributes, medical conditions and incidence data, in-flight medical resources, potential clinical and crew health end states are established to generate end state probabilities. A Monte Carlo computational method is used to determine the probable outcomes and requires up to 25,000 mission trials to reach convergence. For each mission trial, the pharmaceuticals and supplies required to diagnose and treat prevalent medical conditions are tracked and decremented. The uncertainty of patient response to treatment is bounded via a best-case, worst-case, untreated case algorithm. A Crew Health Index (CHI) metric, developed to account for functional impairment due to a medical condition, provides a quantified measure of risk and enables risk comparisons across mission scenarios. The use of historical in-flight medical data, terrestrial surrogate data as appropriate, and space medicine subject matter expertise has enabled the development of a probabilistic, stochastic decision support tool capable of

  7. Similarity-based global optimization of buildings in urban scene

    NASA Astrophysics Data System (ADS)

    Zhu, Quansheng; Zhang, Jing; Jiang, Wanshou

    2013-10-01

    In this paper, an approach for the similarity-based global optimization of buildings in urban scene is presented. In the past, most researches concentrated on single building reconstruction, making it difficult to reconstruct reliable models from noisy or incomplete point clouds. To obtain a better result, a new trend is to utilize the similarity among the buildings. Therefore, a new similarity detection and global optimization strategy is adopted to modify local-fitting geometric errors. Firstly, the hierarchical structure that consists of geometric, topological and semantic features is constructed to represent complex roof models. Secondly, similar roof models can be detected by combining primitive structure and connection similarities. At last, the global optimization strategy is applied to preserve the consistency and precision of similar roof structures. Moreover, non-local consolidation is adapted to detect small roof parts. The experiments reveal that the proposed method can obtain convincing roof models and promote the reconstruction quality of 3D buildings in urban scene.

  8. Level set based structural topology optimization for minimizing frequency response

    NASA Astrophysics Data System (ADS)

    Shu, Lei; Wang, Michael Yu; Fang, Zongde; Ma, Zhengdong; Wei, Peng

    2011-11-01

    For the purpose of structure vibration reduction, a structural topology optimization for minimizing frequency response is proposed based on the level set method. The objective of the present study is to minimize the frequency response at the specified points or surfaces on the structure with an excitation frequency or a frequency range, subject to the given amount of the material over the admissible design domain. The sensitivity analysis with respect to the structural boundaries is carried out, while the Extended finite element method (X-FEM) is employed for solving the state equation and the adjoint equation. The optimal structure with smooth boundaries is obtained by the level set evolution with advection velocity, derived from the sensitivity analysis and the optimization algorithm. A number of numerical examples, in the frameworks of two-dimension (2D) and three-dimension (3D), are presented to demonstrate the feasibility and effectiveness of the proposed approach.

  9. Voronoi Diagram Based Optimization of Dynamic Reactive Power Sources

    SciTech Connect

    Huang, Weihong; Sun, Kai; Qi, Junjian; Xu, Yan

    2015-01-01

    Dynamic var sources can effectively mitigate fault-induced delayed voltage recovery (FIDVR) issues or even voltage collapse. This paper proposes a new approach to optimization of the sizes of dynamic var sources at candidate locations by a Voronoi diagram based algorithm. It first disperses sample points of potential solutions in a searching space, evaluates a cost function at each point by barycentric interpolation for the subspaces around the point, and then constructs a Voronoi diagram about cost function values over the entire space. Accordingly, the final optimal solution can be obtained. Case studies on the WSCC 9-bus system and NPCC 140-bus system have validated that the new approach can quickly identify the boundary of feasible solutions in searching space and converge to the global optimal solution.

  10. Vision-based stereo ranging as an optimal control problem

    NASA Technical Reports Server (NTRS)

    Menon, P. K. A.; Sridhar, B.; Chatterji, G. B.

    1992-01-01

    The recent interest in the use of machine vision for flight vehicle guidance is motivated by the need to automate the nap-of-the-earth flight regime of helicopters. Vision-based stereo ranging problem is cast as an optimal control problem in this paper. A quadratic performance index consisting of the integral of the error between observed image irradiances and those predicted by a Pade approximation of the correspondence hypothesis is then used to define an optimization problem. The necessary conditions for optimality yield a set of linear two-point boundary-value problems. These two-point boundary-value problems are solved in feedback form using a version of the backward sweep method. Application of the ranging algorithm is illustrated using a laboratory image pair.

  11. A Danger-Theory-Based Immune Network Optimization Algorithm

    PubMed Central

    Li, Tao; Xiao, Xin; Shi, Yuanquan

    2013-01-01

    Existing artificial immune optimization algorithms reflect a number of shortcomings, such as premature convergence and poor local search ability. This paper proposes a danger-theory-based immune network optimization algorithm, named dt-aiNet. The danger theory emphasizes that danger signals generated from changes of environments will guide different levels of immune responses, and the areas around danger signals are called danger zones. By defining the danger zone to calculate danger signals for each antibody, the algorithm adjusts antibodies' concentrations through its own danger signals and then triggers immune responses of self-regulation. So the population diversity can be maintained. Experimental results show that the algorithm has more advantages in the solution quality and diversity of the population. Compared with influential optimization algorithms, CLONALG, opt-aiNet, and dopt-aiNet, the algorithm has smaller error values and higher success rates and can find solutions to meet the accuracies within the specified function evaluation times. PMID:23483853

  12. Reentry trajectory optimization based on a multistage pseudospectral method.

    PubMed

    Zhao, Jiang; Zhou, Rui; Jin, Xuelian

    2014-01-01

    Of the many direct numerical methods, the pseudospectral method serves as an effective tool to solve the reentry trajectory optimization for hypersonic vehicles. However, the traditional pseudospectral method is time-consuming due to large number of discretization points. For the purpose of autonomous and adaptive reentry guidance, the research herein presents a multistage trajectory control strategy based on the pseudospectral method, capable of dealing with the unexpected situations in reentry flight. The strategy typically includes two subproblems: the trajectory estimation and trajectory refining. In each processing stage, the proposed method generates a specified range of trajectory with the transition of the flight state. The full glide trajectory consists of several optimal trajectory sequences. The newly focused geographic constraints in actual flight are discussed thereafter. Numerical examples of free-space flight, target transition flight, and threat avoidance flight are used to show the feasible application of multistage pseudospectral method in reentry trajectory optimization. PMID:24574929

  13. Reentry Trajectory Optimization Based on a Multistage Pseudospectral Method

    PubMed Central

    Zhou, Rui; Jin, Xuelian

    2014-01-01

    Of the many direct numerical methods, the pseudospectral method serves as an effective tool to solve the reentry trajectory optimization for hypersonic vehicles. However, the traditional pseudospectral method is time-consuming due to large number of discretization points. For the purpose of autonomous and adaptive reentry guidance, the research herein presents a multistage trajectory control strategy based on the pseudospectral method, capable of dealing with the unexpected situations in reentry flight. The strategy typically includes two subproblems: the trajectory estimation and trajectory refining. In each processing stage, the proposed method generates a specified range of trajectory with the transition of the flight state. The full glide trajectory consists of several optimal trajectory sequences. The newly focused geographic constraints in actual flight are discussed thereafter. Numerical examples of free-space flight, target transition flight, and threat avoidance flight are used to show the feasible application of multistage pseudospectral method in reentry trajectory optimization. PMID:24574929

  14. On the Integration of Risk Aversion and Average-Performance Optimization in Reservoir Control

    NASA Astrophysics Data System (ADS)

    Nardini, Andrea; Piccardi, Carlo; Soncini-Sessa, Rodolfo

    1992-02-01

    The real-time operation of a reservoir is a matter of trade-off between the two criteria of risk aversion (to avoid dramatic failures) and average-performance optimization (to yield the best long-term average performance). A methodology taking into account both criteria is presented m this paper to derive "off-line" infinite-horizon control policies for a single multipurpose reservoir, where the management goals are water supply and flood control. According to this methodology, the reservoir control policy is derived in two steps: First, a (min-max) risk aversion problem is formulated, whose solution is not unique, but rather a whole set of policies, all equivalent from the point of view of the risk-aversion objectives. Second, a stochastic average-performance optimization problem is solved, to select from the set previously obtained the best policy from the point of view of the average-performance objectives. The methodology has several interesting features: the rnin-max (or "guaranteed performance") approach, which is particularly suited whenever "weak" users are affected by the consequences of the decision-making process; the flexible definition of a "risk aversion degree," by the selection of those inflow sequences which are particularly feared; and the two-objective analysis which provides the manager with a whole set of alternatives among which he (she) will select the one that yields the desired trade-off between the management goals.

  15. Optimal high speed CMOS inverter design using craziness based Particle Swarm Optimization Algorithm

    NASA Astrophysics Data System (ADS)

    De, Bishnu P.; Kar, Rajib; Mandal, Durbadal; Ghoshal, Sakti P.

    2015-07-01

    The inverter is the most fundamental logic gate that performs a Boolean operation on a single input variable. In this paper, an optimal design of CMOS inverter using an improved version of particle swarm optimization technique called Craziness based Particle Swarm Optimization (CRPSO) is proposed. CRPSO is very simple in concept, easy to implement and computationally efficient algorithm with two main advantages: it has fast, nearglobal convergence, and it uses nearly robust control parameters. The performance of PSO depends on its control parameters and may be influenced by premature convergence and stagnation problems. To overcome these problems the PSO algorithm has been modiffed to CRPSO in this paper and is used for CMOS inverter design. In birds' flocking or ffsh schooling, a bird or a ffsh often changes direction suddenly. In the proposed technique, the sudden change of velocity is modelled by a direction reversal factor associated with the previous velocity and a "craziness" velocity factor associated with another direction reversal factor. The second condition is introduced depending on a predeffned craziness probability to maintain the diversity of particles. The performance of CRPSO is compared with real code.gnetic algorithm (RGA), and conventional PSO reported in the recent literature. CRPSO based design results are also compared with the PSPICE based results. The simulation results show that the CRPSO is superior to the other algorithms for the examples considered and can be efficiently used for the CMOS inverter design.

  16. Optimal management of asymptomatic workers at high risk of bladder cancer.

    PubMed

    Schulte, P A; Ringen, K; Hemstreet, G P

    1986-01-01

    Many cohorts of industrial workers at increased risk of occupationally induced bladder cancer are still in the preclinical disease stage. A large proportion of workers in these populations have been exposed to aromatic amines, but have not yet experienced the average latent period for bladder cancer. A need exists for definition of what constitutes optimal management for asymptomatic workers in these cohorts. Promising advances in the epidemiology, pathology, detection, and treatment of bladder cancer pressure for a reassessment of current practices and the application of the most current scientific knowledge. Some of these apparent advances, however, have not yet been rigorously evaluated. The time has come to evaluate these advances so that their application can occur while high risk cohorts are still amenable to and likely to benefit from intervention. This commentary calls for such an evaluation leading to a comprehensive approach to managing cohorts at high risk of bladder cancer. PMID:3950777

  17. Modification of species-based differential evolution for multimodal optimization

    NASA Astrophysics Data System (ADS)

    Idrus, Said Iskandar Al; Syahputra, Hermawan; Firdaus, Muliawan

    2015-12-01

    At this time optimization has an important role in various fields as well as between other operational research, industry, finance and management. Optimization problem is the problem of maximizing or minimizing a function of one variable or many variables, which include unimodal and multimodal functions. Differential Evolution (DE), is a random search technique using vectors as an alternative solution in the search for the optimum. To localize all local maximum and minimum on multimodal function, this function can be divided into several domain of fitness using niching method. Species-based niching method is one of method that build sub-populations or species in the domain functions. This paper describes the modification of species-based previously to reduce the computational complexity and run more efficiently. The results of the test functions show species-based modifications able to locate all the local optima in once run the program.

  18. Optimization Model for Web Based Multimodal Interactive Simulations

    PubMed Central

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2015-01-01

    This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update. In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach. PMID:26085713

  19. Nozzle Mounting Method Optimization Based on Robot Kinematic Analysis

    NASA Astrophysics Data System (ADS)

    Chen, Chaoyue; Liao, Hanlin; Montavon, Ghislain; Deng, Sihao

    2016-08-01

    Nowadays, the application of industrial robots in thermal spray is gaining more and more importance. A desired coating quality depends on factors such as a balanced robot performance, a uniform scanning trajectory and stable parameters (e.g. nozzle speed, scanning step, spray angle, standoff distance). These factors also affect the mass and heat transfer as well as the coating formation. Thus, the kinematic optimization of all these aspects plays a key role in order to obtain an optimal coating quality. In this study, the robot performance was optimized from the aspect of nozzle mounting on the robot. An optimized nozzle mounting for a type F4 nozzle was designed, based on the conventional mounting method from the point of view of robot kinematics validated on a virtual robot. Robot kinematic parameters were obtained from the simulation by offline programming software and analyzed by statistical methods. The energy consumptions of different nozzle mounting methods were also compared. The results showed that it was possible to reasonably assign the amount of robot motion to each axis during the process, so achieving a constant nozzle speed. Thus, it is possible optimize robot performance and to economize robot energy.

  20. CACONET: Ant Colony Optimization (ACO) Based Clustering Algorithm for VANET

    PubMed Central

    Bajwa, Khalid Bashir; Khan, Salabat; Chaudary, Nadeem Majeed; Akram, Adeel

    2016-01-01

    A vehicular ad hoc network (VANET) is a wirelessly connected network of vehicular nodes. A number of techniques, such as message ferrying, data aggregation, and vehicular node clustering aim to improve communication efficiency in VANETs. Cluster heads (CHs), selected in the process of clustering, manage inter-cluster and intra-cluster communication. The lifetime of clusters and number of CHs determines the efficiency of network. In this paper a Clustering algorithm based on Ant Colony Optimization (ACO) for VANETs (CACONET) is proposed. CACONET forms optimized clusters for robust communication. CACONET is compared empirically with state-of-the-art baseline techniques like Multi-Objective Particle Swarm Optimization (MOPSO) and Comprehensive Learning Particle Swarm Optimization (CLPSO). Experiments varying the grid size of the network, the transmission range of nodes, and number of nodes in the network were performed to evaluate the comparative effectiveness of these algorithms. For optimized clustering, the parameters considered are the transmission range, direction and speed of the nodes. The results indicate that CACONET significantly outperforms MOPSO and CLPSO. PMID:27149517

  1. Nozzle Mounting Method Optimization Based on Robot Kinematic Analysis

    NASA Astrophysics Data System (ADS)

    Chen, Chaoyue; Liao, Hanlin; Montavon, Ghislain; Deng, Sihao

    2016-07-01

    Nowadays, the application of industrial robots in thermal spray is gaining more and more importance. A desired coating quality depends on factors such as a balanced robot performance, a uniform scanning trajectory and stable parameters (e.g. nozzle speed, scanning step, spray angle, standoff distance). These factors also affect the mass and heat transfer as well as the coating formation. Thus, the kinematic optimization of all these aspects plays a key role in order to obtain an optimal coating quality. In this study, the robot performance was optimized from the aspect of nozzle mounting on the robot. An optimized nozzle mounting for a type F4 nozzle was designed, based on the conventional mounting method from the point of view of robot kinematics validated on a virtual robot. Robot kinematic parameters were obtained from the simulation by offline programming software and analyzed by statistical methods. The energy consumptions of different nozzle mounting methods were also compared. The results showed that it was possible to reasonably assign the amount of robot motion to each axis during the process, so achieving a constant nozzle speed. Thus, it is possible optimize robot performance and to economize robot energy.

  2. Component-based integration of chemistry and optimization software.

    PubMed

    Kenny, Joseph P; Benson, Steven J; Alexeev, Yuri; Sarich, Jason; Janssen, Curtis L; McInnes, Lois Curfman; Krishnan, Manojkumar; Nieplocha, Jarek; Jurrus, Elizabeth; Fahlstrom, Carl; Windus, Theresa L

    2004-11-15

    Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Component-based software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance, and highlight potential research enabled by this platform. PMID:15362128

  3. Bicriteria Network Optimization Problem using Priority-based Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Gen, Mitsuo; Lin, Lin; Cheng, Runwei

    Network optimization is being an increasingly important and fundamental issue in the fields such as engineering, computer science, operations research, transportation, telecommunication, decision support systems, manufacturing, and airline scheduling. In many applications, however, there are several criteria associated with traversing each edge of a network. For example, cost and flow measures are both important in the networks. As a result, there has been recent interest in solving Bicriteria Network Optimization Problem. The Bicriteria Network Optimization Problem is known a NP-hard. The efficient set of paths may be very large, possibly exponential in size. Thus the computational effort required to solve it can increase exponentially with the problem size in the worst case. In this paper, we propose a genetic algorithm (GA) approach used a priority-based chromosome for solving the bicriteria network optimization problem including maximum flow (MXF) model and minimum cost flow (MCF) model. The objective is to find the set of Pareto optimal solutions that give possible maximum flow with minimum cost. This paper also combines Adaptive Weight Approach (AWA) that utilizes some useful information from the current population to readjust weights for obtaining a search pressure toward a positive ideal point. Computer simulations show the several numerical experiments by using some difficult-to-solve network design problems, and show the effectiveness of the proposed method.

  4. CACONET: Ant Colony Optimization (ACO) Based Clustering Algorithm for VANET.

    PubMed

    Aadil, Farhan; Bajwa, Khalid Bashir; Khan, Salabat; Chaudary, Nadeem Majeed; Akram, Adeel

    2016-01-01

    A vehicular ad hoc network (VANET) is a wirelessly connected network of vehicular nodes. A number of techniques, such as message ferrying, data aggregation, and vehicular node clustering aim to improve communication efficiency in VANETs. Cluster heads (CHs), selected in the process of clustering, manage inter-cluster and intra-cluster communication. The lifetime of clusters and number of CHs determines the efficiency of network. In this paper a Clustering algorithm based on Ant Colony Optimization (ACO) for VANETs (CACONET) is proposed. CACONET forms optimized clusters for robust communication. CACONET is compared empirically with state-of-the-art baseline techniques like Multi-Objective Particle Swarm Optimization (MOPSO) and Comprehensive Learning Particle Swarm Optimization (CLPSO). Experiments varying the grid size of the network, the transmission range of nodes, and number of nodes in the network were performed to evaluate the comparative effectiveness of these algorithms. For optimized clustering, the parameters considered are the transmission range, direction and speed of the nodes. The results indicate that CACONET significantly outperforms MOPSO and CLPSO. PMID:27149517

  5. Optimal assignment methods for ligand-based virtual screening

    PubMed Central

    2009-01-01

    Background Ligand-based virtual screening experiments are an important task in the early drug discovery stage. An ambitious aim in each experiment is to disclose active structures based on new scaffolds. To perform these "scaffold-hoppings" for individual problems and targets, a plethora of different similarity methods based on diverse techniques were published in the last years. The optimal assignment approach on molecular graphs, a successful method in the field of quantitative structure-activity relationships, has not been tested as a ligand-based virtual screening method so far. Results We evaluated two already published and two new optimal assignment methods on various data sets. To emphasize the "scaffold-hopping" ability, we used the information of chemotype clustering analyses in our evaluation metrics. Comparisons with literature results show an improved early recognition performance and comparable results over the complete data set. A new method based on two different assignment steps shows an increased "scaffold-hopping" behavior together with a good early recognition performance. Conclusion The presented methods show a good combination of chemotype discovery and enrichment of active structures. Additionally, the optimal assignment on molecular graphs has the advantage to investigate and interpret the mappings, allowing precise modifications of internal parameters of the similarity measure for specific targets. All methods have low computation times which make them applicable to screen large data sets. PMID:20150995

  6. Risk perception, risk evaluation and human values: cognitive bases of acceptability of a radioactive waste repository

    SciTech Connect

    Earle, T.C.; Lindell, M.K.; Rankin, W.L.

    1981-07-01

    Public acceptance of radioactive waste management alternatives depends in part on public perception of the associated risks. Three aspects of those perceived risks were explored in this study: (1) synthetic measures of risk perception based on judgments of probability and consequences; (2) acceptability of hypothetical radioactive waste policies, and (3) effects of human values on risk perception. Both the work on synthetic measures of risk perception and on the acceptability of hypothetical policies included investigations of three categories of risk: (1) Short-term public risk (affecting persons living when the wastes are created), (2) Long-term public risk (affecting persons living after the time the wastes were created), and (3) Occupational risk (affecting persons working with the radioactive wastes). The human values work related to public risk perception in general, across categories of persons affected. Respondents were selected according to a purposive sampling strategy.

  7. Data-based robust multiobjective optimization of interconnected processes: energy efficiency case study in papermaking.

    PubMed

    Afshar, Puya; Brown, Martin; Maciejowski, Jan; Wang, Hong

    2011-12-01

    Reducing energy consumption is a major challenge for "energy-intensive" industries such as papermaking. A commercially viable energy saving solution is to employ data-based optimization techniques to obtain a set of "optimized" operational settings that satisfy certain performance indices. The difficulties of this are: 1) the problems of this type are inherently multicriteria in the sense that improving one performance index might result in compromising the other important measures; 2) practical systems often exhibit unknown complex dynamics and several interconnections which make the modeling task difficult; and 3) as the models are acquired from the existing historical data, they are valid only locally and extrapolations incorporate risk of increasing process variability. To overcome these difficulties, this paper presents a new decision support system for robust multiobjective optimization of interconnected processes. The plant is first divided into serially connected units to model the process, product quality, energy consumption, and corresponding uncertainty measures. Then multiobjective gradient descent algorithm is used to solve the problem in line with user's preference information. Finally, the optimization results are visualized for analysis and decision making. In practice, if further iterations of the optimization algorithm are considered, validity of the local models must be checked prior to proceeding to further iterations. The method is implemented by a MATLAB-based interactive tool DataExplorer supporting a range of data analysis, modeling, and multiobjective optimization techniques. The proposed approach was tested in two U.K.-based commercial paper mills where the aim was reducing steam consumption and increasing productivity while maintaining the product quality by optimization of vacuum pressures in forming and press sections. The experimental results demonstrate the effectiveness of the method. PMID:22147299

  8. Chaos Time Series Prediction Based on Membrane Optimization Algorithms

    PubMed Central

    Li, Meng; Yi, Liangzhong; Pei, Zheng; Gao, Zhisheng

    2015-01-01

    This paper puts forward a prediction model based on membrane computing optimization algorithm for chaos time series; the model optimizes simultaneously the parameters of phase space reconstruction (τ, m) and least squares support vector machine (LS-SVM) (γ, σ) by using membrane computing optimization algorithm. It is an important basis for spectrum management to predict accurately the change trend of parameters in the electromagnetic environment, which can help decision makers to adopt an optimal action. Then, the model presented in this paper is used to forecast band occupancy rate of frequency modulation (FM) broadcasting band and interphone band. To show the applicability and superiority of the proposed model, this paper will compare the forecast model presented in it with conventional similar models. The experimental results show that whether single-step prediction or multistep prediction, the proposed model performs best based on three error measures, namely, normalized mean square error (NMSE), root mean square error (RMSE), and mean absolute percentage error (MAPE). PMID:25874249

  9. Computer Based Porosity Design by Multi Phase Topology Optimization

    NASA Astrophysics Data System (ADS)

    Burblies, Andreas; Busse, Matthias

    2008-02-01

    A numerical simulation technique called Multi Phase Topology Optimization (MPTO) based on finite element method has been developed and refined by Fraunhofer IFAM during the last five years. MPTO is able to determine the optimum distribution of two or more different materials in components under thermal and mechanical loads. The objective of optimization is to minimize the component's elastic energy. Conventional topology optimization methods which simulate adaptive bone mineralization have got the disadvantage that there is a continuous change of mass by growth processes. MPTO keeps all initial material concentrations and uses methods adapted from molecular dynamics to find energy minimum. Applying MPTO to mechanically loaded components with a high number of different material densities, the optimization results show graded and sometimes anisotropic porosity distributions which are very similar to natural bone structures. Now it is possible to design the macro- and microstructure of a mechanical component in one step. Computer based porosity design structures can be manufactured by new Rapid Prototyping technologies. Fraunhofer IFAM has applied successfully 3D-Printing and Selective Laser Sintering methods in order to produce very stiff light weight components with graded porosities calculated by MPTO.

  10. Mars Mission Optimization Based on Collocation of Resources

    NASA Technical Reports Server (NTRS)

    Chamitoff, G. E.; James, G. H.; Barker, D. C.; Dershowitz, A. L.

    2003-01-01

    This paper presents a powerful approach for analyzing Martian data and for optimizing mission site selection based on resource collocation. This approach is implemented in a program called PROMT (Planetary Resource Optimization and Mapping Tool), which provides a wide range of analysis and display functions that can be applied to raw data or imagery. Thresholds, contours, custom algorithms, and graphical editing are some of the various methods that can be used to process data. Output maps can be created to identify surface regions on Mars that meet any specific criteria. The use of this tool for analyzing data, generating maps, and collocating features is demonstrated using data from the Mars Global Surveyor and the Odyssey spacecraft. The overall mission design objective is to maximize a combination of scientific return and self-sufficiency based on utilization of local materials. Landing site optimization involves maximizing accessibility to collocated science and resource features within a given mission radius. Mission types are categorized according to duration, energy resources, and in-situ resource utilization. Optimization results are shown for a number of mission scenarios.

  11. The optimal community detection of software based on complex networks

    NASA Astrophysics Data System (ADS)

    Huang, Guoyan; Zhang, Peng; Zhang, Bing; Yin, Tengteng; Ren, Jiadong

    2016-02-01

    The community structure is important for software in terms of understanding the design patterns, controlling the development and the maintenance process. In order to detect the optimal community structure in the software network, a method Optimal Partition Software Network (OPSN) is proposed based on the dependency relationship among the software functions. First, by analyzing the information of multiple execution traces of one software, we construct Software Execution Dependency Network (SEDN). Second, based on the relationship among the function nodes in the network, we define Fault Accumulation (FA) to measure the importance of the function node and sort the nodes with measure results. Third, we select the top K(K=1,2,…) nodes as the core of the primal communities (only exist one core node). By comparing the dependency relationships between each node and the K communities, we put the node into the existing community which has the most close relationship. Finally, we calculate the modularity with different initial K to obtain the optimal division. With experiments, the method OPSN is verified to be efficient to detect the optimal community in various softwares.

  12. Chaos time series prediction based on membrane optimization algorithms.

    PubMed

    Li, Meng; Yi, Liangzhong; Pei, Zheng; Gao, Zhisheng; Peng, Hong

    2015-01-01

    This paper puts forward a prediction model based on membrane computing optimization algorithm for chaos time series; the model optimizes simultaneously the parameters of phase space reconstruction (τ, m) and least squares support vector machine (LS-SVM) (γ, σ) by using membrane computing optimization algorithm. It is an important basis for spectrum management to predict accurately the change trend of parameters in the electromagnetic environment, which can help decision makers to adopt an optimal action. Then, the model presented in this paper is used to forecast band occupancy rate of frequency modulation (FM) broadcasting band and interphone band. To show the applicability and superiority of the proposed model, this paper will compare the forecast model presented in it with conventional similar models. The experimental results show that whether single-step prediction or multistep prediction, the proposed model performs best based on three error measures, namely, normalized mean square error (NMSE), root mean square error (RMSE), and mean absolute percentage error (MAPE). PMID:25874249

  13. Mesh Optimization for Monte Carlo-Based Optical Tomography

    PubMed Central

    Edmans, Andrew; Intes, Xavier

    2015-01-01

    Mesh-based Monte Carlo techniques for optical imaging allow for accurate modeling of light propagation in complex biological tissues. Recently, they have been developed within an efficient computational framework to be used as a forward model in optical tomography. However, commonly employed adaptive mesh discretization techniques have not yet been implemented for Monte Carlo based tomography. Herein, we propose a methodology to optimize the mesh discretization and analytically rescale the associated Jacobian based on the characteristics of the forward model. We demonstrate that this method maintains the accuracy of the forward model even in the case of temporal data sets while allowing for significant coarsening or refinement of the mesh. PMID:26566523

  14. [Optimized Spectral Indices Based Estimation of Forage Grass Biomass].

    PubMed

    An, Hai-bo; Li, Fei; Zhao, Meng-li; Liu, Ya-jun

    2015-11-01

    As an important indicator of forage production, aboveground biomass will directly illustrate the growth of forage grass. Therefore, Real-time monitoring biomass of forage grass play a crucial role in performing suitable grazing and management in artificial and natural grassland. However, traditional sampling and measuring are time-consuming and labor-intensive. Recently, development of hyperspectral remote sensing provides the feasibility in timely and nondestructive deriving biomass of forage grass. In the present study, the main objectives were to explore the robustness of published and optimized spectral indices in estimating biomass of forage grass in natural and artificial pasture. The natural pasture with four grazing density (control, light grazing, moderate grazing and high grazing) was designed in desert steppe, and different forage cultivars with different N rate were conducted in artificial forage fields in Inner Mongolia. The canopy reflectance and biomass in each plot were measured during critical stages. The result showed that, due to the influence in canopy structure and biomass, the canopy reflectance have a great difference in different type of forage grass. The best performing spectral index varied in different species of forage grass with different treatments (R² = 0.00-0.69). The predictive ability of spectral indices decreased under low biomass of desert steppe, while red band based spectral indices lost sensitivity under moderate-high biomass of forage maize. When band combinations of simple ratio and normalized difference spectral indices were optimized in combined datasets of natural and artificial grassland, optimized spectral indices significant increased predictive ability and the model between biomass and optimized spectral indices had the highest R² (R² = 0.72) compared to published spectral indices. Sensitive analysis further confirmed that the optimized index had the lowest noise equivalent and were the best performing index in

  15. Parameter optimization in differential geometry based solvation models.

    PubMed

    Wang, Bao; Wei, G W

    2015-10-01

    Differential geometry (DG) based solvation models are a new class of variational implicit solvent approaches that are able to avoid unphysical solvent-solute boundary definitions and associated geometric singularities, and dynamically couple polar and non-polar interactions in a self-consistent framework. Our earlier study indicates that DG based non-polar solvation model outperforms other methods in non-polar solvation energy predictions. However, the DG based full solvation model has not shown its superiority in solvation analysis, due to its difficulty in parametrization, which must ensure the stability of the solution of strongly coupled nonlinear Laplace-Beltrami and Poisson-Boltzmann equations. In this work, we introduce new parameter learning algorithms based on perturbation and convex optimization theories to stabilize the numerical solution and thus achieve an optimal parametrization of the DG based solvation models. An interesting feature of the present DG based solvation model is that it provides accurate solvation free energy predictions for both polar and non-polar molecules in a unified formulation. Extensive numerical experiment demonstrates that the present DG based solvation model delivers some of the most accurate predictions of the solvation free energies for a large number of molecules. PMID:26450304

  16. On combining Laplacian and optimization-based mesh smoothing techniques

    SciTech Connect

    Freitag, L.A.

    1997-07-01

    Local mesh smoothing algorithms have been shown to be effective in repairing distorted elements in automatically generated meshes. The simplest such algorithm is Laplacian smoothing, which moves grid points to the geometric center of incident vertices. Unfortunately, this method operates heuristically and can create invalid meshes or elements of worse quality than those contained in the original mesh. In contrast, optimization-based methods are designed to maximize some measure of mesh quality and are very effective at eliminating extremal angles in the mesh. These improvements come at a higher computational cost, however. In this article the author proposes three smoothing techniques that combine a smart variant of Laplacian smoothing with an optimization-based approach. Several numerical experiments are performed that compare the mesh quality and computational cost for each of the methods in two and three dimensions. The author finds that the combined approaches are very cost effective and yield high-quality meshes.

  17. Bell-Curve Based Evolutionary Strategies for Structural Optimization

    NASA Technical Reports Server (NTRS)

    Kincaid, Rex K.

    2000-01-01

    Evolutionary methods are exceedingly popular with practitioners of many fields; more so than perhaps any optimization tool in existence. Historically Genetic Algorithms (GAs) led the way in practitioner popularity (Reeves 1997). However, in the last ten years Evolutionary Strategies (ESs) and Evolutionary Programs (EPS) have gained a significant foothold (Glover 1998). One partial explanation for this shift is the interest in using GAs to solve continuous optimization problems. The typical GA relies upon a cumber-some binary representation of the design variables. An ES or EP, however, works directly with the real-valued design variables. For detailed references on evolutionary methods in general and ES or EP in specific see Back (1996) and Dasgupta and Michalesicz (1997). We call our evolutionary algorithm BCB (bell curve based) since it is based upon two normal distributions.

  18. Bell-Curve Based Evolutionary Strategies for Structural Optimization

    NASA Technical Reports Server (NTRS)

    Kincaid, Rex K.

    2001-01-01

    Evolutionary methods are exceedingly popular with practitioners of many fields; more so than perhaps any optimization tool in existence. Historically Genetic Algorithms (GAs) led the way in practitioner popularity. However, in the last ten years Evolutionary Strategies (ESs) and Evolutionary Programs (EPS) have gained a significant foothold. One partial explanation for this shift is the interest in using GAs to solve continuous optimization problems. The typical GA relies upon a cumbersome binary representation of the design variables. An ES or EP, however, works directly with the real-valued design variables. For detailed references on evolutionary methods in general and ES or EP in specific see Back and Dasgupta and Michalesicz. We call our evolutionary algorithm BCB (bell curve based) since it is based upon two normal distributions.

  19. Finite Element Based HWB Centerbody Structural Optimization and Weight Prediction

    NASA Technical Reports Server (NTRS)

    Gern, Frank H.

    2012-01-01

    This paper describes a scalable structural model suitable for Hybrid Wing Body (HWB) centerbody analysis and optimization. The geometry of the centerbody and primary wing structure is based on a Vehicle Sketch Pad (VSP) surface model of the aircraft and a FLOPS compatible parameterization of the centerbody. Structural analysis, optimization, and weight calculation are based on a Nastran finite element model of the primary HWB structural components, featuring centerbody, mid section, and outboard wing. Different centerbody designs like single bay or multi-bay options are analyzed and weight calculations are compared to current FLOPS results. For proper structural sizing and weight estimation, internal pressure and maneuver flight loads are applied. Results are presented for aerodynamic loads, deformations, and centerbody weight.

  20. GA-Based Image Restoration by Isophote Constraint Optimization

    NASA Astrophysics Data System (ADS)

    Kim, Jong Bae; Kim, Hang Joon

    2003-12-01

    We propose an efficient technique for image restoration based on a genetic algorithm (GA) with an isophote constraint. In our technique, the image restoration problem is modeled as an optimization problem which, in our case, is solved by a cost function with isophote constraint that is minimized using a GA. We consider that an image is decomposed into isophotes based on connected components of constant intensity. The technique creates an optimal connection of all pairs of isophotes disconnected by a caption in the frame. For connecting the disconnected isophotes, we estimate the value of the smoothness, given by the best chromosomes of the GA and project this value in the isophote direction. Experimental results show a great possibility for automatic restoration of a region in an advertisement scene.

  1. Process optimization electrospinning fibrous material based on polyhydroxybutyrate

    NASA Astrophysics Data System (ADS)

    Olkhov, A. A.; Tyubaeva, P. M.; Staroverova, O. V.; Mastalygina, E. E.; Popov, A. A.; Ischenko, A. A.; Iordanskii, A. L.

    2016-05-01

    The article analyzes the influence of the main technological parameters of electrostatic spinning on the morphology and properties of ultrathin fibers on the basis of polyhydroxybutyrate. It is found that the electric conductivity and viscosity of the spinning solution affects the process of forming fibers macrostructure. The fiber-based materials PHB lets control geometry and optimize the viscosity and conductivity of a spinning solution. The resulting fibers have found use in medicine, particularly in the construction elements musculoskeletal.

  2. An Optimality-Based Fully-Distributed Watershed Ecohydrological Model

    NASA Astrophysics Data System (ADS)

    Chen, L., Jr.

    2015-12-01

    Watershed ecohydrological models are essential tools to assess the impact of climate change and human activities on hydrological and ecological processes for watershed management. Existing models can be classified as empirically based model, quasi-mechanistic and mechanistic models. The empirically based and quasi-mechanistic models usually adopt empirical or quasi-empirical equations, which may be incapable of capturing non-stationary dynamics of target processes. Mechanistic models that are designed to represent process feedbacks may capture vegetation dynamics, but often have more demanding spatial and temporal parameterization requirements to represent vegetation physiological variables. In recent years, optimality based ecohydrological models have been proposed which have the advantage of reducing the need for model calibration by assuming critical aspects of system behavior. However, this work to date has been limited to plot scale that only considers one-dimensional exchange of soil moisture, carbon and nutrients in vegetation parameterization without lateral hydrological transport. Conceptual isolation of individual ecosystem patches from upslope and downslope flow paths compromises the ability to represent and test the relationships between hydrology and vegetation in mountainous and hilly terrain. This work presents an optimality-based watershed ecohydrological model, which incorporates lateral hydrological process influence on hydrological flow-path patterns that emerge from the optimality assumption. The model has been tested in the Walnut Gulch watershed and shows good agreement with observed temporal and spatial patterns of evapotranspiration (ET) and gross primary productivity (GPP). Spatial variability of ET and GPP produced by the model match spatial distribution of TWI, SCA, and slope well over the area. Compared with the one dimensional vegetation optimality model (VOM), we find that the distributed VOM (DisVOM) produces more reasonable spatial

  3. Study of risk based on web software testing

    NASA Astrophysics Data System (ADS)

    Wang, Xin

    2013-03-01

    Web-based test systems that have particular difficulties and challenges, The article points out a Web application system security risk, through the analysis of the implementation issues involved Web-based testing, proposed workflow based on Web test, And how to choose the risk of the process by adding a detailed study, Discussed the security, performance, accuracy, compatibility, reliability and other details of the risk factors. These risks need for Web application testing program be established in order to make better Web-based test plan.

  4. A Power Grid Optimization Algorithm by Observing Timing Error Risk by IR Drop

    NASA Astrophysics Data System (ADS)

    Kawakami, Yoshiyuki; Terao, Makoto; Fukui, Masahiro; Tsukiyama, Shuji

    With the advent of the deep submicron age, circuit performance is strongly impacted by process variations and the influence on the circuit delay to the power-supply voltage increases more and more due to CMOS feature size shrinkage. Power grid optimization which considers the timing error risk caused by the variations and IR drop becomes very important for stable and hi-speed operation of system-on-chip. Conventionally, a lot of power grid optimization algorithms have been proposed, and most of them use IR drop as their object functions. However, the IR drop is an indirect metric and we suspect that it is vague metric for the real goal of LSI design. In this paper, first, we propose an approach which uses the “timing error risk caused by IR drop” as a direct objective function. Second, the critical path map is introduced to express the existence of critical paths distributed in the entire chip. The timing error risk is decreased by using the critical path map and the new objective function. Some experimental results show the effectiveness.

  5. Mode-tracking based stationary-point optimization.

    PubMed

    Bergeler, Maike; Herrmann, Carmen; Reiher, Markus

    2015-07-15

    In this work, we present a transition-state optimization protocol based on the Mode-Tracking algorithm [Reiher and Neugebauer, J. Chem. Phys., 2003, 118, 1634]. By calculating only the eigenvector of interest instead of diagonalizing the full Hessian matrix and performing an eigenvector following search based on the selectively calculated vector, we can efficiently optimize transition-state structures. The initial guess structures and eigenvectors are either chosen from a linear interpolation between the reactant and product structures, from a nudged-elastic band search, from a constrained-optimization scan, or from the minimum-energy structures. Alternatively, initial guess vectors based on chemical intuition may be defined. We then iteratively refine the selected vectors by the Davidson subspace iteration technique. This procedure accelerates finding transition states for large molecules of a few hundred atoms. It is also beneficial in cases where the starting structure is very different from the transition-state structure or where the desired vector to follow is not the one with lowest eigenvalue. Explorative studies of reaction pathways are feasible by following manually constructed molecular distortions. PMID:26073318

  6. Biological Based Risk Assessment for Space Exploration

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2011-01-01

    Exposures from galactic cosmic rays (GCR) - made up of high-energy protons and high-energy and charge (HZE) nuclei, and solar particle events (SPEs) - comprised largely of low- to medium-energy protons are the primary health concern for astronauts for long-term space missions. Experimental studies have shown that HZE nuclei produce both qualitative and quantitative differences in biological effects compared to terrestrial radiation, making risk assessments for cancer and degenerative risks, such as central nervous system effects and heart disease, highly uncertain. The goal for space radiation protection at NASA is to be able to reduce the uncertainties in risk assessments for Mars exploration to be small enough to ensure acceptable levels of risks are not exceeded and to adequately assess the efficacy of mitigation measures such as shielding or biological countermeasures. We review the recent BEIR VII and UNSCEAR-2006 models of cancer risks and their uncertainties. These models are shown to have an inherent 2-fold uncertainty as defined by ratio of the 95% percent confidence level to the mean projection, even before radiation quality is considered. In order to overcome the uncertainties in these models, new approaches to risk assessment are warranted. We consider new computational biology approaches to modeling cancer risks. A basic program of research that includes stochastic descriptions of the physics and chemistry of radiation tracks and biochemistry of metabolic pathways, to emerging biological understanding of cellular and tissue modifications leading to cancer is described.

  7. Genetic Algorithm (GA)-Based Inclinometer Layout Optimization

    PubMed Central

    Liang, Weijie; Zhang, Ping; Chen, Xianping; Cai, Miao; Yang, Daoguo

    2015-01-01

    This paper presents numerical simulation results of an airflow inclinometer with sensitivity studies and thermal optimization of the printed circuit board (PCB) layout for an airflow inclinometer based on a genetic algorithm (GA). Due to the working principle of the gas sensor, the changes of the ambient temperature may cause dramatic voltage drifts of sensors. Therefore, eliminating the influence of the external environment for the airflow is essential for the performance and reliability of an airflow inclinometer. In this paper, the mechanism of an airflow inclinometer and the influence of different ambient temperatures on the sensitivity of the inclinometer will be examined by the ANSYS-FLOTRAN CFD program. The results show that with changes of the ambient temperature on the sensing element, the sensitivity of the airflow inclinometer is inversely proportional to the ambient temperature and decreases when the ambient temperature increases. GA is used to optimize the PCB thermal layout of the inclinometer. The finite-element simulation method (ANSYS) is introduced to simulate and verify the results of our optimal thermal layout, and the results indicate that the optimal PCB layout greatly improves (by more than 50%) the sensitivity of the inclinometer. The study may be useful in the design of PCB layouts that are related to sensitivity improvement of gas sensors. PMID:25897500

  8. Optimal network topology for structural robustness based on natural connectivity

    NASA Astrophysics Data System (ADS)

    Peng, Guan-sheng; Wu, Jun

    2016-02-01

    The structural robustness of the infrastructure of various real-life systems, which can be represented by networks, is of great importance. Thus we have proposed a tabu search algorithm to optimize the structural robustness of a given network by rewiring the links and fixing the node degrees. The objective of our algorithm is to maximize a new structural robustness measure, natural connectivity, which provides a sensitive and reliable measure of the structural robustness of complex networks and has lower computation complexity. We initially applied this method to several networks with different degree distributions for contrast analysis and investigated the basic properties of the optimal network. We discovered that the optimal network based on the power-law degree distribution exhibits a roughly "eggplant-like" topology, where there is a cluster of high-degree nodes at the head and other low-degree nodes scattered across the body of "eggplant". Additionally, the cost to rewire links in practical applications is considered; therefore, we optimized this method by employing the assortative rewiring strategy and validated its efficiency.

  9. A global optimization paradigm based on change of measures.

    PubMed

    Sarkar, Saikat; Roy, Debasish; Vasu, Ram Mohan

    2015-07-01

    A global optimization framework, COMBEO (Change Of Measure Based Evolutionary Optimization), is proposed. An important aspect in the development is a set of derivative-free additive directional terms, obtainable through a change of measures en route to the imposition of any stipulated conditions aimed at driving the realized design variables (particles) to the global optimum. The generalized setting offered by the new approach also enables several basic ideas, used with other global search methods such as the particle swarm or the differential evolution, to be rationally incorporated in the proposed set-up via a change of measures. The global search may be further aided by imparting to the directional update terms additional layers of random perturbations such as 'scrambling' and 'selection'. Depending on the precise choice of the optimality conditions and the extent of random perturbation, the search can be readily rendered either greedy or more exploratory. As numerically demonstrated, the new proposal appears to provide for a more rational, more accurate and, in some cases, a faster alternative to many available evolutionary optimization schemes. PMID:26587268

  10. Nanodosimetry-Based Plan Optimization for Particle Therapy

    PubMed Central

    Casiraghi, Margherita; Schulte, Reinhard W.

    2015-01-01

    Treatment planning for particle therapy is currently an active field of research due uncertainty in how to modify physical dose in order to create a uniform biological dose response in the target. A novel treatment plan optimization strategy based on measurable nanodosimetric quantities rather than biophysical models is proposed in this work. Simplified proton and carbon treatment plans were simulated in a water phantom to investigate the optimization feasibility. Track structures of the mixed radiation field produced at different depths in the target volume were simulated with Geant4-DNA and nanodosimetric descriptors were calculated. The fluences of the treatment field pencil beams were optimized in order to create a mixed field with equal nanodosimetric descriptors at each of the multiple positions in spread-out particle Bragg peaks. For both proton and carbon ion plans, a uniform spatial distribution of nanodosimetric descriptors could be obtained by optimizing opposing-field but not single-field plans. The results obtained indicate that uniform nanodosimetrically weighted plans, which may also be radiobiologically uniform, can be obtained with this approach. Future investigations need to demonstrate that this approach is also feasible for more complicated beam arrangements and that it leads to biologically uniform response in tumor cells and tissues. PMID:26167202