Science.gov

Sample records for risk based optimization

  1. PTV-based IMPT optimization incorporating planning risk volumes vs robust optimization

    SciTech Connect

    Liu Wei; Li Xiaoqiang; Zhu, Ron. X.; Mohan, Radhe; Frank, Steven J.; Li Yupeng

    2013-02-15

    Purpose: Robust optimization leads to intensity-modulated proton therapy (IMPT) plans that are less sensitive to uncertainties and superior in terms of organs-at-risk (OARs) sparing, target dose coverage, and homogeneity compared to planning target volume (PTV)-based optimized plans. Robust optimization incorporates setup and range uncertainties, which implicitly adds margins to both targets and OARs and is also able to compensate for perturbations in dose distributions within targets and OARs caused by uncertainties. In contrast, the traditional PTV-based optimization considers only setup uncertainties and adds a margin only to targets but no margins to the OARs. It also ignores range uncertainty. The purpose of this work is to determine if robustly optimized plans are superior to PTV-based plans simply because the latter do not assign margins to OARs during optimization. Methods: The authors retrospectively selected from their institutional database five patients with head and neck (H and N) cancer and one with prostate cancer for this analysis. Using their original images and prescriptions, the authors created new IMPT plans using three methods: PTV-based optimization, optimization based on the PTV and planning risk volumes (PRVs) (i.e., 'PTV+PRV-based optimization'), and robust optimization using the 'worst-case' dose distribution. The PRVs were generated by uniformly expanding OARs by 3 mm for the H and N cases and 5 mm for the prostate case. The dose-volume histograms (DVHs) from the worst-case dose distributions were used to assess and compare plan quality. Families of DVHs for each uncertainty for all structures of interest were plotted along with the nominal DVHs. The width of the 'bands' of DVHs was used to quantify the plan sensitivity to uncertainty. Results: Compared with conventional PTV-based and PTV+PRV-based planning, robust optimization led to a smaller bandwidth for the targets in the face of uncertainties {l_brace}clinical target volume [CTV

  2. On the optimal risk based design of highway drainage structures

    NASA Astrophysics Data System (ADS)

    Tung, Y.-K.; Bao, Y.

    1990-12-01

    For a proposed highway bridge or culvert, the total cost to the public during its expected service life includes capital investment on the structures, regular operation and maintenance costs, and various flood related costs. The flood related damage costs include items such as replacement and repair costs of the highway bridge or culvert, flood plain property damage costs, users costs from traffic interruptions and detours, and others. As the design discharge increases, the required capital investment increases but the corresponding flood related damage costs decrease. Hydraulic design of a bridge or culvert using a riskbased approach is to choose among the alternatives the one associated with the least total expected cost. In this paper, the risk-based design procedure is applied to pipe culvert design. The effect of the hydrologic uncertainties such as sample size and type of flood distribution model on the optimal culvert design parameters including design return period and total expected cost are examined in this paper.

  3. Risk-based Multiobjective Optimization Model for Bridge Maintenance Planning

    SciTech Connect

    Yang, I-T.; Hsu, Y.-S.

    2010-05-21

    Determining the optimal maintenance plan is essential for successful bridge management. The optimization objectives are defined in the forms of minimizing life-cycle cost and maximizing performance indicators. Previous bridge maintenance models assumed the process of bridge deterioration and the estimate of maintenance cost are deterministic, i.e., known with certainty. This assumption, however, is invalid especially with estimates over a long time horizon of bridge life. In this study, we consider the risks associated with bridge deterioration and maintenance cost in determining the optimal maintenance plan. The decisions variables include the strategic choice of essential maintenance (such as silane treatment and cathodic protection), and the intervals between periodic maintenance. A epsilon-constrained Particle Swarm Optimization algorithm is used to approximate the tradeoff between life-cycle cost and performance indicators. During stochastic search for optimal solutions, Monte-Carlo simulation is used to evaluate the impact of risks on the objective values, at an acceptance level of reliability. The proposed model can facilitate decision makers to select the compromised maintenance plan with a group of alternative choices, each of which leads to a different level of performance and life-cycle cost. A numerical example is used to illustrate the proposed model.

  4. Rational risk-based decision support for drinking water well managers by optimized monitoring designs

    NASA Astrophysics Data System (ADS)

    Enzenhöfer, R.; Geiges, A.; Nowak, W.

    2011-12-01

    Advection-based well-head protection zones are commonly used to manage the contamination risk of drinking water wells. Considering the insufficient knowledge about hazards and transport properties within the catchment, current Water Safety Plans recommend that catchment managers and stakeholders know, control and monitor all possible hazards within the catchments and perform rational risk-based decisions. Our goal is to supply catchment managers with the required probabilistic risk information, and to generate tools that allow for optimal and rational allocation of resources between improved monitoring versus extended safety margins and risk mitigation measures. To support risk managers with the indispensable information, we address the epistemic uncertainty of advective-dispersive solute transport and well vulnerability (Enzenhoefer et al., 2011) within a stochastic simulation framework. Our framework can separate between uncertainty of contaminant location and actual dilution of peak concentrations by resolving heterogeneity with high-resolution Monte-Carlo simulation. To keep computational costs low, we solve the reverse temporal moment transport equation. Only in post-processing, we recover the time-dependent solute breakthrough curves and the deduced well vulnerability criteria from temporal moments by non-linear optimization. Our first step towards optimal risk management is optimal positioning of sampling locations and optimal choice of data types to reduce best the epistemic prediction uncertainty for well-head delineation, using the cross-bred Likelihood Uncertainty Estimator (CLUE, Leube et al., 2011) for optimal sampling design. Better monitoring leads to more reliable and realistic protection zones and thus helps catchment managers to better justify smaller, yet conservative safety margins. In order to allow an optimal choice in sampling strategies, we compare the trade-off in monitoring versus the delineation costs by accounting for ill

  5. A Risk-Based Multi-Objective Optimization Concept for Early-Warning Monitoring Networks

    NASA Astrophysics Data System (ADS)

    Bode, F.; Loschko, M.; Nowak, W.

    2014-12-01

    Groundwater is a resource for drinking water and hence needs to be protected from contaminations. However, many well catchments include an inventory of known and unknown risk sources which cannot be eliminated, especially in urban regions. As matter of risk control, all these risk sources should be monitored. A one-to-one monitoring situation for each risk source would lead to a cost explosion and is even impossible for unknown risk sources. However, smart optimization concepts could help to find promising low-cost monitoring network designs.In this work we develop a concept to plan monitoring networks using multi-objective optimization. Our considered objectives are to maximize the probability of detecting all contaminations and the early warning time and to minimize the installation and operating costs of the monitoring network. A qualitative risk ranking is used to prioritize the known risk sources for monitoring. The unknown risk sources can neither be located nor ranked. Instead, we represent them by a virtual line of risk sources surrounding the production well.We classify risk sources into four different categories: severe, medium and tolerable for known risk sources and an extra category for the unknown ones. With that, early warning time and detection probability become individual objectives for each risk class. Thus, decision makers can identify monitoring networks which are valid for controlling the top risk sources, and evaluate the capabilities (or search for least-cost upgrade) to also cover moderate, tolerable and unknown risk sources. Monitoring networks which are valid for the remaining risk also cover all other risk sources but the early-warning time suffers.The data provided for the optimization algorithm are calculated in a preprocessing step by a flow and transport model. Uncertainties due to hydro(geo)logical phenomena are taken into account by Monte-Carlo simulations. To avoid numerical dispersion during the transport simulations we use the

  6. Optimal Temporal Risk Assessment

    PubMed Central

    Balci, Fuat; Freestone, David; Simen, Patrick; deSouza, Laura; Cohen, Jonathan D.; Holmes, Philip

    2011-01-01

    Time is an essential feature of most decisions, because the reward earned from decisions frequently depends on the temporal statistics of the environment (e.g., on whether decisions must be made under deadlines). Accordingly, evolution appears to have favored a mechanism that predicts intervals in the seconds to minutes range with high accuracy on average, but significant variability from trial to trial. Importantly, the subjective sense of time that results is sufficiently imprecise that maximizing rewards in decision-making can require substantial behavioral adjustments (e.g., accumulating less evidence for a decision in order to beat a deadline). Reward maximization in many daily decisions therefore requires optimal temporal risk assessment. Here, we review the temporal decision-making literature, conduct secondary analyses of relevant published datasets, and analyze the results of a new experiment. The paper is organized in three parts. In the first part, we review literature and analyze existing data suggesting that animals take account of their inherent behavioral variability (their “endogenous timing uncertainty”) in temporal decision-making. In the second part, we review literature that quantitatively demonstrates nearly optimal temporal risk assessment with sub-second and supra-second intervals using perceptual tasks (with humans and mice) and motor timing tasks (with humans). We supplement this section with original research that tested human and rat performance on a task that requires finding the optimal balance between two time-dependent quantities for reward maximization. This optimal balance in turn depends on the level of timing uncertainty. Corroborating the reviewed literature, humans and rats exhibited nearly optimal temporal risk assessment in this task. In the third section, we discuss the role of timing uncertainty in reward maximization in two-choice perceptual decision-making tasks and review literature that implicates timing uncertainty

  7. A risk explicit interval linear programming model for uncertainty-based environmental economic optimization in the Lake Fuxian watershed, China.

    PubMed

    Zhang, Xiaoling; Huang, Kai; Zou, Rui; Liu, Yong; Yu, Yajuan

    2013-01-01

    The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP) method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP) and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers' preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of "low risk and high return efficiency" in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management.

  8. Risk based approach for design and optimization of stomach specific delivery of rifampicin.

    PubMed

    Vora, Chintan; Patadia, Riddhish; Mittal, Karan; Mashru, Rajashree

    2013-10-15

    The research envisaged focuses on risk management approach for better recognizing the risks, ways to mitigate them and propose a control strategy for the development of rifampicin gastroretentive tablets. Risk assessment using failure mode and effects analysis (FMEA) was done to depict the effects of specific failure modes related to respective formulation/process variable. A Box-Behnken design was used to investigate the effect of amount of sodium bicarbonate (X1), pore former HPMC (X2) and glyceryl behenate (X3) on percent drug release at 1st hour (Q1), 4th hour (Q4), 8th hour (Q8) and floating lag time (min). Main effects and interaction plots were generated to study effects of variables. Selection of the optimized formulation was done using desirability function and overlay contour plots. The optimized formulation exhibited Q1 of 20.9%, Q4 of 59.1%, Q8 of 94.8% and floating lag time of 4.0 min. Akaike information criteria and Model selection criteria revealed that the model was best described by Korsmeyer-Peppas power law. The residual plots demonstrated no existence of non-normality, skewness or outliers. The composite desirability for optimized formulation computed using equations and software were 0.84 and 0.86 respectively. FTIR, DSC and PXRD studies ruled out drug polymer interaction due to thermal treatment.

  9. A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem

    PubMed Central

    Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming

    2015-01-01

    Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity. PMID:26180842

  10. A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem.

    PubMed

    Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming

    2015-01-01

    Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity.

  11. Risk modelling in portfolio optimization

    NASA Astrophysics Data System (ADS)

    Lam, W. H.; Jaaman, Saiful Hafizah Hj.; Isa, Zaidi

    2013-09-01

    Risk management is very important in portfolio optimization. The mean-variance model has been used in portfolio optimization to minimize the investment risk. The objective of the mean-variance model is to minimize the portfolio risk and achieve the target rate of return. Variance is used as risk measure in the mean-variance model. The purpose of this study is to compare the portfolio composition as well as performance between the optimal portfolio of mean-variance model and equally weighted portfolio. Equally weighted portfolio means the proportions that are invested in each asset are equal. The results show that the portfolio composition of the mean-variance optimal portfolio and equally weighted portfolio are different. Besides that, the mean-variance optimal portfolio gives better performance because it gives higher performance ratio than the equally weighted portfolio.

  12. Safety optimization through risk management

    NASA Astrophysics Data System (ADS)

    Wright, K.; Peltonen, P.

    The paper discusses the overall process of system safety optimization in the space program environment and addresses in particular methods that enhance the efficiency of this activity. Effective system safety optimization is achieved by concentrating the available engineering and safety assurance resouces on the main risk contributors. The qualitative risk contributor identification by means of the hazard analyses and the FMECA constitute the basis for the system safety process. The risk contributors are ranked firstly on a qualitative basis according to the consequence severities. This ranking is then refined by mishap propagation/recovery time considerations and by probabilistic means (PRA). Finally, in order to broaden and extend the use of risk contributor ranking as a managerial tool in project resource assignment, quality, manufacturing and operations related critical characteristics, i.e. risk influencing factors, are identified for managerial visibility.

  13. A risk-based coverage model for video surveillance camera control optimization

    NASA Astrophysics Data System (ADS)

    Zhang, Hongzhou; Du, Zhiguo; Zhao, Xingtao; Li, Peiyue; Li, Dehua

    2015-12-01

    Visual surveillance system for law enforcement or police case investigation is different from traditional application, for it is designed to monitor pedestrians, vehicles or potential accidents. Visual surveillance risk is defined as uncertainty of visual information of targets and events monitored in present work and risk entropy is introduced to modeling the requirement of police surveillance task on quality and quantity of vide information. the prosed coverage model is applied to calculate the preset FoV position of PTZ camera.

  14. Dynamic optimization of ISR sensors using a risk-based reward function applied to ground and space surveillance scenarios

    NASA Astrophysics Data System (ADS)

    DeSena, J. T.; Martin, S. R.; Clarke, J. C.; Dutrow, D. A.; Newman, A. J.

    2012-06-01

    . The algorithm to jointly optimize sensor schedules against search, track, and classify is based on recent work by Papageorgiou and Raykin on risk-based sensor management. It uses a risk-based objective function and attempts to minimize and balance the risks of misclassifying and losing track on an object. It supports the requirement to generate tasking for metric and feature data concurrently and synergistically, and account for both tracking accuracy and object characterization, jointly, in computing reward and cost for optimizing tasking decisions.

  15. Drought early warning based on optimal risk forecasts in regulated river systems: Application to the Jucar River Basin (Spain)

    NASA Astrophysics Data System (ADS)

    Haro-Monteagudo, David; Solera, Abel; Andreu, Joaquín

    2017-01-01

    Droughts are a major threat to water resources systems management. Timely anticipation results crucial to defining strategies and measures to minimise their effects. Water managers make use of monitoring systems in order to characterise and assess drought risk by means of indices and indicators. However, there are few systems currently in operation that are capable of providing early warning with regard to the occurrence of a drought episode. This paper proposes a novel methodology to support and complement drought monitoring and early warning in regulated water resources systems. It is based in the combined use of two models, a water resources optimization model and a stochastic streamflow generation model, to generate a series of results that allow evaluating the future state of the system. The results for the period 1998-2009 in the Jucar River Basin (Spain) show that accounting for scenario change risk can be beneficial for basin managers by providing them with information on the current and future drought situation at any given moment. Our results show that the combination of scenario change probabilities with the current drought monitoring system can represent a major advance towards improved drought management in the future, and add a significant value to the existing national State Index (SI) approach for early warning purposes.

  16. Comparison of new conditional value-at-risk-based management models for optimal allocation of uncertain water supplies

    NASA Astrophysics Data System (ADS)

    Yamout, Ghina M.; Hatfield, Kirk; Romeijn, H. Edwin

    2007-07-01

    The paper studies the effect of incorporating the conditional value-at-risk (CVaRα) in analyzing a water allocation problem versus using the frequently used expected value, two-stage modeling, scenario analysis, and linear optimization tools. Five models are developed to examine water resource allocation when available supplies are uncertain: (1) a deterministic expected value model, (2) a scenario analysis model, (3) a two-stage stochastic model with recourse, (4) a CVaRα objective function model, and (5) a CVaRα constraint model. The models are applied over a region of east central Florida. Results show the deterministic expected value model underestimates system costs and water shortage. Furthermore, the expected value model produces identical cost estimates for different standard deviations distributions of water supplies with identical mean. From the scenario analysis model it is again demonstrated that the expected value of results taken from many scenarios underestimates costs and water shortages. Using a two-stage stochastic mixed integer formulation with recourse permits an improved representation of uncertainties and real-life decision making which in turn predicts higher costs. The inclusion of CVaRα objective function in the latter provides for the optimization and control of high-risk events. Minimizing CVaRα does not, however, permit control of lower-risk events. Constraining CVaRα while minimizing cost, on the other hand, allows for the control of high-risk events while minimizing the costs of all events. Results show CVaRα exhibits continuous and consistent behavior with respect to the confidence level α, when compared to value-at-risk (VaRα).

  17. Optimization of the fractionated irradiation scheme considering physical doses to tumor and organ at risk based on dose–volume histograms

    SciTech Connect

    Sugano, Yasutaka; Mizuta, Masahiro; Takao, Seishin; Shirato, Hiroki; Sutherland, Kenneth L.; Date, Hiroyuki

    2015-11-15

    Purpose: Radiotherapy of solid tumors has been performed with various fractionation regimens such as multi- and hypofractionations. However, the ability to optimize the fractionation regimen considering the physical dose distribution remains insufficient. This study aims to optimize the fractionation regimen, in which the authors propose a graphical method for selecting the optimal number of fractions (n) and dose per fraction (d) based on dose–volume histograms for tumor and normal tissues of organs around the tumor. Methods: Modified linear-quadratic models were employed to estimate the radiation effects on the tumor and an organ at risk (OAR), where the repopulation of the tumor cells and the linearity of the dose-response curve in the high dose range of the surviving fraction were considered. The minimization problem for the damage effect on the OAR was solved under the constraint that the radiation effect on the tumor is fixed by a graphical method. Here, the damage effect on the OAR was estimated based on the dose–volume histogram. Results: It was found that the optimization of fractionation scheme incorporating the dose–volume histogram is possible by employing appropriate cell surviving models. The graphical method considering the repopulation of tumor cells and a rectilinear response in the high dose range enables them to derive the optimal number of fractions and dose per fraction. For example, in the treatment of prostate cancer, the optimal fractionation was suggested to lie in the range of 8–32 fractions with a daily dose of 2.2–6.3 Gy. Conclusions: It is possible to optimize the number of fractions and dose per fraction based on the physical dose distribution (i.e., dose–volume histogram) by the graphical method considering the effects on tumor and OARs around the tumor. This method may stipulate a new guideline to optimize the fractionation regimen for physics-guided fractionation.

  18. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Chueng, Kar-Ming

    2008-01-01

    The main purpose of this paper is to introduce a risk management approach that allows planners to quantify the risk and efficiency tradeoff in the presence of uncertainties, and to make forward-looking choices in the development and execution of the plan. Demonstrate a planning and risk analysis framework that tightly integrates mathematical optimization, empirical simulation, and theoretical analysis techniques to solve complex problems.

  19. Adaptation for Planting and Irrigation Decisions to Changing Monsoon Regime in Northeast India: Risk-based Hydro-economic Optimization

    NASA Astrophysics Data System (ADS)

    Zhu, T.; Cai, X.

    2013-12-01

    Delay in onset of Indian summer monsoon becomes increasingly frequent. Delayed monsoon and occasional monsoon failures seriously affect agricultural production in the northeast as well as other parts of India. In the Vaishali district of the Bihar State, Monsoon rainfall is very skewed and erratic, often concentrating in shorter durations. Farmers in Vaishali reported that delayed Monsoon affected paddy planting and, consequently delayed cropping cycle, putting crops under the risks of 'terminal heat.' Canal system in the district does not function due to lack of maintenance; irrigation relies almost entirely on groundwater. Many small farmers choose not to irrigate when monsoon onset is delayed due to high diesel price, leading to reduced production or even crop failure. Some farmers adapt to delayed onset of Monsoon by planting short-duration rice, which gives the flexibility for planting the next season crops. Other sporadic autonomous adaptation activities were observed as well, with various levels of success. Adaptation recommendations and effective policy interventions are much needed. To explore robust options to adapt to the changing Monsoon regime, we build a stochastic programming model to optimize revenues of farmer groups categorized by landholding size, subject to stochastic Monsoon onset and rainfall amount. Imperfect probabilistic long-range forecast is used to inform the model onset and rainfall amount probabilities; the 'skill' of the forecasting is measured using probabilities of correctly predicting events in the past derived through hindcasting. Crop production functions are determined using self-calibrating Positive Mathematical Programming approach. The stochastic programming model aims to emulate decision-making behaviors of representative farmer agents through making choices in adaptation, including crop mix, planting dates, irrigation, and use of weather information. A set of technological and policy intervention scenarios are tested

  20. Risk Assessment: Evidence Base

    NASA Technical Reports Server (NTRS)

    Johnson-Throop, Kathy A.

    2007-01-01

    Human systems PRA (Probabilistic Risk Assessment: a) Provides quantitative measures of probability, consequence, and uncertainty; and b) Communicates risk and informs decision-making. Human health risks rated highest in ISS PRA are based on 1997 assessment of clinical events in analog operational settings. Much work remains to analyze remaining human health risks identified in Bioastronautics Roadmap.

  1. Medical Device Risk Management For Performance Assurance Optimization and Prioritization.

    PubMed

    Gaamangwe, Tidimogo; Babbar, Vishvek; Krivoy, Agustina; Moore, Michael; Kresta, Petr

    2015-01-01

    Performance assurance (PA) is an integral component of clinical engineering medical device risk management. For that reason, the clinical engineering (CE) community has made concerted efforts to define appropriate risk factors and develop quantitative risk models for efficient data processing and improved PA program operational decision making. However, a common framework that relates the various processes of a quantitative risk system does not exist. This article provides a perspective that focuses on medical device quality and risk-based elements of the PA program, which include device inclusion/exclusion, schedule optimization, and inspection prioritization. A PA risk management framework is provided, and previous quantitative models that have contributed to the advancement of PA risk management are examined. A general model for quantitative risk systems is proposed, and further perspective on possible future directions in the area of PA technology is also provided.

  2. Optimal security investments and extreme risk.

    PubMed

    Mohtadi, Hamid; Agiwal, Swati

    2012-08-01

    In the aftermath of 9/11, concern over security increased dramatically in both the public and the private sector. Yet, no clear algorithm exists to inform firms on the amount and the timing of security investments to mitigate the impact of catastrophic risks. The goal of this article is to devise an optimum investment strategy for firms to mitigate exposure to catastrophic risks, focusing on how much to invest and when to invest. The latter question addresses the issue of whether postponing a risk mitigating decision is an optimal strategy or not. Accordingly, we develop and estimate both a one-period model and a multiperiod model within the framework of extreme value theory (EVT). We calibrate these models using probability measures for catastrophic terrorism risks associated with attacks on the food sector. We then compare our findings with the purchase of catastrophic risk insurance.

  3. Potential for dose-escalation and reduction of risk in pancreatic cancer using IMRT optimization with lexicographic ordering and gEUD-based cost functions

    SciTech Connect

    Spalding, Aaron C.; Jee, Kyung-Wook; Vineberg, Karen; Jablonowski, Marla; Fraass, Benedick A.; Pan, Charlie C.; Lawrence, Theodore S.; Ten Haken, Randall K.; Ben-Josef, Edgar

    2007-02-15

    Radiotherapy for pancreatic cancer is limited by the tolerance of local organs at risk (OARs) and frequent overlap of the planning target volume (PTV) and OAR volumes. Using lexicographic ordering (LO), a hierarchical optimization technique, with generalized equivalent uniform dose (gEUD) cost functions, we studied the potential of intensity modulated radiation therapy (IMRT) to increase the dose to pancreatic tumors and to areas of vascular involvement that preclude surgical resection [surgical boost volume (SBV)]. We compared 15 forward planned three-dimensional conformal (3DCRT) and IMRT treatment plans for locally advanced unresectable pancreatic cancer. We created IMRT plans optimized using LO with gEUD-based cost functions that account for the contribution of each part of the resulting inhomogeneous dose distribution. LO-IMRT plans allowed substantial PTV dose escalation compared with 3DCRT; median increase from 52 Gy to 66 Gy (a=-5,p<0.005) and median increase from 50 Gy to 59 Gy (a=-15,p<0.005). LO-IMRT also allowed increases to 85 Gy in the SBV, regardless of a value, along with significant dose reductions in OARs. We conclude that LO-IMRT with gEUD cost functions could allow dose escalation in pancreas tumors with concomitant reduction in doses to organs at risk as compared with traditional 3DCRT.

  4. Risk perception, fuzzy representations and comparative optimism.

    PubMed

    Brown, Stephen L; Morley, Andy M

    2007-11-01

    Rather than a unitary value, individuals may represent health risk as a fuzzy entity that permits them to make a number of specific possible estimates. Comparative optimism might be explained by people flexibly, using such a set to derive optimistic risk estimates. Student participants were asked to rate the likelihood of eight harmful alcohol-related outcomes occurring to themselves and to an average student. Participants made either unitary estimates or estimates representing the upper and lower bounds of a set denoting 'realistic probability' estimates. Personal risk estimates were lower when they were made as unitary estimates than those calculated from the mid-points of the bounded estimates. Unitary estimates of personal risk made after the bounded estimates were lower than initial unitary estimates. There were no effects for estimates made with regard to the average student. Risk may be internally represented as a fuzzy set, and comparative optimism may exist partly because this set allows people the opportunity to make optimistic unitary estimates for personal risk within what they see as realistic parameters.

  5. RNA based evolutionary optimization

    NASA Astrophysics Data System (ADS)

    Schuster, Peter

    1993-12-01

    . Evolutionary optimization of two-letter sequences in thus more difficult than optimization in the world of natural RNA sequences with four bases. This fact might explain the usage of four bases in the genetic language of nature. Finally we study the mapping from RNA sequences into secondary structures and explore the topology of RNA shape space. We find that ‘neutral paths’ connecting neighbouring sequences with identical structures go very frequently through entire sequence space. Sequences folding into common structures are found everywhere in sequence space. Hence, evolution can migrate to almost every part of sequence space without ‘hill climbing’ and only small fractions of the entire number of sequences have to be searched in order to find suitable structures.

  6. Search-based optimization.

    PubMed

    Wheeler, Ward C

    2003-08-01

    The problem of determining the minimum cost hypothetical ancestral sequences for a given cladogram is known to be NP-complete (Wang and Jiang, 1994). Traditionally, point estimations of hypothetical ancestral sequences have been used to gain heuristic, upper bounds on cladogram cost. These include procedures with such diverse approaches as non-additive optimization of multiple sequence alignment, direct optimization (Wheeler, 1996), and fixed-state character optimization (Wheeler, 1999). A method is proposed here which, by extending fixed-state character optimization, replaces the estimation process with a search. This form of optimization examines a diversity of potential state solutions for cost-efficient hypothetical ancestral sequences and can result in greatly more parsimonious cladograms. Additionally, such an approach can be applied to other NP-complete phylogenetic optimization problems such as genomic break-point analysis.

  7. Search-based optimization

    NASA Technical Reports Server (NTRS)

    Wheeler, Ward C.

    2003-01-01

    The problem of determining the minimum cost hypothetical ancestral sequences for a given cladogram is known to be NP-complete (Wang and Jiang, 1994). Traditionally, point estimations of hypothetical ancestral sequences have been used to gain heuristic, upper bounds on cladogram cost. These include procedures with such diverse approaches as non-additive optimization of multiple sequence alignment, direct optimization (Wheeler, 1996), and fixed-state character optimization (Wheeler, 1999). A method is proposed here which, by extending fixed-state character optimization, replaces the estimation process with a search. This form of optimization examines a diversity of potential state solutions for cost-efficient hypothetical ancestral sequences and can result in greatly more parsimonious cladograms. Additionally, such an approach can be applied to other NP-complete phylogenetic optimization problems such as genomic break-point analysis. c2003 The Willi Hennig Society. Published by Elsevier Science (USA). All rights reserved.

  8. C-21 Fleet: Base Optimization

    DTIC Science & Technology

    assigned to the operational support airlift mission, located at Andrews Air Force Base, Maryland and Scott Air Force Base, Illinois. The missions flown... Scott and Andrews AFB is the optimal assignment. If nine total assets were optimized, five would be assigned to Scott AFB and four to Andrews AFB

  9. Testing an optimized community-based human immunodeficiency virus (HIV) risk reduction and antiretroviral adherence intervention for HIV-infected injection drug users.

    PubMed

    Copenhaver, Michael M; Lee, I-Ching; Margolin, Arthur; Bruce, Robert D; Altice, Frederick L

    2011-01-01

    The authors conducted a preliminary study of the 4-session Holistic Health for HIV (3H+), which was adapted from a 12-session evidence-based risk reduction and antiretroviral adherence intervention. Improvements were found in the behavioral skills required to properly adhere to HIV medication regimens. Enhancements were found in all measured aspects of sex-risk reduction outcomes, including HIV knowledge, motivation to reduce sex-risk behavior, behavioral skills related to engaging in reduced sexual risk, and reduced risk behavior. Improvements in drug use outcomes included enhancements in risk reduction skills as well as reduced heroin and cocaine use. Intervention effects also showed durability from post-intervention to the follow-up assessment point. Females responded particularly well in terms of improvements in risk reduction skills and risk behavior. This study suggests that an evidence-based behavioral intervention may be successfully adapted for use in community-based clinical settings where HIV-infected drug users can be more efficiently reached.

  10. Towards Risk Based Design for NASA's Missions

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Barrientos, Francesca; Meshkat, Leila

    2004-01-01

    This paper describes the concept of Risk Based Design in the context of NASA s low volume, high cost missions. The concept of accounting for risk in the design lifecycle has been discussed and proposed under several research topics, including reliability, risk analysis, optimization, uncertainty, decision-based design, and robust design. This work aims to identify and develop methods to enable and automate a means to characterize and optimize risk, and use risk as a tradeable resource to make robust and reliable decisions, in the context of the uncertain and ambiguous stage of early conceptual design. This paper first presents a survey of the related topics explored in the design research community as they relate to risk based design. Then, a summary of the topics from the NASA-led Risk Colloquium is presented, followed by current efforts within NASA to account for risk in early design. Finally, a list of "risk elements", identified for early-phase conceptual design at NASA, is presented. The purpose is to lay the foundation and develop a roadmap for future work and collaborations for research to eliminate and mitigate these risk elements in early phase design.

  11. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Cheung, Kar-Ming

    2008-01-01

    This paper describes a systems engineering approach to resource planning by integrating mathematical modeling and constrained optimization, empirical simulation, and theoretical analysis techniques to generate an optimal task plan in the presence of uncertainties.

  12. Optimal Allocation for the Estimation of Attributable Risk,

    DTIC Science & Technology

    control studies . Various optimal strategies are examined using alternative exposure-specific disease rates. Odd Ratio, Relative Risk and Attributable Risk....This paper derives an expression for the optimum sampling allocation under the minimum variance criterion of the estimated attributable risk for case

  13. Cost tradeoffs in consequence management at nuclear power plants: A risk based approach to setting optimal long-term interdiction limits for regulatory analyses

    SciTech Connect

    Mubayi, V.

    1995-05-01

    The consequences of severe accidents at nuclear power plants can be limited by various protective actions, including emergency responses and long-term measures, to reduce exposures of affected populations. Each of these protective actions involve costs to society. The costs of the long-term protective actions depend on the criterion adopted for the allowable level of long-term exposure. This criterion, called the ``long term interdiction limit,`` is expressed in terms of the projected dose to an individual over a certain time period from the long-term exposure pathways. The two measures of offsite consequences, latent cancers and costs, are inversely related and the choice of an interdiction limit is, in effect, a trade-off between these two measures. By monetizing the health effects (through ascribing a monetary value to life lost), the costs of the two consequence measures vary with the interdiction limit, the health effect costs increasing as the limit is relaxed and the protective action costs decreasing. The minimum of the total cost curve can be used to calculate an optimal long term interdiction limit. The calculation of such an optimal limit is presented for each of five US nuclear power plants which were analyzed for severe accident risk in the NUREG-1150 program by the Nuclear Regulatory Commission.

  14. Research on optimization-based design

    NASA Technical Reports Server (NTRS)

    Balling, R. J.; Parkinson, A. R.; Free, J. C.

    1989-01-01

    Research on optimization-based design is discussed. Illustrative examples are given for cases involving continuous optimization with discrete variables and optimization with tolerances. Approximation of computationally expensive and noisy functions, electromechanical actuator/control system design using decomposition and application of knowledge-based systems and optimization for the design of a valve anti-cavitation device are among the topics covered.

  15. Risk-optimized proton therapy to minimize radiogenic second cancers

    NASA Astrophysics Data System (ADS)

    Rechner, Laura A.; Eley, John G.; Howell, Rebecca M.; Zhang, Rui; Mirkovic, Dragan; Newhauser, Wayne D.

    2015-05-01

    Proton therapy confers substantially lower predicted risk of second cancer compared with photon therapy. However, no previous studies have used an algorithmic approach to optimize beam angle or fluence-modulation for proton therapy to minimize those risks. The objectives of this study were to demonstrate the feasibility of risk-optimized proton therapy and to determine the combination of beam angles and fluence weights that minimizes the risk of second cancer in the bladder and rectum for a prostate cancer patient. We used 6 risk models to predict excess relative risk of second cancer. Treatment planning utilized a combination of a commercial treatment planning system and an in-house risk-optimization algorithm. When normal-tissue dose constraints were incorporated in treatment planning, the risk model that incorporated the effects of fractionation, initiation, inactivation, repopulation and promotion selected a combination of anterior and lateral beams, which lowered the relative risk by 21% for the bladder and 30% for the rectum compared to the lateral-opposed beam arrangement. Other results were found for other risk models.

  16. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    SciTech Connect

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P.

    1996-12-01

    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising.

  17. Constraint programming based biomarker optimization.

    PubMed

    Zhou, Manli; Luo, Youxi; Sun, Guoquan; Mai, Guoqin; Zhou, Fengfeng

    2015-01-01

    Efficient and intuitive characterization of biological big data is becoming a major challenge for modern bio-OMIC based scientists. Interactive visualization and exploration of big data is proven to be one of the successful solutions. Most of the existing feature selection algorithms do not allow the interactive inputs from users in the optimizing process of feature selection. This study investigates this question as fixing a few user-input features in the finally selected feature subset and formulates these user-input features as constraints for a programming model. The proposed algorithm, fsCoP (feature selection based on constrained programming), performs well similar to or much better than the existing feature selection algorithms, even with the constraints from both literature and the existing algorithms. An fsCoP biomarker may be intriguing for further wet lab validation, since it satisfies both the classification optimization function and the biomedical knowledge. fsCoP may also be used for the interactive exploration of bio-OMIC big data by interactively adding user-defined constraints for modeling.

  18. Optimal inverse functions created via population-based optimization.

    PubMed

    Jennings, Alan L; Ordóñez, Raúl

    2014-06-01

    Finding optimal inputs for a multiple-input, single-output system is taxing for a system operator. Population-based optimization is used to create sets of functions that produce a locally optimal input based on a desired output. An operator or higher level planner could use one of the functions in real time. For the optimization, each agent in the population uses the cost and output gradients to take steps lowering the cost while maintaining their current output. When an agent reaches an optimal input for its current output, additional agents are generated in the output gradient directions. The new agents then settle to the local optima for the new output values. The set of associated optimal points forms an inverse function, via spline interpolation, from a desired output to an optimal input. In this manner, multiple locally optimal functions can be created. These functions are naturally clustered in input and output spaces allowing for a continuous inverse function. The operator selects the best cluster over the anticipated range of desired outputs and adjusts the set point (desired output) while maintaining optimality. This reduces the demand from controlling multiple inputs, to controlling a single set point with no loss in performance. Results are demonstrated on a sample set of functions and on a robot control problem.

  19. Quantifying fatigue risk in model-based fatigue risk management.

    PubMed

    Rangan, Suresh; Van Dongen, Hans P A

    2013-02-01

    The question of what is a maximally acceptable level of fatigue risk is hotly debated in model-based fatigue risk management in commercial aviation and other transportation modes. A quantitative approach to addressing this issue, referred to by the Federal Aviation Administration with regard to its final rule for commercial aviation "Flightcrew Member Duty and Rest Requirements," is to compare predictions from a mathematical fatigue model against a fatigue threshold. While this accounts for duty time spent at elevated fatigue risk, it does not account for the degree of fatigue risk and may, therefore, result in misleading schedule assessments. We propose an alternative approach based on the first-order approximation that fatigue risk is proportional to both the duty time spent below the fatigue threshold and the distance of the fatigue predictions to the threshold--that is, the area under the curve (AUC). The AUC approach is straightforward to implement for schedule assessments in commercial aviation and also provides a useful fatigue metric for evaluating thousands of scheduling options in industrial schedule optimization tools.

  20. Risk based management of piping systems

    SciTech Connect

    Conley, M.J.; Aller, J.E.; Tallin, A.; Weber, B.J.

    1996-07-01

    The API Piping Inspection Code is the first such Code to require classification of piping based on the consequences of failure, and to use this classification to influence inspection activity. Since this Code was published, progress has been made in the development of tools to improve on this approach by determining not only the consequences of failure, but also the likelihood of failure. ``Risk`` is defined as the product of the consequence and the likelihood. Measuring risk provides the means to formally manage risk by matching the inspection effort (costs) to the benefits of reduced risk. Using such a cost/benefit analysis allows the optimization of inspection budgets while meeting societal demands for reduction of the risk associated with process plant piping. This paper presents an overview of the tools developed to measure risk, and the methods to determine the effects of past and future inspections on the level of risk. The methodology is being developed as an industry-sponsored project under the direction of an API committee. The intent is to develop an API Recommended Practice that will be linked to In-Service Inspection Standards and the emerging Fitness for Service procedures. Actual studies using a similar approach have shown that a very high percentage of the risk due to piping in an operating facility is associated with relatively few pieces of piping. This permits inspection efforts to be focused on those piping systems that will result in the greatest risk reduction.

  1. Risk-based decisionmaking (Panel)

    SciTech Connect

    Smith, T.H.

    1995-12-31

    By means of a panel discussion and extensive audience interaction, explore the current challenges and progress to date in applying risk considerations to decisionmaking related to low-level waste. This topic is especially timely because of the proposed legislation pertaining to risk-based decisionmaking and because of the increased emphasis placed on radiological performance assessments of low-level waste disposal.

  2. Anticoagulation in the older adult: optimizing benefit and reducing risk.

    PubMed

    Ko, Darae; Hylek, Elaine M

    2014-09-01

    The risk for both arterial and venous thrombosis increases with age. Despite the increasing burden of strokes related to atrial fibrillation (AF) and venous thromboembolism (VTE) among older adults, the use of anticoagulant therapy is limited in this population due to the parallel increase in risk of serious hemorrhage. Understanding the risks and their underlying mechanisms would help to mitigate adverse events and improve persistence with these life-saving therapies. The objectives of this review are to: (1) elucidate the age-related physiologic changes that render this high risk subgroup susceptible to hemorrhage, (2) identify mutable risk factors and hazards contributing to an increased bleeding risk in older individuals, and (3) discuss interventions to optimize anticoagulation therapy in this population.

  3. Spatial Optimization of Future Urban Development with Regards to Climate Risk and Sustainability Objectives.

    PubMed

    Caparros-Midwood, Daniel; Barr, Stuart; Dawson, Richard

    2017-02-23

    Future development in cities needs to manage increasing populations, climate-related risks, and sustainable development objectives such as reducing greenhouse gas emissions. Planners therefore face a challenge of multidimensional, spatial optimization in order to balance potential tradeoffs and maximize synergies between risks and other objectives. To address this, a spatial optimization framework has been developed. This uses a spatially implemented genetic algorithm to generate a set of Pareto-optimal results that provide planners with the best set of trade-off spatial plans for six risk and sustainability objectives: (i) minimize heat risks, (ii) minimize flooding risks, (iii) minimize transport travel costs to minimize associated emissions, (iv) maximize brownfield development, (v) minimize urban sprawl, and (vi) prevent development of greenspace. The framework is applied to Greater London (U.K.) and shown to generate spatial development strategies that are optimal for specific objectives and differ significantly from the existing development strategies. In addition, the analysis reveals tradeoffs between different risks as well as between risk and sustainability objectives. While increases in heat or flood risk can be avoided, there are no strategies that do not increase at least one of these. Tradeoffs between risk and other sustainability objectives can be more severe, for example, minimizing heat risk is only possible if future development is allowed to sprawl significantly. The results highlight the importance of spatial structure in modulating risks and other sustainability objectives. However, not all planning objectives are suited to quantified optimization and so the results should form part of an evidence base to improve the delivery of risk and sustainability management in future urban development.

  4. Risk-Reliability Programming for Optimal Water Quality Control

    NASA Astrophysics Data System (ADS)

    Simonovic, Slobodan P.; Orlob, Gerald T.

    1984-06-01

    A risk-reliability programming approach is developed for optimal allocation of releases for control of water quality downstream of a multipurpose reservoir. Additionally, the approach allows the evaluation of optimal risk/reliability values. Risk is defined as a probability of not satisfying constraints given in probabilistic form, e.g., encroachment of water quality reservation on that for flood control. The objective function includes agricultural production losses that are functions of water quality, and risk-losses associated with encroachment of the water quality control functions on reservations for flood control, fisheries, and irrigation. The approach is demonstrated using data from New Melones Reservoir on the Stanislaus River in California. Results indicate that an optimum water quality reservation exists for a given set of quality targets and loss functions. Additional analysis is presented to determine the sensitivity of optimization results to agricultural production loss functions and the influence of statistically different river flows on the optimal reservoir storage for water quality control. Results indicate the dependence of an optimum water quality reservation on agricultural production losses and hydrologic conditions.

  5. Optimal linear and nonlinear feature extraction based on the minimization of the increased risk of misclassification. [Bayes theorem - statistical analysis/data processing

    NASA Technical Reports Server (NTRS)

    Defigueiredo, R. J. P.

    1974-01-01

    General classes of nonlinear and linear transformations were investigated for the reduction of the dimensionality of the classification (feature) space so that, for a prescribed dimension m of this space, the increase of the misclassification risk is minimized.

  6. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed

  7. 78 FR 76521 - Risk-Based Capital Guidelines; Market Risk

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-18

    ... RIN 7100 AD-98 Risk-Based Capital Guidelines; Market Risk AGENCY: Board of Governors of the Federal...) is adopting a final rule that revises its market risk capital rule (market risk rule) to address... Cooperation and Development (OECD), which are referenced in the Board's market risk rule; to clarify...

  8. Requirements based system risk modeling

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Cornford, Steven; Feather, Martin

    2004-01-01

    The problem that we address in this paper is assessing the expected degree of success of the system or mission based on the degree to which each requirement is satisfied and the relative weight of the requirements. We assume a complete list of the requirements, the relevant risk elements and their probability of occurrence and the quantified effect of the risk elements on the requirements. In order to assess the degree to which each requirement is satisfied, we need to determine the effect of the various risk elements on the requirement.

  9. 76 FR 1889 - Risk-Based Capital Guidelines: Market Risk

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-11

    ... and 225 Federal Deposit Insurance Corporation 12 CFR Part 325 Risk-Based Capital Guidelines: Market... CORPORATION 12 CFR Part 325 RIN 3064-AD70 Risk-Based Capital Guidelines: Market Risk AGENCY: Office of the... proposal to revise their market risk capital rules to modify their scope to better capture positions...

  10. Risk-based Spacecraft Fire Safety Experiments

    NASA Technical Reports Server (NTRS)

    Apostolakis, G.; Catton, I.; Issacci, F.; Paulos, T.; Jones, S.; Paxton, K.; Paul, M.

    1992-01-01

    Viewgraphs on risk-based spacecraft fire safety experiments are presented. Spacecraft fire risk can never be reduced to a zero probability. Probabilistic risk assessment is a tool to reduce risk to an acceptable level.

  11. Optimal Antiviral Switching to Minimize Resistance Risk in HIV Therapy

    PubMed Central

    Luo, Rutao; Piovoso, Michael J.; Martinez-Picado, Javier; Zurakowski, Ryan

    2011-01-01

    The development of resistant strains of HIV is the most significant barrier to effective long-term treatment of HIV infection. The most common causes of resistance development are patient noncompliance and pre-existence of resistant strains. In this paper, methods of antiviral regimen switching are developed that minimize the risk of pre-existing resistant virus emerging during therapy switches necessitated by virological failure. Two distinct cases are considered; a single previous virological failure and multiple virological failures. These methods use optimal control approaches on experimentally verified mathematical models of HIV strain competition and statistical models of resistance risk. It is shown that, theoretically, order-of-magnitude reduction in risk can be achieved, and multiple previous virological failures enable greater success of these methods in reducing the risk of subsequent treatment failures. PMID:22073250

  12. Optimal Combination Treatment and Vascular Outcomes in Recent Ischemic Stroke Patients by Premorbid Risk Level

    PubMed Central

    Park, Jong-Ho; Ovbiagele, Bruce

    2015-01-01

    Background Optimal combination of secondary stroke prevention treatment including antihypertensives, antithrombotic agents, and lipid modifiers is associated with reduced recurrent vascular risk including stroke. It is unclear whether optimal combination treatment has a differential impact on stroke patients based on level of vascular risk. Methods We analyzed a clinical trial dataset comprising 3680 recent non-cardioembolic stroke patients aged ≥35 years and followed for 2 years. Patients were categorized by appropriateness level 0 to III depending on the number of the drugs prescribed divided by the number of drugs potentially indicated for each patient (0=none of the indicated medications prescribed and III=all indicated medications prescribed [optimal combination treatment]). High-risk was defined as having a history of stroke or coronary heart disease (CHD) prior to the index stroke event. Independent associations of medication appropriateness level with a major vascular event (stroke, CHD, or vascular death), ischemic stroke, and all-cause death were analyzed. Results Compared with level 0, for major vascular events, the HR of level III in the low-risk group was 0.51 (95% CI: 0.20–1.28) and 0.32 (0.14–0.70) in the high-risk group; for stroke, the HR of level III in the low-risk group was 0.54 (0.16–1.77) and 0.25 (0.08–0.85) in the high-risk group; and for all-cause death, the HR of level III in the low-risk group was 0.66 (0.09–5.00) and 0.22 (0.06–0.78) in the high-risk group. Conclusion Optimal combination treatment is related to a significantly lower risk of future vascular events and death among high-risk patients after a recent non-cardioembolic stroke. PMID:26044963

  13. Game Theory and Risk-Based Levee System Design

    NASA Astrophysics Data System (ADS)

    Hui, R.; Lund, J. R.; Madani, K.

    2014-12-01

    Risk-based analysis has been developed for optimal levee design for economic efficiency. Along many rivers, two levees on opposite riverbanks act as a simple levee system. Being rational and self-interested, land owners on each river bank would tend to independently optimize their levees with risk-based analysis, resulting in a Pareto-inefficient levee system design from the social planner's perspective. Game theory is applied in this study to analyze decision making process in a simple levee system in which the land owners on each river bank develop their design strategies using risk-based economic optimization. For each land owner, the annual expected total cost includes expected annual damage cost and annualized construction cost. The non-cooperative Nash equilibrium is identified and compared to the social planner's optimal distribution of flood risk and damage cost throughout the system which results in the minimum total flood cost for the system. The social planner's optimal solution is not feasible without appropriate level of compensation for the transferred flood risk to guarantee and improve conditions for all parties. Therefore, cooperative game theory is then employed to develop an economically optimal design that can be implemented in practice. By examining the game in the reversible and irreversible decision making modes, the cost of decision making myopia is calculated to underline the significance of considering the externalities and evolution path of dynamic water resource problems for optimal decision making.

  14. Reduction of radiation risks in patients undergoing some X-ray examinations by using optimal projections: A Monte Carlo program-based mathematical calculation

    PubMed Central

    Chaparian, A.; Kanani, A.; Baghbanian, M.

    2014-01-01

    The objectives of this paper were calculation and comparison of the effective doses, the risks of exposure-induced cancer, and dose reduction in the gonads for male and female patients in different projections of some X-ray examinations. Radiographies of lumbar spine [in the eight projections of anteroposterior (AP), posteroanterior (PA), right lateral (RLAT), left lateral (LLAT), right anterior-posterior oblique (RAO), left anterior-posterior oblique (LAO), right posterior-anterior oblique (RPO), and left posterior-anterior oblique (LPO)], abdomen (in the two projections of AP and PA), and pelvis (in the two projections of AP and PA) were investigated. A solid-state dosimeter was used for the measuring of the entrance skin exposure. A Monte Carlo program was used for calculation of effective doses, the risks of radiation-induced cancer, and doses to the gonads related to the different projections. Results of this study showed that PA projection of abdomen, lumbar spine, and pelvis radiographies caused 50%-57% lower effective doses than AP projection and 50%-60% reduction in radiation risks. Also use of LAO projection of lumbar spine X-ray examination caused 53% lower effective dose than RPO projection and 56% and 63% reduction in radiation risk for male and female, respectively, and RAO projection caused 28% lower effective dose than LPO projection and 52% and 39% reduction in radiation risk for males and females, respectively. About dose reduction in the gonads, using of the PA position rather than AP in the radiographies of the abdomen, lumbar spine, and pelvis can result in reduction of the ovaries doses in women, 38%, 31%, and 25%, respectively and reduction of the testicles doses in males, 76%, 86%, and 94%, respectively. Also for oblique projections of lumbar spine X-ray examination, with employment of LAO rather than RPO and also RAO rather than LPO, demonstrated 22% and 13% reductions to the ovaries doses and 66% and 54% reductions in the testicles doses

  15. Optimal CO2 mitigation under damage risk valuation

    NASA Astrophysics Data System (ADS)

    Crost, Benjamin; Traeger, Christian P.

    2014-07-01

    The current generation has to set mitigation policy under uncertainty about the economic consequences of climate change. This uncertainty governs both the level of damages for a given level of warming, and the steepness of the increase in damage per warming degree. Our model of climate and the economy is a stochastic version of a model employed in assessing the US Social Cost of Carbon (DICE). We compute the optimal carbon taxes and CO2 abatement levels that maximize welfare from economic consumption over time under different risk states. In accordance with recent developments in finance, we separate preferences about time and risk to improve the model's calibration of welfare to observed market interest. We show that introducing the modern asset pricing framework doubles optimal abatement and carbon taxation. Uncertainty over the level of damages at a given temperature increase can result in a slight increase of optimal emissions as compared to using expected damages. In contrast, uncertainty governing the steepness of the damage increase in temperature results in a substantially higher level of optimal mitigation.

  16. Vehicle Shield Optimization and Risk Assessment of Future NEO Missions

    NASA Technical Reports Server (NTRS)

    Nounu, Hatem, N.; Kim, Myung-Hee; Cucinotta, Francis A.

    2011-01-01

    Future human space missions target far destinations such as Near Earth Objects (NEO) or Mars that require extended stay in hostile radiation environments in deep space. The continuous assessment of exploration vehicles is needed to iteratively optimize the designs for shielding protection and calculating the risks associated with such long missions. We use a predictive software capability that calculates the risks to humans inside a spacecraft. The software uses the CAD software Pro/Engineer and Fishbowl tool kit to quantify the radiation shielding properties of the spacecraft geometry by calculating the areal density seen at a certain point, dose point, inside the spacecraft. The shielding results are used by NASA-developed software, BRYNTRN, to quantify the organ doses received in a human body located in the vehicle in a possible solar particle events (SPE) during such prolonged space missions. The organ doses are used to quantify the risks posed on the astronauts' health and life using NASA Space Cancer Model software. An illustration of the shielding optimization and risk calculation on an exploration vehicle design suitable for a NEO mission is provided in this study. The vehicle capsule is made of aluminum shell, airlock with hydrogen-rich carbon composite material end caps. The capsule contains sets of racks that surround a working and living area. A water shelter is provided in the middle of the vehicle to enhance the shielding in case of SPE. The mass distribution is optimized to minimize radiation hotspots and an assessment of the risks associated with a NEO mission is calculated.

  17. Particle swarm optimization based space debris surveillance network scheduling

    NASA Astrophysics Data System (ADS)

    Jiang, Hai; Liu, Jing; Cheng, Hao-Wen; Zhang, Yao

    2017-02-01

    The increasing number of space debris has created an orbital debris environment that poses increasing impact risks to existing space systems and human space flights. For the safety of in-orbit spacecrafts, we should optimally schedule surveillance tasks for the existing facilities to allocate resources in a manner that most significantly improves the ability to predict and detect events involving affected spacecrafts. This paper analyzes two criteria that mainly affect the performance of a scheduling scheme and introduces an artificial intelligence algorithm into the scheduling of tasks of the space debris surveillance network. A new scheduling algorithm based on the particle swarm optimization algorithm is proposed, which can be implemented in two different ways: individual optimization and joint optimization. Numerical experiments with multiple facilities and objects are conducted based on the proposed algorithm, and simulation results have demonstrated the effectiveness of the proposed algorithm.

  18. 78 FR 43829 - Risk-Based Capital Guidelines; Market Risk

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-22

    ... CFR Parts 208 and 225 RIN 7100 AD-98 Risk-Based Capital Guidelines; Market Risk AGENCY: Board of... Governors of the Federal Reserve System (Board) proposes to revise its market risk capital rule (market risk... Organization for Economic Cooperation and Development (OECD), which are referenced in the Board's market...

  19. Risk-Constrained Dynamic Programming for Optimal Mars Entry, Descent, and Landing

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Kuwata, Yoshiaki

    2013-01-01

    A chance-constrained dynamic programming algorithm was developed that is capable of making optimal sequential decisions within a user-specified risk bound. This work handles stochastic uncertainties over multiple stages in the CEMAT (Combined EDL-Mobility Analyses Tool) framework. It was demonstrated by a simulation of Mars entry, descent, and landing (EDL) using real landscape data obtained from the Mars Reconnaissance Orbiter. Although standard dynamic programming (DP) provides a general framework for optimal sequential decisionmaking under uncertainty, it typically achieves risk aversion by imposing an arbitrary penalty on failure states. Such a penalty-based approach cannot explicitly bound the probability of mission failure. A key idea behind the new approach is called risk allocation, which decomposes a joint chance constraint into a set of individual chance constraints and distributes risk over them. The joint chance constraint was reformulated into a constraint on an expectation over a sum of an indicator function, which can be incorporated into the cost function by dualizing the optimization problem. As a result, the chance-constraint optimization problem can be turned into an unconstrained optimization over a Lagrangian, which can be solved efficiently using a standard DP approach.

  20. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  1. Shape optimization of pulsatile ventricular assist devices using FSI to minimize thrombotic risk

    NASA Astrophysics Data System (ADS)

    Long, C. C.; Marsden, A. L.; Bazilevs, Y.

    2014-10-01

    In this paper we perform shape optimization of a pediatric pulsatile ventricular assist device (PVAD). The device simulation is carried out using fluid-structure interaction (FSI) modeling techniques within a computational framework that combines FEM for fluid mechanics and isogeometric analysis for structural mechanics modeling. The PVAD FSI simulations are performed under realistic conditions (i.e., flow speeds, pressure levels, boundary conditions, etc.), and account for the interaction of air, blood, and a thin structural membrane separating the two fluid subdomains. The shape optimization study is designed to reduce thrombotic risk, a major clinical problem in PVADs. Thrombotic risk is quantified in terms of particle residence time in the device blood chamber. Methods to compute particle residence time in the context of moving spatial domains are presented in a companion paper published in the same issue (Comput Mech, doi: 10.1007/s00466-013-0931-y, 2013). The surrogate management framework, a derivative-free pattern search optimization method that relies on surrogates for increased efficiency, is employed in this work. For the optimization study shown here, particle residence time is used to define a suitable cost or objective function, while four adjustable design optimization parameters are used to define the device geometry. The FSI-based optimization framework is implemented in a parallel computing environment, and deployed with minimal user intervention. Using five SEARCH/ POLL steps the optimization scheme identifies a PVAD design with significantly better throughput efficiency than the original device.

  2. Optimal Hops-Based Adaptive Clustering Algorithm

    NASA Astrophysics Data System (ADS)

    Xuan, Xin; Chen, Jian; Zhen, Shanshan; Kuo, Yonghong

    This paper proposes an optimal hops-based adaptive clustering algorithm (OHACA). The algorithm sets an energy selection threshold before the cluster forms so that the nodes with less energy are more likely to go to sleep immediately. In setup phase, OHACA introduces an adaptive mechanism to adjust cluster head and load balance. And the optimal distance theory is applied to discover the practical optimal routing path to minimize the total energy for transmission. Simulation results show that OHACA prolongs the life of network, improves utilizing rate and transmits more data because of energy balance.

  3. Risk-based planning analysis for a single levee

    NASA Astrophysics Data System (ADS)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  4. Decision making in flood risk based storm sewer network design.

    PubMed

    Sun, S A; Djordjević, S; Khu, S T

    2011-01-01

    It is widely recognised that flood risk needs to be taken into account when designing a storm sewer network. Flood risk is generally a combination of flood consequences and flood probabilities. This paper aims to explore the decision making in flood risk based storm sewer network design. A multiobjective optimization is proposed to find the Pareto front of optimal designs in terms of low construction cost and low flood risk. The decision making process then follows this multi-objective optimization to select a best design from the Pareto front. The traditional way of designing a storm sewer system based on a predefined design storm is used as one of the decision making criteria. Additionally, three commonly used risk based criteria, i.e., the expected flood risk based criterion, the Hurwicz criterion and the stochastic dominance based criterion, are investigated and applied in this paper. Different decisions are made according to different criteria as a result of different concerns represented by the criteria. The proposed procedure is applied to a simple storm sewer network design to demonstrate its effectiveness and the different criteria are compared.

  5. Young drivers' optimism bias for accident risk and driving skill: Accountability and insight experience manipulations.

    PubMed

    White, Melanie J; Cunningham, Lauren C; Titchener, Kirsteen

    2011-07-01

    This study aimed to determine whether two brief, low cost interventions would reduce young drivers' optimism bias for their driving skills and accident risk perceptions. This tendency for such drivers to perceive themselves as more skillful and less prone to driving accidents than their peers may lead to less engagement in precautionary driving behaviours and a greater engagement in more dangerous driving behaviour. 243 young drivers (aged 17-25 years) were randomly allocated to one of three groups: accountability, insight or control. All participants provided both overall and specific situation ratings of their driving skills and accident risk relative to a typical young driver. Prior to completing the questionnaire, those in the accountability condition were first advised that their driving skills and accident risk would be later assessed via a driving simulator. Those in the insight condition first underwent a difficult computer-based hazard perception task designed to provide participants with insight into their potential limitations when responding to hazards in difficult and unpredictable driving situations. Participants in the control condition completed only the questionnaire. Results showed that the accountability manipulation was effective in reducing optimism bias in terms of participants' comparative ratings of their accident risk in specific situations, though only for less experienced drivers. In contrast, among more experienced males, participants in the insight condition showed greater optimism bias for overall accident risk than their counterparts in the accountability or control groups. There were no effects of the manipulations on drivers' skills ratings. The differential effects of the two types of manipulations on optimism bias relating to one's accident risk in different subgroups of the young driver sample highlight the importance of targeting interventions for different levels of experience. Accountability interventions may be beneficial for

  6. A risk-reduction approach for optimal software release time determination with the delay incurred cost

    NASA Astrophysics Data System (ADS)

    Peng, Rui; Li, Yan-Fu; Zhang, Jun-Guang; Li, Xiang

    2015-07-01

    Most existing research on software release time determination assumes that parameters of the software reliability model (SRM) are deterministic and the reliability estimate is accurate. In practice, however, there exists a risk that the reliability requirement cannot be guaranteed due to the parameter uncertainties in the SRM, and such risk can be as high as 50% when the mean value is used. It is necessary for the software project managers to reduce the risk to a lower level by delaying the software release, which inevitably increases the software testing costs. In order to incorporate the managers' preferences over these two factors, a decision model based on multi-attribute utility theory (MAUT) is developed for the determination of optimal risk-reduction release time.

  7. Optimization-based Dynamic Human Lifting Prediction

    DTIC Science & Technology

    2008-06-01

    Anith Mathai, Steve Beck,Timothy Marler , Jingzhou Yang, Jasbir S. Arora, Karim Abdel-Malek Virtual Soldier Research Program, Center for Computer Aided...Rahmatalla, S., Kim, J., Marler , T., Beck, S., Yang, J., busek, J., Arora, J.S., and Abdel-Malek, K. Optimization-based dynamic human walking prediction

  8. Surrogate-based Analysis and Optimization

    NASA Technical Reports Server (NTRS)

    Queipo, Nestor V.; Haftka, Raphael T.; Shyy, Wei; Goel, Tushar; Vaidyanathan, Raj; Tucker, P. Kevin

    2005-01-01

    A major challenge to the successful full-scale development of modem aerospace systems is to address competing objectives such as improved performance, reduced costs, and enhanced safety. Accurate, high-fidelity models are typically time consuming and computationally expensive. Furthermore, informed decisions should be made with an understanding of the impact (global sensitivity) of the design variables on the different objectives. In this context, the so-called surrogate-based approach for analysis and optimization can play a very valuable role. The surrogates are constructed using data drawn from high-fidelity models, and provide fast approximations of the objectives and constraints at new design points, thereby making sensitivity and optimization studies feasible. This paper provides a comprehensive discussion of the fundamental issues that arise in surrogate-based analysis and optimization (SBAO), highlighting concepts, methods, techniques, as well as practical implications. The issues addressed include the selection of the loss function and regularization criteria for constructing the surrogates, design of experiments, surrogate selection and construction, sensitivity analysis, convergence, and optimization. The multi-objective optimal design of a liquid rocket injector is presented to highlight the state of the art and to help guide future efforts.

  9. Parameter optimization toward optimal microneedle-based dermal vaccination.

    PubMed

    van der Maaden, Koen; Varypataki, Eleni Maria; Yu, Huixin; Romeijn, Stefan; Jiskoot, Wim; Bouwstra, Joke

    2014-11-20

    Microneedle-based vaccination has several advantages over vaccination by using conventional hypodermic needles. Microneedles are used to deliver a drug into the skin in a minimally-invasive and potentially pain free manner. Besides, the skin is a potent immune organ that is highly suitable for vaccination. However, there are several factors that influence the penetration ability of the skin by microneedles and the immune responses upon microneedle-based immunization. In this study we assessed several different microneedle arrays for their ability to penetrate ex vivo human skin by using trypan blue and (fluorescently or radioactively labeled) ovalbumin. Next, these different microneedles and several factors, including the dose of ovalbumin, the effect of using an impact-insertion applicator, skin location of microneedle application, and the area of microneedle application, were tested in vivo in mice. The penetration ability and the dose of ovalbumin that is delivered into the skin were shown to be dependent on the use of an applicator and on the microneedle geometry and size of the array. Besides microneedle penetration, the above described factors influenced the immune responses upon microneedle-based vaccination in vivo. It was shown that the ovalbumin-specific antibody responses upon microneedle-based vaccination could be increased up to 12-fold when an impact-insertion applicator was used, up to 8-fold when microneedles were applied over a larger surface area, and up to 36-fold dependent on the location of microneedle application. Therefore, these influencing factors should be considered to optimize microneedle-based dermal immunization technologies.

  10. Optimal network solution for proactive risk assessment and emergency response

    NASA Astrophysics Data System (ADS)

    Cai, Tianxing

    Coupled with the continuous development in the field industrial operation management, the requirement for operation optimization in large scale manufacturing network has provoked more interest in the research field of engineering. Compared with the traditional way to take the remedial measure after the occurrence of the emergency event or abnormal situation, the current operation control calls for more proactive risk assessment to set up early warning system and comprehensive emergency response planning. Among all the industries, chemical industry and energy industry have higher opportunity to face with the abnormal and emergency situations due to their own industry characterization. Therefore the purpose of the study is to develop methodologies to give aid in emergency response planning and proactive risk assessment in the above two industries. The efficacy of the developed methodologies is demonstrated via two industrial real problems. The first case is to handle energy network dispatch optimization under emergency of local energy shortage under extreme conditions such as earthquake, tsunami, and hurricane, which may cause local areas to suffer from delayed rescues, widespread power outages, tremendous economic losses, and even public safety threats. In such urgent events of local energy shortage, agile energy dispatching through an effective energy transportation network, targeting the minimum energy recovery time, should be a top priority. The second case is a scheduling methodology to coordinate multiple chemical plants' start-ups in order to minimize regional air quality impacts under extreme meteorological conditions. The objective is to reschedule multi-plant start-up sequence to achieve the minimum sum of delay time compared to the expected start-up time of each plant. All these approaches can provide quantitative decision support for multiple stake holders, including government and environment agencies, chemical industry, energy industry and local

  11. Risk-based maintenance of ethylene oxide production facilities.

    PubMed

    Khan, Faisal I; Haddara, Mahmoud R

    2004-05-20

    This paper discusses a methodology for the design of an optimum inspection and maintenance program. The methodology, called risk-based maintenance (RBM) is based on integrating a reliability approach and a risk assessment strategy to obtain an optimum maintenance schedule. First, the likely equipment failure scenarios are formulated. Out of many likely failure scenarios, the ones, which are most probable, are subjected to a detailed study. Detailed consequence analysis is done for the selected scenarios. Subsequently, these failure scenarios are subjected to a fault tree analysis to determine their probabilities. Finally, risk is computed by combining the results of the consequence and the probability analyses. The calculated risk is compared against known acceptable criteria. The frequencies of the maintenance tasks are obtained by minimizing the estimated risk. A case study involving an ethylene oxide production facility is presented. Out of the five most hazardous units considered, the pipeline used for the transportation of the ethylene is found to have the highest risk. Using available failure data and a lognormal reliability distribution function human health risk factors are calculated. Both societal risk factors and individual risk factors exceeded the acceptable risk criteria. To determine an optimal maintenance interval, a reverse fault tree analysis was used. The maintenance interval was determined such that the original high risk is brought down to an acceptable level. A sensitivity analysis is also undertaken to study the impact of changing the distribution of the reliability model as well as the error in the distribution parameters on the maintenance interval.

  12. Isotretinoin Oil-Based Capsule Formulation Optimization

    PubMed Central

    Tsai, Pi-Ju; Huang, Chi-Te; Lee, Chen-Chou; Li, Chi-Lin; Huang, Yaw-Bin; Tsai, Yi-Hung; Wu, Pao-Chu

    2013-01-01

    The purpose of this study was to develop and optimize an isotretinoin oil-based capsule with specific dissolution pattern. A three-factor-constrained mixture design was used to prepare the systemic model formulations. The independent factors were the components of oil-based capsule including beeswax (X1), hydrogenated coconut oil (X2), and soybean oil (X3). The drug release percentages at 10, 30, 60, and 90 min were selected as responses. The effect of formulation factors including that on responses was inspected by using response surface methodology (RSM). Multiple-response optimization was performed to search for the appropriate formulation with specific release pattern. It was found that the interaction effect of these formulation factors (X1X2, X1X3, and X2X3) showed more potential influence than that of the main factors (X1, X2, and X3). An optimal predicted formulation with Y10 min, Y30 min, Y60 min, and Y90 min release values of 12.3%, 36.7%, 73.6%, and 92.7% at X1, X2, and X3 of 5.75, 15.37, and 78.88, respectively, was developed. The new formulation was prepared and performed by the dissolution test. The similarity factor f2 was 54.8, indicating that the dissolution pattern of the new optimized formulation showed equivalence to the predicted profile. PMID:24068886

  13. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    NASA Astrophysics Data System (ADS)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2016-03-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  14. Optimizing footwear for older people at risk of falls.

    PubMed

    Menant, Jasmine C; Steele, Julie R; Menz, Hylton B; Munro, Bridget J; Lord, Stephen R

    2008-01-01

    Footwear influences balance and the subsequent risk of slips, trips, and falls by altering somatosensory feedback to the foot and ankle and modifying frictional conditions at the shoe/floor interface. Walking indoors barefoot or in socks and walking indoors or outdoors in high-heel shoes have been shown to increase the risk of falls in older people. Other footwear characteristics such as heel collar height, sole hardness, and tread and heel geometry also influence measures of balance and gait. Because many older people wear suboptimal shoes, maximizing safe shoe use may offer an effective fall prevention strategy. Based on findings of a systematic literature review, older people should wear shoes with low heels and firm slip-resistant soles both inside and outside the home. Future research should investigate the potential benefits of tread sole shoes for preventing slips and whether shoes with high collars or flared soles can enhance balance when challenging tasks are undertaken.

  15. LP based approach to optimal stable matchings

    SciTech Connect

    Teo, Chung-Piaw; Sethuraman, J.

    1997-06-01

    We study the classical stable marriage and stable roommates problems using a polyhedral approach. We propose a new LP formulation for the stable roommates problem. This formulation is non-empty if and only if the underlying roommates problem has a stable matching. Furthermore, for certain special weight functions on the edges, we construct a 2-approximation algorithm for the optimal stable roommates problem. Our technique uses a crucial geometry of the fractional solutions in this formulation. For the stable marriage problem, we show that a related geometry allows us to express any fractional solution in the stable marriage polytope as convex combination of stable marriage solutions. This leads to a genuinely simple proof of the integrality of the stable marriage polytope. Based on these ideas, we devise a heuristic to solve the optimal stable roommates problem. The heuristic combines the power of rounding and cutting-plane methods. We present some computational results based on preliminary implementations of this heuristic.

  16. EUD-based biological optimization for carbon ion therapy

    SciTech Connect

    Brüningk, Sarah C. Kamp, Florian; Wilkens, Jan J.

    2015-11-15

    therapy, the optimization by biological objective functions resulted in slightly superior treatment plans in terms of final EUD for the organs at risk (OARs) compared to voxel-based optimization approaches. This observation was made independent of the underlying objective function metric. An absolute gain in OAR sparing was observed for quadratic objective functions, whereas intersecting DVHs were found for logistic approaches. Even for considerable under- or overestimations of the used effect- or dose–volume parameters during the optimization, treatment plans were obtained that were of similar quality as the results of a voxel-based optimization. Conclusions: EUD-based optimization with either of the presented concepts can successfully be applied to treatment plan optimization. This makes EUE-based optimization for carbon ion therapy a useful tool to optimize more specifically in the sense of biological outcome while voxel-to-voxel variations of the biological effectiveness are still properly accounted for. This may be advantageous in terms of computational cost during treatment plan optimization but also enables a straight forward comparison of different fractionation schemes or treatment modalities.

  17. Optimization-based controller design for rotorcraft

    NASA Technical Reports Server (NTRS)

    Tsing, N.-K.; Fan, M. K. H.; Barlow, J.; Tits, A. L.; Tischler, M. B.

    1993-01-01

    An optimization-based methodology for linear control system design is outlined by considering the design of a controller for a UH-60 rotorcraft in hover. A wide range of design specifications is taken into account: internal stability, decoupling between longitudinal and lateral motions, handling qualities, and rejection of windgusts. These specifications are investigated while taking into account physical limitations in the swashplate displacements and rates of displacement. The methodology crucially relies on user-machine interaction for tradeoff exploration.

  18. Base distance optimization for SQUID gradiometers

    SciTech Connect

    Garachtchenko, A.; Matlashov, A.; Kraus, R.

    1998-12-31

    The measurement of magnetic fields generated by weak nearby biomagnetic sources is affected by ambient noise generated by distant sources both internal and external to the subject under study. External ambient noise results from sources with numerous origins, many of which are unpredictable in nature. Internal noise sources are biomagnetic in nature and result from muscle activity (such as the heart, eye blinks, respiration, etc.), pulsation associated with blood flow, surgical implants, etc. Any magnetic noise will interfere with measurements of magnetic sources of interest, such as magnetoencephalography (MEG), in various ways. One of the most effective methods of reducing the magnetic noise measured by the SQUID sensor is to use properly designed superconducting gradiometers. Here, the authors optimized the baseline length of SQUID-based symmetric axial gradiometers using computer simulation. The signal-to-noise ratio (SNR) was used as the optimization criteria. They found that in most cases the optimal baseline is not equal to the depth of the primary source, rather it has a more complex dependence on the gradiometer balance and the ambient magnetic noise. They studied both first and second order gradiometers in simulated shielded environments and only second order gradiometers in a simulated unshielded environment. The noise source was simulated as a distant dipolar source for the shielded cases. They present optimal gradiometer baseline lengths for the various simulated situations below.

  19. Optimal interference code based on machine learning

    NASA Astrophysics Data System (ADS)

    Qian, Ye; Chen, Qian; Hu, Xiaobo; Cao, Ercong; Qian, Weixian; Gu, Guohua

    2016-10-01

    In this paper, we analyze the characteristics of pseudo-random code, by the case of m sequence. Depending on the description of coding theory, we introduce the jamming methods. We simulate the interference effect or probability model by the means of MATLAB to consolidate. In accordance with the length of decoding time the adversary spends, we find out the optimal formula and optimal coefficients based on machine learning, then we get the new optimal interference code. First, when it comes to the phase of recognition, this study judges the effect of interference by the way of simulating the length of time over the decoding period of laser seeker. Then, we use laser active deception jamming simulate interference process in the tracking phase in the next block. In this study we choose the method of laser active deception jamming. In order to improve the performance of the interference, this paper simulates the model by MATLAB software. We find out the least number of pulse intervals which must be received, then we can make the conclusion that the precise interval number of the laser pointer for m sequence encoding. In order to find the shortest space, we make the choice of the greatest common divisor method. Then, combining with the coding regularity that has been found before, we restore pulse interval of pseudo-random code, which has been already received. Finally, we can control the time period of laser interference, get the optimal interference code, and also increase the probability of interference as well.

  20. Smell Detection Agent Based Optimization Algorithm

    NASA Astrophysics Data System (ADS)

    Vinod Chandra, S. S.

    2016-09-01

    In this paper, a novel nature-inspired optimization algorithm has been employed and the trained behaviour of dogs in detecting smell trails is adapted into computational agents for problem solving. The algorithm involves creation of a surface with smell trails and subsequent iteration of the agents in resolving a path. This algorithm can be applied in different computational constraints that incorporate path-based problems. Implementation of the algorithm can be treated as a shortest path problem for a variety of datasets. The simulated agents have been used to evolve the shortest path between two nodes in a graph. This algorithm is useful to solve NP-hard problems that are related to path discovery. This algorithm is also useful to solve many practical optimization problems. The extensive derivation of the algorithm can be enabled to solve shortest path problems.

  1. Nuclear insurance risk assessment using risk-based methodology

    SciTech Connect

    Wendland, W.G. )

    1992-01-01

    This paper presents American Nuclear Insurers' (ANI's) and Mutual Atomic Energy Liability Underwriters' (MAELU's) process and experience for conducting nuclear insurance risk assessments using a risk-based methodology. The process is primarily qualitative and uses traditional insurance risk assessment methods and an approach developed under the auspices of the American Society of Mechanical Engineers (ASME) in which ANI/MAELU is an active sponsor. This process assists ANI's technical resources in identifying where to look for insurance risk in an industry in which insurance exposure tends to be dynamic and nonactuarial. The process is an evolving one that also seeks to minimize the impact on insureds while maintaining a mutually agreeable risk tolerance.

  2. Pixel-based OPC optimization based on conjugate gradients.

    PubMed

    Ma, Xu; Arce, Gonzalo R

    2011-01-31

    Optical proximity correction (OPC) methods are resolution enhancement techniques (RET) used extensively in the semiconductor industry to improve the resolution and pattern fidelity of optical lithography. In pixel-based OPC (PBOPC), the mask is divided into small pixels, each of which is modified during the optimization process. Two critical issues in PBOPC are the required computational complexity of the optimization process, and the manufacturability of the optimized mask. Most current OPC optimization methods apply the steepest descent (SD) algorithm to improve image fidelity augmented by regularization penalties to reduce the complexity of the mask. Although simple to implement, the SD algorithm converges slowly. The existing regularization penalties, however, fall short in meeting the mask rule check (MRC) requirements often used in semiconductor manufacturing. This paper focuses on developing OPC optimization algorithms based on the conjugate gradient (CG) method which exhibits much faster convergence than the SD algorithm. The imaging formation process is represented by the Fourier series expansion model which approximates the partially coherent system as a sum of coherent systems. In order to obtain more desirable manufacturability properties of the mask pattern, a MRC penalty is proposed to enlarge the linear size of the sub-resolution assistant features (SRAFs), as well as the distances between the SRAFs and the main body of the mask. Finally, a projection method is developed to further reduce the complexity of the optimized mask pattern.

  3. Wave based optimization of distributed vibration absorbers

    NASA Astrophysics Data System (ADS)

    Johnson, Marty; Batton, Brad

    2005-09-01

    The concept of distributed vibration absorbers or DVAs has been investigated in recent years as a method of vibration control and sound radiation control for large flexible structures. These devices are comprised of a distributed compliant layer with a distributed mass layer. When such a device is placed onto a structure it forms a sandwich panel configuration with a very soft core. With this configuration the main effect of the DVA is to create forces normal to the surface of the structure and can be used at low frequencies to either add damping, where constrain layer damper treatments are not very effective, or to pin the structure over a narrow frequency bandwidth (i.e., large input impedance/vibration absorber approach). This paper analyses the behavior of these devices using a wave based approach and finds an optimal damping level for the control of broadband disturbances in panels. The optimal design is calculated by solving the differential equations for waves propagating in coupled plates. It is shown that the optimal damping calculated using the infinite case acts as a good ``rule of thumb'' for designing DVAs to control the vibration of finite panels. This is bourn out in both numerical simulations and experiments.

  4. Risk-based SMA for Cubesats

    NASA Technical Reports Server (NTRS)

    Leitner, Jesse

    2016-01-01

    This presentation conveys an approach for risk-based safety and mission assurance applied to cubesats. This presentation accompanies a NASA Goddard standard in development that provides guidance for building a mission success plan for cubesats based on the risk tolerance and resources available.

  5. Risk-Based Explosive Safety Analysis

    DTIC Science & Technology

    2016-11-30

    safety siting of energetic liquids and propellants can be greatly aided by the use of risk-based methodologies. The low probability of exposed...of energetic liquids and propellants can be greatly aided by the use of risk-based methodologies. The low probability of exposed personnel and the

  6. Adaptive control based on retrospective cost optimization

    NASA Astrophysics Data System (ADS)

    Santillo, Mario A.

    This dissertation studies adaptive control of multi-input, multi-output, linear, time-invariant, discrete-time systems that are possibly unstable and nonminimum phase. We consider both gradient-based adaptive control as well as retrospective-cost-based adaptive control. Retrospective cost optimization is a measure of performance at the current time based on a past window of data and without assumptions about the command or disturbance signals. In particular, retrospective cost optimization acts as an inner loop to the adaptive control algorithm by modifying the performance variables based on the difference between the actual past control inputs and the recomputed past control inputs based on the current control law. We develop adaptive control algorithms that are effective for systems that are nonminimum phase. We consider discrete-time adaptive control since these control laws can be implemented directly in embedded code without requiring an intermediate discretization step. Furthermore, the adaptive controllers in this dissertation are developed under minimal modeling assumptions. In particular, the adaptive controllers require knowledge of the sign of the high-frequency gain and a sufficient number of Markov parameters to approximate the nonminimum-phase zeros (if any). No additional modeling information is necessary. The adaptive controllers presented in this dissertation are developed for full-state-feedback stabilization, static-output-feedback stabilization, as well as dynamic compensation for stabilization, command following, disturbance rejection, and model reference adaptive control. Lyapunov-based stability and convergence proofs are provided for special cases. We present numerical examples to illustrate the algorithms' effectiveness in handling systems that are unstable and/or nonminimum phase and to provide insight into the modeling information required for controller implementation.

  7. Cytogenetic bases for risk inference

    SciTech Connect

    Bender, M A

    1980-01-01

    Various enviromental pollutants are suspected of being capable of causing cancers or genetic defects even at low levels of exposure. In order to estimate risk from exposure to these pollutants, it would be useful to have some indicator of exposure. It is suggested that chromosomes are ideally suited for this purpose. Through the phenonema of chromosome aberrations and sister chromatid exchanges (SCE), chromosomes respond to virtually all carcinogens and mutagens. Aberrations and SCE are discussed in the context of their use as indicators of increased risk to health by chemical pollutants. (ACR)

  8. Dispositional Optimism and Perceived Risk Interact to Predict Intentions to Learn Genome Sequencing Results

    PubMed Central

    Taber, Jennifer M.; Klein, William M. P.; Ferrer, Rebecca A.; Lewis, Katie L.; Biesecker, Leslie G.; Biesecker, Barbara B.

    2015-01-01

    Objective Dispositional optimism and risk perceptions are each associated with health-related behaviors and decisions and other outcomes, but little research has examined how these constructs interact, particularly in consequential health contexts. The predictive validity of risk perceptions for health-related information seeking and intentions may be improved by examining dispositional optimism as a moderator, and by testing alternate types of risk perceptions, such as comparative and experiential risk. Method Participants (n = 496) had their genomes sequenced as part of a National Institutes of Health pilot cohort study (ClinSeq®). Participants completed a cross-sectional baseline survey of various types of risk perceptions and intentions to learn genome sequencing results for differing disease risks (e.g., medically actionable, nonmedically actionable, carrier status) and to use this information to change their lifestyle/health behaviors. Results Risk perceptions (absolute, comparative, and experiential) were largely unassociated with intentions to learn sequencing results. Dispositional optimism and comparative risk perceptions interacted, however, such that individuals higher in optimism reported greater intentions to learn all 3 types of sequencing results when comparative risk was perceived to be higher than when it was perceived to be lower. This interaction was inconsistent for experiential risk and absent for absolute risk. Independent of perceived risk, participants high in dispositional optimism reported greater interest in learning risks for nonmedically actionable disease and carrier status, and greater intentions to use genome information to change their lifestyle/health behaviors. Conclusions The relationship between risk perceptions and intentions may depend on how risk perceptions are assessed and on degree of optimism. PMID:25313897

  9. Risk based ASME Code requirements

    SciTech Connect

    Gore, B.F.; Vo, T.V.; Balkey, K.R.

    1992-09-01

    The objective of this ASME Research Task Force is to develop and to apply a methodology for incorporating quantitative risk analysis techniques into the definition of in-service inspection (ISI) programs for a wide range of industrial applications. An additional objective, directed towards the field of nuclear power generation, is ultimately to develop a recommendation for comprehensive revisions to the ISI requirements of Section XI of the ASME Boiler and Pressure Vessel Code. This will require development of a firm technical basis for such requirements, which does not presently exist. Several years of additional research will be required before this can be accomplished. A general methodology suitable for application to any industry has been defined and published. It has recently been refined and further developed during application to the field of nuclear power generation. In the nuclear application probabilistic risk assessment (PRA) techniques and information have been incorporated. With additional analysis, PRA information is used to determine the consequence of a component rupture (increased reactor core damage probability). A procedure has also been recommended for using the resulting quantified risk estimates to determine target component rupture probability values to be maintained by inspection activities. Structural risk and reliability analysis (SRRA) calculations are then used to determine characteristics which an inspection strategy must posess in order to maintain component rupture probabilities below target values. The methodology, results of example applications, and plans for future work are discussed.

  10. Risk-based cleanup standards

    SciTech Connect

    Kennedy, W.E. Jr.

    1992-06-01

    The problems encountered during facility or land cleanup operations will provide challenges both to technology and regulatory agencies. Inevitably, the decisions of the federal agencies regulating cleanup activities have been controversial. The major dilemma facing government and industry is how to accomplish cleanup in a cost-effective manner while minimizing the risks to workers and the public.

  11. Risk-Sensitive Optimal Feedback Control Accounts for Sensorimotor Behavior under Uncertainty

    PubMed Central

    Nagengast, Arne J.; Braun, Daniel A.; Wolpert, Daniel M.

    2010-01-01

    Many aspects of human motor behavior can be understood using optimality principles such as optimal feedback control. However, these proposed optimal control models are risk-neutral; that is, they are indifferent to the variability of the movement cost. Here, we propose the use of a risk-sensitive optimal controller that incorporates movement cost variance either as an added cost (risk-averse controller) or as an added value (risk-seeking controller) to model human motor behavior in the face of uncertainty. We use a sensorimotor task to test the hypothesis that subjects are risk-sensitive. Subjects controlled a virtual ball undergoing Brownian motion towards a target. Subjects were required to minimize an explicit cost, in points, that was a combination of the final positional error of the ball and the integrated control cost. By testing subjects on different levels of Brownian motion noise and relative weighting of the position and control cost, we could distinguish between risk-sensitive and risk-neutral control. We show that subjects change their movement strategy pessimistically in the face of increased uncertainty in accord with the predictions of a risk-averse optimal controller. Our results suggest that risk-sensitivity is a fundamental attribute that needs to be incorporated into optimal feedback control models. PMID:20657657

  12. Doing our best: optimization and the management of risk.

    PubMed

    Ben-Haim, Yakov

    2012-08-01

    Tools and concepts of optimization are widespread in decision-making, design, and planning. There is a moral imperative to "do our best." Optimization underlies theories in physics and biology, and economic theories often presume that economic agents are optimizers. We argue that in decisions under uncertainty, what should be optimized is robustness rather than performance. We discuss the equity premium puzzle from financial economics, and explain that the puzzle can be resolved by using the strategy of satisficing rather than optimizing. We discuss design of critical technological infrastructure, showing that satisficing of performance requirements--rather than optimizing them--is a preferable design concept. We explore the need for disaster recovery capability and its methodological dilemma. The disparate domains--economics and engineering--illuminate different aspects of the challenge of uncertainty and of the significance of robust-satisficing.

  13. Optimizing mesenchymal stem cell-based therapeutics.

    PubMed

    Wagner, Joseph; Kean, Thomas; Young, Randell; Dennis, James E; Caplan, Arnold I

    2009-10-01

    Mesenchymal stem cell (MSC)-based therapeutics are showing significant benefit in multiple clinical trials conducted by both academic and commercial organizations, but obstacles remain for their large-scale commercial implementation. Recent studies have attempted to optimize MSC-based therapeutics by either enhancing their potency or increasing their delivery to target tissues. Overexpression of trophic factors or in vitro exposure to potency-enhancing factors are two approaches that are demonstrating success in preclinical animal models. Delivery enhancement strategies involving tissue-specific cytokine pathways or binding sites are also showing promise. Each of these strategies has its own set of distinct advantages and disadvantages when viewed with a mindset of ultimate commercialization and clinical utility.

  14. Risk-Based Comparison of Carbon Capture Technologies

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Jones, Edward

    2013-05-01

    In this paper, we describe an integrated probabilistic risk assessment methodological framework and a decision-support tool suite for implementing systematic comparisons of competing carbon capture technologies. Culminating from a collaborative effort among national laboratories under the Carbon Capture Simulation Initiative (CCSI), the risk assessment framework and the decision-support tool suite encapsulate three interconnected probabilistic modeling and simulation components. The technology readiness level (TRL) assessment component identifies specific scientific and engineering targets required by each readiness level and applies probabilistic estimation techniques to calculate the likelihood of graded as well as nonlinear advancement in technology maturity. The technical risk assessment component focuses on identifying and quantifying risk contributors, especially stochastic distributions for significant risk contributors, performing scenario-based risk analysis, and integrating with carbon capture process model simulations and optimization. The financial risk component estimates the long-term return on investment based on energy retail pricing, production cost, operating and power replacement cost, plan construction and retrofit expenses, and potential tax relief, expressed probabilistically as the net present value distributions over various forecast horizons.

  15. Temporal variation of optimal UV exposure time over Korea: risks and benefits of surface UV radiation

    NASA Astrophysics Data System (ADS)

    Lee, Y. G.; Koo, J. H.

    2015-12-01

    Solar UV radiation in a wavelength range between 280 to 400 nm has both positive and negative influences on human body. Surface UV radiation is the main natural source of vitamin D, providing the promotion of bone and musculoskeletal health and reducing the risk of a number of cancers and other medical conditions. However, overexposure to surface UV radiation is significantly related with the majority of skin cancer, in addition other negative health effects such as sunburn, skin aging, and some forms of eye cataracts. Therefore, it is important to estimate the optimal UV exposure time, representing a balance between reducing negative health effects and maximizing sufficient vitamin D production. Previous studies calculated erythemal UV and vitamin-D UV from the measured and modelled spectral irradiances, respectively, by weighting CIE Erythema and Vitamin D3 generation functions (Kazantzidis et al., 2009; Fioletov et al., 2010). In particular, McKenzie et al. (2009) suggested the algorithm to estimate vitamin-D production UV from erythemal UV (or UV index) and determined the optimum conditions of UV exposure based on skin type Ⅱ according to the Fitzpatrick (1988). Recently, there are various demands for risks and benefits of surface UV radiation on public health over Korea, thus it is necessary to estimate optimal UV exposure time suitable to skin type of East Asians. This study examined the relationship between erythemally weighted UV (UVEry) and vitamin D weighted UV (UVVitD) over Korea during 2004-2012. The temporal variations of the ratio (UVVitD/UVEry) were also analyzed and the ratio as a function of UV index was applied in estimating the optimal UV exposure time. In summer with high surface UV radiation, short exposure time leaded to sufficient vitamin D and erythema and vice versa in winter. Thus, the balancing time in winter was enough to maximize UV benefits and minimize UV risks.

  16. Optimizing Assurance: The Risk Regulation System in Relationships

    ERIC Educational Resources Information Center

    Murray, Sandra L.; Holmes, John G.; Collins, Nancy L.

    2006-01-01

    A model of risk regulation is proposed to explain how people balance the goal of seeking closeness to a romantic partner against the opposing goal of minimizing the likelihood and pain of rejection. The central premise is that confidence in a partner's positive regard and caring allows people to risk seeking dependence and connectedness. The risk…

  17. Risk preferences: consequences for test and treatment thresholds and optimal cutoffs.

    PubMed

    Felder, Stefan; Mayrhofer, Thomas

    2014-01-01

    Risk attitudes include risk aversion as well as higher-order risk preferences such as prudence and temperance. This article analyzes the effects of such preferences on medical test and treatment decisions, represented either by test and treatment thresholds or-when the test result is not given-by optimal cutoff values for diagnostic tests. For a risk-averse decision maker, effective treatment is a risk-reducing strategy since it prevents the low health outcome of forgoing treatment in the sick state. Compared with risk neutrality, risk aversion thus lowers both the test and the treatment threshold and decreases the optimal test cutoff value. Risk vulnerability, which combines risk aversion, prudence, and temperance, is relevant if there is a comorbidity risk: thresholds and optimal cutoff values decrease even more. Since common utility functions imply risk vulnerability, our findings suggest that diagnostics in low prevalence settings (e.g., screening) may be considered more beneficial when risk preferences are taken into account.

  18. Probability-Based Software for Grid Optimization: Improved Power System Operations Using Advanced Stochastic Optimization

    SciTech Connect

    2012-02-24

    GENI Project: Sandia National Laboratories is working with several commercial and university partners to develop software for market management systems (MMSs) that enable greater use of renewable energy sources throughout the grid. MMSs are used to securely and optimally determine which energy resources should be used to service energy demand across the country. Contributions of electricity to the grid from renewable energy sources such as wind and solar are intermittent, introducing complications for MMSs, which have trouble accommodating the multiple sources of price and supply uncertainties associated with bringing these new types of energy into the grid. Sandia’s software will bring a new, probability-based formulation to account for these uncertainties. By factoring in various probability scenarios for electricity production from renewable energy sources in real time, Sandia’s formula can reduce the risk of inefficient electricity transmission, save ratepayers money, conserve power, and support the future use of renewable energy.

  19. Prioritized Reliability-Risk$ Optimization for a Hydro-Thermal System CddHoward Consulting Ltd

    NASA Astrophysics Data System (ADS)

    Howard, C. D.; Howard, J. C.

    2010-12-01

    This paper provides a real-world example of a hydro-economic model for risk estimation and impact assessment. Ensembles of forecasted reservoir inflows were used to drive an integrated hydro-thermal two-reservoir stochastic long term economic optimization model. The optimization objective was to minimize long term thermal energy purchases by recommending the best current operating decisions within policies on long-term Risk (probabilistic cost). Risk was determined as a function of Reliability to meet forecasted energy load. Reservoir End of Month Rule Curves Reliability: Thermal Risk and Hydro Revenue

  20. Minimal investment risk of a portfolio optimization problem with budget and investment concentration constraints

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2017-02-01

    In the present paper, the minimal investment risk for a portfolio optimization problem with imposed budget and investment concentration constraints is considered using replica analysis. Since the minimal investment risk is influenced by the investment concentration constraint (as well as the budget constraint), it is intuitive that the minimal investment risk for the problem with an investment concentration constraint can be larger than that without the constraint (that is, with only the budget constraint). Moreover, a numerical experiment shows the effectiveness of our proposed analysis. In contrast, the standard operations research approach failed to identify accurately the minimal investment risk of the portfolio optimization problem.

  1. Risk Classification and Risk-based Safety and Mission Assurance

    NASA Technical Reports Server (NTRS)

    Leitner, Jesse A.

    2014-01-01

    Recent activities to revamp and emphasize the need to streamline processes and activities for Class D missions across the agency have led to various interpretations of Class D, including the lumping of a variety of low-cost projects into Class D. Sometimes terms such as Class D minus are used. In this presentation, mission risk classifications will be traced to official requirements and definitions as a measure to ensure that projects and programs align with the guidance and requirements that are commensurate for their defined risk posture. As part of this, the full suite of risk classifications, formal and informal will be defined, followed by an introduction to the new GPR 8705.4 that is currently under review.GPR 8705.4 lays out guidance for the mission success activities performed at the Classes A-D for NPR 7120.5 projects as well as for projects not under NPR 7120.5. Furthermore, the trends in stepping from Class A into higher risk posture classifications will be discussed. The talk will conclude with a discussion about risk-based safety and mission assuranceat GSFC.

  2. CFD based draft tube hydraulic design optimization

    NASA Astrophysics Data System (ADS)

    McNabb, J.; Devals, C.; Kyriacou, S. A.; Murry, N.; Mullins, B. F.

    2014-03-01

    The draft tube design of a hydraulic turbine, particularly in low to medium head applications, plays an important role in determining the efficiency and power characteristics of the overall machine, since an important proportion of the available energy, being in kinetic form leaving the runner, needs to be recovered by the draft tube into static head. For large units, these efficiency and power characteristics can equate to large sums of money when considering the anticipated selling price of the energy produced over the machine's life-cycle. This same draft tube design is also a key factor in determining the overall civil costs of the powerhouse, primarily in excavation and concreting, which can amount to similar orders of magnitude as the price of the energy produced. Therefore, there is a need to find the optimum compromise between these two conflicting requirements. In this paper, an elaborate approach is described for dealing with this optimization problem. First, the draft tube's detailed geometry is defined as a function of a comprehensive set of design parameters (about 20 of which a subset is allowed to vary during the optimization process) and are then used in a non-uniform rational B-spline based geometric modeller to fully define the wetted surfaces geometry. Since the performance of the draft tube is largely governed by 3D viscous effects, such as boundary layer separation from the walls and swirling flow characteristics, which in turn governs the portion of the available kinetic energy which will be converted into pressure, a full 3D meshing and Navier-Stokes analysis is performed for each design. What makes this even more challenging is the fact that the inlet velocity distribution to the draft tube is governed by the runner at each of the various operating conditions that are of interest for the exploitation of the powerhouse. In order to determine these inlet conditions, a combined steady-state runner and an initial draft tube analysis, using a

  3. TU-EF-204-09: A Preliminary Method of Risk-Informed Optimization of Tube Current Modulation for Dose Reduction in CT

    SciTech Connect

    Gao, Y; Liu, B; Kalra, M; Caracappa, P; Liu, T; Li, X; Xu, X

    2015-06-15

    Purpose: X-rays from CT scans can increase cancer risk to patients. Lifetime Attributable Risk of Cancer Incidence for adult patients has been investigated and shown to decrease as patient age. However, a new risk model shows an increasing risk trend for several radiosensitive organs for middle age patients. This study investigates the feasibility of a general method for optimizing tube current modulation (TCM) functions to minimize risk by reducing radiation dose to radiosensitive organs of patients. Methods: Organ-based TCM has been investigated in literature for eye lens dose and breast dose. Adopting the concept in organ-based TCM, this study seeks to find an optimized tube current for minimal total risk to breasts and lungs by reducing dose to these organs. The contributions of each CT view to organ dose are determined through simulations of CT scan view-by-view using a GPU-based fast Monte Carlo code, ARCHER. A Linear Programming problem is established for tube current optimization, with Monte Carlo results as weighting factors at each view. A pre-determined dose is used as upper dose boundary, and tube current of each view is optimized to minimize the total risk. Results: An optimized tube current is found to minimize the total risk of lungs and breasts: compared to fixed current, the risk is reduced by 13%, with breast dose reduced by 38% and lung dose reduced by 7%. The average tube current is maintained during optimization to maintain image quality. In addition, dose to other organs in chest region is slightly affected, with relative change in dose smaller than 10%. Conclusion: Optimized tube current plans can be generated to minimize cancer risk to lungs and breasts while maintaining image quality. In the future, various risk models and greater number of projections per rotation will be simulated on phantoms of different gender and age. National Institutes of Health R01EB015478.

  4. Uncertainty in environmental risk assessment: implications for risk-based management of river basins.

    PubMed

    Ragas, Ad M J; Huijbregts, Mark A J; Henning-de Jong, Irmgard; Leuven, Rob S E W

    2009-01-01

    Environmental risk assessment is typically uncertain due to different perceptions of the risk problem and limited knowledge about the physical, chemical, and biological processes underlying the risk. The present paper provides a systematic overview of the implications of different types of uncertainty for risk management, with a focus on risk-based management of river basins. Three different types of uncertainty are distinguished: 1) problem definition uncertainty, 2) true uncertainty, and 3) variability. Methods to quantify and describe these types of uncertainty are discussed and illustrated in 4 case studies. The case studies demonstrate that explicit regulation of uncertainty can improve risk management (e.g., by identification of the most effective risk reduction measures, optimization of the use of resources, and improvement of the decision-making process). It is concluded that the involvement of nongovernmental actors as prescribed by the European Union Water Framework Directive (WFD) provides challenging opportunities to address problem definition uncertainty and those forms of true uncertainty that are difficult to quantify. However, the WFD guidelines for derivation and application of environmental quality standards could be improved by the introduction of a probabilistic approach to deal with true uncertainty and a better scientific basis for regulation of variability.

  5. Cost-Benefit Analysis for Optimization of Risk Protection Under Budget Constraints.

    PubMed

    Špačková, Olga; Straub, Daniel

    2015-05-01

    Cost-benefit analysis (CBA) is commonly applied as a tool for deciding on risk protection. With CBA, one can identify risk mitigation strategies that lead to an optimal tradeoff between the costs of the mitigation measures and the achieved risk reduction. In practical applications of CBA, the strategies are typically evaluated through efficiency indicators such as the benefit-cost ratio (BCR) and the marginal cost (MC) criterion. In many of these applications, the BCR is not consistently defined, which, as we demonstrate in this article, can lead to the identification of suboptimal solutions. This is of particular relevance when the overall budget for risk reduction measures is limited and an optimal allocation of resources among different subsystems is necessary. We show that this problem can be formulated as a hierarchical decision problem, where the general rules and decisions on the available budget are made at a central level (e.g., central government agency, top management), whereas the decisions on the specific measures are made at the subsystem level (e.g., local communities, company division). It is shown that the MC criterion provides optimal solutions in such hierarchical optimization. Since most practical applications only include a discrete set of possible risk protection measures, the MC criterion is extended to this situation. The findings are illustrated through a hypothetical numerical example. This study was prepared as part of our work on the optimal management of natural hazard risks, but its conclusions also apply to other fields of risk management.

  6. 12 CFR 567.6 - Risk-based capital credit risk-weight categories.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Risk-based capital credit risk-weight... CAPITAL Regulatory Capital Requirements § 567.6 Risk-based capital credit risk-weight categories. (a) Risk...)(2) of this section), plus risk-weighted recourse obligations, direct credit substitutes, and...

  7. Bell-Curve Based Evolutionary Optimization Algorithm

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Laba, K.; Kincaid, R.

    1998-01-01

    The paper presents an optimization algorithm that falls in the category of genetic, or evolutionary algorithms. While the bit exchange is the basis of most of the Genetic Algorithms (GA) in research and applications in America, some alternatives, also in the category of evolutionary algorithms, but use a direct, geometrical approach have gained popularity in Europe and Asia. The Bell-Curve Based Evolutionary Algorithm (BCB) is in this alternative category and is distinguished by the use of a combination of n-dimensional geometry and the normal distribution, the bell-curve, in the generation of the offspring. The tool for creating a child is a geometrical construct comprising a line connecting two parents and a weighted point on that line. The point that defines the child deviates from the weighted point in two directions: parallel and orthogonal to the connecting line, the deviation in each direction obeying a probabilistic distribution. Tests showed satisfactory performance of BCB. The principal advantage of BCB is its controllability via the normal distribution parameters and the geometrical construct variables.

  8. Optimizing solubility: kinetic versus thermodynamic solubility temptations and risks.

    PubMed

    Saal, Christoph; Petereit, Anna Christine

    2012-10-09

    The aim of this study was to assess the usefulness of kinetic and thermodynamic solubility data in guiding medicinal chemistry during lead optimization. The solubility of 465 research compounds was measured using a kinetic and a thermodynamic solubility assay. In the thermodynamic assay, polarized-light microscopy was used to investigate whether the result referred to the crystalline or to the amorphous compound. From the comparison of kinetic and thermodynamic solubility data it was noted that kinetic solubility measurements frequently yielded results which show considerably higher solubility compared to thermodynamic solubility. This observation is ascribed to the fact that a kinetic solubility assay typically delivers results which refer to the amorphous compound. In contrast, results from thermodynamic solubility determinations more frequently refer to a crystalline phase. Accordingly, thermodynamic solubility data--especially when used together with an assessment of the solid state form--are deemed to be more useful in guiding solubility optimization for research compounds.

  9. Optimization of agricultural field workability predictions for improved risk management

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Risks introduced by weather variability are key considerations in agricultural production. The sensitivity of agriculture to weather variability is of special concern in the face of climate change. In particular, the availability of workable days is an important consideration in agricultural practic...

  10. Optimal Predator Risk Assessment by the Sonar-Jamming Arctiine Moth Bertholdia trigona

    PubMed Central

    Corcoran, Aaron J.; Wagner, Ryan D.; Conner, William E.

    2013-01-01

    Nearly all animals face a tradeoff between seeking food and mates and avoiding predation. Optimal escape theory holds that an animal confronted with a predator should only flee when benefits of flight (increased survival) outweigh the costs (energetic costs, lost foraging time, etc.). We propose a model for prey risk assessment based on the predator's stage of attack. Risk level should increase rapidly from when the predator detects the prey to when it commits to the attack. We tested this hypothesis using a predator – the echolocating bat – whose active biosonar reveals its stage of attack. We used a prey defense – clicking used for sonar jamming by the tiger moth Bertholdia trigona– that can be readily studied in the field and laboratory and is enacted simultaneously with evasive flight. We predicted that prey employ defenses soon after being detected and targeted, and that prey defensive thresholds discriminate between legitimate predatory threats and false threats where a nearby prey is attacked. Laboratory and field experiments using playbacks of ultrasound signals and naturally behaving bats, respectively, confirmed our predictions. Moths clicked soon after bats detected and targeted them. Also, B. trigona clicking thresholds closely matched predicted optimal thresholds for discriminating legitimate and false predator threats for bats using search and approach phase echolocation – the period when bats are searching for and assessing prey. To our knowledge, this is the first quantitative study to correlate the sensory stimuli that trigger defensive behaviors with measurements of signals provided by predators during natural attacks in the field. We propose theoretical models for explaining prey risk assessment depending on the availability of cues that reveal a predator's stage of attack. PMID:23671686

  11. Optimal reliability-based planning of experiments for POD curves

    SciTech Connect

    Soerensen, J.D.; Faber, M.H.; Kroon, I.B.

    1995-12-31

    Optimal planning of crack detection tests is considered. The tests are used to update the information on the reliability of inspection techniques modeled by probability of detection (P.O.D.) curves. It is shown how cost-optimal and reliability-based test plans can be obtained using First Order Reliability Methods in combination with life-cycle cost-optimal inspection and maintenance planning. The methodology is based on preposterior analyses from Bayesian decisions theory. An illustrative example is shown.

  12. Research on particle swarm optimization algorithm based on optimal movement probability

    NASA Astrophysics Data System (ADS)

    Ma, Jianhong; Zhang, Han; He, Baofeng

    2017-01-01

    The particle swarm optimization algorithm to improve the control precision, and has great application value training neural network and fuzzy system control fields etc.The traditional particle swarm algorithm is used for the training of feed forward neural networks,the search efficiency is low, and easy to fall into local convergence.An improved particle swarm optimization algorithm is proposed based on error back propagation gradient descent. Particle swarm optimization for Solving Least Squares Problems to meme group, the particles in the fitness ranking, optimization problem of the overall consideration, the error back propagation gradient descent training BP neural network, particle to update the velocity and position according to their individual optimal and global optimization, make the particles more to the social optimal learning and less to its optimal learning, it can avoid the particles fall into local optimum, by using gradient information can accelerate the PSO local search ability, improve the multi beam particle swarm depth zero less trajectory information search efficiency, the realization of improved particle swarm optimization algorithm. Simulation results show that the algorithm in the initial stage of rapid convergence to the global optimal solution can be near to the global optimal solution and keep close to the trend, the algorithm has faster convergence speed and search performance in the same running time, it can improve the convergence speed of the algorithm, especially the later search efficiency.

  13. A seismic risk for the lunar base

    NASA Technical Reports Server (NTRS)

    Oberst, Juergen; Nakamura, Yosio

    1992-01-01

    Shallow moonquakes, which were discovered during observations following the Apollo lunar landing missions, may pose a threat to lunar surface operations. The nature of these moonquakes is similar to that of intraplate earthquakes, which include infrequent but destructive events. Therefore, there is a need for detailed study to assess the possible seismic risk before establishing a lunar base.

  14. Using Quantile and Asymmetric Least Squares Regression for Optimal Risk Adjustment.

    PubMed

    Lorenz, Normann

    2016-06-13

    In this paper, we analyze optimal risk adjustment for direct risk selection (DRS). Integrating insurers' activities for risk selection into a discrete choice model of individuals' health insurance choice shows that DRS has the structure of a contest. For the contest success function (csf) used in most of the contest literature (the Tullock-csf), optimal transfers for a risk adjustment scheme have to be determined by means of a restricted quantile regression, irrespective of whether insurers are primarily engaged in positive DRS (attracting low risks) or negative DRS (repelling high risks). This is at odds with the common practice of determining transfers by means of a least squares regression. However, this common practice can be rationalized for a new csf, but only if positive and negative DRSs are equally important; if they are not, optimal transfers have to be calculated by means of a restricted asymmetric least squares regression. Using data from German and Swiss health insurers, we find considerable differences between the three types of regressions. Optimal transfers therefore critically depend on which csf represents insurers' incentives for DRS and, if it is not the Tullock-csf, whether insurers are primarily engaged in positive or negative DRS. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Optimal guidance law for cooperative attack of multiple missiles based on optimal control theory

    NASA Astrophysics Data System (ADS)

    Sun, Xiao; Xia, Yuanqing

    2012-08-01

    This article considers the problem of optimal guidance laws for cooperative attack of multiple missiles based on the optimal control theory. New guidance laws are presented such that multiple missiles attack a single target simultaneously. Simulation results show the effectiveness of the proposed algorithms.

  16. Optimal separable bases and molecular collisions

    SciTech Connect

    Poirier, Lionel W.

    1997-12-01

    A new methodology is proposed for the efficient determination of Green`s functions and eigenstates for quantum systems of two or more dimensions. For a given Hamiltonian, the best possible separable approximation is obtained from the set of all Hilbert space operators. It is shown that this determination itself, as well as the solution of the resultant approximation, are problems of reduced dimensionality for most systems of physical interest. Moreover, the approximate eigenstates constitute the optimal separable basis, in the sense of self-consistent field theory. These distorted waves give rise to a Born series with optimized convergence properties. Analytical results are presented for an application of the method to the two-dimensional shifted harmonic oscillator system. The primary interest however, is quantum reactive scattering in molecular systems. For numerical calculations, the use of distorted waves corresponds to numerical preconditioning. The new methodology therefore gives rise to an optimized preconditioning scheme for the efficient calculation of reactive and inelastic scattering amplitudes, especially at intermediate energies. This scheme is particularly suited to discrete variable representations (DVR`s) and iterative sparse matrix methods commonly employed in such calculations. State to state and cumulative reactive scattering results obtained via the optimized preconditioner are presented for the two-dimensional collinear H + H2 → H2 + H system. Computational time and memory requirements for this system are drastically reduced in comparison with other methods, and results are obtained for previously prohibitive energy regimes.

  17. CFD Optimization on Network-Based Parallel Computer System

    NASA Technical Reports Server (NTRS)

    Cheung, Samson H.; Holst, Terry L. (Technical Monitor)

    1994-01-01

    Combining multiple engineering workstations into a network-based heterogeneous parallel computer allows application of aerodynamic optimization with advance computational fluid dynamics codes, which is computationally expensive in mainframe supercomputer. This paper introduces a nonlinear quasi-Newton optimizer designed for this network-based heterogeneous parallel computer on a software called Parallel Virtual Machine. This paper will introduce the methodology behind coupling a Parabolized Navier-Stokes flow solver to the nonlinear optimizer. This parallel optimization package has been applied to reduce the wave drag of a body of revolution and a wing/body configuration with results of 5% to 6% drag reduction.

  18. An approximation based global optimization strategy for structural synthesis

    NASA Technical Reports Server (NTRS)

    Sepulveda, A. E.; Schmit, L. A.

    1991-01-01

    A global optimization strategy for structural synthesis based on approximation concepts is presented. The methodology involves the solution of a sequence of highly accurate approximate problems using a global optimization algorithm. The global optimization algorithm implemented consists of a branch and bound strategy based on the interval evaluation of the objective function and constraint functions, combined with a local feasible directions algorithm. The approximate design optimization problems are constructed using first order approximations of selected intermediate response quantities in terms of intermediate design variables. Some numerical results for example problems are presented to illustrate the efficacy of the design procedure setforth.

  19. Defining a region of optimization based on engine usage data

    DOEpatents

    Jiang, Li; Lee, Donghoon; Yilmaz, Hakan; Stefanopoulou, Anna

    2015-08-04

    Methods and systems for engine control optimization are provided. One or more operating conditions of a vehicle engine are detected. A value for each of a plurality of engine control parameters is determined based on the detected one or more operating conditions of the vehicle engine. A range of the most commonly detected operating conditions of the vehicle engine is identified and a region of optimization is defined based on the range of the most commonly detected operating conditions of the vehicle engine. The engine control optimization routine is initiated when the one or more operating conditions of the vehicle engine are within the defined region of optimization.

  20. Risk-based regulation: A utility's perspective

    SciTech Connect

    Chapman, J.R. )

    1993-01-01

    Yankee Atomic Electric Company (YAEC) has supported the operation of several plants under the premise that regulations and corresponding implementation strategies are intended to be [open quotes]risk based.[close quotes] During the past 15 yr, these efforts have changed from essentially qualitative to a blend of qualitative and quantitative. Our observation is that implementation of regulatory requirements has often not addressed the risk significance of the underlying intent of regulations on a proportionate basis. It has caused our resource allocation to be skewed, to the point that our cost-competitiveness has eroded, but more importantly we have missed opportunities for increases in safety.

  1. The Tool for Designing Engineering Systems Using a New Optimization Method Based on a Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    The conventional optimization methods were based on a deterministic approach, since their purpose is to find out an exact solution. However, these methods have initial condition dependence and risk of falling into local solution. In this paper, we propose a new optimization method based on a concept of path integral method used in quantum mechanics. The method obtains a solutions as an expected value (stochastic average) using a stochastic process. The advantages of this method are not to be affected by initial conditions and not to need techniques based on experiences. We applied the new optimization method to a design of the hang glider. In this problem, not only the hang glider design but also its flight trajectory were optimized. The numerical calculation results showed that the method has a sufficient performance.

  2. Design Tool Using a New Optimization Method Based on a Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    Conventional optimization methods are based on a deterministic approach since their purpose is to find out an exact solution. However, such methods have initial condition dependence and the risk of falling into local solution. In this paper, we propose a new optimization method based on the concept of path integrals used in quantum mechanics. The method obtains a solution as an expected value (stochastic average) using a stochastic process. The advantages of this method are that it is not affected by initial conditions and does not require techniques based on experiences. We applied the new optimization method to a hang glider design. In this problem, both the hang glider design and its flight trajectory were optimized. The numerical calculation results prove that performance of the method is sufficient for practical use.

  3. Shape optimization for contact problems based on isogeometric analysis

    NASA Astrophysics Data System (ADS)

    Horn, Benjamin; Ulbrich, Stefan

    2016-08-01

    We consider the shape optimization for mechanical connectors. To avoid the gap between the representation in CAD systems and the finite element simulation used by mathematical optimization, we choose an isogeometric approach for the solution of the contact problem within the optimization method. This leads to a shape optimization problem governed by an elastic contact problem. We handle the contact conditions using the mortar method and solve the resulting contact problem with a semismooth Newton method. The optimization problem is nonconvex and nonsmooth due to the contact conditions. To reduce the number of simulations, we use a derivative based optimization method. With the adjoint approach the design derivatives can be calculated efficiently. The resulting optimization problem is solved with a modified Bundle Trust Region algorithm.

  4. An Optimization Method for Condition Based Maintenance of Aircraft Fleet Considering Prognostics Uncertainty

    PubMed Central

    Chen, Yiran; Sun, Bo; Li, Songjie

    2014-01-01

    An optimization method for condition based maintenance (CBM) of aircraft fleet considering prognostics uncertainty is proposed. The CBM and dispatch process of aircraft fleet is analyzed first, and the alternative strategy sets for single aircraft are given. Then, the optimization problem of fleet CBM with lower maintenance cost and dispatch risk is translated to the combinatorial optimization problem of single aircraft strategy. Remain useful life (RUL) distribution of the key line replaceable Module (LRM) has been transformed into the failure probability of the aircraft and the fleet health status matrix is established. And the calculation method of the costs and risks for mission based on health status matrix and maintenance matrix is given. Further, an optimization method for fleet dispatch and CBM under acceptable risk is proposed based on an improved genetic algorithm. Finally, a fleet of 10 aircrafts is studied to verify the proposed method. The results shows that it could realize optimization and control of the aircraft fleet oriented to mission success. PMID:24892046

  5. Breast Cancer-Related Arm Lymphedema: Incidence Rates, Diagnostic Techniques, Optimal Management and Risk Reduction Strategies

    SciTech Connect

    Shah, Chirag; Vicini, Frank A.

    2011-11-15

    As more women survive breast cancer, long-term toxicities affecting their quality of life, such as lymphedema (LE) of the arm, gain importance. Although numerous studies have attempted to determine incidence rates, identify optimal diagnostic tests, enumerate efficacious treatment strategies and outline risk reduction guidelines for breast cancer-related lymphedema (BCRL), few groups have consistently agreed on any of these issues. As a result, standardized recommendations are still lacking. This review will summarize the latest data addressing all of these concerns in order to provide patients and health care providers with optimal, contemporary recommendations. Published incidence rates for BCRL vary substantially with a range of 2-65% based on surgical technique, axillary sampling method, radiation therapy fields treated, and the use of chemotherapy. Newer clinical assessment tools can potentially identify BCRL in patients with subclinical disease with prospective data suggesting that early diagnosis and management with noninvasive therapy can lead to excellent outcomes. Multiple therapies exist with treatments defined by the severity of BCRL present. Currently, the standard of care for BCRL in patients with significant LE is complex decongestive physiotherapy (CDP). Contemporary data also suggest that a multidisciplinary approach to the management of BCRL should begin prior to definitive treatment for breast cancer employing patient-specific surgical, radiation therapy, and chemotherapy paradigms that limit risks. Further, prospective clinical assessments before and after treatment should be employed to diagnose subclinical disease. In those patients who require aggressive locoregional management, prophylactic therapies and the use of CDP can help reduce the long-term sequelae of BCRL.

  6. Credit risk evaluation based on social media.

    PubMed

    Yang, Yang; Gu, Jing; Zhou, Zongfang

    2016-07-01

    Social media has been playing an increasingly important role in the sharing of individuals' opinions on many financial issues, including credit risk in investment decisions. This paper analyzes whether these opinions, which are transmitted through social media, can accurately predict enterprises' future credit risk. We consider financial statements oriented evaluation results based on logit and probit approaches as the benchmarks. We then conduct textual analysis to retrieve both posts and their corresponding commentaries published on two of the most popular social media platforms for financial investors in China. Professional advice from financial analysts is also investigated in this paper. We surprisingly find that the opinions extracted from both posts and commentaries surpass opinions of analysts in terms of credit risk prediction.

  7. Risk assessment of soybean-based phytoestrogens.

    PubMed

    Kwack, Seung Jun; Kim, Kyu-Bong; Kim, Hyung Sik; Yoon, Kyung Sil; Lee, Byung Mu

    2009-01-01

    Koreans generally consume high quantities of soybean-based foods that contain a variety of phytoestrogens, such as, daidzein, zenistein, and biochalin A. However, phytoestrogens are considered to be potential endocrine-disrupting chemicals (EDC), which interfere with the normal function of the hormonal and reproductive systems. Therefore, dietary exposure to soybean-based phytoestrogens is of concern for Koreans, and comparative dietary risk assessments are required between Japanese (high consumers) versus Americans (low consumers). In this study, a relative risk assessment was conducted based upon daily intake levels of soybean-based foods and phytoestrogens in a Korean cohort, and the risks of photoestrogens were compared with those posed by estradiol and other EDC. Koreans approximately 30-49 yr of age consume on average a total of 135.2 g/d of soy-based foods including soybean, soybean sauce, soybean paste, and soybean oil, and 0.51 mg/kg body weight (bw)/d of phytoestrogens such as daidzein and genistein. Using estimated daily intakes (EDI) and estrogenic potencies (EP), margins of safety (MOS) were calculated where 0.05 is for estradiol (MOS value <1, considered to exert a positive estrogenic effect); thus, MOS values of 1.89 for Japanese, 1.96 for Koreans, and 5.55 for Americans indicate that consumption of soybean-based foods exerted no apparent estrogenic effects, as all MOS values were all higher than 1. For other synthetic EDC used as reference values, MOS values were dieldrin 27, nonylphenol 250, butyl benzyl phthalate 321, bisphenol A 1000, biochanin A 2203, and coumesterol 2898. These results suggest that dietary exposure to phytoestrogens, such as daidzein and genistein, poses a relatively higher health risk for humans than synthetic EDC, although MOS values were all greater than 1.

  8. Optimal trajectories based on linear equations

    NASA Technical Reports Server (NTRS)

    Carter, Thomas E.

    1990-01-01

    The Principal results of a recent theory of fuel optimal space trajectories for linear differential equations are presented. Both impulsive and bounded-thrust problems are treated. A new form of the Lawden Primer vector is found that is identical for both problems. For this reason, starting iteratives from the solution of the impulsive problem are highly effective in the solution of the two-point boundary-value problem associated with bounded thrust. These results were applied to the problem of fuel optimal maneuvers of a spacecraft near a satellite in circular orbit using the Clohessy-Wiltshire equations. For this case two-point boundary-value problems were solved using a microcomputer, and optimal trajectory shapes displayed. The results of this theory can also be applied if the satellite is in an arbitrary Keplerian orbit through the use of the Tschauner-Hempel equations. A new form of the solution of these equations has been found that is identical for elliptical, parabolic, and hyperbolic orbits except in the way that a certain integral is evaluated. For elliptical orbits this integral is evaluated through the use of the eccentric anomaly. An analogous evaluation is performed for hyperbolic orbits.

  9. Ant colony optimization-based firewall anomaly mitigation engine.

    PubMed

    Penmatsa, Ravi Kiran Varma; Vatsavayi, Valli Kumari; Samayamantula, Srinivas Kumar

    2016-01-01

    A firewall is the most essential component of network perimeter security. Due to human error and the involvement of multiple administrators in configuring firewall rules, there exist common anomalies in firewall rulesets such as Shadowing, Generalization, Correlation, and Redundancy. There is a need for research on efficient ways of resolving such anomalies. The challenge is also to see that the reordered or resolved ruleset conforms to the organization's framed security policy. This study proposes an ant colony optimization (ACO)-based anomaly resolution and reordering of firewall rules called ACO-based firewall anomaly mitigation engine. Modified strategies are also introduced to automatically detect these anomalies and to minimize manual intervention of the administrator. Furthermore, an adaptive reordering strategy is proposed to aid faster reordering when a new rule is appended. The proposed approach was tested with different firewall policy sets. The results were found to be promising in terms of the number of conflicts resolved, with minimal availability loss and marginal security risk. This work demonstrated the application of a metaheuristic search technique, ACO, in improving the performance of a packet-filter firewall with respect to mitigating anomalies in the rules, and at the same time demonstrated conformance to the security policy.

  10. 77 FR 53059 - Risk-Based Capital Guidelines: Market Risk

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-30

    ...The Office of the Comptroller of the Currency (OCC), Board of Governors of the Federal Reserve System (Board), and Federal Deposit Insurance Corporation (FDIC) are revising their market risk capital rules to better capture positions for which the market risk capital rules are appropriate; reduce procyclicality; enhance the rules' sensitivity to risks that are not adequately captured under......

  11. Geometry Optimization of a Segmented Thermoelectric Generator Based on Multi-parameter and Nonlinear Optimization Method

    NASA Astrophysics Data System (ADS)

    Cai, Lanlan; Li, Peng; Luo, Qi; Zhai, Pengcheng; Zhang, Qingjie

    2017-01-01

    As no single thermoelectric material has presented a high figure-of-merit (ZT) over a very wide temperature range, segmented thermoelectric generators (STEGs), where the p- and n-legs are formed of different thermoelectric material segments joined in series, have been developed to improve the performance of thermoelectric generators. A crucial but difficult problem in a STEG design is to determine the optimal values of the geometrical parameters, like the relative lengths of each segment and the cross-sectional area ratio of the n- and p-legs. Herein, a multi-parameter and nonlinear optimization method, based on the Improved Powell Algorithm in conjunction with the discrete numerical model, was implemented to solve the STEG's geometrical optimization problem. The multi-parameter optimal results were validated by comparison with the optimal outcomes obtained from the single-parameter optimization method. Finally, the effect of the hot- and cold-junction temperatures on the geometry optimization was investigated. Results show that the optimal geometry parameters for maximizing the specific output power of a STEG are different from those for maximizing the conversion efficiency. Data also suggest that the optimal geometry parameters and the interfacial temperatures of the adjacent segments optimized for maximum specific output power or conversion efficiency vary with changing hot- and cold-junction temperatures. Through the geometry optimization, the CoSb3/Bi2Te3-based STEG can obtain a maximum specific output power up to 1725.3 W/kg and a maximum efficiency of 13.4% when operating at a hot-junction temperature of 823 K and a cold-junction temperature of 298 K.

  12. Geometry Optimization of a Segmented Thermoelectric Generator Based on Multi-parameter and Nonlinear Optimization Method

    NASA Astrophysics Data System (ADS)

    Cai, Lanlan; Li, Peng; Luo, Qi; Zhai, Pengcheng; Zhang, Qingjie

    2017-03-01

    As no single thermoelectric material has presented a high figure-of-merit (ZT) over a very wide temperature range, segmented thermoelectric generators (STEGs), where the p- and n-legs are formed of different thermoelectric material segments joined in series, have been developed to improve the performance of thermoelectric generators. A crucial but difficult problem in a STEG design is to determine the optimal values of the geometrical parameters, like the relative lengths of each segment and the cross-sectional area ratio of the n- and p-legs. Herein, a multi-parameter and nonlinear optimization method, based on the Improved Powell Algorithm in conjunction with the discrete numerical model, was implemented to solve the STEG's geometrical optimization problem. The multi-parameter optimal results were validated by comparison with the optimal outcomes obtained from the single-parameter optimization method. Finally, the effect of the hot- and cold-junction temperatures on the geometry optimization was investigated. Results show that the optimal geometry parameters for maximizing the specific output power of a STEG are different from those for maximizing the conversion efficiency. Data also suggest that the optimal geometry parameters and the interfacial temperatures of the adjacent segments optimized for maximum specific output power or conversion efficiency vary with changing hot- and cold-junction temperatures. Through the geometry optimization, the CoSb3/Bi2Te3-based STEG can obtain a maximum specific output power up to 1725.3 W/kg and a maximum efficiency of 13.4% when operating at a hot-junction temperature of 823 K and a cold-junction temperature of 298 K.

  13. Performance investigation of multigrid optimization for DNS-based optimal control problems

    NASA Astrophysics Data System (ADS)

    Nita, Cornelia; Vandewalle, Stefan; Meyers, Johan

    2016-11-01

    Optimal control theory in Direct Numerical Simulation (DNS) or Large-Eddy Simulation (LES) of turbulent flow involves large computational cost and memory overhead for the optimization of the controls. In this context, the minimization of the cost functional is typically achieved by employing gradient-based iterative methods such as quasi-Newton, truncated Newton or non-linear conjugate gradient. In the current work, we investigate the multigrid optimization strategy (MGOpt) in order to speed up the convergence of the damped L-BFGS algorithm for DNS-based optimal control problems. The method consists in a hierarchy of optimization problems defined on different representation levels aiming to reduce the computational resources associated with the cost functional improvement on the finest level. We examine the MGOpt efficiency for the optimization of an internal volume force distribution with the goal of reducing the turbulent kinetic energy or increasing the energy extraction in a turbulent wall-bounded flow; problems that are respectively related to drag reduction in boundary layers, or energy extraction in large wind farms. Results indicate that in some cases the multigrid optimization method requires up to a factor two less DNS and adjoint DNS than single-grid damped L-BFGS. The authors acknowledge support from OPTEC (OPTimization in Engineering Center of Excellence, KU Leuven, Grant No PFV/10/002).

  14. Optimal fractional order PID design via Tabu Search based algorithm.

    PubMed

    Ateş, Abdullah; Yeroglu, Celaleddin

    2016-01-01

    This paper presents an optimization method based on the Tabu Search Algorithm (TSA) to design a Fractional-Order Proportional-Integral-Derivative (FOPID) controller. All parameter computations of the FOPID employ random initial conditions, using the proposed optimization method. Illustrative examples demonstrate the performance of the proposed FOPID controller design method.

  15. Risk-based Classification of Incidents

    NASA Technical Reports Server (NTRS)

    Greenwell, William S.; Knight, John C.; Strunk, Elisabeth A.

    2003-01-01

    As the penetration of software into safety-critical systems progresses, accidents and incidents involving software will inevitably become more frequent. Identifying lessons from these occurrences and applying them to existing and future systems is essential if recurrences are to be prevented. Unfortunately, investigative agencies do not have the resources to fully investigate every incident under their jurisdictions and domains of expertise and thus must prioritize certain occurrences when allocating investigative resources. In the aviation community, most investigative agencies prioritize occurrences based on the severity of their associated losses, allocating more resources to accidents resulting in injury to passengers or extensive aircraft damage. We argue that this scheme is inappropriate because it undervalues incidents whose recurrence could have a high potential for loss while overvaluing fairly straightforward accidents involving accepted risks. We then suggest a new strategy for prioritizing occurrences based on the risk arising from incident recurrence.

  16. Optimization-Based Management of Energy Systems

    DTIC Science & Technology

    2011-05-11

    initial cost with renewable usage constraints NC CO OK NY TX Grid Yes, unlimited Yes, unlimited Yes, unlimited Yes, unlimited Yes, unlimited Solar PV (KW...35 MW 0 0 0 20 MW Wind turbines(kW) 65 MW 70 MW 65 MW 55 MW 50 MW CHP (microturbines+absChiller) 5 MW microturbines 17.5 MW microturbines 35 MW...optimized 0 0.5 1 1.5 2 2.5 3 3.5 4 x 10 6 Total Cost Grid Energy Cost Grid Demand Cost Heating Cost CHP Natural Gas Cost Diesel Cost Annual Cost

  17. Optimization-based Dynamic Human Walking Prediction

    DTIC Science & Technology

    2007-01-01

    9(1), 1997, p 10-17. 3. Chevallereau, C. and Aousin, Y. Optimal reference trajectories for walking and running of a biped robot. Robotica , v 19...28, 2001, Arlington, Virginia. 13. Mu, XP. and Wu, Q. Synthesis of a complete sagittal gait cycle for a five-link biped robot. Robotica , v 21...gait cycles of a biped robot. Robotica , v 21(2), 2003, p 199-210. 16. Sardain, P. and Bessonnet, G. Forces acting on a biped robot. Center of

  18. A Risk-Based Sensor Placement Methodology

    SciTech Connect

    Lee, Ronald W; Kulesz, James J

    2006-08-01

    A sensor placement methodology is proposed to solve the problem of optimal location of sensors or detectors to protect population against the exposure to and effects of known and/or postulated chemical, biological, and/or radiological threats. Historical meteorological data are used to characterize weather conditions as wind speed and direction pairs with the percentage of occurrence of the pairs over the historical period. The meteorological data drive atmospheric transport and dispersion modeling of the threats, the results of which are used to calculate population at risk against standard exposure levels. Sensor locations are determined via a dynamic programming algorithm where threats captured or detected by sensors placed in prior stages are removed from consideration in subsequent stages. Moreover, the proposed methodology provides a quantification of the marginal utility of each additional sensor or detector. Thus, the criterion for halting the iterative process can be the number of detectors available, a threshold marginal utility value, or the cumulative detection of a minimum factor of the total risk value represented by all threats.

  19. Adjoint-based optimization of fish swimming gaits

    NASA Astrophysics Data System (ADS)

    Floryan, Daniel; Rowley, Clarence W.; Smits, Alexander J.

    2016-11-01

    We study a simplified model of fish swimming, namely a flat plate periodically pitching about its leading edge. Using gradient-based optimization, we seek periodic gaits that are optimal in regards to a particular objective (e.g. maximal thrust). The two-dimensional immersed boundary projection method is used to investigate the flow states, and its adjoint formulation is used to efficiently calculate the gradient of the objective function needed for optimization. The adjoint method also provides sensitivity information, which may be used to elucidate the physics responsible for optimality. Supported under ONR MURI Grants N00014-14-1-0533, Program Manager Bob Brizzolara.

  20. Game theory and risk-based leveed river system planning with noncooperation

    NASA Astrophysics Data System (ADS)

    Hui, Rui; Lund, Jay R.; Madani, Kaveh

    2016-01-01

    Optimal risk-based levee designs are usually developed for economic efficiency. However, in river systems with multiple levees, the planning and maintenance of different levees are controlled by different agencies or groups. For example, along many rivers, levees on opposite riverbanks constitute a simple leveed river system with each levee designed and controlled separately. Collaborative planning of the two levees can be economically optimal for the whole system. Independent and self-interested landholders on opposite riversides often are willing to separately determine their individual optimal levee plans, resulting in a less efficient leveed river system from an overall society-wide perspective (the tragedy of commons). We apply game theory to simple leveed river system planning where landholders on each riverside independently determine their optimal risk-based levee plans. Outcomes from noncooperative games are analyzed and compared with the overall economically optimal outcome, which minimizes net flood cost system-wide. The system-wide economically optimal solution generally transfers residual flood risk to the lower-valued side of the river, but is often impractical without compensating for flood risk transfer to improve outcomes for all individuals involved. Such compensation can be determined and implemented with landholders' agreements on collaboration to develop an economically optimal plan. By examining iterative multiple-shot noncooperative games with reversible and irreversible decisions, the costs of myopia for the future in making levee planning decisions show the significance of considering the externalities and evolution path of dynamic water resource problems to improve decision-making.

  1. Performance optimization of web-based medical simulation.

    PubMed

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2013-01-01

    This paper presents a technique for performance optimization of multimodal interactive web-based medical simulation. A web-based simulation framework is promising for easy access and wide dissemination of medical simulation. However, the real-time performance of the simulation highly depends on hardware capability on the client side. Providing consistent simulation in different hardware is critical for reliable medical simulation. This paper proposes a non-linear mixed integer programming model to optimize the performance of visualization and physics computation while considering hardware capability and application specific constraints. The optimization model identifies and parameterizes the rendering and computing capabilities of the client hardware using an exploratory proxy code. The parameters are utilized to determine the optimized simulation conditions including texture sizes, mesh sizes and canvas resolution. The test results show that the optimization model not only achieves a desired frame per second but also resolves visual artifacts due to low performance hardware.

  2. Stochastic structural and reliability based optimization of tuned mass damper

    NASA Astrophysics Data System (ADS)

    Mrabet, E.; Guedri, M.; Ichchou, M. N.; Ghanmi, S.

    2015-08-01

    The purpose of the current work is to present and discuss a technique for optimizing the parameters of a vibration absorber in the presence of uncertain bounded structural parameters. The technique used in the optimization is an interval extension based on a Taylor expansion of the objective function. The technique permits the transformation of the problem, initially non-deterministic, into two independents deterministic sub-problems. Two optimization strategies are considered: the Stochastic Structural Optimization (SSO) and the Reliability Based Optimization (RBO). It has been demonstrated through two different structures that the technique is valid for the SSO problem, even for high levels of uncertainties and it is less suitable for the RBO problem, especially when considering high levels of uncertainties.

  3. Perceptions of risk in motorcyclists: unrealistic optimism, relative realism and predictions of behaviour.

    PubMed

    Rutter, D R; Quine, L; Albery, I P

    1998-11-01

    In the first phase of a prospective investigation, a national sample of motorcyclists completed a postal questionnaire about their perceptions of risk, their behaviour on the roads and their history of accidents and spills. In the second phase a year later, they reported on their accident history and behaviour over the preceding 12 months. A total of 723 respondents completed both questionnaires. Four sets of findings are reported. First, the group as a whole showed unrealistic optimism: on average, respondents believed themselves to be less at risk than other motorcyclists of an accident needing hospital treatment in the next year. Second, optimism was tempered by 'relative realism', in that respondents who were young and inexperienced saw themselves as more at risk than other motorcyclists, as did riders who reported risky behaviours on the road. Third, there was some evidence of debiasing by personal history, in that having a friend or a relative who had been killed or injured on the roads was associated with perceptions of absolute risk of injury or death--though there were no effects on comparative risk and no effects on any of the judgments of a history of accidents of one's own. Finally, there was good evidence that perceptions of risk predicted subsequent behaviour, though generally in the direction not of precaution adoption but of precaution abandonment: the greater the perceived risk at time 1, the more frequent the risky behaviour at time 2. The implications of the findings are discussed, and possible interpretations are suggested.

  4. Generalized optimal risk allocation: foraging and antipredator behavior in a fluctuating environment.

    PubMed

    Higginson, Andrew D; Fawcett, Tim W; Trimmer, Pete C; McNamara, John M; Houston, Alasdair I

    2012-11-01

    Animals live in complex environments in which predation risk and food availability change over time. To deal with this variability and maximize their survival, animals should take into account how long current conditions may persist and the possible future conditions they may encounter. This should affect their foraging activity, and with it their vulnerability to predation across periods of good and bad conditions. Here we develop a comprehensive theory of optimal risk allocation that allows for environmental persistence and for fluctuations in food availability as well as predation risk. We show that it is the duration of good and bad periods, independent of each other, rather than the overall proportion of time exposed to each that is the most important factor affecting behavior. Risk allocation is most pronounced when conditions change frequently, and optimal foraging activity can either increase or decrease with increasing exposure to bad conditions. When food availability fluctuates rapidly, animals should forage more when food is abundant, whereas when food availability fluctuates slowly, they should forage more when food is scarce. We also show that survival can increase as variability in predation risk increases. Our work reveals that environmental persistence should profoundly influence behavior. Empirical studies of risk allocation should therefore carefully control the duration of both good and bad periods and consider manipulating food availability as well as predation risk.

  5. Trajectory optimization based on differential inclusion

    NASA Technical Reports Server (NTRS)

    Seywald, Hans

    1993-01-01

    A method for generating finite-dimensional approximations to the solutions of optimal control problems is introduced. By employing a description of the dynamical system in terms of its attainable sets in favor of using differential equations, the controls are completely eliminated from the system model. Besides reducing the dimensionality of the discretized problem compared to state-of-the-art collocation methods, this approach also alleviates the search for initial guesses from where standard gradient search methods are able to converge. The mechanics of the new method are illustrated on a simple double integrator problem. The performance of the new algorithm is demonstrated on a 1-D rocket ascent problem ('Goddard Problem') in presence of a dynamic pressure constraint.

  6. Image quality optimization using an x-ray spectra model-based optimization method

    NASA Astrophysics Data System (ADS)

    Gordon, Clarence L., III

    2000-04-01

    Several x-ray parameters must be optimized to deliver exceptional fluoroscopic and radiographic x-ray Image Quality (IQ) for the large variety of clinical procedures and patient sizes performed on a cardiac/vascular x-ray system. The optimal choice varies as a function of the objective of the medical exam, the patient size, local regulatory requirements, and the operational range of the system. As a result, many distinct combinations are required to successfully operate the x-ray system and meet the clinical imaging requirements. Presented here, is a new, configurable and automatic method to perform x-ray technique and IQ optimization using an x-ray spectral model based simulation of the x-ray generation and detection system. This method incorporates many aspects/requirements of the clinical environment, and a complete description of the specific x-ray system. First, the algorithm requires specific inputs: clinically relevant performance objectives, system hardware configuration, and system operational range. Second, the optimization is performed for a Primary Optimization Strategy versus patient thickness, e.g. maximum contrast. Finally, in the case where there are multiple operating points, which meet the Primary Optimization Strategy, a Secondary Optimization Strategy, e.g. to minimize patient dose, is utilized to determine the final set of optimal x-ray techniques.

  7. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... credit risk capital requirement, its market risk capital requirement, and its operations risk capital... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT...

  8. Optimization of a photovoltaic pumping system based on the optimal control theory

    SciTech Connect

    Betka, A.; Attali, A.

    2010-07-15

    This paper suggests how an optimal operation of a photovoltaic pumping system based on an induction motor driving a centrifugal pump can be realized. The optimization problem consists in maximizing the daily pumped water quantity via the optimization of the motor efficiency for every operation point. The proposed structure allows at the same time the minimization the machine losses, the field oriented control and the maximum power tracking of the photovoltaic array. This will be attained based on multi-input and multi-output optimal regulator theory. The effectiveness of the proposed algorithm is described by simulation and the obtained results are compared to those of a system working with a constant air gap flux. (author)

  9. Mean-variance portfolio optimization by using time series approaches based on logarithmic utility function

    NASA Astrophysics Data System (ADS)

    Soeryana, E.; Fadhlina, N.; Sukono; Rusyaman, E.; Supian, S.

    2017-01-01

    Investments in stocks investors are also faced with the issue of risk, due to daily price of stock also fluctuate. For minimize the level of risk, investors usually forming an investment portfolio. Establishment of a portfolio consisting of several stocks are intended to get the optimal composition of the investment portfolio. This paper discussed about optimizing investment portfolio of Mean-Variance to stocks by using mean and volatility is not constant based on logarithmic utility function. Non constant mean analysed using models Autoregressive Moving Average (ARMA), while non constant volatility models are analysed using the Generalized Autoregressive Conditional heteroscedastic (GARCH). Optimization process is performed by using the Lagrangian multiplier technique. As a numerical illustration, the method is used to analyse some Islamic stocks in Indonesia. The expected result is to get the proportion of investment in each Islamic stock analysed.

  10. Anatomy-Based Inverse Planning Simulated Annealing Optimization in High-Dose-Rate Prostate Brachytherapy: Significant Dosimetric Advantage Over Other Optimization Techniques

    SciTech Connect

    Jacob, Dayee Raben, Adam; Sarkar, Abhirup; Grimm, Jimm; Simpson, Larry

    2008-11-01

    Purpose: To perform an independent validation of an anatomy-based inverse planning simulated annealing (IPSA) algorithm in obtaining superior target coverage and reducing the dose to the organs at risk. Method and Materials: In a recent prostate high-dose-rate brachytherapy protocol study by the Radiation Therapy Oncology Group (0321), our institution treated 20 patients between June 1, 2005 and November 30, 2006. These patients had received a high-dose-rate boost dose of 19 Gy to the prostate, in addition to an external beam radiotherapy dose of 45 Gy with intensity-modulated radiotherapy. Three-dimensional dosimetry was obtained for the following optimization schemes in the Plato Brachytherapy Planning System, version 14.3.2, using the same dose constraints for all the patients treated during this period: anatomy-based IPSA optimization, geometric optimization, and dose point optimization. Dose-volume histograms were generated for the planning target volume and organs at risk for each optimization method, from which the volume receiving at least 75% of the dose (V{sub 75%}) for the rectum and bladder, volume receiving at least 125% of the dose (V{sub 125%}) for the urethra, and total volume receiving the reference dose (V{sub 100%}) and volume receiving 150% of the dose (V{sub 150%}) for the planning target volume were determined. The dose homogeneity index and conformal index for the planning target volume for each optimization technique were compared. Results: Despite suboptimal needle position in some implants, the IPSA algorithm was able to comply with the tight Radiation Therapy Oncology Group dose constraints for 90% of the patients in this study. In contrast, the compliance was only 30% for dose point optimization and only 5% for geometric optimization. Conclusions: Anatomy-based IPSA optimization proved to be the superior technique and also the fastest for reducing the dose to the organs at risk without compromising the target coverage.

  11. Trading risk and performance for engineering design optimization using multifidelity analyses

    NASA Astrophysics Data System (ADS)

    Rajnarayan, Dev Gorur

    Computers pervade our lives today: from communication to calculation, their influence percolates many spheres of our existence. With continuing advances in computing, simulations are becoming increasingly complex and accurate. Powerful high-fidelity simulations mimic and predict a variety of real-life scenarios, with applications ranging from entertainment to engineering. The most accurate of such engineering simulations come at a high cost in terms of computing resources and time. Engineers use such simulations to predict the real-world performance of products they design; that is, they use them for analysis. Needless to say, the emphasis is on accuracy of the prediction. For such analysis, one would like to use the most accurate simulation available, and such a simulation is likely to be at the limits of available computing power, quite independently of advances in computing. In engineering design, however, the goal is somewhat different. Engineering design is generally posed as an optimization problem, where the goal is to tweak a set of available inputs or parameters, called design variables, to create a design that is optimal in some way, and meets some preset requirements. In other words, we would like modify the design variables in order to optimize some figure of merit, called an objective function, subject to a set of constraints, typically formulated as equations or inequalities to be satisfied. Typically, a complex engineering system such as an aircraft is described by thousands of design variables, all of which are optimized during the design process. Nevertheless, do we always need to use the highest-fidelity simulations as the objective function and constraints for engineering design? Or can we afford to use lower-fidelity simulations with appropriate corrections? In this thesis, we present a new methodology for surrogate-based optimization. Existing methods combine the possibility erroneous predictions of the low-fidelity surrogate with estimates of

  12. Quantum-based algorithm for optimizing artificial neural networks.

    PubMed

    Tzyy-Chyang Lu; Gwo-Ruey Yu; Jyh-Ching Juang

    2013-08-01

    This paper presents a quantum-based algorithm for evolving artificial neural networks (ANNs). The aim is to design an ANN with few connections and high classification performance by simultaneously optimizing the network structure and the connection weights. Unlike most previous studies, the proposed algorithm uses quantum bit representation to codify the network. As a result, the connectivity bits do not indicate the actual links but the probability of the existence of the connections, thus alleviating mapping problems and reducing the risk of throwing away a potential candidate. In addition, in the proposed model, each weight space is decomposed into subspaces in terms of quantum bits. Thus, the algorithm performs a region by region exploration, and evolves gradually to find promising subspaces for further exploitation. This is helpful to provide a set of appropriate weights when evolving the network structure and to alleviate the noisy fitness evaluation problem. The proposed model is tested on four benchmark problems, namely breast cancer and iris, heart, and diabetes problems. The experimental results show that the proposed algorithm can produce compact ANN structures with good generalization ability compared to other algorithms.

  13. [Physical process based risk assessment of groundwater pollution in the mining area].

    PubMed

    Sun, Fa-Sheng; Cheng, Pin; Zhang, Bo

    2014-04-01

    Case studies of groundwater pollution risk assessment at home and abroad generally start from groundwater vulnerability, without considering the influence of characteristic pollutants on the consequences of pollution too much. Vulnerability is the natural sensitivity of the environment to pollutants. Risk assessment of groundwater pollution should reflect the movement and distribution of pollutants in groundwater. In order to improve the risk assessment theory and method of groundwater pollution, a physical process based risk assessment methodology for groundwater pollution was proposed in a mining area. According to the sensitivity of the economic and social conditions and the possible distribution of pollutants in the future, the spatial distribution of risk levels in aquifer was ranged before hand, and the pollutant source intensity corresponding to each risk level was deduced accordingly. By taking it as the criterion for the classification of groundwater pollution risk assessment, the groundwater pollution risk in the mining area was evaluated by simulating the migration of pollutants in the vadose zone and aquifer. The result show that the risk assessment method of groundwater pollution based on physical process can give the concentration distribution of pollutants and the risk level in the spatial and temporal. For single punctuate polluted area, it gives detailed risk characterization, which is better than the risk assessment method that based on aquifer intrinsic vulnerability index, and it is applicable to the risk assessment of existing polluted sites, optimizing the future sites and providing design parameters for the site construction.

  14. SU-E-T-436: Fluence-Based Trajectory Optimization for Non-Coplanar VMAT

    SciTech Connect

    Smyth, G; Bamber, JC; Bedford, JL; Evans, PM; Saran, FH; Mandeville, HC

    2015-06-15

    Purpose: To investigate a fluence-based trajectory optimization technique for non-coplanar VMAT for brain cancer. Methods: Single-arc non-coplanar VMAT trajectories were determined using a heuristic technique for five patients. Organ at risk (OAR) volume intersected during raytracing was minimized for two cases: absolute volume and the sum of relative volumes weighted by OAR importance. These trajectories and coplanar VMAT formed starting points for the fluence-based optimization method. Iterative least squares optimization was performed on control points 24° apart in gantry rotation. Optimization minimized the root-mean-square (RMS) deviation of PTV dose from the prescription (relative importance 100), maximum dose to the brainstem (10), optic chiasm (5), globes (5) and optic nerves (5), plus mean dose to the lenses (5), hippocampi (3), temporal lobes (2), cochleae (1) and brain excluding other regions of interest (1). Control point couch rotations were varied in steps of up to 10° and accepted if the cost function improved. Final treatment plans were optimized with the same objectives in an in-house planning system and evaluated using a composite metric - the sum of optimization metrics weighted by importance. Results: The composite metric decreased with fluence-based optimization in 14 of the 15 plans. In the remaining case its overall value, and the PTV and OAR components, were unchanged but the balance of OAR sparing differed. PTV RMS deviation was improved in 13 cases and unchanged in two. The OAR component was reduced in 13 plans. In one case the OAR component increased but the composite metric decreased - a 4 Gy increase in OAR metrics was balanced by a reduction in PTV RMS deviation from 2.8% to 2.6%. Conclusion: Fluence-based trajectory optimization improved plan quality as defined by the composite metric. While dose differences were case specific, fluence-based optimization improved both PTV and OAR dosimetry in 80% of cases.

  15. Optimizing ring-based CSR sources

    SciTech Connect

    Byrd, J.M.; De Santis, S.; Hao, Z.; Martin, M.C.; Munson, D.V.; Li, D.; Nis himura, H.; Robin, D.S.; Sannibale, F.; Schlueter, R.D.; Schoenlein, R.; Jung, J.Y.; Venturini, M.; Wan, W.; Zholents, A.A.; Zolotorev, M.

    2004-01-01

    Coherent synchrotron radiation (CSR) is a fascinating phenomenon recently observed in electron storage rings and shows tremendous promise as a high power source of radiation at terahertz frequencies. However, because of the properties of the radiation and the electron beams needed to produce it, there are a number of interesting features of the storage ring that can be optimized for CSR. Furthermore, CSR has been observed in three distinct forms: as steady pulses from short bunches, bursts from growth of spontaneous modulations in high current bunches, and from micro modulations imposed on a bunch from laser slicing. These processes have their relative merits as sources and can be improved via the ring design. The terahertz (THz) and sub-THz region of the electromagnetic spectrum lies between the infrared and the microwave . This boundary region is beyond the normal reach of optical and electronic measurement techniques and sources associated with these better-known neighbors. Recent research has demonstrated a relatively high power source of THz radiation from electron storage rings: coherent synchrotron radiation (CSR). Besides offering high power, CSR enables broadband optical techniques to be extended to nearly the microwave region, and has inherently sub-picosecond pulses. As a result, new opportunities for scientific research and applications are enabled across a diverse array of disciplines: condensed matter physics, medicine, manufacturing, and space and defense industries. CSR will have a strong impact on THz imaging, spectroscopy, femtosecond dynamics, and driving novel non-linear processes. CSR is emitted by bunches of accelerated charged particles when the bunch length is shorter than the wavelength being emitted. When this criterion is met, all the particles emit in phase, and a single-cycle electromagnetic pulse results with an intensity proportional to the square of the number of particles in the bunch. It is this quadratic dependence that can

  16. Experimental Eavesdropping Based on Optimal Quantum Cloning

    NASA Astrophysics Data System (ADS)

    Bartkiewicz, Karol; Lemr, Karel; Černoch, Antonín; Soubusta, Jan; Miranowicz, Adam

    2013-04-01

    The security of quantum cryptography is guaranteed by the no-cloning theorem, which implies that an eavesdropper copying transmitted qubits in unknown states causes their disturbance. Nevertheless, in real cryptographic systems some level of disturbance has to be allowed to cover, e.g., transmission losses. An eavesdropper can attack such systems by replacing a noisy channel by a better one and by performing approximate cloning of transmitted qubits which disturb them but below the noise level assumed by legitimate users. We experimentally demonstrate such symmetric individual eavesdropping on the quantum key distribution protocols of Bennett and Brassard (BB84) and the trine-state spherical code of Renes (R04) with two-level probes prepared using a recently developed photonic multifunctional quantum cloner [Lemr et al., Phys. Rev. A 85, 050307(R) (2012)PLRAAN1050-2947]. We demonstrated that our optimal cloning device with high-success rate makes the eavesdropping possible by hiding it in usual transmission losses. We believe that this experiment can stimulate the quest for other operational applications of quantum cloning.

  17. Optimal diabatic states based on solvation parameters

    NASA Astrophysics Data System (ADS)

    Alguire, Ethan; Subotnik, Joseph E.

    2012-11-01

    A new method for obtaining diabatic electronic states of a molecular system in a condensed environment is proposed and evaluated. This technique, which we denote as Edmiston-Ruedenberg (ER)-ɛ diabatization, forms diabatic states as a linear combination of adiabatic states by minimizing an approximation to the total coupling between states in a medium with temperature T and with a characteristic Pekar factor C. ER-ɛ diabatization represents an improvement upon previous localized diabatization methods for two reasons: first, it is sensitive to the energy separation between adiabatic states, thus accounting for fluctuations in energy and effectively preventing over-mixing. Second, it responds to the strength of system-solvent interactions via parameters for the dielectric constant and temperature of the medium, which is physically reasonable. Here, we apply the ER-ɛ technique to both intramolecular and intermolecular excitation energy transfer systems. We find that ER-ɛ diabatic states satisfy three important properties: (1) they have small derivative couplings everywhere; (2) they have small diabatic couplings at avoided crossings, and (3) they have negligible diabatic couplings everywhere else. As such, ER-ɛ states are good candidates for so-called "optimal diabatic states."

  18. Probabilistic-based approach to optimal filtering

    PubMed

    Hannachi

    2000-04-01

    The signal-to-noise ratio maximizing approach in optimal filtering provides a robust tool to detect signals in the presence of colored noise. The method fails, however, when the data present a regimelike behavior. An approach is developed in this manuscript to recover local (in phase space) behavior in an intermittent regimelike behaving system. The method is first formulated in its general form within a Gaussian framework, given an estimate of the noise covariance, and demands that the signal corresponds to minimizing the noise probability distribution for any given value, i.e., on isosurfaces, of the data probability distribution. The extension to the non-Gaussian case is provided through the use of finite mixture models for data that show regimelike behavior. The method yields the correct signal when applied in a simplified manner to synthetic time series with and without regimes, compared to the signal-to-noise ratio approach, and helps identify the right frequency of the oscillation spells in the classical and variants of the Lorenz system.

  19. Two-level optimization of composite wing structures based on panel genetic optimization

    NASA Astrophysics Data System (ADS)

    Liu, Boyang

    load. The resulting response surface is used for wing-level optimization. In general, complex composite structures consist of several laminates. A common problem in the design of such structures is that some plies in the adjacent laminates terminate in the boundary between the laminates. These discontinuities may cause stress concentrations and may increase manufacturing difficulty and cost. We developed measures of continuity of two adjacent laminates. We studied tradeoffs between weight and continuity through a simple composite wing design. Finally, we compared the two-level optimization to a single-level optimization based on flexural lamination parameters. The single-level optimization is efficient and feasible for a wing consisting of unstiffened panels.

  20. Joint global optimization of tomographic data based on particle swarm optimization and decision theory

    NASA Astrophysics Data System (ADS)

    Paasche, H.; Tronicke, J.

    2012-04-01

    In many near surface geophysical applications multiple tomographic data sets are routinely acquired to explore subsurface structures and parameters. Linking the model generation process of multi-method geophysical data sets can significantly reduce ambiguities in geophysical data analysis and model interpretation. Most geophysical inversion approaches rely on local search optimization methods used to find an optimal model in the vicinity of a user-given starting model. The final solution may critically depend on the initial model. Alternatively, global optimization (GO) methods have been used to invert geophysical data. They explore the solution space in more detail and determine the optimal model independently from the starting model. Additionally, they can be used to find sets of optimal models allowing a further analysis of model parameter uncertainties. Here we employ particle swarm optimization (PSO) to realize the global optimization of tomographic data. PSO is an emergent methods based on swarm intelligence characterized by fast and robust convergence towards optimal solutions. The fundamental principle of PSO is inspired by nature, since the algorithm mimics the behavior of a flock of birds searching food in a search space. In PSO, a number of particles cruise a multi-dimensional solution space striving to find optimal model solutions explaining the acquired data. The particles communicate their positions and success and direct their movement according to the position of the currently most successful particle of the swarm. The success of a particle, i.e. the quality of the currently found model by a particle, must be uniquely quantifiable to identify the swarm leader. When jointly inverting disparate data sets, the optimization solution has to satisfy multiple optimization objectives, at least one for each data set. Unique determination of the most successful particle currently leading the swarm is not possible. Instead, only statements about the Pareto

  1. Stackelberg game of buyback policy in supply chain with a risk-averse retailer and a risk-averse supplier based on CVaR.

    PubMed

    Zhou, Yanju; Chen, Qian; Chen, Xiaohong; Wang, Zongrun

    2014-01-01

    This paper considers a decentralized supply chain in which a single supplier sells a perishable product to a single retailer facing uncertain demand. We assume that the supplier and the retailer are both risk averse and utilize Conditional Value at Risk (CVaR), a risk measure method which is popularized in financial risk management, to estimate their risk attitude. We establish a buyback policy model based on Stackelberg game theory under considering supply chain members' risk preference and get the expressions of the supplier's optimal repurchase price and the retailer's optimal order quantity which are compared with those under risk neutral case. Finally, a numerical example is applied to simulate that model and prove related conclusions.

  2. Equipment management risk rating system based on engineering endpoints.

    PubMed

    James, P J

    1999-01-01

    The equipment management risk ratings system outlined here offers two significant departures from current practice: risk classifications are based on intrinsic device risks, and the risk rating system is based on engineering endpoints. Intrinsic device risks are categorized as physical, clinical and technical, and these flow from the incoming equipment assessment process. Engineering risk management is based on verification of engineering endpoints such as clinical measurements or energy delivery. This practice eliminates the ambiguity associated with ranking risk in terms of physiologic and higher-level outcome endpoints such as no significant hazards, low significance, injury, or mortality.

  3. A new efficient optimal path planner for mobile robot based on Invasive Weed Optimization algorithm

    NASA Astrophysics Data System (ADS)

    Mohanty, Prases K.; Parhi, Dayal R.

    2014-12-01

    Planning of the shortest/optimal route is essential for efficient operation of autonomous mobile robot or vehicle. In this paper Invasive Weed Optimization (IWO), a new meta-heuristic algorithm, has been implemented for solving the path planning problem of mobile robot in partially or totally unknown environments. This meta-heuristic optimization is based on the colonizing property of weeds. First we have framed an objective function that satisfied the conditions of obstacle avoidance and target seeking behavior of robot in partially or completely unknown environments. Depending upon the value of objective function of each weed in colony, the robot avoids obstacles and proceeds towards destination. The optimal trajectory is generated with this navigational algorithm when robot reaches its destination. The effectiveness, feasibility, and robustness of the proposed algorithm has been demonstrated through series of simulation and experimental results. Finally, it has been found that the developed path planning algorithm can be effectively applied to any kinds of complex situation.

  4. Solid-perforated panel layout optimization by topology optimization based on unified transfer matrix.

    PubMed

    Kim, Yoon Jae; Kim, Yoon Young

    2010-10-01

    This paper presents a numerical method for the optimization of the sequencing of solid panels, perforated panels and air gaps and their respective thickness for maximizing sound transmission loss and/or absorption. For the optimization, a method based on the topology optimization formulation is proposed. It is difficult to employ only the commonly-used material interpolation technique because the involved layers exhibit fundamentally different acoustic behavior. Thus, an optimization method formulation using a so-called unified transfer matrix is newly proposed. The key idea is to form elements of the transfer matrix such that interpolated elements by the layer design variables can be those of air, perforated and solid panel layers. The problem related to the interpolation is addressed and bench mark-type problems such as sound transmission or absorption maximization problems are solved to check the efficiency of the developed method.

  5. Optimizing Screening and Risk Assessment for Suicide Risk in the U.S. Military

    DTIC Science & Technology

    2015-03-01

    including those in military service, may attempt to deny, suppress, or control anxiety about death ( Lewis, Espe -Pfeifer, & Blair, 2000). One...significantly higher than average rates of attrition from the military ( Hoge et al., 2002). There is also some evidence that legal problems, misconduct...Military Medicine, 160, 103−106. Lewis, J. G., Espe -Pfeifer, P., & Blair, G. (2000). A comparison of death anxiety and denial in death-risk and death

  6. Optimizing Screening and Risk Assessment for Suicide Risk in the U.S. Military

    DTIC Science & Technology

    2014-09-01

    control anxiety about death ( Lewis, Espe -Pfeifer, & Blair, 2000). One potential area for combat training to increase acquired capability is...attrition from the military ( Hoge et al., 2002). There is also some evidence that legal problems, misconduct, unauthorized absences, and substance use...Lewis, J. G., Espe -Pfeifer, P., & Blair, G. (2000). A comparison of death anxiety and denial in death-risk and death-exposure occupations. Omega: Journal

  7. Genetic-evolution-based optimization methods for engineering design

    NASA Technical Reports Server (NTRS)

    Rao, S. S.; Pan, T. S.; Dhingra, A. K.; Venkayya, V. B.; Kumar, V.

    1990-01-01

    This paper presents the applicability of a biological model, based on genetic evolution, for engineering design optimization. Algorithms embodying the ideas of reproduction, crossover, and mutation are developed and applied to solve different types of structural optimization problems. Both continuous and discrete variable optimization problems are solved. A two-bay truss for maximum fundamental frequency is considered to demonstrate the continuous variable case. The selection of locations of actuators in an actively controlled structure, for minimum energy dissipation, is considered to illustrate the discrete variable case.

  8. Sequential ensemble-based optimal design for parameter estimation

    NASA Astrophysics Data System (ADS)

    Man, Jun; Zhang, Jiangjiang; Li, Weixuan; Zeng, Lingzao; Wu, Laosheng

    2016-10-01

    The ensemble Kalman filter (EnKF) has been widely used in parameter estimation for hydrological models. The focus of most previous studies was to develop more efficient analysis (estimation) algorithms. On the other hand, it is intuitively understandable that a well-designed sampling (data-collection) strategy should provide more informative measurements and subsequently improve the parameter estimation. In this work, a Sequential Ensemble-based Optimal Design (SEOD) method, coupled with EnKF, information theory and sequential optimal design, is proposed to improve the performance of parameter estimation. Based on the first-order and second-order statistics, different information metrics including the Shannon entropy difference (SD), degrees of freedom for signal (DFS) and relative entropy (RE) are used to design the optimal sampling strategy, respectively. The effectiveness of the proposed method is illustrated by synthetic one-dimensional and two-dimensional unsaturated flow case studies. It is shown that the designed sampling strategies can provide more accurate parameter estimation and state prediction compared with conventional sampling strategies. Optimal sampling designs based on various information metrics perform similarly in our cases. The effect of ensemble size on the optimal design is also investigated. Overall, larger ensemble size improves the parameter estimation and convergence of optimal sampling strategy. Although the proposed method is applied to unsaturated flow problems in this study, it can be equally applied in any other hydrological problems.

  9. Adaptive Flood Risk Management Under Climate Change Uncertainty Using Real Options and Optimization.

    PubMed

    Woodward, Michelle; Kapelan, Zoran; Gouldby, Ben

    2014-01-01

    It is well recognized that adaptive and flexible flood risk strategies are required to account for future uncertainties. Development of such strategies is, however, a challenge. Climate change alone is a significant complication, but, in addition, complexities exist trying to identify the most appropriate set of mitigation measures, or interventions. There are a range of economic and environmental performance measures that require consideration, and the spatial and temporal aspects of evaluating the performance of these is complex. All these elements pose severe difficulties to decisionmakers. This article describes a decision support methodology that has the capability to assess the most appropriate set of interventions to make in a flood system and the opportune time to make these interventions, given the future uncertainties. The flood risk strategies have been explicitly designed to allow for flexible adaptive measures by capturing the concepts of real options and multiobjective optimization to evaluate potential flood risk management opportunities. A state-of-the-art flood risk analysis tool is employed to evaluate the risk associated to each strategy over future points in time and a multiobjective genetic algorithm is utilized to search for the optimal adaptive strategies. The modeling system has been applied to a reach on the Thames Estuary (London, England), and initial results show the inclusion of flexibility is advantageous, while the outputs provide decisionmakers with supplementary knowledge that previously has not been considered.

  10. PLS-optimal: a stepwise D-optimal design based on latent variables.

    PubMed

    Brandmaier, Stefan; Sahlin, Ullrika; Tetko, Igor V; Öberg, Tomas

    2012-04-23

    Several applications, such as risk assessment within REACH or drug discovery, require reliable methods for the design of experiments and efficient testing strategies. Keeping the number of experiments as low as possible is important from both a financial and an ethical point of view, as exhaustive testing of compounds requires significant financial resources and animal lives. With a large initial set of compounds, experimental design techniques can be used to select a representative subset for testing. Once measured, these compounds can be used to develop quantitative structure-activity relationship models to predict properties of the remaining compounds. This reduces the required resources and time. D-Optimal design is frequently used to select an optimal set of compounds by analyzing data variance. We developed a new sequential approach to apply a D-Optimal design to latent variables derived from a partial least squares (PLS) model instead of principal components. The stepwise procedure selects a new set of molecules to be measured after each previous measurement cycle. We show that application of the D-Optimal selection generates models with a significantly improved performance on four different data sets with end points relevant for REACH. Compared to those derived from principal components, PLS models derived from the selection on latent variables had a lower root-mean-square error and a higher Q2 and R2. This improvement is statistically significant, especially for the small number of compounds selected.

  11. Demonstrating the benefits of template-based design-technology co-optimization

    NASA Astrophysics Data System (ADS)

    Liebmann, Lars; Hibbeler, Jason; Hieter, Nathaniel; Pileggi, Larry; Jhaveri, Tejas; Moe, Matthew; Rovner, Vyacheslav

    2010-03-01

    The concept of template-based design-technology co-optimization as a means of curbing escalating design complexity and increasing technology qualification risk is described. Data is presented highlighting the design efficacy of this proposal in terms of power, performance, and area benefits, quantifying the specific contributions of complex logic gates in this design optimization. Experimental results from 32nm technology node bulk CMOS wafers are presented to quantify the variability and design-margin reductions as well as yield and manufacturability improvements achievable with the proposed template-based design-technology co-optimization technique. The paper closes with data showing the predictable composability of individual templates, demonstrating a fundamental requirement of this proposal.

  12. Segment-Based Predominant Learning Swarm Optimizer for Large-Scale Optimization.

    PubMed

    Yang, Qiang; Chen, Wei-Neng; Gu, Tianlong; Zhang, Huaxiang; Deng, Jeremiah D; Li, Yun; Zhang, Jun

    2016-10-24

    Large-scale optimization has become a significant yet challenging area in evolutionary computation. To solve this problem, this paper proposes a novel segment-based predominant learning swarm optimizer (SPLSO) swarm optimizer through letting several predominant particles guide the learning of a particle. First, a segment-based learning strategy is proposed to randomly divide the whole dimensions into segments. During update, variables in different segments are evolved by learning from different exemplars while the ones in the same segment are evolved by the same exemplar. Second, to accelerate search speed and enhance search diversity, a predominant learning strategy is also proposed, which lets several predominant particles guide the update of a particle with each predominant particle responsible for one segment of dimensions. By combining these two learning strategies together, SPLSO evolves all dimensions simultaneously and possesses competitive exploration and exploitation abilities. Extensive experiments are conducted on two large-scale benchmark function sets to investigate the influence of each algorithmic component and comparisons with several state-of-the-art meta-heuristic algorithms dealing with large-scale problems demonstrate the competitive efficiency and effectiveness of the proposed optimizer. Further the scalability of the optimizer to solve problems with dimensionality up to 2000 is also verified.

  13. A Novel Consensus-Based Particle Swarm Optimization-Assisted Trust-Tech Methodology for Large-Scale Global Optimization.

    PubMed

    Zhang, Yong-Feng; Chiang, Hsiao-Dong

    2016-06-20

    A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.

  14. Trust regions in Kriging-based optimization with expected improvement

    NASA Astrophysics Data System (ADS)

    Regis, Rommel G.

    2016-06-01

    The Kriging-based Efficient Global Optimization (EGO) method works well on many expensive black-box optimization problems. However, it does not seem to perform well on problems with steep and narrow global minimum basins and on high-dimensional problems. This article develops a new Kriging-based optimization method called TRIKE (Trust Region Implementation in Kriging-based optimization with Expected improvement) that implements a trust-region-like approach where each iterate is obtained by maximizing an Expected Improvement (EI) function within some trust region. This trust region is adjusted depending on the ratio of the actual improvement to the EI. This article also develops the Kriging-based CYCLONE (CYClic Local search in OptimizatioN using Expected improvement) method that uses a cyclic pattern to determine the search regions where the EI is maximized. TRIKE and CYCLONE are compared with EGO on 28 test problems with up to 32 dimensions and on a 36-dimensional groundwater bioremediation application in appendices supplied as an online supplement available at http://dx.doi.org/10.1080/0305215X.2015.1082350. The results show that both algorithms yield substantial improvements over EGO and they are competitive with a radial basis function method.

  15. Discrete Biogeography Based Optimization for Feature Selection in Molecular Signatures.

    PubMed

    Liu, Bo; Tian, Meihong; Zhang, Chunhua; Li, Xiangtao

    2015-04-01

    Biomarker discovery from high-dimensional data is a complex task in the development of efficient cancer diagnoses and classification. However, these data are usually redundant and noisy, and only a subset of them present distinct profiles for different classes of samples. Thus, selecting high discriminative genes from gene expression data has become increasingly interesting in the field of bioinformatics. In this paper, a discrete biogeography based optimization is proposed to select the good subset of informative gene relevant to the classification. In the proposed algorithm, firstly, the fisher-markov selector is used to choose fixed number of gene data. Secondly, to make biogeography based optimization suitable for the feature selection problem; discrete migration model and discrete mutation model are proposed to balance the exploration and exploitation ability. Then, discrete biogeography based optimization, as we called DBBO, is proposed by integrating discrete migration model and discrete mutation model. Finally, the DBBO method is used for feature selection, and three classifiers are used as the classifier with the 10 fold cross-validation method. In order to show the effective and efficiency of the algorithm, the proposed algorithm is tested on four breast cancer dataset benchmarks. Comparison with genetic algorithm, particle swarm optimization, differential evolution algorithm and hybrid biogeography based optimization, experimental results demonstrate that the proposed method is better or at least comparable with previous method from literature when considering the quality of the solutions obtained.

  16. Fatigue reliability based optimal design of planar compliant micropositioning stages

    NASA Astrophysics Data System (ADS)

    Wang, Qiliang; Zhang, Xianmin

    2015-10-01

    Conventional compliant micropositioning stages are usually developed based on static strength and deterministic methods, which may lead to either unsafe or excessive designs. This paper presents a fatigue reliability analysis and optimal design of a three-degree-of-freedom (3 DOF) flexure-based micropositioning stage. Kinematic, modal, static, and fatigue stress modelling of the stage were conducted using the finite element method. The maximum equivalent fatigue stress in the hinges was derived using sequential quadratic programming. The fatigue strength of the hinges was obtained by considering various influencing factors. On this basis, the fatigue reliability of the hinges was analysed using the stress-strength interference method. Fatigue-reliability-based optimal design of the stage was then conducted using the genetic algorithm and MATLAB. To make fatigue life testing easier, a 1 DOF stage was then optimized and manufactured. Experimental results demonstrate the validity of the approach.

  17. [The elderly driver's perception of risk: do older drivers still express comparative optimism? ].

    PubMed

    Spitzenstetter, Florence; Moessinger, Michelle

    2008-01-01

    People frequently express comparative optimism ; that is, they believe they are less likely than average to experience negative events. The aim of the present study is, first, to observe whether people of more than 65 years are still optimists when they evaluate driving-related risks; and second, to test the assumption that older drivers show less optimism when they compare themselves with average-age drivers than when they compare themselves with same-age drivers. Our results reveal that drivers of more than 65 years do, indeed, express comparative optimism, but, contrary to our expectation, only in a limited number of cases does the age of the comparison target appear to have an effect. These results are particularly discussed in terms of self-image enhancement.

  18. Self-enhancement, crash-risk optimism and the impact of safety advertisements on young drivers.

    PubMed

    Harré, Niki; Foster, Susan; O'neill, Maree

    2005-05-01

    In Study 1, young drivers (aged between 16 and 29 years, N = 314) rated their driving attributes relative to their peers. They also rated their likelihood of being involved in a crash relative to their peers (crash-risk optimism), their crash history, stereotype of the young driver, and concern over another health issue. A self-enhancement bias was found for all items in which self/other comparisons were made. These items formed two major factors, perceived relative driving ability and perceived relative driving caution. These factors and perceived luck relative to peers in avoiding crashes significantly predicted crash-risk optimism. In Study 2, an experimental group of young drivers (N = 173) watched safety advertisements that showed drinking and dangerous driving resulting in a crash, and a control group (N = 193) watched advertisements showing people choosing not to drive after drinking. Each group then completed the self/other comparisons used in Study 1. The same factors were found, but only driving caution significantly predicted crash-risk optimism. The experimental group showed more self-enhancement on driving ability than the control group. In both studies, men showed substantially more self-enhancement than women about their driving ability. Implications for safety interventions are discussed.

  19. Optimizing medical data quality based on multiagent web service framework.

    PubMed

    Wu, Ching-Seh; Khoury, Ibrahim; Shah, Hemant

    2012-07-01

    One of the most important issues in e-healthcare information systems is to optimize the medical data quality extracted from distributed and heterogeneous environments, which can extremely improve diagnostic and treatment decision making. This paper proposes a multiagent web service framework based on service-oriented architecture for the optimization of medical data quality in the e-healthcare information system. Based on the design of the multiagent web service framework, an evolutionary algorithm (EA) for the dynamic optimization of the medical data quality is proposed. The framework consists of two main components; first, an EA will be used to dynamically optimize the composition of medical processes into optimal task sequence according to specific quality attributes. Second, a multiagent framework will be proposed to discover, monitor, and report any inconstancy between the optimized task sequence and the actual medical records. To demonstrate the proposed framework, experimental results for a breast cancer case study are provided. Furthermore, to show the unique performance of our algorithm, a comparison with other works in the literature review will be presented.

  20. Efficiency Improvements to the Displacement Based Multilevel Structural Optimization Algorithm

    NASA Technical Reports Server (NTRS)

    Plunkett, C. L.; Striz, A. G.; Sobieszczanski-Sobieski, J.

    2001-01-01

    Multilevel Structural Optimization (MSO) continues to be an area of research interest in engineering optimization. In the present project, the weight optimization of beams and trusses using Displacement based Multilevel Structural Optimization (DMSO), a member of the MSO set of methodologies, is investigated. In the DMSO approach, the optimization task is subdivided into a single system and multiple subsystems level optimizations. The system level optimization minimizes the load unbalance resulting from the use of displacement functions to approximate the structural displacements. The function coefficients are then the design variables. Alternately, the system level optimization can be solved using the displacements themselves as design variables, as was shown in previous research. Both approaches ensure that the calculated loads match the applied loads. In the subsystems level, the weight of the structure is minimized using the element dimensions as design variables. The approach is expected to be very efficient for large structures, since parallel computing can be utilized in the different levels of the problem. In this paper, the method is applied to a one-dimensional beam and a large three-dimensional truss. The beam was tested to study possible simplifications to the system level optimization. In previous research, polynomials were used to approximate the global nodal displacements. The number of coefficients of the polynomials equally matched the number of degrees of freedom of the problem. Here it was desired to see if it is possible to only match a subset of the degrees of freedom in the system level. This would lead to a simplification of the system level, with a resulting increase in overall efficiency. However, the methods tested for this type of system level simplification did not yield positive results. The large truss was utilized to test further improvements in the efficiency of DMSO. In previous work, parallel processing was applied to the

  1. Aerodynamic Shape Optimization Based on Free-form Deformation

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2004-01-01

    This paper presents a free-form deformation technique suitable for aerodynamic shape optimization. Because the proposed technique is independent of grid topology, we can treat structured and unstructured computational fluid dynamics grids in the same manner. The proposed technique is an alternative shape parameterization technique to a trivariate volume technique. It retains the flexibility and freedom of trivariate volumes for CFD shape optimization, but it uses a bivariate surface representation. This reduces the number of design variables by an order of magnitude, and it provides much better control for surface shape changes. The proposed technique is simple, compact, and efficient. The analytical sensitivity derivatives are independent of the design variables and are easily computed for use in a gradient-based optimization. The paper includes the complete formulation and aerodynamics shape optimization results.

  2. An Optimization-based Atomistic-to-Continuum Coupling Method

    SciTech Connect

    Olson, Derek; Bochev, Pavel B.; Luskin, Mitchell; Shapeev, Alexander V.

    2014-08-21

    In this paper, we present a new optimization-based method for atomistic-to-continuum (AtC) coupling. The main idea is to cast the latter as a constrained optimization problem with virtual Dirichlet controls on the interfaces between the atomistic and continuum subdomains. The optimization objective is to minimize the error between the atomistic and continuum solutions on the overlap between the two subdomains, while the atomistic and continuum force balance equations provide the constraints. Separation, rather then blending of the atomistic and continuum problems, and their subsequent use as constraints in the optimization problem distinguishes our approach from the existing AtC formulations. Finally, we present and analyze the method in the context of a one-dimensional chain of atoms modeled using a linearized two-body potential with next-nearest neighbor interactions.

  3. Inversion method based on stochastic optimization for particle sizing.

    PubMed

    Sánchez-Escobar, Juan Jaime; Barbosa-Santillán, Liliana Ibeth; Vargas-Ubera, Javier; Aguilar-Valdés, Félix

    2016-08-01

    A stochastic inverse method is presented based on a hybrid evolutionary optimization algorithm (HEOA) to retrieve a monomodal particle-size distribution (PSD) from the angular distribution of scattered light. By solving an optimization problem, the HEOA (with the Fraunhofer approximation) retrieves the PSD from an intensity pattern generated by Mie theory. The analyzed light-scattering pattern can be attributed to unimodal normal, gamma, or lognormal distribution of spherical particles covering the interval of modal size parameters 46≤α≤150. The HEOA ensures convergence to the near-optimal solution during the optimization of a real-valued objective function by combining the advantages of a multimember evolution strategy and locally weighted linear regression. The numerical results show that our HEOA can be satisfactorily applied to solve the inverse light-scattering problem.

  4. Optimal weight based on energy imbalance and utility maximization

    NASA Astrophysics Data System (ADS)

    Sun, Ruoyan

    2016-01-01

    This paper investigates the optimal weight for both male and female using energy imbalance and utility maximization. Based on the difference of energy intake and expenditure, we develop a state equation that reveals the weight gain from this energy gap. We ​construct an objective function considering food consumption, eating habits and survival rate to measure utility. Through applying mathematical tools from optimal control methods and qualitative theory of differential equations, we obtain some results. For both male and female, the optimal weight is larger than the physiologically optimal weight calculated by the Body Mass Index (BMI). We also study the corresponding trajectories to steady state weight respectively. Depending on the value of a few parameters, the steady state can either be a saddle point with a monotonic trajectory or a focus with dampened oscillations.

  5. An Optimization-based Atomistic-to-Continuum Coupling Method

    DOE PAGES

    Olson, Derek; Bochev, Pavel B.; Luskin, Mitchell; ...

    2014-08-21

    In this paper, we present a new optimization-based method for atomistic-to-continuum (AtC) coupling. The main idea is to cast the latter as a constrained optimization problem with virtual Dirichlet controls on the interfaces between the atomistic and continuum subdomains. The optimization objective is to minimize the error between the atomistic and continuum solutions on the overlap between the two subdomains, while the atomistic and continuum force balance equations provide the constraints. Separation, rather then blending of the atomistic and continuum problems, and their subsequent use as constraints in the optimization problem distinguishes our approach from the existing AtC formulations. Finally,more » we present and analyze the method in the context of a one-dimensional chain of atoms modeled using a linearized two-body potential with next-nearest neighbor interactions.« less

  6. Optimization of Designs for Nanotube-based Scanning Probes

    NASA Technical Reports Server (NTRS)

    Harik, V. M.; Gates, T. S.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    Optimization of designs for nanotube-based scanning probes, which may be used for high-resolution characterization of nanostructured materials, is examined. Continuum models to analyze the nanotube deformations are proposed to help guide selection of the optimum probe. The limitations on the use of these models that must be accounted for before applying to any design problem are presented. These limitations stem from the underlying assumptions and the expected range of nanotube loading, end conditions, and geometry. Once the limitations are accounted for, the key model parameters along with the appropriate classification of nanotube structures may serve as a basis for the design optimization of nanotube-based probe tips.

  7. Adjoint-based airfoil shape optimization in transonic flow

    NASA Astrophysics Data System (ADS)

    Gramanzini, Joe-Ray

    The primary focus of this work is efficient aerodynamic shape optimization in transonic flow. Adjoint-based optimization techniques are employed on airfoil sections and evaluated in terms of computational accuracy as well as efficiency. This study examines two test cases proposed by the AIAA Aerodynamic Design Optimization Discussion Group. The first is a two-dimensional, transonic, inviscid, non-lifting optimization of a Modified-NACA 0012 airfoil. The second is a two-dimensional, transonic, viscous optimization problem using a RAE 2822 airfoil. The FUN3D CFD code of NASA Langley Research Center is used as the ow solver for the gradient-based optimization cases. Two shape parameterization techniques are employed to study their effect and the number of design variables on the final optimized shape: Multidisciplinary Aerodynamic-Structural Shape Optimization Using Deformation (MASSOUD) and the BandAids free-form deformation technique. For the two airfoil cases, angle of attack is treated as a global design variable. The thickness and camber distributions are the local design variables for MASSOUD, and selected airfoil surface grid points are the local design variables for BandAids. Using the MASSOUD technique, a drag reduction of 72.14% is achieved for the NACA 0012 case, reducing the total number of drag counts from 473.91 to 130.59. Employing the BandAids technique yields a 78.67% drag reduction, from 473.91 to 99.98. The RAE 2822 case exhibited a drag reduction from 217.79 to 132.79 counts, a 39.05% decrease using BandAids.

  8. Photon Optimizer (PO) prevails over Progressive Resolution Optimizer (PRO) for VMAT planning with or without knowledge-based solution.

    PubMed

    Jiang, Fan; Wu, Hao; Yue, Haizhen; Jia, Fei; Zhang, Yibao

    2017-03-01

    The enhanced dosimetric performance of knowledge-based volumetric modulated arc therapy (VMAT) planning might be jointly contributed by the patient-specific optimization objectives, as estimated by the RapidPlan model, and by the potentially improved Photon Optimizer (PO) algorithm than the previous Progressive Resolution Optimizer (PRO) engine. As PO is mandatory for RapidPlan estimation but optional for conventional manual planning, appreciating the two optimizers may provide practical guidelines for the algorithm selection because knowledge-based planning may not replace the current method completely in a short run. Using a previously validated dose-volume histogram (DVH) estimation model which can produce clinically acceptable plans automatically for rectal cancer patients without interactive manual adjustment, this study reoptimized 30 historically approved plans (referred as clinical plans that were created manually with PRO) with RapidPlan solution (PO plans). Then the PRO algorithm was utilized to optimize the plans again using the same dose-volume constraints as PO plans, where the line objectives were converted as a series of point objectives automatically (PRO plans). On the basis of comparable target dose coverage, the combined applications of new objectives and PO algorithm have significantly reduced the organs-at-risk (OAR) exposure by 23.49-32.72% than the clinical plans. These discrepancies have been largely preserved after substituting PRO for PO, indicating the dosimetric improvements were mostly attributable to the refined objectives. Therefore, Eclipse users of earlier versions may instantly benefit from adopting the model-generated objectives from other RapidPlan-equipped centers, even with PRO algorithm. However, the additional contribution made by the PO relative to PRO accounted for 1.54-3.74%, suggesting PO should be selected with priority whenever available, with or without RapidPlan solution as a purchasable package. Significantly

  9. The Integrated Medical Model - Optimizing In-flight Space Medical Systems to Reduce Crew Health Risk and Mission Impacts

    NASA Technical Reports Server (NTRS)

    Kerstman, Eric; Walton, Marlei; Minard, Charles; Saile, Lynn; Myers, Jerry; Butler, Doug; Lyengar, Sriram; Fitts, Mary; Johnson-Throop, Kathy

    2009-01-01

    The Integrated Medical Model (IMM) is a decision support tool used by medical system planners and designers as they prepare for exploration planning activities of the Constellation program (CxP). IMM provides an evidence-based approach to help optimize the allocation of in-flight medical resources for a specified level of risk within spacecraft operational constraints. Eighty medical conditions and associated resources are represented in IMM. Nine conditions are due to Space Adaptation Syndrome. The IMM helps answer fundamental medical mission planning questions such as What medical conditions can be expected? What type and quantity of medical resources are most likely to be used?", and "What is the probability of crew death or evacuation due to medical events?" For a specified mission and crew profile, the IMM effectively characterizes the sequence of events that could potentially occur should a medical condition happen. The mathematical relationships among mission and crew attributes, medical conditions and incidence data, in-flight medical resources, potential clinical and crew health end states are established to generate end state probabilities. A Monte Carlo computational method is used to determine the probable outcomes and requires up to 25,000 mission trials to reach convergence. For each mission trial, the pharmaceuticals and supplies required to diagnose and treat prevalent medical conditions are tracked and decremented. The uncertainty of patient response to treatment is bounded via a best-case, worst-case, untreated case algorithm. A Crew Health Index (CHI) metric, developed to account for functional impairment due to a medical condition, provides a quantified measure of risk and enables risk comparisons across mission scenarios. The use of historical in-flight medical data, terrestrial surrogate data as appropriate, and space medicine subject matter expertise has enabled the development of a probabilistic, stochastic decision support tool capable of

  10. Cultural Effects on Cancer Prevention Behaviors: Fatalistic Cancer Beliefs and Risk Optimism Among Asians in Singapore.

    PubMed

    Kim, Hye Kyung; Lwin, May O

    2016-09-09

    Although culture is acknowledged as an important factor that influences health, little is known about cultural differences pertaining to cancer-related beliefs and prevention behaviors. This study examines two culturally influenced beliefs-fatalistic beliefs about cancer prevention, and optimistic beliefs about cancer risk-to identify reasons for cultural disparity in the engagement of cancer prevention behaviors. We utilized data from national surveys of European Americans in the United States (Health Information National Trends Survey 4, Cycle3; N = 1,139) and Asians in Singapore (N = 1,200) to make cultural comparisons. The odds of an Asian adhering to prevention recommendations were less than half the odds of a European American, with the exception of smoking avoidance. Compared to European Americans, Asians were more optimistic about their cancer risk both in an absolute and a comparative sense, and held stronger fatalistic beliefs about cancer prevention. Mediation analyses revealed that fatalistic beliefs and absolute risk optimism among Asians partially explain their lower engagement in prevention behaviors, whereas comparative risk optimism increases their likelihood of adhering to prevention behaviors. Our findings underscore the need for developing culturally targeted interventions in communicating cancer causes and prevention.

  11. On the Integration of Risk Aversion and Average-Performance Optimization in Reservoir Control

    NASA Astrophysics Data System (ADS)

    Nardini, Andrea; Piccardi, Carlo; Soncini-Sessa, Rodolfo

    1992-02-01

    The real-time operation of a reservoir is a matter of trade-off between the two criteria of risk aversion (to avoid dramatic failures) and average-performance optimization (to yield the best long-term average performance). A methodology taking into account both criteria is presented m this paper to derive "off-line" infinite-horizon control policies for a single multipurpose reservoir, where the management goals are water supply and flood control. According to this methodology, the reservoir control policy is derived in two steps: First, a (min-max) risk aversion problem is formulated, whose solution is not unique, but rather a whole set of policies, all equivalent from the point of view of the risk-aversion objectives. Second, a stochastic average-performance optimization problem is solved, to select from the set previously obtained the best policy from the point of view of the average-performance objectives. The methodology has several interesting features: the rnin-max (or "guaranteed performance") approach, which is particularly suited whenever "weak" users are affected by the consequences of the decision-making process; the flexible definition of a "risk aversion degree," by the selection of those inflow sequences which are particularly feared; and the two-objective analysis which provides the manager with a whole set of alternatives among which he (she) will select the one that yields the desired trade-off between the management goals.

  12. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk. The... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Risk-based capital level. 652.70 Section 652.70 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM FEDERAL AGRICULTURAL...

  13. Approximate dynamic programming based solutions for fixed-final-time optimal control and optimal switching

    NASA Astrophysics Data System (ADS)

    Heydari, Ali

    Optimal solutions with neural networks (NN) based on an approximate dynamic programming (ADP) framework for new classes of engineering and non-engineering problems and associated difficulties and challenges are investigated in this dissertation. In the enclosed eight papers, the ADP framework is utilized for solving fixed-final-time problems (also called terminal control problems) and problems with switching nature. An ADP based algorithm is proposed in Paper 1 for solving fixed-final-time problems with soft terminal constraint, in which, a single neural network with a single set of weights is utilized. Paper 2 investigates fixed-final-time problems with hard terminal constraints. The optimality analysis of the ADP based algorithm for fixed-final-time problems is the subject of Paper 3, in which, it is shown that the proposed algorithm leads to the global optimal solution providing certain conditions hold. Afterwards, the developments in Papers 1 to 3 are used to tackle a more challenging class of problems, namely, optimal control of switching systems. This class of problems is divided into problems with fixed mode sequence (Papers 4 and 5) and problems with free mode sequence (Papers 6 and 7). Each of these two classes is further divided into problems with autonomous subsystems (Papers 4 and 6) and problems with controlled subsystems (Papers 5 and 7). Different ADP-based algorithms are developed and proofs of convergence of the proposed iterative algorithms are presented. Moreover, an extension to the developments is provided for online learning of the optimal switching solution for problems with modeling uncertainty in Paper 8. Each of the theoretical developments is numerically analyzed using different real-world or benchmark problems.

  14. Breast Cancer Screening in the Precision Medicine Era: Risk-Based Screening in a Population-Based Trial.

    PubMed

    Shieh, Yiwey; Eklund, Martin; Madlensky, Lisa; Sawyer, Sarah D; Thompson, Carlie K; Stover Fiscalini, Allison; Ziv, Elad; Van't Veer, Laura J; Esserman, Laura J; Tice, Jeffrey A

    2017-01-01

    Ongoing controversy over the optimal approach to breast cancer screening has led to discordant professional society recommendations, particularly in women age 40 to 49 years. One potential solution is risk-based screening, where decisions around the starting age, stopping age, frequency, and modality of screening are based on individual risk to maximize the early detection of aggressive cancers and minimize the harms of screening through optimal resource utilization. We present a novel approach to risk-based screening that integrates clinical risk factors, breast density, a polygenic risk score representing the cumulative effects of genetic variants, and sequencing for moderate- and high-penetrance germline mutations. We demonstrate how thresholds of absolute risk estimates generated by our prediction tools can be used to stratify women into different screening strategies (biennial mammography, annual mammography, annual mammography with adjunctive magnetic resonance imaging, defer screening at this time) while informing the starting age of screening for women age 40 to 49 years. Our risk thresholds and corresponding screening strategies are based on current evidence but need to be tested in clinical trials. The Women Informed to Screen Depending On Measures of risk (WISDOM) Study, a pragmatic, preference-tolerant randomized controlled trial of annual vs personalized screening, will study our proposed approach. WISDOM will evaluate the efficacy, safety, and acceptability of risk-based screening beginning in the fall of 2016. The adaptive design of this trial allows continued refinement of our risk thresholds as the trial progresses, and we discuss areas where we anticipate emerging evidence will impact our approach.

  15. ODVBA: Optimally-Discriminative Voxel-Based Analysis

    PubMed Central

    Davatzikos, Christos

    2012-01-01

    Gaussian smoothing of images prior to applying voxel-based statistics is an important step in Voxel-Based Analysis and Statistical Parametric Mapping (VBA-SPM), and is used to account for registration errors, to Gaussianize the data, and to integrate imaging signals from a region around each voxel. However, it has also become a limitation of VBA-SPM based methods, since it is often chosen empirically and lacks spatial adaptivity to the shape and spatial extent of the region of interest, such as a region of atrophy or functional activity. In this paper, we propose a new framework, named Optimally-Discriminative Voxel-Based Analysis (ODVBA), for determining the optimal spatially adaptive smoothing of images, followed by applying voxel-based group analysis. In ODVBA, Nonnegative Discriminative Projection is applied regionally to get the direction that best discriminates between two groups, e.g., patients and controls; this direction is equivalent to local filtering by an optimal kernel whose coefficients define the optimally discriminative direction. By considering all the neighborhoods that contain a given voxel, we then compose this information to produce the statistic for each voxel. Finally, permutation tests are used to obtain a statistical parametric map of group differences. ODVBA has been evaluated using simulated data in which the ground truth is known and with data from an Alzheimer’s disease (AD) study. The experimental results have shown that the proposed ODVBA can precisely describe the shape and location of structural abnormality. PMID:21324774

  16. Constraint-based Hybrid Cellular Automaton Topology Optimization for Advanced Lightweight Blast Resistant Structure Development

    DTIC Science & Technology

    2011-11-01

    2 1.3 Hybrid Cellular Automata (HCA...1. Hybrid cellular automata based topology optimization example (3, 4). .........................1 Figure 2. Topometry optimization (6...3 Figure 4. Hybrid cellular automata -based topology optimization flowchart (3, 4, 9

  17. Neural network based decomposition in optimal structural synthesis

    NASA Technical Reports Server (NTRS)

    Hajela, P.; Berke, L.

    1992-01-01

    The present paper describes potential applications of neural networks in the multilevel decomposition based optimal design of structural systems. The generic structural optimization problem of interest, if handled as a single problem, results in a large dimensionality problem. Decomposition strategies allow for this problem to be represented by a set of smaller, decoupled problems, for which solutions may either be obtained with greater ease or may be obtained in parallel. Neural network models derived through supervised training, are used in two distinct modes in this work. The first uses neural networks to make available efficient analysis models for use in repetitive function evaluations as required by the optimization algorithm. In the second mode, neural networks are used to represent the coupling that exists between the decomposed subproblems. The approach is illustrated by application to the multilevel decomposition-based synthesis of representative truss and frame structures.

  18. SADA: Ecological Risk Based Decision Support System for Selective Remediation

    EPA Science Inventory

    Spatial Analysis and Decision Assistance (SADA) is freeware that implements terrestrial ecological risk assessment and yields a selective remediation design using its integral geographical information system, based on ecological and risk assessment inputs. Selective remediation ...

  19. Risk perception, risk evaluation and human values: cognitive bases of acceptability of a radioactive waste repository

    SciTech Connect

    Earle, T.C.; Lindell, M.K.; Rankin, W.L.

    1981-07-01

    Public acceptance of radioactive waste management alternatives depends in part on public perception of the associated risks. Three aspects of those perceived risks were explored in this study: (1) synthetic measures of risk perception based on judgments of probability and consequences; (2) acceptability of hypothetical radioactive waste policies, and (3) effects of human values on risk perception. Both the work on synthetic measures of risk perception and on the acceptability of hypothetical policies included investigations of three categories of risk: (1) Short-term public risk (affecting persons living when the wastes are created), (2) Long-term public risk (affecting persons living after the time the wastes were created), and (3) Occupational risk (affecting persons working with the radioactive wastes). The human values work related to public risk perception in general, across categories of persons affected. Respondents were selected according to a purposive sampling strategy.

  20. Risk perception, risk evaluation and human values: Cognitive bases acceptability of a radioactive waste repository

    NASA Astrophysics Data System (ADS)

    Earle, T. C.; Lindell, M. K.; Rankin, W. L.; Nealey, S. M.

    1981-07-01

    Public acceptance of radioactive waste management alternatives depends in part on public perception of the associated risks. Three aspects of those perceived risks were explored: (1) synthetic measures of risk perception based on judgments of probability and consequences; (2) acceptability of hypothetical radioactive waste policies; and (3) effects of human values on risk perception. Both the work on synthetic measures of risk perception and on the acceptability of hypothetical policies included investigations of three categories of risk: short term public risk (affecting persons living when the wastes are created), long term public risk (affecting persons living after the time the wastes were created), and occupational risk (affecting persons working with the radioactive wastes). The human values work related to public risk perception in general, across categories of persons affected. Respondents were selected according to a purposive sampling strategy.

  1. Information fusion based optimal control for large civil aircraft system.

    PubMed

    Zhen, Ziyang; Jiang, Ju; Wang, Xinhua; Gao, Chen

    2015-03-01

    Wind disturbance has a great influence on landing security of Large Civil Aircraft. Through simulation research and engineering experience, it can be found that PID control is not good enough to solve the problem of restraining the wind disturbance. This paper focuses on anti-wind attitude control for Large Civil Aircraft in landing phase. In order to improve the riding comfort and the flight security, an information fusion based optimal control strategy is presented to restrain the wind in landing phase for maintaining attitudes and airspeed. Data of Boeing707 is used to establish a nonlinear mode with total variables of Large Civil Aircraft, and then two linear models are obtained which are divided into longitudinal and lateral equations. Based on engineering experience, the longitudinal channel adopts PID control and C inner control to keep longitudinal attitude constant, and applies autothrottle system for keeping airspeed constant, while an information fusion based optimal regulator in the lateral control channel is designed to achieve lateral attitude holding. According to information fusion estimation, by fusing hard constraint information of system dynamic equations and the soft constraint information of performance index function, optimal estimation of the control sequence is derived. Based on this, an information fusion state regulator is deduced for discrete time linear system with disturbance. The simulation results of nonlinear model of aircraft indicate that the information fusion optimal control is better than traditional PID control, LQR control and LQR control with integral action, in anti-wind disturbance performance in the landing phase.

  2. Optimal Test Design with Rule-Based Item Generation

    ERIC Educational Resources Information Center

    Geerlings, Hanneke; van der Linden, Wim J.; Glas, Cees A. W.

    2013-01-01

    Optimal test-design methods are applied to rule-based item generation. Three different cases of automated test design are presented: (a) test assembly from a pool of pregenerated, calibrated items; (b) test generation on the fly from a pool of calibrated item families; and (c) test generation on the fly directly from calibrated features defining…

  3. Optimized Color Filter Arrays for Sparse Representation Based Demosaicking.

    PubMed

    Li, Jia; Bai, Chenyan; Lin, Zhouchen; Yu, Jian

    2017-03-08

    Demosaicking is the problem of reconstructing a color image from the raw image captured by a digital color camera that covers its only imaging sensor with a color filter array (CFA). Sparse representation based demosaicking has been shown to produce superior reconstruction quality. However, almost all existing algorithms in this category use the CFAs which are not specifically optimized for the algorithms. In this paper, we consider optimally designing CFAs for sparse representation based demosaicking, where the dictionary is well-chosen. The fact that CFAs correspond to the projection matrices used in compressed sensing inspires us to optimize CFAs via minimizing the mutual coherence. This is more challenging than that for traditional projection matrices because CFAs have physical realizability constraints. However, most of the existing methods for minimizing the mutual coherence require that the projection matrices should be unconstrained, making them inapplicable for designing CFAs. We consider directly minimizing the mutual coherence with the CFA's physical realizability constraints as a generalized fractional programming problem, which needs to find sufficiently accurate solutions to a sequence of nonconvex nonsmooth minimization problems. We adapt the redistributed proximal bundle method to address this issue. Experiments on benchmark images testify to the superiority of the proposed method. In particular, we show that a simple sparse representation based demosaicking algorithm with our specifically optimized CFA can outperform LSSC [1]. To the best of our knowledge, it is the first sparse representation based demosaicking algorithm that beats LSSC in terms of CPSNR.

  4. Surrogate based wind farm layout optimization using manifold mapping

    NASA Astrophysics Data System (ADS)

    Kaja Kamaludeen, Shaafi M.; van Zuijle, Alexander; Bijl, Hester

    2016-09-01

    High computational cost associated with the high fidelity wake models such as RANS or LES serves as a primary bottleneck to perform a direct high fidelity wind farm layout optimization (WFLO) using accurate CFD based wake models. Therefore, a surrogate based multi-fidelity WFLO methodology (SWFLO) is proposed. The surrogate model is built using an SBO method referred as manifold mapping (MM). As a verification, optimization of spacing between two staggered wind turbines was performed using the proposed surrogate based methodology and the performance was compared with that of direct optimization using high fidelity model. Significant reduction in computational cost was achieved using MM: a maximum computational cost reduction of 65%, while arriving at the same optima as that of direct high fidelity optimization. The similarity between the response of models, the number of mapping points and its position, highly influences the computational efficiency of the proposed method. As a proof of concept, realistic WFLO of a small 7-turbine wind farm is performed using the proposed surrogate based methodology. Two variants of Jensen wake model with different decay coefficients were used as the fine and coarse model. The proposed SWFLO method arrived at the same optima as that of the fine model with very less number of fine model simulations.

  5. Comparative Pessimism or Optimism: Depressed Mood, Risk-Taking, Social Utility and Desirability.

    PubMed

    Milhabet, Isabelle; Le Barbenchon, Emmanuelle; Cambon, Laurent; Molina, Guylaine

    2015-03-05

    Comparative optimism can be defined as a self-serving, asymmetric judgment of the future. It is often thought to be beneficial and socially accepted, whereas comparative pessimism is correlated with depression and socially rejected. Our goal was to examine the social acceptance of comparative optimism and the social rejection of comparative pessimism in two dimensions of social judgment, social desirability and social utility, considering the attributions of dysphoria and risk-taking potential (studies 2 and 3) on outlooks on the future. In three experiments, the participants assessed either one (study 1) or several (studies 2 and 3) fictional targets in two dimensions, social utility and social desirability. Targets exhibiting comparatively optimistic or pessimistic outlooks on the future were presented as non-depressed, depressed, or neither (control condition) (study 1); non-depressed or depressed (study 2); and non-depressed or in control condition (study 3). Two significant results were obtained: (1) social rejection of comparative pessimism in the social desirability dimension, which can be explained by its depressive feature; and (2) comparative optimism was socially accepted on the social utility dimension, which can be explained by the perception that comparatively optimistic individuals are potential risk-takers.

  6. Parallel Harmony Search Based Distributed Energy Resource Optimization

    SciTech Connect

    Ceylan, Oguzhan; Liu, Guodong; Tomsovic, Kevin

    2015-01-01

    This paper presents a harmony search based parallel optimization algorithm to minimize voltage deviations in three phase unbalanced electrical distribution systems and to maximize active power outputs of distributed energy resources (DR). The main contribution is to reduce the adverse impacts on voltage profile during a day as photovoltaics (PVs) output or electrical vehicles (EVs) charging changes throughout a day. The IEEE 123- bus distribution test system is modified by adding DRs and EVs under different load profiles. The simulation results show that by using parallel computing techniques, heuristic methods may be used as an alternative optimization tool in electrical power distribution systems operation.

  7. Reliability-Based Design Optimization of a Composite Airframe Component

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2009-01-01

    A stochastic design optimization methodology (SDO) has been developed to design components of an airframe structure that can be made of metallic and composite materials. The design is obtained as a function of the risk level, or reliability, p. The design method treats uncertainties in load, strength, and material properties as distribution functions, which are defined with mean values and standard deviations. A design constraint or a failure mode is specified as a function of reliability p. Solution to stochastic optimization yields the weight of a structure as a function of reliability p. Optimum weight versus reliability p traced out an inverted-S-shaped graph. The center of the inverted-S graph corresponded to 50 percent (p = 0.5) probability of success. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponds to unity for reliability p (or p = 1). Weight can be reduced to a small value for the most failure-prone design with a reliability that approaches zero (p = 0). Reliability can be changed for different components of an airframe structure. For example, the landing gear can be designed for a very high reliability, whereas it can be reduced to a small extent for a raked wingtip. The SDO capability is obtained by combining three codes: (1) The MSC/Nastran code was the deterministic analysis tool, (2) The fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and (3) NASA Glenn Research Center s optimization testbed CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life raked wingtip structure of the Boeing 767-400 extended range airliner made of metallic and composite materials.

  8. The effect of target group size on risk judgments and comparative optimism: the more, the riskier.

    PubMed

    Price, Paul C; Smith, Andrew R; Lench, Heather C

    2006-03-01

    In 5 experiments, college students exhibited a group size effect on risk judgments. As the number of individuals in a target group increased, so did participants' judgments of the risk of the average member of the group for a variety of negative life events. This happened regardless of whether the stimuli consisted of photographs of real peers or stick-figure representations of peers. As a result, the degree to which participants exhibited comparative optimism (i.e., judged themselves to be at lower risk than their peers) also increased as the size of the comparison group increased. These results suggest that the typical comparative optimism effect reported so often in the literature might be, at least in part, a group size effect. Additional results include a group size effect on judgments of the likelihood that the average group member will experience positive and neutral events and a group size effect on perceptual judgments of the heights of stick figures. These latter results, in particular, support the existence of a simple, general cognitive mechanism that integrates stimulus numerosity into quantitative judgments about that stimulus.

  9. Model-based optimal planning of hepatic radiofrequency ablation.

    PubMed

    Chen, Qiyong; Müftü, Sinan; Meral, Faik Can; Tuncali, Kemal; Akçakaya, Murat

    2016-07-19

    This article presents a model-based pre-treatment optimal planning framework for hepatic tumour radiofrequency (RF) ablation. Conventional hepatic radiofrequency (RF) ablation methods rely on pre-specified input voltage and treatment length based on the tumour size. Using these experimentally obtained pre-specified treatment parameters in RF ablation is not optimal to achieve the expected level of cell death and usually results in more healthy tissue damage than desired. In this study we present a pre-treatment planning framework that provides tools to control the levels of both the healthy tissue preservation and tumour cell death. Over the geometry of tumour and surrounding tissue, we formulate the RF ablation planning as a constrained optimization problem. With specific constraints over the temperature profile (TP) in pre-determined areas of the target geometry, we consider two different cost functions based on the history of the TP and Arrhenius index (AI) of the target location, respectively. We optimally compute the input voltage variation to minimize the damage to the healthy tissue while ensuring a complete cell death in the tumour and immediate area covering the tumour. As an example, we use a simulation of a 1D symmetric target geometry mimicking the application of single electrode RF probe. Results demonstrate that compared to the conventional methods both cost functions improve the healthy tissue preservation.

  10. Reliability-based analysis and design optimization for durability

    NASA Astrophysics Data System (ADS)

    Choi, Kyung K.; Youn, Byeng D.; Tang, Jun; Hardee, Edward

    2005-05-01

    In the Army mechanical fatigue subject to external and inertia transient loads in the service life of mechanical systems often leads to a structural failure due to accumulated damage. Structural durability analysis that predicts the fatigue life of mechanical components subject to dynamic stresses and strains is a compute intensive multidisciplinary simulation process, since it requires the integration of several computer-aided engineering tools and considerable data communication and computation. Uncertainties in geometric dimensions due to manufacturing tolerances cause the indeterministic nature of the fatigue life of a mechanical component. Due to the fact that uncertainty propagation to structural fatigue under transient dynamic loading is not only numerically complicated but also extremely computationally expensive, it is a challenging task to develop a structural durability-based design optimization process and reliability analysis to ascertain whether the optimal design is reliable. The objective of this paper is the demonstration of an integrated CAD-based computer-aided engineering process to effectively carry out design optimization for structural durability, yielding a durable and cost-effectively manufacturable product. This paper shows preliminary results of reliability-based durability design optimization for the Army Stryker A-Arm.

  11. Bare-Bones Teaching-Learning-Based Optimization

    PubMed Central

    Zou, Feng; Wang, Lei; Hei, Xinhong; Chen, Debao; Jiang, Qiaoyong; Li, Hongye

    2014-01-01

    Teaching-learning-based optimization (TLBO) algorithm which simulates the teaching-learning process of the class room is one of the recently proposed swarm intelligent (SI) algorithms. In this paper, a new TLBO variant called bare-bones teaching-learning-based optimization (BBTLBO) is presented to solve the global optimization problems. In this method, each learner of teacher phase employs an interactive learning strategy, which is the hybridization of the learning strategy of teacher phase in the standard TLBO and Gaussian sampling learning based on neighborhood search, and each learner of learner phase employs the learning strategy of learner phase in the standard TLBO or the new neighborhood search strategy. To verify the performance of our approaches, 20 benchmark functions and two real-world problems are utilized. Conducted experiments can been observed that the BBTLBO performs significantly better than, or at least comparable to, TLBO and some existing bare-bones algorithms. The results indicate that the proposed algorithm is competitive to some other optimization algorithms. PMID:25013844

  12. Bare-bones teaching-learning-based optimization.

    PubMed

    Zou, Feng; Wang, Lei; Hei, Xinhong; Chen, Debao; Jiang, Qiaoyong; Li, Hongye

    2014-01-01

    Teaching-learning-based optimization (TLBO) algorithm which simulates the teaching-learning process of the class room is one of the recently proposed swarm intelligent (SI) algorithms. In this paper, a new TLBO variant called bare-bones teaching-learning-based optimization (BBTLBO) is presented to solve the global optimization problems. In this method, each learner of teacher phase employs an interactive learning strategy, which is the hybridization of the learning strategy of teacher phase in the standard TLBO and Gaussian sampling learning based on neighborhood search, and each learner of learner phase employs the learning strategy of learner phase in the standard TLBO or the new neighborhood search strategy. To verify the performance of our approaches, 20 benchmark functions and two real-world problems are utilized. Conducted experiments can been observed that the BBTLBO performs significantly better than, or at least comparable to, TLBO and some existing bare-bones algorithms. The results indicate that the proposed algorithm is competitive to some other optimization algorithms.

  13. Innovative practice model to optimize resource utilization and improve access to care for high-risk and BRCA+ patients

    PubMed Central

    Head, Linden; Nessim, Carolyn; Boyd, Kirsty Usher

    2017-01-01

    Background Bilateral prophylactic mastectomy (BPM) has shown breast cancer risk reduction in high-risk/BRCA+ patients. However, priority of active cancers coupled with inefficient use of operating room (OR) resources presents challenges in offering BPM in a timely manner. To address these challenges, a rapid access prophylactic mastectomy and immediate reconstruction (RAPMIR) program was innovated. The purpose of this study was to evaluate RAPMIR with regards to access to care and efficiency. Methods We retrospectively reviewed the cases of all high-risk/BRCA+ patients having had BPM between September 2012 and August 2014. Patients were divided into 2 groups: those managed through the traditional model and those managed through the RAPMIR model. RAPMIR leverages 2 concurrently running ORs with surgical oncology and plastic surgery moving between rooms to complete 3 combined BPMs with immediate reconstruction in addition to 1–2 independent cases each operative day. RAPMIR eligibility criteria included high-risk/BRCA+ status; BPM with immediate, implant-based reconstruction; and day surgery candidacy. Wait times, case volumes and patient throughput were measured and compared. Results There were 16 traditional patients and 13 RAPMIR patients. Mean wait time (days from referral to surgery) for RAPMIR was significantly shorter than for the traditional model (165.4 v. 309.2 d, p = 0.027). Daily patient throughput (4.3 v. 2.8), plastic surgery case volume (3.7 v. 1.6) and surgical oncology case volume (3.0 v. 2.2) were significantly greater in the RAPMIR model than the traditional model (p = 0.003, p < 0.001 and p = 0.015, respectively). Conclusion A multidisciplinary model with optimized scheduling has the potential to improve access to care and optimize resource utilization. PMID:28234588

  14. Methodology and application for health risk classification of chemicals in foods based on risk matrix.

    PubMed

    Zhou, Ping Ping; Liu, Zhao Ping; Zhang, Lei; Liu, Ai Dong; Song, Yan; Yong, Ling; Li, Ning

    2014-11-01

    The method has been developed to accurately identify the magnitude of health risks and provide scientific evidence for implementation of risk management in food safety. It combines two parameters including consequence and likelihood of adverse effects based on risk matrix. Score definitions and classification for the consequence and the likelihood of adverse effects are proposed. The risk score identifies the intersection of consequence and likelihood in risk matrix represents its health risk level with different colors: 'low', 'medium', 'high'. Its use in an actual case is shown.

  15. Data-based robust multiobjective optimization of interconnected processes: energy efficiency case study in papermaking.

    PubMed

    Afshar, Puya; Brown, Martin; Maciejowski, Jan; Wang, Hong

    2011-12-01

    Reducing energy consumption is a major challenge for "energy-intensive" industries such as papermaking. A commercially viable energy saving solution is to employ data-based optimization techniques to obtain a set of "optimized" operational settings that satisfy certain performance indices. The difficulties of this are: 1) the problems of this type are inherently multicriteria in the sense that improving one performance index might result in compromising the other important measures; 2) practical systems often exhibit unknown complex dynamics and several interconnections which make the modeling task difficult; and 3) as the models are acquired from the existing historical data, they are valid only locally and extrapolations incorporate risk of increasing process variability. To overcome these difficulties, this paper presents a new decision support system for robust multiobjective optimization of interconnected processes. The plant is first divided into serially connected units to model the process, product quality, energy consumption, and corresponding uncertainty measures. Then multiobjective gradient descent algorithm is used to solve the problem in line with user's preference information. Finally, the optimization results are visualized for analysis and decision making. In practice, if further iterations of the optimization algorithm are considered, validity of the local models must be checked prior to proceeding to further iterations. The method is implemented by a MATLAB-based interactive tool DataExplorer supporting a range of data analysis, modeling, and multiobjective optimization techniques. The proposed approach was tested in two U.K.-based commercial paper mills where the aim was reducing steam consumption and increasing productivity while maintaining the product quality by optimization of vacuum pressures in forming and press sections. The experimental results demonstrate the effectiveness of the method.

  16. A Danger-Theory-Based Immune Network Optimization Algorithm

    PubMed Central

    Li, Tao; Xiao, Xin; Shi, Yuanquan

    2013-01-01

    Existing artificial immune optimization algorithms reflect a number of shortcomings, such as premature convergence and poor local search ability. This paper proposes a danger-theory-based immune network optimization algorithm, named dt-aiNet. The danger theory emphasizes that danger signals generated from changes of environments will guide different levels of immune responses, and the areas around danger signals are called danger zones. By defining the danger zone to calculate danger signals for each antibody, the algorithm adjusts antibodies' concentrations through its own danger signals and then triggers immune responses of self-regulation. So the population diversity can be maintained. Experimental results show that the algorithm has more advantages in the solution quality and diversity of the population. Compared with influential optimization algorithms, CLONALG, opt-aiNet, and dopt-aiNet, the algorithm has smaller error values and higher success rates and can find solutions to meet the accuracies within the specified function evaluation times. PMID:23483853

  17. PCNN document segmentation method based on bacterial foraging optimization algorithm

    NASA Astrophysics Data System (ADS)

    Liao, Yanping; Zhang, Peng; Guo, Qiang; Wan, Jian

    2014-04-01

    Pulse Coupled Neural Network(PCNN) is widely used in the field of image processing, but it is a difficult task to define the relative parameters properly in the research of the applications of PCNN. So far the determination of parameters of its model needs a lot of experiments. To deal with the above problem, a document segmentation based on the improved PCNN is proposed. It uses the maximum entropy function as the fitness function of bacterial foraging optimization algorithm, adopts bacterial foraging optimization algorithm to search the optimal parameters, and eliminates the trouble of manually set the experiment parameters. Experimental results show that the proposed algorithm can effectively complete document segmentation. And result of the segmentation is better than the contrast algorithms.

  18. Reentry trajectory optimization based on a multistage pseudospectral method.

    PubMed

    Zhao, Jiang; Zhou, Rui; Jin, Xuelian

    2014-01-01

    Of the many direct numerical methods, the pseudospectral method serves as an effective tool to solve the reentry trajectory optimization for hypersonic vehicles. However, the traditional pseudospectral method is time-consuming due to large number of discretization points. For the purpose of autonomous and adaptive reentry guidance, the research herein presents a multistage trajectory control strategy based on the pseudospectral method, capable of dealing with the unexpected situations in reentry flight. The strategy typically includes two subproblems: the trajectory estimation and trajectory refining. In each processing stage, the proposed method generates a specified range of trajectory with the transition of the flight state. The full glide trajectory consists of several optimal trajectory sequences. The newly focused geographic constraints in actual flight are discussed thereafter. Numerical examples of free-space flight, target transition flight, and threat avoidance flight are used to show the feasible application of multistage pseudospectral method in reentry trajectory optimization.

  19. Reentry Trajectory Optimization Based on a Multistage Pseudospectral Method

    PubMed Central

    Zhou, Rui; Jin, Xuelian

    2014-01-01

    Of the many direct numerical methods, the pseudospectral method serves as an effective tool to solve the reentry trajectory optimization for hypersonic vehicles. However, the traditional pseudospectral method is time-consuming due to large number of discretization points. For the purpose of autonomous and adaptive reentry guidance, the research herein presents a multistage trajectory control strategy based on the pseudospectral method, capable of dealing with the unexpected situations in reentry flight. The strategy typically includes two subproblems: the trajectory estimation and trajectory refining. In each processing stage, the proposed method generates a specified range of trajectory with the transition of the flight state. The full glide trajectory consists of several optimal trajectory sequences. The newly focused geographic constraints in actual flight are discussed thereafter. Numerical examples of free-space flight, target transition flight, and threat avoidance flight are used to show the feasible application of multistage pseudospectral method in reentry trajectory optimization. PMID:24574929

  20. Voronoi Diagram Based Optimization of Dynamic Reactive Power Sources

    SciTech Connect

    Huang, Weihong; Sun, Kai; Qi, Junjian; Xu, Yan

    2015-01-01

    Dynamic var sources can effectively mitigate fault-induced delayed voltage recovery (FIDVR) issues or even voltage collapse. This paper proposes a new approach to optimization of the sizes of dynamic var sources at candidate locations by a Voronoi diagram based algorithm. It first disperses sample points of potential solutions in a searching space, evaluates a cost function at each point by barycentric interpolation for the subspaces around the point, and then constructs a Voronoi diagram about cost function values over the entire space. Accordingly, the final optimal solution can be obtained. Case studies on the WSCC 9-bus system and NPCC 140-bus system have validated that the new approach can quickly identify the boundary of feasible solutions in searching space and converge to the global optimal solution.

  1. A correlation consistency based multivariate alarm thresholds optimization approach.

    PubMed

    Gao, Huihui; Liu, Feifei; Zhu, Qunxiong

    2016-11-01

    Different alarm thresholds could generate different alarm data, resulting in different correlations. A new multivariate alarm thresholds optimization methodology based on the correlation consistency between process data and alarm data is proposed in this paper. The interpretative structural modeling is adopted to select the key variables. For the key variables, the correlation coefficients of process data are calculated by the Pearson correlation analysis, while the correlation coefficients of alarm data are calculated by kernel density estimation. To ensure the correlation consistency, the objective function is established as the sum of the absolute differences between these two types of correlations. The optimal thresholds are obtained using particle swarm optimization algorithm. Case study of Tennessee Eastman process is given to demonstrate the effectiveness of proposed method.

  2. Vision-based stereo ranging as an optimal control problem

    NASA Technical Reports Server (NTRS)

    Menon, P. K. A.; Sridhar, B.; Chatterji, G. B.

    1992-01-01

    The recent interest in the use of machine vision for flight vehicle guidance is motivated by the need to automate the nap-of-the-earth flight regime of helicopters. Vision-based stereo ranging problem is cast as an optimal control problem in this paper. A quadratic performance index consisting of the integral of the error between observed image irradiances and those predicted by a Pade approximation of the correspondence hypothesis is then used to define an optimization problem. The necessary conditions for optimality yield a set of linear two-point boundary-value problems. These two-point boundary-value problems are solved in feedback form using a version of the backward sweep method. Application of the ranging algorithm is illustrated using a laboratory image pair.

  3. De-risking pharmaceutical tablet manufacture through process understanding, latent variable modeling, and optimization technologies.

    PubMed

    Muteki, Koji; Swaminathan, Vidya; Sekulic, Sonja S; Reid, George L

    2011-12-01

    In pharmaceutical tablet manufacturing processes, a major source of disturbance affecting drug product quality is the (lot-to-lot) variability of the incoming raw materials. A novel modeling and process optimization strategy that compensates for raw material variability is presented. The approach involves building partial least squares models that combine raw material attributes and tablet process parameters and relate these to final tablet attributes. The resulting models are used in an optimization framework to then find optimal process parameters which can satisfy all the desired requirements for the final tablet attributes, subject to the incoming raw material lots. In order to de-risk the potential (lot-to-lot) variability of raw materials on the drug product quality, the effect of raw material lot variability on the final tablet attributes was investigated using a raw material database containing a large number of lots. In this way, the raw material variability, optimal process parameter space and tablet attributes are correlated with each other and offer the opportunity of simulating a variety of changes in silico without actually performing experiments. The connectivity obtained between the three sources of variability (materials, parameters, attributes) can be considered a design space consistent with Quality by Design principles, which is defined by the ICH-Q8 guidance (USDA 2006). The effectiveness of the methodologies is illustrated through a common industrial tablet manufacturing case study.

  4. Constraint Web Service Composition Based on Discrete Particle Swarm Optimization

    NASA Astrophysics Data System (ADS)

    Fang, Xianwen; Fan, Xiaoqin; Yin, Zhixiang

    Web service composition provides an open, standards-based approach for connecting web services together to create higher-level business processes. The Standards are designed to reduce the complexity required to compose web services, hence reducing time and costs, and increase overall efficiency in businesses. This paper present independent global constrains web service composition optimization methods based on Discrete Particle Swarm Optimization (DPSO) and associate Petri net (APN). Combining with the properties of APN, an efficient DPSO algorithm is presented which is used to search a legal firing sequence in the APN model. Using legal firing sequences of the Petri net makes the service composition locating space based on DPSO shrink greatly. Finally, for comparing our methods with the approximating methods, the simulation experiment is given out. Theoretical analysis and experimental results indicate that this method owns both lower computation cost and higher success ratio of service composition.

  5. Nozzle Mounting Method Optimization Based on Robot Kinematic Analysis

    NASA Astrophysics Data System (ADS)

    Chen, Chaoyue; Liao, Hanlin; Montavon, Ghislain; Deng, Sihao

    2016-08-01

    Nowadays, the application of industrial robots in thermal spray is gaining more and more importance. A desired coating quality depends on factors such as a balanced robot performance, a uniform scanning trajectory and stable parameters (e.g. nozzle speed, scanning step, spray angle, standoff distance). These factors also affect the mass and heat transfer as well as the coating formation. Thus, the kinematic optimization of all these aspects plays a key role in order to obtain an optimal coating quality. In this study, the robot performance was optimized from the aspect of nozzle mounting on the robot. An optimized nozzle mounting for a type F4 nozzle was designed, based on the conventional mounting method from the point of view of robot kinematics validated on a virtual robot. Robot kinematic parameters were obtained from the simulation by offline programming software and analyzed by statistical methods. The energy consumptions of different nozzle mounting methods were also compared. The results showed that it was possible to reasonably assign the amount of robot motion to each axis during the process, so achieving a constant nozzle speed. Thus, it is possible optimize robot performance and to economize robot energy.

  6. Component-based integration of chemistry and optimization software.

    PubMed

    Kenny, Joseph P; Benson, Steven J; Alexeev, Yuri; Sarich, Jason; Janssen, Curtis L; McInnes, Lois Curfman; Krishnan, Manojkumar; Nieplocha, Jarek; Jurrus, Elizabeth; Fahlstrom, Carl; Windus, Theresa L

    2004-11-15

    Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Component-based software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance, and highlight potential research enabled by this platform.

  7. Optimization Model for Web Based Multimodal Interactive Simulations.

    PubMed

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2015-07-15

    This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update. In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach.

  8. Optimization Model for Web Based Multimodal Interactive Simulations

    PubMed Central

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2015-01-01

    This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update. In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach. PMID:26085713

  9. CACONET: Ant Colony Optimization (ACO) Based Clustering Algorithm for VANET

    PubMed Central

    Bajwa, Khalid Bashir; Khan, Salabat; Chaudary, Nadeem Majeed; Akram, Adeel

    2016-01-01

    A vehicular ad hoc network (VANET) is a wirelessly connected network of vehicular nodes. A number of techniques, such as message ferrying, data aggregation, and vehicular node clustering aim to improve communication efficiency in VANETs. Cluster heads (CHs), selected in the process of clustering, manage inter-cluster and intra-cluster communication. The lifetime of clusters and number of CHs determines the efficiency of network. In this paper a Clustering algorithm based on Ant Colony Optimization (ACO) for VANETs (CACONET) is proposed. CACONET forms optimized clusters for robust communication. CACONET is compared empirically with state-of-the-art baseline techniques like Multi-Objective Particle Swarm Optimization (MOPSO) and Comprehensive Learning Particle Swarm Optimization (CLPSO). Experiments varying the grid size of the network, the transmission range of nodes, and number of nodes in the network were performed to evaluate the comparative effectiveness of these algorithms. For optimized clustering, the parameters considered are the transmission range, direction and speed of the nodes. The results indicate that CACONET significantly outperforms MOPSO and CLPSO. PMID:27149517

  10. CACONET: Ant Colony Optimization (ACO) Based Clustering Algorithm for VANET.

    PubMed

    Aadil, Farhan; Bajwa, Khalid Bashir; Khan, Salabat; Chaudary, Nadeem Majeed; Akram, Adeel

    2016-01-01

    A vehicular ad hoc network (VANET) is a wirelessly connected network of vehicular nodes. A number of techniques, such as message ferrying, data aggregation, and vehicular node clustering aim to improve communication efficiency in VANETs. Cluster heads (CHs), selected in the process of clustering, manage inter-cluster and intra-cluster communication. The lifetime of clusters and number of CHs determines the efficiency of network. In this paper a Clustering algorithm based on Ant Colony Optimization (ACO) for VANETs (CACONET) is proposed. CACONET forms optimized clusters for robust communication. CACONET is compared empirically with state-of-the-art baseline techniques like Multi-Objective Particle Swarm Optimization (MOPSO) and Comprehensive Learning Particle Swarm Optimization (CLPSO). Experiments varying the grid size of the network, the transmission range of nodes, and number of nodes in the network were performed to evaluate the comparative effectiveness of these algorithms. For optimized clustering, the parameters considered are the transmission range, direction and speed of the nodes. The results indicate that CACONET significantly outperforms MOPSO and CLPSO.

  11. Bounded Error Approximation Algorithms for Risk-Based Intrusion Response

    DTIC Science & Technology

    2015-09-17

    AFRL-AFOSR-VA-TR-2015-0324 Bounded Error Approximation Algorithms for Risk-Based Intrusion Response K Subramani West Virginia University Research...2015. 4. TITLE AND SUBTITLE Bounded Error Approximation Algorithms for Risk-Based Intrusion Response 5a. CONTRACT NUMBER FA9550-12-1-0199. 5b. GRANT... Algorithms for Risk-Based Intrusion Response DISTRIBUTION A: Distribution approved for public release. Definition 1.7 Given an integer k, an undirected

  12. Probabilistic risk assessment techniques help in identifying optimal equipment design for in-situ vitrification

    SciTech Connect

    Lucero, V.; Meale, B.M.; Purser, F.E.

    1990-01-01

    The analysis discussed in this paper was performed as part of the buried waste remediation efforts at the Idaho National Engineering Laboratory (INEL). The specific type of remediation discussed herein involves a thermal treatment process for converting contaminated soil and waste into a stable, chemically-inert form. Models of the proposed process were developed using probabilistic risk assessment (PRA) fault tree and event tree modeling techniques. The models were used to determine the appropriateness of the conceptual design by identifying potential hazards of system operations. Additional models were developed to represent the reliability aspects of the system components. By performing various sensitivities with the models, optimal design modifications are being identified to substantiate an integrated, cost-effective design representing minimal risk to the environment and/or public with maximum component reliability. 4 figs.

  13. Surrogate-Based Optimization of Biogeochemical Transport Models

    NASA Astrophysics Data System (ADS)

    Prieß, Malte; Slawig, Thomas

    2010-09-01

    First approaches towards a surrogate-based optimization method for a one-dimensional marine biogeochemical model of NPZD type are presented. The model, developed by Oschlies and Garcon [1], simulates the distribution of nitrogen, phytoplankton, zooplankton and detritus in a water column and is driven by ocean circulation data. A key issue is to minimize the misfit between the model output and given observational data. Our aim is to reduce the overall optimization cost avoiding expensive function and derivative evaluations by using a surrogate model replacing the high-fidelity model in focus. This in particular becomes important for more complex three-dimensional models. We analyse a coarsening in the discretization of the model equations as one way to create such a surrogate. Here the numerical stability crucially depends upon the discrete stepsize in time and space and the biochemical terms. We show that for given model parameters the level of grid coarsening can be choosen accordingly yielding a stable and satisfactory surrogate. As one example of a surrogate-based optimization method we present results of the Aggressive Space Mapping technique (developed by John W. Bandler [2, 3]) applied to the optimization of this one-dimensional biogeochemical transport model.

  14. Chaos Time Series Prediction Based on Membrane Optimization Algorithms

    PubMed Central

    Li, Meng; Yi, Liangzhong; Pei, Zheng; Gao, Zhisheng

    2015-01-01

    This paper puts forward a prediction model based on membrane computing optimization algorithm for chaos time series; the model optimizes simultaneously the parameters of phase space reconstruction (τ, m) and least squares support vector machine (LS-SVM) (γ, σ) by using membrane computing optimization algorithm. It is an important basis for spectrum management to predict accurately the change trend of parameters in the electromagnetic environment, which can help decision makers to adopt an optimal action. Then, the model presented in this paper is used to forecast band occupancy rate of frequency modulation (FM) broadcasting band and interphone band. To show the applicability and superiority of the proposed model, this paper will compare the forecast model presented in it with conventional similar models. The experimental results show that whether single-step prediction or multistep prediction, the proposed model performs best based on three error measures, namely, normalized mean square error (NMSE), root mean square error (RMSE), and mean absolute percentage error (MAPE). PMID:25874249

  15. Computer Based Porosity Design by Multi Phase Topology Optimization

    NASA Astrophysics Data System (ADS)

    Burblies, Andreas; Busse, Matthias

    2008-02-01

    A numerical simulation technique called Multi Phase Topology Optimization (MPTO) based on finite element method has been developed and refined by Fraunhofer IFAM during the last five years. MPTO is able to determine the optimum distribution of two or more different materials in components under thermal and mechanical loads. The objective of optimization is to minimize the component's elastic energy. Conventional topology optimization methods which simulate adaptive bone mineralization have got the disadvantage that there is a continuous change of mass by growth processes. MPTO keeps all initial material concentrations and uses methods adapted from molecular dynamics to find energy minimum. Applying MPTO to mechanically loaded components with a high number of different material densities, the optimization results show graded and sometimes anisotropic porosity distributions which are very similar to natural bone structures. Now it is possible to design the macro- and microstructure of a mechanical component in one step. Computer based porosity design structures can be manufactured by new Rapid Prototyping technologies. Fraunhofer IFAM has applied successfully 3D-Printing and Selective Laser Sintering methods in order to produce very stiff light weight components with graded porosities calculated by MPTO.

  16. Reliability-based design optimization using efficient global reliability analysis.

    SciTech Connect

    Bichon, Barron J.; Mahadevan, Sankaran; Eldred, Michael Scott

    2010-05-01

    Finding the optimal (lightest, least expensive, etc.) design for an engineered component that meets or exceeds a specified level of reliability is a problem of obvious interest across a wide spectrum of engineering fields. Various methods for this reliability-based design optimization problem have been proposed. Unfortunately, this problem is rarely solved in practice because, regardless of the method used, solving the problem is too expensive or the final solution is too inaccurate to ensure that the reliability constraint is actually satisfied. This is especially true for engineering applications involving expensive, implicit, and possibly nonlinear performance functions (such as large finite element models). The Efficient Global Reliability Analysis method was recently introduced to improve both the accuracy and efficiency of reliability analysis for this type of performance function. This paper explores how this new reliability analysis method can be used in a design optimization context to create a method of sufficient accuracy and efficiency to enable the use of reliability-based design optimization as a practical design tool.

  17. The optimal community detection of software based on complex networks

    NASA Astrophysics Data System (ADS)

    Huang, Guoyan; Zhang, Peng; Zhang, Bing; Yin, Tengteng; Ren, Jiadong

    2016-02-01

    The community structure is important for software in terms of understanding the design patterns, controlling the development and the maintenance process. In order to detect the optimal community structure in the software network, a method Optimal Partition Software Network (OPSN) is proposed based on the dependency relationship among the software functions. First, by analyzing the information of multiple execution traces of one software, we construct Software Execution Dependency Network (SEDN). Second, based on the relationship among the function nodes in the network, we define Fault Accumulation (FA) to measure the importance of the function node and sort the nodes with measure results. Third, we select the top K(K=1,2,…) nodes as the core of the primal communities (only exist one core node). By comparing the dependency relationships between each node and the K communities, we put the node into the existing community which has the most close relationship. Finally, we calculate the modularity with different initial K to obtain the optimal division. With experiments, the method OPSN is verified to be efficient to detect the optimal community in various softwares.

  18. Risk-based maintenance--techniques and applications.

    PubMed

    Arunraj, N S; Maiti, J

    2007-04-11

    Plant and equipment, however well designed, will not remain safe or reliable if it is not maintained. The general objective of the maintenance process is to make use of the knowledge of failures and accidents to achieve the possible safety with the lowest possible cost. The concept of risk-based maintenance was developed to inspect the high-risk components usually with greater frequency and thoroughness and to maintain in a greater manner, to achieve tolerable risk criteria. Risk-based maintenance methodology provides a tool for maintenance planning and decision making to reduce the probability of failure of equipment and the consequences of failure. In this paper, the risk analysis and risk-based maintenance methodologies were identified and classified into suitable classes. The factors affecting the quality of risk analysis were identified and analyzed. The applications, input data and output data were studied to understand their functioning and efficiency. The review showed that there is no unique way to perform risk analysis and risk-based maintenance. The use of suitable techniques and methodologies, careful investigation during the risk analysis phase, and its detailed and structured results are necessary to make proper risk-based maintenance decisions.

  19. Cooperation under predation risk: a data-based ESS analysis

    PubMed Central

    Parker, G. A.; Milinski, M.

    1997-01-01

    Two fish that jointly approach a predator in order to inspect it share the deadly risk of capture depending on the distance between them. Models are developed that seek ESS inspection distances of both single prey and pairs, based on experimental data of the risk that prey (sticklebacks) incur when they approach a predator (pike) to varying distances. Our analysis suggests that an optimal inspection distance can exist for a single fish, and for two equal fish behaving entirely cooperatively so as to maximize the fitness of the pair. Two equal fish inspecting cooperatively should inspect at an equal distance from the predator. The optimal distance is much closer to the predator for cooperative pairs than for single inspectors. However, optimal inspection for two equal fish behaving cooperatively operates across a rather narrow band of conditions relating to the benefits of cooperation. Evolutionarily stable inspection can also exist for two equal fish behaving non-cooperatively such that each acts to make a best reply (in terms of its personal fitness) to its opponent's strategy. Non-cooperative pairs should also inspect at equal distance from the pike. Unlike the 'single fish' and 'cooperative' optima, which are unique inspection distances, there exists a range of ESS inspection distances. If either fish chooses to move to any point in this zone, the best reply of its opponent is to match it (move exactly alongside). Unilateral forward movement in the 'match zone' may not be possible without some cooperation, but if the pair can 'agree' to move forward synchronously, maintaining equal distance, inspection will occur at the nearest point in this zone to the predator. This 'near threshold' is an ESS and is closer to the predator than the single fish optimum: pairs behaving almost selfishly can thus attain greater benefits from inspection by the protection gained from Hamilton's dilution effect. That pairs should inspect more closely than single fish conforms with

  20. Modified risk graph method using fuzzy rule-based approach.

    PubMed

    Nait-Said, R; Zidani, F; Ouzraoui, N

    2009-05-30

    The risk graph is one of the most popular methods used to determine the safety integrity level for safety instrumented functions. However, conventional risk graph as described in the IEC 61508 standard is subjective and suffers from an interpretation problem of risk parameters. Thus, it can lead to inconsistent outcomes that may result in conservative SILs. To overcome this difficulty, a modified risk graph using fuzzy rule-based system is proposed. This novel version of risk graph uses fuzzy scales to assess risk parameters and calibration may be made by varying risk parameter values. Furthermore, the outcomes which are numerical values of risk reduction factor (the inverse of the probability of failure on demand) can be compared directly with those given by quantitative and semi-quantitative methods such as fault tree analysis (FTA), quantitative risk assessment (QRA) and layers of protection analysis (LOPA).

  1. Block-based mask optimization for optical lithography.

    PubMed

    Ma, Xu; Song, Zhiyang; Li, Yanqiu; Arce, Gonzalo R

    2013-05-10

    Pixel-based optical proximity correction (PBOPC) methods have been developed as a leading-edge resolution enhancement technique (RET) for integrated circuit fabrication. PBOPC independently modulates each pixel on the reticle, which tremendously increases the mask's complexity and, at the same time, deteriorates its manufacturability. Most current PBOPC algorithms recur to regularization methods or a mask manufacturing rule check (MRC) to improve the mask manufacturability. Typically, these approaches either fail to satisfy manufacturing constraints on the practical product line, or lead to suboptimal mask patterns that may degrade the lithographic performance. This paper develops a block-based optical proximity correction (BBOPC) algorithm to pursue the optimal masks with manufacturability compliance, where the mask is shaped by a set of overlapped basis blocks rather than pixels. BBOPC optimization is formulated based on a vector imaging model, which is adequate for both dry lithography with lower numerical aperture (NA), and immersion lithography with hyper-NA. The BBOPC algorithm successively optimizes the main features (MF) and subresolution assist features (SRAF) based on a modified conjugate gradient method. It is effective at smoothing any unmanufacturable jogs along edges. A weight matrix is introduced in the cost function to preserve the edge fidelity of the printed images. Simulations show that the BBOPC algorithm can improve lithographic imaging performance while maintaining mask manufacturing constraints.

  2. Biological Based Risk Assessment for Space Exploration

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2011-01-01

    Exposures from galactic cosmic rays (GCR) - made up of high-energy protons and high-energy and charge (HZE) nuclei, and solar particle events (SPEs) - comprised largely of low- to medium-energy protons are the primary health concern for astronauts for long-term space missions. Experimental studies have shown that HZE nuclei produce both qualitative and quantitative differences in biological effects compared to terrestrial radiation, making risk assessments for cancer and degenerative risks, such as central nervous system effects and heart disease, highly uncertain. The goal for space radiation protection at NASA is to be able to reduce the uncertainties in risk assessments for Mars exploration to be small enough to ensure acceptable levels of risks are not exceeded and to adequately assess the efficacy of mitigation measures such as shielding or biological countermeasures. We review the recent BEIR VII and UNSCEAR-2006 models of cancer risks and their uncertainties. These models are shown to have an inherent 2-fold uncertainty as defined by ratio of the 95% percent confidence level to the mean projection, even before radiation quality is considered. In order to overcome the uncertainties in these models, new approaches to risk assessment are warranted. We consider new computational biology approaches to modeling cancer risks. A basic program of research that includes stochastic descriptions of the physics and chemistry of radiation tracks and biochemistry of metabolic pathways, to emerging biological understanding of cellular and tissue modifications leading to cancer is described.

  3. [Optimized Spectral Indices Based Estimation of Forage Grass Biomass].

    PubMed

    An, Hai-bo; Li, Fei; Zhao, Meng-li; Liu, Ya-jun

    2015-11-01

    As an important indicator of forage production, aboveground biomass will directly illustrate the growth of forage grass. Therefore, Real-time monitoring biomass of forage grass play a crucial role in performing suitable grazing and management in artificial and natural grassland. However, traditional sampling and measuring are time-consuming and labor-intensive. Recently, development of hyperspectral remote sensing provides the feasibility in timely and nondestructive deriving biomass of forage grass. In the present study, the main objectives were to explore the robustness of published and optimized spectral indices in estimating biomass of forage grass in natural and artificial pasture. The natural pasture with four grazing density (control, light grazing, moderate grazing and high grazing) was designed in desert steppe, and different forage cultivars with different N rate were conducted in artificial forage fields in Inner Mongolia. The canopy reflectance and biomass in each plot were measured during critical stages. The result showed that, due to the influence in canopy structure and biomass, the canopy reflectance have a great difference in different type of forage grass. The best performing spectral index varied in different species of forage grass with different treatments (R² = 0.00-0.69). The predictive ability of spectral indices decreased under low biomass of desert steppe, while red band based spectral indices lost sensitivity under moderate-high biomass of forage maize. When band combinations of simple ratio and normalized difference spectral indices were optimized in combined datasets of natural and artificial grassland, optimized spectral indices significant increased predictive ability and the model between biomass and optimized spectral indices had the highest R² (R² = 0.72) compared to published spectral indices. Sensitive analysis further confirmed that the optimized index had the lowest noise equivalent and were the best performing index in

  4. Interdependency between Risk Assessments for Self and Other in the Field of Comparative Optimism: The Contribution of Response Times

    ERIC Educational Resources Information Center

    Spitzenstetter, Florence; Schimchowitsch, Sarah

    2012-01-01

    By introducing a response-time measure in the field of comparative optimism, this study was designed to explore how people estimate risk to self and others depending on the evaluation order (self/other or other/self). Our results show the interdependency between self and other answers. Indeed, while response time for risk assessment for the self…

  5. Parameter optimization in differential geometry based solvation models

    PubMed Central

    Wang, Bao; Wei, G. W.

    2015-01-01

    Differential geometry (DG) based solvation models are a new class of variational implicit solvent approaches that are able to avoid unphysical solvent-solute boundary definitions and associated geometric singularities, and dynamically couple polar and non-polar interactions in a self-consistent framework. Our earlier study indicates that DG based non-polar solvation model outperforms other methods in non-polar solvation energy predictions. However, the DG based full solvation model has not shown its superiority in solvation analysis, due to its difficulty in parametrization, which must ensure the stability of the solution of strongly coupled nonlinear Laplace-Beltrami and Poisson-Boltzmann equations. In this work, we introduce new parameter learning algorithms based on perturbation and convex optimization theories to stabilize the numerical solution and thus achieve an optimal parametrization of the DG based solvation models. An interesting feature of the present DG based solvation model is that it provides accurate solvation free energy predictions for both polar and non-polar molecules in a unified formulation. Extensive numerical experiment demonstrates that the present DG based solvation model delivers some of the most accurate predictions of the solvation free energies for a large number of molecules. PMID:26450304

  6. Multiresolution subspace-based optimization method for inverse scattering problems.

    PubMed

    Oliveri, Giacomo; Zhong, Yu; Chen, Xudong; Massa, Andrea

    2011-10-01

    This paper investigates an approach to inverse scattering problems based on the integration of the subspace-based optimization method (SOM) within a multifocusing scheme in the framework of the contrast source formulation. The scattering equations are solved by a nested three-step procedure composed of (a) an outer multiresolution loop dealing with the identification of the regions of interest within the investigation domain through an iterative information-acquisition process, (b) a spectrum analysis step devoted to the reconstruction of the deterministic components of the contrast sources, and (c) an inner optimization loop aimed at retrieving the ambiguous components of the contrast sources through a conjugate gradient minimization of a suitable objective function. A set of representative reconstruction results is discussed to provide numerical evidence of the effectiveness of the proposed algorithmic approach as well as to assess the features and potentialities of the multifocusing integration in comparison with the state-of-the-art SOM implementation.

  7. Model updating based on an affine scaling interior optimization algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Y. X.; Jia, C. X.; Li, Jian; Spencer, B. F.

    2013-11-01

    Finite element model updating is usually considered as an optimization process. Affine scaling interior algorithms are powerful optimization algorithms that have been developed over the past few years. A new finite element model updating method based on an affine scaling interior algorithm and a minimization of modal residuals is proposed in this article, and a general finite element model updating program is developed based on the proposed method. The performance of the proposed method is studied through numerical simulation and experimental investigation using the developed program. The results of the numerical simulation verified the validity of the method. Subsequently, the natural frequencies obtained experimentally from a three-dimensional truss model were used to update a finite element model using the developed program. After updating, the natural frequencies of the truss and finite element model matched well.

  8. Finite Element Based HWB Centerbody Structural Optimization and Weight Prediction

    NASA Technical Reports Server (NTRS)

    Gern, Frank H.

    2012-01-01

    This paper describes a scalable structural model suitable for Hybrid Wing Body (HWB) centerbody analysis and optimization. The geometry of the centerbody and primary wing structure is based on a Vehicle Sketch Pad (VSP) surface model of the aircraft and a FLOPS compatible parameterization of the centerbody. Structural analysis, optimization, and weight calculation are based on a Nastran finite element model of the primary HWB structural components, featuring centerbody, mid section, and outboard wing. Different centerbody designs like single bay or multi-bay options are analyzed and weight calculations are compared to current FLOPS results. For proper structural sizing and weight estimation, internal pressure and maneuver flight loads are applied. Results are presented for aerodynamic loads, deformations, and centerbody weight.

  9. Optimization of positrons generation based on laser wakefield electron acceleration

    NASA Astrophysics Data System (ADS)

    Wu, Yuchi; Han, Dan; Zhang, Tiankui; Dong, Kegong; Zhu, Bin; Yan, Yonghong; Gu, Yuqiu

    2016-08-01

    Laser based positron represents a new particle source with short pulse duration and high charge density. Positron production based on laser wakefield electron acceleration (LWFA) has been investigated theoretically in this paper. Analytical expressions for positron spectra and yield have been obtained through a combination of LWFA and cascade shower theories. The maximum positron yield and corresponding converter thickness have been optimized as a function of driven laser power. Under the optimal condition, high energy (>100 MeV ) positron yield up to 5 ×1011 can be produced by high power femtosecond lasers at ELI-NP. The percentage of positrons shows that a quasineutral electron-positron jet can be generated by setting the converter thickness greater than 5 radiation lengths.

  10. Bell-Curve Based Evolutionary Strategies for Structural Optimization

    NASA Technical Reports Server (NTRS)

    Kincaid, Rex K.

    2001-01-01

    Evolutionary methods are exceedingly popular with practitioners of many fields; more so than perhaps any optimization tool in existence. Historically Genetic Algorithms (GAs) led the way in practitioner popularity. However, in the last ten years Evolutionary Strategies (ESs) and Evolutionary Programs (EPS) have gained a significant foothold. One partial explanation for this shift is the interest in using GAs to solve continuous optimization problems. The typical GA relies upon a cumbersome binary representation of the design variables. An ES or EP, however, works directly with the real-valued design variables. For detailed references on evolutionary methods in general and ES or EP in specific see Back and Dasgupta and Michalesicz. We call our evolutionary algorithm BCB (bell curve based) since it is based upon two normal distributions.

  11. Bell-Curve Based Evolutionary Strategies for Structural Optimization

    NASA Technical Reports Server (NTRS)

    Kincaid, Rex K.

    2000-01-01

    Evolutionary methods are exceedingly popular with practitioners of many fields; more so than perhaps any optimization tool in existence. Historically Genetic Algorithms (GAs) led the way in practitioner popularity (Reeves 1997). However, in the last ten years Evolutionary Strategies (ESs) and Evolutionary Programs (EPS) have gained a significant foothold (Glover 1998). One partial explanation for this shift is the interest in using GAs to solve continuous optimization problems. The typical GA relies upon a cumber-some binary representation of the design variables. An ES or EP, however, works directly with the real-valued design variables. For detailed references on evolutionary methods in general and ES or EP in specific see Back (1996) and Dasgupta and Michalesicz (1997). We call our evolutionary algorithm BCB (bell curve based) since it is based upon two normal distributions.

  12. An Optimality-Based Fully-Distributed Watershed Ecohydrological Model

    NASA Astrophysics Data System (ADS)

    Chen, L., Jr.

    2015-12-01

    Watershed ecohydrological models are essential tools to assess the impact of climate change and human activities on hydrological and ecological processes for watershed management. Existing models can be classified as empirically based model, quasi-mechanistic and mechanistic models. The empirically based and quasi-mechanistic models usually adopt empirical or quasi-empirical equations, which may be incapable of capturing non-stationary dynamics of target processes. Mechanistic models that are designed to represent process feedbacks may capture vegetation dynamics, but often have more demanding spatial and temporal parameterization requirements to represent vegetation physiological variables. In recent years, optimality based ecohydrological models have been proposed which have the advantage of reducing the need for model calibration by assuming critical aspects of system behavior. However, this work to date has been limited to plot scale that only considers one-dimensional exchange of soil moisture, carbon and nutrients in vegetation parameterization without lateral hydrological transport. Conceptual isolation of individual ecosystem patches from upslope and downslope flow paths compromises the ability to represent and test the relationships between hydrology and vegetation in mountainous and hilly terrain. This work presents an optimality-based watershed ecohydrological model, which incorporates lateral hydrological process influence on hydrological flow-path patterns that emerge from the optimality assumption. The model has been tested in the Walnut Gulch watershed and shows good agreement with observed temporal and spatial patterns of evapotranspiration (ET) and gross primary productivity (GPP). Spatial variability of ET and GPP produced by the model match spatial distribution of TWI, SCA, and slope well over the area. Compared with the one dimensional vegetation optimality model (VOM), we find that the distributed VOM (DisVOM) produces more reasonable spatial

  13. Process optimization electrospinning fibrous material based on polyhydroxybutyrate

    NASA Astrophysics Data System (ADS)

    Olkhov, A. A.; Tyubaeva, P. M.; Staroverova, O. V.; Mastalygina, E. E.; Popov, A. A.; Ischenko, A. A.; Iordanskii, A. L.

    2016-05-01

    The article analyzes the influence of the main technological parameters of electrostatic spinning on the morphology and properties of ultrathin fibers on the basis of polyhydroxybutyrate. It is found that the electric conductivity and viscosity of the spinning solution affects the process of forming fibers macrostructure. The fiber-based materials PHB lets control geometry and optimize the viscosity and conductivity of a spinning solution. The resulting fibers have found use in medicine, particularly in the construction elements musculoskeletal.

  14. Air Quality Monitoring: Risk-Based Choices

    NASA Technical Reports Server (NTRS)

    James, John T.

    2009-01-01

    Air monitoring is secondary to rigid control of risks to air quality. Air quality monitoring requires us to target the credible residual risks. Constraints on monitoring devices are severe. Must transition from archival to real-time, on-board monitoring. Must provide data to crew in a way that they can interpret findings. Dust management and monitoring may be a major concern for exploration class missions.

  15. Genetic algorithm for design and manufacture optimization based on numerical simulations applied to aeronautic composite parts

    SciTech Connect

    Mouton, S.; Ledoux, Y.; Teissandier, D.; Sebastian, P.

    2010-06-15

    A key challenge for the future is to reduce drastically the human impact on the environment. In the aeronautic field, this challenge aims at optimizing the design of the aircraft to decrease the global mass. This reduction leads to the optimization of every part constitutive of the plane. This operation is even more delicate when the used material is composite material. In this case, it is necessary to find a compromise between the strength, the mass and the manufacturing cost of the component. Due to these different kinds of design constraints it is necessary to assist engineer with decision support system to determine feasible solutions. In this paper, an approach is proposed based on the coupling of the different key characteristics of the design process and on the consideration of the failure risk of the component. The originality of this work is that the manufacturing deviations due to the RTM process are integrated in the simulation of the assembly process. Two kinds of deviations are identified: volume impregnation (injection phase of RTM process) and geometrical deviations (curing and cooling phases). The quantification of these deviations and the related failure risk calculation is based on finite element simulations (Pam RTM registered and Samcef registered softwares). The use of genetic algorithm allows to estimate the impact of the design choices and their consequences on the failure risk of the component. The main focus of the paper is the optimization of tool design. In the framework of decision support systems, the failure risk calculation is used for making the comparison of possible industrialization alternatives. It is proposed to apply this method on a particular part of the airplane structure: a spar unit made of carbon fiber/epoxy composite.

  16. Credibility theory based dynamic control bound optimization for reservoir flood limited water level

    NASA Astrophysics Data System (ADS)

    Jiang, Zhiqiang; Sun, Ping; Ji, Changming; Zhou, Jianzhong

    2015-10-01

    The dynamic control operation of reservoir flood limited water level (FLWL) can solve the contradictions between reservoir flood control and beneficial operation well, and it is an important measure to make sure the security of flood control and realize the flood utilization. The dynamic control bound of FLWL is a fundamental key element for implementing reservoir dynamic control operation. In order to optimize the dynamic control bound of FLWL by considering flood forecasting error, this paper took the forecasting error as a fuzzy variable, and described it with the emerging credibility theory in recent years. By combining the flood forecasting error quantitative model, a credibility-based fuzzy chance constrained model used to optimize the dynamic control bound was proposed in this paper, and fuzzy simulation technology was used to solve the model. The FENGTAN reservoir in China was selected as a case study, and the results show that, compared with the original operation water level, the initial operation water level (IOWL) of FENGTAN reservoir can be raised 4 m, 2 m and 5.5 m respectively in the three division stages of flood season, and without increasing flood control risk. In addition, the rationality and feasibility of the proposed forecasting error quantitative model and credibility-based dynamic control bound optimization model are verified by the calculation results of extreme risk theory.

  17. Cost-effectiveness and harm-benefit analyses of risk-based screening strategies for breast cancer.

    PubMed

    Vilaprinyo, Ester; Forné, Carles; Carles, Misericordia; Sala, Maria; Pla, Roger; Castells, Xavier; Domingo, Laia; Rue, Montserrat

    2014-01-01

    The one-size-fits-all paradigm in organized screening of breast cancer is shifting towards a personalized approach. The present study has two objectives: 1) To perform an economic evaluation and to assess the harm-benefit ratios of screening strategies that vary in their intensity and interval ages based on breast cancer risk; and 2) To estimate the gain in terms of cost and harm reductions using risk-based screening with respect to the usual practice. We used a probabilistic model and input data from Spanish population registries and screening programs, as well as from clinical studies, to estimate the benefit, harm, and costs over time of 2,624 screening strategies, uniform or risk-based. We defined four risk groups, low, moderate-low, moderate-high and high, based on breast density, family history of breast cancer and personal history of breast biopsy. The risk-based strategies were obtained combining the exam periodicity (annual, biennial, triennial and quinquennial), the starting ages (40, 45 and 50 years) and the ending ages (69 and 74 years) in the four risk groups. Incremental cost-effectiveness and harm-benefit ratios were used to select the optimal strategies. Compared to risk-based strategies, the uniform ones result in a much lower benefit for a specific cost. Reductions close to 10% in costs and higher than 20% in false-positive results and overdiagnosed cases were obtained for risk-based strategies. Optimal screening is characterized by quinquennial or triennial periodicities for the low or moderate risk-groups and annual periodicity for the high-risk group. Risk-based strategies can reduce harm and costs. It is necessary to develop accurate measures of individual risk and to work on how to implement risk-based screening strategies.

  18. Optimal control and optimal trajectories of regional macroeconomic dynamics based on the Pontryagin maximum principle

    NASA Astrophysics Data System (ADS)

    Bulgakov, V. K.; Strigunov, V. V.

    2009-05-01

    The Pontryagin maximum principle is used to prove a theorem concerning optimal control in regional macroeconomics. A boundary value problem for optimal trajectories of the state and adjoint variables is formulated, and optimal curves are analyzed. An algorithm is proposed for solving the boundary value problem of optimal control. The performance of the algorithm is demonstrated by computing an optimal control and the corresponding optimal trajectories.

  19. Genetic Algorithm (GA)-Based Inclinometer Layout Optimization.

    PubMed

    Liang, Weijie; Zhang, Ping; Chen, Xianping; Cai, Miao; Yang, Daoguo

    2015-04-17

    This paper presents numerical simulation results of an airflow inclinometer with sensitivity studies and thermal optimization of the printed circuit board (PCB) layout for an airflow inclinometer based on a genetic algorithm (GA). Due to the working principle of the gas sensor, the changes of the ambient temperature may cause dramatic voltage drifts of sensors. Therefore, eliminating the influence of the external environment for the airflow is essential for the performance and reliability of an airflow inclinometer. In this paper, the mechanism of an airflow inclinometer and the influence of different ambient temperatures on the sensitivity of the inclinometer will be examined by the ANSYS-FLOTRAN CFD program. The results show that with changes of the ambient temperature on the sensing element, the sensitivity of the airflow inclinometer is inversely proportional to the ambient temperature and decreases when the ambient temperature increases. GA is used to optimize the PCB thermal layout of the inclinometer. The finite-element simulation method (ANSYS) is introduced to simulate and verify the results of our optimal thermal layout, and the results indicate that the optimal PCB layout greatly improves (by more than 50%) the sensitivity of the inclinometer. The study may be useful in the design of PCB layouts that are related to sensitivity improvement of gas sensors.

  20. A global optimization paradigm based on change of measures

    PubMed Central

    Sarkar, Saikat; Roy, Debasish; Vasu, Ram Mohan

    2015-01-01

    A global optimization framework, COMBEO (Change Of Measure Based Evolutionary Optimization), is proposed. An important aspect in the development is a set of derivative-free additive directional terms, obtainable through a change of measures en route to the imposition of any stipulated conditions aimed at driving the realized design variables (particles) to the global optimum. The generalized setting offered by the new approach also enables several basic ideas, used with other global search methods such as the particle swarm or the differential evolution, to be rationally incorporated in the proposed set-up via a change of measures. The global search may be further aided by imparting to the directional update terms additional layers of random perturbations such as ‘scrambling’ and ‘selection’. Depending on the precise choice of the optimality conditions and the extent of random perturbation, the search can be readily rendered either greedy or more exploratory. As numerically demonstrated, the new proposal appears to provide for a more rational, more accurate and, in some cases, a faster alternative to many available evolutionary optimization schemes. PMID:26587268

  1. Genetic Algorithm (GA)-Based Inclinometer Layout Optimization

    PubMed Central

    Liang, Weijie; Zhang, Ping; Chen, Xianping; Cai, Miao; Yang, Daoguo

    2015-01-01

    This paper presents numerical simulation results of an airflow inclinometer with sensitivity studies and thermal optimization of the printed circuit board (PCB) layout for an airflow inclinometer based on a genetic algorithm (GA). Due to the working principle of the gas sensor, the changes of the ambient temperature may cause dramatic voltage drifts of sensors. Therefore, eliminating the influence of the external environment for the airflow is essential for the performance and reliability of an airflow inclinometer. In this paper, the mechanism of an airflow inclinometer and the influence of different ambient temperatures on the sensitivity of the inclinometer will be examined by the ANSYS-FLOTRAN CFD program. The results show that with changes of the ambient temperature on the sensing element, the sensitivity of the airflow inclinometer is inversely proportional to the ambient temperature and decreases when the ambient temperature increases. GA is used to optimize the PCB thermal layout of the inclinometer. The finite-element simulation method (ANSYS) is introduced to simulate and verify the results of our optimal thermal layout, and the results indicate that the optimal PCB layout greatly improves (by more than 50%) the sensitivity of the inclinometer. The study may be useful in the design of PCB layouts that are related to sensitivity improvement of gas sensors. PMID:25897500

  2. Nanodosimetry-Based Plan Optimization for Particle Therapy

    PubMed Central

    Casiraghi, Margherita; Schulte, Reinhard W.

    2015-01-01

    Treatment planning for particle therapy is currently an active field of research due uncertainty in how to modify physical dose in order to create a uniform biological dose response in the target. A novel treatment plan optimization strategy based on measurable nanodosimetric quantities rather than biophysical models is proposed in this work. Simplified proton and carbon treatment plans were simulated in a water phantom to investigate the optimization feasibility. Track structures of the mixed radiation field produced at different depths in the target volume were simulated with Geant4-DNA and nanodosimetric descriptors were calculated. The fluences of the treatment field pencil beams were optimized in order to create a mixed field with equal nanodosimetric descriptors at each of the multiple positions in spread-out particle Bragg peaks. For both proton and carbon ion plans, a uniform spatial distribution of nanodosimetric descriptors could be obtained by optimizing opposing-field but not single-field plans. The results obtained indicate that uniform nanodosimetrically weighted plans, which may also be radiobiologically uniform, can be obtained with this approach. Future investigations need to demonstrate that this approach is also feasible for more complicated beam arrangements and that it leads to biologically uniform response in tumor cells and tissues. PMID:26167202

  3. Optimal network topology for structural robustness based on natural connectivity

    NASA Astrophysics Data System (ADS)

    Peng, Guan-sheng; Wu, Jun

    2016-02-01

    The structural robustness of the infrastructure of various real-life systems, which can be represented by networks, is of great importance. Thus we have proposed a tabu search algorithm to optimize the structural robustness of a given network by rewiring the links and fixing the node degrees. The objective of our algorithm is to maximize a new structural robustness measure, natural connectivity, which provides a sensitive and reliable measure of the structural robustness of complex networks and has lower computation complexity. We initially applied this method to several networks with different degree distributions for contrast analysis and investigated the basic properties of the optimal network. We discovered that the optimal network based on the power-law degree distribution exhibits a roughly "eggplant-like" topology, where there is a cluster of high-degree nodes at the head and other low-degree nodes scattered across the body of "eggplant". Additionally, the cost to rewire links in practical applications is considered; therefore, we optimized this method by employing the assortative rewiring strategy and validated its efficiency.

  4. The biopharmaceutics risk assessment roadmap for optimizing clinical drug product performance.

    PubMed

    Selen, Arzu; Dickinson, Paul A; Müllertz, Anette; Crison, John R; Mistry, Hitesh B; Cruañes, Maria T; Martinez, Marilyn N; Lennernäs, Hans; Wigal, Tim L; Swinney, David C; Polli, James E; Serajuddin, Abu T M; Cook, Jack A; Dressman, Jennifer B

    2014-11-01

    The biopharmaceutics risk assessment roadmap (BioRAM) optimizes drug product development and performance by using therapy-driven target drug delivery profiles as a framework to achieve the desired therapeutic outcome. Hence, clinical relevance is directly built into early formulation development. Biopharmaceutics tools are used to identify and address potential challenges to optimize the drug product for patient benefit. For illustration, BioRAM is applied to four relatively common therapy-driven drug delivery scenarios: rapid therapeutic onset, multiphasic delivery, delayed therapeutic onset, and maintenance of target exposure. BioRAM considers the therapeutic target with the drug substance characteristics and enables collection of critical knowledge for development of a dosage form that can perform consistently for meeting the patient's needs. Accordingly, the key factors are identified and in vitro, in vivo, and in silico modeling and simulation techniques are used to elucidate the optimal drug delivery rate and pattern. BioRAM enables (1) feasibility assessment for the dosage form, (2) development and conduct of appropriate "learning and confirming" studies, (3) transparency in decision-making, (4) assurance of drug product quality during lifecycle management, and (5) development of robust linkages between the desired clinical outcome and the necessary product quality attributes for inclusion in the quality target product profile.

  5. Risk-based versus deterministic explosives safety criteria

    SciTech Connect

    Wright, R.E.

    1996-12-01

    The Department of Defense Explosives Safety Board (DDESB) is actively considering ways to apply risk-based approaches in its decision- making processes. As such, an understanding of the impact of converting to risk-based criteria is required. The objectives of this project are to examine the benefits and drawbacks of risk-based criteria and to define the impact of converting from deterministic to risk-based criteria. Conclusions will be couched in terms that allow meaningful comparisons of deterministic and risk-based approaches. To this end, direct comparisons of the consequences and impacts of both deterministic and risk-based criteria at selected military installations are made. Deterministic criteria used in this report are those in DoD 6055.9-STD, `DoD Ammunition and Explosives Safety Standard.` Risk-based criteria selected for comparison are those used by the government of Switzerland, `Technical Requirements for the Storage of Ammunition (TLM 75).` The risk-based criteria used in Switzerland were selected because they have been successfully applied for over twenty-five years.

  6. Vehicle Shield Optimization and Risk Assessment for Future Human Space Missions

    NASA Technical Reports Server (NTRS)

    Nounu, Hatem N.; Kim, Myung-Hee; Cucinotta, Francis A.

    2011-01-01

    As the focus of future human space missions shifts to destinations beyond low Earth orbit such as Near Earth Objects (NEO), the moon, or Mars, risks associated with extended stay in hostile radiation environment need to be well understood and assessed. Since future spacecrafts designs and shapes are evolving continuous assessments of shielding and radiation risks are needed. In this study, we use a predictive software capability that calculates risks to humans inside a spacecraft prototype that builds on previous designs. The software uses CAD software Pro/Engineer and Fishbowl tool kit to quantify radiation shielding provided by the spacecraft geometry by calculating the areal density seen at a certain point, dose point, inside the spacecraft. Shielding results are used by NASA-developed software, BRYNTRN, to quantify organ doses received in a human body located in the vehicle in case of solar particle event (SPE) during such prolonged space missions. Organ doses are used to quantify risks on astronauts health and life using NASA Space Cancer Model. The software can also locate shielding weak points-hotspots-on the spacecraft s outer surface. This capability is used to reinforce weak areas in the design. Results of shielding optimization and risk calculation on an exploration vehicle design for missions of 6 months and 30 months are provided in this study. Vehicle capsule is made of aluminum shell that includes main cabin and airlock. The capsule contains 5 sets of racks that surround working and living areas. Water shelter is provided in the main cabin of the vehicle to enhance shielding in case of SPE.

  7. Basin structure of optimization based state and parameter estimation

    NASA Astrophysics Data System (ADS)

    Schumann-Bischoff, Jan; Parlitz, Ulrich; Abarbanel, Henry D. I.; Kostuk, Mark; Rey, Daniel; Eldridge, Michael; Luther, Stefan

    2015-05-01

    Most data based state and parameter estimation methods require suitable initial values or guesses to achieve convergence to the desired solution, which typically is a global minimum of some cost function. Unfortunately, however, other stable solutions (e.g., local minima) may exist and provide suboptimal or even wrong estimates. Here, we demonstrate for a 9-dimensional Lorenz-96 model how to characterize the basin size of the global minimum when applying some particular optimization based estimation algorithm. We compare three different strategies for generating suitable initial guesses, and we investigate the dependence of the solution on the given trajectory segment (underlying the measured time series). To address the question of how many state variables have to be measured for optimal performance, different types of multivariate time series are considered consisting of 1, 2, or 3 variables. Based on these time series, the local observability of state variables and parameters of the Lorenz-96 model is investigated and confirmed using delay coordinates. This result is in good agreement with the observation that correct state and parameter estimation results are obtained if the optimization algorithm is initialized with initial guesses close to the true solution. In contrast, initialization with other exact solutions of the model equations (different from the true solution used to generate the time series) typically fails, i.e., the optimization procedure ends up in local minima different from the true solution. Initialization using random values in a box around the attractor exhibits success rates depending on the number of observables and the available time series (trajectory segment).

  8. Basin structure of optimization based state and parameter estimation.

    PubMed

    Schumann-Bischoff, Jan; Parlitz, Ulrich; Abarbanel, Henry D I; Kostuk, Mark; Rey, Daniel; Eldridge, Michael; Luther, Stefan

    2015-05-01

    Most data based state and parameter estimation methods require suitable initial values or guesses to achieve convergence to the desired solution, which typically is a global minimum of some cost function. Unfortunately, however, other stable solutions (e.g., local minima) may exist and provide suboptimal or even wrong estimates. Here, we demonstrate for a 9-dimensional Lorenz-96 model how to characterize the basin size of the global minimum when applying some particular optimization based estimation algorithm. We compare three different strategies for generating suitable initial guesses, and we investigate the dependence of the solution on the given trajectory segment (underlying the measured time series). To address the question of how many state variables have to be measured for optimal performance, different types of multivariate time series are considered consisting of 1, 2, or 3 variables. Based on these time series, the local observability of state variables and parameters of the Lorenz-96 model is investigated and confirmed using delay coordinates. This result is in good agreement with the observation that correct state and parameter estimation results are obtained if the optimization algorithm is initialized with initial guesses close to the true solution. In contrast, initialization with other exact solutions of the model equations (different from the true solution used to generate the time series) typically fails, i.e., the optimization procedure ends up in local minima different from the true solution. Initialization using random values in a box around the attractor exhibits success rates depending on the number of observables and the available time series (trajectory segment).

  9. Risk-based decisionmaking in the DOE: Challenges and status

    SciTech Connect

    Henry, C.J.; Alchowiak, J.; Moses, M.

    1995-12-31

    The primary mission of the Environmental Management Program is to protect human health and the environment, the first goal of which must be, to address urgent risks and threats. Another is to provide for a safe workplace. Without credible risk assessments and good risk management practices, the central environmental goals cannot be met. Principles for risk analysis which include principles for risk assessment, management, communication, and priority setting were adopted. As recommended, Environmental Management is using risk-based decision making in its budget process and in the implementation of its program. The challenges presented in using a risk-based Decision making process are to integrate risk assessment methods and cultural and social values so as to produce meaningful priorities. The different laws and regulations governing the Department define risk differently in implementing activities to protect human health and the environment, therefore, assumptions and judgements in risk analysis vary. Currently, the Environmental Management Program is developing and improving a framework to incorporate risk into the budget process and to link the budget, compliance requirements and risk reduction/pollution prevention activities.

  10. Optimal alignment of mirror based pentaprisms for scanning deflectometric devices

    SciTech Connect

    Barber, Samuel K.; Geckeler, Ralf D.; Yashchuk, Valeriy V.; Gubarev, Mikhail V.; Buchheim, Jana; Siewert, Frank; Zeschke, Thomas

    2011-03-04

    In the recent work [Proc. of SPIE 7801, 7801-2/1-12 (2010), Opt. Eng. 50(5) (2011), in press], we have reported on improvement of the Developmental Long Trace Profiler (DLTP), a slope measuring profiler available at the Advanced Light Source Optical Metrology Laboratory, achieved by replacing the bulk pentaprism with a mirror based pentaprism (MBPP). An original experimental procedure for optimal mutual alignment of the MBPP mirrors has been suggested and verified with numerical ray tracing simulations. It has been experimentally shown that the optimally aligned MBPP allows the elimination of systematic errors introduced by inhomogeneity of the optical material and fabrication imperfections of the bulk pentaprism. In the present article, we provide the analytical derivation and verification of easily executed optimal alignment algorithms for two different designs of mirror based pentaprisms. We also provide an analytical description for the mechanism for reduction of the systematic errors introduced by a typical high quality bulk pentaprism. It is also shown that residual misalignments of an MBPP introduce entirely negligible systematic errors in surface slope measurements with scanning deflectometric devices.

  11. Model-based optimization of tapered free-electron lasers

    NASA Astrophysics Data System (ADS)

    Mak, Alan; Curbis, Francesca; Werin, Sverker

    2015-04-01

    The energy extraction efficiency is a figure of merit for a free-electron laser (FEL). It can be enhanced by the technique of undulator tapering, which enables the sustained growth of radiation power beyond the initial saturation point. In the development of a single-pass x-ray FEL, it is important to exploit the full potential of this technique and optimize the taper profile aw(z ). Our approach to the optimization is based on the theoretical model by Kroll, Morton, and Rosenbluth, whereby the taper profile aw(z ) is not a predetermined function (such as linear or exponential) but is determined by the physics of a resonant particle. For further enhancement of the energy extraction efficiency, we propose a modification to the model, which involves manipulations of the resonant particle's phase. Using the numerical simulation code GENESIS, we apply our model-based optimization methods to a case of the future FEL at the MAX IV Laboratory (Lund, Sweden), as well as a case of the LCLS-II facility (Stanford, USA).

  12. OPTIMIZATION BIAS IN ENERGY-BASED STRUCTURE PREDICTION.

    PubMed

    Petrella, Robert J

    2013-12-01

    Physics-based computational approaches to predicting the structure of macromolecules such as proteins are gaining increased use, but there are remaining challenges. In the current work, it is demonstrated that in energy-based prediction methods, the degree of optimization of the sampled structures can influence the prediction results. In particular, discrepancies in the degree of local sampling can bias the predictions in favor of the oversampled structures by shifting the local probability distributions of the minimum sampled energies. In simple systems, it is shown that the magnitude of the errors can be calculated from the energy surface, and for certain model systems, derived analytically. Further, it is shown that for energy wells whose forms differ only by a randomly assigned energy shift, the optimal accuracy of prediction is achieved when the sampling around each structure is equal. Energy correction terms can be used in cases of unequal sampling to reproduce the total probabilities that would occur under equal sampling, but optimal corrections only partially restore the prediction accuracy lost to unequal sampling. For multiwell systems, the determination of the correction terms is a multibody problem; it is shown that the involved cross-correlation multiple integrals can be reduced to simpler integrals. The possible implications of the current analysis for macromolecular structure prediction are discussed.

  13. Optimizing legacy molecular dynamics software with directive-based offload

    SciTech Connect

    Michael Brown, W.; Carrillo, Jan-Michael Y.; Gavhane, Nitin; Thakkar, Foram M.; Plimpton, Steven J.

    2015-05-14

    The directive-based programming models are one solution for exploiting many-core coprocessors to increase simulation rates in molecular dynamics. They offer the potential to reduce code complexity with offload models that can selectively target computations to run on the CPU, the coprocessor, or both. In our paper, we describe modifications to the LAMMPS molecular dynamics code to enable concurrent calculations on a CPU and coprocessor. We also demonstrate that standard molecular dynamics algorithms can run efficiently on both the CPU and an x86-based coprocessor using the same subroutines. As a consequence, we demonstrate that code optimizations for the coprocessor also result in speedups on the CPU; in extreme cases up to 4.7X. We provide results for LAMMAS benchmarks and for production molecular dynamics simulations using the Stampede hybrid supercomputer with both Intel (R) Xeon Phi (TM) coprocessors and NVIDIA GPUs: The optimizations presented have increased simulation rates by over 2X for organic molecules and over 7X for liquid crystals on Stampede. The optimizations are available as part of the "Intel package" supplied with LAMMPS. (C) 2015 Elsevier B.V. All rights reserved.

  14. Optimizing legacy molecular dynamics software with directive-based offload

    DOE PAGES

    Michael Brown, W.; Carrillo, Jan-Michael Y.; Gavhane, Nitin; ...

    2015-05-14

    The directive-based programming models are one solution for exploiting many-core coprocessors to increase simulation rates in molecular dynamics. They offer the potential to reduce code complexity with offload models that can selectively target computations to run on the CPU, the coprocessor, or both. In our paper, we describe modifications to the LAMMPS molecular dynamics code to enable concurrent calculations on a CPU and coprocessor. We also demonstrate that standard molecular dynamics algorithms can run efficiently on both the CPU and an x86-based coprocessor using the same subroutines. As a consequence, we demonstrate that code optimizations for the coprocessor also resultmore » in speedups on the CPU; in extreme cases up to 4.7X. We provide results for LAMMAS benchmarks and for production molecular dynamics simulations using the Stampede hybrid supercomputer with both Intel (R) Xeon Phi (TM) coprocessors and NVIDIA GPUs: The optimizations presented have increased simulation rates by over 2X for organic molecules and over 7X for liquid crystals on Stampede. The optimizations are available as part of the "Intel package" supplied with LAMMPS. (C) 2015 Elsevier B.V. All rights reserved.« less

  15. Neural network based optimal control of HVAC&R systems

    NASA Astrophysics Data System (ADS)

    Ning, Min

    Heating, Ventilation, Air-Conditioning and Refrigeration (HVAC&R) systems have wide applications in providing a desired indoor environment for different types of buildings. It is well acknowledged that 30%-40% of the total energy generated is consumed by buildings and HVAC&R systems alone account for more than 50% of the building energy consumption. Low operational efficiency especially under partial load conditions and poor control are part of reasons for such high energy consumption. To improve energy efficiency, HVAC&R systems should be properly operated to maintain a comfortable and healthy indoor environment under dynamic ambient and indoor conditions with the least energy consumption. This research focuses on the optimal operation of HVAC&R systems. The optimization problem is formulated and solved to find the optimal set points for the chilled water supply temperature, discharge air temperature and AHU (air handling unit) fan static pressure such that the indoor environment is maintained with the least chiller and fan energy consumption. To achieve this objective, a dynamic system model is developed first to simulate the system behavior under different control schemes and operating conditions. The system model is modular in structure, which includes a water-cooled vapor compression chiller model and a two-zone VAV system model. A fuzzy-set based extended transformation approach is then applied to investigate the uncertainties of this model caused by uncertain parameters and the sensitivities of the control inputs with respect to the interested model outputs. A multi-layer feed forward neural network is constructed and trained in unsupervised mode to minimize the cost function which is comprised of overall energy cost and penalty cost when one or more constraints are violated. After training, the network is implemented as a supervisory controller to compute the optimal settings for the system. In order to implement the optimal set points predicted by the

  16. Adaptive Estimation of Intravascular Shear Rate Based on Parameter Optimization

    NASA Astrophysics Data System (ADS)

    Nitta, Naotaka; Takeda, Naoto

    2008-05-01

    The relationships between the intravascular wall shear stress, controlled by flow dynamics, and the progress of arteriosclerosis plaque have been clarified by various studies. Since the shear stress is determined by the viscosity coefficient and shear rate, both factors must be estimated accurately. In this paper, an adaptive method for improving the accuracy of quantitative shear rate estimation was investigated. First, the parameter dependence of the estimated shear rate was investigated in terms of the differential window width and the number of averaged velocity profiles based on simulation and experimental data, and then the shear rate calculation was optimized. The optimized result revealed that the proposed adaptive method of shear rate estimation was effective for improving the accuracy of shear rate calculation.

  17. Optimization-based design of a heat flux concentrator

    PubMed Central

    Peralta, Ignacio; Fachinotti, Víctor D.; Ciarbonetti, Ángel A.

    2017-01-01

    To gain control over the diffusive heat flux in a given domain, one needs to engineer a thermal metamaterial with a specific distribution of the generally anisotropic thermal conductivity throughout the domain. Until now, the appropriate conductivity distribution was usually determined using transformation thermodynamics. By this way, only a few particular cases of heat flux control in simple domains having simple boundary conditions were studied. Thermal metamaterials based on optimization algorithm provides superior properties compared to those using the previous methods. As a more general approach, we propose to define the heat control problem as an optimization problem where we minimize the error in guiding the heat flux in a given way, taking as design variables the parameters that define the variable microstructure of the metamaterial. In the present study we numerically demonstrate the ability to manipulate heat flux by designing a device to concentrate the thermal energy to its center without disturbing the temperature profile outside it. PMID:28084451

  18. Physics-Based Prognostics for Optimizing Plant Operation

    SciTech Connect

    Leonard J. Bond; Don B. Jarrell

    2005-03-01

    Scientists at the Pacific Northwest National Laboratory (PNNL) have examined the necessity for optimization of energy plant operation using 'DSOM{reg_sign}'--Decision Support Operation and Maintenance and this has been deployed at several sites. This approach has been expanded to include a prognostics components and tested on a pilot scale service water system, modeled on the design employed in a nuclear power plant. A key element in plant optimization is understanding and controlling the aging process of safety-specific nuclear plant components. This paper reports the development and demonstration of a physics-based approach to prognostic analysis that combines distributed computing, RF data links, the measurement of aging precursor metrics and their correlation with degradation rate and projected machine failure.

  19. Optimization-based design of a heat flux concentrator.

    PubMed

    Peralta, Ignacio; Fachinotti, Víctor D; Ciarbonetti, Ángel A

    2017-01-13

    To gain control over the diffusive heat flux in a given domain, one needs to engineer a thermal metamaterial with a specific distribution of the generally anisotropic thermal conductivity throughout the domain. Until now, the appropriate conductivity distribution was usually determined using transformation thermodynamics. By this way, only a few particular cases of heat flux control in simple domains having simple boundary conditions were studied. Thermal metamaterials based on optimization algorithm provides superior properties compared to those using the previous methods. As a more general approach, we propose to define the heat control problem as an optimization problem where we minimize the error in guiding the heat flux in a given way, taking as design variables the parameters that define the variable microstructure of the metamaterial. In the present study we numerically demonstrate the ability to manipulate heat flux by designing a device to concentrate the thermal energy to its center without disturbing the temperature profile outside it.

  20. Optimization of integer wavelet transforms based on difference correlation structures.

    PubMed

    Li, Hongliang; Liu, Guizhong; Zhang, Zhongwei

    2005-11-01

    In this paper, a novel lifting integer wavelet transform based on difference correlation structure (DCCS-LIWT) is proposed. First, we establish a relationship between the performance of a linear predictor and the difference correlations of an image. The obtained results provide a theoretical foundation for the following construction of the optimal lifting filters. Then, the optimal prediction lifting coefficients in the sense of least-square prediction error are derived. DCCS-LIWT puts heavy emphasis on image inherent dependence. A distinct feature of this method is the use of the variance-normalized autocorrelation function of the difference image to construct a linear predictor and adapt the predictor to varying image sources. The proposed scheme also allows respective calculations of the lifting filters for the horizontal and vertical orientations. Experimental evaluation shows that the proposed method produces better results than the other well-known integer transforms for the lossless image compression.

  1. Optimization-based design of a heat flux concentrator

    NASA Astrophysics Data System (ADS)

    Peralta, Ignacio; Fachinotti, Víctor D.; Ciarbonetti, Ángel A.

    2017-01-01

    To gain control over the diffusive heat flux in a given domain, one needs to engineer a thermal metamaterial with a specific distribution of the generally anisotropic thermal conductivity throughout the domain. Until now, the appropriate conductivity distribution was usually determined using transformation thermodynamics. By this way, only a few particular cases of heat flux control in simple domains having simple boundary conditions were studied. Thermal metamaterials based on optimization algorithm provides superior properties compared to those using the previous methods. As a more general approach, we propose to define the heat control problem as an optimization problem where we minimize the error in guiding the heat flux in a given way, taking as design variables the parameters that define the variable microstructure of the metamaterial. In the present study we numerically demonstrate the ability to manipulate heat flux by designing a device to concentrate the thermal energy to its center without disturbing the temperature profile outside it.

  2. R2-Based Multi/Many-Objective Particle Swarm Optimization

    PubMed Central

    Toscano, Gregorio; Barron-Zambrano, Jose Hugo; Tello-Leal, Edgar

    2016-01-01

    We propose to couple the R2 performance measure and Particle Swarm Optimization in order to handle multi/many-objective problems. Our proposal shows that through a well-designed interaction process we could maintain the metaheuristic almost inalterable and through the R2 performance measure we did not use neither an external archive nor Pareto dominance to guide the search. The proposed approach is validated using several test problems and performance measures commonly adopted in the specialized literature. Results indicate that the proposed algorithm produces results that are competitive with respect to those obtained by four well-known MOEAs. Additionally, we validate our proposal in many-objective optimization problems. In these problems, our approach showed its main strength, since it could outperform another well-known indicator-based MOEA. PMID:27656200

  3. Optimizing Dendritic Cell-Based Approaches for Cancer Immunotherapy

    PubMed Central

    Datta, Jashodeep; Terhune, Julia H.; Lowenfeld, Lea; Cintolo, Jessica A.; Xu, Shuwen; Roses, Robert E.; Czerniecki, Brian J.

    2014-01-01

    Dendritic cells (DC) are professional antigen-presenting cells uniquely suited for cancer immunotherapy. They induce primary immune responses, potentiate the effector functions of previously primed T-lymphocytes, and orchestrate communication between innate and adaptive immunity. The remarkable diversity of cytokine activation regimens, DC maturation states, and antigen-loading strategies employed in current DC-based vaccine design reflect an evolving, but incomplete, understanding of optimal DC immunobiology. In the clinical realm, existing DC-based cancer immunotherapy efforts have yielded encouraging but inconsistent results. Despite recent U.S. Federal and Drug Administration (FDA) approval of DC-based sipuleucel-T for metastatic castration-resistant prostate cancer, clinically effective DC immunotherapy as monotherapy for a majority of tumors remains a distant goal. Recent work has identified strategies that may allow for more potent “next-generation” DC vaccines. Additionally, multimodality approaches incorporating DC-based immunotherapy may improve clinical outcomes. PMID:25506283

  4. Optimization and determination of polycyclic aromatic hydrocarbons in biochar-based fertilizers.

    PubMed

    Chen, Ping; Zhou, Hui; Gan, Jay; Sun, Mingxing; Shang, Guofeng; Liu, Liang; Shen, Guoqing

    2015-03-01

    The agronomic benefit of biochar has attracted widespread attention to biochar-based fertilizers. However, the inevitable presence of polycyclic aromatic hydrocarbons in biochar is a matter of concern because of the health and ecological risks of these compounds. The strong adsorption of polycyclic aromatic hydrocarbons to biochar complicates their analysis and extraction from biochar-based fertilizers. In this study, we optimized and validated a method for determining the 16 priority polycyclic aromatic hydrocarbons in biochar-based fertilizers. Results showed that accelerated solvent extraction exhibited high extraction efficiency. Based on a Box-Behnken design with a triplicate central point, accelerated solvent extraction was used under the following optimal operational conditions: extraction temperature of 78°C, extraction time of 17 min, and two static cycles. The optimized method was validated by assessing the linearity of analysis, limit of detection, limit of quantification, recovery, and application to real samples. The results showed that the 16 polycyclic aromatic hydrocarbons exhibited good linearity, with a correlation coefficient of 0.996. The limits of detection varied between 0.001 (phenanthrene) and 0.021 mg/g (benzo[ghi]perylene), and the limits of quantification varied between 0.004 (phenanthrene) and 0.069 mg/g (benzo[ghi]perylene). The relative recoveries of the 16 polycyclic aromatic hydrocarbons were 70.26-102.99%.

  5. Model based optimization of wind erosion control by tree shelterbelt for suitable land management

    NASA Astrophysics Data System (ADS)

    Bartus, M.; Farsang, A.; Szatmári, J.; Barta, K.

    2012-04-01

    The degradation of soil by wind erosion causes huge problem in many parts of the world. The wind erodes the upper, nutrition rich part of the soil, therefore erosion causes soil productivity loss. The length of tree shelterbelts was significantly reduced by the collectivisation (1960-1989) and the wind erosion affected areas expanded in Hungary. The tree shelterbelt is more than just a tool of wind erosion control; by good planning it can increase the yield. The tree shelterbelt reduces the wind speed and changes the microclimate providing better condition to plant growth. The aim of our work is to estimate wind erosion risk and to find the way to reduce it by tree shelterbelts. A GIS based model was created to calculate the risk and the optimal windbreak position was defined to reduce the wind erosion risk to the minimum. The model is based on the DIN 19706 (Ermitlung der Erosiongefährdung von Böden durch Wind, Estimation of Wind Erosion Risk) German standard. The model uses five input data: structure and carbon content of soil, average yearly wind speed at 10 meters height, the cultivated plants and the height and position of windbreak. The study field (16km2) was chosen near Szeged (SE Hungary). In our investigation, the cultivated plant species and the position and height of windbreaks were modified. Different scenarios were made using the data of the land management in the last few years. The best case scenario (zero wind erosion) and the worst case scenario (with no tree shelter belt and the worst land use) were made to find the optimal windbreak position. Finally, the research proved that the tree shelterbelts can provide proper protection against wind erosion, but for optimal land management the cultivated plant types should also controlled. As a result of the research, a land management plan was defined to reduce the wind erosion risk on the study field, which contains the positions of new tree shelterbelts planting and the optimal cultivation.

  6. Estimation of the effects of normal tissue sparing using equivalent uniform dose-based optimization

    PubMed Central

    Senthilkumar, K.; Maria Das, K. J.; Balasubramanian, K.; Deka, A. C.; Patil, B. R.

    2016-01-01

    In this study, we intend to estimate the effects of normal tissue sparing between intensity modulated radiotherapy (IMRT) treatment plans generated with and without a dose volume (DV)-based physical cost function using equivalent uniform dose (EUD). Twenty prostate cancer patients were retrospectively selected for this study. For each patient, two IMRT plans were generated (i) EUD-based optimization with a DV-based physical cost function to control inhomogeneity (EUDWith DV) and (ii) EUD-based optimization without a DV-based physical cost function to allow inhomogeneity (EUDWithout DV). The generated plans were prescribed a dose of 72 Gy in 36 fractions to planning target volume (PTV). Mean dose, D30%, and D5% were evaluated for all organ at risk (OAR). Normal tissue complication probability was also calculated for all OARs using BioSuite software. The average volume of PTV for all patients was 103.02 ± 27 cm3. The PTV mean dose for EUDWith DV plans was 73.67 ± 1.7 Gy, whereas for EUDWithout DV plans was 80.42 ± 2.7 Gy. It was found that PTV volume receiving dose more than 115% of prescription dose was negligible in EUDWith DV plans, whereas it was 28% in EUDWithout DV plans. In almost all dosimetric parameters evaluated, dose to OARs in EUDWith DV plans was higher than in EUDWithout DV plans. Allowing inhomogeneous dose (EUDWithout DV) inside the target would achieve better normal tissue sparing compared to homogenous dose distribution (EUDWith DV). Hence, this inhomogeneous dose could be intentionally dumped on the high-risk volume to achieve high local control. Therefore, it was concluded that EUD optimized plans offer added advantage of less OAR dose as well as selectively boosting dose to gross tumor volume. PMID:27217624

  7. Risk management and statistical multivariate analysis approach for design and optimization of satranidazole nanoparticles.

    PubMed

    Dhat, Shalaka; Pund, Swati; Kokare, Chandrakant; Sharma, Pankaj; Shrivastava, Birendra

    2017-01-01

    Rapidly evolving technical and regulatory landscapes of the pharmaceutical product development necessitates risk management with application of multivariate analysis using Process Analytical Technology (PAT) and Quality by Design (QbD). Poorly soluble, high dose drug, Satranidazole was optimally nanoprecipitated (SAT-NP) employing principles of Formulation by Design (FbD). The potential risk factors influencing the critical quality attributes (CQA) of SAT-NP were identified using Ishikawa diagram. Plackett-Burman screening design was adopted to screen the eight critical formulation and process parameters influencing the mean particle size, zeta potential and dissolution efficiency at 30min in pH7.4 dissolution medium. Pareto charts (individual and cumulative) revealed three most critical factors influencing CQA of SAT-NP viz. aqueous stabilizer (Polyvinyl alcohol), release modifier (Eudragit® S 100) and volume of aqueous phase. The levels of these three critical formulation attributes were optimized by FbD within established design space to minimize mean particle size, poly dispersity index, and maximize encapsulation efficiency of SAT-NP. Lenth's and Bayesian analysis along with mathematical modeling of results allowed identification and quantification of critical formulation attributes significantly active on the selected CQAs. The optimized SAT-NP exhibited mean particle size; 216nm, polydispersity index; 0.250, zeta potential; -3.75mV and encapsulation efficiency; 78.3%. The product was lyophilized using mannitol to form readily redispersible powder. X-ray diffraction analysis confirmed the conversion of crystalline SAT to amorphous form. In vitro release of SAT-NP in gradually pH changing media showed <20% release in pH1.2 and pH6.8 in 5h, while, complete release (>95%) in pH7.4 in next 3h, indicative of burst release after a lag time. This investigation demonstrated effective application of risk management and QbD tools in developing site-specific release

  8. Risk management for optimal land use planning integrating ecosystem services values: A case study in Changsha, Middle China.

    PubMed

    Liang, Jie; Zhong, Minzhou; Zeng, Guangming; Chen, Gaojie; Hua, Shanshan; Li, Xiaodong; Yuan, Yujie; Wu, Haipeng; Gao, Xiang

    2017-02-01

    Land-use change has direct impact on ecosystem services and alters ecosystem services values (ESVs). Ecosystem services analysis is beneficial for land management and decisions. However, the application of ESVs for decision-making in land use decisions is scarce. In this paper, a method, integrating ESVs to balance future ecosystem-service benefit and risk, is developed to optimize investment in land for ecological conservation in land use planning. Using ecological conservation in land use planning in Changsha as an example, ESVs is regarded as the expected ecosystem-service benefit. And uncertainty of land use change is regarded as risk. This method can optimize allocation of investment in land to improve ecological benefit. The result shows that investment should be partial to Liuyang City to get higher benefit. The investment should also be shifted from Liuyang City to other regions to reduce risk. In practice, lower limit and upper limit for weight distribution, which affects optimal outcome and selection of investment allocation, should be set in investment. This method can reveal the optimal spatial allocation of investment to maximize the expected ecosystem-service benefit at a given level of risk or minimize risk at a given level of expected ecosystem-service benefit. Our results of optimal analyses highlight tradeoffs between future ecosystem-service benefit and uncertainty of land use change in land use decisions.

  9. Risk assessment and hierarchical risk management of enterprises in chemical industrial parks based on catastrophe theory.

    PubMed

    Chen, Yu; Song, Guobao; Yang, Fenglin; Zhang, Shushen; Zhang, Yun; Liu, Zhenyu

    2012-12-03

    According to risk systems theory and the characteristics of the chemical industry, an index system was established for risk assessment of enterprises in chemical industrial parks (CIPs) based on the inherent risk of the source, effectiveness of the prevention and control mechanism, and vulnerability of the receptor. A comprehensive risk assessment method based on catastrophe theory was then proposed and used to analyze the risk levels of ten major chemical enterprises in the Songmu Island CIP, China. According to the principle of equal distribution function, the chemical enterprise risk level was divided into the following five levels: 1.0 (very safe), 0.8 (safe), 0.6 (generally recognized as safe, GRAS), 0.4 (unsafe), 0.2 (very unsafe). The results revealed five enterprises (50%) with an unsafe risk level, and another five enterprises (50%) at the generally recognized as safe risk level. This method solves the multi-objective evaluation and decision-making problem. Additionally, this method involves simple calculations and provides an effective technique for risk assessment and hierarchical risk management of enterprises in CIPs.

  10. Human biomonitoring to optimize fish consumption advice: reducing uncertainty when evaluating benefits and risks.

    PubMed

    Arnold, Scott M; Lynn, Tracey V; Verbrugge, Lori A; Middaugh, John P

    2005-03-01

    National fish consumption advisories that are based solely on assessment of risk of exposure to contaminants without consideration of consumption benefits result in overly restrictive advice that discourages eating fish even in areas where such advice is unwarranted. In fact, generic fish advisories may have adverse public health consequences because of decreased fish consumption and substitution of foods that are less healthy. Public health is on the threshold of a new era for determining actual exposures to environmental contaminants, owing to technological advances in analytical chemistry. It is now possible to target fish consumption advice to specific at-risk populations by evaluating individual contaminant exposures and health risk factors. Because of the current epidemic of nutritionally linked disease, such as obesity, diabetes, and cardiovascular disease, general recommendations for limiting fish consumption are ill conceived and potentially dangerous.

  11. Tools for Risk-Based UXO Remediation

    DTIC Science & Technology

    2014-01-01

    we (i) performed a probabilistic risk assessment using polarizabilities and ground truth information from Camp San Luis Obispo , Camp Butner, and...actual depth distribution of the UXO recovered at San Luis Obispo and results of the synthetic seed study, we conclude that all of the UXO, at least...same detection scheme, for burial depths of up to 0.77m. Thus, the detection process applied to ESTCP’s Classification Study at San Luis Obispo , CA

  12. Optimization-based mesh correction with volume and convexity constraints

    DOE PAGES

    D'Elia, Marta; Ridzal, Denis; Peterson, Kara J.; ...

    2016-02-24

    Here, we consider the problem of finding a mesh such that 1) it is the closest, with respect to a suitable metric, to a given source mesh having the same connectivity, and 2) the volumes of its cells match a set of prescribed positive values that are not necessarily equal to the cell volumes in the source mesh. Also, this volume correction problem arises in important simulation contexts, such as satisfying a discrete geometric conservation law and solving transport equations by incremental remapping or similar semi-Lagrangian transport schemes. In this paper we formulate volume correction as a constrained optimization problemmore » in which the distance to the source mesh defines an optimization objective, while the prescribed cell volumes, mesh validity and/or cell convexity specify the constraints. We solve this problem numerically using a sequential quadratic programming (SQP) method whose performance scales with the mesh size. To achieve scalable performance we develop a specialized multigrid-based preconditioner for optimality systems that arise in the application of the SQP method to the volume correction problem. Numerical examples illustrate the importance of volume correction, and showcase the accuracy, robustness and scalability of our approach.« less

  13. Optimization-based mesh correction with volume and convexity constraints

    SciTech Connect

    D'Elia, Marta; Ridzal, Denis; Peterson, Kara J.; Bochev, Pavel; Shashkov, Mikhail

    2016-02-24

    Here, we consider the problem of finding a mesh such that 1) it is the closest, with respect to a suitable metric, to a given source mesh having the same connectivity, and 2) the volumes of its cells match a set of prescribed positive values that are not necessarily equal to the cell volumes in the source mesh. Also, this volume correction problem arises in important simulation contexts, such as satisfying a discrete geometric conservation law and solving transport equations by incremental remapping or similar semi-Lagrangian transport schemes. In this paper we formulate volume correction as a constrained optimization problem in which the distance to the source mesh defines an optimization objective, while the prescribed cell volumes, mesh validity and/or cell convexity specify the constraints. We solve this problem numerically using a sequential quadratic programming (SQP) method whose performance scales with the mesh size. To achieve scalable performance we develop a specialized multigrid-based preconditioner for optimality systems that arise in the application of the SQP method to the volume correction problem. Numerical examples illustrate the importance of volume correction, and showcase the accuracy, robustness and scalability of our approach.

  14. Robustness-Based Design Optimization Under Data Uncertainty

    NASA Technical Reports Server (NTRS)

    Zaman, Kais; McDonald, Mark; Mahadevan, Sankaran; Green, Lawrence

    2010-01-01

    This paper proposes formulations and algorithms for design optimization under both aleatory (i.e., natural or physical variability) and epistemic uncertainty (i.e., imprecise probabilistic information), from the perspective of system robustness. The proposed formulations deal with epistemic uncertainty arising from both sparse and interval data without any assumption about the probability distributions of the random variables. A decoupled approach is proposed in this paper to un-nest the robustness-based design from the analysis of non-design epistemic variables to achieve computational efficiency. The proposed methods are illustrated for the upper stage design problem of a two-stage-to-orbit (TSTO) vehicle, where the information on the random design inputs are only available as sparse point and/or interval data. As collecting more data reduces uncertainty but increases cost, the effect of sample size on the optimality and robustness of the solution is also studied. A method is developed to determine the optimal sample size for sparse point data that leads to the solutions of the design problem that are least sensitive to variations in the input random variables.

  15. Weather forecast-based optimization of integrated energy systems.

    SciTech Connect

    Zavala, V. M.; Constantinescu, E. M.; Krause, T.; Anitescu, M.

    2009-03-01

    In this work, we establish an on-line optimization framework to exploit detailed weather forecast information in the operation of integrated energy systems, such as buildings and photovoltaic/wind hybrid systems. We first discuss how the use of traditional reactive operation strategies that neglect the future evolution of the ambient conditions can translate in high operating costs. To overcome this problem, we propose the use of a supervisory dynamic optimization strategy that can lead to more proactive and cost-effective operations. The strategy is based on the solution of a receding-horizon stochastic dynamic optimization problem. This permits the direct incorporation of economic objectives, statistical forecast information, and operational constraints. To obtain the weather forecast information, we employ a state-of-the-art forecasting model initialized with real meteorological data. The statistical ambient information is obtained from a set of realizations generated by the weather model executed in an operational setting. We present proof-of-concept simulation studies to demonstrate that the proposed framework can lead to significant savings (more than 18% reduction) in operating costs.

  16. Optimal sensor placement using FRFs-based clustering method

    NASA Astrophysics Data System (ADS)

    Li, Shiqi; Zhang, Heng; Liu, Shiping; Zhang, Zhe

    2016-12-01

    The purpose of this work is to develop an optimal sensor placement method by selecting the most relevant degrees of freedom as actual measure position. Based on observation matrix of a structure's frequency response, two optimal criteria are used to avoid the information redundancy of the candidate degrees of freedom. By using principal component analysis, the frequency response matrix can be decomposed into principal directions and their corresponding singular. A relatively small number of principal directions will maintain a system's dominant response information. According to the dynamic similarity of each degree of freedom, the k-means clustering algorithm is designed to classify the degrees of freedom, and effective independence method deletes the sensors which are redundant of each cluster. Finally, two numerical examples and a modal test are included to demonstrate the efficient of the derived method. It is shown that the proposed method provides a way to extract sub-optimal sets and the selected sensors are well distributed on the whole structure.

  17. Risk-based zoning for urbanizing floodplains.

    PubMed

    Porse, Erik

    2014-01-01

    Urban floodplain development brings economic benefits and enhanced flood risks. Rapidly growing cities must often balance the economic benefits and increased risks of floodplain settlement. Planning can provide multiple flood mitigation and environmental benefits by combining traditional structural measures such as levees, increasingly popular landscape and design features (green infrastructure), and non-structural measures such as zoning. Flexibility in both structural and non-structural options, including zoning procedures, can reduce flood risks. This paper presents a linear programming formulation to assess cost-effective urban floodplain development decisions that consider benefits and costs of development along with expected flood damages. It uses a probabilistic approach to identify combinations of land-use allocations (residential and commercial development, flood channels, distributed runoff management) and zoning regulations (development zones in channel) to maximize benefits. The model is applied to a floodplain planning analysis for an urbanizing region in the Baja Sur peninsula of Mexico. The analysis demonstrates how (1) economic benefits drive floodplain development, (2) flexible zoning can improve economic returns, and (3) cities can use landscapes, enhanced by technology and design, to manage floods. The framework can incorporate additional green infrastructure benefits, and bridges typical disciplinary gaps for planning and engineering.

  18. Antihypertensive combination therapy: optimizing blood pressure control and cardiovascular risk reduction.

    PubMed

    Nesbitt, Shawna D

    2007-11-01

    Treating hypertension reduces the rates of myocardial infarction, stroke, and renal disease; however, clinical trial experience suggests that monotherapy is not likely to be successful for achieving goal blood pressure (BP) levels in many hypertensive patients. In multiple recent clinical trials including various subsets of hypertensive patients, the achievement of BP goal has typically required the combination of 2 or more medications, particularly in patients with BP levels>160/100 mm Hg. When initiating combination therapy for hypertension, careful consideration must be given to the choice of medication. Clinical trial evidence has shown the efficacy of various combinations of angiotensin-converting enzyme inhibitors, angiotensin II receptor blockers, calcium channel blockers, and diuretics in reducing BP and cardiovascular risk. Ongoing trials should provide additional guidance on the optimal choice of combination regimens in specific clinical settings.

  19. Developing interpretable models with optimized set reduction for identifying high risk software components

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.

    1993-01-01

    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault frequency components so that testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents the Optimized Set Reduction approach for constructing such models, intended to fulfill specific software engineering needs. Our approach to classification is to measure the software system and build multivariate stochastic models for predicting high risk system components. We present experimental results obtained by classifying Ada components into two classes: is or is not likely to generate faults during system and acceptance test. Also, we evaluate the accuracy of the model and the insights it provides into the error making process.

  20. Morphing-Based Shape Optimization in Computational Fluid Dynamics

    NASA Astrophysics Data System (ADS)

    Rousseau, Yannick; Men'Shov, Igor; Nakamura, Yoshiaki

    In this paper, a Morphing-based Shape Optimization (MbSO) technique is presented for solving Optimum-Shape Design (OSD) problems in Computational Fluid Dynamics (CFD). The proposed method couples Free-Form Deformation (FFD) and Evolutionary Computation, and, as its name suggests, relies on the morphing of shape and computational domain, rather than direct shape parameterization. Advantages of the FFD approach compared to traditional parameterization are first discussed. Then, examples of shape and grid deformations by FFD are presented. Finally, the MbSO approach is illustrated and applied through an example: the design of an airfoil for a future Mars exploration airplane.

  1. Optimal Constellation Design for Satellite Based Augmentation System

    NASA Astrophysics Data System (ADS)

    Kawano, Isao

    Global Positioning System (GPS) is widely utilized in daily life, for instance car navigation. Wide Area Augmentation System (WAAS) and Local Area Augmentation System (LAAS) are proposed so as to provide GPS better navigation accuracy and integrity capability. Satellite Based Augmentation System (SBAS) is a kind of WAAS and Multi-functional Transportation Satellite (MTSAT) has been developed in Japan. To improve navigation accuracy most efficiently, augmentation satellites should be so placed that minimize Geometric Dilution of Precision (GDOP) of constellation. In this paper the result of optimal constellation design for SBAS is shown.

  2. Research on Topology Optimization of Truss Structures Based on the Improved Group Search Optimizer

    NASA Astrophysics Data System (ADS)

    Haobin, Xie; Feng, Liu; Lijuan, Li; Chun, Wang

    2010-05-01

    In this paper, a novel optimization algorithm, named group search optimizer (GSO), is used to truss structure topology optimization. The group search optimizer is improved in two aspects which including using harmony memory and adhering to the boundary. Two topology methods, such as heuristic topology and discretization of topology variables, are incorporated with GSO to make sure that the topology optimization works well. In the end of the paper, two numerical examples were used to test the improved GSO. Calculation results show that the improved GSO is feasible and robust for truss topology optimization.

  3. Optimism

    PubMed Central

    Carver, Charles S.; Scheier, Michael F.; Segerstrom, Suzanne C.

    2010-01-01

    Optimism is an individual difference variable that reflects the extent to which people hold generalized favorable expectancies for their future. Higher levels of optimism have been related prospectively to better subjective well-being in times of adversity or difficulty (i.e., controlling for previous well-being). Consistent with such findings, optimism has been linked to higher levels of engagement coping and lower levels of avoidance, or disengagement, coping. There is evidence that optimism is associated with taking proactive steps to protect one's health, whereas pessimism is associated with health-damaging behaviors. Consistent with such findings, optimism is also related to indicators of better physical health. The energetic, task-focused approach that optimists take to goals also relates to benefits in the socioeconomic world. Some evidence suggests that optimism relates to more persistence in educational efforts and to higher later income. Optimists also appear to fare better than pessimists in relationships. Although there are instances in which optimism fails to convey an advantage, and instances in which it may convey a disadvantage, those instances are relatively rare. In sum, the behavioral patterns of optimists appear to provide models of living for others to learn from. PMID:20170998

  4. An Improved Teaching-Learning-Based Optimization with the Social Character of PSO for Global Optimization

    PubMed Central

    Zou, Feng; Chen, Debao; Wang, Jiangtao

    2016-01-01

    An improved teaching-learning-based optimization with combining of the social character of PSO (TLBO-PSO), which is considering the teacher's behavior influence on the students and the mean grade of the class, is proposed in the paper to find the global solutions of function optimization problems. In this method, the teacher phase of TLBO is modified; the new position of the individual is determined by the old position, the mean position, and the best position of current generation. The method overcomes disadvantage that the evolution of the original TLBO might stop when the mean position of students equals the position of the teacher. To decrease the computation cost of the algorithm, the process of removing the duplicate individual in original TLBO is not adopted in the improved algorithm. Moreover, the probability of local convergence of the improved method is decreased by the mutation operator. The effectiveness of the proposed method is tested on some benchmark functions, and the results are competitive with respect to some other methods. PMID:27057157

  5. An Improved Teaching-Learning-Based Optimization with the Social Character of PSO for Global Optimization.

    PubMed

    Zou, Feng; Chen, Debao; Wang, Jiangtao

    2016-01-01

    An improved teaching-learning-based optimization with combining of the social character of PSO (TLBO-PSO), which is considering the teacher's behavior influence on the students and the mean grade of the class, is proposed in the paper to find the global solutions of function optimization problems. In this method, the teacher phase of TLBO is modified; the new position of the individual is determined by the old position, the mean position, and the best position of current generation. The method overcomes disadvantage that the evolution of the original TLBO might stop when the mean position of students equals the position of the teacher. To decrease the computation cost of the algorithm, the process of removing the duplicate individual in original TLBO is not adopted in the improved algorithm. Moreover, the probability of local convergence of the improved method is decreased by the mutation operator. The effectiveness of the proposed method is tested on some benchmark functions, and the results are competitive with respect to some other methods.

  6. Optimal multi-floor plant layout based on the mathematical programming and particle swarm optimization.

    PubMed

    Lee, Chang Jun

    2015-01-01

    In the fields of researches associated with plant layout optimization, the main goal is to minimize the costs of pipelines and pumping between connecting equipment under various constraints. However, what is the lacking of considerations in previous researches is to transform various heuristics or safety regulations into mathematical equations. For example, proper safety distances between equipments have to be complied for preventing dangerous accidents on a complex plant. Moreover, most researches have handled single-floor plant. However, many multi-floor plants have been constructed for the last decade. Therefore, the proper algorithm handling various regulations and multi-floor plant should be developed. In this study, the Mixed Integer Non-Linear Programming (MINLP) problem including safety distances, maintenance spaces, etc. is suggested based on mathematical equations. The objective function is a summation of pipeline and pumping costs. Also, various safety and maintenance issues are transformed into inequality or equality constraints. However, it is really hard to solve this problem due to complex nonlinear constraints. Thus, it is impossible to use conventional MINLP solvers using derivatives of equations. In this study, the Particle Swarm Optimization (PSO) technique is employed. The ethylene oxide plant is illustrated to verify the efficacy of this study.

  7. Bioassay-based risk assessment of complex mixtures

    SciTech Connect

    Donnelly, K.C.; Huebner, H.J.

    1996-12-31

    The baseline risk assessment often plays an integral role in various decision-making processes at Superfund sites. The present study reports on risk characterizations prepared for seven complex mixtures using biological and chemical analysis. Three of the samples (A, B, and C) were complex mixtures of polycyclic aromatic hydrocarbons (PAHs) extracted from coal tar; while four samples extracted from munitions-contaminated soil contained primarily nitroaromatic hydrocarbons. The chemical-based risk assessment ranked sample C as least toxic, while the risk associated with samples A and B was approximately equal. The microbial bioassay was in general agreement for the coal tar samples. The weighted activity of the coal tar extracts in Salmonella was 4,960 for sample C, and 162,000 and 206,000 for samples A and B, respectively. The bacterial mutagenicity of 2,4,6-trinitrotoluene contaminated soils exhibited an indirect correlation with chemical-based risk assessment. The aqueous extract of sample 004 induced 1,292 net revertants in Salmonella, while the estimated risk to ingestion and dermal adsorption was 2E-9. The data indicate that the chemical-based risk assessment accurately predicted the genotoxicity of the PAHs, while the accuracy of the risk assessment for munitions contaminated soils was limited due to the presence of metabolites of TNT degradation. The biological tests used in this research provide a valuable compliment to chemical analysis for characterizing the genotoxic risk of complex mixtures.

  8. Research needs for risk-informed, performance-based regulations

    SciTech Connect

    Thadani, A.C.

    1997-01-01

    This article summarizes the activities of the Office of Research of the NRC, both from a historical aspect as well as it applies to the application of risk-based decision making. The office has been actively involved in problems related to understanding risks related to core accidents, to understanding the problem of aging of reactor components and materials from years of service, and toward the understanding and analysis of severe accidents. In addition new policy statements regarding the role of risk assessment in regulatory applications has given focus for the need of further work. The NRC has used risk assessment in regulatory questions in the past but in a fairly ad hoc sort of manner. The new policies will clearly require a better defined application of risk assessment, and help for people evaluating applications in judging the applicability of such applications when a component of them is based on risk-based decision making. To address this, standard review plans are being prepared to serve as guides for such questions. In addition, with regulatory decisions being allowed to be based upon risk-based decisions, it is necessary to have an adequate data base prepared, and made publically available, to support such a position.

  9. A school-based intervention for diabetes risk reduction

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We examined the effects of a multicomponent, school-based program, addressing risk factors for diabetes among children whose race, or ethnic group and socioeconomic status placed them at high risk for obesity and type 2 diabetes. Using a cluster design, we randomly assigned 42 schools to either a mu...

  10. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk... Oversight. SBA supervises, examines, and regulates, and enforces laws against, SBA Supervised Lenders...

  11. Risk-Based Educational Accountability in Dutch Primary Education

    ERIC Educational Resources Information Center

    Timmermans, A. C.; de Wolf, I. F.; Bosker, R. J.; Doolaard, S.

    2015-01-01

    A recent development in educational accountability is a risk-based approach, in which intensity and frequency of school inspections vary across schools to make educational accountability more efficient and effective by enabling inspectorates to focus on organizations at risk. Characteristics relevant in predicting which schools are "at risk…

  12. Developing a risk-based air quality health index

    NASA Astrophysics Data System (ADS)

    Wong, Tze Wai; Tam, Wilson Wai San; Yu, Ignatius Tak Sun; Lau, Alexis Kai Hon; Pang, Sik Wing; Wong, Andromeda H. S.

    2013-09-01

    We developed a risk-based, multi-pollutant air quality health index (AQHI) reporting system in Hong Kong, based on the Canadian approach. We performed time series studies to obtain the relative risks of hospital admissions for respiratory and cardiovascular diseases associated with four air pollutants: sulphur dioxide, nitrogen dioxide, ozone, and particulate matter with an aerodynamic diameter less than 10 μm (PM10). We then calculated the sum of excess risks of the hospital admissions associated with these air pollutants. The cut-off points of the summed excess risk, for the issuance of different health warnings, were based on the concentrations of these pollutants recommended as short-term Air Quality Guidelines by the World Health Organization. The excess risks were adjusted downwards for young children and the elderly. Health risk was grouped into five categories and sub-divided into eleven bands, with equal increments in excess risk from band 1 up to band 10 (the 11th band is 'band 10+'). We developed health warning messages for the general public, including at-risk groups: young children, the elderly, and people with pre-existing cardiac or respiratory diseases. The new system addressed two major shortcomings of the current standard-based system; namely, the time lag between a sudden rise in air pollutant concentrations and the issue of a health warning, and the reliance on one dominant pollutant to calculate the index. Hence, the AQHI represents an improvement over Hong Kong's existing air pollution index.

  13. Biological Bases of Space Radiation Risk

    NASA Technical Reports Server (NTRS)

    1997-01-01

    In this session, Session JP4, the discussion focuses on the following topics: Hematopoiesis Dynamics in Irradiated Mammals, Mathematical Modeling; Estimating Health Risks in Space from Galactic Cosmic Rays; Failure of Heavy Ions to Affect Physiological Integrity of the Corneal Endothelial Monolayer; Application of an Unbiased Two-Gel CDNA Library Screening Method to Expression Monitoring of Genes in Irradiated Versus Control Cells; Detection of Radiation-Induced DNA Strand Breaks in Mammalian Cells By Enzymatic Post-Labeling; Evaluation of Bleomycin-Induced Chromosome Aberrations Under Microgravity Conditions in Human Lymphocytes, Using "Fish" Techniques; Technical Description of the Space Exposure Biology Assembly Seba on ISS; and Cytogenetic Research in Biological Dosimetry.

  14. A universal optimization strategy for ant colony optimization algorithms based on the Physarum-inspired mathematical model.

    PubMed

    Zhang, Zili; Gao, Chao; Liu, Yuxin; Qian, Tao

    2014-09-01

    Ant colony optimization (ACO) algorithms often fall into the local optimal solution and have lower search efficiency for solving the travelling salesman problem (TSP). According to these shortcomings, this paper proposes a universal optimization strategy for updating the pheromone matrix in the ACO algorithms. The new optimization strategy takes advantages of the unique feature of critical paths reserved in the process of evolving adaptive networks of the Physarum-inspired mathematical model (PMM). The optimized algorithms, denoted as PMACO algorithms, can enhance the amount of pheromone in the critical paths and promote the exploitation of the optimal solution. Experimental results in synthetic and real networks show that the PMACO algorithms are more efficient and robust than the traditional ACO algorithms, which are adaptable to solve the TSP with single or multiple objectives. Meanwhile, we further analyse the influence of parameters on the performance of the PMACO algorithms. Based on these analyses, the best values of these parameters are worked out for the TSP.

  15. Optimal pattern distributions in Rete-based production systems

    NASA Technical Reports Server (NTRS)

    Scott, Stephen L.

    1994-01-01

    Since its introduction into the AI community in the early 1980's, the Rete algorithm has been widely used. This algorithm has formed the basis for many AI tools, including NASA's CLIPS. One drawback of Rete-based implementation, however, is that the network structures used internally by the Rete algorithm make it sensitive to the arrangement of individual patterns within rules. Thus while rules may be more or less arbitrarily placed within source files, the distribution of individual patterns within these rules can significantly affect the overall system performance. Some heuristics have been proposed to optimize pattern placement, however, these suggestions can be conflicting. This paper describes a systematic effort to measure the effect of pattern distribution on production system performance. An overview of the Rete algorithm is presented to provide context. A description of the methods used to explore the pattern ordering problem area are presented, using internal production system metrics such as the number of partial matches, and coarse-grained operating system data such as memory usage and time. The results of this study should be of interest to those developing and optimizing software for Rete-based production systems.

  16. Multi-objective reliability-based optimization with stochastic metamodels.

    PubMed

    Coelho, Rajan Filomeno; Bouillard, Philippe

    2011-01-01

    This paper addresses continuous optimization problems with multiple objectives and parameter uncertainty defined by probability distributions. First, a reliability-based formulation is proposed, defining the nondeterministic Pareto set as the minimal solutions such that user-defined probabilities of nondominance and constraint satisfaction are guaranteed. The formulation can be incorporated with minor modifications in a multiobjective evolutionary algorithm (here: the nondominated sorting genetic algorithm-II). Then, in the perspective of applying the method to large-scale structural engineering problems--for which the computational effort devoted to the optimization algorithm itself is negligible in comparison with the simulation--the second part of the study is concerned with the need to reduce the number of function evaluations while avoiding modification of the simulation code. Therefore, nonintrusive stochastic metamodels are developed in two steps. First, for a given sampling of the deterministic variables, a preliminary decomposition of the random responses (objectives and constraints) is performed through polynomial chaos expansion (PCE), allowing a representation of the responses by a limited set of coefficients. Then, a metamodel is carried out by kriging interpolation of the PCE coefficients with respect to the deterministic variables. The method has been tested successfully on seven analytical test cases and on the 10-bar truss benchmark, demonstrating the potential of the proposed approach to provide reliability-based Pareto solutions at a reasonable computational cost.

  17. A Localization Method for Multistatic SAR Based on Convex Optimization.

    PubMed

    Zhong, Xuqi; Wu, Junjie; Yang, Jianyu; Sun, Zhichao; Huang, Yuling; Li, Zhongyu

    2015-01-01

    In traditional localization methods for Synthetic Aperture Radar (SAR), the bistatic range sum (BRS) estimation and Doppler centroid estimation (DCE) are needed for the calculation of target localization. However, the DCE error greatly influences the localization accuracy. In this paper, a localization method for multistatic SAR based on convex optimization without DCE is investigated and the influence of BRS estimation error on localization accuracy is analysed. Firstly, by using the information of each transmitter and receiver (T/R) pair and the target in SAR image, the model functions of T/R pairs are constructed. Each model function's maximum is on the circumference of the ellipse which is the iso-range for its model function's T/R pair. Secondly, the target function whose maximum is located at the position of the target is obtained by adding all model functions. Thirdly, the target function is optimized based on gradient descent method to obtain the position of the target. During the iteration process, principal component analysis is implemented to guarantee the accuracy of the method and improve the computational efficiency. The proposed method only utilizes BRSs of a target in several focused images from multistatic SAR. Therefore, compared with traditional localization methods for SAR, the proposed method greatly improves the localization accuracy. The effectivity of the localization approach is validated by simulation experiment.

  18. A Localization Method for Multistatic SAR Based on Convex Optimization

    PubMed Central

    2015-01-01

    In traditional localization methods for Synthetic Aperture Radar (SAR), the bistatic range sum (BRS) estimation and Doppler centroid estimation (DCE) are needed for the calculation of target localization. However, the DCE error greatly influences the localization accuracy. In this paper, a localization method for multistatic SAR based on convex optimization without DCE is investigated and the influence of BRS estimation error on localization accuracy is analysed. Firstly, by using the information of each transmitter and receiver (T/R) pair and the target in SAR image, the model functions of T/R pairs are constructed. Each model function’s maximum is on the circumference of the ellipse which is the iso-range for its model function’s T/R pair. Secondly, the target function whose maximum is located at the position of the target is obtained by adding all model functions. Thirdly, the target function is optimized based on gradient descent method to obtain the position of the target. During the iteration process, principal component analysis is implemented to guarantee the accuracy of the method and improve the computational efficiency. The proposed method only utilizes BRSs of a target in several focused images from multistatic SAR. Therefore, compared with traditional localization methods for SAR, the proposed method greatly improves the localization accuracy. The effectivity of the localization approach is validated by simulation experiment. PMID:26566031

  19. [Study on the land use optimization based on PPI].

    PubMed

    Wu, Xiao-Feng; Li, Ting

    2012-03-01

    Land use type and managing method which is greatly influenced by human activities, is one of the most important factors of non-point pollution. Based on the collection and analysis of non-point pollution control methods and the concept of the three ecological fronts, 9 land use optimized scenarios were designed according to rationality analysis of the current land use situation in the 3 typed small watersheds in Miyun reservoir basin. Take Caojialu watershed for example to analyze and compare the influence to environment of different scenarios based on potential pollution index (PPI) and river section potential pollution index (R-PPI) and the best combination scenario was found. Land use scenario designing and comparison on basis of PPI and R-PPI could help to find the best combination scenario of land use type and managing method, to optimize space distribution and managing methods of land use in basin, to reduce soil erosion and to provide powerful support to formulation of land use planning and pollution control project.

  20. Optimization-based multiple-point geostatistics: A sparse way

    NASA Astrophysics Data System (ADS)

    Kalantari, Sadegh; Abdollahifard, Mohammad Javad

    2016-10-01

    In multiple-point simulation the image should be synthesized consistent with the given training image and hard conditioning data. Existing sequential simulation methods usually lead to error accumulation which is hardly manageable in future steps. Optimization-based methods are capable of handling inconsistencies by iteratively refining the simulation grid. In this paper, the multiple-point stochastic simulation problem is formulated in an optimization-based framework using a sparse model. Sparse model allows each patch to be constructed as a superposition of a few atoms of a dictionary formed using training patterns, leading to a significant increase in the variability of the patches. To control the creativity of the model, a local histogram matching method is proposed. Furthermore, effective solutions are proposed for different issues arisen in multiple-point simulation. In order to handle hard conditioning data a weighted matching pursuit method is developed in this paper. Moreover, a simple and efficient thresholding method is developed which allows working with categorical variables. The experiments show that the proposed method produces acceptable realizations in terms of pattern reproduction, increases the variability of the realizations, and properly handles numerous conditioning data.

  1. CFD-Based Design Optimization Tool Developed for Subsonic Inlet

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The traditional approach to the design of engine inlets for commercial transport aircraft is a tedious process that ends with a less-than-optimum design. With the advent of high-speed computers and the availability of more accurate and reliable computational fluid dynamics (CFD) solvers, numerical optimization processes can effectively be used to design an aerodynamic inlet lip that enhances engine performance. The designers' experience at Boeing Corporation showed that for a peak Mach number on the inlet surface beyond some upper limit, the performance of the engine degrades excessively. Thus, our objective was to optimize efficiency (minimize the peak Mach number) at maximum cruise without compromising performance at other operating conditions. Using a CFD code NPARC, the NASA Lewis Research Center, in collaboration with Boeing, developed an integrated procedure at Lewis to find the optimum shape of a subsonic inlet lip and a numerical optimization code, ADS. We used a GRAPE-based three-dimensional grid generator to help automate the optimization procedure. The inlet lip shape at the crown and the keel was described as a superellipse, and the superellipse exponents and radii ratios were considered as design variables. Three operating conditions: cruise, takeoff, and rolling takeoff, were considered in this study. Three-dimensional Euler computations were carried out to obtain the flow field. At the initial design, the peak Mach numbers for maximum cruise, takeoff, and rolling takeoff conditions were 0.88, 1.772, and 1.61, respectively. The acceptable upper limits on the takeoff and rolling takeoff Mach numbers were 1.55 and 1.45. Since the initial design provided by Boeing was found to be optimum with respect to the maximum cruise condition, the sum of the peak Mach numbers at takeoff and rolling takeoff were minimized in the current study while the maximum cruise Mach number was constrained to be close to that at the existing design. With this objective, the

  2. A Triangle Mesh Standardization Method Based on Particle Swarm Optimization

    PubMed Central

    Duan, Liming; Bai, Yang; Wang, Haoyu; Shao, Hui; Zhong, Siyang

    2016-01-01

    To enhance the triangle quality of a reconstructed triangle mesh, a novel triangle mesh standardization method based on particle swarm optimization (PSO) is proposed. First, each vertex of the mesh and its first order vertices are fitted to a cubic curve surface by using least square method. Additionally, based on the condition that the local fitted surface is the searching region of PSO and the best average quality of the local triangles is the goal, the vertex position of the mesh is regulated. Finally, the threshold of the normal angle between the original vertex and regulated vertex is used to determine whether the vertex needs to be adjusted to preserve the detailed features of the mesh. Compared with existing methods, experimental results show that the proposed method can effectively improve the triangle quality of the mesh while preserving the geometric features and details of the original mesh. PMID:27509129

  3. Vision-based coaching: optimizing resources for leader development.

    PubMed

    Passarelli, Angela M

    2015-01-01

    Leaders develop in the direction of their dreams, not in the direction of their deficits. Yet many coaching interactions intended to promote a leader's development fail to leverage the benefits of the individual's personal vision. Drawing on intentional change theory, this article postulates that coaching interactions that emphasize a leader's personal vision (future aspirations and core identity) evoke a psychophysiological state characterized by positive emotions, cognitive openness, and optimal neurobiological functioning for complex goal pursuit. Vision-based coaching, via this psychophysiological state, generates a host of relational and motivational resources critical to the developmental process. These resources include: formation of a positive coaching relationship, expansion of the leader's identity, increased vitality, activation of learning goals, and a promotion-orientation. Organizational outcomes as well as limitations to vision-based coaching are discussed.

  4. Vision-based coaching: optimizing resources for leader development

    PubMed Central

    Passarelli, Angela M.

    2015-01-01

    Leaders develop in the direction of their dreams, not in the direction of their deficits. Yet many coaching interactions intended to promote a leader’s development fail to leverage the benefits of the individual’s personal vision. Drawing on intentional change theory, this article postulates that coaching interactions that emphasize a leader’s personal vision (future aspirations and core identity) evoke a psychophysiological state characterized by positive emotions, cognitive openness, and optimal neurobiological functioning for complex goal pursuit. Vision-based coaching, via this psychophysiological state, generates a host of relational and motivational resources critical to the developmental process. These resources include: formation of a positive coaching relationship, expansion of the leader’s identity, increased vitality, activation of learning goals, and a promotion–orientation. Organizational outcomes as well as limitations to vision-based coaching are discussed. PMID:25926803

  5. Optimizing bulk milk dioxin monitoring based on costs and effectiveness.

    PubMed

    Lascano-Alcoser, V H; Velthuis, A G J; van der Fels-Klerx, H J; Hoogenboom, L A P; Oude Lansink, A G J M

    2013-07-01

    concentration equal to the EC maximum level. This study shows that the effectiveness of finding an incident depends not only on the ratio at which, for testing, collected truck samples are mixed into a pooled sample (aiming at detecting certain concentration), but also the number of collected truck samples. In conclusion, the optimal cost-effective monitoring depends on the number of contaminated farms and the concentration aimed at detection. The models and study results offer quantitative support to risk managers of food industries and food safety authorities.

  6. Chemical Mixture Risk Assessment Additivity-Based Approaches

    EPA Science Inventory

    Powerpoint presentation includes additivity-based chemical mixture risk assessment methods. Basic concepts, theory and example calculations are included. Several slides discuss the use of "common adverse outcomes" in analyzing phthalate mixtures.

  7. Genetics algorithm optimization of DWT-DCT based image Watermarking

    NASA Astrophysics Data System (ADS)

    Budiman, Gelar; Novamizanti, Ledya; Iwut, Iwan

    2017-01-01

    Data hiding in an image content is mandatory for setting the ownership of the image. Two dimensions discrete wavelet transform (DWT) and discrete cosine transform (DCT) are proposed as transform method in this paper. First, the host image in RGB color space is converted to selected color space. We also can select the layer where the watermark is embedded. Next, 2D-DWT transforms the selected layer obtaining 4 subband. We select only one subband. And then block-based 2D-DCT transforms the selected subband. Binary-based watermark is embedded on the AC coefficients of each block after zigzag movement and range based pixel selection. Delta parameter replacing pixels in each range represents embedded bit. +Delta represents bit “1” and –delta represents bit “0”. Several parameters to be optimized by Genetics Algorithm (GA) are selected color space, layer, selected subband of DWT decomposition, block size, embedding range, and delta. The result of simulation performs that GA is able to determine the exact parameters obtaining optimum imperceptibility and robustness, in any watermarked image condition, either it is not attacked or attacked. DWT process in DCT based image watermarking optimized by GA has improved the performance of image watermarking. By five attacks: JPEG 50%, resize 50%, histogram equalization, salt-pepper and additive noise with variance 0.01, robustness in the proposed method has reached perfect watermark quality with BER=0. And the watermarked image quality by PSNR parameter is also increased about 5 dB than the watermarked image quality from previous method.

  8. Risk based tiered approach (RBTASM) for pollution prevention.

    PubMed

    Elves, R G; Sweeney, L M; Tomljanovic, C

    1997-11-01

    Effective management of human health and ecological hazards in the manufacturing and maintenance environment can be achieved by focusing on the risks associated with these operations. The NDCEE Industrial Health Risk Assessment (IHRA) Program is developing a comprehensive approach to risk analysis applied to existing processes and used to evaluate alternatives. The IHRA Risk-Based Tiered Approach (RBTASM) builds on the American Society for Testing and Materials (ASTM) Risk-Based Corrective Action (RBCA) effort to remediate underground storage tanks. Using readily available information, a semi-quantitative ranking of alternatives based on environmental, safety, and occupational health criteria was produced. A Rapid Screening Assessment of alternative corrosion protection products was performed on behalf of the Joint Group on Acquisition Pollution Prevention (JG-APP). Using the RBTASM in pollution prevention alternative selection required higher tiered analysis and more detailed assessment of human health risks under site-specific conditions. This example illustrates the RBTASM for a organic finishing line using three different products (one conventional spray and two alternative powder coats). The human health risk information developed using the RBTASM is considered along with product performance, regulatory, and cost information by risk managers downselecting alternatives for implementation or further analysis.

  9. Quantifying the risks of unexploded ordnance at closed military bases.

    PubMed

    MacDonald, Jacqueline A; Small, Mitchell J; Morgan, M Granger

    2009-01-15

    Some 1,976 sites at closed military bases in the United States are contaminated with unexploded ordnance (UXO) left over from live-fire weapons training. These sites present risks to civilians who might come into contact with the UXO and cause it to explode. This paper presents the first systems analysis model for assessing the explosion risks of UXO at former military training ranges. We develop a stochastic model for estimating the probability of exposure to and explosion of UXO, before and after site cleanup. An application of the model to a 310-acre parcel at Fort Ord, California, shows that substantial risk can remain even after a site is declared clean. We estimate that risk to individual construction workers of encountering UXO that explodes would range from 4 x 10(-4) to 5 x 10(-2), depending on model assumptions, well above typical Occupational Safety and Health Administration (OSHA) and U.S. Environmental Protection Agency (EPA) target risk levels of 10(-4) to 10(-6). In contrast, a qualitative UXO risk assessment method, the Munitions and Explosives of Concern Hazard Assessment (MEC HA), developed by an interagency work group led by the EPA, indicates that the explosion risk at the case study site is low and "compatible with current and determined or reasonably anticipated future risk." We argue that a quantitative approach, like that illustrated in this paper, is necessary to provide a more complete picture of risks and the opportunities for risk reduction.

  10. Does the Defense Industrial Base Environment Create Strategic Risk?

    DTIC Science & Technology

    2013-03-01

    required of them to meet future DoD requirements. If an industry is struggling financially, money wasted on designing capabilities that DoD does not...Does the Defense Industrial Base Environment Create Strategic Risk? by Lieutenant Colonel Brandon L. Grubbs United States... Industrial Base Environment Create Strategic Risk? 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Lieutenant

  11. Risk-based decision making for terrorism applications.

    PubMed

    Dillon, Robin L; Liebe, Robert M; Bestafka, Thomas

    2009-03-01

    This article describes the anti-terrorism risk-based decision aid (ARDA), a risk-based decision-making approach for prioritizing anti-terrorism measures. The ARDA model was developed as part of a larger effort to assess investments for protecting U.S. Navy assets at risk and determine whether the most effective anti-terrorism alternatives are being used to reduce the risk to the facilities and war-fighting assets. With ARDA and some support from subject matter experts, we examine thousands of scenarios composed of 15 attack modes against 160 facility types on two installations and hundreds of portfolios of 22 mitigation alternatives. ARDA uses multiattribute utility theory to solve some of the commonly identified challenges in security risk analysis. This article describes the process and documents lessons learned from applying the ARDA model for this application.

  12. Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach.

    PubMed

    Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H

    2016-11-28

    Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks.

  13. Requirements based system level risk modeling

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Cornford, Steven; Feather, Martin

    2004-01-01

    The problem that we address in this paper is assessing the expected degree of success of the system or mission based on the degree to which each requirement is satisfied and the relative weight of the requirements.

  14. An integrated optimization method for river water quality management and risk analysis in a rural system.

    PubMed

    Liu, J; Li, Y P; Huang, G H; Zeng, X T; Nie, S

    2016-01-01

    In this study, an interval-stochastic-based risk analysis (RSRA) method is developed for supporting river water quality management in a rural system under uncertainty (i.e., uncertainties exist in a number of system components as well as their interrelationships). The RSRA method is effective in risk management and policy analysis, particularly when the inputs (such as allowable pollutant discharge and pollutant discharge rate) are expressed as probability distributions and interval values. Moreover, decision-makers' attitudes towards system risk can be reflected using a restricted resource measure by controlling the variability of the recourse cost. The RSRA method is then applied to a real case of water quality management in the Heshui River Basin (a rural area of China), where chemical oxygen demand (COD), total nitrogen (TN), total phosphorus (TP), and soil loss are selected as major indicators to identify the water pollution control strategies. Results reveal that uncertainties and risk attitudes have significant effects on both pollutant discharge and system benefit. A high risk measure level can lead to a reduced system benefit; however, this reduction also corresponds to raised system reliability. Results also disclose that (a) agriculture is the dominant contributor to soil loss, TN, and TP loads, and abatement actions should be mainly carried out for paddy and dry farms; (b) livestock husbandry is the main COD discharger, and abatement measures should be mainly conducted for poultry farm; (c) fishery accounts for a high percentage of TN, TP, and COD discharges but a has low percentage of overall net benefit, and it may be beneficial to cease fishery activities in the basin. The findings can facilitate the local authority in identifying desired pollution control strategies with the tradeoff between socioeconomic development and environmental sustainability.

  15. A Fast Method for Embattling Optimization of Ground-Based Radar Surveillance Network

    NASA Astrophysics Data System (ADS)

    Jiang, H.; Cheng, H.; Zhang, Y.; Liu, J.

    A growing number of space activities have created an orbital debris environment that poses increasing impact risks to existing space systems and human space flight. For the safety of in-orbit spacecraft, a lot of observation facilities are needed to catalog space objects, especially in low earth orbit. Surveillance of Low earth orbit objects are mainly rely on ground-based radar, due to the ability limitation of exist radar facilities, a large number of ground-based radar need to build in the next few years in order to meet the current space surveillance demands. How to optimize the embattling of ground-based radar surveillance network is a problem to need to be solved. The traditional method for embattling optimization of ground-based radar surveillance network is mainly through to the detection simulation of all possible stations with cataloged data, and makes a comprehensive comparative analysis of various simulation results with the combinational method, and then selects an optimal result as station layout scheme. This method is time consuming for single simulation and high computational complexity for the combinational analysis, when the number of stations increases, the complexity of optimization problem will be increased exponentially, and cannot be solved with traditional method. There is no better way to solve this problem till now. In this paper, target detection procedure was simplified. Firstly, the space coverage of ground-based radar was simplified, a space coverage projection model of radar facilities in different orbit altitudes was built; then a simplified objects cross the radar coverage model was established according to the characteristics of space objects orbit motion; after two steps simplification, the computational complexity of the target detection was greatly simplified, and simulation results shown the correctness of the simplified results. In addition, the detection areas of ground-based radar network can be easily computed with the

  16. CFD-Based Design Optimization for Single Element Rocket Injector

    NASA Technical Reports Server (NTRS)

    Vaidyanathan, Rajkumar; Tucker, Kevin; Papila, Nilay; Shyy, Wei

    2003-01-01

    To develop future Reusable Launch Vehicle concepts, we have conducted design optimization for a single element rocket injector, with overall goals of improving reliability and performance while reducing cost. Computational solutions based on the Navier-Stokes equations, finite rate chemistry, and the k-E turbulence closure are generated with design of experiment techniques, and the response surface method is employed as the optimization tool. The design considerations are guided by four design objectives motivated by the consideration in both performance and life, namely, the maximum temperature on the oxidizer post tip, the maximum temperature on the injector face, the adiabatic wall temperature, and the length of the combustion zone. Four design variables are selected, namely, H2 flow angle, H2 and O2 flow areas with fixed flow rates, and O2 post tip thickness. In addition to establishing optimum designs by varying emphasis on the individual objectives, better insight into the interplay between design variables and their impact on the design objectives is gained. The investigation indicates that improvement in performance or life comes at the cost of the other. Best compromise is obtained when improvements in both performance and life are given equal importance.

  17. Tree-Based Visualization and Optimization for Image Collection.

    PubMed

    Han, Xintong; Zhang, Chongyang; Lin, Weiyao; Xu, Mingliang; Sheng, Bin; Mei, Tao

    2016-06-01

    The visualization of an image collection is the process of displaying a collection of images on a screen under some specific layout requirements. This paper focuses on an important problem that is not well addressed by the previous methods: visualizing image collections into arbitrary layout shapes while arranging images according to user-defined semantic or visual correlations (e.g., color or object category). To this end, we first propose a property-based tree construction scheme to organize images of a collection into a tree structure according to user-defined properties. In this way, images can be adaptively placed with the desired semantic or visual correlations in the final visualization layout. Then, we design a two-step visualization optimization scheme to further optimize image layouts. As a result, multiple layout effects including layout shape and image overlap ratio can be effectively controlled to guarantee a satisfactory visualization. Finally, we also propose a tree-transfer scheme such that visualization layouts can be adaptively changed when users select different "images of interest." We demonstrate the effectiveness of our proposed approach through the comparisons with state-of-the-art visualization techniques.

  18. Source mask optimization study based on latest Nikon immersion scanner

    NASA Astrophysics Data System (ADS)

    Zhu, Jun; Wei, Fang; Chen, Lijun; Zhang, Chenming; Zhang, Wei; Nishinaga, Hisashi; El-Sewefy, Omar; Gao, Gen-Sheng; Lafferty, Neal; Meiring, Jason; Zhang, Recoo; Zhu, Cynthia

    2016-03-01

    The 2x nm logic foundry node has many challenges since critical levels are pushed close to the limits of low k1 ArF water immersion lithography. For these levels, improvements in lithographic performance can translate to decreased rework and increased yield. Source Mask Optimization (SMO) is one such route to realize these image fidelity improvements. During SMO, critical layout constructs are intensively optimized in both the mask and source domain, resulting in a solution for maximum lithographic entitlement. From the hardware side, advances in source technology have enabled free-form illumination. The approach allows highly customized illumination, enabling the practical application of SMO sources. The customized illumination sources can be adjusted for maximum versatility. In this paper, we present a study on a critical layer of an advanced foundry logic node using the latest ILT based SMO software, paired with state-of-the-art scanner hardware and intelligent illuminator. Performance of the layer's existing POR source is compared with the ideal SMO result and the installed source as realized on the intelligent illuminator of an NSR-S630D scanner. Both simulation and on-silicon measurements are used to confirm that the performance of the studied layer meets established specifications.

  19. [Optimal allocation of irrigation water resources based on systematical strategy].

    PubMed

    Cheng, Shuai; Zhang, Shu-qing

    2015-01-01

    With the development of the society and economy, as well as the rapid increase of population, more and more water is needed by human, which intensified the shortage of water resources. The scarcity of water resources and growing competition of water in different water use sectors reduce water availability for irrigation, so it is significant to plan and manage irrigation water resources scientifically and reasonably for improving water use efficiency (WUE) and ensuring food security. Many investigations indicate that WUE can be increased by optimization of water use. However, present studies focused primarily on a particular aspect or scale, which lack systematic analysis on the problem of irrigation water allocation. By summarizing previous related studies, especially those based on intelligent algorithms, this article proposed a multi-level, multi-scale framework for allocating irrigation water, and illustrated the basic theory of each component of the framework. Systematical strategy of optimal irrigation water allocation can not only control the total volume of irrigation water on the time scale, but also reduce water loss on the spatial scale. It could provide scientific basis and technical support for improving the irrigation water management level and ensuring the food security.

  20. Development and use of risk-based inspection guides

    SciTech Connect

    Taylor, J.H.; Fresco, A.; Higgins, J.; Usher, J.; Long, S.M.

    1989-06-01

    Risk-based system inspection guides, for nuclear power plants which have been subjected to a probabilistic risk assessment (PRA), have been developed to provide guidance to NRC inspectors in prioritizing their inspection activities. Systems are prioritized, and then dominant component failure modes and human errors within those systems are identified for the above-stated purposes. Examples of applications to specific types of NRC inspection activities are also presented. Thus, the report provides guidance for both the development and use of risk-based system inspection guides. Work is proceeding to develop a method methodology for risk-based guidance for nuclear power plants not subject to a PRA. 18 refs., 1 fig.

  1. Composition optimization and stability testing of a parenteral antifungal solution based on a ternary solvent system.

    PubMed

    Kovács, Kristóf; Antal, István; Stampf, György; Klebovich, Imre; Ludányi, Krisztina

    2010-03-01

    An intravenous solution is a dosage forms intended for administration into the bloodstream. This route is the most rapid and the most bioavailable method of getting drugs into systemic circulation, and therefore it is also the most liable to cause adverse effects. In order to reduce the possibility of side effects and to ensure adequate clinical dosage of the formulation, the primarily formulated composition should be optimized. It is also important that the composition should retain its therapeutic effectiveness and safety throughout the shelf-life of the product. This paper focuses on the optimization and stability testing of a parenteral solution containing miconazole and ketoconazole solubilized with a ternary solvent system as model drugs. Optimization of the solvent system was performed based on assessing the risk/benefit ratio of the composition and its properties upon dilution. Stability tests were conducted based on the EMEA (European Medicines Agency) "guideline on stability testing: stability testing of existing active substances and related finished products". Experiments show that both the amount of co-solvent and surface active agent of the solvent system could substantially be reduced, while still maintaining adequate solubilizing power. It is also shown that the choice of various containers affects the stability of the compositions. It was concluded that by assessing the risk/benefit ratio of solubilizing power versus toxicity, the concentration of excipients could be considerably decreased while still showing a powerful solubilizing effect. It was also shown that a pharmaceutically acceptable shelf-life could be assigned to the composition, indicating good long-term stability.

  2. Communication: Optimal parameters for basin-hopping global optimization based on Tsallis statistics

    SciTech Connect

    Shang, C. Wales, D. J.

    2014-08-21

    A fundamental problem associated with global optimization is the large free energy barrier for the corresponding solid-solid phase transitions for systems with multi-funnel energy landscapes. To address this issue we consider the Tsallis weight instead of the Boltzmann weight to define the acceptance ratio for basin-hopping global optimization. Benchmarks for atomic clusters show that using the optimal Tsallis weight can improve the efficiency by roughly a factor of two. We present a theory that connects the optimal parameters for the Tsallis weighting, and demonstrate that the predictions are verified for each of the test cases.

  3. An optimization-based iterative algorithm for recovering fluorophore location

    NASA Astrophysics Data System (ADS)

    Yi, Huangjian; Peng, Jinye; Jin, Chen; He, Xiaowei

    2015-10-01

    Fluorescence molecular tomography (FMT) is a non-invasive technique that allows three-dimensional visualization of fluorophore in vivo in small animals. In practical applications of FMT, however, there are challenges in the image reconstruction since it is a highly ill-posed problem due to the diffusive behaviour of light transportation in tissue and the limited measurement data. In this paper, we presented an iterative algorithm based on an optimization problem for three dimensional reconstruction of fluorescent target. This method alternates weighted algebraic reconstruction technique (WART) with steepest descent method (SDM) for image reconstruction. Numerical simulations experiments and physical phantom experiment are performed to validate our method. Furthermore, compared to conjugate gradient method, the proposed method provides a better three-dimensional (3D) localization of fluorescent target.

  4. k-Nearest neighbors optimization-based outlier removal.

    PubMed

    Yosipof, Abraham; Senderowitz, Hanoch

    2015-03-30

    Datasets of molecular compounds often contain outliers, that is, compounds which are different from the rest of the dataset. Outliers, while often interesting may affect data interpretation, model generation, and decisions making, and therefore, should be removed from the dataset prior to modeling efforts. Here, we describe a new method for the iterative identification and removal of outliers based on a k-nearest neighbors optimization algorithm. We demonstrate for three different datasets that the removal of outliers using the new algorithm provides filtered datasets which are better than those provided by four alternative outlier removal procedures as well as by random compound removal in two important aspects: (1) they better maintain the diversity of the parent datasets; (2) they give rise to quantitative structure activity relationship (QSAR) models with much better prediction statistics. The new algorithm is, therefore, suitable for the pretreatment of datasets prior to QSAR modeling.

  5. Optimization-based interactive segmentation interface for multiregion problems.

    PubMed

    Baxter, John S H; Rajchl, Martin; Peters, Terry M; Chen, Elvis C S

    2016-04-01

    Interactive segmentation is becoming of increasing interest to the medical imaging community in that it combines the positive aspects of both manual and automated segmentation. However, general-purpose tools have been lacking in terms of segmenting multiple regions simultaneously with a high degree of coupling between groups of labels. Hierarchical max-flow segmentation has taken advantage of this coupling for individual applications, but until recently, these algorithms were constrained to a particular hierarchy and could not be considered general-purpose. In a generalized form, the hierarchy for any given segmentation problem is specified in run-time, allowing different hierarchies to be quickly explored. We present an interactive segmentation interface, which uses generalized hierarchical max-flow for optimization-based multiregion segmentation guided by user-defined seeds. Applications in cardiac and neonatal brain segmentation are given as example applications of its generality.

  6. Patch-based near-optimal image denoising.

    PubMed

    Chatterjee, Priyam; Milanfar, Peyman

    2012-04-01

    In this paper, we propose a denoising method motivated by our previous analysis of the performance bounds for image denoising. Insights from that study are used here to derive a high-performance practical denoising algorithm. We propose a patch-based Wiener filter that exploits patch redundancy for image denoising. Our framework uses both geometrically and photometrically similar patches to estimate the different filter parameters. We describe how these parameters can be accurately estimated directly from the input noisy image. Our denoising approach, designed for near-optimal performance (in the mean-squared error sense), has a sound statistical foundation that is analyzed in detail. The performance of our approach is experimentally verified on a variety of images and noise levels. The results presented here demonstrate that our proposed method is on par or exceeding the current state of the art, both visually and quantitatively.

  7. Efficacy of Code Optimization on Cache-Based Processors

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Saphir, William C.; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    In this paper a number of techniques for improving the cache performance of a representative piece of numerical software is presented. Target machines are popular processors from several vendors: MIPS R5000 (SGI Indy), MIPS R8000 (SGI PowerChallenge), MIPS R10000 (SGI Origin), DEC Alpha EV4 + EV5 (Cray T3D & T3E), IBM RS6000 (SP Wide-node), Intel PentiumPro (Ames' Whitney), Sun UltraSparc (NERSC's NOW). The optimizations all attempt to increase the locality of memory accesses. But they meet with rather varied and often counterintuitive success on the different computing platforms. We conclude that it may be genuinely impossible to obtain portable performance on the current generation of cache-based machines. At the least, it appears that the performance of modern commodity processors cannot be described with parameters defining the cache alone.

  8. Density-based penalty parameter optimization on C-SVM.

    PubMed

    Liu, Yun; Lian, Jie; Bartolacci, Michael R; Zeng, Qing-An

    2014-01-01

    The support vector machine (SVM) is one of the most widely used approaches for data classification and regression. SVM achieves the largest distance between the positive and negative support vectors, which neglects the remote instances away from the SVM interface. In order to avoid a position change of the SVM interface as the result of an error system outlier, C-SVM was implemented to decrease the influences of the system's outliers. Traditional C-SVM holds a uniform parameter C for both positive and negative instances; however, according to the different number proportions and the data distribution, positive and negative instances should be set with different weights for the penalty parameter of the error terms. Therefore, in this paper, we propose density-based penalty parameter optimization of C-SVM. The experiential results indicated that our proposed algorithm has outstanding performance with respect to both precision and recall.

  9. Density-Based Penalty Parameter Optimization on C-SVM

    PubMed Central

    Liu, Yun; Lian, Jie; Bartolacci, Michael R.; Zeng, Qing-An

    2014-01-01

    The support vector machine (SVM) is one of the most widely used approaches for data classification and regression. SVM achieves the largest distance between the positive and negative support vectors, which neglects the remote instances away from the SVM interface. In order to avoid a position change of the SVM interface as the result of an error system outlier, C-SVM was implemented to decrease the influences of the system's outliers. Traditional C-SVM holds a uniform parameter C for both positive and negative instances; however, according to the different number proportions and the data distribution, positive and negative instances should be set with different weights for the penalty parameter of the error terms. Therefore, in this paper, we propose density-based penalty parameter optimization of C-SVM. The experiential results indicated that our proposed algorithm has outstanding performance with respect to both precision and recall. PMID:25114978

  10. SVM-based glioma grading: Optimization by feature reduction analysis.

    PubMed

    Zöllner, Frank G; Emblem, Kyrre E; Schad, Lothar R

    2012-09-01

    We investigated the predictive power of feature reduction analysis approaches in support vector machine (SVM)-based classification of glioma grade. In 101 untreated glioma patients, three analytic approaches were evaluated to derive an optimal reduction in features; (i) Pearson's correlation coefficients (PCC), (ii) principal component analysis (PCA) and (iii) independent component analysis (ICA). Tumor grading was performed using a previously reported SVM approach including whole-tumor cerebral blood volume (CBV) histograms and patient age. Best classification accuracy was found using PCA at 85% (sensitivity=89%, specificity=84%) when reducing the feature vector from 101 (100-bins rCBV histogram+age) to 3 principal components. In comparison, classification accuracy by PCC was 82% (89%, 77%, 2 dimensions) and 79% by ICA (87%, 75%, 9 dimensions). For improved speed (up to 30%) and simplicity, feature reduction by all three methods provided similar classification accuracy to literature values (∼87%) while reducing the number of features by up to 98%.

  11. Lymphatic Filariasis Transmission Risk Map of India, Based on a Geo-Environmental Risk Model

    PubMed Central

    Sabesan, Shanmugavelu; Raju, Konuganti Hari Kishan; Srivastava, Pradeep Kumar; Jambulingam, Purushothaman

    2013-01-01

    Abstract The strategy adopted by a global program to interrupt transmission of lymphatic filariasis (LF) is mass drug administration (MDA) using chemotherapy. India also followed this strategy by introducing MDA in the historically known endemic areas. All other areas, which remained unsurveyed, were presumed to be nonendemic and left without any intervention. Therefore, identification of LF transmission risk areas in the entire country has become essential so that they can be targeted for intervention. A geo-environmental risk model (GERM) developed earlier was used to create a filariasis transmission risk map for India. In this model, a Standardized Filariasis Transmission Risk Index (SFTRI, based on geo-environmental risk variables) was used as a predictor of transmission risk. The relationship between SFTRI and endemicity (historically known) of an area was quantified by logistic regression analysis. The quantified relationship was validated by assessing the filarial antigenemia status of children living in the unsurveyed areas through a ground truth study. A significant positive relationship was observed between SFTRI and the endemicity of an area. Overall, the model prediction of filarial endemic status of districts was found to be correct in 92.8% of the total observations. Thus, among the 190 districts hitherto unsurveyed, as many as 113 districts were predicted to be at risk, and the remaining at no risk. The GERM developed on geographic information system (GIS) platform is useful for LF spatial delimitation on a macrogeographic/regional scale. Furthermore, the risk map developed will be useful for the national LF elimination program by identifying areas at risk for intervention and for undertaking surveillance in no-risk areas. PMID:23808973

  12. Optimizing timing performance of silicon photomultiplier-based scintillation detectors

    PubMed Central

    Yeom, Jung Yeol; Vinke, Ruud

    2013-01-01

    Precise timing resolution is crucial for applications requiring photon time-of-flight (ToF) information such as ToF positron emission tomography (PET). Silicon photomultipliers (SiPM) for PET, with their high output capacitance, are known to require custom preamplifiers to optimize timing performance. In this paper, we describe simple alternative front-end electronics based on a commercial low-noise RF preamplifier and methods that have been implemented to achieve excellent timing resolution. Two radiation detectors with L(Y)SO scintillators coupled to Hamamatsu SiPMs (MPPC S10362–33-050C) and front-end electronics based on an RF amplifier (MAR-3SM+), typically used for wireless applications that require minimal additional circuitry, have been fabricated. These detectors were used to detect annihilation photons from a Ge-68 source and the output signals were subsequently digitized by a high speed oscilloscope for offline processing. A coincident resolving time (CRT) of 147 ± 3 ps FWHM and 186 ± 3 ps FWHM with 3 × 3 × 5 mm3 and with 3 × 3 × 20 mm3 LYSO crystal elements were measured, respectively. With smaller 2 × 2 × 3 mm3 LSO crystals, a CRT of 125 ± 2 ps FWHM was achieved with slight improvement to 121 ± 3 ps at a lower temperature (15°C). Finally, with the 20 mm length crystals, a degradation of timing resolution was observed for annihilation photon interactions that occur close to the photosensor compared to shallow depth-of-interaction (DOI). We conclude that commercial RF amplifiers optimized for noise, besides their ease of use, can produce excellent timing resolution comparable to best reported values acquired with custom readout electronics. On the other hand, as timing performance degrades with increasing photon DOI, a head-on detector configuration will produce better CRT than a side-irradiated setup for longer crystals. PMID:23369872

  13. A controller based on Optimal Type-2 Fuzzy Logic: systematic design, optimization and real-time implementation.

    PubMed

    Fayek, H M; Elamvazuthi, I; Perumal, N; Venkatesh, B

    2014-09-01

    A computationally-efficient systematic procedure to design an Optimal Type-2 Fuzzy Logic Controller (OT2FLC) is proposed. The main scheme is to optimize the gains of the controller using Particle Swarm Optimization (PSO), then optimize only two parameters per type-2 membership function using Genetic Algorithm (GA). The proposed OT2FLC was implemented in real-time to control the position of a DC servomotor, which is part of a robotic arm. The performance judgments were carried out based on the Integral Absolute Error (IAE), as well as the computational cost. Various type-2 defuzzification methods were investigated in real-time. A comparative analysis with an Optimal Type-1 Fuzzy Logic Controller (OT1FLC) and a PI controller, demonstrated OT2FLC׳s superiority; which is evident in handling uncertainty and imprecision induced in the system by means of noise and disturbances.

  14. A risk-based framework for biomedical data sharing.

    PubMed

    Dankar, Fida K; Badji, Radja

    2017-02-01

    The problem of biomedical data sharing is a form of gambling; on one hand it incurs the risk of privacy violations and on the other it stands to profit from knowledge discovery. In general, the risk of granting data access to a user depends heavily upon the data requested, the purpose for the access, the user requesting the data (user motives) and the security of the user's environment. While traditional manual biomedical data sharing processes (based on institutional review boards) are lengthy and demanding, the automated ones (known as honest broker systems) disregard the individualities of different requests and offer "one-size-fits-all" solutions to all data requestors. In this manuscript, we propose a conceptual risk-aware data sharing system; the system brings the concept of risk, from all contextual information surrounding a data request, into the data disclosure decision module. The decision module, in turn, imposes mitigation measures to counter the calculated risk.

  15. Swarm Optimization-Based Magnetometer Calibration for Personal Handheld Devices

    PubMed Central

    Ali, Abdelrahman; Siddharth, Siddharth; Syed, Zainab; El-Sheimy, Naser

    2012-01-01

    Inertial Navigation Systems (INS) consist of accelerometers, gyroscopes and a processor that generates position and orientation solutions by integrating the specific forces and rotation rates. In addition to the accelerometers and gyroscopes, magnetometers can be used to derive the user heading based on Earth's magnetic field. Unfortunately, the measurements of the magnetic field obtained with low cost sensors are usually corrupted by several errors, including manufacturing defects and external electro-magnetic fields. Consequently, proper calibration of the magnetometer is required to achieve high accuracy heading measurements. In this paper, a Particle Swarm Optimization (PSO)-based calibration algorithm is presented to estimate the values of the bias and scale factor of low cost magnetometers. The main advantage of this technique is the use of the artificial intelligence which does not need any error modeling or awareness of the nonlinearity. Furthermore, the proposed algorithm can help in the development of Pedestrian Navigation Devices (PNDs) when combined with inertial sensors and GPS/Wi-Fi for indoor navigation and Location Based Services (LBS) applications.

  16. Parallel performance optimizations on unstructured mesh-based simulations

    DOE PAGES

    Sarje, Abhinav; Song, Sukhyun; Jacobsen, Douglas; ...

    2015-06-01

    This paper addresses two key parallelization challenges the unstructured mesh-based ocean modeling code, MPAS-Ocean, which uses a mesh based on Voronoi tessellations: (1) load imbalance across processes, and (2) unstructured data access patterns, that inhibit intra- and inter-node performance. Our work analyzes the load imbalance due to naive partitioning of the mesh, and develops methods to generate mesh partitioning with better load balance and reduced communication. Furthermore, we present methods that minimize both inter- and intranode data movement and maximize data reuse. Our techniques include predictive ordering of data elements for higher cache efficiency, as well as communication reduction approaches.more » We present detailed performance data when running on thousands of cores using the Cray XC30 supercomputer and show that our optimization strategies can exceed the original performance by over 2×. Additionally, many of these solutions can be broadly applied to a wide variety of unstructured grid-based computations.« less

  17. Development and optimization of biofilm based algal cultivation

    NASA Astrophysics Data System (ADS)

    Gross, Martin Anthony

    This dissertation describes research done on biofilm based algal cultivation systems. The system that was developed in this work is the revolving algal biofilm cultivation system (RAB). A raceway-retrofit, and a trough-based pilot-scale RAB system were developed and investigated. Each of the systems significantly outperformed a control raceway pond in side-by-side tests. Furthermore the RAB system was found to require significantly less water than the raceway pond based cultivation system. Lastly a TEA/LCA analysis was conducted to evaluate the economic and life cycle of the RAB cultivation system in comparison to raceway pond. It was found that the RAB system was able to grow algae at a lower cost and was shown to be profitable at a smaller scale than the raceway pond style of algal cultivation. Additionally the RAB system was projected to have lower GHG emissions, and better energy and water use efficiencies in comparison to a raceway pond system. Furthermore, fundamental research was conducted to identify the optimal material for algae to attach on. A total of 28 materials with a smooth surface were tested for initial cell colonization and it was found that the tetradecane contact angle of the materials had a good correlation with cell attachment. The effects of surface texture were evaluated using mesh materials (nylon, polypropylene, high density polyethylene, polyester, aluminum, and stainless steel) with openings ranging from 0.05--6.40 mm. It was found that both surface texture and material composition influence algal attachment.

  18. Combining antihypertensive and antihyperlipidemic agents – optimizing cardiovascular risk factor management

    PubMed Central

    Zamorano, José; Edwards, Jonathan

    2011-01-01

    Clinical guidelines now recognize the importance of a multifactorial approach to managing cardiovascular (CV) risk. This idea was taken a step further with the concept of the Polypill™. There are, however, considerable patent, pharmacokinetic, pharmacodynamic, registration, and cost implications that will need to be overcome before the Polypill™ or other single-pill combinations of CV medications become widely available. However, a medication targeting blood pressure (BP) and lipids provides much of the proposed benefits of the Polypill™. A single-pill combination of the antihypertensive amlodipine besylate and the lipid-lowering medication atorvastatin calcium (SPAA) is currently available in many parts of the world. This review describes the rationale for this combination therapy and the clinical trials that have demonstrated that these two agents can be combined without the loss of efficacy for either agent or an increase in the incidence of adverse events. The recently completed Cluster Randomized Usual Care vs Caduet Investigation Assessing Long-term-risk (CRUCIAL trial) is discussed in detail. CRUCIAL was a 12-month, international, multicenter, prospective, open-label, parallel design, cluster-randomized trial, which demonstrated that a proactive intervention strategy based on SPAA in addition to usual care (UC) had substantial benefits on estimated CV risk, BP, and lipids over continued UC alone. Adherence with antihypertensive and lipid-lowering therapies outside of the controlled environment of clinical trials is very low (~30%–40% at 12 months). Observational studies have demonstrated that improving adherence to lipid-lowering and antihypertensive medications may reduce CV events. One means of improving adherence is the use of single-pill combinations. Real-world observational studies have demonstrated that patients are more adherent to SPAA than co-administered antihypertensive and lipid-lowering therapy, and this improved adherence translated to

  19. Model-based benefit-risk assessment: can Archimedes help?

    PubMed

    Krishna, R

    2009-03-01

    In December 2008, the US Food and Drug Administration issued a new draft Guidance for Industry on Diabetes Mellitus--evaluating cardiovascular risk in new antidiabetic therapies to treat Type 2 diabetes. This guidance comes at a time when recent discussions have focused on delineation of cardiovascular risk reduction for new antidiabetic drugs. Computational tools that can enable early prediction of cardiovascular risk are reviewed with specific reference to Archimedes (Kaiser Permanente), with an aim of proposing a model-based solution and enabling decisions to be made as early as possible in the drug development value chain.

  20. A Third-Generation Evidence Base for Human Spaceflight Risks

    NASA Technical Reports Server (NTRS)

    Kundrot, Craig E.; Lumpkins, Sarah; Steil, Jennifer; Pellis, Neal; Charles, John

    2014-01-01

    NASA's Human Research Program seeks to understand and mitigate risks to crew health and performance in exploration missions center dot HRP's evidence base consists of an Evidence Report for each HRP risk center dot Three generations of Evidence Reports 1) Review articles + Good content - Limited authorship, infrequent updates 2) Wikipedia articles + Viewed often, very open to contributions - Summary of reviews, very few contributions 3) HRP-controlled wiki articles + Incremental additions to review articles with editorial control

  1. An Improved Ensemble of Random Vector Functional Link Networks Based on Particle Swarm Optimization with Double Optimization Strategy.

    PubMed

    Ling, Qing-Hua; Song, Yu-Qing; Han, Fei; Yang, Dan; Huang, De-Shuang

    2016-01-01

    For ensemble learning, how to select and combine the candidate classifiers are two key issues which influence the performance of the ensemble system dramatically. Random vector functional link networks (RVFL) without direct input-to-output links is one of suitable base-classifiers for ensemble systems because of its fast learning speed, simple structure and good generalization performance. In this paper, to obtain a more compact ensemble system with improved convergence performance, an improved ensemble of RVFL based on attractive and repulsive particle swarm optimization (ARPSO) with double optimization strategy is proposed. In the proposed method, ARPSO is applied to select and combine the candidate RVFL. As for using ARPSO to select the optimal base RVFL, ARPSO considers both the convergence accuracy on the validation data and the diversity of the candidate ensemble system to build the RVFL ensembles. In the process of combining RVFL, the ensemble weights corresponding to the base RVFL are initialized by the minimum norm least-square method and then further optimized by ARPSO. Finally, a few redundant RVFL is pruned, and thus the more compact ensemble of RVFL is obtained. Moreover, in this paper, theoretical analysis and justification on how to prune the base classifiers on classification problem is presented, and a simple and practically feasible strategy for pruning redundant base classifiers on both classification and regression problems is proposed. Since the double optimization is performed on the basis of the single optimization, the ensemble of RVFL built by the proposed method outperforms that built by some single optimization methods. Experiment results on function approximation and classification problems verify that the proposed method could improve its convergence accuracy as well as reduce the complexity of the ensemble system.

  2. An Improved Ensemble of Random Vector Functional Link Networks Based on Particle Swarm Optimization with Double Optimization Strategy

    PubMed Central

    Ling, Qing-Hua; Song, Yu-Qing; Han, Fei; Yang, Dan; Huang, De-Shuang

    2016-01-01

    For ensemble learning, how to select and combine the candidate classifiers are two key issues which influence the performance of the ensemble system dramatically. Random vector functional link networks (RVFL) without direct input-to-output links is one of suitable base-classifiers for ensemble systems because of its fast learning speed, simple structure and good generalization performance. In this paper, to obtain a more compact ensemble system with improved convergence performance, an improved ensemble of RVFL based on attractive and repulsive particle swarm optimization (ARPSO) with double optimization strategy is proposed. In the proposed method, ARPSO is applied to select and combine the candidate RVFL. As for using ARPSO to select the optimal base RVFL, ARPSO considers both the convergence accuracy on the validation data and the diversity of the candidate ensemble system to build the RVFL ensembles. In the process of combining RVFL, the ensemble weights corresponding to the base RVFL are initialized by the minimum norm least-square method and then further optimized by ARPSO. Finally, a few redundant RVFL is pruned, and thus the more compact ensemble of RVFL is obtained. Moreover, in this paper, theoretical analysis and justification on how to prune the base classifiers on classification problem is presented, and a simple and practically feasible strategy for pruning redundant base classifiers on both classification and regression problems is proposed. Since the double optimization is performed on the basis of the single optimization, the ensemble of RVFL built by the proposed method outperforms that built by some single optimization methods. Experiment results on function approximation and classification problems verify that the proposed method could improve its convergence accuracy as well as reduce the complexity of the ensemble system. PMID:27835638

  3. Efficacy of Code Optimization on Cache-based Processors

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    The current common wisdom in the U.S. is that the powerful, cost-effective supercomputers of tomorrow will be based on commodity (RISC) micro-processors with cache memories. Already, most distributed systems in the world use such hardware as building blocks. This shift away from vector supercomputers and towards cache-based systems has brought about a change in programming paradigm, even when ignoring issues of parallelism. Vector machines require inner-loop independence and regular, non-pathological memory strides (usually this means: non-power-of-two strides) to allow efficient vectorization of array operations. Cache-based systems require spatial and temporal locality of data, so that data once read from main memory and stored in high-speed cache memory is used optimally before being written back to main memory. This means that the most cache-friendly array operations are those that feature zero or unit stride, so that each unit of data read from main memory (a cache line) contains information for the next iteration in the loop. Moreover, loops ought to be 'fat', meaning that as many operations as possible are performed on cache data-provided instruction caches do not overflow and enough registers are available. If unit stride is not possible, for example because of some data dependency, then care must be taken to avoid pathological strides, just ads on vector computers. For cache-based systems the issues are more complex, due to the effects of associativity and of non-unit block (cache line) size. But there is more to the story. Most modern micro-processors are superscalar, which means that they can issue several (arithmetic) instructions per clock cycle, provided that there are enough independent instructions in the loop body. This is another argument for providing fat loop bodies. With these restrictions, it appears fairly straightforward to produce code that will run efficiently on any cache-based system. It can be argued that although some of the important

  4. Cloud-based large-scale air traffic flow optimization

    NASA Astrophysics Data System (ADS)

    Cao, Yi

    The ever-increasing traffic demand makes the efficient use of airspace an imperative mission, and this paper presents an effort in response to this call. Firstly, a new aggregate model, called Link Transmission Model (LTM), is proposed, which models the nationwide traffic as a network of flight routes identified by origin-destination pairs. The traversal time of a flight route is assumed to be the mode of distribution of historical flight records, and the mode is estimated by using Kernel Density Estimation. As this simplification abstracts away physical trajectory details, the complexity of modeling is drastically decreased, resulting in efficient traffic forecasting. The predicative capability of LTM is validated against recorded traffic data. Secondly, a nationwide traffic flow optimization problem with airport and en route capacity constraints is formulated based on LTM. The optimization problem aims at alleviating traffic congestions with minimal global delays. This problem is intractable due to millions of variables. A dual decomposition method is applied to decompose the large-scale problem such that the subproblems are solvable. However, the whole problem is still computational expensive to solve since each subproblem is an smaller integer programming problem that pursues integer solutions. Solving an integer programing problem is known to be far more time-consuming than solving its linear relaxation. In addition, sequential execution on a standalone computer leads to linear runtime increase when the problem size increases. To address the computational efficiency problem, a parallel computing framework is designed which accommodates concurrent executions via multithreading programming. The multithreaded version is compared with its monolithic version to show decreased runtime. Finally, an open-source cloud computing framework, Hadoop MapReduce, is employed for better scalability and reliability. This framework is an "off-the-shelf" parallel computing model

  5. Risk-based testing of imported animals: A case study for bovine tuberculosis in The Netherlands.

    PubMed

    de Vos, Clazien J; van der Goot, Jeanet A; van Zijderveld, Fred G; Swanenburg, Manon; Elbers, Armin R W

    2015-09-01

    In intra-EU trade, the health status of animals is warranted by issuing a health certificate after clinical inspection in the exporting country. This certificate cannot provide guarantee of absence of infection, especially not for diseases with a long incubation period and no overt clinical signs such as bovine tuberculosis (bTB). The Netherlands are officially free from bTB since 1999. However, frequent reintroductions occurred in the past 15 years through importation of infected cattle. Additional testing (AT) of imported cattle could enhance the probability of detecting an imported bTB infection in an early stage. The goal of this study was to evaluate the effectiveness of risk-based AT for bTB in cattle imported into The Netherlands. A generic stochastic import risk model was developed that simulates introduction of infection into an importing country through importation of live animals. Main output parameters are the number of infected animals that is imported (Ninf), the number of infected animals that is detected by testing (Ndet), and the economic losses incurred by importing infected animals (loss). The model was parameterized for bTB. Model calculations were optimized to either maximize Ndet or to minimize loss. Model results indicate that the risk of bTB introduction into The Netherlands is very high. For the current situation in which Dutch health checks on imported cattle are limited to a clinical inspection of a random sample of 5-10% of imported animals, the calculated annual Ninf=99 (median value). Random AT of 8% of all imported cattle results in Ndet=7 (median value), while the median Ndet=75 if the sampling strategy for AT is optimized to maximize Ndet. However, in the latter scenario, loss is more than twice as large as in the current situation, because only calves are tested for which cost of detection is higher than the expected gain of preventing a possible outbreak. When optimizing the sampling strategy for AT to minimize loss, only breeding

  6. Traffic Aware Planner for Cockpit-Based Trajectory Optimization

    NASA Technical Reports Server (NTRS)

    Woods, Sharon E.; Vivona, Robert A.; Henderson, Jeffrey; Wing, David J.; Burke, Kelly A.

    2016-01-01

    The Traffic Aware Planner (TAP) software application is a cockpit-based advisory tool designed to be hosted on an Electronic Flight Bag and to enable and test the NASA concept of Traffic Aware Strategic Aircrew Requests (TASAR). The TASAR concept provides pilots with optimized route changes (including altitude) that reduce fuel burn and/or flight time, avoid interactions with known traffic, weather and restricted airspace, and may be used by the pilots to request a route and/or altitude change from Air Traffic Control. Developed using an iterative process, TAP's latest improvements include human-machine interface design upgrades and added functionality based on the results of human-in-the-loop simulation experiments and flight trials. Architectural improvements have been implemented to prepare the system for operational-use trials with partner commercial airlines. Future iterations will enhance coordination with airline dispatch and add functionality to improve the acceptability of TAP-generated route-change requests to pilots, dispatchers, and air traffic controllers.

  7. Task-based optimization of image reconstruction in breast CT

    NASA Astrophysics Data System (ADS)

    Sanchez, Adrian A.; Sidky, Emil Y.; Pan, Xiaochuan

    2014-03-01

    We demonstrate a task-based assessment of image quality in dedicated breast CT in order to optimize the number of projection views acquired. The methodology we employ is based on the Hotelling Observer (HO) and its associated metrics. We consider two tasks: the Rayleigh task of discerning between two resolvable objects and a single larger object, and the signal detection task of classifying an image as belonging to either a signalpresent or signal-absent hypothesis. HO SNR values are computed for 50, 100, 200, 500, and 1000 projection view images, with the total imaging radiation dose held constant. We use the conventional fan-beam FBP algorithm and investigate the effect of varying the width of a Hanning window used in the reconstruction, since this affects both the noise properties of the image and the under-sampling artifacts which can arise in the case of sparse-view acquisitions. Our results demonstrate that fewer projection views should be used in order to increase HO performance, which in this case constitutes an upper-bound on human observer performance. However, the impact on HO SNR of using fewer projection views, each with a higher dose, is not as significant as the impact of employing regularization in the FBP reconstruction through a Hanning filter.

  8. Partially observable Markov decision processes for risk-based screening

    NASA Astrophysics Data System (ADS)

    Mrozack, Alex; Liao, Xuejun; Skatter, Sondre; Carin, Lawrence

    2016-05-01

    A long-term goal for checked baggage screening in airports has been to include passenger information, or at least a predetermined passenger risk level, in the screening process. One method for including that information could be treating the checked baggage screening process as a system-of-systems. This would allow for an optimized policy builder, such as one trained using the methodology of partially observable Markov decision processes (POMDP), to navigate the different sensors available for screening. In this paper we describe the necessary steps to tailor a POMDP for baggage screening, as well as results of simulations for specific screening scenarios.

  9. Optimal control of switched linear systems based on Migrant Particle Swarm Optimization algorithm

    NASA Astrophysics Data System (ADS)

    Xie, Fuqiang; Wang, Yongji; Zheng, Zongzhun; Li, Chuanfeng

    2009-10-01

    The optimal control problem for switched linear systems with internally forced switching has more constraints than with externally forced switching. Heavy computations and slow convergence in solving this problem is a major obstacle. In this paper we describe a new approach for solving this problem, which is called Migrant Particle Swarm Optimization (Migrant PSO). Imitating the behavior of a flock of migrant birds, the Migrant PSO applies naturally to both continuous and discrete spaces, in which definitive optimization algorithm and stochastic search method are combined. The efficacy of the proposed algorithm is illustrated via a numerical example.

  10. Tradeoff-based optimization criteria in clinical trials with multiple objectives and adaptive designs.

    PubMed

    Dmitrienko, Alex; Paux, Gautier; Pulkstenis, Erik; Zhang, Jianliang

    2016-01-01

    The article discusses clinical trial optimization problems in the context of mid- to late-stage drug development. Using the Clinical Scenario Evaluation approach, main objectives of clinical trial optimization are formulated, including selection of clinically relevant optimization criteria, identification of sets of optimal and nearly optimal values of the parameters of interest, and sensitivity assessments. The paper focuses on a class of optimization criteria arising in clinical trials with several competing goals, termed tradeoff-based optimization criteria, and discusses key considerations in constructing and applying tradeoff-based criteria. The clinical trial optimization framework considered in the paper is illustrated using two case studies based on a clinical trial with multiple objectives and a two-stage clinical trial which utilizes adaptive decision rules.

  11. A nonparametric stochastic optimizer for TDMA-based neuronal signaling.

    PubMed

    Suzuki, Junichi; Phan, Dũng H; Budiman, Harry

    2014-09-01

    This paper considers neurons as a physical communication medium for intrabody networks of nano/micro-scale machines and formulates a noisy multiobjective optimization problem for a Time Division Multiple Access (TDMA) communication protocol atop the physical layer. The problem is to find the Pareto-optimal TDMA configurations that maximize communication performance (e.g., latency) by multiplexing a given neuronal network to parallelize signal transmissions while maximizing communication robustness (i.e., unlikeliness of signal interference) against noise in neuronal signaling. Using a nonparametric significance test, the proposed stochastic optimizer is designed to statistically determine the superior-inferior relationship between given two solution candidates and seek the optimal trade-offs among communication performance and robustness objectives. Simulation results show that the proposed optimizer efficiently obtains quality TDMA configurations in noisy environments and outperforms existing noise-aware stochastic optimizers.

  12. A simple data base for identification of risk profiles

    SciTech Connect

    Munganahalli, D.

    1996-12-31

    Sedco Forex is a drilling contractor that operates approximately 80 rigs on land and offshore worldwide. The HSE management system developed by Sedco Forex is an effort to prevent accidents and minimize losses. An integral part of the HSE management system is establishing risk profiles and thereby minimizing risk and reducing loss exposures. Risk profiles are established based on accident reports, potential accident reports and other risk identification reports (RIR) like the Du Pont STOP system. A rig could fill in as many as 30 accident reports, 30 potential accident reports and 500 STOP cards each year. Statistics are important for an HSE management system, since they are indicators of success or failure of HSE systems. It is however difficult to establish risk profiles based on statistical information, unless tools are available at the rig site to aid with the analysis. Risk profiles are then used to identify important areas in the operation that may require specific attention to minimize the loss exposure. Programs to address the loss exposure can then be identified and implemented with either a local or corporate approach. In January 1995, Sedco Forex implemented a uniform HSE Database on all the rigs worldwide. In one year companywide, the HSE database would contain information on approximately 500 accident and potential accident reports, and 10,000 STOP cards. This paper demonstrates the salient features of the database and describes how it has helped in establishing key risk profiles. It also shows a recent example of how risk profiles have been established at the corporate level and used to identify the key contributing factors to hands and finger injuries. Based on this information, a campaign was launched to minimize the frequency of occurrence and associated loss attributed to hands and fingers accidents.

  13. Risk-based approach to petroleum hydrocarbon remediation. Research study

    SciTech Connect

    Miller, R.N.; Haas, P.; Faile, M.; Taffinder, S.

    1994-12-31

    The risk-based approach utilizes tools developed under the BTEX, Intrinsic Remediation (natural attenuation), Bioslurper, and Bioventing Initiatives of the Air Force Center for Environmental Excellence Technology Transfer Division (AFCEE/ERT) to construct a risk-based cost-effective approach to the cleanup of petroleum contaminated sites. The AFCEE Remediation Matrix (Enclosure 1) identifies natural attenuation as the first remediation alternative for soil and ground water contaminated with petroleum hydrocarbons. The intrinsic remediation (natural attenuation) alternative requires a scientifically defensible risk assessment based on contaminant sources, pathways, and receptors. For fuel contaminated sites, the first step is to determine contaminants of interest. For the ground water pathway (usually considered most important by regulators), this will normally be the most soluble, mobile, and toxic compounds, namely benzene, toluene, ethyl benzene, and o, m, p, xylene (BTEX).

  14. Coupling risk-based remediation with innovative technology

    SciTech Connect

    Goodheart, G.F.; Teaf, C.M. |; Manning, M.J.

    1998-05-01

    Tiered risk-based cleanup approaches have been effectively used at petroleum sites, pesticide sites and other commercial/industrial facilities. For example, the Illinois Environmental Protection Agency (IEPA) has promulgated guidance for a Tiered Approach to Corrective action Objectives (TACO) to establish site-specific remediation goals for contaminated soil and groundwater. As in the case of many other state programs, TACO is designed to provide for adequate protection of human health and the environment based on potential risks posed by site conditions. It also incorporates site-related information that may allow more cost-effective remediation. IEPA developed TACO to provide flexibility to site owners/operators when formulating site-specific remediation activities, as well as to hasten property redevelopment to return sites to more productive use. Where appropriate, risk-based cleanup objectives as set by TACO-type programs may be coupled with innovative remediation technologies such as air sparging, bioremediation and soil washing.

  15. PERSPECTIVE: Technical fixes and climate change: optimizing for risks and consequences

    NASA Astrophysics Data System (ADS)

    Rasch, Philip J.

    2010-09-01

    Scientists and society in general are becoming increasingly concerned about the risks of climate change from the emission of greenhouse gases (IPCC 2007). Yet emissions continue to increase (Raupach et al 2007), and achieving reductions soon enough to avoid large and undesirable impacts requires a near-revolutionary global transformation of energy and transportation systems (Hoffert et al 1998). The size of the transformation and lack of an effective societal response have motivated some to explore other quite controversial strategies to mitigate some of the planetary consequences of these emissions. These strategies have come to be known as geoengineering: 'the deliberate manipulation of the planetary environment to counteract anthropogenic climate change' (Keith 2000). Concern about society's inability to reduce emissions has driven a resurgence in interest in geoengineering, particularly following the call for more research in Crutzen (2006). Two classes of geoengineering solutions have developed: (1) methods to draw CO2 out of the atmosphere and sequester it in a relatively benign form; and (2) methods that change the energy flux entering or leaving the planet without modifying CO2 concentrations by, for example, changing the planetary albedo. Only the latter methods are considered here. Summaries of many of the methods, scientific questions, and issues of testing and implementation are discussed in Launder and Thompson (2009) and Royal Society (2009). The increased attention indicates that geoengineering is not a panacea and all strategies considered will have risks and consequences (e.g. Robock 2008, Trenberth and Dai 2007). Recent studies involving comprehensive Earth system models can provide insight into subtle interactions between components of the climate system. For example Rasch et al (2009) found that geoengineering by changing boundary clouds will not simultaneously 'correct' global averaged surface temperature, precipitation, and sea ice to present

  16. Comparative assessment of absolute cardiovascular disease risk characterization from non-laboratory-based risk assessment in South African populations

    PubMed Central

    2013-01-01

    Background All rigorous primary cardiovascular disease (CVD) prevention guidelines recommend absolute CVD risk scores to identify high- and low-risk patients, but laboratory testing can be impractical in low- and middle-income countries. The purpose of this study was to compare the ranking performance of a simple, non-laboratory-based risk score to laboratory-based scores in various South African populations. Methods We calculated and compared 10-year CVD (or coronary heart disease (CHD)) risk for 14,772 adults from thirteen cross-sectional South African populations (data collected from 1987 to 2009). Risk characterization performance for the non-laboratory-based score was assessed by comparing rankings of risk with six laboratory-based scores (three versions of Framingham risk, SCORE for high- and low-risk countries, and CUORE) using Spearman rank correlation and percent of population equivalently characterized as ‘high’ or ‘low’ risk. Total 10-year non-laboratory-based risk of CVD death was also calculated for a representative cross-section from the 1998 South African Demographic Health Survey (DHS, n = 9,379) to estimate the national burden of CVD mortality risk. Results Spearman correlation coefficients for the non-laboratory-based score with the laboratory-based scores ranged from 0.88 to 0.986. Using conventional thresholds for CVD risk (10% to 20% 10-year CVD risk), 90% to 92% of men and 94% to 97% of women were equivalently characterized as ‘high’ or ‘low’ risk using the non-laboratory-based and Framingham (2008) CVD risk score. These results were robust across the six risk scores evaluated and the thirteen cross-sectional datasets, with few exceptions (lower agreement between the non-laboratory-based and Framingham (1991) CHD risk scores). Approximately 18% of adults in the DHS population were characterized as ‘high CVD risk’ (10-year CVD death risk >20%) using the non-laboratory-based score. Conclusions We found a high level of

  17. Relationship of optimism and suicidal ideation in three groups of patients at varying levels of suicide risk.

    PubMed

    Huffman, Jeff C; Boehm, Julia K; Beach, Scott R; Beale, Eleanor E; DuBois, Christina M; Healy, Brian C

    2016-06-01

    Optimism has been associated with reduced suicidal ideation, but there have been few studies in patients at high suicide risk. We analyzed data from three study populations (total N = 319) with elevated risk of suicide: (1) patients with a recent acute cardiovascular event, (2) patients hospitalized for heart disease who had depression or an anxiety disorder, and (3) patients psychiatrically hospitalized for suicidal ideation or following a suicide attempt. For each study we analyzed the association between optimism (measured by the Life-Orientation Test-Revised) and suicidal ideation, and then completed an exploratory random effects meta-analysis of the findings to synthesize this data. The meta-analysis of the three studies showed that higher levels of self-reported optimism were associated with a lower likelihood of suicidal ideation (odds ratio [OR] = .89, 95% confidence interval [CI] = .85-.95, z = 3.94, p < .001), independent of age, gender, and depressive symptoms. This association held when using the subscales of the Life Orientation Test-Revised scale that measured higher optimism (OR = .84, 95% CI = .76-.92, z = 3.57, p < .001) and lower pessimism (OR = .83, 95% CI = .75-.92], z = 3.61, p < .001). These results also held when suicidal ideation was analyzed as an ordinal variable. Our findings suggest that optimism may be associated with a lower risk of suicidal ideation, above and beyond the effects of depressive symptoms, for a wide range of patients with clinical conditions that place them at elevated risk for suicide.

  18. Risk-based audit selection of dairy farms.

    PubMed

    van Asseldonk, M A P M; Velthuis, A G J

    2014-02-01

    Dairy farms are audited in the Netherlands on numerous process standards. Each farm is audited once every 2 years. Increasing demands for cost-effectiveness in farm audits can be met by introducing risk-based principles. This implies targeting subpopulations with a higher risk of poor process standards. To select farms for an audit that present higher risks, a statistical analysis was conducted to test the relationship between the outcome of farm audits and bulk milk laboratory results before the audit. The analysis comprised 28,358 farm audits and all conducted laboratory tests of bulk milk samples 12 mo before the audit. The overall outcome of each farm audit was classified as approved or rejected. Laboratory results included somatic cell count (SCC), total bacterial count (TBC), antimicrobial drug residues (ADR), level of butyric acid spores (BAB), freezing point depression (FPD), level of free fatty acids (FFA), and cleanliness of the milk (CLN). The bulk milk laboratory results were significantly related to audit outcomes. Rejected audits are likely to occur on dairy farms with higher mean levels of SCC, TBC, ADR, and BAB. Moreover, in a multivariable model, maxima for TBC, SCC, and FPD as well as standard deviations for TBC and FPD are risk factors for negative audit outcomes. The efficiency curve of a risk-based selection approach, on the basis of the derived regression results, dominated the current random selection approach. To capture 25, 50, or 75% of the population with poor process standards (i.e., audit outcome of rejected), respectively, only 8, 20, or 47% of the population had to be sampled based on a risk-based selection approach. Milk quality information can thus be used to preselect high-risk farms to be audited more frequently.

  19. Risk-based principles for defining and managing water security

    PubMed Central

    Hall, Jim; Borgomeo, Edoardo

    2013-01-01

    The concept of water security implies concern about potentially harmful states of coupled human and natural water systems. Those harmful states may be associated with water scarcity (for humans and/or the environment), floods or harmful water quality. The theories and practices of risk analysis and risk management have been developed and elaborated to deal with the uncertain occurrence of harmful events. Yet despite their widespread application in public policy, theories and practices of risk management have well-known limitations, particularly in the context of severe uncertainties and contested values. Here, we seek to explore the boundaries of applicability of risk-based principles as a means of formalizing discussion of water security. Not only do risk concepts have normative appeal, but they also provide an explicit means of addressing the variability that is intrinsic to hydrological, ecological and socio-economic systems. We illustrate the nature of these interconnections with a simulation study, which demonstrates how water resources planning could take more explicit account of epistemic uncertainties, tolerability of risk and the trade-offs in risk among different actors. PMID:24080616

  20. Comparison of Genetic Algorithm, Particle Swarm Optimization and Biogeography-based Optimization for Feature Selection to Classify Clusters of Microcalcifications

    NASA Astrophysics Data System (ADS)

    Khehra, Baljit Singh; Pharwaha, Amar Partap Singh

    2016-06-01

    Ductal carcinoma in situ (DCIS) is one type of breast cancer. Clusters of microcalcifications (MCCs) are symptoms of DCIS that are recognized by mammography. Selection of robust features vector is the process of selecting an optimal subset of features from a large number of available features in a given problem domain after the feature extraction and before any classification scheme. Feature selection reduces the feature space that improves the performance of classifier and decreases the computational burden imposed by using many features on classifier. Selection of an optimal subset of features from a large number of available features in a given problem domain is a difficult search problem. For n features, the total numbers of possible subsets of features are 2n. Thus, selection of an optimal subset of features problem belongs to the category of NP-hard problems. In this paper, an attempt is made to find the optimal subset of MCCs features from all possible subsets of features using genetic algorithm (GA), particle swarm optimization (PSO) and biogeography-based optimization (BBO). For simulation, a total of 380 benign and malignant MCCs samples have been selected from mammogram images of DDSM database. A total of 50 features extracted from benign and malignant MCCs samples are used in this study. In these algorithms, fitness function is correct classification rate of classifier. Support vector machine is used as a classifier. From experimental results, it is also observed that the performance of PSO-based and BBO-based algorithms to select an optimal subset of features for classifying MCCs as benign or malignant is better as compared to GA-based algorithm.

  1. Prototype Biology-Based Radiation Risk Module Project

    NASA Technical Reports Server (NTRS)

    Terrier, Douglas; Clayton, Ronald G.; Patel, Zarana; Hu, Shaowen; Huff, Janice

    2015-01-01

    Biological effects of space radiation and risk mitigation are strategic knowledge gaps for the Evolvable Mars Campaign. The current epidemiology-based NASA Space Cancer Risk (NSCR) model contains large uncertainties (HAT #6.5a) due to lack of information on the radiobiology of galactic cosmic rays (GCR) and lack of human data. The use of experimental models that most accurately replicate the response of human tissues is critical for precision in risk projections. Our proposed study will compare DNA damage, histological, and cell kinetic parameters after irradiation in normal 2D human cells versus 3D tissue models, and it will use a multi-scale computational model (CHASTE) to investigate various biological processes that may contribute to carcinogenesis, including radiation-induced cellular signaling pathways. This cross-disciplinary work, with biological validation of an evolvable mathematical computational model, will help reduce uncertainties within NSCR and aid risk mitigation for radiation-induced carcinogenesis.

  2. Adaptive surrogate model based multi-objective transfer trajectory optimization between different libration points

    NASA Astrophysics Data System (ADS)

    Peng, Haijun; Wang, Wei

    2016-10-01

    An adaptive surrogate model-based multi-objective optimization strategy that combines the benefits of invariant manifolds and low-thrust control toward developing a low-computational-cost transfer trajectory between libration orbits around the L1 and L2 libration points in the Sun-Earth system has been proposed in this paper. A new structure for a multi-objective transfer trajectory optimization model that divides the transfer trajectory into several segments and gives the dominations for invariant manifolds and low-thrust control in different segments has been established. To reduce the computational cost of multi-objective transfer trajectory optimization, a mixed sampling strategy-based adaptive surrogate model has been proposed. Numerical simulations show that the results obtained from the adaptive surrogate-based multi-objective optimization are in agreement with the results obtained using direct multi-objective optimization methods, and the computational workload of the adaptive surrogate-based multi-objective optimization is only approximately 10% of that of direct multi-objective optimization. Furthermore, the generating efficiency of the Pareto points of the adaptive surrogate-based multi-objective optimization is approximately 8 times that of the direct multi-objective optimization. Therefore, the proposed adaptive surrogate-based multi-objective optimization provides obvious advantages over direct multi-objective optimization methods.

  3. Task-based optimization of flip angle for texture analysis in MRI

    NASA Astrophysics Data System (ADS)

    Brand, Jonathan F.; Furenlid, Lars R.; Altbach, Maria I.; Galons, Jean-Phillippe; Bhattacharyya, Achyut; Sharma, Puneet; Bhattacharyya, Tulshi; Bilgin, Ali; Martin, Diego R.

    2016-03-01

    Chronic liver disease is a worldwide health problem, and hepatic fibrosis (HF) is one of the hallmarks of the disease. The current reference standard for diagnosing HF is biopsy followed by pathologist examination, however this is limited by sampling error and carries risk of complications. Pathology diagnosis of HF is based on textural change in the liver as a lobular collagen network that develops within portal triads. The scale of collagen lobules is characteristically on order of 1-5 mm, which approximates the resolution limit of in vivo gadolinium-enhanced magnetic resonance imaging in the delayed phase. We have shown that MRI of formalin fixed human ex vivo liver samples mimic the textural contrast of in vivo Gd-MRI and can be used as MRI phantoms. We have developed local texture analysis that is applied to phantom images, and the results are used to train model observers. The performance of the observer is assessed with the area-under-the-receiveroperator- characteristic curve (AUROC) as the figure of merit. To optimize the MRI pulse sequence, phantoms are scanned with multiple times at a range of flip angles. The flip angle that associated with the highest AUROC is chosen as optimal based on the task of detecting HF.

  4. 12 CFR Appendix B to Part 3 - Risk-Based Capital Guidelines; Market Risk Adjustment

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... safe and sound banking practices. (3) The OCC may exclude a national bank otherwise meeting the... deems it consistent with safe and sound banking practices. (c) Scope. The capital requirements of this... maturity) if the payment would cause the issuing bank's risk-based capital ratio to fall or remain...

  5. Optimization-based decision support to assist in logistics planning for hospital evacuations.

    PubMed

    Glick, Roger; Bish, Douglas R; Agca, Esra

    2013-01-01

    The evacuation of the hospital is a very complex process and evacuation planning is an important part of a hospital's emergency management plan. There are numerous factors that affect the evacuation plan including the nature of threat, availability of resources and staff the characteristics of the evacuee population, and risk to patients and staff. The safety and health of patients is of fundamental importance, but safely moving patients to alternative care facilities while under threat is a very challenging task. This article describes the logistical issues and complexities involved in planning and execution of hospital evacuations. Furthermore, this article provides examples of how optimization-based decision support tools can help evacuation planners to better plan for complex evacuations by providing real-world solutions to various evacuation scenarios.

  6. Optimizing nanomedicine pharmacokinetics using physiologically based pharmacokinetics modelling.

    PubMed

    Moss, Darren Michael; Siccardi, Marco

    2014-09-01

    The delivery of therapeutic agents is characterized by numerous challenges including poor absorption, low penetration in target tissues and non-specific dissemination in organs, leading to toxicity or poor drug exposure. Several nanomedicine strategies have emerged as an advanced approach to enhance drug delivery and improve the treatment of several diseases. Numerous processes mediate the pharmacokinetics of nanoformulations, with the absorption, distribution, metabolism and elimination (ADME) being poorly understood and often differing substantially from traditional formulations. Understanding how nanoformulation composition and physicochemical properties influence drug distribution in the human body is of central importance when developing future treatment strategies. A helpful pharmacological tool to simulate the distribution of nanoformulations is represented by physiologically based pharmacokinetics (PBPK) modelling, which integrates system data describing a population of interest with drug/nanoparticle in vitro data through a mathematical description of ADME. The application of PBPK models for nanomedicine is in its infancy and characterized by several challenges. The integration of property-distribution relationships in PBPK models may benefit nanomedicine research, giving opportunities for innovative development of nanotechnologies. PBPK modelling has the potential to improve our understanding of the mechanisms underpinning nanoformulation disposition and allow for more rapid and accurate determination of their kinetics. This review provides an overview of the current knowledge of nanomedicine distribution and the use of PBPK modelling in the characterization of nanoformulations with optimal pharmacokinetics.

  7. Optimal control of complex networks based on matrix differentiation

    NASA Astrophysics Data System (ADS)

    Li, Guoqi; Ding, Jie; Wen, Changyun; Pei, Jing

    2016-09-01

    Finding the key node set to be connected to external control sources so as to minimize the energy for controlling a complex network, known as the minimum-energy control problem, is of critical importance but remains open. We address this critical problem where matrix differentiation is involved. To this end, the differentiation of energy/cost function with respect to the input matrix is obtained based on tensor analysis, and the Hessian matrix is compressed from a fourth-order tensor. Normalized projected gradient method (NPGM) normalized projected trust-region method (NPTM) are proposed with established convergence property. We show that NPGM is more computationally efficient than NPTM. Simulation results demonstrate satisfactory performance of the algorithms, and reveal important insights as well. Two interesting phenomena are observed. One is that the key node set tends to divide elementary paths equally. The other is that the low-degree nodes may be more important than hubs from a control point of view, indicating that controlling hub nodes does not help to lower the control energy. These results suggest a way of achieving optimal control of complex networks, and provide meaningful insights for future researches.

  8. Finding Risk Groups by Optimizing Artificial Neural Networks on the Area under the Survival Curve Using Genetic Algorithms

    PubMed Central

    Kalderstam, Jonas; Edén, Patrik; Ohlsson, Mattias

    2015-01-01

    We investigate a new method to place patients into risk groups in censored survival data. Properties such as median survival time, and end survival rate, are implicitly improved by optimizing the area under the survival curve. Artificial neural networks (ANN) are trained to either maximize or minimize this area using a genetic algorithm, and combined into an ensemble to predict one of low, intermediate, or high risk groups. Estimated patient risk can influence treatment choices, and is important for study stratification. A common approach is to sort the patients according to a prognostic index and then group them along the quartile limits. The Cox proportional hazards model (Cox) is one example of this approach. Another method of doing risk grouping is recursive partitioning (Rpart), which constructs a decision tree where each branch point maximizes the statistical separation between the groups. ANN, Cox, and Rpart are compared on five publicly available data sets with varying properties. Cross-validation, as well as separate test sets, are used to validate the models. Results on the test sets show comparable performance, except for the smallest data set where Rpart’s predicted risk groups turn out to be inverted, an example of crossing survival curves. Cross-validation shows that all three models exhibit crossing of some survival curves on this small data set but that the ANN model manages the best separation of groups in terms of median survival time before such crossings. The conclusion is that optimizing the area under the survival curve is a viable approach to identify risk groups. Training ANNs to optimize this area combines two key strengths from both prognostic indices and Rpart. First, a desired minimum group size can be specified, as for a prognostic index. Second, the ability to utilize non-linear effects among the covariates, which Rpart is also able to do. PMID:26352405

  9. Approach to proliferation risk assessment based on multiple objective analysis framework

    SciTech Connect

    Andrianov, A.; Kuptsov, I.

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.

  10. Risk-based evaluation of Allowed Outage Times (AOTs) considering risk of shutdown

    SciTech Connect

    Mankamo, T. ); Kim, I.S.; Samanta, P.K. )

    1992-01-01

    When safety systems fail during power operation, Technical Specifications (TS) usually limit the repair within Allowed Outage Time (AOT). If the repair cannot be completed within the AOT, or no AOT is allowed, the plant is required to be shut down for the repair. However, if the capability to remove decay heat is degraded, shutting down the plant with the need to operate the affected decay-heat removal systems may impose a substantial risk compared to continued power operation over a usual repair time. Thus, defining a proper AOT in such situations can be considered as a risk-comparison between the repair in frill power state with a temporarily increased level of risk, and the altemative of shutting down the plant for the repair in zero power state with a specific associated risk. The methodology of the risk-comparison approach, with a due consideration of the shutdown risk, has been further developed and applied to the AOT considerations of residual heat removal and standby service water systems of a boiling water reactor (BWR) plant. Based on the completed work, several improvements to the TS requirements for the systems studied can be suggested.

  11. Risk-based evaluation of Allowed Outage Times (AOTs) considering risk of shutdown

    SciTech Connect

    Mankamo, T.; Kim, I.S.; Samanta, P.K.

    1992-12-31

    When safety systems fail during power operation, Technical Specifications (TS) usually limit the repair within Allowed Outage Time (AOT). If the repair cannot be completed within the AOT, or no AOT is allowed, the plant is required to be shut down for the repair. However, if the capability to remove decay heat is degraded, shutting down the plant with the need to operate the affected decay-heat removal systems may impose a substantial risk compared to continued power operation over a usual repair time. Thus, defining a proper AOT in such situations can be considered as a risk-comparison between the repair in frill power state with a temporarily increased level of risk, and the altemative of shutting down the plant for the repair in zero power state with a specific associated risk. The methodology of the risk-comparison approach, with a due consideration of the shutdown risk, has been further developed and applied to the AOT considerations of residual heat removal and standby service water systems of a boiling water reactor (BWR) plant. Based on the completed work, several improvements to the TS requirements for the systems studied can be suggested.

  12. Aeroelastic Optimization Study Based on X-56A Model

    NASA Technical Reports Server (NTRS)

    Li, Wesley; Pak, Chan-Gi

    2014-01-01

    A design process which incorporates the object-oriented multidisciplinary design, analysis, and optimization (MDAO) tool and the aeroelastic effects of high fidelity finite element models to characterize the design space was successfully developed and established. Two multidisciplinary design optimization studies using an object-oriented MDAO tool developed at NASA Armstrong Flight Research Center were presented. The first study demonstrates the use of aeroelastic tailoring concepts to minimize the structural weight while meeting the design requirements including strength, buckling, and flutter. A hybrid and discretization optimization approach was implemented to improve accuracy and computational efficiency of a global optimization algorithm. The second study presents a flutter mass balancing optimization study. The results provide guidance to modify the fabricated flexible wing design and move the design flutter speeds back into the flight envelope so that the original objective of X-56A flight test can be accomplished.

  13. Who reaps the benefits, who bears the risks? Comparative optimism, comparative utility, and regulatory preferences for mobile phone technology.

    PubMed

    White, Mathew P; Eiser, J Richard; Harris, Peter R; Pahl, Sabine

    2007-06-01

    Although the issue of risk target (e.g., self, others, children) is widely acknowledged in risk perception research, its importance appears underappreciated. To date, most research has been satisfied with demonstrating comparative optimism, i.e., lower perceived risk for the self than others, and exploring its moderators, such as perceived controllability and personal exposure. Much less research has investigated how the issue of target may affect benefit perceptions or key outcomes such as stated preferences for hazard regulation. The current research investigated these issues using data from a public survey of attitudes toward mobile phone technology (N= 1,320). First, results demonstrated comparative optimism for this hazard, and also found moderating effects of both controllability and personal exposure. Second, there was evidence of comparative utility, i.e., users believed that the benefits from mobile phone technology are greater for the self than others. Third, and most important for policy, preferences for handset regulation were best predicted by perceptions of the risks to others but perceived benefits for the self. Results suggest a closer awareness of target can improve prediction of stated preferences for hazard regulation and that it would be profitable for future research to pay more attention to the issue of target for both risk and benefit perceptions.

  14. Optimizing management of metabolic syndrome to reduce risk: focus on life-style.

    PubMed

    Bianchi, Cristina; Penno, Giuseppe; Daniele, Giuseppe; Benzi, Luca; Del Prato, Stefano; Miccoli, Roberto

    2008-06-01

    The prevalence of metabolic syndrome (MS) is increasing all over the world and its incidence is expected to rise in the next years. Although genetic predisposition appears to play an important role in the regulation of metabolic parameters and in particular of body weight, the rapid increase in the prevalence of obesity and MS suggests that ecological factors (social, economic, cultural and physical environment) are promoting those conditions in susceptible individuals. People with MS are at increased risk of type 2 diabetes and cardiovascular disease and therefore they represent a priority target for preventive strategies. Life-style modifications based on healthy diet and increased physical activity are an effective preventing and therapeutic approach. Unfortunately, implementation of life-style modification and maintenance of effects is a difficult task both at personal and social level, thus drug therapy can be taken into account.

  15. Risk management and maintenance optimization of nuclear reactor cooling piping system

    NASA Astrophysics Data System (ADS)

    Augé, L.; Capra, B.; Lasne, M.; Bernard, O.; Bénéfice, P.; Comby, R.

    2006-11-01

    Seaside nuclear power plants have to face the ageing of nuclear reactor cooling piping systems. In order to minimize the duration of the production unit shutdown, maintenance operations have to be planned well in advance. In a context where owners of infrastructures tend to extend the life span of their goods while having to keep the safety level maximum, Oxand brings its expertise and know-how in management of infrastructures life cycle. A dedicated methodology relies on several modules that all participate in fixing network optimum replacement dates: expertise on ageing mechanisms (corrosion, cement degradation...) and the associated kinetics, expertise on impacts of ageing on functional integrity of piping systems, predictive simulation based on experience feedback, development of monitoring techniques focused on actual threats. More precisely, Oxand has designed a patented monitoring technique based on optic fiber sensors, which aims at controlling the deterioration level of piping systems. This preventive maintenance enables the owner to determine criteria for network replacement based on degradation impacts. This approach helps the owner justify his maintenance strategy and allows him to demonstrate the management of safety level. More generally, all monitoring techniques used by the owners are developed and coupled to predictive simulation tools, notably thanks to processes based on Bayesian approaches. Methodologies to evaluate and optimize operation budgets, depending on predictions of future functional deterioration and available maintenance solutions are also developed and applied. Finally, all information related to infrastructure ageing and available maintenance options are put together to reach the right solution for safe and performing infrastructure management.

  16. Risk-based indicators of Canadians’ exposures to environmental carcinogens

    PubMed Central

    2013-01-01

    Background Tools for estimating population exposures to environmental carcinogens are required to support evidence-based policies to reduce chronic exposures and associated cancers. Our objective was to develop indicators of population exposure to selected environmental carcinogens that can be easily updated over time, and allow comparisons and prioritization between different carcinogens and exposure pathways. Methods We employed a risk assessment-based approach to produce screening-level estimates of lifetime excess cancer risk for selected substances listed as known carcinogens by the International Agency for Research on Cancer. Estimates of lifetime average daily intake were calculated using population characteristics combined with concentrations (circa 2006) in outdoor air, indoor air, dust, drinking water, and food and beverages from existing monitoring databases or comprehensive literature reviews. Intake estimates were then multiplied by cancer potency factors from Health Canada, the United States Environmental Protection Agency, and the California Office of Environmental Health Hazard Assessment to estimate lifetime excess cancer risks associated with each substance and exposure pathway. Lifetime excess cancer risks in excess of 1 per million people are identified as potential priorities for further attention. Results Based on data representing average conditions circa 2006, a total of 18 carcinogen-exposure pathways had potential lifetime excess cancer risks greater than 1 per million, based on varying data quality. Carcinogens with moderate to high data quality and lifetime excess cancer risk greater than 1 per million included benzene, 1,3-butadiene and radon in outdoor air; benzene and radon in indoor air; and arsenic and hexavalent chromium in drinking water. Important data gaps were identified for asbestos, hexavalent chromium and diesel exhaust in outdoor and indoor air, while little data were available to assess risk for substances in dust, food

  17. New knowledge-based genetic algorithm for excavator boom structural optimization

    NASA Astrophysics Data System (ADS)

    Hua, Haiyan; Lin, Shuwen

    2014-03-01

    Due to the insufficiency of utilizing knowledge to guide the complex optimal searching, existing genetic algorithms fail to effectively solve excavator boom structural optimization problem. To improve the optimization efficiency and quality, a new knowledge-based real-coded genetic algorithm is proposed. A dual evolution mechanism combining knowledge evolution with genetic algorithm is established to extract, handle and utilize the shallow and deep implicit constraint knowledge to guide the optimal searching of genetic algorithm circularly. Based on this dual evolution mechanism, knowledge evolution and population evolution can be connected by knowledge influence operators to improve the configurability of knowledge and genetic operators. Then, the new knowledge-based selection operator, crossover operator and mutation operator are proposed to integrate the optimal process knowledge and domain culture to guide the excavator boom structural optimization. Eight kinds of testing algorithms, which include different genetic operators, are taken as examples to solve the structural optimization of a medium-sized excavator boom. By comparing the results of optimization, it is shown that the algorithm including all the new knowledge-based genetic operators can more remarkably improve the evolutionary rate and searching ability than other testing algorithms, which demonstrates the effectiveness of knowledge for guiding optimal searching. The proposed knowledge-based genetic algorithm by combining multi-level knowledge evolution with numerical optimization provides a new effective method for solving the complex engineering optimization problem.

  18. HIV treatment optimism and sexual risk behaviors among HIV positive African American men who have sex with men.

    PubMed

    Peterson, John L; Miner, Michael H; Brennan, David J; Rosser, B R Simon

    2012-04-01

    The association between HIV treatment optimism--beliefs about susceptibility to transmit HIV, motivation to use condoms, and severity of HIV--and sexual risk behavior was examined among HIV-positive African American men who have sex with men (MSM). Participants were 174 men recruited in four major metropolitan areas of the United States to participate in a weekend HIV risk reduction intervention. Baseline results revealed that beliefs in less susceptibility to transmit HIV and less motivation to use condoms were significantly associated with more unprotected anal intercourse among serodiscordant casual partners. Less motivation to use condoms also predicted more unprotected insertive and receptive anal sex and was more important than susceptibility beliefs in predicting these behaviors. Suggestions are offered of ways to better inform HIV-positive African American MSM about their misperceptions about HIV treatment and how their level of optimism about HIV treatment may diminish or encourage condom use.

  19. A School-Based Suicide Risk Assessment Protocol

    ERIC Educational Resources Information Center

    Boccio, Dana E.

    2015-01-01

    Suicide remains the third leading cause of death among young people in the United States. Considering that youth who contemplate suicide generally exhibit warning signs before engaging in lethal self-harm, school-based mental health professionals can play a vital role in identifying students who are at risk for suicidal behavior. Nevertheless, the…

  20. Research needs for risk-informed, performance-based regulation

    SciTech Connect

    Cloninger, T.H.

    1997-01-01

    This presentation was made by an executive in the utility which operates the South Texas Project reactors, and summarizes their perspective on probabilistic safety analysis, risk-based operation, and risk-based regulation. They view it as a tool to help them better apply their resources to maintain the level of safety necessary to protect the public health and safety. South Texas served as one of the pilot plants for the application of risk-based regulation to the maintenance rule. The author feels that the process presents opportunities as well as challenges. Among the opportunities is the involvement of more people in the process, and the sense of investment they take in the decisions, in addition to the insight they can offer. In the area of challenges there is the need for better understanding of how to apply what already is known on problems, rather than essentially reinventing the wheel to address problems. Research is needed to better understand when some events are not truly of a significant safety concern. The demarcation between deterministic decisions and the appropriate application of risk-based decisions must be better defined, for the sake of the operator as well as the public observing plant operation.

  1. Active Multitask Learning With Trace Norm Regularization Based on Excess Risk.

    PubMed

    Fang, Meng; Yin, Jie; Hall, Lawrence O; Tao, Dacheng

    2016-07-27

    This paper addresses the problem of active learning on multiple tasks, where labeled data are expensive to obtain for each individual task but the learning problems share some commonalities across multiple related tasks. To leverage the benefits of jointly learning from multiple related tasks and making active queries, we propose a novel active multitask learning approach based on trace norm regularized least squares. The basic idea is to induce an optimal classifier which has the lowest risk and at the same time which is closest to the true hypothesis. Toward this aim, we devise a new active selection criterion that takes into account not only the risk but also the excess risk, which measures the distance to the true hypothesis. Based on this criterion, our proposed algorithm actively selects the instance to query for its label based on the combination of the two risks. Experiments on both synthetic and real-world datasets show that our proposed algorithm provides superior performance as compared to other state-of-the-art active learning methods.

  2. Optimal design and selection of magneto-rheological brake types based on braking torque and mass

    NASA Astrophysics Data System (ADS)

    Nguyen, Q. H.; Lang, V. T.; Choi, S. B.

    2015-06-01

    In developing magnetorheological brakes (MRBs), it is well known that the braking torque and the mass of the MRBs are important factors that should be considered in the product’s design. This research focuses on the optimal design of different types of MRBs, from which we identify an optimal selection of MRB types, considering braking torque and mass. In the optimization, common types of MRBs such as disc-type, drum-type, hybrid-type, and T-shape types are considered. The optimization problem is to find an optimal MRB structure that can produce the required braking torque while minimizing its mass. After a brief description of the configuration of the MRBs, the MRBs’ braking torque is derived based on the Herschel-Bulkley rheological model of the magnetorheological fluid. Then, the optimal designs of the MRBs are analyzed. The optimization objective is to minimize the mass of the brake while the braking torque is constrained to be greater than a required value. In addition, the power consumption of the MRBs is also considered as a reference parameter in the optimization. A finite element analysis integrated with an optimization tool is used to obtain optimal solutions for the MRBs. Optimal solutions of MRBs with different required braking torque values are obtained based on the proposed optimization procedure. From the results, we discuss the optimal selection of MRB types, considering braking torque and mass.

  3. Optimism and Pessimism in Social Context: An Interpersonal Perspective on Resilience and Risk

    PubMed Central

    Smith, Timothy W.; Ruiz, John M.; Cundiff, Jenny M.; Baron, Kelly G.; Nealey-Moore, Jill B.

    2016-01-01

    Using the interpersonal perspective, we examined social correlates of dispositional optimism. In Study 1, optimism and pessimism were associated with warm-dominant and hostile-submissive interpersonal styles, respectively, across four samples, and had expected associations with social support and interpersonal stressors. In 300 married couples, Study 2 replicated these findings regarding interpersonal styles, using self-reports and spouse ratings. Optimism-pessimism also had significant actor and partner associations with marital quality. In Study 3 (120 couples), husbands’ and wives’ optimism predicted increases in their own marital adjustment over time, and husbands’ optimism predicted increases in wives’ marital adjustment. Thus, the interpersonal perspective is a useful integrative framework for examining social processes that could contribute to associations of optimism-pessimism with physical health and emotional adjustment. PMID:27840458

  4. Hazardous materials transportation: a risk-analysis-based routing methodology.

    PubMed

    Leonelli, P; Bonvicini, S; Spadoni, G

    2000-01-07

    This paper introduces a new methodology based on risk analysis for the selection of the best route for the transport of a hazardous substance. In order to perform this optimisation, the network is considered as a graph composed by nodes and arcs; each arc is assigned a cost per unit vehicle travelling on it and a vehicle capacity. After short discussion about risk measures suitable for linear risk sources, the arc capacities are introduced by comparison between the societal and individual risk measures of each arc with hazardous materials transportation risk criteria; then arc costs are defined in order to take into account both transportation out-of-pocket expenses and risk-related costs. The optimisation problem can thus be formulated as a 'minimum cost flow problem', which consists of determining for a specific hazardous substance the cheapest flow distribution, honouring the arc capacities, from the origin nodes to the destination nodes. The main features of the optimisation procedure, implemented on the computer code OPTIPATH, are presented. Test results about shipments of ammonia are discussed and finally further research developments are proposed.

  5. Strategy based on information entropy for optimizing stochastic functions.

    PubMed

    Schmidt, Tobias Christian; Ries, Harald; Spirkl, Wolfgang

    2007-02-01

    We propose a method for the global optimization of stochastic functions. During the course of the optimization, a probability distribution is built up for the location and the value of the global optimum. The concept of information entropy is used to make the optimization as efficient as possible. The entropy measures the information content of a probability distribution, and thus gives a criterion for decisions: From several possibilities we choose the one which yields the most information concerning location and value of the global maximum sought.

  6. Optimization Based Trajectory Planning of Human Upper Body

    DTIC Science & Technology

    2004-01-01

    such as the minimum torque change model. Considerable research has been done to obtain optimal robot path. Saramago et al. (1998; 2000; 2002) have...Robotics and Automation, Vol. 2, pp. 802-807. 25. Saramago , S.F.P. and Steffen Jr, V., 2000, “Optimal Trajectory Planning of Robot Manipulators in...the Presence of Moving Obstacles, ” Mechanism and Machine Theory 35(8), 1079–1094. 26. Saramago , S.F.P. and Steffen Jr, V., 1998, “Optimization of the

  7. Recent experiences using finite-element-based structural optimization

    NASA Technical Reports Server (NTRS)

    Paul, B. K.; Mcconnell, J. C.; Love, Mike H.

    1989-01-01

    Structural optimization has been available to the structural analysis community as a tool for many years. The popular use of displacement method finite-element techniques to analyze linearly elastic structures has resulted in an ability to calculate the weight and constraint gradients inexpensively for numerical optimization of structures. Here, recent experiences in the investigation and use of structural optimization are discussed. In particular, experience with the commercially available ADS/NASOPT code is addressed. An overview of the ADS/NASOPT procedure and how it was implemented is given. Two example problems are also discussed.

  8. Optimization of natural lipstick formulation based on pitaya (Hylocereus polyrhizus) seed oil using D-optimal mixture experimental design.

    PubMed

    Kamairudin, Norsuhaili; Gani, Siti Salwa Abd; Masoumi, Hamid Reza Fard; Hashim, Puziah

    2014-10-16

    The D-optimal mixture experimental design was employed to optimize the melting point of natural lipstick based on pitaya (Hylocereus polyrhizus) seed oil. The influence of the main lipstick components-pitaya seed oil (10%-25% w/w), virgin coconut oil (25%-45% w/w), beeswax (5%-25% w/w), candelilla wax (1%-5% w/w) and carnauba wax (1%-5% w/w)-were investigated with respect to the melting point properties of the lipstick formulation. The D-optimal mixture experimental design was applied to optimize the properties of lipstick by focusing on the melting point with respect to the above influencing components. The D-optimal mixture design analysis showed that the variation in the response (melting point) could be depicted as a quadratic function of the main components of the lipstick. The best combination of each significant factor determined by the D-optimal mixture design was established to be pitaya seed oil (25% w/w), virgin coconut oil (37% w/w), beeswax (17% w/w), candelilla wax (2% w/w) and carnauba wax (2% w/w). With respect to these factors, the 46.0 °C melting point property was observed experimentally, similar to the theoretical prediction of 46.5 °C. Carnauba wax is the most influential factor on this response (melting point) with its function being with respect to heat endurance. The quadratic polynomial model sufficiently fit the experimental data.

  9. Mindfulness Based Programs Implemented with At-Risk Adolescents

    PubMed Central

    Rawlett, Kristen; Scrandis, Debra

    2016-01-01

    Objective: This review examines studies on mindfulness based programs used with adolescents at-risk for poor future outcomes such as not graduating from high school and living in poverty. Method: The keywords used include mindfulness, at-risk and adolescents in each database to search CINAHL (10 items: 2 book reviews, 3 Dissertations, and 5 research articles), Medline EBSCO (15 research articles), and PubMed (10 research articles). Only primary research articles published between 2009- 2015 in English on mindfulness and at-risk adolescents were included for the most current evidence. Results: Few studies (n= 11) were found that investigate mindfulness in at-risk adolescents. These studies used various mindfulness programs (n = 7) making it difficult to generalize findings for practice. Only three studies were randomized control trials focusing mostly on male students with low socioeconomic status and existing mental health diagnoses. Conclusion: There is a relationship between health behaviors and academic achievement. Future research studies on mindfulness based interventions need to expand to its effects on academic achievement in those youth at-risk to decrease problematic behaviors and improve their ability to be successful adults. PMID:27347259

  10. Agreement in cardiovascular risk rating based on anthropometric parameters

    PubMed Central

    Dantas, Endilly Maria da Silva; Pinto, Cristiane Jordânia; Freitas, Rodrigo Pegado de Abreu; de Medeiros, Anna Cecília Queiroz

    2015-01-01

    Objective To investigate the agreement in evaluation of risk of developing cardiovascular diseases based on anthropometric parameters in young adults. Methods The study included 406 students, measuring weight, height, and waist and neck circumferences. Waist-to-height ratio and the conicity index. The kappa coefficient was used to assess agreement in risk classification for cardiovascular diseases. The positive and negative specific agreement values were calculated as well. The Pearson chi-square (χ2) test was used to assess associations between categorical variables (p<0.05). Results The majority of the parameters assessed (44%) showed slight (k=0.21 to 0.40) and/or poor agreement (k<0.20), with low values of negative specific agreement. The best agreement was observed between waist circumference and waist-to-height ratio both for the general population (k=0.88) and between sexes (k=0.93 to 0.86). There was a significant association (p<0.001) between the risk of cardiovascular diseases and females when using waist circumference and conicity index, and with males when using neck circumference. This resulted in a wide variation in the prevalence of cardiovascular disease risk (5.5%-36.5%), depending on the parameter and the sex that was assessed. Conclusion The results indicate variability in agreement in assessing risk for cardiovascular diseases, based on anthropometric parameters, and which also seems to be influenced by sex. Further studies in the Brazilian population are required to better understand this issue. PMID:26466060

  11. Optimal planning of LEO active debris removal based on hybrid optimal control theory

    NASA Astrophysics Data System (ADS)

    Yu, Jing; Chen, Xiao-qian; Chen, Li-hu

    2015-06-01

    The mission planning of Low Earth Orbit (LEO) active debris removal problem is studied in this paper. Specifically, the Servicing Spacecraft (SSc) and several debris exist on near-circular near-coplanar LEOs. The SSc should repeatedly rendezvous with the debris, and de-orbit them until all debris are removed. Considering the long-duration effect of J2 perturbation, a linear dynamics model is used for each rendezvous. The purpose of this paper is to find the optimal service sequence and rendezvous path with minimum total rendezvous cost (Δv) for the whole mission, and some complex constraints (communication time window constraint, terminal state constraint, and time distribution constraint) should be satisfied meanwhile. Considering this mission as a hybrid optimal control problem, a mathematical model is proposed, as well as the solution method. The proposed approach is demonstrated by a typical active debris removal problem. Numerical experiments show that (1) the model and solution method proposed in this paper can effectively address the planning problem of LEO debris removal; (2) the communication time window constraint and the J2 perturbation have considerable influences on the optimization results; and (3) under the same configuration, some suboptimal sequences are equivalent to the optimal one since their difference in Δv cost is very small.

  12. News of Biomedical Advances in HIV: Relationship to Treatment Optimism and Expected Risk Behavior in US MSM.

    PubMed

    Zimmerman, Rick S; Kirschbaum, Allison L

    2017-03-14

    HIV treatment optimism and the ways in which news of HIV biomedical advances in HIV is presented to the most at-risk communities interact in ways that affect risk behavior and the incidence of HIV. The goal of the current study was to understand the relationships among HIV treatment optimism, knowledge of HIV biomedical advances, and current and expected increased risk behavior as a result of reading hypothetical news stories of further advances. Most of an online-recruited sample of MSM were quite knowledgeable about current biomedical advances. After reading three hypothetical news stories, 15-24% of those not living with HIV and 26-52% of those living with HIV reported their condom use would decrease if the story they read were true. Results suggest the importance of more cautious reporting on HIV biomedical advances, and for targeting individuals with greater treatment optimism and those living with HIV via organizations where they are most likely to receive their information about HIV.

  13. Model based climate information on drought risk in Africa

    NASA Astrophysics Data System (ADS)

    Calmanti, S.; Syroka, J.; Jones, C.; Carfagna, F.; Dell'Aquila, A.; Hoefsloot, P.; Kaffaf, S.; Nikulin, G.

    2012-04-01

    The United Nations World Food Programme (WFP) has embarked upon the endeavor of creating a sustainable Africa-wide natural disaster risk management system. A fundamental building block of this initiative is the setup of a drought impact modeling platform called Africa Risk-View that aims to quantify and monitor weather-related food security risk in Africa. The modeling approach is based the Water Requirement Satisfaction Index (WRSI), as the fundamental indicator of the performances of agriculture and uses historical records of food assistance operation to project future potential needs for livelihood protection. By using climate change scenarios as an input to Africa Risk-View it is possible, in principles, to evaluate the future impact of climate variability on critical issues such as food security and the overall performance of the envisaged risk management system. A necessary preliminary step to this challenging task is the exploration of the sources of uncertainties affecting the assessment based on modeled climate change scenarios. For this purpose, a limited set of climate models have been selected in order verify the relevance of using climate model output data with Africa Risk-View and to explore a minimal range of possible sources of uncertainty. This first evaluation exercise started before the setup of the CORDEX framework and has relied on model output available at the time. In particular only one regional downscaling was available for the entire African continent from the ENSEMBLES project. The analysis shows that current coarse resolution global climate models can not directly feed into the Africa RiskView risk-analysis tool. However, regional downscaling may help correcting the inherent biases observed in the datasets. Further analysis is performed by using the first data available under the CORDEX framework. In particular, we consider a set of simulation driven with boundary conditions from the reanalysis ERA-Interim to evaluate the skill drought

  14. Electronic Nose Based on an Optimized Competition Neural Network

    PubMed Central

    Men, Hong; Liu, Haiyan; Pan, Yunpeng; Wang, Lei; Zhang, Haiping

    2011-01-01

    In view of the fact that there are disadvantages in that the class number must be determined in advance, the value of learning rates are hard to fix, etc., when using traditional competitive neural networks (CNNs) in electronic noses (E-noses), an optimized CNN method was presented. The optimized CNN was established on the basis of the optimum class number of samples according to the changes of the Davies and Bouldin (DB) value and it could increase, divide, or delete neurons in order to adjust the number of neurons automatically. Moreover, the learning rate changes according to the variety of training times of each sample. The traditional CNN and the optimized CNN were applied to five kinds of sorted vinegars with an E-nose. The results showed that optimized network structures could adjust the number of clusters dynamically and resulted in good classifications. PMID:22163887

  15. Design Optimization of Coronary Stent Based on Finite Element Models

    PubMed Central

    Qiu, Tianshuang; Zhu, Bao; Wu, Jinying

    2013-01-01

    This paper presents an effective optimization method using the Kriging surrogate model combing with modified rectangular grid sampling to reduce the stent dogboning effect in the expansion process. An infilling sampling criterion named expected improvement (EI) is used to balance local and global searches in the optimization iteration. Four commonly used finite element models of stent dilation were used to investigate stent dogboning rate. Thrombosis models of three typical shapes are built to test the effectiveness of optimization results. Numerical results show that two finite element models dilated by pressure applied inside the balloon are available, one of which with the artery and plaque can give an optimal stent with better expansion behavior, while the artery and plaque unincluded model is more efficient and takes a smaller amount of computation. PMID:24222743

  16. Reliability based structural optimization - A simplified safety index approach

    NASA Technical Reports Server (NTRS)

    Reddy, Mahidhar V.; Grandhi, Ramana V.; Hopkins, Dale A.

    1993-01-01

    A probabilistic optimal design methodology for complex structures modelled with finite element methods is presented. The main emphasis is on developing probabilistic analysis tools suitable for optimization. An advanced second-moment method is employed to evaluate the failure probability of the performance function. The safety indices are interpolated using the information at mean and most probable failure point. The minimum weight design with an improved safety index limit is achieved by using the extended interior penalty method of optimization. Numerical examples covering beam and plate structures are presented to illustrate the design approach. The results obtained by using the proposed approach are compared with those obtained by using the existing probabilistic optimization techniques.

  17. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  18. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  19. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  20. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  1. A quantile-based Time at Risk: A new approach for assessing risk in financial markets

    NASA Astrophysics Data System (ADS)

    Bolgorian, Meysam; Raei, Reza

    2013-11-01

    In this paper, we provide a new measure for evaluation of risk in financial markets. This measure is based on the return interval of critical events in financial markets or other investment situations. Our main goal was to devise a model like Value at Risk (VaR). As VaR, for a given financial asset, probability level and time horizon, gives a critical value such that the likelihood of loss on the asset over the time horizon exceeds this value is equal to the given probability level, our concept of Time at Risk (TaR), using a probability distribution function of return intervals, provides a critical time such that the probability that the return interval of a critical event exceeds this time equals the given probability level. As an empirical application, we applied our model to data from the Tehran Stock Exchange Price Index (TEPIX) as a financial asset (market portfolio) and reported the results.

  2. Optimality criteria-based topology optimization of a bi-material model for acoustic-structural coupled systems

    NASA Astrophysics Data System (ADS)

    Shang, Linyuan; Zhao, Guozhong

    2016-06-01

    This article investigates topology optimization of a bi-material model for acoustic-structural coupled systems. The design variables are volume fractions of inclusion material in a bi-material model constructed by the microstructure-based design domain method (MDDM). The design objective is the minimization of sound pressure level (SPL) in an interior acoustic medium. Sensitivities of SPL with respect to topological design variables are derived concretely by the adjoint method. A relaxed form of optimality criteria (OC) is developed for solving the acoustic-structural coupled optimization problem to find the optimum bi-material distribution. Based on OC and the adjoint method, a topology optimization method to deal with large calculations in acoustic-structural coupled problems is proposed. Numerical examples are given to illustrate the applications of topology optimization for a bi-material plate under a low single-frequency excitation and an aerospace structure under a low frequency-band excitation, and to prove the efficiency of the adjoint method and the relaxed form of OC.

  3. Quantitative health risk assessment of Cryptosporidium in rivers of southern China based on continuous monitoring.

    PubMed

    An, Wei; Zhang, Dongqing; Xiao, Shumin; Yu, Jianwei; Yang, Min

    2011-06-01

    The concentrations of Cryptosporidium in the source water of several cities of Zhejiang Province, China were determined to be in the range of 0-17 oocysts/10 L in the rainy season in 2008, with a mean value of 7 oocysts/10 L. Based on the investigation data, comprehensive risk assessment of Cryptosporidium infection was performed by considering different water intake routes as well as water consumption. Intakes of unboiled tapwater (including drinking and tooth-brushing and food and dish washing) and source water (through swimming in rivers) were estimated to be 2.59-25.9 and 0.32-0.74 L/year-person, respectively. The mortality due to Cryptosporidium infection for people in this region, excluding HIV-infected patients, was calculated as 0-0.0146 per 10(5) persons using a conditional probability formula. Disability-adjusted life years (DALYs) were used to quantify the risk of Cryptosporidium infection, for which uncertainty was analyzed. For people who consumed conventionally treated water, the DALYs due to Cryptosporidium infection were 6.51 per 10(5) (95% CI: 2.16 × 10(-5)-22.35 × 10(-5)) persons, which were higher than a risk judged acceptable by some (1.97 × 10(-5) DALYs per year), and the risk for those consuming ozone-treated water became 0.0689 × 10(-5) DALYs per year. The major risk of infection resulted from swimming in the river. This study provides a method to establish the risk of Cryptosporidium infection and optimize the scheme for reducing the risk effectively, which is useful for the modification of water quality standards based on cost utility analysis given use of DALYs.

  4. Projected Flood Risks in China based on CMIP5

    NASA Astrophysics Data System (ADS)

    Xu, Ying

    2016-04-01

    Based on the simulations from 22 CMIP5 models and in combination with data on population, GDP, arable land, and terrain elevation, the spatial distributions of the flood risk levels are calculated and analyzed under RCP8.5 for the baseline period (1986-2005), the near term future period (2016-2035), the middle term future period (2046-2065), and the long term future period (2080-2099). (1) Areas with higher flood hazard risk levels in the future are concentrated in southeastern China, and the areas with the risk level III continue to expand. The major changes in flood hazard risks will occur in the middle and long term future. (2) In future, the areas of high vulnerability to flood hazards will be located in China's eastern region. In the middle and late 21st century, the extent of the high vulnerability area will expand eastward and its intensity will gradually increase. The highest vulnerability values are found in the provinces of Beijing, Tianjin, Hebei, Henan, Anhui, Shandong, Shanghai, Jiangsu, and in parts of the Pearl River Delta. Furthermore, the major cities in northeast China, as well as Wuhan, Changsha and Nanchang are highly vulnerable. (3) The regions with high flood risk levels will be located in eastern China, in the middle and lower reaches of Yangtze River and stretching northward to Beijing and Tianjin. High-risk flood areas are also occurring in major cities in Northeast China, in some parts of Shaanxi and Shanxi, and in some coastal areas in Southeast China. (4) Compared to the baseline period, the high flood risks will increase on a regional level towards the end of the 21st century, although the areas of flood hazards show little variation. In this paper, the projected future flood risks for different periods were analyzed under the RCP8.5 emission scenarios. By comparing the results with the simulations under the RCP 2.6 and RCP 4.5 scenarios, both scenarios show no differences in the spatial distribution, but in the intensity of flood

  5. Development of a GCR Event-based Risk Model

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee

    2009-01-01

    A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how

  6. Optimizing nanomedicine pharmacokinetics using physiologically based pharmacokinetics modelling

    PubMed Central

    Moss, Darren Michael; Siccardi, Marco

    2014-01-01

    The delivery of therapeutic agents is characterized by numerous challenges including poor absorption, low penetration in target tissues and non-specific dissemination in organs, leading to toxicity or poor drug exposure. Several nanomedicine strategies have emerged as an advanced approach to enhance drug delivery and improve the treatment of several diseases. Numerous processes mediate the pharmacokinetics of nanoformulations, with the absorption, distribution, metabolism and elimination (ADME) being poorly understood and often differing substantially from traditional formulations. Understanding how nanoformulation composition and physicochemical properties influence drug distribution in the human body is of central importance when developing future treatment strategies. A helpful pharmacological tool to simulate the distribution of nanoformulations is represented by physiologically based pharmacokinetics (PBPK) modelling, which integrates system data describing a population of interest with drug/nanoparticle in vitro data through a mathematical description of ADME. The application of PBPK models for nanomedicine is in its infancy and characterized by several challenges. The integration of property–distribution relationships in PBPK models may benefit nanomedicine research, giving opportunities for innovative development of nanotechnologies. PBPK modelling has the potential to improve our understanding of the mechanisms underpinning nanoformulation disposition and allow for more rapid and accurate determination of their kinetics. This review provides an overview of the current knowledge of nanomedicine distribution and the use of PBPK modelling in the characterization of nanoformulations with optimal pharmacokinetics. Linked Articles This article is part of a themed section on Nanomedicine. To view the other articles in this section visit http://dx.doi.org/10.1111/bph.2014.171.issue-17 PMID:24467481

  7. OMIGA: Optimized Maker-Based Insect Genome Annotation.

    PubMed

    Liu, Jinding; Xiao, Huamei; Huang, Shuiqing; Li, Fei

    2014-08-01

    Insects are one of the largest classes of animals on Earth and constitute more than half of all living species. The i5k initiative has begun sequencing of more than 5,000 insect genomes, which should greatly help in exploring insect resource and pest control. Insect genome annotation remains challenging because many insects have high levels of heterozygosity. To improve the quality of insect genome annotation, we developed a pipeline, named Optimized Maker-Based Insect Genome Annotation (OMIGA), to predict protein-coding genes from insect genomes. We first mapped RNA-Seq reads to genomic scaffolds to determine transcribed regions using Bowtie, and the putative transcripts were assembled using Cufflink. We then selected highly reliable transcripts with intact coding sequences to train de novo gene prediction software, including Augustus. The re-trained software was used to predict genes from insect genomes. Exonerate was used to refine gene structure and to determine near exact exon/intron boundary in the genome. Finally, we used the software Maker to integrate data from RNA-Seq, de novo gene prediction, and protein alignment to produce an official gene set. The OMIGA pipeline was used to annotate the draft genome of an important insect pest, Chilo suppressalis, yielding 12,548 genes. Different strategies were compared, which demonstrated that OMIGA had the best performance. In summary, we present a comprehensive pipeline for identifying genes in insect genomes that can be widely used to improve the annotation quality in insects. OMIGA is provided at http://ento.njau.edu.cn/omiga.html .

  8. An Optimized Informatics Pipeline for Mass Spectrometry-Based Peptidomics

    NASA Astrophysics Data System (ADS)

    Wu, Chaochao; Monroe, Matthew E.; Xu, Zhe; Slysz, Gordon W.; Payne, Samuel H.; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-12-01

    The comprehensive MS analysis of the peptidome, the intracellular and intercellular products of protein degradation, has the potential to provide novel insights on endogenous proteolytic processing and its utility in disease diagnosis and prognosis. Along with the advances in MS instrumentation and related platforms, a plethora of proteomics data analysis tools have been applied for direct use in peptidomics; however, an evaluation of the currently available informatics pipelines for peptidomics data analysis has yet to be reported. In this study, we began by evaluating the results of several popular MS/MS database search engines, including MS-GF+, SEQUEST, and MS-Align+, for peptidomics data analysis, followed by identification and label-free quantification using the well-established accurate mass and time (AMT) tag and newly developed informed quantification (IQ) approaches, both based on direct LC-MS analysis. Our results demonstrated that MS-GF+ outperformed both SEQUEST and MS-Align+ in identifying peptidome peptides. Using a database established from MS-GF+ peptide identifications, both the AMT tag and IQ approaches provided significantly deeper peptidome coverage and less missing data for each individual data set than the MS/MS methods, while achieving robust label-free quantification. Besides having an excellent correlation with the AMT tag quantification results, IQ also provided slightly higher peptidome coverage. Taken together, we propose an optimized informatics pipeline combining MS-GF+ for initial database searching with IQ (or AMT tag) approaches for identification and label-free quantification for high-throughput, comprehensive, and quantitative peptidomics analysis.

  9. An Optimized Informatics Pipeline for Mass Spectrometry-Based Peptidomics

    SciTech Connect

    Wu, Chaochao; Monroe, Matthew E.; Xu, Zhe; Slysz, Gordon W.; Payne, Samuel H.; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-12-26

    Comprehensive MS analysis of peptidome, the intracellular and intercellular products of protein degradation, has the potential to provide novel insights on endogenous proteolytic processing and their utility in disease diagnosis and prognosis. Along with the advances in MS instrumentation, a plethora of proteomics data analysis tools have been applied for direct use in peptidomics; however an evaluation of the currently available informatics pipelines for peptidomics data analysis has yet to be reported. In this study, we set off by evaluating the results of several popular MS/MS database search engines including MS-GF+, SEQUEST and MS-Align+ for peptidomics data analysis, followed by identification and label-free quantification using the well-established accurate mass and time (AMT) tag and newly developed informed quantification (IQ) approaches, both based on direct LC-MS analysis. Our result demonstrated that MS-GF+ outperformed both SEQUEST and MS-Align+ in identifying peptidome peptides. Using a database established from the MS-GF+ peptide identifications, both the AMT tag and IQ approaches provided significantly deeper peptidome coverage and less missing value for each individual data set than the MS/MS methods, while achieving robust label-free quantification. Besides having an excellent correlation with the AMT tag quantification results, IQ also provided slightly higher peptidome coverage than AMT. Taken together, we propose an optimal informatics pipeline combining MS-GF+ for initial database searching with IQ (or AMT) for identification and label-free quantification for high-throughput, comprehensive and quantitative peptidomics analysis.

  10. Orbit design and optimization based on global telecommunication performance metrics

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Lee, Charles H.; Kerridge, Stuart; Cheung, Kar-Ming; Edwards, Charles D.

    2006-01-01

    The orbit selection of telecommunications orbiters is one of the critical design processes and should be guided by global telecom performance metrics and mission-specific constraints. In order to aid the orbit selection, we have coupled the Telecom Orbit Analysis and Simulation Tool (TOAST) with genetic optimization algorithms. As a demonstration, we have applied the developed tool to select an optimal orbit for general Mars telecommunications orbiters with the constraint of being a frozen orbit. While a typical optimization goal is to minimize tele-communications down time, several relevant performance metrics are examined: 1) area-weighted average gap time, 2) global maximum of local maximum gap time, 3) global maximum of local minimum gap time. Optimal solutions are found with each of the metrics. Common and different features among the optimal solutions as well as the advantage and disadvantage of each metric are presented. The optimal solutions are compared with several candidate orbits that were considered during the development of Mars Telecommunications Orbiter.

  11. Aeroelastic Optimization Study Based on the X-56A Model

    NASA Technical Reports Server (NTRS)

    Li, Wesley W.; Pak, Chan-Gi

    2014-01-01

    One way to increase the aircraft fuel efficiency is to reduce structural weight while maintaining adequate structural airworthiness, both statically and aeroelastically. A design process which incorporates the object-oriented multidisciplinary design, analysis, and optimization (MDAO) tool and the aeroelastic effects of high fidelity finite element models to characterize the design space was successfully developed and established. This paper presents two multidisciplinary design optimization studies using an object-oriented MDAO tool developed at NASA Armstrong Flight Research Center. The first study demonstrates the use of aeroelastic tailoring concepts to minimize the structural weight while meeting the design requirements including strength, buckling, and flutter. Such an approach exploits the anisotropic capabilities of the fiber composite materials chosen for this analytical exercise with ply stacking sequence. A hybrid and discretization optimization approach improves accuracy and computational efficiency of a global optimization algorithm. The second study presents a flutter mass balancing optimization study for the fabricated flexible wing of the X-56A model since a desired flutter speed band is required for the active flutter suppression demonstration during flight testing. The results of the second study provide guidance to modify the wing design and move the design flutter speeds back into the flight envelope so that the original objective of X-56A flight test can be accomplished successfully. The second case also demonstrates that the object-oriented MDAO tool can handle multiple analytical configurations in a single optimization run.

  12. Optimal seat suspension design based on minimum "simulated subjective response".

    PubMed

    Wan, Y; Schimmels, J M

    1997-11-01

    This work addresses a method for improving vertical whole body vibration isolation through optimal seat suspension design. The primary thrusts of this investigation are: (1) the development of a simple model that captures the essential dynamics of a seated human exposed to vertical vibration, (2) the selection and evaluation of several standards for assessing human sensitivity to vertical vibration, and (3) the determination of the seat suspension parameters that minimize these standards to yield optimal vibration isolation. Results show that the optimal seat and cushion damping coefficients depend very much on the selection of the vibration sensitivity standard and on the lower bound of the stiffnesses used in the constrained optimization procedure. In all cases, however, the optimal seat damping obtained here is significantly larger (by than a factor of 10) than that obtained using existing seat suspension design methods or from previous optimal suspension studies. This research also indicates that the existing means of assessing vibration in suspension design (ISO 7096) requires modification.

  13. Lessons in risk- versus resilience-based design and management.

    PubMed

    Park, Jeryang; Seager, Thomas P; Rao, P Suresh C

    2011-07-01

    The implications of recent catastrophic disasters, including the Fukushima Daiichi nuclear power plant accident, reach well beyond the immediate, direct environmental and human health risks. In a complex coupled system, disruptions from natural disasters and man-made accidents can quickly propagate through a complex chain of networks to cause unpredictable failures in other economic or social networks and other parts of the world. Recent disasters have revealed the inadequacy of a classical risk management approach. This study calls for a new resilience-based design and management paradigm that draws upon the ecological analogues of diversity and adaptation in response to low-probability and high-consequence disruptions.

  14. A Most Probable Point-Based Method for Reliability Analysis, Sensitivity Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Hou, Gene J.-W.; Gumbert, Clyde R.; Newman, Perry A.

    2004-01-01

    A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The optimal solutions associated with the MPP provide measurements related to safety probability. This study focuses on two commonly used approximate probability integration methods; i.e., the Reliability Index Approach (RIA) and the Performance Measurement Approach (PMA). Their reliability sensitivity equations are first derived in this paper, based on the derivatives of their respective optimal solutions. Examples are then provided to demonstrate the use of these derivatives for better reliability analysis and Reliability-Based Design Optimization (RBDO).

  15. A Study on the Optimal Generation Mix Based on Portfolio Theory with Considering the Basic Condition for Power Supply

    NASA Astrophysics Data System (ADS)

    Kato, Moritoshi; Zhou, Yicheng

    This paper presents a novel method to analyze the optimal generation mix based on portfolio theory with considering the basic condition for power supply, which means that electricity generation corresponds with load curve. The optimization of portfolio is integrated with the calculation of a capacity factor of each generation in order to satisfy the basic condition for power supply. Besides, each generation is considered to be an asset, and risks of the generation asset both in its operation period and construction period are considered. Environmental measures are evaluated through restriction of CO2 emissions, which are indicated by CO2 price. Numerical examples show the optimal generation mix according to risks such as the deviation of capacity factor of nuclear power or restriction of CO2 emissions, the possibility of introduction of clean coal technology (IGCC, CCS) or renewable energy, and so on. The results of this work will be possibly applied as setting the target of the generation mix for the future according to prospects of risks of each generation and restrictions of CO2 emissions.

  16. Thermodynamic Analysis and Optimization Based on Exergy Flow for a Two-Staged Pulse Tube Refrigerator

    DTIC Science & Technology

    2010-01-01

    model is convenient for thermodynamic analysis and optimization and shows the important characteristics of two-stage PTRs. Exergy comes into the system ...THERMODYNAMIC ANALYSIS AND OPTIMIZATION BASED ON EXERGY FLOW FOR A TWOSTAGED PULSE TUBE REFRIGERATOR A. Razani, T. Fraser, C. Dodson, and T. Roberts...Prescribed by ANSI Std Z39-18 THERMODYNAMIC ANALYSIS AND OPTIMIZATION BASED ON EXERGY FLOW FOR A TWO-STAGED PULSE TUBE REFRIGERATOR A. Razani 1,3

  17. Optimizing Biosurveillance Systems that Use Threshold-based Event Detection Methods

    DTIC Science & Technology

    2009-06-01

    Optimizing Biosurveillance Systems that Use Threshold-based Event Detection Methods Ronald D. Fricker, Jr.∗ and David Banschbach† June 1, 2009...Abstract We describe a methodology for optimizing a threshold detection-based biosurveillance system. The goal is to maximize the system-wide probability of...Using this approach, pub- lic health officials can “tune” their biosurveillance systems to optimally detect various threats, thereby allowing

  18. A multi-objective optimization model with conditional value-at-risk constraints for water allocation equality

    NASA Astrophysics Data System (ADS)

    Hu, Zhineng; Wei, Changting; Yao, Liming; Li, Ling; Li, Chaozhi

    2016-11-01

    Water scarcity is a global problem which causes economic and political conflicts as well as degradation of ecosystems. Moreover, the uncertainty caused by extreme weather increases the risk of economic inefficiency, an essential consideration for water users. In this study, a multi-objective model involving water allocation equality and economic efficiency risk control is developed to help water managers mitigate these problems. Gini coefficient is introduced to optimize water allocation equality in water use sectors (agricultural, domestic, and industrial sectors), and CVaR is integrated into the model constraints to control the economic efficiency loss risk corresponding to variations in water availability. The case study demonstrates the practicability and rationality of the developed model, allowing the river basin authority to determine water allocation strategies for a single river basin.

  19. A hybrid algorithm for instant optimization of beam weights in anatomy-based intensity modulated radiotherapy: A performance evaluation study.

    PubMed

    Vaitheeswaran, Ranganathan; Sathiya, Narayanan V K; Bhangle, Janhavi R; Nirhali, Amit; Kumar, Namita; Basu, Sumit; Maiya, Vikram

    2011-04-01

    The study aims to introduce a hybrid optimization algorithm for anatomy-based intensity modulated radiotherapy (AB-IMRT). Our proposal is that by integrating an exact optimization algorithm with a heuristic optimization algorithm, the advantages of both the algorithms can be combined, which will lead to an efficient global optimizer solving the problem at a very fast rate. Our hybrid approach combines Gaussian elimination algorithm (exact optimizer) with fast simulated annealing algorithm (a heuristic global optimizer) for the optimization of beam weights in AB-IMRT. The algorithm has been implemented using MATLAB software. The optimization efficiency of the hybrid algorithm is clarified by (i) analysis of the numerical characteristics of the algorithm and (ii) analysis of the clinical capabilities of the algorithm. The numerical and clinical characteristics of the hybrid algorithm are compared with Gaussian elimination method (GEM) and fast simulated annealing (FSA). The numerical characteristics include convergence, consistency, number of iterations and overall optimization speed, which were analyzed for the respective cases of 8 patients. The clinical capabilities of the hybrid algorithm are demonstrated in cases of (a) prostate and (b) brain. The analyses reveal that (i) the convergence speed of the hybrid algorithm is approximately three times higher than that of FSA algorithm; (ii) the convergence (percentage reduction in the cost function) in hybrid algorithm is about 20% improved as compared to that in GEM algorithm; (iii) the hybrid algorithm is capable of producing relatively better treatment plans in terms of Conformity Index (CI) [~ 2% - 5% improvement] and Homogeneity Index (HI) [~ 4% - 10% improvement] as compared to GEM and FSA algorithms; (iv) the sparing of organs at risk in hybrid algorithm-based plans is better than that in GEM-based plans and comparable to that in FSA-based plans; and (v) the beam weights resulting from the hybrid algorithm are

  20. Assessing risk based on uncertain avalanche activity patterns

    NASA Astrophysics Data System (ADS)

    Zeidler, Antonia; Fromm, Reinhard

    2015-04-01

    Avalanches may affect critical infrastructure and may cause great economic losses. The planning horizon of infrastructures, e.g. hydropower generation facilities, reaches well into the future. Based on the results of previous studies on the effect of changing meteorological parameters (precipitation, temperature) and the effect on avalanche activity we assume that there will be a change of the risk pattern in future. The decision makers need to understand what the future might bring to best formulate their mitigation strategies. Therefore, we explore a commercial risk software to calculate risk for the coming years that might help in decision processes. The software @risk, is known to many larger companies, and therefore we explore its capabilities to include avalanche risk simulations in order to guarantee a comparability of different risks. In a first step, we develop a model for a hydropower generation facility that reflects the problem of changing avalanche activity patterns in future by selecting relevant input parameters and assigning likely probability distributions. The uncertain input variables include the probability of avalanches affecting an object, the vulnerability of an object, the expected costs for repairing the object and the expected cost due to interruption. The crux is to find the distribution that best represents the input variables under changing meteorological conditions. Our focus is on including the uncertain probability of avalanches based on the analysis of past avalanche data and expert knowledge. In order to explore different likely outcomes we base the analysis on three different climate scenarios (likely, worst case, baseline). For some variables, it is possible to fit a distribution to historical data, whereas in cases where the past dataset is insufficient or not available the software allows to select from over 30 different distribution types. The Monte Carlo simulation uses the probability distribution of uncertain variables

  1. A Global Airport-Based Risk Model for the Spread of Dengue Infection via the Air Transport Network

    PubMed Central

    Gardner, Lauren; Sarkar, Sahotra

    2013-01-01

    The number of travel-acquired dengue infections has seen a consistent global rise over the past decade. An increased volume of international passenger air traffic originating from regions with endemic dengue has contributed to a rise in the number of dengue cases in both areas of endemicity and elsewhere. This paper reports results from a network-based risk assessment model which uses international passenger travel volumes, travel routes, travel distances, regional populations, and predictive species distribution models (for the two vector species, Aedes aegypti and Aedes albopictus) to quantify the relative risk posed by each airport in importing passengers with travel-acquired dengue infections. Two risk attributes are evaluated: (i) the risk posed by through traffic at each stopover airport and (ii) the risk posed by incoming travelers to each destination airport. The model results prioritize optimal locations (i.e., airports) for targeted dengue surveillance. The model is easily extendible to other vector-borne diseases. PMID:24009672

  2. Optimal colour quality of LED clusters based on memory colours.

    PubMed

    Smet, Kevin; Ryckaert, Wouter R; Pointer, Michael R; Deconinck, Geert; Hanselaer, Peter

    2011-03-28

    The spectral power distributions of tri- and tetrachromatic clusters of Light-Emitting-Diodes, composed of simulated and commercially available LEDs, were optimized with a genetic algorithm to maximize the luminous efficacy of radiation and the colour quality as assessed by the memory colour quality metric developed by the authors. The trade-off of the colour quality as assessed by the memory colour metric and the luminous efficacy of radiation was investigated by calculating the Pareto optimal front using the NSGA-II genetic algorithm. Optimal peak wavelengths and spectral widths of the LEDs were derived, and over half of them were found to be close to Thornton's prime colours. The Pareto optimal fronts of real LED clusters were always found to be smaller than those of the simulated clusters. The effect of binning on designing a real LED cluster was investigated and was found to be quite large. Finally, a real LED cluster of commercially available AlGaInP, InGaN and phosphor white LEDs was optimized to obtain a higher score on memory colour quality scale than its corresponding CIE reference illuminant.

  3. A comprehensive propagation prediction model comprising microfacet based scattering and probability based coverage optimization algorithm.

    PubMed

    Kausar, A S M Zahid; Reza, Ahmed Wasif; Wo, Lau Chun; Ramiah, Harikrishnan

    2014-01-01

    Although ray tracing based propagation prediction models are popular for indoor radio wave propagation characterization, most of them do not provide an integrated approach for achieving the goal of optimum coverage, which is a key part in designing wireless network. In this paper, an accelerated technique of three-dimensional ray tracing is presented, where rough surface scattering is included for making a more accurate ray tracing technique. Here, the rough surface scattering is represented by microfacets, for which it becomes possible to compute the scattering field in all possible directions. New optimization techniques, like dual quadrant skipping (DQS) and closest object finder (COF), are implemented for fast characterization of wireless communications and making the ray tracing technique more efficient. In conjunction with the ray tracing technique, probability based coverage optimization algorithm is accumulated with the ray tracing technique to make a compact solution for indoor propagation prediction. The proposed technique decreases the ray tracing time by omitting the unnecessary objects for ray tracing using the DQS technique and by decreasing the ray-object intersection time using the COF technique. On the other hand, the coverage optimization algorithm is based on probability theory, which finds out the minimum number of transmitters and their corresponding positions in order to achieve optimal indoor wireless coverage. Both of the space and time complexities of the proposed algorithm surpass the existing algorithms. For the verification of the proposed ray tracing technique and coverage algorithm, detailed simulation results for different scattering factors, different antenna types, and different operating frequencies are presented. Furthermore, the proposed technique is verified by the experimental results.

  4. Patient specific optimization-based treatment planning for catheter-based ultrasound hyperthermia and thermal ablation

    NASA Astrophysics Data System (ADS)

    Prakash, Punit; Chen, Xin; Wootton, Jeffery; Pouliot, Jean; Hsu, I.-Chow; Diederich, Chris J.

    2009-02-01

    A 3D optimization-based thermal treatment planning platform has been developed for the application of catheter-based ultrasound hyperthermia in conjunction with high dose rate (HDR) brachytherapy for treating advanced pelvic tumors. Optimal selection of applied power levels to each independently controlled transducer segment can be used to conform and maximize therapeutic heating and thermal dose coverage to the target region, providing significant advantages over current hyperthermia technology and improving treatment response. Critical anatomic structures, clinical target outlines, and implant/applicator geometries were acquired from sequential multi-slice 2D images obtained from HDR treatment planning and used to reconstruct patient specific 3D biothermal models. A constrained optimization algorithm was devised and integrated within a finite element thermal solver to determine a priori the optimal applied power levels and the resulting 3D temperature distributions such that therapeutic heating is maximized within the target, while placing constraints on maximum tissue temperature and thermal exposure of surrounding non-targeted tissue. This optimizationbased treatment planning and modeling system was applied on representative cases of clinical implants for HDR treatment of cervix and prostate to evaluate the utility of this planning approach. The planning provided significant improvement in achievable temperature distributions for all cases, with substantial increase in T90 and thermal dose (CEM43T90) coverage to the hyperthermia target volume while decreasing maximum treatment temperature and reducing thermal dose exposure to surrounding non-targeted tissues and thermally sensitive rectum and bladder. This optimization based treatment planning platform with catheter-based ultrasound applicators is a useful tool that has potential to significantly improve the delivery of hyperthermia in conjunction with HDR brachytherapy. The planning platform has been extended

  5. Risk based neoadjuvant chemotherapy in muscle invasive bladder cancer

    PubMed Central

    Jayaratna, Isuru S.; Navai, Neema

    2015-01-01

    Muscle invasive bladder cancer (MIBC) is an aggressive disease that frequently requires radical cystectomy (RC) to achieve durable cure rates. Surgery is most effective when performed in organ-confined disease, with the best outcomes for those patients with a pT0 result. The goals of neoadjuvant chemotherapy (NC) are to optimize surgical outcomes for a malignancy with limited adjuvant therapies and a lack of effective salvage treatments. Despite level 1 evidence demonstrating a survival benefit, the utilization of NC has been hampered by several issues, including, the inability to predict responders and the perception that NC may delay curative surgery. In this article, we review the current efforts to identify patients that are most likely to derive a benefit from NC, in order to create a risk-adapted paradigm that reserves NC for those who need it. PMID:26816830

  6. Cost Risk Analysis Based on Perception of the Engineering Process

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering

  7. Treatment referral for sex offenders based on clinical judgment versus actuarial risk assessment: match and analysis of mismatch.

    PubMed

    Smid, Wineke J; Kamphuis, Jan Henk; Wever, Edwin C; Van Beek, Daan

    2013-07-01

    The Risk Need Responsivity (RNR) principles (Andrews & Bonta, 2010) dictate that higher risk sex offenders should receive more intensive treatment. The present study investigates how clinically based treatment assignment relates to risk level in a sex offender sample from The Netherlands. Correlational analyses served to identify sources of mismatches: that is, variables differing significantly in their relation between treatment selection and risk level. Our study sample consisted of 194 convicted rapists and 214 convicted child molesters. All participants' criminal files were retrospectively coded in terms of the items of the STATIC-99R, PCL: SV, and SVR-20. A low to moderate correlation was observed between clinical treatment selection and actuarial risk levels. A substantial part of the sex offenders, especially child molesters, received overly intensive treatment and another substantial part, especially rapists, received treatment of lesser intensity than indicated by their risk levels. General violent and antisocial risk factors seemed to be underemphasized in the clinical evaluation of sex offenders, especially rapists. A negative attitude toward intervention was negatively associated with clinical treatment selection. It is concluded that clinical treatment selection leads to an insufficient match between risk level and treatment level and systematic use of validated structured risk assessment instruments is necessary to ensure optimal adherence to the risk principle.

  8. Transport path optimization algorithm based on fuzzy integrated weights

    NASA Astrophysics Data System (ADS)

    Hou, Yuan-Da; Xu, Xiao-Hao

    2014-11-01

    Natural disasters cause significant damage to roads, making route selection a complicated logistical problem. To overcome this complexity, we present a method of using a trapezoidal fuzzy number to select the optimal transport path. Using the given trapezoidal fuzzy edge coefficients, we calculate a fuzzy integrated matrix, and incorporate the fuzzy multi-weights into fuzzy integrated weights. The optimal path is determined by taking two sets of vertices and transforming undiscovered vertices into discoverable ones. Our experimental results show that the model is highly accurate, and requires only a few measurement data to confirm the optimal path. The model provides an effective, feasible, and convenient method to obtain weights for different road sections, and can be applied to road planning in intelligent transportation systems.

  9. Optical transfer function optimization based on linear expansions

    NASA Astrophysics Data System (ADS)

    Schwiegerling, Jim

    2015-09-01

    The Optical Transfer Function (OTF) and its modulus the Modulation Transfer Function (MTF) are metrics of optical system performance. However in system optimization, calculation times for the OTF are often substantially longer than more traditional optimization targets such as wavefront error or transverse ray error. The OTF is typically calculated as either the autocorrelation of the complex pupil function or as the Fourier transform of the Point Spread Function. We recently demonstrated that the on-axis OTF can be represented as a linear combination of analytical functions where the weighting terms are directly related to the wavefront error coefficients and apodization of the complex pupil function. Here, we extend this technique to the off-axis case. The expansion technique offers a potential for accelerating OTF optimization in lens design, as well as insight into the interaction of aberrations with components of the OTF.

  10. Microemulsion-based gel of terbinafine for the treatment of onychomycosis: optimization of formulation using D-optimal design.

    PubMed

    Barot, Bhavesh S; Parejiya, Punit B; Patel, Hetal K; Gohel, Mukesh C; Shelat, Pragna K

    2012-03-01

    The aim of the present investigation was to evaluate microemulsion as a vehicle for dermal drug delivery and to develop microemulsion-based gel of terbinafine for the treatment of onychomycosis. D-optimal mixture experimental design was adopted to optimize the amount of oil (X(1)), Smix (mixture of surfactant and cosurfactant; X(2)) and water (X(3)) in the microemulsion. The formulations were assessed for globule size (in nanometers; Y(1)) and solubility of drug in microemulsion (in milligrams per milliliter; Y(2)). The microemulsion containing 5.75% oil, 53.75% surfactant-cosurfactant mixture and 40.5% water was selected as the optimized batch. The globule size and solubility of the optimized batch were 18.14 nm and 43.71 mg/ml, respectively. Transmission electron microscopy showed that globules were spherical in shape. Drug containing microemulsion was converted into gel employing 0.75% w/w carbopol 934P. The optimized gel showed better penetration and retention in the human cadaver skin as compared to the commercial cream. The cumulative amount of terbinafine permeated after 12 h was 244.65 ± 18.43 μg cm(-2) which was three times more than the selected commercial cream. Terbinafine microemulsion in the gel form showed better activity against Candida albicans and Trichophyton rubrum than the commercial cream. It was concluded that drug-loaded gel could be a promising formulation for effective treatment of onychomycosis.

  11. Chaos optimization algorithms based on chaotic maps with different probability distribution and search speed for global optimization

    NASA Astrophysics Data System (ADS)

    Yang, Dixiong; Liu, Zhenjun; Zhou, Jilei

    2014-04-01

    Chaos optimization algorithms (COAs) usually utilize the chaotic map like Logistic map to generate the pseudo-random numbers mapped as the design variables for global optimization. Many existing researches indicated that COA can more easily escape from the local minima than classical stochastic optimization algorithms. This paper reveals the inherent mechanism of high efficiency and superior performance of COA, from a new perspective of both the probability distribution property and search speed of chaotic sequences generated by different chaotic maps. The statistical property and search speed of chaotic sequences are represented by the probability density function (PDF) and the Lyapunov exponent, respectively. Meanwhile, the computational performances of hybrid chaos-BFGS algorithms based on eight one-dimensional chaotic maps with different PDF and Lyapunov exponents are compared, in which BFGS is a quasi-Newton method for local optimization. Moreover, several multimodal benchmark examples illustrate that, the probability distribution property and search speed of chaotic sequences from different chaotic maps significantly affect the global searching capability and optimization efficiency of COA. To achieve the high efficiency of COA, it is recommended to adopt the appropriate chaotic map generating the desired chaotic sequences with uniform or nearly uniform probability distribution and large Lyapunov exponent.

  12. Enhancing the effectiveness of IST through risk-based techniques

    SciTech Connect

    Floyd, S.D.

    1996-12-01

    Current IST requirements were developed mainly through deterministic-based methods. While this approach has resulted in an adequate level of safety and reliability for pumps and valves, insights from probabilistic safety assessments suggest a better safety focus can be achieved at lower costs. That is, some high safety impact pumps and valves are currently not tested under the IST program and should be added, while low safety impact valves could be tested at significantly greater intervals than allowed by the current IST program. The nuclear utility industry, through the Nuclear Energy Institute (NEI), has developed a draft guideline for applying risk-based techniques to focus testing on those pumps and valves with a high safety impact while reducing test frequencies on low safety impact pumps and valves. The guideline is being validated through an industry pilot application program that is being reviewed by the U.S. Nuclear Regulatory Commission. NEI and the ASME maintain a dialogue on the two groups` activities related to risk-based IST. The presenter will provide an overview of the NEI guideline, discuss the methodological approach for applying risk-based technology to IST and provide the status of the industry pilot plant effort.

  13. A GIS-based method for flood risk assessment

    NASA Astrophysics Data System (ADS)

    Kalogeropoulos, Kleomenis; Stathopoulos, Nikos; Psarogiannis, Athanasios; Penteris, Dimitris; Tsiakos, Chrisovalantis; Karagiannopoulou, Aikaterini; Krikigianni, Eleni; Karymbalis, Efthimios; Chalkias, Christos

    2016-04-01

    Floods are physical global hazards with negative environmental and socio-economic impacts on local and regional scale. The technological evolution during the last decades, especially in the field of geoinformatics, has offered new advantages in hydrological modelling. This study seeks to use this technology in order to quantify flood risk assessment. The study area which was used is an ungauged catchment and by using mostly GIS hydrological and geomorphological analysis together with a GIS-based distributed Unit Hydrograph model, a series of outcomes have risen. More specifically, this paper examined the behaviour of the Kladeos basin (Peloponnese, Greece) using real rainfall data, as well hypothetical storms. The hydrological analysis held using a Digital Elevation Model of 5x5m pixel size, while the quantitative drainage basin characteristics were calculated and were studied in terms of stream order and its contribution to the flood. Unit Hydrographs are, as it known, useful when there is lack of data and in this work, based on time-area method, a sequences of flood risk assessments have been made using the GIS technology. Essentially, the proposed methodology estimates parameters such as discharge, flow velocity equations etc. in order to quantify flood risk assessment. Keywords Flood Risk Assessment Quantification; GIS; hydrological analysis; geomorphological analysis.

  14. Random search optimization based on genetic algorithm and discriminant function

    NASA Technical Reports Server (NTRS)

    Kiciman, M. O.; Akgul, M.; Erarslanoglu, G.

    1990-01-01

    The general problem of optimization with arbitrary merit and constraint functions, which could be convex, concave, monotonic, or non-monotonic, is treated using stochastic methods. To improve the efficiency of the random search methods, a genetic algorithm for the search phase and a discriminant function for the constraint-control phase were utilized. The validity of the technique is demonstrated by comparing the results to published test problem results. Numerical experimentation indicated that for cases where a quick near optimum solution is desired, a general, user-friendly optimization code can be developed without serious penalties in both total computer time and accuracy.

  15. MIMO decorrelation for visible light communication based on angle optimization

    NASA Astrophysics Data System (ADS)

    Zhang, Haiyong; Zhu, Yijun

    2017-03-01

    Recently, many researchers have used the normal vector tilting to solve the problems about low rate of multiplexing and channel strong correlation in Visible Light Communication Multiple-Input Multiple-Output (VLC-MIMO) system, but they all lack of the theoretical support. In this paper, we establish a channel model about 2×2 VLC-MIMO, then translate the communication problem about vector tilting optimal angle in a certain range into a mathematical problem about seeking the minimum value of function. Finally, we deduced the mathematic expressions about the optimal tilting angles of corresponding LEDs and PDs, and these expressions will provide a theoretical basis for the further study.

  16. Optimization-based design of control systems for flexible structures

    NASA Technical Reports Server (NTRS)

    Polak, E.; Baker, T. E.; Wuu, T-L.; Harn, Y-P.

    1988-01-01

    The purpose of this presentation is to show that it is possible to use nonsmooth optimization algorithms to design both closed-loop finite dimensional compensators and open-loop optimal controls for flexible structures modeled by partial differential equations. An important feature of our approach is that it does not require modal decomposition and hence is immune to instabilities caused by spillover effects. Furthermore, it can be used to design control systems for structures that are modeled by mixed systems of coupled ordinary and partial differential equations.

  17. Efficient global optimization based 3D carotid AB-LIB MRI segmentation by simultaneously evolving coupled surfaces.

    PubMed

    Ukwatta, Eranga; Yuan, Jing; Rajchl, Martin; Fenster, Aaron

    2012-01-01

    Magnetic resonance (MR) imaging of carotid atherosclerosis biomarkers are increasingly being investigated for the risk assessment of vulnerable plaques. A fast and robust 3D segmentation of the carotid adventitia (AB) and lumen-intima (LIB) boundaries can greatly alleviate the measurement burden of generating quantitative imaging biomarkers in clinical research. In this paper, we propose a novel global optimization-based approach to segment the carotid AB and LIB from 3D T1-weighted black blood MR images, by simultaneously evolving two coupled surfaces with enforcement of anatomical consistency of the AB and LIB. We show that the evolution of two surfaces at each discrete time-frame can be optimized exactly and globally by means of convex relaxation. Our continuous max-flow based algorithm is implemented in GPUs to achieve high computational performance. The experiment results from 16 carotid MR images show that the algorithm obtained high agreement with manual segmentations and achieved high repeatability in segmentation.

  18. Study on Risk of Enterprise' Technology Innovation Based on ISM

    NASA Astrophysics Data System (ADS)

    Li, Hongyan

    The risk in the process of enterprise' technology innovation is concluted five subsystems: environmental risk, market risk, enterprise capacity risk, project risk and project management risk, 16 risk factors under each subsystem are identified. A Interpretative Structural Modeling(ISM) of of risk factors is established, the relationship and influence levels of them is confirmed, the purpose is to help enterprise assessing risks and taking countermeasure to minimize the potential loss and increase the innovation income.

  19. A ranking of European veterinary medicines based on environmental risks.

    PubMed

    Kools, Stefan A E; Boxall, Alistair; Moltmann, Johann F; Bryning, Gareth; Koschorreck, Jan; Knacker, Thomas

    2008-10-01

    The most likely entry pathways of veterinary pharmaceuticals to the environment are via slurry or manure from intensively reared animals to soil and via dung or urine from animals grazing on pasture. These pathways may result in contamination of surface water via runoff or leaching and drainage. Direct entry into water may occur by defecation by pasture animals or by Scompanion animals. In addition, application of medicines for aquaculture is important for a limited number of veterinary medicinal products. For a large number of veterinary medicinal products, consistent data on the environmental risk have never been generated. In this project, a simple risk-based ranking procedure was developed that should allow assessing the potential for environmental risks of active substances of veterinary medicinal products. In the European Union approximately 2000 products containing 741 active substances were identified. In the prescreening step and in agreement with the technical guidelines released by the European Medicines Agency, 294 natural substances, complex mixtures, and substances with low expected exposure were exempted from the ranking procedure. For 233 active substances, sufficient information was collated on 4 exposure scenarios: Intensively reared animals, pasture animals, companion animals, and aquaculture. The ranking approach was performed in 4 phases: (1) usage estimation; (2) characterization of exposure to soil, dung, surface water, and aquatic organisms depending on exposure scenarios; (3) characterization of effects based on therapeutical doses; and (4) risk characterization, which is the ratio of exposure to effects (risk index), and ranking. Generally, the top-ranked substances were from the antibiotic and parasiticide groups of veterinary medicines. Differences occurred in the ranking of substances in soil via application to either intensively reared or pasture animals. In intensive rearing, anticoccidia, for example, are used as feed

  20. Efficiency Mode of Energy Management based on Optimal Flight Path

    NASA Astrophysics Data System (ADS)

    Yang, Ling-xiao

    2016-07-01

    One new method of searching the optimal flight path in target function is put forward, which is applied to energy section for reentry flight vehicle, and the optimal flight path in which the energy is managed to decline rapidly, is settled by this design. The research for energy management is meaningful for engineering, it can also improve the applicability and flexibility for vehicle. The angle-of-attack and the bank angle are used to regulate energy and range at unpowered reentry flight as control variables. Firstly, the angle-of-attack section for minimum lift-to-drag ratio is ensured by the relation of range and lift-to-drag ratio. Secondly, build the secure boundary for flight corridor by restrictions in flight. Thirdly, the D-e section is optimized for energy expending in corridor by the influencing rule of the D-e section and range. Finally, compare this design method with the traditional Pseudo-spectral method. Moreover, energy-managing is achieved by cooperating lateral motion, and the optimized D-e section is tracked to prove the practicability of programming flight path with energy management.

  1. Hypergraph-Based Combinatorial Optimization of Matrix-Vector Multiplication

    ERIC Educational Resources Information Center

    Wolf, Michael Maclean

    2009-01-01

    Combinatorial scientific computing plays an important enabling role in computational science, particularly in high performance scientific computing. In this thesis, we will describe our work on optimizing matrix-vector multiplication using combinatorial techniques. Our research has focused on two different problems in combinatorial scientific…

  2. Model-Based Optimal Experimental Design for Complex Physical Systems

    DTIC Science & Technology

    2015-12-03

    NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) Jean-Luc Cambier Program Officer, Computational Mathematics , AFOSR/RTA 875 N...computational tools have been inadequate. Our goal has been to develop new mathematical formulations, estimation approaches, and approximation strategies...previous suboptimal approaches. 15. SUBJECT TERMS computational mathematics ; optimal experimental design; uncertainty quantification; Bayesian inference

  3. Optimized dynamic framing for PET-based myocardial blood flow estimation

    NASA Astrophysics Data System (ADS)

    Kolthammer, Jeffrey A.; Muzic, Raymond F.

    2013-08-01

    An optimal experiment design methodology was developed to select the framing schedule to be used in dynamic positron emission tomography (PET) for estimation of myocardial blood flow using 82Rb. A compartment model and an arterial input function based on measured data were used to calculate a D-optimality criterion for a wide range of candidate framing schedules. To validate the optimality calculation, noisy time-activity curves were simulated, from which parameter values were estimated using an efficient and robust decomposition of the estimation problem. D-optimized schedules improved estimate precision compared to non-optimized schedules, including previously published schedules. To assess robustness, a range of physiologic conditions were simulated. Schedules that were optimal for one condition were nearly-optimal for others. The effect of infusion duration was investigated. Optimality was better for shorter than for longer tracer infusion durations, with the optimal schedule for the shortest infusion duration being nearly optimal for other durations. Together this suggests that a framing schedule optimized for one set of conditions will also work well for others and it is not necessary to use different schedules for different infusion durations or for rest and stress studies. The method for optimizing schedules is general and could be applied in other dynamic PET imaging studies.

  4. Fews-Risk: A step towards risk-based flood forecasting

    NASA Astrophysics Data System (ADS)

    Bachmann, Daniel; Eilander, Dirk; de Leeuw, Annemargreet; Diermanse, Ferdinand; Weerts, Albrecht; de Bruijn, Karin; Beckers, Joost; Boelee, Leonore; Brown, Emma; Hazlewood, Caroline

    2015-04-01

    Operational flood prediction and the assessment of flood risk are important components of flood management. Currently, the model-based prediction of discharge and/or water level in a river is common practice for operational flood forecasting. Based on the prediction of these values decisions about specific emergency measures are made within operational flood management. However, the information provided for decision support is restricted to pure hydrological or hydraulic aspects of a flood. Information about weak sections within the flood defences, flood prone areas and assets at risk in the protected areas are rarely used in a model-based flood forecasting system. This information is often available for strategic planning, but is not in an appropriate format for operational purposes. The idea of FEWS-Risk is the extension of existing flood forecasting systems with elements of strategic flood risk analysis, such as probabilistic failure analysis, two dimensional flood spreading simulation and the analysis of flood impacts and consequences. Thus, additional information is provided to the decision makers, such as: • Location, timing and probability of failure of defined sections of the flood defence line; • Flood spreading, extent and hydraulic values in the hinterland caused by an overflow or a breach flow • Impacts and consequences in case of flooding in the protected areas, such as injuries or casualties and/or damages to critical infrastructure or economy. In contrast with purely hydraulic-based operational information, these additional data focus upon decision support for answering crucial questions within an operational flood forecasting framework, such as: • Where should I reinforce my flood defence system? • What type of action can I take to mend a weak spot in my flood defences? • What are the consequences of a breach? • Which areas should I evacuate first? This presentation outlines the additional required workflows towards risk-based flood

  5. A Genetics-based Biomarker Risk Algorithm for Predicting Risk of Alzheimer’s Disease

    PubMed Central

    Lutz, Michael W.; Sundseth, Scott S.; Burns, Daniel K.; Saunders, Ann M.; Hayden, Kathleen M.; Burke, James R.; Welsh-Bohmer, Kathleen A.; Roses, Allen D.

    2016-01-01

    Background A straightforward, reproducible blood-based test that predicts age dependent risk of Alzheimer’s disease (AD) could be used as an enrichment tool for clinical development of therapies. This study evaluated the prognostic performance of a genetics-based biomarker risk algorithm (GBRA) established on a combination of Apolipoprotein E (APOE)/Translocase of outer mitochondrial membrane 40 homolog (TOMM40) genotypes and age, then compare it to cerebrospinal fluid (CSF) biomarkers, neuroimaging and neurocognitive tests using data from two independent AD cohorts. Methods The GBRA was developed using data from the prospective Bryan-ADRC study (n=407; 86 conversion events (mild cognitive impairment (MCI) or late onset Alzheimer’s disease (LOAD)). The performance of the algorithm was tested using data from the ADNI study (n=660; 457 individuals categorized as MCI or LOAD). Results The positive predictive values (PPV) and negative predictive values (NPV) of the GBRA are in the range of 70–80%. The relatively high odds ratio (approximately 3–5) and significant net reclassification index (NRI) scores comparing the GBRA to a version based on APOE and age alone support the value of the GBRA in risk prediction for MCI due to LOAD. Performance of the GBRA compares favorably with CSF and imaging (fMRI) biomarkers. In addition, the GBRA “high” and “low” AD-risk categorizations correlated well with pathological CSF biomarker levels, PET amyloid burden and neurocognitive scores. Conclusions Unlike dynamic markers (i.e., imaging, protein or lipid markers) that may be influenced by factors unrelated to disease, genomic DNA is easily collected, stable, and the technical methods for measurement are robust, inexpensive, and widely available. The performance characteristics of the GBRA support its use as a pharmacogenetic enrichment tool for LOAD delay of onset clinical trials, and merits further evaluation for its clinical utility in evaluating therapeutic

  6. Increasing Effectiveness and Efficiency Through Risk-Based Deployments

    DTIC Science & Technology

    2015-12-01

    research into the experiences of other entities with risk-based deployment methodologies. 14. SUBJECT TERMS aviation security, Transportation...imaging technology BDO behavior detection officer CATA civil aviation threat assessment CHDS Center for Homeland Defense and Security COTS...little to no threat to aviation .”1 The TSA has the opportunity to continue this evolution, and address calls from the Government Accountability Office

  7. CAROTID - a web-based platform for optimal personalized management of atherosclerotic patients.

    PubMed

    Gastounioti, Aimilia; Kolias, Vasileios; Golemati, Spyretta; Tsiaparas, Nikolaos N; Matsakou, Aikaterini; Stoitsis, John S; Kadoglou, Nikolaos P E; Gkekas, Christos; Kakisis, John D; Liapis, Christos D; Karakitsos, Petros; Sarafis, Ioannis; Angelidis, Pantelis; Nikita, Konstantina S

    2014-04-01

    Carotid atherosclerosis is the main cause of fatal cerebral ischemic events, thereby posing a major burden for public health and state economies. We propose a web-based platform named CAROTID to address the need for optimal management of patients with carotid atherosclerosis in a twofold sense: (a) objective selection of patients who need carotid-revascularization (i.e., high-risk patients), using a multifaceted description of the disease consisting of ultrasound imaging, biochemical and clinical markers, and (b) effective storage and retrieval of patient data to facilitate frequent follow-ups and direct comparisons with related cases. These two services are achieved by two interconnected modules, namely the computer-aided diagnosis (CAD) tool and the intelligent archival system, in a unified, remotely accessible system. We present the design of the platform and we describe three main usage scenarios to demonstrate the CAROTID utilization in clinical practice. Additionally, the platform was evaluated in a real clinical environment in terms of CAD performance, end-user satisfaction and time spent on different functionalities. CAROTID classification of high- and low-risk cases was 87%; the corresponding stenosis-degree-based classification would have been 61%. Questionnaire-based user satisfaction showed encouraging results in terms of ease-of-use, clinical usefulness and patient data protection. Times for different CAROTID functionalities were generally short; as an example, the time spent for generating the diagnostic decision was 5min in case of 4-s ultrasound video. Large datasets and future evaluation sessions in multiple medical institutions are still necessary to reveal with confidence the full potential of the platform.

  8. Optimization of industrial structure based on water environmental carrying capacity in Tieling City.

    PubMed

    Yue, Qiang; Hou, Limin; Wang, Tong; Wang, Liusuo; Zhu, Yue; Wang, Xiu; Cheng, Xilei

    2015-01-01

    A system dynamics optimization model of the industrial structure of Tieling City based on water environmental carrying capacity has been established. This system is divided into the following subsystems: water resources, economics, population, contaminants, and agriculture and husbandry. Three schemes were designed to simulate the model from 2011 to 2020, and these schemes were compared to obtain an optimal social and economic development model in Tieling City. Policy recommendations on industrial structure optimization based on the optimal solution provide scientific decision-making advice to develop a strong and sustainable economy in Tieling.

  9. A global optimization algorithm for simulation-based problems via the extended DIRECT scheme

    NASA Astrophysics Data System (ADS)

    Liu, Haitao; Xu, Shengli; Wang, Xiaofang; Wu, Junnan; Song, Yang

    2015-11-01

    This article presents a global optimization algorithm via the extension of the DIviding RECTangles (DIRECT) scheme to handle problems with computationally expensive simulations efficiently. The new optimization strategy improves the regular partition scheme of DIRECT to a flexible irregular partition scheme in order to utilize information from irregular points. The metamodelling technique is introduced to work with the flexible partition scheme to speed up the convergence, which is meaningful for simulation-based problems. Comparative results on eight representative benchmark problems and an engineering application with some existing global optimization algorithms indicate that the proposed global optimization strategy is promising for simulation-based problems in terms of efficiency and accuracy.

  10. Torque-based optimal acceleration control for electric vehicle

    NASA Astrophysics Data System (ADS)

    Lu, Dongbin; Ouyang, Minggao

    2014-03-01

    The existing research of the acceleration control mainly focuses on an optimization of the velocity trajectory with respect to a criterion formulation that weights acceleration time and fuel consumption. The minimum-fuel acceleration problem in conventional vehicle has been solved by Pontryagin's maximum principle and dynamic programming algorithm, respectively. The acceleration control with minimum energy consumption for battery electric vehicle(EV) has not been reported. In this paper, the permanent magnet synchronous motor(PMSM) is controlled by the field oriented control(FOC) method and the electric drive system for the EV(including the PMSM, the inverter and the battery) is modeled to favor over a detailed consumption map. The analytical algorithm is proposed to analyze the optimal acceleration control and the optimal torque versus speed curve in the acceleration process is obtained. Considering the acceleration time, a penalty function is introduced to realize a fast vehicle speed tracking. The optimal acceleration control is also addressed with dynamic programming(DP). This method can solve the optimal acceleration problem with precise time constraint, but it consumes a large amount of computation time. The EV used in simulation and experiment is a four-wheel hub motor drive electric vehicle. The simulation and experimental results show that the required battery energy has little difference between the acceleration control solved by analytical algorithm and that solved by DP, and is greatly reduced comparing with the constant pedal opening acceleration. The proposed analytical and DP algorithms can minimize the energy consumption in EV's acceleration process and the analytical algorithm is easy to be implemented in real-time control.

  11. Integrated biodepuration of pesticide-contaminated wastewaters from the fruit-packaging industry using biobeds: Bioaugmentation, risk assessment and optimized management.

    PubMed

    Karas, Panagiotis A; Perruchon, Chiara; Karanasios, Evangelos; Papadopoulou, Evangelia S; Manthou, Elena; Sitra, Stefania; Ehaliotis, Constantinos; Karpouzas, Dimitrios G

    2016-12-15

    Wastewaters from fruit-packaging plants contain high loads of toxic and persistent pesticides and should be treated on site. We evaluated the depuration performance of five pilot biobeds against those effluents. In addition we tested bioaugmentation with bacterial inocula as a strategy for optimization of their depuration capacity. Finally we determined the composition and functional dynamics of the microbial community via q-PCR. Practical issues were also addressed including the risk associated with the direct environmental disposal of biobed-treated effluents and decontamination methods for the spent packing material. Biobeds showed high depuration capacity (>99.5%) against all pesticides with bioaugmentation maximizing their depuration performance against the persistent fungicide thiabendazole (TBZ). This was followed by a significant increase in the abundance of bacteria, fungi and of catabolic genes of aromatic compounds catA and pcaH. Bioaugmentation was the most potent decontamination method for spent packing material with composting being an effective alternative. Risk assessment based on practical scenarios (pome and citrus fruit-packaging plants) and the depuration performance of the pilot biobeds showed that discharge of the treated effluents into an 0.1-ha disposal site did not entail an environmental risk, except for TBZ-containing effluents where a larger disposal area (0.2ha) or bioaugmentation alleviated the risk.

  12. Ecological risk assessment of a decommissioned military base

    SciTech Connect

    Starodub, M.E.; Feniak, N.A.; Willes, R.F.; Moore, C.E.; Mucklow, L.; Marshall, L.

    1995-12-31

    The ecological health risks to selected terrestrial animals at a decommissioned military base in Atlantic Canada have been assessed. Areas of the base varied in terms of terrain, ground cover, as well as types and extent of contamination, dependent on former uses of the sites. Analysis of surficial soils, sediments, water and fish tissue at the base indicated contamination by metals, PCBs, and various petroleum products and their constituents. Identification of chemicals of concern was based on these analyses, in conjunction with detailed chemical selection procedures. Exposures to chemicals of concern for ecological receptors were assessed in one of two ways. The exposures of moose, snowshoe hare and meadow vole were estimated in areas with surficial contamination, based on expected exposures to environmental media via oral inhalation, and dermal routes of exposure. For two top predators (mink and bald-headed eagle), exposures to bioaccumulative chemicals (cadmium, lead, mercury and PCBs) via transport through the aquatic and/or terrestrial foodchain were estimated. A toxicological assessment was conducted for the chemicals of concern, to yield exposure limits derived from governmental regulations or developed based on no-observed-effect-levels (NOELs) reported in scientifically sound toxicological assays in relevant species. The risk evaluation of each chemical of concern was conducted as a comparison of the estimated total exposures to the exposure limits derived for the selected ecological receptors.

  13. SU-E-T-617: A Feasibility Study of Navigation Based Multi Criteria Optimization for Advanced Cervical Cancer IMRT Planning

    SciTech Connect

    Ma, C

    2014-06-01

    Purpose: This study aims to validate multi-criteria optimization (MCO) against standard intensity modulated radiation therapy (IMRT) optimization for advanced cervical cancer in RayStation (v2.4, RaySearch Laboratories, Sweden). Methods: 10 advanced cervical cancer patients IMRT plans were randomly selected, these plans were designed with step and shoot optimization, new plans were then designed with MCO based on these plans,while keeping optimization conditions unchanged,comparison was made between both kinds of plans including the dose volume histogram parameters of PTV and OAR,and were analysed by pairing-t test. Results: We normalize the plan so that 95% volume of PTV achieved the prescribed dose(50Gy). The volume of radiation 10, 20, 30, and 40 Gy of the rectum were reduced by 14.7%,26.8%,21.1%,10.5% respectively(P≥0.05). The mean dose of rectum were reduced by 7.2Gy(P≤0.05). There were no significant differences for the dosimetric parameters for the bladder. Conclusion: In comparision with standard IMRT optimization, MCO reduces the dose of organs at risk with the same PTV coverage,but the result needs further clinical evalution.

  14. Risk based analysis: A rational approach to site cleanup

    SciTech Connect

    Arulanatham, R.; So, E.

    1994-12-31

    Soil and groundwater pollution in urban areas often can pose a threat to either human health or water quality or both. This soil and groundwater cleanup can be a very lengthy process and requires significant economic resources. The cleanup levels or requirements required by one agency sometimes do not match that required by the other agency, especially those for soil pollution. The involvement of several agencies at different times during the reclamation process has often diminished the cost-effectiveness of the reclamation efforts. In an attempt to bring some solutions to minimize this kind of problem (which has been experienced by both the authors) the staff of the Alameda County Department of Environmental Health and the Regional Water Quality Control Board, San Francisco Bay Region, has jointly developed some workable guidelines to self-assist the responsible parties in deriving target cleanup goals that are both human health (or other ecological receptor) and water quality protective. The following is a 6-step summary of the methodology to assist the responsible parties in properly managing their pollution problem. These guidelines include: (1) site characterization; (2) initial risk-based screening of contaminants; (3) derivation of health and/or ecological risk-based cleanup goals; (4) derivation of groundwater quality-based cleanup goals; (5) site cleanup goals and site remediation; and (6) risk management decisions.

  15. Developing Hydrogeological Site Characterization Strategies based on Human Health Risk

    NASA Astrophysics Data System (ADS)

    de Barros, F.; Rubin, Y.; Maxwell, R. M.

    2013-12-01

    In order to provide better sustainable groundwater quality management and minimize the impact of contamination in humans, improved understanding and quantification of the interaction between hydrogeological models, geological site information and human health are needed. Considering the joint influence of these components in the overall human health risk assessment and the corresponding sources of uncertainty aid decision makers to better allocate resources in data acquisition campaigns. This is important to (1) achieve remediation goals in a cost-effective manner, (2) protect human health and (3) keep water supplies clean in order to keep with quality standards. Such task is challenging since a full characterization of the subsurface is unfeasible due to financial and technological constraints. In addition, human exposure and physiological response to contamination are subject to uncertainty and variability. Normally, sampling strategies are developed with the goal of reducing uncertainty, but less often they are developed in the context of their impacts on the overall system uncertainty. Therefore, quantifying the impact from each of these components (hydrogeological, behavioral and physiological) in final human health risk prediction can provide guidance for decision makers to best allocate resources towards minimal prediction uncertainty. In this presentation, a multi-component human health risk-based framework is presented which allows decision makers to set priorities through an information entropy-based visualization tool. Results highlight the role of characteristic length-scales characterizing flow and transport in determining data needs within an integrated hydrogeological-health framework. Conditions where uncertainty reduction in human health risk predictions may benefit from better understanding of the health component, as opposed to a more detailed hydrogeological characterization, are also discussed. Finally, results illustrate how different dose

  16. Poaching risks in community-based natural resource management.

    PubMed

    Kahler, Jessica S; Roloff, Gary J; Gore, Meredith L

    2013-02-01

    Poaching can disrupt wildlife-management efforts in community-based natural resource management systems. Monitoring, estimating, and acquiring data on poaching is difficult. We used local-stakeholder knowledge and poaching records to rank and map the risk of poaching incidents in 2 areas where natural resources are managed by community members in Caprivi, Namibia. We mapped local stakeholder perceptions of the risk of poaching, risk of wildlife damage to livelihoods, and wildlife distribution and compared these maps with spatially explicit records of poaching events. Recorded poaching events and stakeholder perceptions of where poaching occurred were not spatially correlated. However, the locations of documented poaching events were spatially correlated with areas that stakeholders perceived wildlife as a threat to their livelihoods. This result suggests poaching occurred in response to wildlife damage occurred. Local stakeholders thought that wildlife populations were at high risk of being poached and that poaching occurred where there was abundant wildlife. These findings suggest stakeholders were concerned about wildlife resources in their community and indicate a need for integrated and continued monitoring of poaching activities and further interventions at the wildlife-agricultural interface. Involving stakeholders in the assessment of poaching risks promotes their participation in local conservation efforts, a central tenet of community-based management. We considered stakeholders poaching informants, rather than suspects, and our technique was spatially explicit. Different strategies to reduce poaching are likely needed in different areas. For example, interventions that reduce human-wildlife conflict may be required in residential areas, and increased and targeted patrolling may be required in more remote areas. Stakeholder-generated maps of human-wildlife interactions may be a valuable enforcement and intervention support tool.

  17. Reducing infection risk in implant-based breast-reconstruction surgery: challenges and solutions

    PubMed Central

    Ooi, Adrian SH; Song, David H

    2016-01-01

    Implant-based procedures are the most commonly performed method for postmastectomy breast reconstruction. While donor-site morbidity is low, these procedures are associated with a higher risk of reconstructive loss. Many of these are related to infection of the implant, which can lead to prolonged antibiotic treatment, undesired additional surgical procedures, and unsatisfactory results. This review combines a summary of the recent literature regarding implant-related breast-reconstruction infections and combines this with a practical approach to the patient and surgery aimed at reducing this risk. Prevention of infection begins with appropriate reconstructive choice based on an assessment and optimization of risk factors. These include patient and disease characteristics, such as smoking, obesity, large breast size, and immediate reconstructive procedures, as well as adjuvant therapy, such as radiotherapy and chemotherapy. For implant-based breast reconstruction, preoperative planning and organization is key to reducing infection. A logical and consistent intraoperative and postoperative surgical protocol, including appropriate antibiotic choice, mastectomy-pocket creation, implant handling, and considered acellular dermal matrix use contribute toward the reduction of breast-implant infections. PMID:27621667

  18. Web based collaborative decision making in flood risk management

    NASA Astrophysics Data System (ADS)

    Evers, Mariele; Almoradie, Adrian; Jonoski, Andreja

    2014-05-01

    Stakeholder participation in the development of flood risk management (FRM) plans is essential since stakeholders often have a better understanding or knowledge of the potentials and limitation of their local area. Moreover, a participatory approach also creates trust amongst stakeholders, leading to a successful implementation of measures. Stakeholder participation however has its challenges and potential pitfalls that could lead to its premature termination. Such challenges and pitfalls are the limitation of financial resources, stakeholders' spatial distribution and their interest to participate. Different type of participation in FRM may encounter diverse challenges. These types of participation in FRM can be classified into (1) Information and knowledge sharing (IKS), (2) Consultative participation (CP) or (3) Collaborative decision making (CDM)- the most challenging type of participation. An innovative approach to address these challenges and potential pitfalls is a web-based mobile or computer-aided environment for stakeholder participation. This enhances the remote interaction between participating entities such as stakeholders. This paper presents a developed framework and an implementation of CDM web based environment for the Alster catchment (Hamburg, Germany) and Cranbrook catchment (London, UK). The CDM framework consists of two main stages: (1) Collaborative modelling and (2) Participatory decision making. This paper also highlights the stakeholder analyses, modelling approach and application of General Public License (GPL) technologies in developing the web-based environments. Actual test and evaluation of the environments was through series of stakeholders workshops. The overall results based from stakeholders' evaluation shows that web-based environments can address the challenges and potential pitfalls in stakeholder participation and it enhances participation in flood risk management. The web-based environment was developed within the DIANE

  19. Self-affirmation moderates effects of unrealistic optimism and pessimism on reactions to tailored risk feedback.

    PubMed

    Klein, William M P; Lipkus, Isaac M; Scholl, Sarah M; McQueen, Amy; Cerully, Jennifer L; Harris, Peter R

    2010-12-01

    We examined whether self-affirmation would facilitate intentions to engage in colorectal cancer (CRC) screening among individuals who were off-schedule for CRC screening and who were categorised as unrealistically optimistic, realistic or unrealistically pessimistic about their CRC risk. All participants received tailored risk feedback; in addition, one group received threatening social comparison information regarding their risk factors, a second received this information after a self-affirmation exercise and a third was a no-treatment control. When participants were unrealistically optimistic about their CRC risk (determined by comparing their perceived comparative risk to calculations from a risk algorithm), they expressed greater interest in screening if they were self-affirmed (relative to controls). Non-affirmed unrealistic optimists expressed lower interest relative to controls, suggesting that they were responding defensively. Realistic participants and unrealistically pessimistic participants who were self-affirmed expressed relatively less interest in CRC screening, suggesting that self-affirmation can be helpful or hurtful depending on the accuracy of one's risk perceptions.

  20. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given

  1. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  2. Physics-based Entry, Descent and Landing Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Huynh, Loc C.; Manning, Ted

    2014-01-01

    A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.

  3. Risk estimation based on chromosomal aberrations induced by radiation

    NASA Technical Reports Server (NTRS)

    Durante, M.; Bonassi, S.; George, K.; Cucinotta, F. A.

    2001-01-01

    The presence of a causal association between the frequency of chromosomal aberrations in peripheral blood lymphocytes and the risk of cancer has been substantiated recently by epidemiological studies. Cytogenetic analyses of crew members of the Mir Space Station have shown that a significant increase in the frequency of chromosomal aberrations can be detected after flight, and that such an increase is likely to be attributed to the radiation exposure. The risk of cancer can be estimated directly from the yields of chromosomal aberrations, taking into account some aspects of individual susceptibility and other factors unrelated to radiation. However, the use of an appropriate technique for the collection and analysis of chromosomes and the choice of the structural aberrations to be measured are crucial in providing sound results. Based on the fraction of aberrant lymphocytes detected before and after flight, the relative risk after a long-term Mir mission is estimated to be about 1.2-1.3. The new technique of mFISH can provide useful insights into the quantification of risk on an individual basis.

  4. Fire risk reduction through a community-based risk assessment: reflections from Makola Market, Accra, Ghana.

    PubMed

    Oteng-Ababio, Martin; Sarpong, Akwasi Owusu

    2015-07-01

    This paper explores the level of vulnerability to the hazard of fire that exists in Makola Market in Accra, Ghana, and assesses how this threat can be reduced through a community-based risk assessment. It examines the perceptions of both market-stall occupants and primary stakeholders regarding the hazard of fire, and analyses the availability of local assets (coping strategies) with which to address the challenge. Through an evaluation of past instances of fire, as well as in-depth key stakeholder interviews, field visits, and observations, the study produces a detailed hazard map of the market. It goes on to recommend that policymakers consider short-to-long-term interventions to reduce the degree of risk. By foregrounding the essence of holistic and integrated planning, the paper calls for the incorporation of disaster mitigation measures in the overall urban planning process and for the strict enforcement of relevant building and fire safety codes by responsible public agencies.

  5. Developing safety performance functions incorporating reliability-based risk measures.

    PubMed

    Ibrahim, Shewkar El-Bassiouni; Sayed, Tarek

    2011-11-01

    Current geometric design guides provide deterministic standards where the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from these standards. Several studies have advocated probabilistic geometric design where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a risk measure of the implication of deviation from design standards. However, there is currently no link between measures of design reliability and the quantification of safety using collision frequency. The analysis presented in this paper attempts to bridge this gap by incorporating a reliability-based quantitative risk measure such as the probability of non-compliance (P(nc)) in safety performance functions (SPFs). Establishing this link will allow admitting reliability-based design into traditional benefit-cost analysis and should lead to a wider application of the reliability technique in road design. The present application is concerned with the design of horizontal curves, where the limit state function is defined in terms of the available (supply) and stopping (demand) sight distances. A comprehensive collision and geometric design database of two-lane rural highways is used to investigate the effect of the probability of non-compliance on safety. The reliability analysis was carried out using the First Order Reliability Method (FORM). Two Negative Binomial (NB) SPFs were developed to compare models with and without the reliability-based risk measures. It was found that models incorporating the P(nc) provided a better fit to the data set than the traditional (without risk) NB SPFs for total, injury and fatality (I+F) and property damage only (PDO) collisions.

  6. Optimizing and Validating a Brief Assessment for Identifying Children of Service Members at Risk for Psychological Health Problems Following Parent Deployment

    DTIC Science & Technology

    2013-07-01

    Children (BASC) ADHD Monitor. Circle Pines, MN: American Guidance Service. Keane, T., Fairbank, J., Caddell, J., Zimering, R., Taylor, K., & Mora, C...0036 TITLE: Optimizing and Validating a Brief Assessment for Identifying Children of Service Members at Risk for Psychological Health Problems...DATES COVERED (From - To) 15 June 2012- 14 June 2013 Optimizing and Validating a Brief Assessment for Identifying Children of Service Members at Risk

  7. New displacement-based methods for optimal truss topology design

    NASA Technical Reports Server (NTRS)

    Bendsoe, Martin P.; Ben-Tal, Aharon; Haftka, Raphael T.

    1991-01-01

    Two alternate methods for maximum stiffness truss topology design are presented. The ground structure approach is used, and the problem is formulated in terms of displacements and bar areas. This large, nonconvex optimization problem can be solved by a simultaneous analysis and design approach. Alternatively, an equivalent, unconstrained, and convex problem in the displacements only can be formulated, and this problem can be solved by a nonsmooth, steepest descent algorithm. In both methods, the explicit solving of the equilibrium equations and the assembly of the global stiffness matrix are circumvented. A large number of examples have been studied, showing the attractive features of topology design as well as exposing interesting features of optimal topologies.

  8. Optimization of amide-based EP3 receptor antagonists.

    PubMed

    Lee, Esther C Y; Futatsugi, Kentaro; Arcari, Joel T; Bahnck, Kevin; Coffey, Steven B; Derksen, David R; Kalgutkar, Amit S; Loria, Paula M; Sharma, Raman

    2016-06-01

    Prostaglandin E receptor subtype 3 (EP3) antagonism may treat a variety of symptoms from inflammation to cardiovascular and metabolic diseases. Previously, most EP3 antagonists were large acidic ligands that mimic the substrate, prostaglandin E2 (PGE2). This manuscript describes the optimization of a neutral small molecule amide series with improved lipophilic efficiency (LipE) also known as lipophilic ligand efficiency (LLE) ((a) Nat. Rev. Drug Disc.2007, 6, 881; (b) Annu. Rep. Med. Chem.2010, 45, 380).

  9. Optimal control based seizure abatement using patient derived connectivity

    PubMed Central

    Taylor, Peter N.; Thomas, Jijju; Sinha, Nishant; Dauwels, Justin; Kaiser, Marcus; Thesen, Thomas; Ruths, Justin

    2015-01-01

    Epilepsy is a neurological disorder in which patients have recurrent seizures. Seizures occur in conjunction with abnormal electrical brain activity which can be recorded by the electroencephalogram (EEG). Often, this abnormal brain activity consists of high amplitude regular spike-wave oscillations as opposed to low amplitude irregular oscillations in the non-seizure state. Active brain stimulation has been proposed as a method to terminate seizures prematurely, however, a general and widely-applicable approach to optimal stimulation protocols is still lacking. In this study we use a computational model of epileptic spike-wave dynamics to evaluate the effectiveness of a pseudospectral method to simulated seizure abatement. We incorporate brain connectivity derived from magnetic resonance imaging of a subject with idiopathic generalized epilepsy. We find that the pseudospectral method can successfully generate time-varying stimuli that abate simulated seizures, even when including heterogeneous patient specific brain connectivity. The strength of the stimulus required varies in different brain areas. Our results suggest that seizure abatement, modeled as an optimal control problem and solved with the pseudospectral method, offers an attractive approach to treatment for in vivo stimulation techniques. Further, if optimal brain stimulation protocols are to be experimentally successful, then the heterogeneity of cortical connectivity should be accounted for in the development of those protocols and thus more spatially localized solutions may be preferable. PMID:26089775

  10. Poblano v1.0 : a Matlab toolbox for gradient-based optimization.

    SciTech Connect

    Dunlavy, Daniel M.; Acar, Evrim; Kolda, Tamara Gibson

    2010-03-01

    We present Poblano v1.0, a Matlab toolbox for solving gradient-based unconstrained optimization problems. Poblano implements three optimization methods (nonlinear conjugate gradients, limited-memory BFGS, and truncated Newton) that require only first order derivative information. In this paper, we describe the Poblano methods, provide numerous examples on how to use Poblano, and present results of Poblano used in solving problems from a standard test collection of unconstrained optimization problems.

  11. Foraging on the potential energy surface: A swarm intelligence-based optimizer for molecular geometry

    NASA Astrophysics Data System (ADS)

    Wehmeyer, Christoph; Falk von Rudorff, Guido; Wolf, Sebastian; Kabbe, Gabriel; Schärf, Daniel; Kühne, Thomas D.; Sebastiani, Daniel

    2012-11-01

    We present a stochastic, swarm intelligence-based optimization algorithm for the prediction of global minima on potential energy surfaces of molecular cluster structures. Our optimization approach is a modification of the artificial bee colony (ABC) algorithm which is inspired by the foraging behavior of honey bees. We apply our modified ABC algorithm to the problem of global geometry optimization of molecular cluster structures and show its performance for clusters with 2-57 particles and different interatomic interaction potentials.

  12. Research of trajectory optimization on feeding manipulator based on internal penalty function

    NASA Astrophysics Data System (ADS)

    Wei, Chunli

    2016-10-01

    This paper has discussed the problems of trajectory optimization of feeding manipulator based on penalty function. Has selected the types of feeding robot, which work on NC machining center of the flexible workshop, and created the mathematical model with penalty function, for the purpose not only to optimize its walking path to reduce the production cost, but also improve its safety and efficiency of production. It has been verified by theoretical analysis and practice, the path optimization method is feasible.

  13. Local-in-Time Adjoint-Based Method for Optimal Control/Design Optimization of Unsteady Compressible Flows

    NASA Technical Reports Server (NTRS)

    Yamaleev, N. K.; Diskin, B.; Nielsen, E. J.

    2009-01-01

    .We study local-in-time adjoint-based methods for minimization of ow matching functionals subject to the 2-D unsteady compressible Euler equations. The key idea of the local-in-time method is to construct a very accurate approximation of the global-in-time adjoint equations and the corresponding sensitivity derivative by using only local information available on each time subinterval. In contrast to conventional time-dependent adjoint-based optimization methods which require backward-in-time integration of the adjoint equations over the entire time interval, the local-in-time method solves local adjoint equations sequentially over each time subinterval. Since each subinterval contains relatively few time steps, the storage cost of the local-in-time method is much lower than that of the global adjoint formulation, thus making the time-dependent optimization feasible for practical applications. The paper presents a detailed comparison of the local- and global-in-time adjoint-based methods for minimization of a tracking functional governed by the Euler equations describing the ow around a circular bump. Our numerical results show that the local-in-time method converges to the same optimal solution obtained with the global counterpart, while drastically reducing the memory cost as compared to the global-in-time adjoint formulation.

  14. The optimal dynamic immunization under a controlled heterogeneous node-based SIRS model

    NASA Astrophysics Data System (ADS)

    Yang, Lu-Xing; Draief, Moez; Yang, Xiaofan

    2016-05-01

    Dynamic immunizations, under which the state of the propagation network of electronic viruses can be changed by adjusting the control measures, are regarded as an alternative to static immunizations. This paper addresses the optimal dynamical immunization under the widely accepted SIRS assumption. First, based on a controlled heterogeneous node-based SIRS model, an optimal control problem capturing the optimal dynamical immunization is formulated. Second, the existence of an optimal dynamical immunization scheme is shown, and the corresponding optimality system is derived. Next, some numerical examples are given to show that an optimal immunization strategy can be worked out by numerically solving the optimality system, from which it is found that the network topology has a complex impact on the optimal immunization strategy. Finally, the difference between a payoff and the minimum payoff is estimated in terms of the deviation of the corresponding immunization strategy from the optimal immunization strategy. The proposed optimal immunization scheme is justified, because it can achieve a low level of infections at a low cost.

  15. Prediction-based manufacturing center self-adaptive demand side energy optimization in cyber physical systems

    NASA Astrophysics Data System (ADS)

    Sun, Xinyao; Wang, Xue; Wu, Jiangwei; Liu, Youda

    2014-05-01

    Cyber physical systems(CPS) recently emerge as a new technology which can provide promising approaches to demand side management(DSM), an important capability in industrial power systems. Meanwhile, the manufacturing center is a typical industrial power subsystem with dozens of high energy consumption devices which have complex physical dynamics. DSM, integrated with CPS, is an effective methodology for solving energy optimization problems in manufacturing center. This paper presents a prediction-based manufacturing center self-adaptive energy optimization method for demand side management in cyber physical systems. To gain prior knowledge of DSM operating results, a sparse Bayesian learning based componential forecasting method is introduced to predict 24-hour electric load levels for specific industrial areas in China. From this data, a pricing strategy is designed based on short-term load forecasting results. To minimize total energy costs while guaranteeing manufacturing center service quality, an adaptive demand side energy optimization algorithm is presented. The proposed scheme is tested in a machining center energy optimization experiment. An AMI sensing system is then used to measure the demand side energy consumption of the manufacturing center. Based on the data collected from the sensing system, the load prediction-based energy optimization scheme is implemented. By employing both the PSO and the CPSO method, the problem of DSM in the manufacturing center is solved. The results of the experiment show the self-adaptive CPSO energy optimization method enhances optimization by 5% compared with the traditional PSO optimization method.

  16. Optimized Deposition Parameters & Coating Properties of Cobalt Phosphorus Alloy Electroplating for Technology Insertion Risk Reduction

    DTIC Science & Technology

    2010-10-01

    TECHNOLOGY INSERTION RISK REDUCTION (ESTCP Project WP-0411) Ruben Prado & John Benfer Naval Air Systems Command Diana Facchini Integran...Phosphorus Alloy Electroplating for Technology Insertion Risk Reduction" Ruben Prado, John Benfer, Diana Facchini, Keith Legg NAVAIR ISSC/FRC-SE NAS...coating. Attempts were then made to pry the coating with a sharp blade to assess whether there was any lift off of the coating which would indicate

  17. A faster optimization method based on support vector regression for aerodynamic problems

    NASA Astrophysics Data System (ADS)

    Yang, Xixiang; Zhang, Weihua

    2013-09-01

    In this paper, a new strategy for optimal design of complex aerodynamic configuration with a reasonable low computational effort is proposed. In order to solve the formulated aerodynamic optimization problem with heavy computation complexity, two steps are taken: (1) a sequential approximation method based on support vector regression (SVR) and hybrid cross validation strategy, is proposed to predict aerodynamic coefficients, and thus approximates the objective function and constraint conditions of the originally formulated optimization problem with given limited sample points; (2) a sequential optimization algorithm is proposed to ensure the obtained optimal solution by solving the approximation optimization problem in step (1) is very close to the optimal solution of the originally formulated optimization problem. In the end, we adopt a complex aerodynamic design problem, that is optimal aerodynamic design of a flight vehicle with grid fins, to demonstrate our proposed optimization methods, and numerical results show that better results can be obtained with a significantly lower computational effort than using classical optimization techniques.

  18. Agent-Based Mapping of Credit Risk for Sustainable Microfinance

    PubMed Central

    Lee, Joung-Hun; Jusup, Marko; Podobnik, Boris; Iwasa, Yoh

    2015-01-01

    By drawing analogies with independent research areas, we propose an unorthodox framework for mapping microfinance credit risk---a major obstacle to the sustainability of lenders outreaching to the poor. Specifically, using the elements of network theory, we constructed an agent-based model that obeys the stylized rules of microfinance industry. We found that in a deteriorating economic environment confounded with adverse selection, a form of latent moral hazard may cause a regime shift from a high to a low loan payment probability. An after-the-fact recovery, when possible, required the economic environment to improve beyond that which led to the shift in the first place. These findings suggest a small set of measurable quantiti