Sample records for case study optimal

  1. Managing the Public Sector Research and Development Portfolio Selection Process: A Case Study of Quantitative Selection and Optimization

    DTIC Science & Technology

    2016-09-01

    PUBLIC SECTOR RESEARCH & DEVELOPMENT PORTFOLIO SELECTION PROCESS: A CASE STUDY OF QUANTITATIVE SELECTION AND OPTIMIZATION by Jason A. Schwartz...PUBLIC SECTOR RESEARCH & DEVELOPMENT PORTFOLIO SELECTION PROCESS: A CASE STUDY OF QUANTITATIVE SELECTION AND OPTIMIZATION 5. FUNDING NUMBERS 6...describing how public sector organizations can implement a research and development (R&D) portfolio optimization strategy to maximize the cost

  2. A Homogenization Approach for Design and Simulation of Blast Resistant Composites

    NASA Astrophysics Data System (ADS)

    Sheyka, Michael

    Structural composites have been used in aerospace and structural engineering due to their high strength to weight ratio. Composite laminates have been successfully and extensively used in blast mitigation. This dissertation examines the use of the homogenization approach to design and simulate blast resistant composites. Three case studies are performed to examine the usefulness of different methods that may be used in designing and optimizing composite plates for blast resistance. The first case study utilizes a single degree of freedom system to simulate the blast and a reliability based approach. The first case study examines homogeneous plates and the optimal stacking sequence and plate thicknesses are determined. The second and third case studies use the homogenization method to calculate the properties of composite unit cell made of two different materials. The methods are integrated with dynamic simulation environments and advanced optimization algorithms. The second case study is 2-D and uses an implicit blast simulation, while the third case study is 3-D and simulates blast using the explicit blast method. Both case studies 2 and 3 rely on multi-objective genetic algorithms for the optimization process. Pareto optimal solutions are determined in case studies 2 and 3. Case study 3 is an integrative method for determining optimal stacking sequence, microstructure and plate thicknesses. The validity of the different methods such as homogenization, reliability, explicit blast modeling and multi-objective genetic algorithms are discussed. Possible extension of the methods to include strain rate effects and parallel computation is also examined.

  3. Optimization of vibratory energy harvesters with stochastic parametric uncertainty: a new perspective

    NASA Astrophysics Data System (ADS)

    Haji Hosseinloo, Ashkan; Turitsyn, Konstantin

    2016-04-01

    Vibration energy harvesting has been shown as a promising power source for many small-scale applications mainly because of the considerable reduction in the energy consumption of the electronics and scalability issues of the conventional batteries. However, energy harvesters may not be as robust as the conventional batteries and their performance could drastically deteriorate in the presence of uncertainty in their parameters. Hence, study of uncertainty propagation and optimization under uncertainty is essential for proper and robust performance of harvesters in practice. While all studies have focused on expectation optimization, we propose a new and more practical optimization perspective; optimization for the worst-case (minimum) power. We formulate the problem in a generic fashion and as a simple example apply it to a linear piezoelectric energy harvester. We study the effect of parametric uncertainty in its natural frequency, load resistance, and electromechanical coupling coefficient on its worst-case power and then optimize for it under different confidence levels. The results show that there is a significant improvement in the worst-case power of thus designed harvester compared to that of a naively-optimized (deterministically-optimized) harvester.

  4. Stochastic Robust Mathematical Programming Model for Power System Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Cong; Changhyeok, Lee; Haoyong, Chen

    2016-01-01

    This paper presents a stochastic robust framework for two-stage power system optimization problems with uncertainty. The model optimizes the probabilistic expectation of different worst-case scenarios with ifferent uncertainty sets. A case study of unit commitment shows the effectiveness of the proposed model and algorithms.

  5. A case study on topology optimized design for additive manufacturing

    NASA Astrophysics Data System (ADS)

    Gebisa, A. W.; Lemu, H. G.

    2017-12-01

    Topology optimization is an optimization method that employs mathematical tools to optimize material distribution in a part to be designed. Earlier developments of topology optimization considered conventional manufacturing techniques that have limitations in producing complex geometries. This has hindered the topology optimization efforts not to fully be realized. With the emergence of additive manufacturing (AM) technologies, the technology that builds a part layer upon a layer directly from three dimensional (3D) model data of the part, however, producing complex shape geometry is no longer an issue. Realization of topology optimization through AM provides full design freedom for the design engineers. The article focuses on topologically optimized design approach for additive manufacturing with a case study on lightweight design of jet engine bracket. The study result shows that topology optimization is a powerful design technique to reduce the weight of a product while maintaining the design requirements if additive manufacturing is considered.

  6. The effect of dropout on the efficiency of D-optimal designs of linear mixed models.

    PubMed

    Ortega-Azurduy, S A; Tan, F E S; Berger, M P F

    2008-06-30

    Dropout is often encountered in longitudinal data. Optimal designs will usually not remain optimal in the presence of dropout. In this paper, we study D-optimal designs for linear mixed models where dropout is encountered. Moreover, we estimate the efficiency loss in cases where a D-optimal design for complete data is chosen instead of that for data with dropout. Two types of monotonically decreasing response probability functions are investigated to describe dropout. Our results show that the location of D-optimal design points for the dropout case will shift with respect to that for the complete and uncorrelated data case. Owing to this shift, the information collected at the D-optimal design points for the complete data case does not correspond to the smallest variance. We show that the size of the displacement of the time points depends on the linear mixed model and that the efficiency loss is moderate.

  7. Field-based optimal-design of an electric motor: a new sensitivity formulation

    NASA Astrophysics Data System (ADS)

    Barba, Paolo Di; Mognaschi, Maria Evelina; Lowther, David Alister; Wiak, Sławomir

    2017-12-01

    In this paper, a new approach to robust optimal design is proposed. The idea is to consider the sensitivity by means of two auxiliary criteria A and D, related to the magnitude and isotropy of the sensitivity, respectively. The optimal design of a switched-reluctance motor is considered as a case study: since the case study exhibits two design criteria, the relevant Pareto front is approximated by means of evolutionary computing.

  8. Optimal Control for Quantum Driving of Two-Level Systems

    NASA Astrophysics Data System (ADS)

    Qi, Xiao-Qiu

    2018-01-01

    In this paper, the optimal quantum control of two-level systems is studied by the decompositions of SU(2). Using the Pontryagin maximum principle, the minimum time of quantum control is analyzed in detail. The solution scheme of the optimal control function is given in the general case. Finally, two specific cases, which can be applied in many quantum systems, are used to illustrate the scheme, while the corresponding optimal control functions are obtained.

  9. Applications of artificial neural nets in structural mechanics

    NASA Technical Reports Server (NTRS)

    Berke, Laszlo; Hajela, Prabhat

    1990-01-01

    A brief introduction to the fundamental of Neural Nets is given, followed by two applications in structural optimization. In the first case, the feasibility of simulating with neural nets the many structural analyses performed during optimization iterations was studied. In the second case, the concept of using neural nets to capture design expertise was studied.

  10. Applications of artificial neural nets in structural mechanics

    NASA Technical Reports Server (NTRS)

    Berke, L.; Hajela, P.

    1992-01-01

    A brief introduction to the fundamental of Neural Nets is given, followed by two applications in structural optimization. In the first case, the feasibility of simulating with neural nets the many structural analyses performed during optimization iterations was studied. In the second case, the concept of using neural nets to capture design expertise was studied.

  11. Optimal design of studies of influenza transmission in households. I: case-ascertained studies.

    PubMed

    Klick, B; Leung, G M; Cowling, B J

    2012-01-01

    Case-ascertained household transmission studies, in which households including an 'index case' are recruited and followed up, are invaluable to understanding the epidemiology of influenza. We used a simulation approach parameterized with data from household transmission studies to evaluate alternative study designs. We compared studies that relied on self-reported illness in household contacts vs. studies that used home visits to collect swab specimens for virological confirmation of secondary infections, allowing for the trade-off between sample size vs. intensity of follow-up given a fixed budget. For studies estimating the secondary attack proportion, 2-3 follow-up visits with specimens collected from all members regardless of illness were optimal. However, for studies comparing secondary attack proportions between two or more groups, such as controlled intervention studies, designs with reactive home visits following illness reports in contacts were most powerful, while a design with one home visit optimally timed also performed well.

  12. Optimized model tuning in medical systems.

    PubMed

    Kléma, Jirí; Kubalík, Jirí; Lhotská, Lenka

    2005-12-01

    In medical systems it is often advantageous to utilize specific problem situations (cases) in addition to or instead of a general model. Decisions are then based on relevant past cases retrieved from a case memory. The reliability of such decisions depends directly on the ability to identify cases of practical relevance to the current situation. This paper discusses issues of automated tuning in order to obtain a proper definition of mutual case similarity in a specific medical domain. The main focus is on a reasonably time-consuming optimization of the parameters that determine case retrieval and further utilization in decision making/ prediction. The two case studies - mortality prediction after cardiological intervention, and resource allocation at a spa - document that the optimization process is influenced by various characteristics of the problem domain.

  13. Geostationary Collocation: Case Studies for Optimal Maneuvers

    DTIC Science & Technology

    2016-03-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited GEOSTATIONARY ...DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE GEOSTATIONARY COLLOCATION: CASE STUDIES FOR OPTIMAL MANEUVERS 5. FUNDING NUMBERS 6...The geostationary belt is considered a natural resource, and as time goes by, the physical spaces for geostationary satellites will run out. The

  14. Transaction fees and optimal rebalancing in the growth-optimal portfolio

    NASA Astrophysics Data System (ADS)

    Feng, Yu; Medo, Matúš; Zhang, Liang; Zhang, Yi-Cheng

    2011-05-01

    The growth-optimal portfolio optimization strategy pioneered by Kelly is based on constant portfolio rebalancing which makes it sensitive to transaction fees. We examine the effect of fees on an example of a risky asset with a binary return distribution and show that the fees may give rise to an optimal period of portfolio rebalancing. The optimal period is found analytically in the case of lognormal returns. This result is consequently generalized and numerically verified for broad return distributions and returns generated by a GARCH process. Finally we study the case when investment is rebalanced only partially and show that this strategy can improve the investment long-term growth rate more than optimization of the rebalancing period.

  15. Web-GIS oriented systems viability for municipal solid waste selective collection optimization in developed and transient economies.

    PubMed

    Rada, E C; Ragazzi, M; Fedrizzi, P

    2013-04-01

    Municipal solid waste management is a multidisciplinary activity that includes generation, source separation, storage, collection, transfer and transport, processing and recovery, and, last but not least, disposal. The optimization of waste collection, through source separation, is compulsory where a landfill based management must be overcome. In this paper, a few aspects related to the implementation of a Web-GIS based system are analyzed. This approach is critically analyzed referring to the experience of two Italian case studies and two additional extra-European case studies. The first case is one of the best examples of selective collection optimization in Italy. The obtained efficiency is very high: 80% of waste is source separated for recycling purposes. In the second reference case, the local administration is going to be faced with the optimization of waste collection through Web-GIS oriented technologies for the first time. The starting scenario is far from an optimized management of municipal solid waste. The last two case studies concern pilot experiences in China and Malaysia. Each step of the Web-GIS oriented strategy is comparatively discussed referring to typical scenarios of developed and transient economies. The main result is that transient economies are ready to move toward Web oriented tools for MSW management, but this opportunity is not yet well exploited in the sector. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Comparison of linear and nonlinear programming approaches for "worst case dose" and "minmax" robust optimization of intensity-modulated proton therapy dose distributions.

    PubMed

    Zaghian, Maryam; Cao, Wenhua; Liu, Wei; Kardar, Laleh; Randeniya, Sharmalee; Mohan, Radhe; Lim, Gino

    2017-03-01

    Robust optimization of intensity-modulated proton therapy (IMPT) takes uncertainties into account during spot weight optimization and leads to dose distributions that are resilient to uncertainties. Previous studies demonstrated benefits of linear programming (LP) for IMPT in terms of delivery efficiency by considerably reducing the number of spots required for the same quality of plans. However, a reduction in the number of spots may lead to loss of robustness. The purpose of this study was to evaluate and compare the performance in terms of plan quality and robustness of two robust optimization approaches using LP and nonlinear programming (NLP) models. The so-called "worst case dose" and "minmax" robust optimization approaches and conventional planning target volume (PTV)-based optimization approach were applied to designing IMPT plans for five patients: two with prostate cancer, one with skull-based cancer, and two with head and neck cancer. For each approach, both LP and NLP models were used. Thus, for each case, six sets of IMPT plans were generated and assessed: LP-PTV-based, NLP-PTV-based, LP-worst case dose, NLP-worst case dose, LP-minmax, and NLP-minmax. The four robust optimization methods behaved differently from patient to patient, and no method emerged as superior to the others in terms of nominal plan quality and robustness against uncertainties. The plans generated using LP-based robust optimization were more robust regarding patient setup and range uncertainties than were those generated using NLP-based robust optimization for the prostate cancer patients. However, the robustness of plans generated using NLP-based methods was superior for the skull-based and head and neck cancer patients. Overall, LP-based methods were suitable for the less challenging cancer cases in which all uncertainty scenarios were able to satisfy tight dose constraints, while NLP performed better in more difficult cases in which most uncertainty scenarios were hard to meet tight dose limits. For robust optimization, the worst case dose approach was less sensitive to uncertainties than was the minmax approach for the prostate and skull-based cancer patients, whereas the minmax approach was superior for the head and neck cancer patients. The robustness of the IMPT plans was remarkably better after robust optimization than after PTV-based optimization, and the NLP-PTV-based optimization outperformed the LP-PTV-based optimization regarding robustness of clinical target volume coverage. In addition, plans generated using LP-based methods had notably fewer scanning spots than did those generated using NLP-based methods. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  17. Optimizing the construction of devices to control inaccesible surfaces - case study

    NASA Astrophysics Data System (ADS)

    Niţu, E. L.; Costea, A.; Iordache, M. D.; Rizea, A. D.; Babă, Al

    2017-10-01

    The modern concept for the evolution of manufacturing systems requires multi-criteria optimization of technological processes and equipments, prioritizing associated criteria according to their importance. Technological preparation of the manufacturing can be developed, depending on the volume of production, to the limit of favourable economical effects related to the recovery of the costs for the design and execution of the technological equipment. Devices, as subsystems of the technological system, in the general context of modernization and diversification of machines, tools, semi-finished products and drives, are made in a multitude of constructive variants, which in many cases do not allow their identification, study and improvement. This paper presents a case study in which the multi-criteria analysis of some structures, based on a general optimization method, of novelty character, is used in order to determine the optimal construction variant of a control device. The rational construction of the control device confirms that the optimization method and the proposed calculation methods are correct and determine a different system configuration, new features and functions, and a specific method of working to control inaccessible surfaces.

  18. Machining fixture layout optimization using particle swarm optimization algorithm

    NASA Astrophysics Data System (ADS)

    Dou, Jianping; Wang, Xingsong; Wang, Lei

    2011-05-01

    Optimization of fixture layout (locator and clamp locations) is critical to reduce geometric error of the workpiece during machining process. In this paper, the application of particle swarm optimization (PSO) algorithm is presented to minimize the workpiece deformation in the machining region. A PSO based approach is developed to optimize fixture layout through integrating ANSYS parametric design language (APDL) of finite element analysis to compute the objective function for a given fixture layout. Particle library approach is used to decrease the total computation time. The computational experiment of 2D case shows that the numbers of function evaluations are decreased about 96%. Case study illustrates the effectiveness and efficiency of the PSO based optimization approach.

  19. Improving the FLORIS wind plant model for compatibility with gradient-based optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, Jared J.; Gebraad, Pieter MO; Ning, Andrew

    The FLORIS (FLOw Redirection and Induction in Steady-state) model, a parametric wind turbine wake model that predicts steady-state wake characteristics based on wind turbine position and yaw angle, was developed for optimization of control settings and turbine locations. This article provides details on changes made to the FLORIS model to make the model more suitable for gradient-based optimization. Changes to the FLORIS model were made to remove discontinuities and add curvature to regions of non-physical zero gradient. Exact gradients for the FLORIS model were obtained using algorithmic differentiation. A set of three case studies demonstrate that using exact gradients withmore » gradient-based optimization reduces the number of function calls by several orders of magnitude. The case studies also show that adding curvature improves convergence behavior, allowing gradient-based optimization algorithms used with the FLORIS model to more reliably find better solutions to wind farm optimization problems.« less

  20. Vector-model-supported optimization in volumetric-modulated arc stereotactic radiotherapy planning for brain metastasis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Eva Sau Fan; Department of Health Technology and Informatics, The Hong Kong Polytechnic University; Wu, Vincent Wing Cheung

    Long planning time in volumetric-modulated arc stereotactic radiotherapy (VMA-SRT) cases can limit its clinical efficiency and use. A vector model could retrieve previously successful radiotherapy cases that share various common anatomic features with the current case. The prsent study aimed to develop a vector model that could reduce planning time by applying the optimization parameters from those retrieved reference cases. Thirty-six VMA-SRT cases of brain metastasis (gender, male [n = 23], female [n = 13]; age range, 32 to 81 years old) were collected and used as a reference database. Another 10 VMA-SRT cases were planned with both conventional optimization and vector-model-supported optimization, followingmore » the oncologists' clinical dose prescriptions. Planning time and plan quality measures were compared using the 2-sided paired Wilcoxon signed rank test with a significance level of 0.05, with positive false discovery rate (pFDR) of less than 0.05. With vector-model-supported optimization, there was a significant reduction in the median planning time, a 40% reduction from 3.7 to 2.2 hours (p = 0.002, pFDR = 0.032), and for the number of iterations, a 30% reduction from 8.5 to 6.0 (p = 0.006, pFDR = 0.047). The quality of plans from both approaches was comparable. From these preliminary results, vector-model-supported optimization can expedite the optimization of VMA-SRT for brain metastasis while maintaining plan quality.« less

  1. Superstructure-based Design and Optimization of Batch Biodiesel Production Using Heterogeneous Catalysts

    NASA Astrophysics Data System (ADS)

    Nuh, M. Z.; Nasir, N. F.

    2017-08-01

    Biodiesel as a fuel comprised of mono alkyl esters of long chain fatty acids derived from renewable lipid feedstock, such as vegetable oil and animal fat. Biodiesel production is complex process which need systematic design and optimization. However, no case study using the process system engineering (PSE) elements which are superstructure optimization of batch process, it involves complex problems and uses mixed-integer nonlinear programming (MINLP). The PSE offers a solution to complex engineering system by enabling the use of viable tools and techniques to better manage and comprehend the complexity of the system. This study is aimed to apply the PSE tools for the simulation of biodiesel process and optimization and to develop mathematical models for component of the plant for case A, B, C by using published kinetic data. Secondly, to determine economic analysis for biodiesel production, focusing on heterogeneous catalyst. Finally, the objective of this study is to develop the superstructure for biodiesel production by using heterogeneous catalyst. The mathematical models are developed by the superstructure and solving the resulting mixed integer non-linear model and estimation economic analysis by using MATLAB software. The results of the optimization process with the objective function of minimizing the annual production cost by batch process from case C is 23.2587 million USD. Overall, the implementation a study of process system engineering (PSE) has optimized the process of modelling, design and cost estimation. By optimizing the process, it results in solving the complex production and processing of biodiesel by batch.

  2. Investigation of schedules for traffic signal timing optimization.

    DOT National Transportation Integrated Search

    2005-01-01

    Traffic signal optimization is recognized as one of the most cost-effective ways to improve urban mobility; however the extent of the benefits realized could significantly depend on how often traffic signal re-optimization occurs. Using a case study ...

  3. The operating room case-mix problem under uncertainty and nurses capacity constraints.

    PubMed

    Yahia, Zakaria; Eltawil, Amr B; Harraz, Nermine A

    2016-12-01

    Surgery is one of the key functions in hospitals; it generates significant revenue and admissions to hospitals. In this paper we address the decision of choosing a case-mix for a surgery department. The objective of this study is to generate an optimal case-mix plan of surgery patients with uncertain surgery operations, which includes uncertainty in surgery durations, length of stay, surgery demand and the availability of nurses. In order to obtain an optimal case-mix plan, a stochastic optimization model is proposed and the sample average approximation method is applied. The proposed model is used to determine the number of surgery cases to be weekly served, the amount of operating rooms' time dedicated to each specialty and the number of ward beds dedicated to each specialty. The optimal case-mix selection criterion is based upon a weighted score taking into account both the waiting list and the historical demand of each patient category. The score aims to maximizing the service level of the operating rooms by increasing the total number of surgery cases that could be served. A computational experiment is presented to demonstrate the performance of the proposed method. The results show that the stochastic model solution outperforms the expected value problem solution. Additional analysis is conducted to study the effect of varying the number of ORs and nurses capacity on the overall ORs' performance.

  4. Determining the optimal mix of federal and contract fire crews: a case study from the Pacific Northwest.

    Treesearch

    Geoffrey H. Donovan

    2006-01-01

    Federal land management agencies in the United States are increasingly relying on contract crews as opposed to agency fire crews. Despite this increasing reliance on contractors, there have been no studies to determine what the optimal mix of contract and agency fire crews should be. A mathematical model is presented to address this question and is applied to a case...

  5. Optimization under variability and uncertainty: a case study for NOx emissions control for a gasification system.

    PubMed

    Chen, Jianjun; Frey, H Christopher

    2004-12-15

    Methods for optimization of process technologies considering the distinction between variability and uncertainty are developed and applied to case studies of NOx control for Integrated Gasification Combined Cycle systems. Existing methods of stochastic optimization (SO) and stochastic programming (SP) are demonstrated. A comparison of SO and SP results provides the value of collecting additional information to reduce uncertainty. For example, an expected annual benefit of 240,000 dollars is estimated if uncertainty can be reduced before a final design is chosen. SO and SP are typically applied to uncertainty. However, when applied to variability, the benefit of dynamic process control is obtained. For example, an annual savings of 1 million dollars could be achieved if the system is adjusted to changes in process conditions. When variability and uncertainty are treated distinctively, a coupled stochastic optimization and programming method and a two-dimensional stochastic programming method are demonstrated via a case study. For the case study, the mean annual benefit of dynamic process control is estimated to be 700,000 dollars, with a 95% confidence range of 500,000 dollars to 940,000 dollars. These methods are expected to be of greatest utility for problems involving a large commitment of resources, for which small differences in designs can produce large cost savings.

  6. Optimization of a Turboprop UAV for Maximum Loiter and Specific Power Using Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Dinc, Ali

    2016-09-01

    In this study, a genuine code was developed for optimization of selected parameters of a turboprop engine for an unmanned aerial vehicle (UAV) by employing elitist genetic algorithm. First, preliminary sizing of a UAV and its turboprop engine was done, by the code in a given mission profile. Secondly, single and multi-objective optimization were done for selected engine parameters to maximize loiter duration of UAV or specific power of engine or both. In single objective optimization, as first case, UAV loiter time was improved with an increase of 17.5% from baseline in given boundaries or constraints of compressor pressure ratio and burner exit temperature. In second case, specific power was enhanced by 12.3% from baseline. In multi-objective optimization case, where previous two objectives are considered together, loiter time and specific power were increased by 14.2% and 9.7% from baseline respectively, for the same constraints.

  7. Case study: Optimizing fault model input parameters using bio-inspired algorithms

    NASA Astrophysics Data System (ADS)

    Plucar, Jan; Grunt, Onřej; Zelinka, Ivan

    2017-07-01

    We present a case study that demonstrates a bio-inspired approach in the process of finding optimal parameters for GSM fault model. This model is constructed using Petri Nets approach it represents dynamic model of GSM network environment in the suburban areas of Ostrava city (Czech Republic). We have been faced with a task of finding optimal parameters for an application that requires high amount of data transfers between the application itself and secure servers located in datacenter. In order to find the optimal set of parameters we employ bio-inspired algorithms such as Differential Evolution (DE) or Self Organizing Migrating Algorithm (SOMA). In this paper we present use of these algorithms, compare results and judge their performance in fault probability mitigation.

  8. Optimal river monitoring network using optimal partition analysis: a case study of Hun River, Northeast China.

    PubMed

    Wang, Hui; Liu, Chunyue; Rong, Luge; Wang, Xiaoxu; Sun, Lina; Luo, Qing; Wu, Hao

    2018-01-09

    River monitoring networks play an important role in water environmental management and assessment, and it is critical to develop an appropriate method to optimize the monitoring network. In this study, an effective method was proposed based on the attainment rate of National Grade III water quality, optimal partition analysis and Euclidean distance, and Hun River was taken as a method validation case. There were 7 sampling sites in the monitoring network of the Hun River, and 17 monitoring items were analyzed once a month during January 2009 to December 2010. The results showed that the main monitoring items in the surface water of Hun River were ammonia nitrogen (NH 4 + -N), chemical oxygen demand, and biochemical oxygen demand. After optimization, the required number of monitoring sites was reduced from seven to three, and 57% of the cost was saved. In addition, there were no significant differences between non-optimized and optimized monitoring networks, and the optimized monitoring networks could correctly represent the original monitoring network. The duplicate setting degree of monitoring sites decreased after optimization, and the rationality of the monitoring network was improved. Therefore, the optimal method was identified as feasible, efficient, and economic.

  9. Application of multi response optimization with grey relational analysis and fuzzy logic method

    NASA Astrophysics Data System (ADS)

    Winarni, Sri; Wahyu Indratno, Sapto

    2018-01-01

    Multi-response optimization is an optimization process by considering multiple responses simultaneously. The purpose of this research is to get the optimum point on multi-response optimization process using grey relational analysis and fuzzy logic method. The optimum point is determined from the Fuzzy-GRG (Grey Relational Grade) variable which is the conversion of the Signal to Noise Ratio of the responses involved. The case study used in this research are case optimization of electrical process parameters in electrical disharge machining. It was found that the combination of treatments resulting to optimum MRR and SR was a 70 V gap voltage factor, peak current 9 A and duty factor 0.8.

  10. Multi-probe-based resonance-frequency electrical impedance spectroscopy for detection of suspicious breast lesions: improving performance using partial ROC optimization

    NASA Astrophysics Data System (ADS)

    Lederman, Dror; Zheng, Bin; Wang, Xingwei; Wang, Xiao Hui; Gur, David

    2011-03-01

    We have developed a multi-probe resonance-frequency electrical impedance spectroscope (REIS) system to detect breast abnormalities. Based on assessing asymmetry in REIS signals acquired between left and right breasts, we developed several machine learning classifiers to classify younger women (i.e., under 50YO) into two groups of having high and low risk for developing breast cancer. In this study, we investigated a new method to optimize performance based on the area under a selected partial receiver operating characteristic (ROC) curve when optimizing an artificial neural network (ANN), and tested whether it could improve classification performance. From an ongoing prospective study, we selected a dataset of 174 cases for whom we have both REIS signals and diagnostic status verification. The dataset includes 66 "positive" cases recommended for biopsy due to detection of highly suspicious breast lesions and 108 "negative" cases determined by imaging based examinations. A set of REIS-based feature differences, extracted from the two breasts using a mirror-matched approach, was computed and constituted an initial feature pool. Using a leave-one-case-out cross-validation method, we applied a genetic algorithm (GA) to train the ANN with an optimal subset of features. Two optimization criteria were separately used in GA optimization, namely the area under the entire ROC curve (AUC) and the partial area under the ROC curve, up to a predetermined threshold (i.e., 90% specificity). The results showed that although the ANN optimized using the entire AUC yielded higher overall performance (AUC = 0.83 versus 0.76), the ANN optimized using the partial ROC area criterion achieved substantially higher operational performance (i.e., increasing sensitivity level from 28% to 48% at 95% specificity and/ or from 48% to 58% at 90% specificity).

  11. [Case managers experience improved trajectories for cancer patients after implementation of the case manager function].

    PubMed

    Axelsen, Karina Rahbek; Nafei, Hanne; Jakobsen, Stine Finne; Gandrup, Per; Knudsen, Janne Lehmann

    2014-10-13

    Case managers are increasingly used to optimize trajectories for patients. This study is based on a questionnaire among case managers in cancer care, aiming at the clarification of the function and its impact on especially patient safety, when handing over the responsibility. The results show a major variation in how the function is organized, the level of competence and the task to be handled. The responsibility has in general been narrowed to department level. Overall, the case managers believe that the function has optimized pathways for cancer patients and improved safety, but barriers persist.

  12. Comparison of Optimization and Two-point Methods in Estimation of Soil Water Retention Curve

    NASA Astrophysics Data System (ADS)

    Ghanbarian-Alavijeh, B.; Liaghat, A. M.; Huang, G.

    2009-04-01

    Soil water retention curve (SWRC) is one of the soil hydraulic properties in which its direct measurement is time consuming and expensive. Since, its measurement is unavoidable in study of environmental sciences i.e. investigation of unsaturated hydraulic conductivity and solute transport, in this study the attempt is to predict soil water retention curve from two measured points. By using Cresswell and Paydar (1996) method (two-point method) and an optimization method developed in this study on the basis of two points of SWRC, parameters of Tyler and Wheatcraft (1990) model (fractal dimension and air entry value) were estimated and then water content at different matric potentials were estimated and compared with their measured values (n=180). For each method, we used both 3 and 1500 kPa (case 1) and 33 and 1500 kPa (case 2) as two points of SWRC. The calculated RMSE values showed that in the Creswell and Paydar (1996) method, there exists no significant difference between case 1 and case 2. However, the calculated RMSE value in case 2 (2.35) was slightly less than case 1 (2.37). The results also showed that the developed optimization method in this study had significantly less RMSE values for cases 1 (1.63) and 2 (1.33) rather than Cresswell and Paydar (1996) method.

  13. Research on the decision-making model of land-use spatial optimization

    NASA Astrophysics Data System (ADS)

    He, Jianhua; Yu, Yan; Liu, Yanfang; Liang, Fei; Cai, Yuqiu

    2009-10-01

    Using the optimization result of landscape pattern and land use structure optimization as constraints of CA simulation results, a decision-making model of land use spatial optimization is established coupled the landscape pattern model with cellular automata to realize the land use quantitative and spatial optimization simultaneously. And Huangpi district is taken as a case study to verify the rationality of the model.

  14. Applications of polynomial optimization in financial risk investment

    NASA Astrophysics Data System (ADS)

    Zeng, Meilan; Fu, Hongwei

    2017-09-01

    Recently, polynomial optimization has many important applications in optimization, financial economics and eigenvalues of tensor, etc. This paper studies the applications of polynomial optimization in financial risk investment. We consider the standard mean-variance risk measurement model and the mean-variance risk measurement model with transaction costs. We use Lasserre's hierarchy of semidefinite programming (SDP) relaxations to solve the specific cases. The results show that polynomial optimization is effective for some financial optimization problems.

  15. The Improvement of Particle Swarm Optimization: a Case Study of Optimal Operation in Goupitan Reservoir

    NASA Astrophysics Data System (ADS)

    Li, Haichen; Qin, Tao; Wang, Weiping; Lei, Xiaohui; Wu, Wenhui

    2018-02-01

    Due to the weakness in holding diversity and reaching global optimum, the standard particle swarm optimization has not performed well in reservoir optimal operation. To solve this problem, this paper introduces downhill simplex method to work together with the standard particle swarm optimization. The application of this approach in Goupitan reservoir optimal operation proves that the improved method had better accuracy and higher reliability with small investment.

  16. SU-E-T-175: Clinical Evaluations of Monte Carlo-Based Inverse Treatment Plan Optimization for Intensity Modulated Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chi, Y; Li, Y; Tian, Z

    2015-06-15

    Purpose: Pencil-beam or superposition-convolution type dose calculation algorithms are routinely used in inverse plan optimization for intensity modulated radiation therapy (IMRT). However, due to their limited accuracy in some challenging cases, e.g. lung, the resulting dose may lose its optimality after being recomputed using an accurate algorithm, e.g. Monte Carlo (MC). It is the objective of this study to evaluate the feasibility and advantages of a new method to include MC in the treatment planning process. Methods: We developed a scheme to iteratively perform MC-based beamlet dose calculations and plan optimization. In the MC stage, a GPU-based dose engine wasmore » used and the particle number sampled from a beamlet was proportional to its optimized fluence from the previous step. We tested this scheme in four lung cancer IMRT cases. For each case, the original plan dose, plan dose re-computed by MC, and dose optimized by our scheme were obtained. Clinically relevant dosimetric quantities in these three plans were compared. Results: Although the original plan achieved a satisfactory PDV dose coverage, after re-computing doses using MC method, it was found that the PTV D95% were reduced by 4.60%–6.67%. After re-optimizing these cases with our scheme, the PTV coverage was improved to the same level as in the original plan, while the critical OAR coverages were maintained to clinically acceptable levels. Regarding the computation time, it took on average 144 sec per case using only one GPU card, including both MC-based beamlet dose calculation and treatment plan optimization. Conclusion: The achieved dosimetric gains and high computational efficiency indicate the feasibility and advantages of the proposed MC-based IMRT optimization method. Comprehensive validations in more patient cases are in progress.« less

  17. Chopped random-basis quantum optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caneva, Tommaso; Calarco, Tommaso; Montangero, Simone

    2011-08-15

    In this work, we describe in detail the chopped random basis (CRAB) optimal control technique recently introduced to optimize time-dependent density matrix renormalization group simulations [P. Doria, T. Calarco, and S. Montangero, Phys. Rev. Lett. 106, 190501 (2011)]. Here, we study the efficiency of this control technique in optimizing different quantum processes and we show that in the considered cases we obtain results equivalent to those obtained via different optimal control methods while using less resources. We propose the CRAB optimization as a general and versatile optimal control technique.

  18. A Hamiltonian approach to the planar optimization of mid-course corrections

    NASA Astrophysics Data System (ADS)

    Iorfida, E.; Palmer, P. L.; Roberts, M.

    2016-04-01

    Lawden's primer vector theory gives a set of necessary conditions that characterize the optimality of a transfer orbit, defined accordingly to the possibility of adding mid-course corrections. In this paper a novel approach is proposed where, through a polar coordinates transformation, the primer vector components decouple. Furthermore, the case when transfer, departure and arrival orbits are coplanar is analyzed using a Hamiltonian approach. This procedure leads to approximate analytic solutions for the in-plane components of the primer vector. Moreover, the solution for the circular transfer case is proven to be the Hill's solution. The novel procedure reduces the mathematical and computational complexity of the original case study. It is shown that the primer vector is independent of the semi-major axis of the transfer orbit. The case with a fixed transfer trajectory and variable initial and final thrust impulses is studied. The acquired related optimality maps are presented and analyzed and they express the likelihood of a set of trajectories to be optimal. Furthermore, it is presented which kind of requirements have to be fulfilled by a set of departure and arrival orbits to have the same profile of primer vector.

  19. Web-GIS oriented systems viability for municipal solid waste selective collection optimization in developed and transient economies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rada, E.C., E-mail: Elena.Rada@ing.unitn.it; Ragazzi, M.; Fedrizzi, P.

    Highlights: ► As an appropriate solution for MSW management in developed and transient countries. ► As an option to increase the efficiency of MSW selective collection. ► As an opportunity to integrate MSW management needs and services inventories. ► As a tool to develop Urban Mining actions. - Abstract: Municipal solid waste management is a multidisciplinary activity that includes generation, source separation, storage, collection, transfer and transport, processing and recovery, and, last but not least, disposal. The optimization of waste collection, through source separation, is compulsory where a landfill based management must be overcome. In this paper, a few aspectsmore » related to the implementation of a Web-GIS based system are analyzed. This approach is critically analyzed referring to the experience of two Italian case studies and two additional extra-European case studies. The first case is one of the best examples of selective collection optimization in Italy. The obtained efficiency is very high: 80% of waste is source separated for recycling purposes. In the second reference case, the local administration is going to be faced with the optimization of waste collection through Web-GIS oriented technologies for the first time. The starting scenario is far from an optimized management of municipal solid waste. The last two case studies concern pilot experiences in China and Malaysia. Each step of the Web-GIS oriented strategy is comparatively discussed referring to typical scenarios of developed and transient economies. The main result is that transient economies are ready to move toward Web oriented tools for MSW management, but this opportunity is not yet well exploited in the sector.« less

  20. Deep ECGNet: An Optimal Deep Learning Framework for Monitoring Mental Stress Using Ultra Short-Term ECG Signals.

    PubMed

    Hwang, Bosun; You, Jiwoo; Vaessen, Thomas; Myin-Germeys, Inez; Park, Cheolsoo; Zhang, Byoung-Tak

    2018-02-08

    Stress recognition using electrocardiogram (ECG) signals requires the intractable long-term heart rate variability (HRV) parameter extraction process. This study proposes a novel deep learning framework to recognize the stressful states, the Deep ECGNet, using ultra short-term raw ECG signals without any feature engineering methods. The Deep ECGNet was developed through various experiments and analysis of ECG waveforms. We proposed the optimal recurrent and convolutional neural networks architecture, and also the optimal convolution filter length (related to the P, Q, R, S, and T wave durations of ECG) and pooling length (related to the heart beat period) based on the optimization experiments and analysis on the waveform characteristics of ECG signals. The experiments were also conducted with conventional methods using HRV parameters and frequency features as a benchmark test. The data used in this study were obtained from Kwangwoon University in Korea (13 subjects, Case 1) and KU Leuven University in Belgium (9 subjects, Case 2). Experiments were designed according to various experimental protocols to elicit stressful conditions. The proposed framework to recognize stress conditions, the Deep ECGNet, outperformed the conventional approaches with the highest accuracy of 87.39% for Case 1 and 73.96% for Case 2, respectively, that is, 16.22% and 10.98% improvements compared with those of the conventional HRV method. We proposed an optimal deep learning architecture and its parameters for stress recognition, and the theoretical consideration on how to design the deep learning structure based on the periodic patterns of the raw ECG data. Experimental results in this study have proved that the proposed deep learning model, the Deep ECGNet, is an optimal structure to recognize the stress conditions using ultra short-term ECG data.

  1. Selective robust optimization: A new intensity-modulated proton therapy optimization strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yupeng; Niemela, Perttu; Siljamaki, Sami

    2015-08-15

    Purpose: To develop a new robust optimization strategy for intensity-modulated proton therapy as an important step in translating robust proton treatment planning from research to clinical applications. Methods: In selective robust optimization, a worst-case-based robust optimization algorithm is extended, and terms of the objective function are selectively computed from either the worst-case dose or the nominal dose. Two lung cancer cases and one head and neck cancer case were used to demonstrate the practical significance of the proposed robust planning strategy. The lung cancer cases had minimal tumor motion less than 5 mm, and, for the demonstration of the methodology,more » are assumed to be static. Results: Selective robust optimization achieved robust clinical target volume (CTV) coverage and at the same time increased nominal planning target volume coverage to 95.8%, compared to the 84.6% coverage achieved with CTV-based robust optimization in one of the lung cases. In the other lung case, the maximum dose in selective robust optimization was lowered from a dose of 131.3% in the CTV-based robust optimization to 113.6%. Selective robust optimization provided robust CTV coverage in the head and neck case, and at the same time improved controls over isodose distribution so that clinical requirements may be readily met. Conclusions: Selective robust optimization may provide the flexibility and capability necessary for meeting various clinical requirements in addition to achieving the required plan robustness in practical proton treatment planning settings.« less

  2. Optimal control of underactuated mechanical systems: A geometric approach

    NASA Astrophysics Data System (ADS)

    Colombo, Leonardo; Martín De Diego, David; Zuccalli, Marcela

    2010-08-01

    In this paper, we consider a geometric formalism for optimal control of underactuated mechanical systems. Our techniques are an adaptation of the classical Skinner and Rusk approach for the case of Lagrangian dynamics with higher-order constraints. We study a regular case where it is possible to establish a symplectic framework and, as a consequence, to obtain a unique vector field determining the dynamics of the optimal control problem. These developments will allow us to develop a new class of geometric integrators based on discrete variational calculus.

  3. Solar Collector Design Optimization: A Hands-on Project Case Study

    ERIC Educational Resources Information Center

    Birnie, Dunbar P., III; Kaz, David M.; Berman, Elena A.

    2012-01-01

    A solar power collector optimization design project has been developed for use in undergraduate classrooms and/or laboratories. The design optimization depends on understanding the current-voltage characteristics of the starting photovoltaic cells as well as how the cell's electrical response changes with increased light illumination. Students…

  4. [Case managers experience improved trajectories for cancer patients after implementation of the case manager function].

    PubMed

    Axelsen, Karina Rahbek; Nafei, Hanne; Jakobsen, Stine Finne; Gandrup, Per; Knudsen, Janne Lehmann

    2015-06-08

    Case managers are increasingly used to optimize trajectories for patients. This study is based on a questionnaire among case managers in cancer care, aiming at the clarification of the func­tion and its impact on especially patient safety, when handing over the responsibility. The results show a major variation in how the function is organized, the level of competence and the task to be handled. The responsibility has in general been nar­rowed to department level. Overall, the case managers believe that the function has optimized pathways for cancer patients and improved safety, but barriers persist.

  5. Fuzzy multi-objective optimization case study based on an anaerobic co-digestion process of food waste leachate and piggery wastewater.

    PubMed

    Choi, Angelo Earvin Sy; Park, Hung Suck

    2018-06-20

    This paper presents the development and evaluation of fuzzy multi-objective optimization for decision-making that includes the process optimization of anaerobic digestion (AD) process. The operating cost criteria which is a fundamental research gap in previous AD analysis was integrated for the case study in this research. In this study, the mixing ratio of food waste leachate (FWL) and piggery wastewater (PWW), calcium carbonate (CaCO 3 ) and sodium chloride (NaCl) concentrations were optimized to enhance methane production while minimizing operating cost. The results indicated a maximum of 63.3% satisfaction for both methane production and operating cost under the following optimal conditions: mixing ratio (FWL: PWW) - 1.4, CaCO 3 - 2970.5 mg/L and NaCl - 2.7 g/L. In multi-objective optimization, the specific methane yield (SMY) was 239.0 mL CH 4 /g VS added , while 41.2% volatile solids reduction (VSR) was obtained at an operating cost of 56.9 US$/ton. In comparison with the previous optimization study that utilized the response surface methodology, the SMY, VSR and operating cost of the AD process were 310 mL/g, 54% and 83.2 US$/ton, respectively. The results from multi-objective fuzzy optimization proves to show the potential application of this technique for practical decision-making in the process optimization of AD process. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Transferability of optimally-selected climate models in the quantification of climate change impacts on hydrology

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Brissette, François P.; Lucas-Picher, Philippe

    2016-11-01

    Given the ever increasing number of climate change simulations being carried out, it has become impractical to use all of them to cover the uncertainty of climate change impacts. Various methods have been proposed to optimally select subsets of a large ensemble of climate simulations for impact studies. However, the behaviour of optimally-selected subsets of climate simulations for climate change impacts is unknown, since the transfer process from climate projections to the impact study world is usually highly non-linear. Consequently, this study investigates the transferability of optimally-selected subsets of climate simulations in the case of hydrological impacts. Two different methods were used for the optimal selection of subsets of climate scenarios, and both were found to be capable of adequately representing the spread of selected climate model variables contained in the original large ensemble. However, in both cases, the optimal subsets had limited transferability to hydrological impacts. To capture a similar variability in the impact model world, many more simulations have to be used than those that are needed to simply cover variability from the climate model variables' perspective. Overall, both optimal subset selection methods were better than random selection when small subsets were selected from a large ensemble for impact studies. However, as the number of selected simulations increased, random selection often performed better than the two optimal methods. To ensure adequate uncertainty coverage, the results of this study imply that selecting as many climate change simulations as possible is the best avenue. Where this was not possible, the two optimal methods were found to perform adequately.

  7. A method to accelerate creation of plasma etch recipes using physics and Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Chopra, Meghali J.; Verma, Rahul; Lane, Austin; Willson, C. G.; Bonnecaze, Roger T.

    2017-03-01

    Next generation semiconductor technologies like high density memory storage require precise 2D and 3D nanopatterns. Plasma etching processes are essential to achieving the nanoscale precision required for these structures. Current plasma process development methods rely primarily on iterative trial and error or factorial design of experiment (DOE) to define the plasma process space. Here we evaluate the efficacy of the software tool Recipe Optimization for Deposition and Etching (RODEo) against standard industry methods at determining the process parameters of a high density O2 plasma system with three case studies. In the first case study, we demonstrate that RODEo is able to predict etch rates more accurately than a regression model based on a full factorial design while using 40% fewer experiments. In the second case study, we demonstrate that RODEo performs significantly better than a full factorial DOE at identifying optimal process conditions to maximize anisotropy. In the third case study we experimentally show how RODEo maximizes etch rates while using half the experiments of a full factorial DOE method. With enhanced process predictions and more accurate maps of the process space, RODEo reduces the number of experiments required to develop and optimize plasma processes.

  8. Post Pareto optimization-A case

    NASA Astrophysics Data System (ADS)

    Popov, Stoyan; Baeva, Silvia; Marinova, Daniela

    2017-12-01

    Simulation performance may be evaluated according to multiple quality measures that are in competition and their simultaneous consideration poses a conflict. In the current study we propose a practical framework for investigating such simulation performance criteria, exploring the inherent conflicts amongst them and identifying the best available tradeoffs, based upon multi-objective Pareto optimization. This approach necessitates the rigorous derivation of performance criteria to serve as objective functions and undergo vector optimization. We demonstrate the effectiveness of our proposed approach by applying it with multiple stochastic quality measures. We formulate performance criteria of this use-case, pose an optimization problem, and solve it by means of a simulation-based Pareto approach. Upon attainment of the underlying Pareto Frontier, we analyze it and prescribe preference-dependent configurations for the optimal simulation training.

  9. Does optimal partitioning of color space account for universal color categorization?

    PubMed Central

    2017-01-01

    A 2007 study by Regier, Kay, and Khetarpal purports to show that universal categories emerge as a result of optimal partitioning of color space. Regier, Kay, and Khetarpal only consider color categorizations of up to six categories. However, in most industrialized societies eleven color categories are observed. This paper shows that when applied to the case of eleven categories, Regier, Kay, and Khetarpal’s optimality criterion yields unsatisfactory results. Applications of the criterion to the intermediate cases of seven, eight, nine, and ten color categories are also briefly considered and are shown to yield mixed results. We consider a number of possible explanations of the failure of the criterion in the case of eleven categories, and suggest that, as color categorizations get more complex, further criteria come to play a role, alongside Regier, Kay, and Khetarpal’s optimality criterion. PMID:28570598

  10. A jazz-based approach for optimal setting of pressure reducing valves in water distribution networks

    NASA Astrophysics Data System (ADS)

    De Paola, Francesco; Galdiero, Enzo; Giugni, Maurizio

    2016-05-01

    This study presents a model for valve setting in water distribution networks (WDNs), with the aim of reducing the level of leakage. The approach is based on the harmony search (HS) optimization algorithm. The HS mimics a jazz improvisation process able to find the best solutions, in this case corresponding to valve settings in a WDN. The model also interfaces with the improved version of a popular hydraulic simulator, EPANET 2.0, to check the hydraulic constraints and to evaluate the performances of the solutions. Penalties are introduced in the objective function in case of violation of the hydraulic constraints. The model is applied to two case studies, and the obtained results in terms of pressure reductions are comparable with those of competitive metaheuristic algorithms (e.g. genetic algorithms). The results demonstrate the suitability of the HS algorithm for water network management and optimization.

  11. Incidence of flare-ups and evaluation of quality after retreatment of resorcinol-formaldehyde resin ("Russian Red Cement") endodontic therapy.

    PubMed

    Gound, Tom G; Marx, David; Schwandt, Nathan A

    2003-10-01

    The purpose of this retrospective study was to evaluate the quality of treatment and incidence of flare-ups when teeth with resorcinol-formaldehyde resin are retreated in a postgraduate endodontic clinic. Fifty-eight cases were included in this study. Obturated and unfilled canal space was measured on radiographs. Forty-eight percent of the total canal space was filled before retreatment; 90% was filled after retreatment. After retreatment, obturations were rated as optimal in 59%, improved in 33%, unchanged in 6%, and worse in 2%. Seven patients (12%) had postretreatment flare-ups. Data were statistically analyzed using the Cochran-Armitage Test for Discrete Variables. No statistical difference in the incidence of flare-ups was found in teeth that before treatment had more than half the canal space filled compared to teeth with less than half, cases with pre-existing periradicular radiolucencies compared to cases with normal periradicular appearance, symptomatic cases compared to asymptomatic cases, or cases with optimal fillings after retreatment compared to less than optimal cases. It was concluded that teeth with resorcinol-formaldehyde fillings might be retreated with a good prognosis for improving the radiographic quality, but a higher than normal incidence of flare-ups may occur.

  12. Evaluation of optimal water fluoridation on the incidence and skeletal distribution of naturally arising osteosarcoma in pet dogs

    PubMed Central

    Rebhun, R. B.; Kass, P. H.; Kent, M. S.; Watson, K. D.; Withers, S. S.; Culp, W. T. N.; King, A.M.

    2016-01-01

    Experimental toxicological studies in laboratory animals and epidemiological human studies have reported a possible association between water fluoridation and osteosarcoma (OSA). To further explore this possibility, a case-control study of individual dogs evaluated by the UC Davis Veterinary Medical Teaching Hospital was conducted using ecologic data on water fluoridation based on the owner’s residence. The case group included 161 dogs with OSA diagnosed between 2008–2012. Two cancer control groups included dogs diagnosed with lymphoma (LSA) or hemangiosarcoma (HSA) during the same period (n = 134 and n = 145, respectively). Dogs with OSA were not significantly more likely to live in an area with optimized fluoride in the water than dogs with LSA or HSA. Additional analyses within OSA patients also revealed no significant differences in age, or skeletal distribution of OSA cases relative to fluoride status. Taken together, these analyses do not support the hypothesis that optimal fluoridation of drinking water contributes to naturally occurring OSA in dogs. PMID:26762869

  13. Evaluation of optimal water fluoridation on the incidence and skeletal distribution of naturally arising osteosarcoma in pet dogs.

    PubMed

    Rebhun, R B; Kass, P H; Kent, M S; Watson, K D; Withers, S S; Culp, W T N; King, A M

    2017-06-01

    Experimental toxicological studies in laboratory animals and epidemiological human studies have reported a possible association between water fluoridation and osteosarcoma (OSA). To further explore this possibility, a case-control study of individual dogs evaluated by the UC Davis Veterinary Medical Teaching Hospital was conducted using ecologic data on water fluoridation based on the owner's residence. The case group included 161 dogs with OSA diagnosed between 2008-2012. Two cancer control groups included dogs diagnosed with lymphoma (LSA) or hemangiosarcoma (HSA) during the same period (n = 134 and n = 145, respectively). Dogs with OSA were not significantly more likely to live in an area with optimized fluoride in the water than dogs with LSA or HSA. Additional analyses within OSA patients also revealed no significant differences in age, or skeletal distribution of OSA cases relative to fluoride status. Taken together, these analyses do not support the hypothesis that optimal fluoridation of drinking water contributes to naturally occurring OSA in dogs. © 2016 John Wiley & Sons Ltd.

  14. Vector-model-supported approach in prostate plan optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Eva Sau Fan; Department of Health Technology and Informatics, The Hong Kong Polytechnic University; Wu, Vincent Wing Cheung

    Lengthy time consumed in traditional manual plan optimization can limit the use of step-and-shoot intensity-modulated radiotherapy/volumetric-modulated radiotherapy (S&S IMRT/VMAT). A vector model base, retrieving similar radiotherapy cases, was developed with respect to the structural and physiologic features extracted from the Digital Imaging and Communications in Medicine (DICOM) files. Planning parameters were retrieved from the selected similar reference case and applied to the test case to bypass the gradual adjustment of planning parameters. Therefore, the planning time spent on the traditional trial-and-error manual optimization approach in the beginning of optimization could be reduced. Each S&S IMRT/VMAT prostate reference database comprised 100more » previously treated cases. Prostate cases were replanned with both traditional optimization and vector-model-supported optimization based on the oncologists' clinical dose prescriptions. A total of 360 plans, which consisted of 30 cases of S&S IMRT, 30 cases of 1-arc VMAT, and 30 cases of 2-arc VMAT plans including first optimization and final optimization with/without vector-model-supported optimization, were compared using the 2-sided t-test and paired Wilcoxon signed rank test, with a significance level of 0.05 and a false discovery rate of less than 0.05. For S&S IMRT, 1-arc VMAT, and 2-arc VMAT prostate plans, there was a significant reduction in the planning time and iteration with vector-model-supported optimization by almost 50%. When the first optimization plans were compared, 2-arc VMAT prostate plans had better plan quality than 1-arc VMAT plans. The volume receiving 35 Gy in the femoral head for 2-arc VMAT plans was reduced with the vector-model-supported optimization compared with the traditional manual optimization approach. Otherwise, the quality of plans from both approaches was comparable. Vector-model-supported optimization was shown to offer much shortened planning time and iteration number without compromising the plan quality.« less

  15. Optimization of FNAC findings as a preoperative diagnostic aid for odontogenic cysts.

    PubMed

    Jain, Garima; Shetty, Pushparaja

    2015-01-01

    Fine-needle aspiration cytology (FNAC) is not a definitive preoperative diagnostic procedure done for all cases of odontogenic cysts. This is because of the inconsistent results obtained with it. This study was done to optimize FNAC findings and help in preoperative characterization of odontogenic cysts. Cystic fluid was collected and centrifuged from 50 odontogenic cysts that were planned for excision. Three smears were prepared from the cell sediment obtained after centrifugation and stained. The stained sections were examined for presence and type of epithelial cells, to formulate a preopererative diagnosis. Epithelial cells were detected in 46% cases in smear 1, 48% cases in smear 2, and 52% cases in smear 3. When all three smears from one case were studied, 86% cases showed epithelial cells for evaluation. Cystic aspirate should be centrifuged and the entire cell sediment should be examined by making multiple smears for evaluation of cystic epithelial lining cells.

  16. Disk MHD generator study

    NASA Technical Reports Server (NTRS)

    Retallick, F. D.

    1980-01-01

    Directly-fired, separately-fired, and oxygen-augmented MHD power plants incorporating a disk geometry for the MHD generator were studied. The base parameters defined for four near-optimum-performance MHD steam power systems of various types are presented. The finally selected systems consisted of (1) two directly fired cases, one at 1920 K (2996F) preheat and the other at 1650 K (2500 F) preheat, (2) a separately-fired case where the air is preheated to the same level as the higher temperature directly-fired cases, and (3) an oxygen augmented case with the same generator inlet temperature of 2839 (4650F) as the high temperature directly-fired and separately-fired cases. Supersonic Mach numbers at the generator inlet, gas inlet swirl, and constant Hall field operation were specified based on disk generator optimization. System pressures were based on optimization of MHD net power. Supercritical reheat stream plants were used in all cases. Open and closed cycle component costs are summarized and compared.

  17. Light extraction efficiency of GaN-based LED with pyramid texture by using ray path analysis.

    PubMed

    Pan, Jui-Wen; Wang, Chia-Shen

    2012-09-10

    We study three different gallium-nitride (GaN) based light emitting diode (LED) cases based on the different locations of the pyramid textures. In case 1, the pyramid texture is located on the sapphire top surface, in case 2, the pyramid texture is locate on the P-GaN top surface, while in case 3, the pyramid texture is located on both the sapphire and P-GaN top surfaces. We study the relationship between the light extraction efficiency (LEE) and angle of slant of the pyramid texture. The optimization of total LEE was highest for case 3 among the three cases. Moreover, the seven escape paths along which most of the escaped photon flux propagated were selected in a simulation of the LEDs. The seven escape paths were used to estimate the slant angle for the optimization of LEE and to precisely analyze the photon escape path.

  18. Optimal insemination and replacement decisions to minimize the cost of pathogen-specific clinical mastitis in dairy cows.

    PubMed

    Cha, E; Kristensen, A R; Hertl, J A; Schukken, Y H; Tauer, L W; Welcome, F L; Gröhn, Y T

    2014-01-01

    Mastitis is a serious production-limiting disease, with effects on milk yield, milk quality, and conception rate, and an increase in the risk of mortality and culling. The objective of this study was 2-fold: (1) to develop an economic optimization model that incorporates all the different types of pathogens that cause clinical mastitis (CM) categorized into 8 classes of culture results, and account for whether the CM was a first, second, or third case in the current lactation and whether the cow had a previous case or cases of CM in the preceding lactation; and (2) to develop this decision model to be versatile enough to add additional pathogens, diseases, or other cow characteristics as more information becomes available without significant alterations to the basic structure of the model. The model provides economically optimal decisions depending on the individual characteristics of the cow and the specific pathogen causing CM. The net returns for the basic herd scenario (with all CM included) were $507/cow per year, where the incidence of CM (cases per 100 cow-years) was 35.6, of which 91.8% of cases were recommended for treatment under an optimal replacement policy. The cost per case of CM was $216.11. The CM cases comprised (incidences, %) Staphylococcus spp. (1.6), Staphylococcus aureus (1.8), Streptococcus spp. (6.9), Escherichia coli (8.1), Klebsiella spp. (2.2), other treated cases (e.g., Pseudomonas; 1.1), other not treated cases (e.g., Trueperella pyogenes; 1.2), and negative culture cases (12.7). The average cost per case, even under optimal decisions, was greatest for Klebsiella spp. ($477), followed by E. coli ($361), other treated cases ($297), and other not treated cases ($280). This was followed by the gram-positive pathogens; among these, the greatest cost per case was due to Staph. aureus ($266), followed by Streptococcus spp. ($174) and Staphylococcus spp. ($135); negative culture had the lowest cost ($115). The model recommended treatment for most CM cases (>85%); the range was 86.2% (Klebsiella spp.) to 98.5% (Staphylococcus spp.). In general, the optimal recommended time for replacement was up to 5 mo earlier for cows with CM compared with cows without CM. Furthermore, although the parameter estimates implemented in this model are applicable to the dairy farms in this study, the parameters may be altered to be specific to other dairy farms. Cow rankings and values based on disease status, pregnancy status, and milk production can be extracted; these provide guidance when determining which cows to keep or cull. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  19. Sequential ensemble-based optimal design for parameter estimation: SEQUENTIAL ENSEMBLE-BASED OPTIMAL DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Man, Jun; Zhang, Jiangjiang; Li, Weixuan

    2016-10-01

    The ensemble Kalman filter (EnKF) has been widely used in parameter estimation for hydrological models. The focus of most previous studies was to develop more efficient analysis (estimation) algorithms. On the other hand, it is intuitively understandable that a well-designed sampling (data-collection) strategy should provide more informative measurements and subsequently improve the parameter estimation. In this work, a Sequential Ensemble-based Optimal Design (SEOD) method, coupled with EnKF, information theory and sequential optimal design, is proposed to improve the performance of parameter estimation. Based on the first-order and second-order statistics, different information metrics including the Shannon entropy difference (SD), degrees ofmore » freedom for signal (DFS) and relative entropy (RE) are used to design the optimal sampling strategy, respectively. The effectiveness of the proposed method is illustrated by synthetic one-dimensional and two-dimensional unsaturated flow case studies. It is shown that the designed sampling strategies can provide more accurate parameter estimation and state prediction compared with conventional sampling strategies. Optimal sampling designs based on various information metrics perform similarly in our cases. The effect of ensemble size on the optimal design is also investigated. Overall, larger ensemble size improves the parameter estimation and convergence of optimal sampling strategy. Although the proposed method is applied to unsaturated flow problems in this study, it can be equally applied in any other hydrological problems.« less

  20. Multi-point optimization of recirculation flow type casing treatment in centrifugal compressors

    NASA Astrophysics Data System (ADS)

    Tun, Min Thaw; Sakaguchi, Daisaku

    2016-06-01

    High-pressure ratio and wide operating range are highly required for a turbocharger in diesel engines. A recirculation flow type casing treatment is effective for flow range enhancement of centrifugal compressors. Two ring grooves on a suction pipe and a shroud casing wall are connected by means of an annular passage and stable recirculation flow is formed at small flow rates from the downstream groove toward the upstream groove through the annular bypass. The shape of baseline recirculation flow type casing is modified and optimized by using a multi-point optimization code with a metamodel assisted evolutionary algorithm embedding a commercial CFD code CFX from ANSYS. The numerical optimization results give the optimized design of casing with improving adiabatic efficiency in wide operating flow rate range. Sensitivity analysis of design parameters as a function of efficiency has been performed. It is found that the optimized casing design provides optimized recirculation flow rate, in which an increment of entropy rise is minimized at grooves and passages of the rotating impeller.

  1. Growth Optimization Studies to Develop InAs/GaInSb Superlattice Materials for Very Long Wavelength Infrared Detection (Postprint)

    DTIC Science & Technology

    2014-10-01

    AFRL-RX-WP-JA-2015-0188 GROWTH OPTIMIZATION STUDIES TO DEVELOP INAS/GAINSB SUPERLATTICE MATERIALS FOR VERY LONG WAVELENGTH INFRARED...Interim 3. DATES COVERED (From – To) 17 January 2013 – 28 September 2014 4. TITLE AND SUBTITLE GROWTH OPTIMIZATION STUDIES TO DEVELOP INAS/GAINSB...AVAILABILITY STATEMENT Approved for public release; distribution unlimited. This report contains color. 13. SUPPLEMENTARY NOTES PA Case Number: 88ABW

  2. Dynamic optimization case studies in DYNOPT tool

    NASA Astrophysics Data System (ADS)

    Ozana, Stepan; Pies, Martin; Docekal, Tomas

    2016-06-01

    Dynamic programming is typically applied to optimization problems. As the analytical solutions are generally very difficult, chosen software tools are used widely. These software packages are often third-party products bound for standard simulation software tools on the market. As typical examples of such tools, TOMLAB and DYNOPT could be effectively applied for solution of problems of dynamic programming. DYNOPT will be presented in this paper due to its licensing policy (free product under GPL) and simplicity of use. DYNOPT is a set of MATLAB functions for determination of optimal control trajectory by given description of the process, the cost to be minimized, subject to equality and inequality constraints, using orthogonal collocation on finite elements method. The actual optimal control problem is solved by complete parameterization both the control and the state profile vector. It is assumed, that the optimized dynamic model may be described by a set of ordinary differential equations (ODEs) or differential-algebraic equations (DAEs). This collection of functions extends the capability of the MATLAB Optimization Tool-box. The paper will introduce use of DYNOPT in the field of dynamic optimization problems by means of case studies regarding chosen laboratory physical educational models.

  3. A fast optimization approach for treatment planning of volumetric modulated arc therapy.

    PubMed

    Yan, Hui; Dai, Jian-Rong; Li, Ye-Xiong

    2018-05-30

    Volumetric modulated arc therapy (VMAT) is widely used in clinical practice. It not only significantly reduces treatment time, but also produces high-quality treatment plans. Current optimization approaches heavily rely on stochastic algorithms which are time-consuming and less repeatable. In this study, a novel approach is proposed to provide a high-efficient optimization algorithm for VMAT treatment planning. A progressive sampling strategy is employed for beam arrangement of VMAT planning. The initial beams with equal-space are added to the plan in a coarse sampling resolution. Fluence-map optimization and leaf-sequencing are performed for these beams. Then, the coefficients of fluence-maps optimization algorithm are adjusted according to the known fluence maps of these beams. In the next round the sampling resolution is doubled and more beams are added. This process continues until the total number of beams arrived. The performance of VMAT optimization algorithm was evaluated using three clinical cases and compared to those of a commercial planning system. The dosimetric quality of VMAT plans is equal to or better than the corresponding IMRT plans for three clinical cases. The maximum dose to critical organs is reduced considerably for VMAT plans comparing to those of IMRT plans, especially in the head and neck case. The total number of segments and monitor units are reduced for VMAT plans. For three clinical cases, VMAT optimization takes < 5 min accomplished using proposed approach and is 3-4 times less than that of the commercial system. The proposed VMAT optimization algorithm is able to produce high-quality VMAT plans efficiently and consistently. It presents a new way to accelerate current optimization process of VMAT planning.

  4. On the Convergence Analysis of the Optimized Gradient Method.

    PubMed

    Kim, Donghwan; Fessler, Jeffrey A

    2017-01-01

    This paper considers the problem of unconstrained minimization of smooth convex functions having Lipschitz continuous gradients with known Lipschitz constant. We recently proposed the optimized gradient method for this problem and showed that it has a worst-case convergence bound for the cost function decrease that is twice as small as that of Nesterov's fast gradient method, yet has a similarly efficient practical implementation. Drori showed recently that the optimized gradient method has optimal complexity for the cost function decrease over the general class of first-order methods. This optimality makes it important to study fully the convergence properties of the optimized gradient method. The previous worst-case convergence bound for the optimized gradient method was derived for only the last iterate of a secondary sequence. This paper provides an analytic convergence bound for the primary sequence generated by the optimized gradient method. We then discuss additional convergence properties of the optimized gradient method, including the interesting fact that the optimized gradient method has two types of worstcase functions: a piecewise affine-quadratic function and a quadratic function. These results help complete the theory of an optimal first-order method for smooth convex minimization.

  5. On the Convergence Analysis of the Optimized Gradient Method

    PubMed Central

    Kim, Donghwan; Fessler, Jeffrey A.

    2016-01-01

    This paper considers the problem of unconstrained minimization of smooth convex functions having Lipschitz continuous gradients with known Lipschitz constant. We recently proposed the optimized gradient method for this problem and showed that it has a worst-case convergence bound for the cost function decrease that is twice as small as that of Nesterov’s fast gradient method, yet has a similarly efficient practical implementation. Drori showed recently that the optimized gradient method has optimal complexity for the cost function decrease over the general class of first-order methods. This optimality makes it important to study fully the convergence properties of the optimized gradient method. The previous worst-case convergence bound for the optimized gradient method was derived for only the last iterate of a secondary sequence. This paper provides an analytic convergence bound for the primary sequence generated by the optimized gradient method. We then discuss additional convergence properties of the optimized gradient method, including the interesting fact that the optimized gradient method has two types of worstcase functions: a piecewise affine-quadratic function and a quadratic function. These results help complete the theory of an optimal first-order method for smooth convex minimization. PMID:28461707

  6. Robust stochastic optimization for reservoir operation

    NASA Astrophysics Data System (ADS)

    Pan, Limeng; Housh, Mashor; Liu, Pan; Cai, Ximing; Chen, Xin

    2015-01-01

    Optimal reservoir operation under uncertainty is a challenging engineering problem. Application of classic stochastic optimization methods to large-scale problems is limited due to computational difficulty. Moreover, classic stochastic methods assume that the estimated distribution function or the sample inflow data accurately represents the true probability distribution, which may be invalid and the performance of the algorithms may be undermined. In this study, we introduce a robust optimization (RO) approach, Iterative Linear Decision Rule (ILDR), so as to provide a tractable approximation for a multiperiod hydropower generation problem. The proposed approach extends the existing LDR method by accommodating nonlinear objective functions. It also provides users with the flexibility of choosing the accuracy of ILDR approximations by assigning a desired number of piecewise linear segments to each uncertainty. The performance of the ILDR is compared with benchmark policies including the sampling stochastic dynamic programming (SSDP) policy derived from historical data. The ILDR solves both the single and multireservoir systems efficiently. The single reservoir case study results show that the RO method is as good as SSDP when implemented on the original historical inflows and it outperforms SSDP policy when tested on generated inflows with the same mean and covariance matrix as those in history. For the multireservoir case study, which considers water supply in addition to power generation, numerical results show that the proposed approach performs as well as in the single reservoir case study in terms of optimal value and distributional robustness.

  7. WE-AB-209-09: Optimization of Rotational Arc Station Parameter Optimized Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, P; Xing, L; Ungun, B

    Purpose: To develop a fast optimization method for station parameter optimized radiation therapy (SPORT) and show that SPORT is capable of improving VMAT in both plan quality and delivery efficiency. Methods: The angular space from 0° to 360° was divided into 180 station points (SPs). A candidate aperture was assigned to each of the SPs based on the calculation results using a column generation algorithm. The weights of the apertures were then obtained by optimizing the objective function using a state-of-the-art GPU based Proximal Operator Graph Solver (POGS) within seconds. Apertures with zero or low weight were thrown out. Tomore » avoid being trapped in a local minimum, a stochastic gradient descent method was employed which also greatly increased the convergence rate of the objective function. The above procedure repeated until the plan could not be improved any further. A weighting factor associated with the total plan MU also indirectly controlled the complexities of aperture shapes. The number of apertures for VMAT and SPORT was confined to 180. The SPORT allowed the coexistence of multiple apertures in a single SP. The optimization technique was assessed by using three clinical cases (prostate, H&N and brain). Results: Marked dosimetric quality improvement was demonstrated in the SPORT plans for all three studied cases. Prostate case: the volume of the 50% prescription dose was decreased by 22% for the rectum. H&N case: SPORT improved the mean dose for the left and right parotids by 15% each. Brain case: the doses to the eyes, chiasm and inner ears were all improved. SPORT shortened the treatment time by ∼1 min for the prostate case, ∼0.5 min for brain case, and ∼0.2 min for the H&N case. Conclusion: The superior dosimetric quality and delivery efficiency presented here indicates that SPORT is an intriguing alternative treatment modality.« less

  8. Optimizing random searches on three-dimensional lattices

    NASA Astrophysics Data System (ADS)

    Yang, Benhao; Yang, Shunkun; Zhang, Jiaquan; Li, Daqing

    2018-07-01

    Search is a universal behavior related to many types of intelligent individuals. While most studies have focused on search in two or infinite-dimensional space, it is still missing how search can be optimized in three-dimensional space. Here we study random searches on three-dimensional (3d) square lattices with periodic boundary conditions, and explore the optimal search strategy with a power-law step length distribution, p(l) ∼l-μ, known as Lévy flights. We find that compared to random searches on two-dimensional (2d) lattices, the optimal exponent μopt on 3d lattices is relatively smaller in non-destructive case and remains similar in destructive case. We also find μopt decreases as the lattice length in z direction increases under high target density. Our findings may help us to understand the role of spatial dimension in search behaviors.

  9. Impact of respiratory motion on worst-case scenario optimized intensity modulated proton therapy for lung cancers.

    PubMed

    Liu, Wei; Liao, Zhongxing; Schild, Steven E; Liu, Zhong; Li, Heng; Li, Yupeng; Park, Peter C; Li, Xiaoqiang; Stoker, Joshua; Shen, Jiajian; Keole, Sameer; Anand, Aman; Fatyga, Mirek; Dong, Lei; Sahoo, Narayan; Vora, Sujay; Wong, William; Zhu, X Ronald; Bues, Martin; Mohan, Radhe

    2015-01-01

    We compared conventionally optimized intensity modulated proton therapy (IMPT) treatment plans against worst-case scenario optimized treatment plans for lung cancer. The comparison of the 2 IMPT optimization strategies focused on the resulting plans' ability to retain dose objectives under the influence of patient setup, inherent proton range uncertainty, and dose perturbation caused by respiratory motion. For each of the 9 lung cancer cases, 2 treatment plans were created that accounted for treatment uncertainties in 2 different ways. The first used the conventional method: delivery of prescribed dose to the planning target volume that is geometrically expanded from the internal target volume (ITV). The second used a worst-case scenario optimization scheme that addressed setup and range uncertainties through beamlet optimization. The plan optimality and plan robustness were calculated and compared. Furthermore, the effects on dose distributions of changes in patient anatomy attributable to respiratory motion were investigated for both strategies by comparing the corresponding plan evaluation metrics at the end-inspiration and end-expiration phase and absolute differences between these phases. The mean plan evaluation metrics of the 2 groups were compared with 2-sided paired Student t tests. Without respiratory motion considered, we affirmed that worst-case scenario optimization is superior to planning target volume-based conventional optimization in terms of plan robustness and optimality. With respiratory motion considered, worst-case scenario optimization still achieved more robust dose distributions to respiratory motion for targets and comparable or even better plan optimality (D95% ITV, 96.6% vs 96.1% [P = .26]; D5%- D95% ITV, 10.0% vs 12.3% [P = .082]; D1% spinal cord, 31.8% vs 36.5% [P = .035]). Worst-case scenario optimization led to superior solutions for lung IMPT. Despite the fact that worst-case scenario optimization did not explicitly account for respiratory motion, it produced motion-resistant treatment plans. However, further research is needed to incorporate respiratory motion into IMPT robust optimization. Copyright © 2015 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  10. The cost of different types of lameness in dairy cows calculated by dynamic programming.

    PubMed

    Cha, E; Hertl, J A; Bar, D; Gröhn, Y T

    2010-10-01

    Traditionally, studies which placed a monetary value on the effect of lameness have calculated the costs at the herd level and rarely have they been specific to different types of lameness. These costs which have been calculated from former studies are not particularly useful for farmers in making economically optimal decisions depending on individual cow characteristics. The objective of this study was to calculate the cost of different types of lameness at the individual cow level and thereby identify the optimal management decision for each of three representative lameness diagnoses. This model would provide a more informed decision making process in lameness management for maximal economic profitability. We made modifications to an existing dynamic optimization and simulation model, studying the effects of various factors (incidence of lameness, milk loss, pregnancy rate and treatment cost) on the cost of different types of lameness. The average cost per case (US$) of sole ulcer, digital dermatitis and foot rot were 216.07, 132.96 and 120.70, respectively. It was recommended that 97.3% of foot rot cases, 95.5% of digital dermatitis cases and 92.3% of sole ulcer cases be treated. The main contributor to the total cost per case of sole ulcer was milk loss (38%), treatment cost for digital dermatitis (42%) and the effect of decreased fertility for foot rot (50%). This model affords versatility as it allows for parameters such as production costs, economic values and disease frequencies to be altered. Therefore, cost estimates are the direct outcome of the farm specific parameters entered into the model. Thus, this model can provide farmers economically optimal guidelines specific to their individual cows suffering from different types of lameness. Copyright © 2010 Elsevier B.V. All rights reserved.

  11. Contraction Options and Optimal Multiple-Stopping in Spectrally Negative Lévy Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamazaki, Kazutoshi, E-mail: kyamazak@kansai-u.ac.jp

    This paper studies the optimal multiple-stopping problem arising in the context of the timing option to withdraw from a project in stages. The profits are driven by a general spectrally negative Lévy process. This allows the model to incorporate sudden declines of the project values, generalizing greatly the classical geometric Brownian motion model. We solve the one-stage case as well as the extension to the multiple-stage case. The optimal stopping times are of threshold-type and the value function admits an expression in terms of the scale function. A series of numerical experiments are conducted to verify the optimality and tomore » evaluate the efficiency of the algorithm.« less

  12. OPTIMIZED REAL-TIME CONTROL OF COMBINED SEWERAGE SYSTEMS: TWO CASE STUDIES

    EPA Science Inventory

    The paper presents results of two case studies of Real-Time Control (RTC) alternatives evaluations that were conducted on portions of sewerage systems near Paris, France and in Quebec City, Canada, respectively. The studies were performed at real-scale demonstration sites. RTC ...

  13. Academic consortium for the evaluation of computer-aided diagnosis (CADx) in mammography

    NASA Astrophysics Data System (ADS)

    Mun, Seong K.; Freedman, Matthew T.; Wu, Chris Y.; Lo, Shih-Chung B.; Floyd, Carey E., Jr.; Lo, Joseph Y.; Chan, Heang-Ping; Helvie, Mark A.; Petrick, Nicholas; Sahiner, Berkman; Wei, Datong; Chakraborty, Dev P.; Clarke, Laurence P.; Kallergi, Maria; Clark, Bob; Kim, Yongmin

    1995-04-01

    Computer aided diagnosis (CADx) is a promising technology for the detection of breast cancer in screening mammography. A number of different approaches have been developed for CADx research that have achieved significant levels of performance. Research teams now recognize the need for a careful and detailed evaluation study of approaches to accelerate the development of CADx, to make CADx more clinically relevant and to optimize the CADx algorithms based on unbiased evaluations. The results of such a comparative study may provide each of the participating teams with new insights into the optimization of their individual CADx algorithms. This consortium of experienced CADx researchers is working as a group to compare results of the algorithms and to optimize the performance of CADx algorithms by learning from each other. Each institution will be contributing an equal number of cases that will be collected under a standard protocol for case selection, truth determination, and data acquisition to establish a common and unbiased database for the evaluation study. An evaluation procedure for the comparison studies are being developed to analyze the results of individual algorithms for each of the test cases in the common database. Optimization of individual CADx algorithms can be made based on the comparison studies. The consortium effort is expected to accelerate the eventual clinical implementation of CADx algorithms at participating institutions.

  14. An approach for multi-objective optimization of vehicle suspension system

    NASA Astrophysics Data System (ADS)

    Koulocheris, D.; Papaioannou, G.; Christodoulou, D.

    2017-10-01

    In this paper, a half car model of with nonlinear suspension systems is selected in order to study the vertical vibrations and optimize its suspension system with respect to ride comfort and road holding. A road bump was used as road profile. At first, the optimization problem is solved with the use of Genetic Algorithms with respect to 6 optimization targets. Then the k - ɛ optimization method was implemented to locate one optimum solution. Furthermore, an alternative approach is presented in this work: the previous optimization targets are separated in main and supplementary ones, depending on their importance in the analysis. The supplementary targets are not crucial to the optimization but they could enhance the main objectives. Thus, the problem was solved again using Genetic Algorithms with respect to the 3 main targets of the optimization. Having obtained the Pareto set of solutions, the k - ɛ optimality method was implemented for the 3 main targets and the supplementary ones, evaluated by the simulation of the vehicle model. The results of both cases are presented and discussed in terms of convergence of the optimization and computational time. The optimum solutions acquired from both cases are compared based on performance metrics as well.

  15. Relationship between Academic Optimism and Classroom Management Styles of Teachers--Case Study: Elementary School Teachers in Isfahan

    ERIC Educational Resources Information Center

    Moghtadaie, L.; Hoveida, R.

    2015-01-01

    The purpose of this study was to investigate the relationship between classroom management styles of the teachers and their academic optimism. In this study, three types of classroom management styles (interventionist style, interactionist style, and non-interventionist style) have been considered. Research community is all public primary school…

  16. The cost and management of different types of clinical mastitis in dairy cows estimated by dynamic programming.

    PubMed

    Cha, E; Bar, D; Hertl, J A; Tauer, L W; Bennett, G; González, R N; Schukken, Y H; Welcome, F L; Gröhn, Y T

    2011-09-01

    The objective of this study was to estimate the cost of 3 different types of clinical mastitis (CM) (caused by gram-positive bacteria, gram-negative bacteria, and other organisms) at the individual cow level and thereby identify the economically optimal management decision for each type of mastitis. We made modifications to an existing dynamic optimization and simulation model, studying the effects of various factors (incidence of CM, milk loss, pregnancy rate, and treatment cost) on the cost of different types of CM. The average costs per case (US$) of gram-positive, gram-negative, and other CM were $133.73, $211.03, and $95.31, respectively. This model provided a more informed decision-making process in CM management for optimal economic profitability and determined that 93.1% of gram-positive CM cases, 93.1% of gram-negative CM cases, and 94.6% of other CM cases should be treated. The main contributor to the total cost per case was treatment cost for gram-positive CM (51.5% of the total cost per case), milk loss for gram-negative CM (72.4%), and treatment cost for other CM (49.2%). The model affords versatility as it allows for parameters such as production costs, economic values, and disease frequencies to be altered. Therefore, cost estimates are the direct outcome of the farm-specific parameters entered into the model. Thus, this model can provide farmers economically optimal guidelines specific to their individual cows suffering from different types of CM. Copyright © 2011 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  17. Optimization applications in aircraft engine design and test

    NASA Technical Reports Server (NTRS)

    Pratt, T. K.

    1984-01-01

    Starting with the NASA-sponsored STAEBL program, optimization methods based primarily upon the versatile program COPES/CONMIN were introduced over the past few years to a broad spectrum of engineering problems in structural optimization, engine design, engine test, and more recently, manufacturing processes. By automating design and testing processes, many repetitive and costly trade-off studies have been replaced by optimization procedures. Rather than taking engineers and designers out of the loop, optimization has, in fact, put them more in control by providing sophisticated search techniques. The ultimate decision whether to accept or reject an optimal feasible design still rests with the analyst. Feedback obtained from this decision process has been invaluable since it can be incorporated into the optimization procedure to make it more intelligent. On several occasions, optimization procedures have produced novel designs, such as the nonsymmetric placement of rotor case stiffener rings, not anticipated by engineering designers. In another case, a particularly difficult resonance contraint could not be satisfied using hand iterations for a compressor blade, when the STAEBL program was applied to the problem, a feasible solution was obtained in just two iterations.

  18. Entrepreneurial Decision Making and Institutional Governance within the Academy: A Case Study

    ERIC Educational Resources Information Center

    French, Edward F.

    2011-01-01

    This case study explored the relationship between entrepreneurial decision making and optimal institutional governance. The study focused on a single institution, characterized as a small, tuition-driven, private institution. Twelve participants were interviewed in the study, equally divided between members of the faculty and of the…

  19. Optimal control model predictions of system performance and attention allocation and their experimental validation in a display design study

    NASA Technical Reports Server (NTRS)

    Johannsen, G.; Govindaraj, T.

    1980-01-01

    The influence of different types of predictor displays in a longitudinal vertical takeoff and landing (VTOL) hover task is analyzed in a theoretical study. Several cases with differing amounts of predictive and rate information are compared. The optimal control model of the human operator is used to estimate human and system performance in terms of root-mean-square (rms) values and to compute optimized attention allocation. The only part of the model which is varied to predict these data is the observation matrix. Typical cases are selected for a subsequent experimental validation. The rms values as well as eye-movement data are recorded. The results agree favorably with those of the theoretical study in terms of relative differences. Better matching is achieved by revised model input data.

  20. Generalized t-statistic for two-group classification.

    PubMed

    Komori, Osamu; Eguchi, Shinto; Copas, John B

    2015-06-01

    In the classic discriminant model of two multivariate normal distributions with equal variance matrices, the linear discriminant function is optimal both in terms of the log likelihood ratio and in terms of maximizing the standardized difference (the t-statistic) between the means of the two distributions. In a typical case-control study, normality may be sensible for the control sample but heterogeneity and uncertainty in diagnosis may suggest that a more flexible model is needed for the cases. We generalize the t-statistic approach by finding the linear function which maximizes a standardized difference but with data from one of the groups (the cases) filtered by a possibly nonlinear function U. We study conditions for consistency of the method and find the function U which is optimal in the sense of asymptotic efficiency. Optimality may also extend to other measures of discriminatory efficiency such as the area under the receiver operating characteristic curve. The optimal function U depends on a scalar probability density function which can be estimated non-parametrically using a standard numerical algorithm. A lasso-like version for variable selection is implemented by adding L1-regularization to the generalized t-statistic. Two microarray data sets in the study of asthma and various cancers are used as motivating examples. © 2014, The International Biometric Society.

  1. Optimal estimation of spatially variable recharge and transmissivity fields under steady-state groundwater flow. Part 2. Case study

    NASA Astrophysics Data System (ADS)

    Graham, Wendy D.; Neff, Christina R.

    1994-05-01

    The first-order analytical solution of the inverse problem for estimating spatially variable recharge and transmissivity under steady-state groundwater flow, developed in Part 1 is applied to the Upper Floridan Aquifer in NE Florida. Parameters characterizing the statistical structure of the log-transmissivity and head fields are estimated from 152 measurements of transmissivity and 146 measurements of hydraulic head available in the study region. Optimal estimates of the recharge, transmissivity and head fields are produced throughout the study region by conditioning on the nearest 10 available transmissivity measurements and the nearest 10 available head measurements. Head observations are shown to provide valuable information for estimating both the transmissivity and the recharge fields. Accurate numerical groundwater model predictions of the aquifer flow system are obtained using the optimal transmissivity and recharge fields as input parameters, and the optimal head field to define boundary conditions. For this case study, both the transmissivity field and the uncertainty of the transmissivity field prediction are poorly estimated, when the effects of random recharge are neglected.

  2. Optimization of microphysics in the Unified Model, using the Micro-genetic algorithm.

    NASA Astrophysics Data System (ADS)

    Jang, J.; Lee, Y.; Lee, H.; Lee, J.; Joo, S.

    2016-12-01

    This study focuses on parameter optimization of microphysics in the Unified Model (UM) using the Micro-genetic algorithm (Micro-GA). We need the optimization of microphysics in UM. Because, Microphysics in the Numerical Weather Prediction (NWP) model is important to Quantitative Precipitation Forecasting (QPF). The Micro-GA searches for optimal parameters on the basis of fitness function. The five parameters are chosen. The target parameters include x1, x2 related to raindrop size distribution, Cloud-rain correlation coefficient, Surface droplet number and Droplet taper height. The fitness function is based on the skill score that is BIAS and Critical Successive Index (CSI). An interface between UM and Micro-GA is developed and applied to three precipitation cases in Korea. The cases are (ⅰ) heavy rainfall in the Southern area because of typhoon NAKRI, (ⅱ) heavy rainfall in the Youngdong area, and (ⅲ) heavy rainfall in the Seoul metropolitan area. When the optimized result is compared to the control result (using the UM default value, CNTL), the optimized result leads to improvements in precipitation forecast, especially for heavy rainfall of the late forecast time. Also, we analyze the skill score of precipitation forecasts in terms of various thresholds of CNTL, Optimized result, and experiments on each optimized parameter for five parameters. Generally, the improvement is maximized when the five optimized parameters are used simultaneously. Therefore, this study demonstrates the ability to improve Korean precipitation forecasts by optimizing microphysics in UM.

  3. A graph decomposition-based approach for water distribution network optimization

    NASA Astrophysics Data System (ADS)

    Zheng, Feifei; Simpson, Angus R.; Zecchin, Aaron C.; Deuerlein, Jochen W.

    2013-04-01

    A novel optimization approach for water distribution network design is proposed in this paper. Using graph theory algorithms, a full water network is first decomposed into different subnetworks based on the connectivity of the network's components. The original whole network is simplified to a directed augmented tree, in which the subnetworks are substituted by augmented nodes and directed links are created to connect them. Differential evolution (DE) is then employed to optimize each subnetwork based on the sequence specified by the assigned directed links in the augmented tree. Rather than optimizing the original network as a whole, the subnetworks are sequentially optimized by the DE algorithm. A solution choice table is established for each subnetwork (except for the subnetwork that includes a supply node) and the optimal solution of the original whole network is finally obtained by use of the solution choice tables. Furthermore, a preconditioning algorithm is applied to the subnetworks to produce an approximately optimal solution for the original whole network. This solution specifies promising regions for the final optimization algorithm to further optimize the subnetworks. Five water network case studies are used to demonstrate the effectiveness of the proposed optimization method. A standard DE algorithm (SDE) and a genetic algorithm (GA) are applied to each case study without network decomposition to enable a comparison with the proposed method. The results show that the proposed method consistently outperforms the SDE and GA (both with tuned parameters) in terms of both the solution quality and efficiency.

  4. Multidisciplinary design optimization of the belt drive system considering both structure and vibration characteristics based on improved genetic algorithm

    NASA Astrophysics Data System (ADS)

    Yuan, Yongliang; Song, Xueguan; Sun, Wei; Wang, Xiaobang

    2018-05-01

    The dynamic performance of a belt drive system is composed of many factors, such as the efficiency, the vibration, and the optimal parameters. The conventional design only considers the basic performance of the belt drive system, while ignoring its overall performance. To address all these challenges, the study on vibration characteristics and optimization strategies could be a feasible way. This paper proposes a new optimization strategy and takes a belt drive design optimization as a case study based on the multidisciplinary design optimization (MDO). The MDO of the belt drive system is established and the corresponding sub-systems are analyzed. The multidisciplinary optimization is performed by using an improved genetic algorithm. Based on the optimal results obtained from the MDO, the three-dimension (3D) model of the belt drive system is established for dynamics simulation by virtual prototyping. From the comparison of the results with respect to different velocities and loads, the MDO method can effectively reduce the transverse vibration amplitude. The law of the vibration displacement, the vibration frequency, and the influence of velocities on the transverse vibrations has been obtained. Results show that the MDO method is of great help to obtain the optimal structural parameters. Furthermore, the kinematics principle of the belt drive has been obtained. The belt drive design case indicates that the proposed method in this paper can also be used to solve other engineering optimization problems efficiently.

  5. Optimal antiretroviral therapy adherence as evaluated by CASE index score tool is associated with virological suppression in HIV-infected adults in Dakar, Senegal.

    PubMed

    Byabene, A K; Fortes-Déguénonvo, L; Niang, K; Manga, M N; Bulabula, A N H; Nachega, J B; Seydi, M

    2017-06-01

    To determine the prevalence and factors associated with optimal antiretroviral therapy (ART) adherence and virological failure (VLF) among HIV-infected adults enrolled in the national ART programme at the teaching hospital of Fann, Dakar, Senegal. Cross-sectional study from 1 September 2013 to 30 January 2014. (1) optimal ART adherence by the Center for Adherence Support Evaluation (CASE) Index Score (>10) and (2) VLF (HIV RNA > 1000 copies/ml). Diagnostic accuracy of CASE Index Score assessed using sensitivity (Se), specificity (Sp), positive predictive value (PPV), negative predictive value (NPV) and corresponding 95% confidence intervals (CIs). Multivariate logistic regression analysis was performed to identify independent factors associated with optimal adherence and VLF. Of 98 HIV-infected patients on ART, 68% were female. The median (IQR) age was 42 (20-50) years. A total of 57 of 98 (60%) were on ART more than 3 years, and majority (88%) were on NNRTI-based first-line ART regimen. A total of 79 of 98 (80%) patients reported optimal ART adherence, and only five of 84 (5.9%) had documented VLF. Patients with VLF were significantly more likely to have suboptimal ART adherence (17.7% vs. 2.9%; P = 0.02). CASE Index Score showed the best trade-off in Se (78.9%, 95% CI: 54.4-93.9%), Sp (20.0%, 95% CI: 11.1-31.7), PPV (22.4, 95% CI: 13.1-34.2%) and NPV (76.5%, 95% CI: 50.1-93.2), when used VLF threshold of HIV RNA >50 copies/ml. Factors independently associated with VLF were CASE Index Score <10 ([aOR] = 13.0, 95% CI: 1.1-147.9; P = 0.04) and being a boosted PI-based ART regimen ([aOR] = 27.0, 95% CI: 2.4-309.4; P = 0.008). Optimal ART adherence is achievable in a high proportion of HIV-infected adults in this study population. CASE Index Score was independently associated with virological outcomes, supporting usefulness of this low-cost ART adherence monitoring tool in this setting. © 2017 John Wiley & Sons Ltd.

  6. Sculpt test problem analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sweetser, John David

    2013-10-01

    This report details Sculpt's implementation from a user's perspective. Sculpt is an automatic hexahedral mesh generation tool developed at Sandia National Labs by Steve Owen. 54 predetermined test cases are studied while varying the input parameters (Laplace iterations, optimization iterations, optimization threshold, number of processors) and measuring the quality of the resultant mesh. This information is used to determine the optimal input parameters to use for an unknown input geometry. The overall characteristics are covered in Chapter 1. The speci c details of every case are then given in Appendix A. Finally, example Sculpt inputs are given in B.1 andmore » B.2.« less

  7. A fast optimization algorithm for multicriteria intensity modulated proton therapy planning.

    PubMed

    Chen, Wei; Craft, David; Madden, Thomas M; Zhang, Kewu; Kooy, Hanne M; Herman, Gabor T

    2010-09-01

    To describe a fast projection algorithm for optimizing intensity modulated proton therapy (IMPT) plans and to describe and demonstrate the use of this algorithm in multicriteria IMPT planning. The authors develop a projection-based solver for a class of convex optimization problems and apply it to IMPT treatment planning. The speed of the solver permits its use in multicriteria optimization, where several optimizations are performed which span the space of possible treatment plans. The authors describe a plan database generation procedure which is customized to the requirements of the solver. The optimality precision of the solver can be specified by the user. The authors apply the algorithm to three clinical cases: A pancreas case, an esophagus case, and a tumor along the rib cage case. Detailed analysis of the pancreas case shows that the algorithm is orders of magnitude faster than industry-standard general purpose algorithms (MOSEK'S interior point optimizer, primal simplex optimizer, and dual simplex optimizer). Additionally, the projection solver has almost no memory overhead. The speed and guaranteed accuracy of the algorithm make it suitable for use in multicriteria treatment planning, which requires the computation of several diverse treatment plans. Additionally, given the low memory overhead of the algorithm, the method can be extended to include multiple geometric instances and proton range possibilities, for robust optimization.

  8. WMOST v2 Case Study: Monponsett Ponds

    EPA Science Inventory

    This webinar presents an overview of the preliminary results of a case study application of EPA's Watershed Management Optimization Support Tool v2 (WMOST) for stakeholders in the Monponsett Ponds Watershed Workgroup. Monponsett Ponds is a large water system consisting of two ba...

  9. Optimal Elevation and Configuration of Hanford's Double-Shell Tank Waste Mixer Pumps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onishi, Yasuo; Yokuda, Satoru T.; Majumder, Catherine A.

    The objective of this study was to compare the mixing performance of the Lawrence pump, which has injection nozzles at the top, with an alternative pump that has injection nozzles at the bottom, and to determine the optimal elevation for the alternative pump. Sixteen cases were evaluated: two sludge thicknesses at eight levels. A two-step evaluation approach was used: Step 1 to evaluate all 16 cases with the non-rotating mixer pump model and Step 2 to further evaluate four of those cases with the more realistic rotating mixer pump model. The TEMPEST code was used.

  10. Interactive Land-Use Optimization Using Laguerre Voronoi Diagram with Dynamic Generating Point Allocation

    NASA Astrophysics Data System (ADS)

    Chaidee, S.; Pakawanwong, P.; Suppakitpaisarn, V.; Teerasawat, P.

    2017-09-01

    In this work, we devise an efficient method for the land-use optimization problem based on Laguerre Voronoi diagram. Previous Voronoi diagram-based methods are more efficient and more suitable for interactive design than discrete optimization-based method, but, in many cases, their outputs do not satisfy area constraints. To cope with the problem, we propose a force-directed graph drawing algorithm, which automatically allocates generating points of Voronoi diagram to appropriate positions. Then, we construct a Laguerre Voronoi diagram based on these generating points, use linear programs to adjust each cell, and reconstruct the diagram based on the adjustment. We adopt the proposed method to the practical case study of Chiang Mai University's allocated land for a mixed-use complex. For this case study, compared to other Voronoi diagram-based method, we decrease the land allocation error by 62.557 %. Although our computation time is larger than the previous Voronoi-diagram-based method, it is still suitable for interactive design.

  11. Two-Step Optimization for Spatial Accessibility Improvement: A Case Study of Health Care Planning in Rural China

    PubMed Central

    Luo, Jing; Tian, Lingling; Luo, Lei; Yi, Hong

    2017-01-01

    A recent advancement in location-allocation modeling formulates a two-step approach to a new problem of minimizing disparity of spatial accessibility. Our field work in a health care planning project in a rural county in China indicated that residents valued distance or travel time from the nearest hospital foremost and then considered quality of care including less waiting time as a secondary desirability. Based on the case study, this paper further clarifies the sequential decision-making approach, termed “two-step optimization for spatial accessibility improvement (2SO4SAI).” The first step is to find the best locations to site new facilities by emphasizing accessibility as proximity to the nearest facilities with several alternative objectives under consideration. The second step adjusts the capacities of facilities for minimal inequality in accessibility, where the measure of accessibility accounts for the match ratio of supply and demand and complex spatial interaction between them. The case study illustrates how the two-step optimization method improves both aspects of spatial accessibility for health care access in rural China. PMID:28484707

  12. Two-Step Optimization for Spatial Accessibility Improvement: A Case Study of Health Care Planning in Rural China.

    PubMed

    Luo, Jing; Tian, Lingling; Luo, Lei; Yi, Hong; Wang, Fahui

    2017-01-01

    A recent advancement in location-allocation modeling formulates a two-step approach to a new problem of minimizing disparity of spatial accessibility. Our field work in a health care planning project in a rural county in China indicated that residents valued distance or travel time from the nearest hospital foremost and then considered quality of care including less waiting time as a secondary desirability. Based on the case study, this paper further clarifies the sequential decision-making approach, termed "two-step optimization for spatial accessibility improvement (2SO4SAI)." The first step is to find the best locations to site new facilities by emphasizing accessibility as proximity to the nearest facilities with several alternative objectives under consideration. The second step adjusts the capacities of facilities for minimal inequality in accessibility, where the measure of accessibility accounts for the match ratio of supply and demand and complex spatial interaction between them. The case study illustrates how the two-step optimization method improves both aspects of spatial accessibility for health care access in rural China.

  13. Optimized Real-Time Control of Combined Sewerage Systems: Two Case Studies (Proceedings Paper)

    EPA Science Inventory

    The paper presents results of two case studies of Real-Time Control (RTC) alternatives evaluations that were conducted on portions of sewerage systems near Paris, France and in Quebec City, Canada, respectively. The studies were performed at real-scale demonstration sites. RTC al...

  14. Quantum teleportation scheme by selecting one of multiple output ports

    NASA Astrophysics Data System (ADS)

    Ishizaka, Satoshi; Hiroshima, Tohya

    2009-04-01

    The scheme of quantum teleportation, where Bob has multiple (N) output ports and obtains the teleported state by simply selecting one of the N ports, is thoroughly studied. We consider both the deterministic version and probabilistic version of the teleportation scheme aiming to teleport an unknown state of a qubit. Moreover, we consider two cases for each version: (i) the state employed for the teleportation is fixed to a maximally entangled state and (ii) the state is also optimized as well as Alice’s measurement. We analytically determine the optimal protocols for all the four cases and show the corresponding optimal fidelity or optimal success probability. All these protocols can achieve the perfect teleportation in the asymptotic limit of N→∞ . The entanglement properties of the teleportation scheme are also discussed.

  15. Optimal case-control matching in practice.

    PubMed

    Cologne, J B; Shibata, Y

    1995-05-01

    We illustrate modern matching techniques and discuss practical issues in defining the closeness of matching for retrospective case-control designs (in which the pool of subjects already exists when the study commences). We empirically compare matching on a balancing score, analogous to the propensity score for treated/control matching, with matching on a weighted distance measure. Although both methods in principle produce balance between cases and controls in the marginal distributions of the matching covariates, the weighted distance measure provides better balance in practice because the balancing score can be poorly estimated. We emphasize the use of optimal matching based on efficient network algorithms. An illustration is based on the design of a case-control study of hepatitis B virus infection as a possible confounder and/or effect modifier of radiation-related primary liver cancer in atomic bomb survivors.

  16. Determination of the optimal case definition for the diagnosis of end-stage renal disease from administrative claims data in Manitoba, Canada.

    PubMed

    Komenda, Paul; Yu, Nancy; Leung, Stella; Bernstein, Keevin; Blanchard, James; Sood, Manish; Rigatto, Claudio; Tangri, Navdeep

    2015-01-01

    End-stage renal disease (ESRD) is a major public health problem with increasing prevalence and costs. An understanding of the long-term trends in dialysis rates and outcomes can help inform health policy. We determined the optimal case definition for the diagnosis of ESRD using administrative claims data in the province of Manitoba over a 7-year period. We determined the sensitivity, specificity, predictive value and overall accuracy of 4 administrative case definitions for the diagnosis of ESRD requiring chronic dialysis over different time horizons from Jan. 1, 2004, to Mar. 31, 2011. The Manitoba Renal Program Database served as the gold standard for confirming dialysis status. During the study period, 2562 patients were registered as recipients of chronic dialysis in the Manitoba Renal Program Database. Over a 1-year period (2010), the optimal case definition was any 2 claims for outpatient dialysis, and it was 74.6% sensitive (95% confidence interval [CI] 72.3%-76.9%) and 94.4% specific (95% CI 93.6%-95.2%) for the diagnosis of ESRD. In contrast, a case definition of at least 2 claims for dialysis treatment more than 90 days apart was 64.8% sensitive (95% CI 62.2%-67.3%) and 97.1% specific (95% CI 96.5%-97.7%). Extending the period to 5 years greatly improved sensitivity for all case definitions, with minimal change to specificity; for example, for the optimal case definition of any 2 claims for dialysis treatment, sensitivity increased to 86.0% (95% CI 84.7%-87.4%) at 5 years. Accurate case definitions for the diagnosis of ESRD requiring dialysis can be derived from administrative claims data. The optimal definition required any 2 claims for outpatient dialysis. Extending the claims period to 5 years greatly improved sensitivity with minimal effects on specificity for all case definitions.

  17. Energy-optimal electrical excitation of nerve fibers.

    PubMed

    Jezernik, Saso; Morari, Manfred

    2005-04-01

    We derive, based on an analytical nerve membrane model and optimal control theory of dynamical systems, an energy-optimal stimulation current waveform for electrical excitation of nerve fibers. Optimal stimulation waveforms for nonleaky and leaky membranes are calculated. The case with a leaky membrane is a realistic case. Finally, we compare the waveforms and energies necessary for excitation of a leaky membrane in the case where the stimulation waveform is a square-wave current pulse, and in the case of energy-optimal stimulation. The optimal stimulation waveform is an exponentially rising waveform and necessitates considerably less energy to excite the nerve than a square-wave pulse (especially true for larger pulse durations). The described theoretical results can lead to drastically increased battery lifetime and/or decreased energy transmission requirements for implanted biomedical systems.

  18. Static vs stochastic optimization: A case study of FTSE Bursa Malaysia sectorial indices

    NASA Astrophysics Data System (ADS)

    Mamat, Nur Jumaadzan Zaleha; Jaaman, Saiful Hafizah; Ahmad, Rokiah@Rozita

    2014-06-01

    Traditional portfolio optimization methods in the likes of Markowitz' mean-variance model and semi-variance model utilize static expected return and volatility risk from historical data to generate an optimal portfolio. The optimal portfolio may not truly be optimal in reality due to the fact that maximum and minimum values from the data may largely influence the expected return and volatility risk values. This paper considers distributions of assets' return and volatility risk to determine a more realistic optimized portfolio. For illustration purposes, the sectorial indices data in FTSE Bursa Malaysia is employed. The results show that stochastic optimization provides more stable information ratio.

  19. Optimal field-splitting algorithm in intensity-modulated radiotherapy: Evaluations using head-and-neck and female pelvic IMRT cases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dou, Xin; Kim, Yusung, E-mail: yusung-kim@uiowa.edu; Bayouth, John E.

    2013-04-01

    To develop an optimal field-splitting algorithm of minimal complexity and verify the algorithm using head-and-neck (H and N) and female pelvic intensity-modulated radiotherapy (IMRT) cases. An optimal field-splitting algorithm was developed in which a large intensity map (IM) was split into multiple sub-IMs (≥2). The algorithm reduced the total complexity by minimizing the monitor units (MU) delivered and segment number of each sub-IM. The algorithm was verified through comparison studies with the algorithm as used in a commercial treatment planning system. Seven IMRT, H and N, and female pelvic cancer cases (54 IMs) were analyzed by MU, segment numbers, andmore » dose distributions. The optimal field-splitting algorithm was found to reduce both total MU and the total number of segments. We found on average a 7.9 ± 11.8% and 9.6 ± 18.2% reduction in MU and segment numbers for H and N IMRT cases with an 11.9 ± 17.4% and 11.1 ± 13.7% reduction for female pelvic cases. The overall percent (absolute) reduction in the numbers of MU and segments were found to be on average −9.7 ± 14.6% (−15 ± 25 MU) and −10.3 ± 16.3% (−3 ± 5), respectively. In addition, all dose distributions from the optimal field-splitting method showed improved dose distributions. The optimal field-splitting algorithm shows considerable improvements in both total MU and total segment number. The algorithm is expected to be beneficial for the radiotherapy treatment of large-field IMRT.« less

  20. The Effect of Teachers' Shared Leadership Perception on Academic Optimism and Organizational Citizenship Behaviour: A Turkish Case

    ERIC Educational Resources Information Center

    Akin Kösterelioglu, Meltem

    2017-01-01

    Purpose: The present study investigates the capability of high school teachers' shared leadership perception to predict the academic optimism and organizational citizenship levels. Research methods: The population of the current descriptive study, which was conducted via screening model, consists of 321 high school teachers working for Amasya…

  1. Financial planning as a policy tool in the petroleum industry (the case study: ojsc ”SURGUTNEFTEGAS”)

    NASA Astrophysics Data System (ADS)

    Romanyuk, Vera; Karyakina, Anna; Vershkova, Elena; Grinkevish, Larisa; Pozdeeva, Galina

    2016-09-01

    The article deals with the financial planning of oil and gas company activities including capital structure optimization. One of the main tasks of up-to-date financial management is to optimize the capital structure of an organization and minimize the weighted average cost of capital. The applied method in capital structure optimization affects the research quality results, as well as management decisions. The study was conducted on the basis of OJSC "Surgutneftegas" financial statements.

  2. FMRQ-A Multiagent Reinforcement Learning Algorithm for Fully Cooperative Tasks.

    PubMed

    Zhang, Zhen; Zhao, Dongbin; Gao, Junwei; Wang, Dongqing; Dai, Yujie

    2017-06-01

    In this paper, we propose a multiagent reinforcement learning algorithm dealing with fully cooperative tasks. The algorithm is called frequency of the maximum reward Q-learning (FMRQ). FMRQ aims to achieve one of the optimal Nash equilibria so as to optimize the performance index in multiagent systems. The frequency of obtaining the highest global immediate reward instead of immediate reward is used as the reinforcement signal. With FMRQ each agent does not need the observation of the other agents' actions and only shares its state and reward at each step. We validate FMRQ through case studies of repeated games: four cases of two-player two-action and one case of three-player two-action. It is demonstrated that FMRQ can converge to one of the optimal Nash equilibria in these cases. Moreover, comparison experiments on tasks with multiple states and finite steps are conducted. One is box-pushing and the other one is distributed sensor network problem. Experimental results show that the proposed algorithm outperforms others with higher performance.

  3. An algorithm for the optimal collection of wet waste.

    PubMed

    Laureri, Federica; Minciardi, Riccardo; Robba, Michela

    2016-02-01

    This work refers to the development of an approach for planning wet waste (food waste and other) collection at a metropolitan scale. Some specific modeling features distinguish this specific waste collection problem from the other ones. For instance, there may be significant differences as regards the values of the parameters (such as weight and volume) characterizing the various collection points. As it happens for classical waste collection planning, even in the case of wet waste, one has to deal with difficult combinatorial problems, where the determination of an optimal solution may require a very large computational effort, in the case of problem instances having a noticeable dimensionality. For this reason, in this work, a heuristic procedure for the optimal planning of wet waste is developed and applied to problem instances drawn from a real case study. The performances that can be obtained by applying such a procedure are evaluated by a comparison with those obtainable via a general-purpose mathematical programming software package, as well as those obtained by applying very simple decision rules commonly used in practice. The considered case study consists in an area corresponding to the historical center of the Municipality of Genoa. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Yousong, E-mail: yousong.luo@rmit.edu.au

    This paper deals with a class of optimal control problems governed by an initial-boundary value problem of a parabolic equation. The case of semi-linear boundary control is studied where the control is applied to the system via the Wentzell boundary condition. The differentiability of the state variable with respect to the control is established and hence a necessary condition is derived for the optimal solution in the case of both unconstrained and constrained problems. The condition is also sufficient for the unconstrained convex problems. A second order condition is also derived.

  5. Reliability centered maintenance : a case study of railway transit maintenance to achieve optimal performance.

    DOT National Transportation Integrated Search

    2010-12-01

    The purpose of this qualitative case study was to identify the types of obstacles and patterns experienced by a single heavy rail transit agency located in North America that embedded a Reliability Centered Maintenance (RCM) Process. The outcome of t...

  6. Employment of Personnel at the Tucson Border Patrol Station

    DTIC Science & Technology

    2017-06-09

    RESEARCH METHODOLOGY How should the Tucson Border Patrol Station optimally employ personnel? Using a case study research methodology141 provided...BORSTAR provide better capabilities to respond and greater mobility in risk management.155 The methodologies of case study comparatives include the...35 CHAPTER 3 RESEARCH METHODOLOGY

  7. Full space device optimization for solar cells.

    PubMed

    Baloch, Ahmer A B; Aly, Shahzada P; Hossain, Mohammad I; El-Mellouhi, Fedwa; Tabet, Nouar; Alharbi, Fahhad H

    2017-09-20

    Advances in computational materials have paved a way to design efficient solar cells by identifying the optimal properties of the device layers. Conventionally, the device optimization has been governed by single or double descriptors for an individual layer; mostly the absorbing layer. However, the performance of the device depends collectively on all the properties of the material and the geometry of each layer in the cell. To address this issue of multi-property optimization and to avoid the paradigm of reoccurring materials in the solar cell field, a full space material-independent optimization approach is developed and presented in this paper. The method is employed to obtain an optimized material data set for maximum efficiency and for targeted functionality for each layer. To ensure the robustness of the method, two cases are studied; namely perovskite solar cells device optimization and cadmium-free CIGS solar cell. The implementation determines the desirable optoelectronic properties of transport mediums and contacts that can maximize the efficiency for both cases. The resulted data sets of material properties can be matched with those in materials databases or by further microscopic material design. Moreover, the presented multi-property optimization framework can be extended to design any solid-state device.

  8. Optimal design of green and grey stormwater infrastructure for small urban catchment based on life-cycle cost-effectiveness analysis

    NASA Astrophysics Data System (ADS)

    Yang, Y.; Chui, T. F. M.

    2016-12-01

    Green infrastructure (GI) is identified as sustainable and environmentally friendly alternatives to the conventional grey stormwater infrastructure. Commonly used GI (e.g. green roof, bioretention, porous pavement) can provide multifunctional benefits, e.g. mitigation of urban heat island effects, improvements in air quality. Therefore, to optimize the design of GI and grey drainage infrastructure, it is essential to account for their benefits together with the costs. In this study, a comprehensive simulation-optimization modelling framework that considers the economic and hydro-environmental aspects of GI and grey infrastructure for small urban catchment applications is developed. Several modelling tools (i.e., EPA SWMM model, the WERF BMP and LID Whole Life Cycle Cost Modelling Tools) and optimization solvers are coupled together to assess the life-cycle cost-effectiveness of GI and grey infrastructure, and to further develop optimal stormwater drainage solutions. A typical residential lot in New York City is examined as a case study. The life-cycle cost-effectiveness of various GI and grey infrastructure are first examined at different investment levels. The results together with the catchment parameters are then provided to the optimization solvers, to derive the optimal investment and contributing area of each type of the stormwater controls. The relationship between the investment and optimized environmental benefit is found to be nonlinear. The optimized drainage solutions demonstrate that grey infrastructure is preferred at low total investments while more GI should be adopted at high investments. The sensitivity of the optimized solutions to the prices the stormwater controls is evaluated and is found to be highly associated with their utilizations in the base optimization case. The overall simulation-optimization framework can be easily applied to other sites world-wide, and to be further developed into powerful decision support systems.

  9. Structural optimization of large structural systems by optimality criteria methods

    NASA Technical Reports Server (NTRS)

    Berke, Laszlo

    1992-01-01

    The fundamental concepts of the optimality criteria method of structural optimization are presented. The effect of the separability properties of the objective and constraint functions on the optimality criteria expressions is emphasized. The single constraint case is treated first, followed by the multiple constraint case with a more complex evaluation of the Lagrange multipliers. Examples illustrate the efficiency of the method.

  10. Self-adaptive multimethod optimization applied to a tailored heating forging process

    NASA Astrophysics Data System (ADS)

    Baldan, M.; Steinberg, T.; Baake, E.

    2018-05-01

    The presented paper describes an innovative self-adaptive multi-objective optimization code. Investigation goals concern proving the superiority of this code compared to NGSA-II and applying it to an inductor’s design case study addressed to a “tailored” heating forging application. The choice of the frequency and the heating time are followed by the determination of the turns number and their positions. Finally, a straightforward optimization is performed in order to minimize energy consumption using “optimal control”.

  11. Using Aspen to Teach Chromatographic Bioprocessing: A Case Study in Weak Partitioning Chromatography for Biotechnology Applications

    ERIC Educational Resources Information Center

    Evans, Steven T.; Huang, Xinqun; Cramer, Steven M.

    2010-01-01

    The commercial simulator Aspen Chromatography was employed to study and optimize an important new industrial separation process, weak partitioning chromatography. This case study on antibody purification was implemented in a chromatographic separations course. Parametric simulations were performed to investigate the effect of operating parameters…

  12. Application of a territorial-based filtering algorithm in turbomachinery blade design optimization

    NASA Astrophysics Data System (ADS)

    Bahrami, Salman; Khelghatibana, Maryam; Tribes, Christophe; Yi Lo, Suk; von Fellenberg, Sven; Trépanier, Jean-Yves; Guibault, François

    2017-02-01

    A territorial-based filtering algorithm (TBFA) is proposed as an integration tool in a multi-level design optimization methodology. The design evaluation burden is split between low- and high-cost levels in order to properly balance the cost and required accuracy in different design stages, based on the characteristics and requirements of the case at hand. TBFA is in charge of connecting those levels by selecting a given number of geometrically different promising solutions from the low-cost level to be evaluated in the high-cost level. Two test case studies, a Francis runner and a transonic fan rotor, have demonstrated the robustness and functionality of TBFA in real industrial optimization problems.

  13. Incorporating deliverable monitor unit constraints into spot intensity optimization in intensity modulated proton therapy treatment planning

    PubMed Central

    Cao, Wenhua; Lim, Gino; Li, Xiaoqiang; Li, Yupeng; Zhu, X. Ronald; Zhang, Xiaodong

    2014-01-01

    The purpose of this study is to investigate the feasibility and impact of incorporating deliverable monitor unit (MU) constraints into spot intensity optimization in intensity modulated proton therapy (IMPT) treatment planning. The current treatment planning system (TPS) for IMPT disregards deliverable MU constraints in the spot intensity optimization (SIO) routine. It performs a post-processing procedure on an optimized plan to enforce deliverable MU values that are required by the spot scanning proton delivery system. This procedure can create a significant dose distribution deviation between the optimized and post-processed deliverable plans, especially when small spot spacings are used. In this study, we introduce a two-stage linear programming (LP) approach to optimize spot intensities and constrain deliverable MU values simultaneously, i.e., a deliverable spot intensity optimization (DSIO) model. Thus, the post-processing procedure is eliminated and the associated optimized plan deterioration can be avoided. Four prostate cancer cases at our institution were selected for study and two parallel opposed beam angles were planned for all cases. A quadratic programming (QP) based model without MU constraints, i.e., a conventional spot intensity optimization (CSIO) model, was also implemented to emulate the commercial TPS. Plans optimized by both the DSIO and CSIO models were evaluated for five different settings of spot spacing from 3 mm to 7 mm. For all spot spacings, the DSIO-optimized plans yielded better uniformity for the target dose coverage and critical structure sparing than did the CSIO-optimized plans. With reduced spot spacings, more significant improvements in target dose uniformity and critical structure sparing were observed in the DSIO- than in the CSIO-optimized plans. Additionally, better sparing of the rectum and bladder was achieved when reduced spacings were used for the DSIO-optimized plans. The proposed DSIO approach ensures the deliverability of optimized IMPT plans that take into account MU constraints. This eliminates the post-processing procedure required by the TPS as well as the resultant deteriorating effect on ultimate dose distributions. This approach therefore allows IMPT plans to adopt all possible spot spacings optimally. Moreover, dosimetric benefits can be achieved using smaller spot spacings. PMID:23835656

  14. Watershed Management Optimization Support Tool (WMOST) v2: User Manual and Case Studies

    EPA Science Inventory

    The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...

  15. Regionalized LCA-based optimization of building energy supply: method and case study for a Swiss municipality.

    PubMed

    Saner, Dominik; Vadenbo, Carl; Steubing, Bernhard; Hellweg, Stefanie

    2014-07-01

    This paper presents a regionalized LCA-based multiobjective optimization model of building energy demand and supply for the case of a Swiss municipality for the minimization of greenhouse gas emissions and particulate matter formation. The results show that the environmental improvement potential is very large: in the optimal case, greenhouse gas emissions from energy supply could be reduced by more than 75% and particulate emissions by over 50% in the municipality. This scenario supposes a drastic shift of heat supply systems from a fossil fuel dominated portfolio to a portfolio consisting of mainly heat pump and woodchip incineration systems. In addition to a change in heat supply technologies, roofs, windows and walls would need to be refurbished in more than 65% of the municipality's buildings. The full potential of the environmental impact reductions will hardly be achieved in reality, particularly in the short term, for example, because of financial constraints and social acceptance, which were not taken into account in this study. Nevertheless, the results of the optimization model can help policy makers to identify the most effective measures for improvement at the decision making level, for example, at the building level for refurbishment and selection of heating systems or at the municipal level for designing district heating networks. Therefore, this work represents a starting point for designing effective incentives to reduce the environmental impact of buildings. While the results of the optimization model are specific to the municipality studied, the model could readily be adapted to other regions.

  16. PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.

    PubMed

    Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter

    2016-04-01

    Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Case Study: Fuel Cells Provide Combined Heat and Power at Verizon's Garden City Central Office

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-12-01

    This case study describes how Verizon's Central Office in Garden City, NY, installed a 1.4-MW phosphoric acid fuel cell system as an alternative solution to bolster electric reliability, optimize the company's energy use, and reduce costs in an environmentally responsible manner.

  18. Static vs stochastic optimization: A case study of FTSE Bursa Malaysia sectorial indices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mamat, Nur Jumaadzan Zaleha; Jaaman, Saiful Hafizah; Ahmad, Rokiah Rozita

    2014-06-19

    Traditional portfolio optimization methods in the likes of Markowitz' mean-variance model and semi-variance model utilize static expected return and volatility risk from historical data to generate an optimal portfolio. The optimal portfolio may not truly be optimal in reality due to the fact that maximum and minimum values from the data may largely influence the expected return and volatility risk values. This paper considers distributions of assets' return and volatility risk to determine a more realistic optimized portfolio. For illustration purposes, the sectorial indices data in FTSE Bursa Malaysia is employed. The results show that stochastic optimization provides more stablemore » information ratio.« less

  19. A Study of the Optimal Planning Model for Reservoir Sustainable Management- A Case Study of Shihmen Reservoir

    NASA Astrophysics Data System (ADS)

    Chen, Y. Y.; Ho, C. C.; Chang, L. C.

    2017-12-01

    The reservoir management in Taiwan faces lots of challenge. Massive sediment caused by landslide were flushed into reservoir, which will decrease capacity, rise the turbidity, and increase supply risk. Sediment usually accompanies nutrition that will cause eutrophication problem. Moreover, the unevenly distribution of rainfall cause water supply instability. Hence, how to ensure sustainable use of reservoirs has become an important task in reservoir management. The purpose of the study is developing an optimal planning model for reservoir sustainable management to find out an optimal operation rules of reservoir flood control and sediment sluicing. The model applies Genetic Algorithms to combine with the artificial neural network of hydraulic analysis and reservoir sediment movement. The main objective of operation rules in this study is to prevent reservoir outflow caused downstream overflow, minimum the gap between initial and last water level of reservoir, and maximum sluicing sediment efficiency. A case of Shihmen reservoir was used to explore the different between optimal operating rule and the current operation of the reservoir. The results indicate optimal operating rules tended to open desilting tunnel early and extend open duration during flood discharge period. The results also show the sluicing sediment efficiency of optimal operating rule is 36%, 44%, 54% during Typhoon Jangmi, Typhoon Fung-Wong, and Typhoon Sinlaku respectively. The results demonstrate the optimal operation rules do play a role in extending the service life of Shihmen reservoir and protecting the safety of downstream. The study introduces a low cost strategy, alteration of operation reservoir rules, into reservoir sustainable management instead of pump dredger in order to improve the problem of elimination of reservoir sediment and high cost.

  20. Influence of model errors in optimal sensor placement

    NASA Astrophysics Data System (ADS)

    Vincenzi, Loris; Simonini, Laura

    2017-02-01

    The paper investigates the role of model errors and parametric uncertainties in optimal or near optimal sensor placements for structural health monitoring (SHM) and modal testing. The near optimal set of measurement locations is obtained by the Information Entropy theory; the results of placement process considerably depend on the so-called covariance matrix of prediction error as well as on the definition of the correlation function. A constant and an exponential correlation function depending on the distance between sensors are firstly assumed; then a proposal depending on both distance and modal vectors is presented. With reference to a simple case-study, the effect of model uncertainties on results is described and the reliability and the robustness of the proposed correlation function in the case of model errors are tested with reference to 2D and 3D benchmark case studies. A measure of the quality of the obtained sensor configuration is considered through the use of independent assessment criteria. In conclusion, the results obtained by applying the proposed procedure on a real 5-spans steel footbridge are described. The proposed method also allows to better estimate higher modes when the number of sensors is greater than the number of modes of interest. In addition, the results show a smaller variation in the sensor position when uncertainties occur.

  1. Wing-section optimization for supersonic viscous flow

    NASA Technical Reports Server (NTRS)

    Item, Cem C.; Baysal, Oktay (Editor)

    1995-01-01

    To improve the shape of a supersonic wing, an automated method that also includes higher fidelity to the flow physics is desirable. With this impetus, an aerodynamic optimization methodology incorporating thin-layer Navier-Stokes equations and sensitivity analysis had been previously developed. Prior to embarking upon the wind design task, the present investigation concentrated on testing the feasibility of the methodology, and the identification of adequate problem formulations, by defining two-dimensional, cost-effective test cases. Starting with two distinctly different initial airfoils, two independent shape optimizations resulted in shapes with similar features: slightly cambered, parabolic profiles with sharp leading- and trailing-edges. Secondly, the normal section to the subsonic portion of the leading edge, which had a high normal angle-of-attack, was considered. The optimization resulted in a shape with twist and camber which eliminated the adverse pressure gradient, hence, exploiting the leading-edge thrust. The wing section shapes obtained in all the test cases had the features predicted by previous studies. Therefore, it was concluded that the flowfield analyses and sensitivity coefficients were computed and fed to the present gradient-based optimizer correctly. Also, as a result of the present two-dimensional study, suggestions were made for the problem formulations which should contribute to an effective wing shape optimization.

  2. Numerical verification of two-component dental implant in the context of fatigue life for various load cases.

    PubMed

    Szajek, Krzysztof; Wierszycki, Marcin

    2016-01-01

    Dental implant designing is a complex process which considers many limitations both biological and mechanical in nature. In earlier studies, a complete procedure for improvement of two-component dental implant was proposed. However, the optimization tasks carried out required assumption on representative load case, which raised doubts on optimality for the other load cases. This paper deals with verification of the optimal design in context of fatigue life and its main goal is to answer the question if the assumed load scenario (solely horizontal occlusal load) leads to the design which is also "safe" for oblique occlussal loads regardless the angle from an implant axis. The verification is carried out with series of finite element analyses for wide spectrum of physiologically justified loads. The design of experiment methodology with full factorial technique is utilized. All computations are done in Abaqus suite. The maximal Mises stress and normalized effective stress amplitude for various load cases are discussed and compared with the assumed "safe" limit (equivalent of fatigue life for 5e6 cycles). The obtained results proof that coronial-appical load component should be taken into consideration in the two component dental implant when fatigue life is optimized. However, its influence in the analyzed case is small and does not change the fact that the fatigue life improvement is observed for all components within whole range of analyzed loads.

  3. Production of Low Cost Carbon-Fiber through Energy Optimization of Stabilization Process.

    PubMed

    Golkarnarenji, Gelayol; Naebe, Minoo; Badii, Khashayar; Milani, Abbas S; Jazar, Reza N; Khayyam, Hamid

    2018-03-05

    To produce high quality and low cost carbon fiber-based composites, the optimization of the production process of carbon fiber and its properties is one of the main keys. The stabilization process is the most important step in carbon fiber production that consumes a large amount of energy and its optimization can reduce the cost to a large extent. In this study, two intelligent optimization techniques, namely Support Vector Regression (SVR) and Artificial Neural Network (ANN), were studied and compared, with a limited dataset obtained to predict physical property (density) of oxidative stabilized PAN fiber (OPF) in the second zone of a stabilization oven within a carbon fiber production line. The results were then used to optimize the energy consumption in the process. The case study can be beneficial to chemical industries involving carbon fiber manufacturing, for assessing and optimizing different stabilization process conditions at large.

  4. Production of Low Cost Carbon-Fiber through Energy Optimization of Stabilization Process

    PubMed Central

    Golkarnarenji, Gelayol; Naebe, Minoo; Badii, Khashayar; Milani, Abbas S.; Jazar, Reza N.; Khayyam, Hamid

    2018-01-01

    To produce high quality and low cost carbon fiber-based composites, the optimization of the production process of carbon fiber and its properties is one of the main keys. The stabilization process is the most important step in carbon fiber production that consumes a large amount of energy and its optimization can reduce the cost to a large extent. In this study, two intelligent optimization techniques, namely Support Vector Regression (SVR) and Artificial Neural Network (ANN), were studied and compared, with a limited dataset obtained to predict physical property (density) of oxidative stabilized PAN fiber (OPF) in the second zone of a stabilization oven within a carbon fiber production line. The results were then used to optimize the energy consumption in the process. The case study can be beneficial to chemical industries involving carbon fiber manufacturing, for assessing and optimizing different stabilization process conditions at large. PMID:29510592

  5. Representations in Problem Solving: A Case Study with Optimization Problems

    ERIC Educational Resources Information Center

    Villegas, Jose L.; Castro, Enrique; Gutierrez, Jose

    2009-01-01

    Introduction: Representations play an essential role in mathematical thinking. They favor the understanding of mathematical concepts and stimulate the development of flexible and versatile thinking in problem solving. Here our focus is on their use in optimization problems, a type of problem considered important in mathematics teaching and…

  6. An optimization model to agroindustrial sector in antioquia (Colombia, South America)

    NASA Astrophysics Data System (ADS)

    Fernandez, J.

    2015-06-01

    This paper develops a proposal of a general optimization model for the flower industry, which is defined by using discrete simulation and nonlinear optimization, whose mathematical models have been solved by using ProModel simulation tools and Gams optimization. It defines the operations that constitute the production and marketing of the sector, statistically validated data taken directly from each operation through field work, the discrete simulation model of the operations and the linear optimization model of the entire industry chain are raised. The model is solved with the tools described above and presents the results validated in a case study.

  7. Multi-objective Optimization on Helium Liquefier Using Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Wang, H. R.; Xiong, L. Y.; Peng, N.; Meng, Y. R.; Liu, L. Q.

    2017-02-01

    Research on optimization of helium liquefier is limited at home and abroad, and most of the optimization is single-objective based on Collins cycle. In this paper, a multi-objective optimization is conducted using genetic algorithm (GA) on the 40 L/h helium liquefier developed by Technical Institute of Physics and Chemistry of the Chinese Academy of Science (TIPC, CAS), steady solutions are obtained in the end. In addition, the exergy loss of the optimized system is studied in the case of with and without liquid nitrogen pre-cooling. The results have guiding significance for the future design of large helium liquefier.

  8. Strontium-90 Biokinetics from Simulated Wound Intakes in Non-human Primates Compared with Combined Model Predictions from National Council on Radiation Protection and Measurements Report 156 and International Commission on Radiological Protection Publication 67.

    PubMed

    Allen, Mark B; Brey, Richard R; Gesell, Thomas; Derryberry, Dewayne; Poudel, Deepesh

    2016-01-01

    This study had a goal to evaluate the predictive capabilities of the National Council on Radiation Protection and Measurements (NCRP) wound model coupled to the International Commission on Radiological Protection (ICRP) systemic model for 90Sr-contaminated wounds using non-human primate data. Studies were conducted on 13 macaque (Macaca mulatta) monkeys, each receiving one-time intramuscular injections of 90Sr solution. Urine and feces samples were collected up to 28 d post-injection and analyzed for 90Sr activity. Integrated Modules for Bioassay Analysis (IMBA) software was configured with default NCRP and ICRP model transfer coefficients to calculate predicted 90Sr intake via the wound based on the radioactivity measured in bioassay samples. The default parameters of the combined models produced adequate fits of the bioassay data, but maximum likelihood predictions of intake were overestimated by a factor of 1.0 to 2.9 when bioassay data were used as predictors. Skeletal retention was also over-predicted, suggesting an underestimation of the excretion fraction. Bayesian statistics and Monte Carlo sampling were applied using IMBA to vary the default parameters, producing updated transfer coefficients for individual monkeys that improved model fit and predicted intake and skeletal retention. The geometric means of the optimized transfer rates for the 11 cases were computed, and these optimized sample population parameters were tested on two independent monkey cases and on the 11 monkeys from which the optimized parameters were derived. The optimized model parameters did not improve the model fit in most cases, and the predicted skeletal activity produced improvements in three of the 11 cases. The optimized parameters improved the predicted intake in all cases but still over-predicted the intake by an average of 50%. The results suggest that the modified transfer rates were not always an improvement over the default NCRP and ICRP model values.

  9. Case studies on optimization problems in MATLAB and COMSOL multiphysics by means of the livelink

    NASA Astrophysics Data System (ADS)

    Ozana, Stepan; Pies, Martin; Docekal, Tomas

    2016-06-01

    LiveLink for COMSOL is a tool that integrates COMSOL Multiphysics with MATLAB to extend one's modeling with scripting programming in the MATLAB environment. It allows user to utilize the full power of MATLAB and its toolboxes in preprocessing, model manipulation, and post processing. At first, the head script launches COMSOL with MATLAB and defines initial value of all parameters, refers to the objective function J described in the objective function and creates and runs the defined optimization task. Once the task is launches, the COMSOL model is being called in the iteration loop (from MATLAB environment by use of API interface), changing defined optimization parameters so that the objective function is minimized, using fmincon function to find a local or global minimum of constrained linear or nonlinear multivariable function. Once the minimum is found, it returns exit flag, terminates optimization and returns the optimized values of the parameters. The cooperation with MATLAB via LiveLink enhances a powerful computational environment with complex multiphysics simulations. The paper will introduce using of the LiveLink for COMSOL for chosen case studies in the field of technical cybernetics and bioengineering.

  10. A Case Study on the Application of a Structured Experimental Method for Optimal Parameter Design of a Complex Control System

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2015-01-01

    This report documents a case study on the application of Reliability Engineering techniques to achieve an optimal balance between performance and robustness by tuning the functional parameters of a complex non-linear control system. For complex systems with intricate and non-linear patterns of interaction between system components, analytical derivation of a mathematical model of system performance and robustness in terms of functional parameters may not be feasible or cost-effective. The demonstrated approach is simple, structured, effective, repeatable, and cost and time efficient. This general approach is suitable for a wide range of systems.

  11. SPOTting Model Parameters Using a Ready-Made Python Package

    NASA Astrophysics Data System (ADS)

    Houska, Tobias; Kraft, Philipp; Chamorro-Chavez, Alejandro; Breuer, Lutz

    2017-04-01

    The choice for specific parameter estimation methods is often more dependent on its availability than its performance. We developed SPOTPY (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of methods typically used to calibrate, analyze and optimize parameters for a wide range of ecological models. SPOTPY currently contains eight widely used algorithms, 11 objective functions, and can sample from eight parameter distributions. SPOTPY has a model-independent structure and can be run in parallel from the workstation to large computation clusters using the Message Passing Interface (MPI). We tested SPOTPY in five different case studies to parameterize the Rosenbrock, Griewank and Ackley functions, a one-dimensional physically based soil moisture routine, where we searched for parameters of the van Genuchten-Mualem function and a calibration of a biogeochemistry model with different objective functions. The case studies reveal that the implemented SPOTPY methods can be used for any model with just a minimal amount of code for maximal power of parameter optimization. They further show the benefit of having one package at hand that includes number of well performing parameter search methods, since not every case study can be solved sufficiently with every algorithm or every objective function.

  12. SPOTting Model Parameters Using a Ready-Made Python Package.

    PubMed

    Houska, Tobias; Kraft, Philipp; Chamorro-Chavez, Alejandro; Breuer, Lutz

    2015-01-01

    The choice for specific parameter estimation methods is often more dependent on its availability than its performance. We developed SPOTPY (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of methods typically used to calibrate, analyze and optimize parameters for a wide range of ecological models. SPOTPY currently contains eight widely used algorithms, 11 objective functions, and can sample from eight parameter distributions. SPOTPY has a model-independent structure and can be run in parallel from the workstation to large computation clusters using the Message Passing Interface (MPI). We tested SPOTPY in five different case studies to parameterize the Rosenbrock, Griewank and Ackley functions, a one-dimensional physically based soil moisture routine, where we searched for parameters of the van Genuchten-Mualem function and a calibration of a biogeochemistry model with different objective functions. The case studies reveal that the implemented SPOTPY methods can be used for any model with just a minimal amount of code for maximal power of parameter optimization. They further show the benefit of having one package at hand that includes number of well performing parameter search methods, since not every case study can be solved sufficiently with every algorithm or every objective function.

  13. SPOTting Model Parameters Using a Ready-Made Python Package

    PubMed Central

    Houska, Tobias; Kraft, Philipp; Chamorro-Chavez, Alejandro; Breuer, Lutz

    2015-01-01

    The choice for specific parameter estimation methods is often more dependent on its availability than its performance. We developed SPOTPY (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of methods typically used to calibrate, analyze and optimize parameters for a wide range of ecological models. SPOTPY currently contains eight widely used algorithms, 11 objective functions, and can sample from eight parameter distributions. SPOTPY has a model-independent structure and can be run in parallel from the workstation to large computation clusters using the Message Passing Interface (MPI). We tested SPOTPY in five different case studies to parameterize the Rosenbrock, Griewank and Ackley functions, a one-dimensional physically based soil moisture routine, where we searched for parameters of the van Genuchten-Mualem function and a calibration of a biogeochemistry model with different objective functions. The case studies reveal that the implemented SPOTPY methods can be used for any model with just a minimal amount of code for maximal power of parameter optimization. They further show the benefit of having one package at hand that includes number of well performing parameter search methods, since not every case study can be solved sufficiently with every algorithm or every objective function. PMID:26680783

  14. By Force or by Fraud: Optimizing U.S. Information Strategy With Deception

    DTIC Science & Technology

    2016-06-01

    IV. CASE- STUDY ASSESSMENTS ........................................................................37 A. CASE 1 OVERVIEW: THE DHOFAR REBELLION, 1965...xvi SOF Special Operations Forces SOG Studies and Observations Group USIA United States Information Agency VC Viet Cong...The Development of Overt and Covert Propaganda Strategies,” Presidential Studies Quarterly 24, no. 2 (Spring 1994): 265. 6 Ibid. 4 USIA departments

  15. Mixed oxidizer hybrid propulsion system optimization under uncertainty using applied response surface methodology and Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Whitehead, James Joshua

    The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in visualization. The concept of Expanded-Durov diagrams was also adopted and adapted to this study to aid in visualization of uncertainty bounds. Regions of maximum regression rate and associated uncertainties were determined for each set of case scenarios. Application of response surface methodology coupled with probabilistic-based MCS allowed for flexible and comprehensive interrogation of mixture and operating design space during optimization cases. Analyses were also conducted to assess sensitivity of uncertainty to variations in key elemental uncertainty estimates. The methodology developed during this research provides an innovative optimization tool for future propulsion design efforts.

  16. Economic and environmental optimization of waste treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Münster, M.; Ravn, H.; Hedegaard, K.

    2015-04-15

    Highlights: • Optimizing waste treatment by incorporating LCA methodology. • Applying different objectives (minimizing costs or GHG emissions). • Prioritizing multiple objectives given different weights. • Optimum depends on objective and assumed displaced electricity production. - Abstract: This article presents the new systems engineering optimization model, OptiWaste, which incorporates a life cycle assessment (LCA) methodology and captures important characteristics of waste management systems. As part of the optimization, the model identifies the most attractive waste management options. The model renders it possible to apply different optimization objectives such as minimizing costs or greenhouse gas emissions or to prioritize several objectivesmore » given different weights. A simple illustrative case is analysed, covering alternative treatments of one tonne of residual household waste: incineration of the full amount or sorting out organic waste for biogas production for either combined heat and power generation or as fuel in vehicles. The case study illustrates that the optimal solution depends on the objective and assumptions regarding the background system – illustrated with different assumptions regarding displaced electricity production. The article shows that it is feasible to combine LCA methodology with optimization. Furthermore, it highlights the need for including the integrated waste and energy system into the model.« less

  17. Optimal charges in lead progression: a structure-based neuraminidase case study.

    PubMed

    Armstrong, Kathryn A; Tidor, Bruce; Cheng, Alan C

    2006-04-20

    Collective experience in structure-based lead progression has found electrostatic interactions to be more difficult to optimize than shape-based ones. A major reason for this is that the net electrostatic contribution observed includes a significant nonintuitive desolvation component in addition to the more intuitive intermolecular interaction component. To investigate whether knowledge of the ligand optimal charge distribution can facilitate more intuitive design of electrostatic interactions, we took a series of small-molecule influenza neuraminidase inhibitors with known protein cocrystal structures and calculated the difference between the optimal and actual charge distributions. This difference from the electrostatic optimum correlates with the calculated electrostatic contribution to binding (r(2) = 0.94) despite small changes in binding modes caused by chemical substitutions, suggesting that the optimal charge distribution is a useful design goal. Furthermore, detailed suggestions for chemical modification generated by this approach are in many cases consistent with observed improvements in binding affinity, and the method appears to be useful despite discrete chemical constraints. Taken together, these results suggest that charge optimization is useful in facilitating generation of compound ideas in lead optimization. Our results also provide insight into design of neuraminidase inhibitors.

  18. Shape Optimization of Rubber Bushing Using Differential Evolution Algorithm

    PubMed Central

    2014-01-01

    The objective of this study is to design rubber bushing at desired level of stiffness characteristics in order to achieve the ride quality of the vehicle. A differential evolution algorithm based approach is developed to optimize the rubber bushing through integrating a finite element code running in batch mode to compute the objective function values for each generation. Two case studies were given to illustrate the application of proposed approach. Optimum shape parameters of 2D bushing model were determined by shape optimization using differential evolution algorithm. PMID:25276848

  19. A flowsheet model of a well-mixed fluidized bed dryer: Applications in controllability assessment and optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langrish, T.A.G.; Harvey, A.C.

    2000-01-01

    A model of a well-mixed fluidized-bed dryer within a process flowsheeting package (SPEEDUP{trademark}) has been developed and applied to a parameter sensitivity study, a steady-state controllability analysis and an optimization study. This approach is more general and would be more easily applied to a complex flowsheet than one which relied on stand-alone dryer modeling packages. The simulation has shown that industrial data may be fitted to the model outputs with sensible values of unknown parameters. For this case study, the parameter sensitivity study has found that the heat loss from the dryer and the critical moisture content of the materialmore » have the greatest impact on the dryer operation at the current operating point. An optimization study has demonstrated the dominant effect of the heat loss from the dryer on the current operating cost and the current operating conditions, and substantial cost savings (around 50%) could be achieved with a well-insulated and airtight dryer, for the specific case studied here.« less

  20. Optimal trajectories for the aeroassisted flight experiment. Part 4: Data, tables, and graphs

    NASA Technical Reports Server (NTRS)

    Miele, A.; Wang, T.; Lee, W. Y.; Wang, H.; Wu, G. D.

    1989-01-01

    The determination of optimal trajectories for the aeroassisted flight experiment (AFE) is discussed. Data, tables, and graphs relative to the following transfers are presented: (IA) indirect ascent to a 178 NM perigee via a 197 NM apogee; and (DA) direct ascent to a 178 NM apogee. For both transfers, two cases are investigated: (1) the bank angle is continuously variable; and (2) the trajectory is divided into segments along which the bank angle is constant. For case (2), the following subcases are studied: two segments, three segments, four segments, and five segments; because the time duration of each segment is optimized, the above subcases involve four, six, eight, and ten parameters, respectively. Presented here are systematic data on a total of ten optimal trajectories (OT), five for Transfer IA and five for Transfer DA. For comparison purposes and only for Transfer IA, a five-segment reference trajectory RT is also considered.

  1. Control of three-dimensional waves on thin liquid films. I - Optimal control and transverse mode effects

    NASA Astrophysics Data System (ADS)

    Tomlin, Ruben; Gomes, Susana; Pavliotis, Greg; Papageorgiou, Demetrios

    2017-11-01

    We consider a weakly nonlinear model for interfacial waves on three-dimensional thin films on inclined flat planes - the Kuramoto-Sivashinsky equation. The flow is driven by gravity, and is allowed to be overlying or hanging on the flat substrate. Blowing and suction controls are applied at the substrate surface. In this talk we explore the instability of the transverse modes for hanging arrangements, which are unbounded and grow exponentially. The structure of the equations allows us to construct optimal transverse controls analytically to prevent this transverse growth. In this case and the case of an overlying film, we additionally study the influence of controlling to non-trivial transverse states on the streamwise and mixed mode dynamics. Finally, we solve the full optimal control problem by deriving the first order necessary conditions for existence of an optimal control, and solving these numerically using the forward-backward sweep method.

  2. Optimal Analyses for 3×n AB Games in the Worst Case

    NASA Astrophysics Data System (ADS)

    Huang, Li-Te; Lin, Shun-Shii

    The past decades have witnessed a growing interest in research on deductive games such as Mastermind and AB game. Because of the complicated behavior of deductive games, tree-search approaches are often adopted to find their optimal strategies. In this paper, a generalized version of deductive games, called 3×n AB games, is introduced. However, traditional tree-search approaches are not appropriate for solving this problem since it can only solve instances with smaller n. For larger values of n, a systematic approach is necessary. Therefore, intensive analyses of playing 3×n AB games in the worst case optimally are conducted and a sophisticated method, called structural reduction, which aims at explaining the worst situation in this game is developed in the study. Furthermore, a worthwhile formula for calculating the optimal numbers of guesses required for arbitrary values of n is derived and proven to be final.

  3. Injection current minimization of InAs/InGaAs quantum dot laser by optimization of its active region and reflectivity of laser cavity edges

    NASA Astrophysics Data System (ADS)

    Korenev, V. V.; Savelyev, A. V.; Zhukov, A. E.; Maximov, M. V.

    2015-11-01

    The ways to optimize key parameters of active region and edge reflectivity of edge- emitting semiconductor quantum dot laser are provided. It is shown that in the case of optimal cavity length and sufficiently large dispersion lasing spectrum of a given width can be obtained at injection current up to an order of magnitude lower in comparison to non-optimized sample. The influence of internal loss and edge reflection is also studied in details.

  4. Optimism and Physical Health: A Meta-analytic Review

    PubMed Central

    Rasmussen, Heather N.; Greenhouse, Joel B.

    2010-01-01

    Background Prior research links optimism to physical health, but the strength of the association has not been systematically evaluated. Purpose The purpose of this study is to conduct a meta-analytic review to determine the strength of the association between optimism and physical health. Methods The findings from 83 studies, with 108 effect sizes (ESs), were included in the analyses, using random-effects models. Results Overall, the mean ES characterizing the relationship between optimism and physical health outcomes was 0.17, p<.001. ESs were larger for studies using subjective (versus objective) measures of physical health. Subsidiary analyses were also conducted grouping studies into those that focused solely on mortality, survival, cardiovascular outcomes, physiological markers (including immune function), immune function only, cancer outcomes, outcomes related to pregnancy, physical symptoms, or pain. In each case, optimism was a significant predictor of health outcomes or markers, all p<.001. Conclusions Optimism is a significant predictor of positive physical health outcomes. PMID:19711142

  5. Design optimization of single mixed refrigerant LNG process using a hybrid modified coordinate descent algorithm

    NASA Astrophysics Data System (ADS)

    Qyyum, Muhammad Abdul; Long, Nguyen Van Duc; Minh, Le Quang; Lee, Moonyong

    2018-01-01

    Design optimization of the single mixed refrigerant (SMR) natural gas liquefaction (LNG) process involves highly non-linear interactions between decision variables, constraints, and the objective function. These non-linear interactions lead to an irreversibility, which deteriorates the energy efficiency of the LNG process. In this study, a simple and highly efficient hybrid modified coordinate descent (HMCD) algorithm was proposed to cope with the optimization of the natural gas liquefaction process. The single mixed refrigerant process was modeled in Aspen Hysys® and then connected to a Microsoft Visual Studio environment. The proposed optimization algorithm provided an improved result compared to the other existing methodologies to find the optimal condition of the complex mixed refrigerant natural gas liquefaction process. By applying the proposed optimization algorithm, the SMR process can be designed with the 0.2555 kW specific compression power which is equivalent to 44.3% energy saving as compared to the base case. Furthermore, in terms of coefficient of performance (COP), it can be enhanced up to 34.7% as compared to the base case. The proposed optimization algorithm provides a deep understanding of the optimization of the liquefaction process in both technical and numerical perspectives. In addition, the HMCD algorithm can be employed to any mixed refrigerant based liquefaction process in the natural gas industry.

  6. Impacts of Valuing Resilience on Cost-Optimal PV and Storage Systems for Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laws, Nicholas D; Anderson, Katherine H; DiOrio, Nicholas A

    Decreasing electric grid reliability in the US, along with increasing severe weather events, have greatly increased interest in resilient energy systems. Few studies have included the value of resilience when sizing PV and Battery Energy Storage Systems (BESS), and none have included the cost to island a PV and BESS, grid-connected costs and benefits, and the value of resilience. This work presents a novel method for incorporating the value of resilience provided by a PV and BESS into a techno-economic optimization model. Including the value of resilience in the design of a cost-optimal PV and BESS generally increases the systemmore » capacities, and in some cases makes a system economical where it was not before. For example, for a large hotel in Anaheim, CA no system is economical without resilience valued; however, with a $5317/hr value of resilience a 363 kW and 60 kWh solar and BESS provides a net present value of $50,000. Lastly, we discuss the effect of the 'islandable premium', which must be balanced against the benefits from serving critical loads during outages. Case studies show that the islandable premium can vary widely, which highlights the necessity for case-by-case solutions in a rapidly developing market.« less

  7. Production scheduling with ant colony optimization

    NASA Astrophysics Data System (ADS)

    Chernigovskiy, A. S.; Kapulin, D. V.; Noskova, E. E.; Yamskikh, T. N.; Tsarev, R. Yu

    2017-10-01

    The optimum solution of the production scheduling problem for manufacturing processes at an enterprise is crucial as it allows one to obtain the required amount of production within a specified time frame. Optimum production schedule can be found using a variety of optimization algorithms or scheduling algorithms. Ant colony optimization is one of well-known techniques to solve the global multi-objective optimization problem. In the article, the authors present a solution of the production scheduling problem by means of an ant colony optimization algorithm. A case study of the algorithm efficiency estimated against some others production scheduling algorithms is presented. Advantages of the ant colony optimization algorithm and its beneficial effect on the manufacturing process are provided.

  8. Optimal management of on-farm resources in small-scale dairy systems of Central Mexico: model development and evaluation.

    PubMed

    Castelán-Ortega, Octavio Alonso; Martínez-García, Carlos Galdino; Mould, Fergus L; Dorward, Peter; Rehman, Tahir; Rayas-Amor, Adolfo Armando

    2016-06-01

    This study evaluates the available on-farm resources of five case studies typified as small-scale dairy systems in central Mexico. A comprehensive mixed-integer linear programming model was developed and applied to two case studies. The optimal plan suggested the following: (1) instruction and utilization of maize silage, (2) alfalfa hay making that added US$140/ha/cut to the total net income, (3) allocation of land to cultivated pastures in a ratio of 27:41(cultivated pastures/maize crop) rather than at the current 14:69, and dairy cattle should graze 12 h/day, (4) to avoid grazing of communal pastures because this activity represented an opportunity cost of family labor that reduced the farm net income, and (5) that the highest farm net income was obtained when liquid milk and yogurt sales were included in the optimal plan. In the context of small-scale dairy systems of central Mexico, the optimal plan would need to be implemented gradually to enable farmers to develop required skills and to change management strategies from reliance on forage and purchased concentrate to pasture-based and conserved forage systems.

  9. Watershed Management Optimization Support Tool (WMOST) v1: User Manual and Case Study Examples

    EPA Science Inventory

    The Watershed Management Optimization Support Tool (WMOST) is intended to be used as a screening tool as part of an integrated watershed management process such as that described in EPA’s watershed planning handbook (EPA 2008).1 The objective of WMOST is to serve as a public-doma...

  10. Longitudinal Examination of Optimism, Personal Self-Efficacy and Student Well-Being: A Path Analysis

    ERIC Educational Resources Information Center

    Phan, Huy P.

    2016-01-01

    The present longitudinal study, based on existing theoretical tenets, explored a conceptual model that depicted four major orientations: optimism, self-efficacy, and academic well-being. An important question for consideration, in this case, involved the testing of different untested trajectories that could explain and predict individuals'…

  11. An optimized procedure for obtaining DNA from fired and unfired ammunition.

    PubMed

    Montpetit, Shawn; O'Donnell, Patrick

    2015-07-01

    Gun crimes are a significant problem facing law enforcement agencies. Traditional forensic examination of firearms involves comparisons of markings imparted to bullets and cartridge casings during the firing process. DNA testing of casings and cartridges may not be routinely done in crime laboratories due a variety of factors including the typically low amounts of DNA recovered. The San Diego Police Department (SDPD) Crime Laboratory conducted a study to optimize the collection and profiling of DNA from fired and unfired ammunition. The method was optimized to where interpretable DNA results were obtained for 26.1% of the total number of forensic casework evidence samples, and provided some insights into the level of secondary transfer that might be expected from this type of evidence. Briefly detailed are the results from the experimental study and the forensic casework analysis using the optimized process. Mixtures (samples having more DNA types than the loader's known genotype detected or visible at any marker) were obtained in 39.8% of research samples and the likely source of DNA mixtures is discussed. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. Optimization of CO2 Storage in Saline Aquifers Using Water-Alternating Gas (WAG) Scheme - Case Study for Utsira Formation

    NASA Astrophysics Data System (ADS)

    Agarwal, R. K.; Zhang, Z.; Zhu, C.

    2013-12-01

    For optimization of CO2 storage and reduced CO2 plume migration in saline aquifers, a genetic algorithm (GA) based optimizer has been developed which is combined with the DOE multi-phase flow and heat transfer numerical simulation code TOUGH2. Designated as GA-TOUGH2, this combined solver/optimizer has been verified by performing optimization studies on a number of model problems and comparing the results with brute-force optimization which requires a large number of simulations. Using GA-TOUGH2, an innovative reservoir engineering technique known as water-alternating-gas (WAG) injection has been investigated to determine the optimal WAG operation for enhanced CO2 storage capacity. The topmost layer (layer # 9) of Utsira formation at Sleipner Project, Norway is considered as a case study. A cylindrical domain, which possesses identical characteristics of the detailed 3D Utsira Layer #9 model except for the absence of 3D topography, was used. Topographical details are known to be important in determining the CO2 migration at Sleipner, and are considered in our companion model for history match of the CO2 plume migration at Sleipner. However, simplification on topography here, without compromising accuracy, is necessary to analyze the effectiveness of WAG operation on CO2 migration without incurring excessive computational cost. Selected WAG operation then can be simulated with full topography details later. We consider a cylindrical domain with thickness of 35 m with horizontal flat caprock. All hydrogeological properties are retained from the detailed 3D Utsira Layer #9 model, the most important being the horizontal-to-vertical permeability ratio of 10. Constant Gas Injection (CGI) operation with nine-year average CO2 injection rate of 2.7 kg/s is considered as the baseline case for comparison. The 30-day, 15-day, and 5-day WAG cycle durations are considered for the WAG optimization design. Our computations show that for the simplified Utsira Layer #9 model, the WAG operation with 5-day cycle leads to most noticeable reduction in plume migration. For 5-day WAG cycle, the values of design variables corresponding to optimal WAG operation are found as optimal CO2 injection ICO2,optimal = 11.56 kg/s, and optimal water injection Iwater,optimal = 7.62 kg/s. The durations of CO2 and water injection in one WAG cycle are 11 and 19 days, respectively. Identical WAG cycles are repeated 20 times to complete a two-year operation. Significant reduction (22%) in CO2 migration is achieved compared to CGI operation after only two years of WAG operation. In addition, CO2 dissolution is also significantly enhanced from about 9% to 22% of the total injected CO2 . The results obtained from this and other optimization studies suggest that over 50% reduction of in situ CO2 footprint, greatly enhanced CO2 dissolution, and significantly improved well injectivity can be achieved by employing GA-TOUGH2. The optimization code has also been employed to determine the optimal well placement in a multi-well injection operation. GA-TOUGH2 appears to hold great promise for studying a host of other optimization problems related to Carbon Storage.

  13. Case study on impact performance optimization of hydraulic breakers.

    PubMed

    Noh, Dae-Kyung; Kang, Young-Ky; Cho, Jae-Sang; Jang, Joo-Sup

    2016-01-01

    In order to expand the range of activities of an excavator, attachments, such as hydraulic breakers have been developed to be applied to buckets. However, it is very difficult to predict the dynamic behavior of hydraulic impact devices such as breakers because of high non-linearity. Thus, the purpose of this study is to optimize the impact performance of hydraulic breakers. The ultimate goal of the optimization is to increase the impact energy and impact frequency and to reduce the pressure pulsation of the supply and return lines. The optimization results indicated that the four parameters used to optimize the impact performance of the breaker showed considerable improvement over the results reported in the literature. A test was also conducted and the results were compared with those obtained through optimization in order to verify the optimization results. The comparison showed an average relative error of 8.24 %, which seems to be in good agreement. The results of this study can be used to optimize the impact performance of hydraulic impact devices such as breakers, thus facilitating its application to excavators and increasing the range of activities of an excavator.

  14. On optimal current patterns for electrical impedance tomography.

    PubMed

    Demidenko, Eugene; Hartov, Alex; Soni, Nirmal; Paulsen, Keith D

    2005-02-01

    We develop a statistical criterion for optimal patterns in planar circular electrical impedance tomography. These patterns minimize the total variance of the estimation for the resistance or conductance matrix. It is shown that trigonometric patterns (Isaacson, 1986), originally derived from the concept of distinguishability, are a special case of our optimal statistical patterns. New optimal random patterns are introduced. Recovering the electrical properties of the measured body is greatly simplified when optimal patterns are used. The Neumann-to-Dirichlet map and the optimal patterns are derived for a homogeneous medium with an arbitrary distribution of the electrodes on the periphery. As a special case, optimal patterns are developed for a practical EIT system with a finite number of electrodes. For a general nonhomogeneous medium, with no a priori restriction, the optimal patterns for the resistance and conductance matrix are the same. However, for a homogeneous medium, the best current pattern is the worst voltage pattern and vice versa. We study the effect of the number and the width of the electrodes on the estimate of resistivity and conductivity in a homogeneous medium. We confirm experimentally that the optimal patterns produce minimum conductivity variance in a homogeneous medium. Our statistical model is able to discriminate between a homogenous agar phantom and one with a 2 mm air hole with error probability (p-value) 1/1000.

  15. Parameter assessment for virtual Stackelberg game in aerodynamic shape optimization

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Xie, Fangfang; Zheng, Yao; Zhang, Jifa

    2018-05-01

    In this paper, parametric studies of virtual Stackelberg game (VSG) are conducted to assess the impact of critical parameters on aerodynamic shape optimization, including design cycle, split of design variables and role assignment. Typical numerical cases, including the inverse design and drag reduction design of airfoil, have been carried out. The numerical results confirm the effectiveness and efficiency of VSG. Furthermore, the most significant parameters are identified, e.g. the increase of design cycle can improve the optimization results but it will also add computational burden. These studies will maximize the productivity of the effort in aerodynamic optimization for more complicated engineering problems, such as the multi-element airfoil and wing-body configurations.

  16. Full Monte Carlo-Based Biologic Treatment Plan Optimization System for Intensity Modulated Carbon Ion Therapy on Graphics Processing Unit.

    PubMed

    Qin, Nan; Shen, Chenyang; Tsai, Min-Yu; Pinto, Marco; Tian, Zhen; Dedes, Georgios; Pompos, Arnold; Jiang, Steve B; Parodi, Katia; Jia, Xun

    2018-01-01

    One of the major benefits of carbon ion therapy is enhanced biological effectiveness at the Bragg peak region. For intensity modulated carbon ion therapy (IMCT), it is desirable to use Monte Carlo (MC) methods to compute the properties of each pencil beam spot for treatment planning, because of their accuracy in modeling physics processes and estimating biological effects. We previously developed goCMC, a graphics processing unit (GPU)-oriented MC engine for carbon ion therapy. The purpose of the present study was to build a biological treatment plan optimization system using goCMC. The repair-misrepair-fixation model was implemented to compute the spatial distribution of linear-quadratic model parameters for each spot. A treatment plan optimization module was developed to minimize the difference between the prescribed and actual biological effect. We used a gradient-based algorithm to solve the optimization problem. The system was embedded in the Varian Eclipse treatment planning system under a client-server architecture to achieve a user-friendly planning environment. We tested the system with a 1-dimensional homogeneous water case and 3 3-dimensional patient cases. Our system generated treatment plans with biological spread-out Bragg peaks covering the targeted regions and sparing critical structures. Using 4 NVidia GTX 1080 GPUs, the total computation time, including spot simulation, optimization, and final dose calculation, was 0.6 hour for the prostate case (8282 spots), 0.2 hour for the pancreas case (3795 spots), and 0.3 hour for the brain case (6724 spots). The computation time was dominated by MC spot simulation. We built a biological treatment plan optimization system for IMCT that performs simulations using a fast MC engine, goCMC. To the best of our knowledge, this is the first time that full MC-based IMCT inverse planning has been achieved in a clinically viable time frame. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Optimization Under Uncertainty of Site-Specific Turbine Configurations

    NASA Astrophysics Data System (ADS)

    Quick, J.; Dykes, K.; Graf, P.; Zahle, F.

    2016-09-01

    Uncertainty affects many aspects of wind energy plant performance and cost. In this study, we explore opportunities for site-specific turbine configuration optimization that accounts for uncertainty in the wind resource. As a demonstration, a simple empirical model for wind plant cost of energy is used in an optimization under uncertainty to examine how different risk appetites affect the optimal selection of a turbine configuration for sites of different wind resource profiles. If there is unusually high uncertainty in the site wind resource, the optimal turbine configuration diverges from the deterministic case and a generally more conservative design is obtained with increasing risk aversion on the part of the designer.

  18. [AWAKE CRANIOTOMY: IN SEARCH FOR OPTIMAL SEDATION].

    PubMed

    Kulikova, A S; Sel'kov, D A; Kobyakov, G L; Shmigel'skiy, A V; Lubnin, A Yu

    2015-01-01

    Awake craniotomy is a "gold standard"for intraoperative brain language mapping. One of the main anesthetic challenge of awake craniotomy is providing of optimal sedation for initial stages of intervention. The goal of this study was comparison of different technics of anesthesia for awake craniotomy. Materials and methods: 162 operations were divided in 4 groups: 76 cases with propofol sedation (2-4mg/kg/h) without airway protection; 11 cases with propofol sedation (4-5 mg/kg/h) with MV via LMA; 36 cases of xenon anesthesia; and 39 cases with dexmedetomidine sedation without airway protection. Results and discussion: brain language mapping was successful in 90% of cases. There was no difference between groups in successfulness of brain mapping. However in the first group respiratory complications were more frequent. Three other technics were more safer Xenon anesthesia was associated with ultrafast awakening for mapping (5±1 min). Dexmedetomidine sedation provided high hemodynamic and respiratory stability during the procedure.

  19. Multiobjective optimization model of intersection signal timing considering emissions based on field data: A case study of Beijing.

    PubMed

    Kou, Weibin; Chen, Xumei; Yu, Lei; Gong, Huibo

    2018-04-18

    Most existing signal timing models are aimed to minimize the total delay and stops at intersections, without considering environmental factors. This paper analyzes the trade-off between vehicle emissions and traffic efficiencies on the basis of field data. First, considering the different operating modes of cruising, acceleration, deceleration, and idling, field data of emissions and Global Positioning System (GPS) are collected to estimate emission rates for heavy-duty and light-duty vehicles. Second, multiobjective signal timing optimization model is established based on a genetic algorithm to minimize delay, stops, and emissions. Finally, a case study is conducted in Beijing. Nine scenarios are designed considering different weights of emission and traffic efficiency. The results compared with those using Highway Capacity Manual (HCM) 2010 show that signal timing optimized by the model proposed in this paper can decrease vehicles delay and emissions more significantly. The optimization model can be applied in different cities, which provides supports for eco-signal design and development. Vehicle emissions are heavily at signal intersections in urban area. The multiobjective signal timing optimization model is proposed considering the trade-off between vehicle emissions and traffic efficiencies on the basis of field data. The results indicate that signal timing optimized by the model proposed in this paper can decrease vehicle emissions and delays more significantly. The optimization model can be applied in different cities, which provides supports for eco-signal design and development.

  20. Parallel and Preemptable Dynamically Dimensioned Search Algorithms for Single and Multi-objective Optimization in Water Resources

    NASA Astrophysics Data System (ADS)

    Tolson, B.; Matott, L. S.; Gaffoor, T. A.; Asadzadeh, M.; Shafii, M.; Pomorski, P.; Xu, X.; Jahanpour, M.; Razavi, S.; Haghnegahdar, A.; Craig, J. R.

    2015-12-01

    We introduce asynchronous parallel implementations of the Dynamically Dimensioned Search (DDS) family of algorithms including DDS, discrete DDS, PA-DDS and DDS-AU. These parallel algorithms are unique from most existing parallel optimization algorithms in the water resources field in that parallel DDS is asynchronous and does not require an entire population (set of candidate solutions) to be evaluated before generating and then sending a new candidate solution for evaluation. One key advance in this study is developing the first parallel PA-DDS multi-objective optimization algorithm. The other key advance is enhancing the computational efficiency of solving optimization problems (such as model calibration) by combining a parallel optimization algorithm with the deterministic model pre-emption concept. These two efficiency techniques can only be combined because of the asynchronous nature of parallel DDS. Model pre-emption functions to terminate simulation model runs early, prior to completely simulating the model calibration period for example, when intermediate results indicate the candidate solution is so poor that it will definitely have no influence on the generation of further candidate solutions. The computational savings of deterministic model preemption available in serial implementations of population-based algorithms (e.g., PSO) disappear in synchronous parallel implementations as these algorithms. In addition to the key advances above, we implement the algorithms across a range of computation platforms (Windows and Unix-based operating systems from multi-core desktops to a supercomputer system) and package these for future modellers within a model-independent calibration software package called Ostrich as well as MATLAB versions. Results across multiple platforms and multiple case studies (from 4 to 64 processors) demonstrate the vast improvement over serial DDS-based algorithms and highlight the important role model pre-emption plays in the performance of parallel, pre-emptable DDS algorithms. Case studies include single- and multiple-objective optimization problems in water resources model calibration and in many cases linear or near linear speedups are observed.

  1. SU-E-T-422: Fast Analytical Beamlet Optimization for Volumetric Intensity-Modulated Arc Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chan, Kenny S K; Lee, Louis K Y; Xing, L

    2015-06-15

    Purpose: To implement a fast optimization algorithm on CPU/GPU heterogeneous computing platform and to obtain an optimal fluence for a given target dose distribution from the pre-calculated beamlets in an analytical approach. Methods: The 2D target dose distribution was modeled as an n-dimensional vector and estimated by a linear combination of independent basis vectors. The basis set was composed of the pre-calculated beamlet dose distributions at every 6 degrees of gantry angle and the cost function was set as the magnitude square of the vector difference between the target and the estimated dose distribution. The optimal weighting of the basis,more » which corresponds to the optimal fluence, was obtained analytically by the least square method. Those basis vectors with a positive weighting were selected for entering into the next level of optimization. Totally, 7 levels of optimization were implemented in the study.Ten head-and-neck and ten prostate carcinoma cases were selected for the study and mapped to a round water phantom with a diameter of 20cm. The Matlab computation was performed in a heterogeneous programming environment with Intel i7 CPU and NVIDIA Geforce 840M GPU. Results: In all selected cases, the estimated dose distribution was in a good agreement with the given target dose distribution and their correlation coefficients were found to be in the range of 0.9992 to 0.9997. Their root-mean-square error was monotonically decreasing and converging after 7 cycles of optimization. The computation took only about 10 seconds and the optimal fluence maps at each gantry angle throughout an arc were quickly obtained. Conclusion: An analytical approach is derived for finding the optimal fluence for a given target dose distribution and a fast optimization algorithm implemented on the CPU/GPU heterogeneous computing environment greatly reduces the optimization time.« less

  2. Automating Initial Guess Generation for High Fidelity Trajectory Optimization Tools

    NASA Technical Reports Server (NTRS)

    Villa, Benjamin; Lantoine, Gregory; Sims, Jon; Whiffen, Gregory

    2013-01-01

    Many academic studies in spaceflight dynamics rely on simplified dynamical models, such as restricted three-body models or averaged forms of the equations of motion of an orbiter. In practice, the end result of these preliminary orbit studies needs to be transformed into more realistic models, in particular to generate good initial guesses for high-fidelity trajectory optimization tools like Mystic. This paper reviews and extends some of the approaches used in the literature to perform such a task, and explores the inherent trade-offs of such a transformation with a view toward automating it for the case of ballistic arcs. Sample test cases in the libration point regimes and small body orbiter transfers are presented.

  3. Best Practices Case Study: Tindall Homes - Princeton, NJ, Legends at Mansfield, Columbus, NJ

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2011-09-01

    Case Study of Tindall Homes, who worked with Building America to design an optimal package including advanced framing, insulated precast concrete basement walls, polyurethane foam in the walls, and R-49 of batt plus blown cellulose in the attics. Some homes included a detached garden shed with photovoltaic panel-covered roofs.

  4. Impact of Targeted Preoperative Optimization on Clinical Outcome in Emergency Abdominal Surgeries: A Prospective Randomized Trial.

    PubMed

    Sethi, Ashish; Debbarma, Miltan; Narang, Neeraj; Saxena, Anudeep; Mahobia, Mamta; Tomar, Gaurav Singh

    2018-01-01

    Perforation peritonitis continues to be one of the most common surgical emergencies that need a surgical intervention most of the times. Anesthesiologists are invariably involved in managing such cases efficiently in perioperative period. The assessment and evaluation of Acute Physiology and Chronic Health Evaluation II (APACHE II) score at presentation and 24 h after goal-directed optimization, administration of empirical broad-spectrum antibiotics, and definitive source control postoperatively. Outcome assessment in terms of duration of hospital stay and mortality in with or without optimization was also measured. It is a prospective, randomized, double-blind controlled study in hospital setting. One hundred and one patients aged ≥18 years, of the American Society of Anesthesiologists physical Status I and II (E) with clinical diagnosis of perforation peritonitis posted for surgery were enrolled. Enrolled patients were randomly divided into two groups. Group A is optimized by goal-directed optimization protocol in the preoperative holding room by anesthesiology residents whereas in Group S, managed by surgery residents in the surgical wards without any fixed algorithm. The assessment of APACHE II score was done as a first step on admission and 24 h postoperatively. Duration of hospital stay and mortality in both the groups were also measured and compared. Categorical data are presented as frequency counts (percent) and compared using the Chi-square or Fisher's exact test. The statistical significance for categorical variables was determined by Chi-square analysis. For continuous variables, a two-sample t -test was applied. The mean APACHE II score on admission in case and control groups was comparable. Significant lowering of serial scores in case group was observed as compared to control group ( P = 0.02). There was a significant lowering of mean duration of hospital stay seen in case group (9.8 ± 1.7 days) as compared to control group ( P = 0.007). Furthermore, a significant decline in death rate was noted in case group as compared to control group ( P = 0.03). Goal-directed optimized patients with perforation peritonitis were discharged early as compared to control group with significantly lesser mortality as compared with randomly optimized patients in the perioperative period.

  5. Beam orientation optimization for intensity-modulated radiation therapy using mixed integer programming

    NASA Astrophysics Data System (ADS)

    Yang, Ruijie; Dai, Jianrong; Yang, Yong; Hu, Yimin

    2006-08-01

    The purpose of this study is to extend an algorithm proposed for beam orientation optimization in classical conformal radiotherapy to intensity-modulated radiation therapy (IMRT) and to evaluate the algorithm's performance in IMRT scenarios. In addition, the effect of the candidate pool of beam orientations, in terms of beam orientation resolution and starting orientation, on the optimized beam configuration, plan quality and optimization time is also explored. The algorithm is based on the technique of mixed integer linear programming in which binary and positive float variables are employed to represent candidates for beam orientation and beamlet weights in beam intensity maps. Both beam orientations and beam intensity maps are simultaneously optimized in the algorithm with a deterministic method. Several different clinical cases were used to test the algorithm and the results show that both target coverage and critical structures sparing were significantly improved for the plans with optimized beam orientations compared to those with equi-spaced beam orientations. The calculation time was less than an hour for the cases with 36 binary variables on a PC with a Pentium IV 2.66 GHz processor. It is also found that decreasing beam orientation resolution to 10° greatly reduced the size of the candidate pool of beam orientations without significant influence on the optimized beam configuration and plan quality, while selecting different starting orientations had large influence. Our study demonstrates that the algorithm can be applied to IMRT scenarios, and better beam orientation configurations can be obtained using this algorithm. Furthermore, the optimization efficiency can be greatly increased through proper selection of beam orientation resolution and starting beam orientation while guaranteeing the optimized beam configurations and plan quality.

  6. SU-E-T-452: Impact of Respiratory Motion On Robustly-Optimized Intensity-Modulated Proton Therapy to Treat Lung Cancers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, W; Schild, S; Bues, M

    Purpose: We compared conventionally optimized intensity-modulated proton therapy (IMPT) treatment plans against the worst-case robustly optimized treatment plans for lung cancer. The comparison of the two IMPT optimization strategies focused on the resulting plans' ability to retain dose objectives under the influence of patient set-up, inherent proton range uncertainty, and dose perturbation caused by respiratory motion. Methods: For each of the 9 lung cancer cases two treatment plans were created accounting for treatment uncertainties in two different ways: the first used the conventional Method: delivery of prescribed dose to the planning target volume (PTV) that is geometrically expanded from themore » internal target volume (ITV). The second employed the worst-case robust optimization scheme that addressed set-up and range uncertainties through beamlet optimization. The plan optimality and plan robustness were calculated and compared. Furthermore, the effects on dose distributions of the changes in patient anatomy due to respiratory motion was investigated for both strategies by comparing the corresponding plan evaluation metrics at the end-inspiration and end-expiration phase and absolute differences between these phases. The mean plan evaluation metrics of the two groups were compared using two-sided paired t-tests. Results: Without respiratory motion considered, we affirmed that worst-case robust optimization is superior to PTV-based conventional optimization in terms of plan robustness and optimality. With respiratory motion considered, robust optimization still leads to more robust dose distributions to respiratory motion for targets and comparable or even better plan optimality [D95% ITV: 96.6% versus 96.1% (p=0.26), D5% - D95% ITV: 10.0% versus 12.3% (p=0.082), D1% spinal cord: 31.8% versus 36.5% (p =0.035)]. Conclusion: Worst-case robust optimization led to superior solutions for lung IMPT. Despite of the fact that robust optimization did not explicitly account for respiratory motion it produced motion-resistant treatment plans. However, further research is needed to incorporate respiratory motion into IMPT robust optimization.« less

  7. SU-E-T-436: Fluence-Based Trajectory Optimization for Non-Coplanar VMAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smyth, G; Bamber, JC; Bedford, JL

    2015-06-15

    Purpose: To investigate a fluence-based trajectory optimization technique for non-coplanar VMAT for brain cancer. Methods: Single-arc non-coplanar VMAT trajectories were determined using a heuristic technique for five patients. Organ at risk (OAR) volume intersected during raytracing was minimized for two cases: absolute volume and the sum of relative volumes weighted by OAR importance. These trajectories and coplanar VMAT formed starting points for the fluence-based optimization method. Iterative least squares optimization was performed on control points 24° apart in gantry rotation. Optimization minimized the root-mean-square (RMS) deviation of PTV dose from the prescription (relative importance 100), maximum dose to the brainstemmore » (10), optic chiasm (5), globes (5) and optic nerves (5), plus mean dose to the lenses (5), hippocampi (3), temporal lobes (2), cochleae (1) and brain excluding other regions of interest (1). Control point couch rotations were varied in steps of up to 10° and accepted if the cost function improved. Final treatment plans were optimized with the same objectives in an in-house planning system and evaluated using a composite metric - the sum of optimization metrics weighted by importance. Results: The composite metric decreased with fluence-based optimization in 14 of the 15 plans. In the remaining case its overall value, and the PTV and OAR components, were unchanged but the balance of OAR sparing differed. PTV RMS deviation was improved in 13 cases and unchanged in two. The OAR component was reduced in 13 plans. In one case the OAR component increased but the composite metric decreased - a 4 Gy increase in OAR metrics was balanced by a reduction in PTV RMS deviation from 2.8% to 2.6%. Conclusion: Fluence-based trajectory optimization improved plan quality as defined by the composite metric. While dose differences were case specific, fluence-based optimization improved both PTV and OAR dosimetry in 80% of cases.« less

  8. Spatio-temporal optimization of agricultural practices to achieve a sustainable development at basin level; framework of a case study in Colombia

    NASA Astrophysics Data System (ADS)

    Uribe, Natalia; corzo, Gerald; Solomatine, Dimitri

    2016-04-01

    The flood events present during the last years in different basins of the Colombian territory have raised questions on the sensitivity of the regions and if this regions have common features. From previous studies it seems important features in the sensitivity of the flood process were: land cover change, precipitation anomalies and these related to impacts of agriculture management and water management deficiencies, among others. A significant government investment in the outreach activities for adopting and promoting the Colombia National Action Plan on Climate Change (NAPCC) is being carried out in different sectors and regions, having as a priority the agriculture sector. However, more information is still needed in the local environment in order to assess were the regions have this sensitivity. Also the continuous change in one region with seasonal agricultural practices have been pointed out as a critical information for optimal sustainable development. This combined spatio-temporal dynamics of crops cycle in relation to climate change (or variations) has an important impact on flooding events at basin areas. This research will develop on the assessment and optimization of the aggregated impact of flood events due to determinate the spatio-temporal dynamic of changes in agricultural management practices. A number of common best agricultural practices have been identified to explore their effect in a spatial hydrological model that will evaluate overall changes. The optimization process consists on the evaluation of best performance in the agricultural production, without having to change crops activities or move to other regions. To achieve this objectives a deep analysis of different models combined with current and future climate scenarios have been planned. An algorithm have been formulated to cover the parametric updates such that the optimal temporal identification will be evaluated in different region on the case study area. Different hydroinformatics techniques for optimization and uncertainty analysis are included in a framework that will solve partially the computational load found in the pre-runs of the case study. The work will focus on the region Fuquene basin in Colombia but this will not limit the scope of this study to have general methodological applications to other areas. Key words Modelling, WFlow_sbm, agriculture practices, climate change, optimization, flooding, spatial and temporal analysis

  9. Some Results of Weak Anticipative Concept Applied in Simulation Based Decision Support in Enterprise

    NASA Astrophysics Data System (ADS)

    Kljajić, Miroljub; Kofjač, Davorin; Kljajić Borštnar, Mirjana; Škraba, Andrej

    2010-11-01

    The simulation models are used as for decision support and learning in enterprises and in schools. Tree cases of successful applications demonstrate usefulness of weak anticipative information. Job shop scheduling production with makespan criterion presents a real case customized flexible furniture production optimization. The genetic algorithm for job shop scheduling optimization is presented. Simulation based inventory control for products with stochastic lead time and demand describes inventory optimization for products with stochastic lead time and demand. Dynamic programming and fuzzy control algorithms reduce the total cost without producing stock-outs in most cases. Values of decision making information based on simulation were discussed too. All two cases will be discussed from optimization, modeling and learning point of view.

  10. Optimal causal inference: estimating stored information and approximating causal architecture.

    PubMed

    Still, Susanne; Crutchfield, James P; Ellison, Christopher J

    2010-09-01

    We introduce an approach to inferring the causal architecture of stochastic dynamical systems that extends rate-distortion theory to use causal shielding--a natural principle of learning. We study two distinct cases of causal inference: optimal causal filtering and optimal causal estimation. Filtering corresponds to the ideal case in which the probability distribution of measurement sequences is known, giving a principled method to approximate a system's causal structure at a desired level of representation. We show that in the limit in which a model-complexity constraint is relaxed, filtering finds the exact causal architecture of a stochastic dynamical system, known as the causal-state partition. From this, one can estimate the amount of historical information the process stores. More generally, causal filtering finds a graded model-complexity hierarchy of approximations to the causal architecture. Abrupt changes in the hierarchy, as a function of approximation, capture distinct scales of structural organization. For nonideal cases with finite data, we show how the correct number of the underlying causal states can be found by optimal causal estimation. A previously derived model-complexity control term allows us to correct for the effect of statistical fluctuations in probability estimates and thereby avoid overfitting.

  11. On a biologically inspired topology optimization method

    NASA Astrophysics Data System (ADS)

    Kobayashi, Marcelo H.

    2010-03-01

    This work concerns the development of a biologically inspired methodology for the study of topology optimization in engineering and natural systems. The methodology is based on L systems and its turtle interpretation for the genotype-phenotype modeling of the topology development. The topology is analyzed using the finite element method, and optimized using an evolutionary algorithm with the genetic encoding of the L system and its turtle interpretation, as well as, body shape and physical characteristics. The test cases considered in this work clearly show the suitability of the proposed method for the study of engineering and natural complex systems.

  12. Cooperative combinatorial optimization: evolutionary computation case study.

    PubMed

    Burgin, Mark; Eberbach, Eugene

    2008-01-01

    This paper presents a formalization of the notion of cooperation and competition of multiple systems that work toward a common optimization goal of the population using evolutionary computation techniques. It is proved that evolutionary algorithms are more expressive than conventional recursive algorithms, such as Turing machines. Three classes of evolutionary computations are introduced and studied: bounded finite, unbounded finite, and infinite computations. Universal evolutionary algorithms are constructed. Such properties of evolutionary algorithms as completeness, optimality, and search decidability are examined. A natural extension of evolutionary Turing machine (ETM) model is proposed to properly reflect phenomena of cooperation and competition in the whole population.

  13. Large-scale hydropower system optimization using dynamic programming and object-oriented programming: the case of the Northeast China Power Grid.

    PubMed

    Li, Ji-Qing; Zhang, Yu-Shan; Ji, Chang-Ming; Wang, Ai-Jing; Lund, Jay R

    2013-01-01

    This paper examines long-term optimal operation using dynamic programming for a large hydropower system of 10 reservoirs in Northeast China. Besides considering flow and hydraulic head, the optimization explicitly includes time-varying electricity market prices to maximize benefit. Two techniques are used to reduce the 'curse of dimensionality' of dynamic programming with many reservoirs. Discrete differential dynamic programming (DDDP) reduces the search space and computer memory needed. Object-oriented programming (OOP) and the ability to dynamically allocate and release memory with the C++ language greatly reduces the cumulative effect of computer memory for solving multi-dimensional dynamic programming models. The case study shows that the model can reduce the 'curse of dimensionality' and achieve satisfactory results.

  14. Simplex-based optimization of numerical and categorical inputs in early bioprocess development: Case studies in HT chromatography.

    PubMed

    Konstantinidis, Spyridon; Titchener-Hooker, Nigel; Velayudhan, Ajoy

    2017-08-01

    Bioprocess development studies often involve the investigation of numerical and categorical inputs via the adoption of Design of Experiments (DoE) techniques. An attractive alternative is the deployment of a grid compatible Simplex variant which has been shown to yield optima rapidly and consistently. In this work, the method is combined with dummy variables and it is deployed in three case studies wherein spaces are comprised of both categorical and numerical inputs, a situation intractable by traditional Simplex methods. The first study employs in silico data and lays out the dummy variable methodology. The latter two employ experimental data from chromatography based studies performed with the filter-plate and miniature column High Throughput (HT) techniques. The solute of interest in the former case study was a monoclonal antibody whereas the latter dealt with the separation of a binary system of model proteins. The implemented approach prevented the stranding of the Simplex method at local optima, due to the arbitrary handling of the categorical inputs, and allowed for the concurrent optimization of numerical and categorical, multilevel and/or dichotomous, inputs. The deployment of the Simplex method, combined with dummy variables, was therefore entirely successful in identifying and characterizing global optima in all three case studies. The Simplex-based method was further shown to be of equivalent efficiency to a DoE-based approach, represented here by D-Optimal designs. Such an approach failed, however, to both capture trends and identify optima, and led to poor operating conditions. It is suggested that the Simplex-variant is suited to development activities involving numerical and categorical inputs in early bioprocess development. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. A search game model of the scatter hoarder's problem

    PubMed Central

    Alpern, Steve; Fokkink, Robbert; Lidbetter, Thomas; Clayton, Nicola S.

    2012-01-01

    Scatter hoarders are animals (e.g. squirrels) who cache food (nuts) over a number of sites for later collection. A certain minimum amount of food must be recovered, possibly after pilfering by another animal, in order to survive the winter. An optimal caching strategy is one that maximizes the survival probability, given worst case behaviour of the pilferer. We modify certain ‘accumulation games’ studied by Kikuta & Ruckle (2000 J. Optim. Theory Appl.) and Kikuta & Ruckle (2001 Naval Res. Logist.), which modelled the problem of optimal diversification of resources against catastrophic loss, to include the depth at which the food is hidden at each caching site. Optimal caching strategies can then be determined as equilibria in a new ‘caching game’. We show how the distribution of food over sites and the site-depths of the optimal caching varies with the animal's survival requirements and the amount of pilfering. We show that in some cases, ‘decoy nuts’ are required to be placed above other nuts that are buried further down at the same site. Methods from the field of search games are used. Some empirically observed behaviour can be shown to be optimal in our model. PMID:22012971

  16. Integrating GIS, cellular automata, and genetic algorithm in urban spatial optimization: a case study of Lanzhou

    NASA Astrophysics Data System (ADS)

    Xu, Xibao; Zhang, Jianming; Zhou, Xiaojian

    2006-10-01

    This paper presents a model integrating GIS, cellular automata (CA) and genetic algorithm (GA) in urban spatial optimization. The model involves three objectives of the maximization of land-use efficiency, the maximization of urban spatial harmony and appropriate proportion of each land-use type. CA submodel is designed with standard Moore neighbor and three transition rules to maximize the land-use efficiency and urban spatial harmony, according to the land-use suitability and spatial harmony index. GA submodel is designed with four constraints and seven steps for the maximization of urban spatial harmony and appropriate proportion of each land-use type, including encoding, initializing, calculating fitness, selection, crossover, mutation and elitism. GIS is used to prepare for the input data sets for the model and perform spatial analysis on the results, while CA and GA are integrated to optimize urban spatial structure, programmed with Matlab 7 and coupled with GIS loosely. Lanzhou, a typical valley-basin city with fast urban development, is chosen as the case study. At the end, a detail analysis and evaluation of the spatial optimization with the model are made, and it proves to be a powerful tool in optimizing urban spatial structure and make supplement for urban planning and policy-makers.

  17. Moving from Student to Professional: Industry Mentors and Academic Internship Coordinators Supporting Intern Learning in the Workplace

    ERIC Educational Resources Information Center

    Kramer-Simpson, Elisabeth

    2018-01-01

    This article offers empirical data to explore ways that both industry mentors and academic internship coordinators support student interns in ways that optimize the workplace experience. Rich description of qualitative data from case studies and interviews shows that to optimize the internship, both the industry mentor and the academic internship…

  18. Achieving Conservation when Opportunity Costs Are High: Optimizing Reserve Design in Alberta's Oil Sands Region

    PubMed Central

    Schneider, Richard R.; Hauer, Grant; Farr, Dan; Adamowicz, W. L.; Boutin, Stan

    2011-01-01

    Recent studies have shown that conservation gains can be achieved when the spatial distributions of biological benefits and economic costs are incorporated in the conservation planning process. Using Alberta, Canada, as a case study we apply these techniques in the context of coarse-filter reserve design. Because targets for ecosystem representation and other coarse-filter design elements are difficult to define objectively we use a trade-off analysis to systematically explore the relationship between conservation targets and economic opportunity costs. We use the Marxan conservation planning software to generate reserve designs at each level of conservation target to ensure that our quantification of conservation and economic outcomes represents the optimal allocation of resources in each case. Opportunity cost is most affected by the ecological representation target and this relationship is nonlinear. Although petroleum resources are present throughout most of Alberta, and include highly valuable oil sands deposits, our analysis indicates that over 30% of public lands could be protected while maintaining access to more than 97% of the value of the region's resources. Our case study demonstrates that optimal resource allocation can be usefully employed to support strategic decision making in the context of land-use planning, even when conservation targets are not well defined. PMID:21858046

  19. Optimization of rotational arc station parameter optimized radiation therapy.

    PubMed

    Dong, P; Ungun, B; Boyd, S; Xing, L

    2016-09-01

    To develop a fast optimization method for station parameter optimized radiation therapy (SPORT) and show that SPORT is capable of matching VMAT in both plan quality and delivery efficiency by using three clinical cases of different disease sites. The angular space from 0° to 360° was divided into 180 station points (SPs). A candidate aperture was assigned to each of the SPs based on the calculation results using a column generation algorithm. The weights of the apertures were then obtained by optimizing the objective function using a state-of-the-art GPU based proximal operator graph solver. To avoid being trapped in a local minimum in beamlet-based aperture selection using the gradient descent algorithm, a stochastic gradient descent was employed here. Apertures with zero or low weight were thrown out. To find out whether there was room to further improve the plan by adding more apertures or SPs, the authors repeated the above procedure with consideration of the existing dose distribution from the last iteration. At the end of the second iteration, the weights of all the apertures were reoptimized, including those of the first iteration. The above procedure was repeated until the plan could not be improved any further. The optimization technique was assessed by using three clinical cases (prostate, head and neck, and brain) with the results compared to that obtained using conventional VMAT in terms of dosimetric properties, treatment time, and total MU. Marked dosimetric quality improvement was demonstrated in the SPORT plans for all three studied cases. For the prostate case, the volume of the 50% prescription dose was decreased by 22% for the rectum and 6% for the bladder. For the head and neck case, SPORT improved the mean dose for the left and right parotids by 15% each. The maximum dose was lowered from 72.7 to 71.7 Gy for the mandible, and from 30.7 to 27.3 Gy for the spinal cord. The mean dose for the pharynx and larynx was reduced by 8% and 6%, respectively. For the brain case, the doses to the eyes, chiasm, and inner ears were all improved. SPORT shortened the treatment time by ∼1 min for the prostate case, ∼0.5 min for brain case, and ∼0.2 min for the head and neck case. The dosimetric quality and delivery efficiency presented here indicate that SPORT is an intriguing alternative treatment modality. With the widespread adoption of digital linac, SPORT should lead to improved patient care in the future.

  20. Optimization of rotational arc station parameter optimized radiation therapy

    PubMed Central

    Dong, P.; Ungun, B.; Boyd, S.; Xing, L.

    2016-01-01

    Purpose: To develop a fast optimization method for station parameter optimized radiation therapy (SPORT) and show that SPORT is capable of matching VMAT in both plan quality and delivery efficiency by using three clinical cases of different disease sites. Methods: The angular space from 0° to 360° was divided into 180 station points (SPs). A candidate aperture was assigned to each of the SPs based on the calculation results using a column generation algorithm. The weights of the apertures were then obtained by optimizing the objective function using a state-of-the-art GPU based proximal operator graph solver. To avoid being trapped in a local minimum in beamlet-based aperture selection using the gradient descent algorithm, a stochastic gradient descent was employed here. Apertures with zero or low weight were thrown out. To find out whether there was room to further improve the plan by adding more apertures or SPs, the authors repeated the above procedure with consideration of the existing dose distribution from the last iteration. At the end of the second iteration, the weights of all the apertures were reoptimized, including those of the first iteration. The above procedure was repeated until the plan could not be improved any further. The optimization technique was assessed by using three clinical cases (prostate, head and neck, and brain) with the results compared to that obtained using conventional VMAT in terms of dosimetric properties, treatment time, and total MU. Results: Marked dosimetric quality improvement was demonstrated in the SPORT plans for all three studied cases. For the prostate case, the volume of the 50% prescription dose was decreased by 22% for the rectum and 6% for the bladder. For the head and neck case, SPORT improved the mean dose for the left and right parotids by 15% each. The maximum dose was lowered from 72.7 to 71.7 Gy for the mandible, and from 30.7 to 27.3 Gy for the spinal cord. The mean dose for the pharynx and larynx was reduced by 8% and 6%, respectively. For the brain case, the doses to the eyes, chiasm, and inner ears were all improved. SPORT shortened the treatment time by ∼1 min for the prostate case, ∼0.5 min for brain case, and ∼0.2 min for the head and neck case. Conclusions: The dosimetric quality and delivery efficiency presented here indicate that SPORT is an intriguing alternative treatment modality. With the widespread adoption of digital linac, SPORT should lead to improved patient care in the future. PMID:27587028

  1. Optimization of rotational arc station parameter optimized radiation therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, P.; Ungun, B.

    Purpose: To develop a fast optimization method for station parameter optimized radiation therapy (SPORT) and show that SPORT is capable of matching VMAT in both plan quality and delivery efficiency by using three clinical cases of different disease sites. Methods: The angular space from 0° to 360° was divided into 180 station points (SPs). A candidate aperture was assigned to each of the SPs based on the calculation results using a column generation algorithm. The weights of the apertures were then obtained by optimizing the objective function using a state-of-the-art GPU based proximal operator graph solver. To avoid being trappedmore » in a local minimum in beamlet-based aperture selection using the gradient descent algorithm, a stochastic gradient descent was employed here. Apertures with zero or low weight were thrown out. To find out whether there was room to further improve the plan by adding more apertures or SPs, the authors repeated the above procedure with consideration of the existing dose distribution from the last iteration. At the end of the second iteration, the weights of all the apertures were reoptimized, including those of the first iteration. The above procedure was repeated until the plan could not be improved any further. The optimization technique was assessed by using three clinical cases (prostate, head and neck, and brain) with the results compared to that obtained using conventional VMAT in terms of dosimetric properties, treatment time, and total MU. Results: Marked dosimetric quality improvement was demonstrated in the SPORT plans for all three studied cases. For the prostate case, the volume of the 50% prescription dose was decreased by 22% for the rectum and 6% for the bladder. For the head and neck case, SPORT improved the mean dose for the left and right parotids by 15% each. The maximum dose was lowered from 72.7 to 71.7 Gy for the mandible, and from 30.7 to 27.3 Gy for the spinal cord. The mean dose for the pharynx and larynx was reduced by 8% and 6%, respectively. For the brain case, the doses to the eyes, chiasm, and inner ears were all improved. SPORT shortened the treatment time by ∼1 min for the prostate case, ∼0.5 min for brain case, and ∼0.2 min for the head and neck case. Conclusions: The dosimetric quality and delivery efficiency presented here indicate that SPORT is an intriguing alternative treatment modality. With the widespread adoption of digital linac, SPORT should lead to improved patient care in the future.« less

  2. Using Infant Massage Following a Mother's Unfavorable Neonatal Intensive Care Unit Experiences: A Case Study

    ERIC Educational Resources Information Center

    Lappin, Grace

    2005-01-01

    The purpose of this case study was to explore the synchronous behaviors enacted by mother and infant with blindness. In the study, a mother's less than optimal experience with the neonatal intensive care unit (NICU) had a profound effect not only on her and her infant son, who was born 3 months prematurely and was visually impaired, but also on…

  3. Using genetic algorithms to optimise current and future health planning--the example of ambulance locations.

    PubMed

    Sasaki, Satoshi; Comber, Alexis J; Suzuki, Hiroshi; Brunsdon, Chris

    2010-01-28

    Ambulance response time is a crucial factor in patient survival. The number of emergency cases (EMS cases) requiring an ambulance is increasing due to changes in population demographics. This is decreasing ambulance response times to the emergency scene. This paper predicts EMS cases for 5-year intervals from 2020, to 2050 by correlating current EMS cases with demographic factors at the level of the census area and predicted population changes. It then applies a modified grouping genetic algorithm to compare current and future optimal locations and numbers of ambulances. Sets of potential locations were evaluated in terms of the (current and predicted) EMS case distances to those locations. Future EMS demands were predicted to increase by 2030 using the model (R2 = 0.71). The optimal locations of ambulances based on future EMS cases were compared with current locations and with optimal locations modelled on current EMS case data. Optimising the location of ambulance stations locations reduced the average response times by 57 seconds. Current and predicted future EMS demand at modelled locations were calculated and compared. The reallocation of ambulances to optimal locations improved response times and could contribute to higher survival rates from life-threatening medical events. Modelling EMS case 'demand' over census areas allows the data to be correlated to population characteristics and optimal 'supply' locations to be identified. Comparing current and future optimal scenarios allows more nuanced planning decisions to be made. This is a generic methodology that could be used to provide evidence in support of public health planning and decision making.

  4. Aerothermoelastic Topology Optimization with Flutter and Buckling Metrics (Postprint)

    DTIC Science & Technology

    2013-07-01

    topologies of an unheated panel, thermal buckling-optimal topologies, and flutter- optimality of a heated panel (where the latter case presents a...topological compromise between the former two). The effect of various constraint boundaries, temperature gradients, and (for the flutter of the heated panel...optimality of a heated panel (where the latter case presents a topological compromise between the former two). The effect of various constraint boundaries

  5. Parameter Optimization for Feature and Hit Generation in a General Unknown Screening Method-Proof of Concept Study Using a Design of Experiment Approach for a High Resolution Mass Spectrometry Procedure after Data Independent Acquisition.

    PubMed

    Elmiger, Marco P; Poetzsch, Michael; Steuer, Andrea E; Kraemer, Thomas

    2018-03-06

    High resolution mass spectrometry and modern data independent acquisition (DIA) methods enable the creation of general unknown screening (GUS) procedures. However, even when DIA is used, its potential is far from being exploited, because often, the untargeted acquisition is followed by a targeted search. Applying an actual GUS (including untargeted screening) produces an immense amount of data that must be dealt with. An optimization of the parameters regulating the feature detection and hit generation algorithms of the data processing software could significantly reduce the amount of unnecessary data and thereby the workload. Design of experiment (DoE) approaches allow a simultaneous optimization of multiple parameters. In a first step, parameters are evaluated (crucial or noncrucial). Second, crucial parameters are optimized. The aim in this study was to reduce the number of hits, without missing analytes. The obtained parameter settings from the optimization were compared to the standard settings by analyzing a test set of blood samples spiked with 22 relevant analytes as well as 62 authentic forensic cases. The optimization lead to a marked reduction of workload (12.3 to 1.1% and 3.8 to 1.1% hits for the test set and the authentic cases, respectively) while simultaneously increasing the identification rate (68.2 to 86.4% and 68.8 to 88.1%, respectively). This proof of concept study emphasizes the great potential of DoE approaches to master the data overload resulting from modern data independent acquisition methods used for general unknown screening procedures by optimizing software parameters.

  6. Terawatt x-ray free-electron-laser optimization by transverse electron distribution shaping

    DOE PAGES

    Emma, C.; Wu, J.; Fang, K.; ...

    2014-11-03

    We study the dependence of the peak power of a 1.5 Å Terawatt (TW), tapered x-ray free-electron laser (FEL) on the transverse electron density distribution. Multidimensional optimization schemes for TW hard x-ray free-electron lasers are applied to the cases of transversely uniform and parabolic electron beam distributions and compared to a Gaussian distribution. The optimizations are performed for a 200 m undulator and a resonant wavelength of λ r = 1.5 Å using the fully three-dimensional FEL particle code GENESIS. The study shows that the flatter transverse electron distributions enhance optical guiding in the tapered section of the undulator andmore » increase the maximum radiation power from a maximum of 1.56 TW for a transversely Gaussian beam to 2.26 TW for the parabolic case and 2.63 TW for the uniform case. Spectral data also shows a 30%–70% reduction in energy deposited in the sidebands for the uniform and parabolic beams compared with a Gaussian. An analysis of the transverse coherence of the radiation shows the coherence area to be much larger than the beam spotsize for all three distributions, making coherent diffraction imaging experiments possible.« less

  7. Slotting optimization of automated storage and retrieval system (AS/RS) for efficient delivery of parts in an assembly shop using genetic algorithm: A case Study

    NASA Astrophysics Data System (ADS)

    Yue, L.; Guan, Z.; He, C.; Luo, D.; Saif, U.

    2017-06-01

    In recent years, the competitive pressure on manufacturing companies shifted them from mass production to mass customization to produce large variety of products. It is a great challenge for companies nowadays to produce customized mixed flow mode of production to meet customized demand on time. Due to large variety of products, the storage system to deliver variety of products to production lines influences on the timely production of variety of products, as investigated from by simulation study of an inefficient storage system of a real Company, in the current research. Therefore, current research proposed a slotting optimization model with mixed model sequence to assemble in consideration of the final flow lines to optimize whole automated storage and retrieval system (AS/RS) and distribution system in the case company. Current research is aimed to minimize vertical height of centre of gravity of AS/RS and total time spent for taking the materials out from the AS/RS simultaneously. Genetic algorithm is adopted to solve the proposed problem and computational result shows significant improvement in stability and efficiency of AS/RS as compared to the existing method used in the case company.

  8. Performance comparison of genetic algorithms and particle swarm optimization for model integer programming bus timetabling problem

    NASA Astrophysics Data System (ADS)

    Wihartiko, F. D.; Wijayanti, H.; Virgantari, F.

    2018-03-01

    Genetic Algorithm (GA) is a common algorithm used to solve optimization problems with artificial intelligence approach. Similarly, the Particle Swarm Optimization (PSO) algorithm. Both algorithms have different advantages and disadvantages when applied to the case of optimization of the Model Integer Programming for Bus Timetabling Problem (MIPBTP), where in the case of MIPBTP will be found the optimal number of trips confronted with various constraints. The comparison results show that the PSO algorithm is superior in terms of complexity, accuracy, iteration and program simplicity in finding the optimal solution.

  9. Technical Note: Dose effects of 1.5 T transverse magnetic field on tissue interfaces in MRI-guided radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xinfeng; Prior, Phil; Chen, Guang-Pei

    Purpose: The integration of MRI with a linear accelerator (MR-linac) offers great potential for high-precision delivery of radiation therapy (RT). However, the electron deflection resulting from the presence of a transverse magnetic field (TMF) can affect the dose distribution, particularly the electron return effect (ERE) at tissue interfaces. The purpose of the study is to investigate the dose effects of ERE at air-tissue and lung-tissue interfaces during intensity-modulated radiation therapy (IMRT) planning. Methods: IMRT and volumetric modulated arc therapy (VMAT) plans for representative pancreas, lung, breast, and head and neck (HN) cases were generated following commonly used clinical dose volumemore » (DV) criteria. In each case, three types of plans were generated: (1) the original plan generated without a TMF; (2) the reconstructed plan generated by recalculating the original plan with the presence of a TMF of 1.5 T (no optimization); and (3) the optimized plan generated by a full optimization with TMF = 1.5 T. These plans were compared using a variety of DV parameters, including V{sub 100%}, D{sub 95%}, DHI [dose heterogeneity index: (D{sub 20%}–D{sub 80%})/D{sub prescription}], D{sub max}, and D{sub 1cc} in OARs (organs at risk) and tissue interface. All the optimizations and calculations in this work were performed on static data. Results: The dose recalculation under TMF showed the presence of the 1.5 T TMF can slightly reduce V{sub 100%} and D{sub 95%} for PTV, with the differences being less than 4% for all but one lung case studied. The TMF results in considerable increases in D{sub max} and D{sub 1cc} on the skin in all cases, mostly between 10% and 35%. The changes in D{sub max} and D{sub 1cc} on air cavity walls are dependent upon site, geometry, and size, with changes ranging up to 15%. The VMAT plans lead to much smaller dose effects from ERE compared to fixed-beam IMRT in pancreas case. When the TMF is considered in the plan optimization, the dose effects of the TMF at tissue interfaces (e.g., air-cavity wall, lung-tissue interfaces, skin) are significantly reduced in most cases. Conclusions: The doses on tissue interfaces can be significantly changed by the presence of a TMF during MR-guided RT when the magnetic field is not included in plan optimization. These changes can be substantially reduced or even eliminated during VMAT/IMRT optimization that specifically considers the TMF, without deteriorating overall plan quality.« less

  10. Optimizing Motion Planning for Hyper Dynamic Manipulator

    NASA Astrophysics Data System (ADS)

    Aboura, Souhila; Omari, Abdelhafid; Meguenni, Kadda Zemalache

    2012-01-01

    This paper investigates the optimal motion planning for an hyper dynamic manipulator. As case study, we consider a golf swing robot which is consisting with two actuated joint and a mechanical stoppers. Genetic Algorithm (GA) technique is proposed to solve the optimal golf swing motion which is generated by Fourier series approximation. The objective function for GA approach is to minimizing the intermediate and final state, minimizing the robot's energy consummation and maximizing the robot's speed. Obtained simulation results show the effectiveness of the proposed scheme.

  11. Numerical and experimental analysis of a ducted propeller designed by a fully automated optimization process under open water condition

    NASA Astrophysics Data System (ADS)

    Yu, Long; Druckenbrod, Markus; Greve, Martin; Wang, Ke-qi; Abdel-Maksoud, Moustafa

    2015-10-01

    A fully automated optimization process is provided for the design of ducted propellers under open water conditions, including 3D geometry modeling, meshing, optimization algorithm and CFD analysis techniques. The developed process allows the direct integration of a RANSE solver in the design stage. A practical ducted propeller design case study is carried out for validation. Numerical simulations and open water tests are fulfilled and proved that the optimum ducted propeller improves hydrodynamic performance as predicted.

  12. Evaluating and optimizing the NERSC workload on Knights Landing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, T; Cook, B; Deslippe, J

    2017-01-30

    NERSC has partnered with 20 representative application teams to evaluate performance on the Xeon-Phi Knights Landing architecture and develop an application-optimization strategy for the greater NERSC workload on the recently installed Cori system. In this article, we present early case studies and summarized results from a subset of the 20 applications highlighting the impact of important architecture differences between the Xeon-Phi and traditional Xeon processors. We summarize the status of the applications and describe the greater optimization strategy that has formed.

  13. Evaluating and Optimizing the NERSC Workload on Knights Landing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, Taylor; Cook, Brandon; Doerfler, Douglas

    2016-01-01

    NERSC has partnered with 20 representative application teams to evaluate performance on the Xeon-Phi Knights Landing architecture and develop an application-optimization strategy for the greater NERSC workload on the recently installed Cori system. In this article, we present early case studies and summarized results from a subset of the 20 applications highlighting the impact of important architecture differences between the Xeon-Phi and traditional Xeon processors. We summarize the status of the applications and describe the greater optimization strategy that has formed.

  14. Liner Optimization Studies Using the Ducted Fan Noise Prediction Code TBIEM3D

    NASA Technical Reports Server (NTRS)

    Dunn, M. H.; Farassat, F.

    1998-01-01

    In this paper we demonstrate the usefulness of the ducted fan noise prediction code TBIEM3D as a liner optimization design tool. Boundary conditions on the interior duct wall allow for hard walls or a locally reacting liner with axially segmented, circumferentially uniform impedance. Two liner optimization studies are considered in which farfield noise attenuation due to the presence of a liner is maximized by adjusting the liner impedance. In the first example, the dependence of optimal liner impedance on frequency and liner length is examined. Results show that both the optimal impedance and attenuation levels are significantly influenced by liner length and frequency. In the second example, TBIEM3D is used to compare radiated sound pressure levels between optimal and non-optimal liner cases at conditions designed to simulate take-off. It is shown that significant noise reduction is achieved for most of the sound field by selecting the optimal or near optimal liner impedance. Our results also indicate that there is relatively large region of the impedance plane over which optimal or near optimal liner behavior is attainable. This is an important conclusion for the designer since there are variations in liner characteristics due to manufacturing imprecisions.

  15. Performance Comparison of Optimized Designs of Francis Turbines Exposed to Sediment Erosion in various Operating Conditions

    NASA Astrophysics Data System (ADS)

    Shrestha, K. P.; Chitrakar, S.; Thapa, B.; Dahlhaug, O. G.

    2018-06-01

    Erosion on hydro turbine mostly depends on impingement velocity, angle of impact, concentration, shape, size and distribution of erodent particle and substrate material. In the case of Francis turbines, the sediment particles tend to erode more in the off-designed conditions than at the best efficiency point. Previous studies focused on the optimized runner blade design to reduce erosion at the designed flow. However, the effect of the change in the design on other operating conditions was not studied. This paper demonstrates the performance of optimized Francis turbine exposed to sediment erosion in various operating conditions. Comparative study has been carryout among the five different shapes of runner, different set of guide vane and stay vane angles. The effect of erosion is studied in terms of average erosion density rate on optimized design Francis runner with Lagrangian particle tracking method in CFD analysis. The numerical sensitivity of the results are investigated by comparing two turbulence models. Numerical results are validated from the velocity measurements carried out in the actual turbine. Results show that runner blades are susceptible to more erosion at part load conditions compared to BEP, whereas for the case of guide vanes, more erosion occurs at full load conditions. Out of the five shapes compared, Shape 5 provides an optimum combination of efficiency and erosion on the studied operating conditions.

  16. Design optimization for active twist rotor blades

    NASA Astrophysics Data System (ADS)

    Mok, Ji Won

    This dissertation introduces the process of optimizing active twist rotor blades in the presence of embedded anisotropic piezo-composite actuators. Optimum design of active twist blades is a complex task, since it involves a rich design space with tightly coupled design variables. The study presents the development of an optimization framework for active helicopter rotor blade cross-sectional design. This optimization framework allows for exploring a rich and highly nonlinear design space in order to optimize the active twist rotor blades. Different analytical components are combined in the framework: cross-sectional analysis (UM/VABS), an automated mesh generator, a beam solver (DYMORE), a three-dimensional local strain recovery module, and a gradient based optimizer within MATLAB. Through the mathematical optimization problem, the static twist actuation performance of a blade is maximized while satisfying a series of blade constraints. These constraints are associated with locations of the center of gravity and elastic axis, blade mass per unit span, fundamental rotating blade frequencies, and the blade strength based on local three-dimensional strain fields under worst loading conditions. Through pre-processing, limitations of the proposed process have been studied. When limitations were detected, resolution strategies were proposed. These include mesh overlapping, element distortion, trailing edge tab modeling, electrode modeling and foam implementation of the mesh generator, and the initial point sensibility of the current optimization scheme. Examples demonstrate the effectiveness of this process. Optimization studies were performed on the NASA/Army/MIT ATR blade case. Even though that design was built and shown significant impact in vibration reduction, the proposed optimization process showed that the design could be improved significantly. The second example, based on a model scale of the AH-64D Apache blade, emphasized the capability of this framework to explore the nonlinear design space of complex planform. Especially for this case, detailed design is carried out to make the actual blade manufacturable. The proposed optimization framework is shown to be an effective tool to design high authority active twist blades to reduce vibration in future helicopter rotor blades.

  17. An approach for aerodynamic optimization of transonic fan blades

    NASA Astrophysics Data System (ADS)

    Khelghatibana, Maryam

    Aerodynamic design optimization of transonic fan blades is a highly challenging problem due to the complexity of flow field inside the fan, the conflicting design requirements and the high-dimensional design space. In order to address all these challenges, an aerodynamic design optimization method is developed in this study. This method automates the design process by integrating a geometrical parameterization method, a CFD solver and numerical optimization methods that can be applied to both single and multi-point optimization design problems. A multi-level blade parameterization is employed to modify the blade geometry. Numerical analyses are performed by solving 3D RANS equations combined with SST turbulence model. Genetic algorithms and hybrid optimization methods are applied to solve the optimization problem. In order to verify the effectiveness and feasibility of the optimization method, a singlepoint optimization problem aiming to maximize design efficiency is formulated and applied to redesign a test case. However, transonic fan blade design is inherently a multi-faceted problem that deals with several objectives such as efficiency, stall margin, and choke margin. The proposed multi-point optimization method in the current study is formulated as a bi-objective problem to maximize design and near-stall efficiencies while maintaining the required design pressure ratio. Enhancing these objectives significantly deteriorate the choke margin, specifically at high rotational speeds. Therefore, another constraint is embedded in the optimization problem in order to prevent the reduction of choke margin at high speeds. Since capturing stall inception is numerically very expensive, stall margin has not been considered as an objective in the problem statement. However, improving near-stall efficiency results in a better performance at stall condition, which could enhance the stall margin. An investigation is therefore performed on the Pareto-optimal solutions to demonstrate the relation between near-stall efficiency and stall margin. The proposed method is applied to redesign NASA rotor 67 for single and multiple operating conditions. The single-point design optimization showed +0.28 points improvement of isentropic efficiency at design point, while the design pressure ratio and mass flow are, respectively, within 0.12% and 0.11% of the reference blade. Two cases of multi-point optimization are performed: First, the proposed multi-point optimization problem is relaxed by removing the choke margin constraint in order to demonstrate the relation between near-stall efficiency and stall margin. An investigation on the Pareto-optimal solutions of this optimization shows that the stall margin has been increased with improving near-stall efficiency. The second multi-point optimization case is performed with considering all the objectives and constraints. One selected optimized design on the Pareto front presents +0.41, +0.56 and +0.9 points improvement in near-peak efficiency, near-stall efficiency and stall margin, respectively. The design pressure ratio and mass flow are, respectively, within 0.3% and 0.26% of the reference blade. Moreover the optimized design maintains the required choking margin. Detailed aerodynamic analyses are performed to investigate the effect of shape optimization on shock occurrence, secondary flows, tip leakage and shock/tip-leakage interactions in both single and multi-point optimizations.

  18. Analysis Balance Parameter of Optimal Ramp metering

    NASA Astrophysics Data System (ADS)

    Li, Y.; Duan, N.; Yang, X.

    2018-05-01

    Ramp metering is a motorway control method to avoid onset congestion through limiting the access of ramp inflows into the main road of the motorway. The optimization model of ramp metering is developed based upon cell transmission model (CTM). With the piecewise linear structure of CTM, the corresponding motorway traffic optimization problem can be formulated as a linear programming (LP) problem. It is known that LP problem can be solved by established solution algorithms such as SIMPLEX or interior-point methods for the global optimal solution. The commercial software (CPLEX) is adopted in this study to solve the LP problem within reasonable computational time. The concept is illustrated through a case study of the United Kingdom M25 Motorway. The optimal solution provides useful insights and guidances on how to manage motorway traffic in order to maximize the corresponding efficiency.

  19. SU-G-JeP2-05: Dose Effects of a 1.5T Magnetic Field On Air-Tissue and Lung-Tissue Interfaces in MRI-Guided Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xinfeng; Prior, Phillip; Chen, Guangpei

    Purpose: The purpose of the study is to investigate the dose effects of electron-return-effect (ERE) at air-tissue and lung-tissue interfaces under a 1.5T transverse-magnetic-field (TMF). Methods: IMRT and VMAT plans for representative pancreas, lung, breast and head & neck (H&N) cases were generated following clinical dose volume (DV) criteria. The air-cavity walls, as well as the lung wall, were delineated to examine the ERE. In each case, the original plan generated without TMF is compared with the reconstructed plan (generated by recalculating the original plan with the presence of TMF) and the optimized plan (generated by a full optimization withmore » TMF), using a variety of DV parameters, including V100%, D95% and dose heterogeneity index for PTV, Dmax, and D1cc for OARs (organs at risk) and tissue interface. Results: The dose recalculation under TMF showed the presence of the 1.5 T TMF can slightly reduce V100% and D95% for PTV, with the differences being less than 4% for all but lung case studied. The TMF results in considerable increases in Dmax and D1cc on the skin in all cases, mostly between 10-35%. The changes in Dmax and D1cc on air cavity walls are dependent upon site, geometry, and size, with changes ranging up to 15%. In general, the VMAT plans lead to much smaller dose effects from ERE compared to fixed-beam IMRT. When the TMF is considered in the plan optimization, the dose effects of the TMF at tissue interfaces are significantly reduced in most cases. Conclusion: The doses on tissue interfaces can be significantly changed by the presence of a 1.5T TMF during MR-guided RT when the TMF is not included in plan optimization. These changes can be substantially reduced or even removed during VMAT/IMRT optimization that specifically considers the TMF, without deteriorating overall plan quality.« less

  20. Estimating multivariate response surface model with data outliers, case study in enhancing surface layer properties of an aircraft aluminium alloy

    NASA Astrophysics Data System (ADS)

    Widodo, Edy; Kariyam

    2017-03-01

    To determine the input variable settings that create the optimal compromise in response variable used Response Surface Methodology (RSM). There are three primary steps in the RSM problem, namely data collection, modelling, and optimization. In this study focused on the establishment of response surface models, using the assumption that the data produced is correct. Usually the response surface model parameters are estimated by OLS. However, this method is highly sensitive to outliers. Outliers can generate substantial residual and often affect the estimator models. Estimator models produced can be biased and could lead to errors in the determination of the optimal point of fact, that the main purpose of RSM is not reached. Meanwhile, in real life, the collected data often contain some response variable and a set of independent variables. Treat each response separately and apply a single response procedures can result in the wrong interpretation. So we need a development model for the multi-response case. Therefore, it takes a multivariate model of the response surface that is resistant to outliers. As an alternative, in this study discussed on M-estimation as a parameter estimator in multivariate response surface models containing outliers. As an illustration presented a case study on the experimental results to the enhancement of the surface layer of aluminium alloy air by shot peening.

  1. Optimization Under Uncertainty of Site-Specific Turbine Configurations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quick, J.; Dykes, K.; Graf, P.

    Uncertainty affects many aspects of wind energy plant performance and cost. In this study, we explore opportunities for site-specific turbine configuration optimization that accounts for uncertainty in the wind resource. As a demonstration, a simple empirical model for wind plant cost of energy is used in an optimization under uncertainty to examine how different risk appetites affect the optimal selection of a turbine configuration for sites of different wind resource profiles. Lastly, if there is unusually high uncertainty in the site wind resource, the optimal turbine configuration diverges from the deterministic case and a generally more conservative design is obtainedmore » with increasing risk aversion on the part of the designer.« less

  2. Optimization under Uncertainty of Site-Specific Turbine Configurations: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quick, Julian; Dykes, Katherine; Graf, Peter

    Uncertainty affects many aspects of wind energy plant performance and cost. In this study, we explore opportunities for site-specific turbine configuration optimization that accounts for uncertainty in the wind resource. As a demonstration, a simple empirical model for wind plant cost of energy is used in an optimization under uncertainty to examine how different risk appetites affect the optimal selection of a turbine configuration for sites of different wind resource profiles. If there is unusually high uncertainty in the site wind resource, the optimal turbine configuration diverges from the deterministic case and a generally more conservative design is obtained withmore » increasing risk aversion on the part of the designer.« less

  3. Optimization Under Uncertainty of Site-Specific Turbine Configurations

    DOE PAGES

    Quick, J.; Dykes, K.; Graf, P.; ...

    2016-10-03

    Uncertainty affects many aspects of wind energy plant performance and cost. In this study, we explore opportunities for site-specific turbine configuration optimization that accounts for uncertainty in the wind resource. As a demonstration, a simple empirical model for wind plant cost of energy is used in an optimization under uncertainty to examine how different risk appetites affect the optimal selection of a turbine configuration for sites of different wind resource profiles. Lastly, if there is unusually high uncertainty in the site wind resource, the optimal turbine configuration diverges from the deterministic case and a generally more conservative design is obtainedmore » with increasing risk aversion on the part of the designer.« less

  4. Optimization Under Uncertainty for Wake Steering Strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quick, Julian; Annoni, Jennifer; King, Ryan N

    2017-08-03

    This presentation covers the motivation for this research, optimization under the uncertainty problem formulation, a two-turbine case, the Princess Amalia Wind Farm case, and conclusions and next steps.

  5. Training Post-9/11 Police Officers with a Counter-Terrorism Reality-Based Training Model: A Case Study

    ERIC Educational Resources Information Center

    Biddle, Christopher J.

    2013-01-01

    The purpose of this qualitative holistic multiple-case study was to identify the optimal theoretical approach for a Counter-Terrorism Reality-Based Training (CTRBT) model to train post-9/11 police officers to perform effectively in their counter-terrorism assignments. Post-9/11 police officers assigned to counter-terrorism duties are not trained…

  6. Drag Optimization Of Light Trucks Using Computational Fluid Dynamics

    DTIC Science & Technology

    2003-09-01

    dimensional design case 19 study on the Lockheed C-141B aircraft wing, Cosentino and Holst [Ref. 10] reduced the number of design variables from 120 to 12... case letters) 6. AUTHOR(S) 5. FUNDING NUMBERS 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate School Monterey, CA 93943...23 B. TWO DIMENSIONAL LIGHT TRUCK SHAPE STUDIES .................. 23 1. Canopies

  7. Quantifying the economic benefits of prevention in a healthcare setting with severe financial constraints: the case of hypertension control.

    PubMed

    Athanasakis, Kostas; Kyriopoulos, Ilias-Ioannis; Boubouchairopoulou, Nadia; Stergiou, George S; Kyriopoulos, John

    2015-01-01

    Hypertension significantly contributes to the increased cardiovascular morbidity and mortality, thus leading to rising healthcare costs. The objective of this study was to quantify the clinical and economic benefits of optimal systolic blood pressure (SBP), in a setting under severe financial constraints, as in the case of Greece. Hence, a Markov model projecting 10-year outcomes and costs was adopted, in order to compare two scenarios. The first one depicted the "current setting", where all hypertensives in Greece presented an average SBP of 164 mmHg, while the second scenario namely "optimal SBP control" represented a hypothesis in which the whole population of hypertensives would achieve optimal SBP (i.e. <140 mmHg). Cardiovascular events' occurrence was estimated for four sub-models (according to gender and smoking status). Costs were calculated from the Greek healthcare system's perspective (discounted at a 3% annual rate). Findings showed that compared to the "current setting", universal "optimal SBP control" could, within a 10-year period, reduce the occurrence of non-fatal events and deaths, by 80 and 61 cases/1000 male smokers; 59 and 37 cases/1000 men non-smokers; whereas the respective figures for women were 69 and 57 cases/1000 women smokers; and accordingly, 52 and 28 cases/1000 women non-smokers. Considering health expenditures, they could be reduced by approximately €83 million per year. Therefore, prevention of cardiovascular events through BP control could result in reduced morbidity, thereby in substantial cost savings. Based on clinical and economic outcomes, interventions that promote BP control should be a health policy priority.

  8. Generalized massive optimal data compression

    NASA Astrophysics Data System (ADS)

    Alsing, Justin; Wandelt, Benjamin

    2018-05-01

    In this paper, we provide a general procedure for optimally compressing N data down to n summary statistics, where n is equal to the number of parameters of interest. We show that compression to the score function - the gradient of the log-likelihood with respect to the parameters - yields n compressed statistics that are optimal in the sense that they preserve the Fisher information content of the data. Our method generalizes earlier work on linear Karhunen-Loéve compression for Gaussian data whilst recovering both lossless linear compression and quadratic estimation as special cases when they are optimal. We give a unified treatment that also includes the general non-Gaussian case as long as mild regularity conditions are satisfied, producing optimal non-linear summary statistics when appropriate. As a worked example, we derive explicitly the n optimal compressed statistics for Gaussian data in the general case where both the mean and covariance depend on the parameters.

  9. Optimal design of zero-water discharge rinsing systems.

    PubMed

    Thöming, Jorg

    2002-03-01

    This paper is about zero liquid discharge in processes that use water for rinsing. Emphasis was given to those systems that contaminate process water with valuable process liquor and compounds. The approach involved the synthesis of optimal rinsing and recycling networks (RRN) that had a priori excluded water discharge. The total annualized costs of the RRN were minimized by the use of a mixed-integer nonlinear program (MINLP). This MINLP was based on a hyperstructure of the RRN and contained eight counterflow rinsing stages and three regenerator units: electrodialysis, reverse osmosis, and ion exchange columns. A "large-scale nickel plating process" case study showed that by means of zero-water discharge and optimized rinsing the total waste could be reduced by 90.4% at a revenue of $448,000/yr. Furthermore, with the optimized RRN, the rinsing performance can be improved significantly at a low-cost increase. In all the cases, the amount of valuable compounds reclaimed was above 99%.

  10. Aerospace Applications of Optimization under Uncertainty

    NASA Technical Reports Server (NTRS)

    Padula, Sharon; Gumbert, Clyde; Li, Wu

    2003-01-01

    The Multidisciplinary Optimization (MDO) Branch at NASA Langley Research Center develops new methods and investigates opportunities for applying optimization to aerospace vehicle design. This paper describes MDO Branch experiences with three applications of optimization under uncertainty: (1) improved impact dynamics for airframes, (2) transonic airfoil optimization for low drag, and (3) coupled aerodynamic/structures optimization of a 3-D wing. For each case, a brief overview of the problem and references to previous publications are provided. The three cases are aerospace examples of the challenges and opportunities presented by optimization under uncertainty. The present paper will illustrate a variety of needs for this technology, summarize promising methods, and uncover fruitful areas for new research.

  11. Aerospace Applications of Optimization under Uncertainty

    NASA Technical Reports Server (NTRS)

    Padula, Sharon; Gumbert, Clyde; Li, Wu

    2006-01-01

    The Multidisciplinary Optimization (MDO) Branch at NASA Langley Research Center develops new methods and investigates opportunities for applying optimization to aerospace vehicle design. This paper describes MDO Branch experiences with three applications of optimization under uncertainty: (1) improved impact dynamics for airframes, (2) transonic airfoil optimization for low drag, and (3) coupled aerodynamic/structures optimization of a 3-D wing. For each case, a brief overview of the problem and references to previous publications are provided. The three cases are aerospace examples of the challenges and opportunities presented by optimization under uncertainty. The present paper will illustrate a variety of needs for this technology, summarize promising methods, and uncover fruitful areas for new research.

  12. Query Optimization in Distributed Databases.

    DTIC Science & Technology

    1982-10-01

    general, the strategy a31 a11 a 3 is more time comsuming than the strategy a, a, and sually we do not use it. Since the semijoin of R.XJ> RS requires...analytic behavior of those heuristic algorithms. Although some analytic results of worst case and average case analysis are difficult to obtain, some...is the study of the analytic behavior of those heuristic algorithms. Although some analytic results of worst case and average case analysis are

  13. Very high-energy electron (VHEE) beams in radiation therapy; Treatment plan comparison between VHEE, VMAT, and PPBS.

    PubMed

    Schüler, Emil; Eriksson, Kjell; Hynning, Elin; Hancock, Steven L; Hiniker, Susan M; Bazalova-Carter, Magdalena; Wong, Tony; Le, Quynh-Thu; Loo, Billy W; Maxim, Peter G

    2017-06-01

    The aim of this study was to evaluate the performance of very high-energy electron beams (VHEE) in comparison to clinically derived treatment plans generated with volumetric modulated arc therapy (VMAT) and proton pencil beam scanning (PPBS) technology. We developed a custom optimization script that could be applied automatically across modalities to eliminate operator bias during IMRT optimization. Four clinical cases were selected (prostate cancer, lung cancer, pediatric brain tumor, and head and neck cancer (HNC)). The VHEE beams were calculated in the EGSnrc/DOSXYZnrc Monte Carlo code for 100 and 200 MeV beams. Treatment plans with VHEE, VMAT, and PPBS were optimized in a research version of RayStation using an in-house developed script to minimize operator bias between the different techniques. The in-house developed script generated similar or superior plans to the clinically used plans. In the comparisons between the modalities, the integral dose was lowest for the PPBS-generated plans in all cases. For the prostate case, the 200 MeV VHEE plan showed reduced integral dose and reduced organ at risk (OAR) dose compared to the VMAT plan. For all other cases, both the 100 and the 200 MeV VHEE plans were superior to the VMAT plans, and the VHEE plans showed better conformity and lower spinal cord dose in the pediatric brain case and lower brain stem dose in the HNC case when compared to the PPBS plan. The automated optimization developed in this study generated similar or superior plans as compared to the clinically used plan and represents an unbiased approach to compare treatment plans generated for different modalities. In the present study, we also show that VHEE plans are similar or superior to VMAT plans with reduced mean OAR dose and increased target conformity for a variety of clinical cases, and VHEE plans can even achieve reductions in OAR doses compared to PPBS plans for shallow targets. With increased VHEE energy, better conformity and even higher reductions in mean OAR doses are achieved. On the whole, VHEE was intermediate between photon VMAT and PPBS for OAR sparing. © 2017 American Association of Physicists in Medicine.

  14. Minimum error discrimination between similarity-transformed quantum states

    NASA Astrophysics Data System (ADS)

    Jafarizadeh, M. A.; Sufiani, R.; Mazhari Khiavi, Y.

    2011-07-01

    Using the well-known necessary and sufficient conditions for minimum error discrimination (MED), we extract an equivalent form for the MED conditions. In fact, by replacing the inequalities corresponding to the MED conditions with an equivalent but more suitable and convenient identity, the problem of mixed state discrimination with optimal success probability is solved. Moreover, we show that the mentioned optimality conditions can be viewed as a Helstrom family of ensembles under some circumstances. Using the given identity, MED between N similarity transformed equiprobable quantum states is investigated. In the case that the unitary operators are generating a set of irreducible representation, the optimal set of measurements and corresponding maximum success probability of discrimination can be determined precisely. In particular, it is shown that for equiprobable pure states, the optimal measurement strategy is the square-root measurement (SRM), whereas for the mixed states, SRM is not optimal. In the case that the unitary operators are reducible, there is no closed-form formula in the general case, but the procedure can be applied in each case in accordance to that case. Finally, we give the maximum success probability of optimal discrimination for some important examples of mixed quantum states, such as generalized Bloch sphere m-qubit states, spin-j states, particular nonsymmetric qudit states, etc.

  15. Minimum error discrimination between similarity-transformed quantum states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jafarizadeh, M. A.; Institute for Studies in Theoretical Physics and Mathematics, Tehran 19395-1795; Research Institute for Fundamental Sciences, Tabriz 51664

    2011-07-15

    Using the well-known necessary and sufficient conditions for minimum error discrimination (MED), we extract an equivalent form for the MED conditions. In fact, by replacing the inequalities corresponding to the MED conditions with an equivalent but more suitable and convenient identity, the problem of mixed state discrimination with optimal success probability is solved. Moreover, we show that the mentioned optimality conditions can be viewed as a Helstrom family of ensembles under some circumstances. Using the given identity, MED between N similarity transformed equiprobable quantum states is investigated. In the case that the unitary operators are generating a set of irreduciblemore » representation, the optimal set of measurements and corresponding maximum success probability of discrimination can be determined precisely. In particular, it is shown that for equiprobable pure states, the optimal measurement strategy is the square-root measurement (SRM), whereas for the mixed states, SRM is not optimal. In the case that the unitary operators are reducible, there is no closed-form formula in the general case, but the procedure can be applied in each case in accordance to that case. Finally, we give the maximum success probability of optimal discrimination for some important examples of mixed quantum states, such as generalized Bloch sphere m-qubit states, spin-j states, particular nonsymmetric qudit states, etc.« less

  16. Three dimensional intensity modulated brachytherapy (IMBT): dosimetry algorithm and inverse treatment planning.

    PubMed

    Shi, Chengyu; Guo, Bingqi; Cheng, Chih-Yao; Esquivel, Carlos; Eng, Tony; Papanikolaou, Niko

    2010-07-01

    The feasibility of intensity modulated brachytherapy (IMBT) to improve dose conformity for irregularly shaped targets has been previously investigated by researchers by means of using partially shielded sources. However, partial shielding does not fully explore the potential of IMBT. The goal of this study is to introduce the concept of three dimensional (3D) intensity modulated brachytherapy and solve two fundamental issues regarding the application of 3D IMBT treatment planning: The dose calculation algorithm and the inverse treatment planning method. A 3D IMBT treatment planning system prototype was developed using the MATLAB platform. This system consists of three major components: (1) A comprehensive IMBT source calibration method with dosimetric inputs from Monte Carlo (EGSnrc) simulations; (2) a "modified TG-43" (mTG-43) dose calculation formalism for IMBT dosimetry; and (3) a physical constraint based inverse IMBT treatment planning platform utilizing a simulated annealing optimization algorithm. The model S700 Axxent electronic brachytherapy source developed by Xoft, Inc. (Fremont, CA), was simulated in this application. Ten intracavitary accelerated partial breast irradiation (APBI) cases were studied. For each case, an "isotropic plan" with only optimized source dwell time and a fully optimized IMBT plan were generated and compared to the original plan in various dosimetric aspects, such as the plan quality, planning, and delivery time. The issue of the mechanical complexity of the IMBT applicator is not addressed in this study. IMBT approaches showed superior plan quality compared to the original plans and tht isotropic plans to different extents in all studied cases. An extremely difficult case with a small breast and a small distance to the ribs and skin, the IMBT plan minimized the high dose volume V200 by 16.1% and 4.8%, respectively, compared to the original and the isotropic plans. The conformity index for the target was increased by 0.13 and 0.04, respectively. The maximum dose to the skin was reduced by 56 and 28 cGy, respectively, per fraction. Also, the maximum dose to the ribs was reduced by 104 and 96 cGy, respectively, per fraction. The mean dose to the ipsilateral and contralateral breasts and lungs were also slightly reduced by the IMBT plan. The limitations of IMBT are the longer planning and delivery time. The IMBT plan took around 2 h to optimize, while the isotropic plan optimization could reach the global minimum within 5 min. The delivery time for the IMBT plan is typically four to six times longer than the corresponding isotropic plan. In this study, a dosimetry method for IMBT sources was proposed and an inverse treatment planning system prototype for IMBT was developed. The improvement of plan quality by 3D IMBT was demonstrated using ten APBI case studies. Faster computers and higher output of the source can further reduce plan optimization and delivery time, respectively.

  17. Illumination system development using design and analysis of computer experiments

    NASA Astrophysics Data System (ADS)

    Keresztes, Janos C.; De Ketelaere, Bart; Audenaert, Jan; Koshel, R. J.; Saeys, Wouter

    2015-09-01

    Computer assisted optimal illumination design is crucial when developing cost-effective machine vision systems. Standard local optimization methods, such as downhill simplex optimization (DHSO), often result in an optimal solution that is influenced by the starting point by converging to a local minimum, especially when dealing with high dimensional illumination designs or nonlinear merit spaces. This work presents a novel nonlinear optimization approach, based on design and analysis of computer experiments (DACE). The methodology is first illustrated with a 2D case study of four light sources symmetrically positioned along a fixed arc in order to obtain optimal irradiance uniformity on a flat Lambertian reflecting target at the arc center. The first step consists of choosing angular positions with no overlap between sources using a fast, flexible space filling design. Ray-tracing simulations are then performed at the design points and a merit function is used for each configuration to quantify the homogeneity of the irradiance at the target. The obtained homogeneities at the design points are further used as input to a Gaussian Process (GP), which develops a preliminary distribution for the expected merit space. Global optimization is then performed on the GP more likely providing optimal parameters. Next, the light positioning case study is further investigated by varying the radius of the arc, and by adding two spots symmetrically positioned along an arc diametrically opposed to the first one. The added value of using DACE with regard to the performance in convergence is 6 times faster than the standard simplex method for equal uniformity of 97%. The obtained results were successfully validated experimentally using a short-wavelength infrared (SWIR) hyperspectral imager monitoring a Spectralon panel illuminated by tungsten halogen sources with 10% of relative error.

  18. Optimal management of substrates in anaerobic co-digestion: An ant colony algorithm approach.

    PubMed

    Verdaguer, Marta; Molinos-Senante, María; Poch, Manel

    2016-04-01

    Sewage sludge (SWS) is inevitably produced in urban wastewater treatment plants (WWTPs). The treatment of SWS on site at small WWTPs is not economical; therefore, the SWS is typically transported to an alternative SWS treatment center. There is increased interest in the use of anaerobic digestion (AnD) with co-digestion as an SWS treatment alternative. Although the availability of different co-substrates has been ignored in most of the previous studies, it is an essential issue for the optimization of AnD co-digestion. In a pioneering approach, this paper applies an Ant-Colony-Optimization (ACO) algorithm that maximizes the generation of biogas through AnD co-digestion in order to optimize the discharge of organic waste from different waste sources in real-time. An empirical application is developed based on a virtual case study that involves organic waste from urban WWTPs and agrifood activities. The results illustrate the dominate role of toxicity levels in selecting contributions to the AnD input. The methodology and case study proposed in this paper demonstrate the usefulness of the ACO approach in supporting a decision process that contributes to improving the sustainability of organic waste and SWS management. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Optimization of a Coastal Environmental Monitoring Network Based on the Kriging Method: A Case Study of Quanzhou Bay, China

    PubMed Central

    Chen, Kai; Ni, Minjie; Wang, Jun; Huang, Dongren; Chen, Huorong; Wang, Xiao; Liu, Mengyang

    2016-01-01

    Environmental monitoring is fundamental in assessing environmental quality and to fulfill protection and management measures with permit conditions. However, coastal environmental monitoring work faces many problems and challenges, including the fact that monitoring information cannot be linked up with evaluation, monitoring data cannot well reflect the current coastal environmental condition, and monitoring activities are limited by cost constraints. For these reasons, protection and management measures cannot be developed and implemented well by policy makers who intend to solve this issue. In this paper, Quanzhou Bay in southeastern China was selected as a case study; and the Kriging method and a geographic information system were employed to evaluate and optimize the existing monitoring network in a semienclosed bay. This study used coastal environmental monitoring data from 15 sites (including COD, DIN, and PO4-P) to adequately analyze the water quality from 2009 to 2012 by applying the Trophic State Index. The monitoring network in Quanzhou Bay was evaluated and optimized, with the number of sites increased from 15 to 24, and the monitoring precision improved by 32.9%. The results demonstrated that the proposed advanced monitoring network optimization was appropriate for environmental monitoring in Quanzhou Bay. It might provide technical support for coastal management and pollutant reduction in similar areas. PMID:27777951

  20. Optimization of a Coastal Environmental Monitoring Network Based on the Kriging Method: A Case Study of Quanzhou Bay, China.

    PubMed

    Chen, Kai; Ni, Minjie; Cai, Minggang; Wang, Jun; Huang, Dongren; Chen, Huorong; Wang, Xiao; Liu, Mengyang

    2016-01-01

    Environmental monitoring is fundamental in assessing environmental quality and to fulfill protection and management measures with permit conditions. However, coastal environmental monitoring work faces many problems and challenges, including the fact that monitoring information cannot be linked up with evaluation, monitoring data cannot well reflect the current coastal environmental condition, and monitoring activities are limited by cost constraints. For these reasons, protection and management measures cannot be developed and implemented well by policy makers who intend to solve this issue. In this paper, Quanzhou Bay in southeastern China was selected as a case study; and the Kriging method and a geographic information system were employed to evaluate and optimize the existing monitoring network in a semienclosed bay. This study used coastal environmental monitoring data from 15 sites (including COD, DIN, and PO 4 -P) to adequately analyze the water quality from 2009 to 2012 by applying the Trophic State Index. The monitoring network in Quanzhou Bay was evaluated and optimized, with the number of sites increased from 15 to 24, and the monitoring precision improved by 32.9%. The results demonstrated that the proposed advanced monitoring network optimization was appropriate for environmental monitoring in Quanzhou Bay. It might provide technical support for coastal management and pollutant reduction in similar areas.

  1. CONDUIT: A New Multidisciplinary Integration Environment for Flight Control Development

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.; Colbourne, Jason D.; Morel, Mark R.; Biezad, Daniel J.; Levine, William S.; Moldoveanu, Veronica

    1997-01-01

    A state-of-the-art computational facility for aircraft flight control design, evaluation, and integration called CONDUIT (Control Designer's Unified Interface) has been developed. This paper describes the CONDUIT tool and case study applications to complex rotary- and fixed-wing fly-by-wire flight control problems. Control system analysis and design optimization methods are presented, including definition of design specifications and system models within CONDUIT, and the multi-objective function optimization (CONSOL-OPTCAD) used to tune the selected design parameters. Design examples are based on flight test programs for which extensive data are available for validation. CONDUIT is used to analyze baseline control laws against pertinent military handling qualities and control system specifications. In both case studies, CONDUIT successfully exploits trade-offs between forward loop and feedback dynamics to significantly improve the expected handling, qualities and minimize the required actuator authority. The CONDUIT system provides a new environment for integrated control system analysis and design, and has potential for significantly reducing the time and cost of control system flight test optimization.

  2. Discrete harmony search algorithm for scheduling and rescheduling the reprocessing problems in remanufacturing: a case study

    NASA Astrophysics Data System (ADS)

    Gao, Kaizhou; Wang, Ling; Luo, Jianping; Jiang, Hua; Sadollah, Ali; Pan, Quanke

    2018-06-01

    In this article, scheduling and rescheduling problems with increasing processing time and new job insertion are studied for reprocessing problems in the remanufacturing process. To handle the unpredictability of reprocessing time, an experience-based strategy is used. Rescheduling strategies are applied for considering the effect of increasing reprocessing time and the new subassembly insertion. To optimize the scheduling and rescheduling objective, a discrete harmony search (DHS) algorithm is proposed. To speed up the convergence rate, a local search method is designed. The DHS is applied to two real-life cases for minimizing the maximum completion time and the mean of earliness and tardiness (E/T). These two objectives are also considered together as a bi-objective problem. Computational optimization results and comparisons show that the proposed DHS is able to solve the scheduling and rescheduling problems effectively and productively. Using the proposed approach, satisfactory optimization results can be achieved for scheduling and rescheduling on a real-life shop floor.

  3. Functional and Structural Optimality in Plant Growth: A Crop Modelling Case Study

    NASA Astrophysics Data System (ADS)

    Caldararu, S.; Purves, D. W.; Smith, M. J.

    2014-12-01

    Simple mechanistic models of vegetation processes are essential both to our understanding of plant behaviour and to our ability to predict future changes in vegetation. One concept that can take us closer to such models is that of plant optimality, the hypothesis that plants aim to achieve an optimal state. Conceptually, plant optimality can be either structural or functional optimality. A structural constraint would mean that plants aim to achieve a certain structural characteristic such as an allometric relationship or nutrient content that allows optimal function. A functional condition refers to plants achieving optimal functionality, in most cases by maximising carbon gain. Functional optimality conditions are applied on shorter time scales and lead to higher plasticity, making plants more adaptable to changes in their environment. In contrast, structural constraints are optimal given the specific environmental conditions that plants are adapted to and offer less flexibility. We exemplify these concepts using a simple model of crop growth. The model represents annual cycles of growth from sowing date to harvest, including both vegetative and reproductive growth and phenology. Structural constraints to growth are represented as an optimal C:N ratio in all plant organs, which drives allocation throughout the vegetative growing stage. Reproductive phenology - i.e. the onset of flowering and grain filling - is determined by a functional optimality condition in the form of maximising final seed mass, so that vegetative growth stops when the plant reaches maximum nitrogen or carbon uptake. We investigate the plants' response to variations in environmental conditions within these two optimality constraints and show that final yield is most affected by changes during vegetative growth which affect the structural constraint.

  4. Directed Diffusion Modelling for Tesso Nilo National Parks Case Study

    NASA Astrophysics Data System (ADS)

    Yasri, Indra; Safrianti, Ery

    2018-01-01

    — Directed Diffusion (DD has ability to achieve energy efficiency in Wireless Sensor Network (WSN). This paper proposes Directed Diffusion (DD) modelling for Tesso Nilo National Parks (TNNP) case study. There are 4 stages of scenarios involved in this modelling. It’s started by appointing of sampling area through GPS coordinate. The sampling area is determined by optimization processes from 500m x 500m up to 1000m x 1000m with 100m increment in between. The next stage is sensor node placement. Sensor node is distributed in sampling area with three different quantities i.e. 20 nodes, 30 nodes and 40 nodes. One of those quantities is choose as an optimized sensor node placement. The third stage is to implement all scenarios in stages 1 and stages 2 on DD modelling. In the last stage, the evaluation process to achieve most energy efficient in the combination of optimized sampling area and optimized sensor node placement on Direct Diffusion (DD) routing protocol. The result shows combination between sampling area 500m x 500m and 20 nodes able to achieve energy efficient to support a forest preventive fire system at Tesso Nilo National Parks.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ranganathan, V; Kumar, P; Bzdusek, K

    Purpose: We propose a novel data-driven method to predict the achievability of clinical objectives upfront before invoking the IMRT optimization. Methods: A new metric called “Geometric Complexity (GC)” is used to estimate the achievability of clinical objectives. Here, GC is the measure of the number of “unmodulated” beamlets or rays that intersect the Region-of-interest (ROI) and the target volume. We first compute the geometric complexity ratio (GCratio) between the GC of a ROI (say, parotid) in a reference plan and the GC of the same ROI in a given plan. The GCratio of a ROI indicates the relative geometric complexitymore » of the ROI as compared to the same ROI in the reference plan. Hence GCratio can be used to predict if a defined clinical objective associated with the ROI can be met by the optimizer for a given case. Basically a higher GCratio indicates a lesser likelihood for the optimizer to achieve the clinical objective defined for a given ROI. Similarly, a lower GCratio indicates a higher likelihood for the optimizer to achieve the clinical objective defined for the given ROI. We have evaluated the proposed method on four Head and Neck cases using Pinnacle3 (version 9.10.0) Treatment Planning System (TPS). Results: Out of the total of 28 clinical objectives from four head and neck cases included in the study, 25 were in agreement with the prediction, which implies an agreement of about 85% between predicted and obtained results. The Pearson correlation test shows a positive correlation between predicted and obtained results (Correlation = 0.82, r2 = 0.64, p < 0.005). Conclusion: The study demonstrates the feasibility of the proposed method in head and neck cases for predicting the achievability of clinical objectives with reasonable accuracy.« less

  6. Fully integrated aerodynamic/dynamic optimization of helicopter rotor blades

    NASA Technical Reports Server (NTRS)

    Walsh, Joanne L.; Lamarsh, William J., II; Adelman, Howard M.

    1992-01-01

    This paper describes a fully integrated aerodynamic/dynamic optimization procedure for helicopter rotor blades. The procedure combines performance and dynamics analyses with a general purpose optimizer. The procedure minimizes a linear combination of power required (in hover, forward flight, and maneuver) and vibratory hub shear. The design variables include pretwist, taper initiation, taper ratio, root chord, blade stiffnesses, tuning masses, and tuning mass locations. Aerodynamic constraints consist of limits on power required in hover, forward flight and maneuver; airfoil section stall; drag divergence Mach number; minimum tip chord; and trim. Dynamic constraints are on frequencies, minimum autorotational inertia, and maximum blade weight. The procedure is demonstrated for two cases. In the first case the objective function involves power required (in hover, forward flight, and maneuver) and dynamics. The second case involves only hover power and dynamics. The designs from the integrated procedure are compared with designs from a sequential optimization approach in which the blade is first optimized for performance and then for dynamics. In both cases, the integrated approach is superior.

  7. Fully integrated aerodynamic/dynamic optimization of helicopter rotor blades

    NASA Technical Reports Server (NTRS)

    Walsh, Joanne L.; Lamarsh, William J., II; Adelman, Howard M.

    1992-01-01

    A fully integrated aerodynamic/dynamic optimization procedure is described for helicopter rotor blades. The procedure combines performance and dynamic analyses with a general purpose optimizer. The procedure minimizes a linear combination of power required (in hover, forward flight, and maneuver) and vibratory hub shear. The design variables include pretwist, taper initiation, taper ratio, root chord, blade stiffnesses, tuning masses, and tuning mass locations. Aerodynamic constraints consist of limits on power required in hover, forward flight and maneuvers; airfoil section stall; drag divergence Mach number; minimum tip chord; and trim. Dynamic constraints are on frequencies, minimum autorotational inertia, and maximum blade weight. The procedure is demonstrated for two cases. In the first case, the objective function involves power required (in hover, forward flight and maneuver) and dynamics. The second case involves only hover power and dynamics. The designs from the integrated procedure are compared with designs from a sequential optimization approach in which the blade is first optimized for performance and then for dynamics. In both cases, the integrated approach is superior.

  8. Improved Ant Algorithms for Software Testing Cases Generation

    PubMed Central

    Yang, Shunkun; Xu, Jiaqi

    2014-01-01

    Existing ant colony optimization (ACO) for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to porduce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony optimization, improved pheromone volatilization coefficient for ant colony optimization (IPVACO), and improved the global path pheromone update strategy for ant colony optimization (IGPACO). At last, we put forward a comprehensive improved ant colony optimization (ACIACO), which is based on all the above three methods. The proposed technique will be compared with random algorithm (RND) and genetic algorithm (GA) in terms of both efficiency and coverage. The results indicate that the improved method can effectively improve the search efficiency, restrain precocity, promote case coverage, and reduce the number of iterations. PMID:24883391

  9. Sequential Optimization Methods for Augmentation of Marine Enzymes Production in Solid-State Fermentation: l-Glutaminase Production a Case Study.

    PubMed

    Sathish, T; Uppuluri, K B; Veera Bramha Chari, P; Kezia, D

    There is an increased l-glutaminase market worldwide due to its relevant industrial applications. Salt tolerance l-glutaminases play a vital role in the increase of flavor of different types of foods like soya sauce and tofu. This chapter is presenting the economically viable l-glutaminases production in solid-state fermentation (SSF) by Aspergillus flavus MTCC 9972 as a case study. The enzyme production was improved following a three step optimization process. Initially mixture design (MD) (augmented simplex lattice design) was employed to optimize the solid substrate mixture. Such solid substrate mixture consisted of 59:41 of wheat bran and Bengal gram husk has given higher amounts of l-glutaminase. Glucose and l-glutamine were screened as a finest additional carbon and nitrogen sources for l-glutaminase production with help of Plackett-Burman Design (PBD). l-Glutamine also acting as a nitrogen source as well as inducer for secretion of l-glutaminase from A. flavus MTCC 9972. In the final step of optimization various environmental and nutritive parameters such as pH, temperature, moisture content, inoculum concentration, glucose, and l-glutamine levels were optimized through the use of hybrid feed forward neural networks (FFNNs) and genetic algorithm (GA). Through sequential optimization methods MD-PBD-FFNN-GA, the l-glutaminase production in SSF could be improved by 2.7-fold (453-1690U/g). © 2016 Elsevier Inc. All rights reserved.

  10. A pheromone-rate-based analysis on the convergence time of ACO algorithm.

    PubMed

    Huang, Han; Wu, Chun-Guo; Hao, Zhi-Feng

    2009-08-01

    Ant colony optimization (ACO) has widely been applied to solve combinatorial optimization problems in recent years. There are few studies, however, on its convergence time, which reflects how many iteration times ACO algorithms spend in converging to the optimal solution. Based on the absorbing Markov chain model, we analyze the ACO convergence time in this paper. First, we present a general result for the estimation of convergence time to reveal the relationship between convergence time and pheromone rate. This general result is then extended to a two-step analysis of the convergence time, which includes the following: 1) the iteration time that the pheromone rate spends on reaching the objective value and 2) the convergence time that is calculated with the objective pheromone rate in expectation. Furthermore, four brief ACO algorithms are investigated by using the proposed theoretical results as case studies. Finally, the conclusions of the case studies that the pheromone rate and its deviation determine the expected convergence time are numerically verified with the experiment results of four one-ant ACO algorithms and four ten-ant ACO algorithms.

  11. Solving mixed integer nonlinear programming problems using spiral dynamics optimization algorithm

    NASA Astrophysics Data System (ADS)

    Kania, Adhe; Sidarto, Kuntjoro Adji

    2016-02-01

    Many engineering and practical problem can be modeled by mixed integer nonlinear programming. This paper proposes to solve the problem with modified spiral dynamics inspired optimization method of Tamura and Yasuda. Four test cases have been examined, including problem in engineering and sport. This method succeeds in obtaining the optimal result in all test cases.

  12. Computer-aided resource planning and scheduling for radiological services

    NASA Astrophysics Data System (ADS)

    Garcia, Hong-Mei C.; Yun, David Y.; Ge, Yiqun; Khan, Javed I.

    1996-05-01

    There exists tremendous opportunity in hospital-wide resource optimization based on system integration. This paper defines the resource planning and scheduling requirements integral to PACS, RIS and HIS integration. An multi-site case study is conducted to define the requirements. A well-tested planning and scheduling methodology, called Constrained Resource Planning model, has been applied to the chosen problem of radiological service optimization. This investigation focuses on resource optimization issues for minimizing the turnaround time to increase clinical efficiency and customer satisfaction, particularly in cases where the scheduling of multiple exams are required for a patient. How best to combine the information system efficiency and human intelligence in improving radiological services is described. Finally, an architecture for interfacing a computer-aided resource planning and scheduling tool with the existing PACS, HIS and RIS implementation is presented.

  13. Investigation of Low-Reynolds-Number Rocket Nozzle Design Using PNS-Based Optimization Procedure

    NASA Technical Reports Server (NTRS)

    Hussaini, M. Moin; Korte, John J.

    1996-01-01

    An optimization approach to rocket nozzle design, based on computational fluid dynamics (CFD) methodology, is investigated for low-Reynolds-number cases. This study is undertaken to determine the benefits of this approach over those of classical design processes such as Rao's method. A CFD-based optimization procedure, using the parabolized Navier-Stokes (PNS) equations, is used to design conical and contoured axisymmetric nozzles. The advantage of this procedure is that it accounts for viscosity during the design process; other processes make an approximated boundary-layer correction after an inviscid design is created. Results showed significant improvement in the nozzle thrust coefficient over that of the baseline case; however, the unusual nozzle design necessitates further investigation of the accuracy of the PNS equations for modeling expanding flows with thick laminar boundary layers.

  14. Establishing a culturally specific nursing home for Finnish-speaking older persons in Sweden: A case study.

    PubMed

    Hadziabdic, Emina; Hjelm, Katarina

    2018-04-01

    The study aims to describe the establishment of a culturally specific nursing home for Finnish-speaking older persons in Sweden. A descriptive qualitative study. A descriptive case study based on a review of 14 public documents and individual interviews with two experts in the area, analysed with qualitative content analysis. This study found that shared language, preservation of customs and habits and collaboration between the representatives of the municipality, Finnish-speaking migrant associations and staff at the nursing home influenced the development of the culturally specific nursing home for older Finnish-speaking people intended to avoid loneliness, isolation and misunderstandings among older Finnish-speaking. Collaboration between healthcare service for older persons and minority people resulted in an optimal culturally specific nursing home, simultaneously encountering the majority culture. Nursing and healthcare services need to be aware of positive effects of collaboration with stakeholders to achieve optimal culturally specific nursing homes.

  15. Molecular taxonomy of phytopathogenic fungi: a case study in Peronospora.

    PubMed

    Göker, Markus; García-Blázquez, Gema; Voglmayr, Hermann; Tellería, M Teresa; Martín, María P

    2009-07-29

    Inappropriate taxon definitions may have severe consequences in many areas. For instance, biologically sensible species delimitation of plant pathogens is crucial for measures such as plant protection or biological control and for comparative studies involving model organisms. However, delimiting species is challenging in the case of organisms for which often only molecular data are available, such as prokaryotes, fungi, and many unicellular eukaryotes. Even in the case of organisms with well-established morphological characteristics, molecular taxonomy is often necessary to emend current taxonomic concepts and to analyze DNA sequences directly sampled from the environment. Typically, for this purpose clustering approaches to delineate molecular operational taxonomic units have been applied using arbitrary choices regarding the distance threshold values, and the clustering algorithms. Here, we report on a clustering optimization method to establish a molecular taxonomy of Peronospora based on ITS nrDNA sequences. Peronospora is the largest genus within the downy mildews, which are obligate parasites of higher plants, and includes various economically important pathogens. The method determines the distance function and clustering setting that result in an optimal agreement with selected reference data. Optimization was based on both taxonomy-based and host-based reference information, yielding the same outcome. Resampling and permutation methods indicate that the method is robust regarding taxon sampling and errors in the reference data. Tests with newly obtained ITS sequences demonstrate the use of the re-classified dataset in molecular identification of downy mildews. A corrected taxonomy is provided for all Peronospora ITS sequences contained in public databases. Clustering optimization appears to be broadly applicable in automated, sequence-based taxonomy. The method connects traditional and modern taxonomic disciplines by specifically addressing the issue of how to optimally account for both traditional species concepts and genetic divergence.

  16. Molecular Taxonomy of Phytopathogenic Fungi: A Case Study in Peronospora

    PubMed Central

    Göker, Markus; García-Blázquez, Gema; Voglmayr, Hermann; Tellería, M. Teresa; Martín, María P.

    2009-01-01

    Background Inappropriate taxon definitions may have severe consequences in many areas. For instance, biologically sensible species delimitation of plant pathogens is crucial for measures such as plant protection or biological control and for comparative studies involving model organisms. However, delimiting species is challenging in the case of organisms for which often only molecular data are available, such as prokaryotes, fungi, and many unicellular eukaryotes. Even in the case of organisms with well-established morphological characteristics, molecular taxonomy is often necessary to emend current taxonomic concepts and to analyze DNA sequences directly sampled from the environment. Typically, for this purpose clustering approaches to delineate molecular operational taxonomic units have been applied using arbitrary choices regarding the distance threshold values, and the clustering algorithms. Methodology Here, we report on a clustering optimization method to establish a molecular taxonomy of Peronospora based on ITS nrDNA sequences. Peronospora is the largest genus within the downy mildews, which are obligate parasites of higher plants, and includes various economically important pathogens. The method determines the distance function and clustering setting that result in an optimal agreement with selected reference data. Optimization was based on both taxonomy-based and host-based reference information, yielding the same outcome. Resampling and permutation methods indicate that the method is robust regarding taxon sampling and errors in the reference data. Tests with newly obtained ITS sequences demonstrate the use of the re-classified dataset in molecular identification of downy mildews. Conclusions A corrected taxonomy is provided for all Peronospora ITS sequences contained in public databases. Clustering optimization appears to be broadly applicable in automated, sequence-based taxonomy. The method connects traditional and modern taxonomic disciplines by specifically addressing the issue of how to optimally account for both traditional species concepts and genetic divergence. PMID:19641601

  17. Optical Sensor/Actuator Locations for Active Structural Acoustic Control

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.; Palumbo, Daniel L.; Kincaid, Rex K.

    1998-01-01

    Researchers at NASA Langley Research Center have extensive experience using active structural acoustic control (ASAC) for aircraft interior noise reduction. One aspect of ASAC involves the selection of optimum locations for microphone sensors and force actuators. This paper explains the importance of sensor/actuator selection, reviews optimization techniques, and summarizes experimental and numerical results. Three combinatorial optimization problems are described. Two involve the determination of the number and position of piezoelectric actuators, and the other involves the determination of the number and location of the sensors. For each case, a solution method is suggested, and typical results are examined. The first case, a simplified problem with simulated data, is used to illustrate the method. The second and third cases are more representative of the potential of the method and use measured data. The three case studies and laboratory test results establish the usefulness of the numerical methods.

  18. Controlling imported malaria cases in the United States of America.

    PubMed

    Dembele, Bassidy; Yakubu, Abdul-Aziz

    2017-02-01

    We extend the mathematical malaria epidemic model framework of Dembele et al. and use it to ``capture" the 2013 Centers for Disease Control and Prevention (CDC) reported data on the 2011 number of imported malaria cases in the USA. Furthermore, we use our ``fitted" malaria models for the top 20 countries of malaria acquisition by USA residents to study the impact of protecting USA residents from malaria infection when they travel to malaria endemic areas, the impact of protecting residents of malaria endemic regions from mosquito bites and the impact of killing mosquitoes in those endemic areas on the CDC number of imported malaria cases in USA. To significantly reduce the number of imported malaria cases in USA, for each top 20 country of malaria acquisition by USA travelers, we compute the optimal proportion of USA international travelers that must be protected against malaria infection and the optimal proportion of mosquitoes that must be killed.

  19. Productivity growth, case mix and optimal size of hospitals. A 16-year study of the Norwegian hospital sector.

    PubMed

    Anthun, Kjartan Sarheim; Kittelsen, Sverre Andreas Campbell; Magnussen, Jon

    2017-04-01

    This paper analyses productivity growth in the Norwegian hospital sector over a period of 16 years, 1999-2014. This period was characterized by a large ownership reform with subsequent hospital reorganizations and mergers. We describe how technological change, technical productivity, scale efficiency and the estimated optimal size of hospitals have evolved during this period. Hospital admissions were grouped into diagnosis-related groups using a fixed-grouper logic. Four composite outputs were defined and inputs were measured as operating costs. Productivity and efficiency were estimated with bootstrapped data envelopment analyses. Mean productivity increased by 24.6% points from 1999 to 2014, an average annual change of 1.5%. There was a substantial growth in productivity and hospital size following the ownership reform. After the reform (2003-2014), average annual growth was <0.5%. There was no evidence of technical change. Estimated optimal size was smaller than the actual size of most hospitals, yet scale efficiency was high even after hospital mergers. However, the later hospital mergers have not been followed by similar productivity growth as around time of the reform. This study addresses the issues of both cross-sectional and longitudinal comparability of case mix between hospitals, and thus provides a framework for future studies. The study adds to the discussion on optimal hospital size. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Improving performance of breast cancer risk prediction using a new CAD-based region segmentation scheme

    NASA Astrophysics Data System (ADS)

    Heidari, Morteza; Zargari Khuzani, Abolfazl; Danala, Gopichandh; Qiu, Yuchen; Zheng, Bin

    2018-02-01

    Objective of this study is to develop and test a new computer-aided detection (CAD) scheme with improved region of interest (ROI) segmentation combined with an image feature extraction framework to improve performance in predicting short-term breast cancer risk. A dataset involving 570 sets of "prior" negative mammography screening cases was retrospectively assembled. In the next sequential "current" screening, 285 cases were positive and 285 cases remained negative. A CAD scheme was applied to all 570 "prior" negative images to stratify cases into the high and low risk case group of having cancer detected in the "current" screening. First, a new ROI segmentation algorithm was used to automatically remove useless area of mammograms. Second, from the matched bilateral craniocaudal view images, a set of 43 image features related to frequency characteristics of ROIs were initially computed from the discrete cosine transform and spatial domain of the images. Third, a support vector machine model based machine learning classifier was used to optimally classify the selected optimal image features to build a CAD-based risk prediction model. The classifier was trained using a leave-one-case-out based cross-validation method. Applying this improved CAD scheme to the testing dataset, an area under ROC curve, AUC = 0.70+/-0.04, which was significantly higher than using the extracting features directly from the dataset without the improved ROI segmentation step (AUC = 0.63+/-0.04). This study demonstrated that the proposed approach could improve accuracy on predicting short-term breast cancer risk, which may play an important role in helping eventually establish an optimal personalized breast cancer paradigm.

  1. Enhanced genetic algorithm optimization model for a single reservoir operation based on hydropower generation: case study of Mosul reservoir, northern Iraq.

    PubMed

    Al-Aqeeli, Yousif H; Lee, T S; Abd Aziz, S

    2016-01-01

    Achievement of the optimal hydropower generation from operation of water reservoirs, is a complex problems. The purpose of this study was to formulate and improve an approach of a genetic algorithm optimization model (GAOM) in order to increase the maximization of annual hydropower generation for a single reservoir. For this purpose, two simulation algorithms were drafted and applied independently in that GAOM during 20 scenarios (years) for operation of Mosul reservoir, northern Iraq. The first algorithm was based on the traditional simulation of reservoir operation, whilst the second algorithm (Salg) enhanced the GAOM by changing the population values of GA through a new simulation process of reservoir operation. The performances of these two algorithms were evaluated through the comparison of their optimal values of annual hydropower generation during the 20 scenarios of operating. The GAOM achieved an increase in hydropower generation in 17 scenarios using these two algorithms, with the Salg being superior in all scenarios. All of these were done prior adding the evaporation (Ev) and precipitation (Pr) to the water balance equation. Next, the GAOM using the Salg was applied by taking into consideration the volumes of these two parameters. In this case, the optimal values obtained from the GAOM were compared, firstly with their counterpart that found using the same algorithm without taking into consideration of Ev and Pr, secondly with the observed values. The first comparison showed that the optimal values obtained in this case decreased in all scenarios, whilst maintaining the good results compared with the observed in the second comparison. The results proved the effectiveness of the Salg in increasing the hydropower generation through the enhanced approach of the GAOM. In addition, the results indicated to the importance of taking into account the Ev and Pr in the modelling of reservoirs operation.

  2. Optimization of Personnel Assignment Problem Based on Traveling Time by Using Hungarian Methods: Case Study on the Central Post Office Bandung

    NASA Astrophysics Data System (ADS)

    Supian, Sudradjat; Wahyuni, Sri; Nahar, Julita; Subiyanto

    2018-01-01

    In this paper, traveling time workers from the central post office Bandung in delivering the package to the destination location was optimized by using Hungarian method. Sensitivity analysis against data changes that may occur was also conducted. The sampled data in this study are 10 workers who will be assigned to deliver mail package to 10 post office delivery centers in Bandung that is Cikutra, Padalarang, Ujung Berung, Dayeuh Kolot, Asia- Africa, Soreang, Situ Saeur, Cimahi, Cipedes and Cikeruh. The result of this research is optimal traveling time from 10 workers to 10 destination locations. The optimal traveling time required by the workers is 387 minutes to reach the destination. Based on this result, manager of the central post office Bandung can make optimal decisions to assign tasks to their workers.

  3. Breast Cancer and Modifiable Lifestyle Factors in Argentinean Women: Addressing Missing Data in a Case-Control Study

    PubMed Central

    Coquet, Julia Becaria; Tumas, Natalia; Osella, Alberto Ruben; Tanzi, Matteo; Franco, Isabella; Diaz, Maria Del Pilar

    2016-01-01

    A number of studies have evidenced the effect of modifiable lifestyle factors such as diet, breastfeeding and nutritional status on breast cancer risk. However, none have addressed the missing data problem in nutritional epidemiologic research in South America. Missing data is a frequent problem in breast cancer studies and epidemiological settings in general. Estimates of effect obtained from these studies may be biased, if no appropriate method for handling missing data is applied. We performed Multiple Imputation for missing values on covariates in a breast cancer case-control study of Córdoba (Argentina) to optimize risk estimates. Data was obtained from a breast cancer case control study from 2008 to 2015 (318 cases, 526 controls). Complete case analysis and multiple imputation using chained equations were the methods applied to estimate the effects of a Traditional dietary pattern and other recognized factors associated with breast cancer. Physical activity and socioeconomic status were imputed. Logistic regression models were performed. When complete case analysis was performed only 31% of women were considered. Although a positive association of Traditional dietary pattern and breast cancer was observed from both approaches (complete case analysis OR=1.3, 95%CI=1.0-1.7; multiple imputation OR=1.4, 95%CI=1.2-1.7), effects of other covariates, like BMI and breastfeeding, were only identified when multiple imputation was considered. A Traditional dietary pattern, BMI and breastfeeding are associated with the occurrence of breast cancer in this Argentinean population when multiple imputation is appropriately performed. Multiple Imputation is suggested in Latin America’s epidemiologic studies to optimize effect estimates in the future. PMID:27892664

  4. Minimizing the health and climate impacts of emissions from heavy-duty public transportation bus fleets through operational optimization.

    PubMed

    Gouge, Brian; Dowlatabadi, Hadi; Ries, Francis J

    2013-04-16

    In contrast to capital control strategies (i.e., investments in new technology), the potential of operational control strategies (e.g., vehicle scheduling optimization) to reduce the health and climate impacts of the emissions from public transportation bus fleets has not been widely considered. This case study demonstrates that heterogeneity in the emission levels of different bus technologies and the exposure potential of bus routes can be exploited though optimization (e.g., how vehicles are assigned to routes) to minimize these impacts as well as operating costs. The magnitude of the benefits of the optimization depend on the specific transit system and region. Health impacts were found to be particularly sensitive to different vehicle assignments and ranged from worst to best case assignment by more than a factor of 2, suggesting there is significant potential to reduce health impacts. Trade-offs between climate, health, and cost objectives were also found. Transit agencies that do not consider these objectives in an integrated framework and, for example, optimize for costs and/or climate impacts alone, risk inadvertently increasing health impacts by as much as 49%. Cost-benefit analysis was used to evaluate trade-offs between objectives, but large uncertainties make identifying an optimal solution challenging.

  5. Genetic-evolution-based optimization methods for engineering design

    NASA Technical Reports Server (NTRS)

    Rao, S. S.; Pan, T. S.; Dhingra, A. K.; Venkayya, V. B.; Kumar, V.

    1990-01-01

    This paper presents the applicability of a biological model, based on genetic evolution, for engineering design optimization. Algorithms embodying the ideas of reproduction, crossover, and mutation are developed and applied to solve different types of structural optimization problems. Both continuous and discrete variable optimization problems are solved. A two-bay truss for maximum fundamental frequency is considered to demonstrate the continuous variable case. The selection of locations of actuators in an actively controlled structure, for minimum energy dissipation, is considered to illustrate the discrete variable case.

  6. Taguchi optimization: Case study of gold recovery from amalgamation tailing by using froth flotation method

    NASA Astrophysics Data System (ADS)

    Sudibyo, Aji, B. B.; Sumardi, S.; Mufakir, F. R.; Junaidi, A.; Nurjaman, F.; Karna, Aziza, Aulia

    2017-01-01

    Gold amalgamation process was widely used to treat gold ore. This process produces the tailing or amalgamation solid waste, which still contains gold at 8-9 ppm. Froth flotation is one of the promising methods to beneficiate gold from this tailing. However, this process requires optimal conditions which depends on the type of raw material. In this study, Taguchi method was used to optimize the optimum conditions of the froth flotation process. The Taguchi optimization shows that the gold recovery was strongly influenced by the particle size which is the best particle size at 150 mesh followed by the Potassium amyl xanthate concentration, pH and pine oil concentration at 1133.98, 4535.92 and 68.04 gr/ton amalgamation tailing, respectively.

  7. Novel optimization technique of isolated microgrid with hydrogen energy storage.

    PubMed

    Beshr, Eman Hassan; Abdelghany, Hazem; Eteiba, Mahmoud

    2018-01-01

    This paper presents a novel optimization technique for energy management studies of an isolated microgrid. The system is supplied by various Distributed Energy Resources (DERs), Diesel Generator (DG), a Wind Turbine Generator (WTG), Photovoltaic (PV) arrays and supported by fuel cell/electrolyzer Hydrogen storage system for short term storage. Multi-objective optimization is used through non-dominated sorting genetic algorithm to suit the load requirements under the given constraints. A novel multi-objective flower pollination algorithm is utilized to check the results. The Pros and cons of the two optimization techniques are compared and evaluated. An isolated microgrid is modelled using MATLAB software package, dispatch of active/reactive power, optimal load flow analysis with slack bus selection are carried out to be able to minimize fuel cost and line losses under realistic constraints. The performance of the system is studied and analyzed during both summer and winter conditions and three case studies are presented for each condition. The modified IEEE 15 bus system is used to validate the proposed algorithm.

  8. Novel optimization technique of isolated microgrid with hydrogen energy storage

    PubMed Central

    Abdelghany, Hazem; Eteiba, Mahmoud

    2018-01-01

    This paper presents a novel optimization technique for energy management studies of an isolated microgrid. The system is supplied by various Distributed Energy Resources (DERs), Diesel Generator (DG), a Wind Turbine Generator (WTG), Photovoltaic (PV) arrays and supported by fuel cell/electrolyzer Hydrogen storage system for short term storage. Multi-objective optimization is used through non-dominated sorting genetic algorithm to suit the load requirements under the given constraints. A novel multi-objective flower pollination algorithm is utilized to check the results. The Pros and cons of the two optimization techniques are compared and evaluated. An isolated microgrid is modelled using MATLAB software package, dispatch of active/reactive power, optimal load flow analysis with slack bus selection are carried out to be able to minimize fuel cost and line losses under realistic constraints. The performance of the system is studied and analyzed during both summer and winter conditions and three case studies are presented for each condition. The modified IEEE 15 bus system is used to validate the proposed algorithm. PMID:29466433

  9. Optimization of Airport Surface Traffic: A Case-Study of Incheon International Airport

    NASA Technical Reports Server (NTRS)

    Eun, Yeonju; Jeon, Daekeun; Lee, Hanbong; Jung, Yoon C.; Zhu, Zhifan; Jeong, Myeongsook; Kim, Hyounkong; Oh, Eunmi; Hong, Sungkwon

    2017-01-01

    This study aims to develop a controllers decision support tool for departure and surface management of ICN. Airport surface traffic optimization for Incheon International Airport (ICN) in South Korea was studied based on the operational characteristics of ICN and airspace of Korea. For surface traffic optimization, a multiple runway scheduling problem and a taxi scheduling problem were formulated into two Mixed Integer Linear Programming (MILP) optimization models. The Miles-In-Trail (MIT) separation constraint at the departure fix shared by the departure flights from multiple runways and the runway crossing constraints due to the taxi route configuration specific to ICN were incorporated into the runway scheduling and taxiway scheduling problems, respectively. Since the MILP-based optimization model for the multiple runway scheduling problem may be computationally intensive, computation times and delay costs of different solving methods were compared for a practical implementation. This research was a collaboration between Korea Aerospace Research Institute (KARI) and National Aeronautics and Space Administration (NASA).

  10. Optimization of Airport Surface Traffic: A Case-Study of Incheon International Airport

    NASA Technical Reports Server (NTRS)

    Eun, Yeonju; Jeon, Daekeun; Lee, Hanbong; Jung, Yoon Chul; Zhu, Zhifan; Jeong, Myeong-Sook; Kim, Hyoun Kyoung; Oh, Eunmi; Hong, Sungkwon

    2017-01-01

    This study aims to develop a controllers' decision support tool for departure and surface management of ICN. Airport surface traffic optimization for Incheon International Airport (ICN) in South Korea was studied based on the operational characteristics of ICN and airspace of Korea. For surface traffic optimization, a multiple runway scheduling problem and a taxi scheduling problem were formulated into two Mixed Integer Linear Programming (MILP) optimization models. The Miles-In-Trail (MIT) separation constraint at the departure fix shared by the departure flights from multiple runways and the runway crossing constraints due to the taxi route configuration specific to ICN were incorporated into the runway scheduling and taxiway scheduling problems, respectively. Since the MILP-based optimization model for the multiple runway scheduling problem may be computationally intensive, computation times and delay costs of different solving methods were compared for a practical implementation. This research was a collaboration between Korea Aerospace Research Institute (KARI) and National Aeronautics and Space Administration (NASA).

  11. Audit and internal quality control in immunohistochemistry

    PubMed Central

    Maxwell, P; McCluggage, W

    2000-01-01

    Aims—Although positive and negative controls are performed and checked in surgical pathology cases undergoing immunohistochemistry, internal quality control procedures for immunohistochemistry are not well described. This study, comprising a retrospective audit, aims to describe a method of internal quality control for immunohistochemistry. A scoring system that allows comparison between cases is described. Methods—Two positive tissue controls for each month over a three year period (1996–1998) of the 10 antibodies used most frequently were evaluated. All test cases undergoing immunohistochemistry in the months of April in this three year period were also studied. When the test case was completely negative for a given antibody, the corresponding positive tissue control from that day was examined. A marking system was devised whereby each immunohistochemical slide was assessed out of a possible score of 8 to take account of staining intensity, uniformity, specificity, background, and counterstaining. Using this scoring system, cases were classified as showing optimal (7–8), borderline (5–6), or unacceptable (0–4) staining. Results—Most positive tissue controls showed either optimal or borderline staining with the exception of neurone specific enolase (NSE), where most slides were unacceptable or borderline as a result of a combination of low intensity, poor specificity, and excessive background staining. All test cases showed either optimal or borderline staining with the exception of a single case stained for NSE, which was unacceptable. Conclusions—This retrospective audit shows that immunohistochemically stained slides can be assessed using this scoring system. With most antibodies, acceptable staining was achieved in most cases. However, there were problems with staining for NSE, which needs to be reviewed. Laboratories should use a system such as this to evaluate which antibodies regularly result in poor staining so that they can be excluded from panels. Routine evaluation of immunohistochemical staining should become part of everyday internal quality control procedures. Key Words: immunohistochemistry • audit • internal quality control PMID:11265178

  12. A nonlinear bi-level programming approach for product portfolio management.

    PubMed

    Ma, Shuang

    2016-01-01

    Product portfolio management (PPM) is a critical decision-making for companies across various industries in today's competitive environment. Traditional studies on PPM problem have been motivated toward engineering feasibilities and marketing which relatively pay less attention to other competitors' actions and the competitive relations, especially in mathematical optimization domain. The key challenge lies in that how to construct a mathematical optimization model to describe this Stackelberg game-based leader-follower PPM problem and the competitive relations between them. The primary work of this paper is the representation of a decision framework and the optimization model to leverage the PPM problem of leader and follower. A nonlinear, integer bi-level programming model is developed based on the decision framework. Furthermore, a bi-level nested genetic algorithm is put forward to solve this nonlinear bi-level programming model for leader-follower PPM problem. A case study of notebook computer product portfolio optimization is reported. Results and analyses reveal that the leader-follower bi-level optimization model is robust and can empower product portfolio optimization.

  13. Structural damage detection-oriented multi-type sensor placement with multi-objective optimization

    NASA Astrophysics Data System (ADS)

    Lin, Jian-Fu; Xu, You-Lin; Law, Siu-Seong

    2018-05-01

    A structural damage detection-oriented multi-type sensor placement method with multi-objective optimization is developed in this study. The multi-type response covariance sensitivity-based damage detection method is first introduced. Two objective functions for optimal sensor placement are then introduced in terms of the response covariance sensitivity and the response independence. The multi-objective optimization problem is formed by using the two objective functions, and the non-dominated sorting genetic algorithm (NSGA)-II is adopted to find the solution for the optimal multi-type sensor placement to achieve the best structural damage detection. The proposed method is finally applied to a nine-bay three-dimensional frame structure. Numerical results show that the optimal multi-type sensor placement determined by the proposed method can avoid redundant sensors and provide satisfactory results for structural damage detection. The restriction on the number of each type of sensors in the optimization can reduce the searching space in the optimization to make the proposed method more effective. Moreover, how to select a most optimal sensor placement from the Pareto solutions via the utility function and the knee point method is demonstrated in the case study.

  14. SoMIR framework for designing high-NDBP photonic crystal waveguides.

    PubMed

    Mirjalili, Seyed Mohammad

    2014-06-20

    This work proposes a modularized framework for designing the structure of photonic crystal waveguides (PCWs) and reducing human involvement during the design process. The proposed framework consists of three main modules: parameters module, constraints module, and optimizer module. The first module is responsible for defining the structural parameters of a given PCW. The second module defines various limitations in order to achieve desirable optimum designs. The third module is the optimizer, in which a numerical optimization method is employed to perform optimization. As case studies, two new structures called Ellipse PCW (EPCW) and Hypoellipse PCW (HPCW) with different shape of holes in each row are proposed and optimized by the framework. The calculation results show that the proposed framework is able to successfully optimize the structures of the new EPCW and HPCW. In addition, the results demonstrate the applicability of the proposed framework for optimizing different PCWs. The results of the comparative study show that the optimized EPCW and HPCW provide 18% and 9% significant improvements in normalized delay-bandwidth product (NDBP), respectively, compared to the ring-shape-hole PCW, which has the highest NDBP in the literature. Finally, the simulations of pulse propagation confirm the manufacturing feasibility of both optimized structures.

  15. Hybrid algorithms for fuzzy reverse supply chain network design.

    PubMed

    Che, Z H; Chiang, Tzu-An; Kuo, Y C; Cui, Zhihua

    2014-01-01

    In consideration of capacity constraints, fuzzy defect ratio, and fuzzy transport loss ratio, this paper attempted to establish an optimized decision model for production planning and distribution of a multiphase, multiproduct reverse supply chain, which addresses defects returned to original manufacturers, and in addition, develops hybrid algorithms such as Particle Swarm Optimization-Genetic Algorithm (PSO-GA), Genetic Algorithm-Simulated Annealing (GA-SA), and Particle Swarm Optimization-Simulated Annealing (PSO-SA) for solving the optimized model. During a case study of a multi-phase, multi-product reverse supply chain network, this paper explained the suitability of the optimized decision model and the applicability of the algorithms. Finally, the hybrid algorithms showed excellent solving capability when compared with original GA and PSO methods.

  16. Hybrid Algorithms for Fuzzy Reverse Supply Chain Network Design

    PubMed Central

    Che, Z. H.; Chiang, Tzu-An; Kuo, Y. C.

    2014-01-01

    In consideration of capacity constraints, fuzzy defect ratio, and fuzzy transport loss ratio, this paper attempted to establish an optimized decision model for production planning and distribution of a multiphase, multiproduct reverse supply chain, which addresses defects returned to original manufacturers, and in addition, develops hybrid algorithms such as Particle Swarm Optimization-Genetic Algorithm (PSO-GA), Genetic Algorithm-Simulated Annealing (GA-SA), and Particle Swarm Optimization-Simulated Annealing (PSO-SA) for solving the optimized model. During a case study of a multi-phase, multi-product reverse supply chain network, this paper explained the suitability of the optimized decision model and the applicability of the algorithms. Finally, the hybrid algorithms showed excellent solving capability when compared with original GA and PSO methods. PMID:24892057

  17. [Navigated implantation of total knee endoprostheses--a comparative study with conventional instrumentation].

    PubMed

    Jenny, J Y; Boeri, C

    2001-01-01

    A navigation system should improve the quality of a total knee prosthesis implantation in comparison to the classical, surgeon-controlled operative technique. The authors have implanted 40 knee total prostheses with an optical infrared navigation system (Orthopilot AESCULAP, Tuttlingen--group A). The quality of implantation was studied on postoperative long leg AP and lateral X-rays, and compared to a control group of 40 computer-paired total knee prostheses o the same model (Search Prosthesis, AESCULAP, Tuttlingen) implanted with a classical, surgeon-controlled technique (group B). An optimal mechanical femorotibial angle (3 degrees valgus to 3 degrees varus) was obtained by 33 cases in group A and 31 cases in group B (p > 0.05). Better results were seen for the coronal and sagittal orientation of both tibial and femoral components in group A. Globally, 26 cases of the group A and 12 cases of the group B were implanted in an optimal manner for all studied criteria (p < 0.01). The used navigation system allows a significant improvement of the quality of implantation of a knee total prosthesis in comparison to a classical, surgeon-controlled instrumentation. Long-term outcome could be consequently improved.

  18. Connection between optimal control theory and adiabatic-passage techniques in quantum systems

    NASA Astrophysics Data System (ADS)

    Assémat, E.; Sugny, D.

    2012-08-01

    This work explores the relationship between optimal control theory and adiabatic passage techniques in quantum systems. The study is based on a geometric analysis of the Hamiltonian dynamics constructed from Pontryagin's maximum principle. In a three-level quantum system, we show that the stimulated Raman adiabatic passage technique can be associated to a peculiar Hamiltonian singularity. One deduces that the adiabatic pulse is solution of the optimal control problem only for a specific cost functional. This analysis is extended to the case of a four-level quantum system.

  19. Building-to-Grid Integration through Commercial Building Portfolios Participating in Energy and Frequency Regulation Markets

    NASA Astrophysics Data System (ADS)

    Pavlak, Gregory S.

    Building energy use is a significant contributing factor to growing worldwide energy demands. In pursuit of a sustainable energy future, commercial building operations must be intelligently integrated with the electric system to increase efficiency and enable renewable generation. Toward this end, a model-based methodology was developed to estimate the capability of commercial buildings to participate in frequency regulation ancillary service markets. This methodology was integrated into a supervisory model predictive controller to optimize building operation in consideration of energy prices, demand charges, and ancillary service revenue. The supervisory control problem was extended to building portfolios to evaluate opportunities for synergistic effect among multiple, centrally-optimized buildings. Simulation studies performed showed that the multi-market optimization was able to determine appropriate opportunities for buildings to provide frequency regulation. Total savings were increased by up to thirteen percentage points, depending on the simulation case. Furthermore, optimizing buildings as a portfolio achieved up to seven additional percentage points of savings, depending on the case. Enhanced energy and cost savings opportunities were observed by taking the novel perspective of optimizing building portfolios in multiple grid markets, motivating future pursuits of advanced control paradigms that enable a more intelligent electric grid.

  20. Optimum Design of High-Speed Prop-Rotors

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; McCarthy, Thomas Robert

    1993-01-01

    An integrated multidisciplinary optimization procedure is developed for application to rotary wing aircraft design. The necessary disciplines such as dynamics, aerodynamics, aeroelasticity, and structures are coupled within a closed-loop optimization process. The procedure developed is applied to address two different problems. The first problem considers the optimization of a helicopter rotor blade and the second problem addresses the optimum design of a high-speed tilting proprotor. In the helicopter blade problem, the objective is to reduce the critical vibratory shear forces and moments at the blade root, without degrading rotor aerodynamic performance and aeroelastic stability. In the case of the high-speed proprotor, the goal is to maximize the propulsive efficiency in high-speed cruise without deteriorating the aeroelastic stability in cruise and the aerodynamic performance in hover. The problems studied involve multiple design objectives; therefore, the optimization problems are formulated using multiobjective design procedures. A comprehensive helicopter analysis code is used for the rotary wing aerodynamic, dynamic and aeroelastic stability analyses and an algorithm developed specifically for these purposes is used for the structural analysis. A nonlinear programming technique coupled with an approximate analysis procedure is used to perform the optimization. The optimum blade designs obtained in each case are compared to corresponding reference designs.

  1. Inadequate management of pregnancy-associated listeriosis: lessons from four case reports.

    PubMed

    Charlier, C; Goffinet, F; Azria, E; Leclercq, A; Lecuit, M

    2014-03-01

    Listeria monocytogenes infection during pregnancy can lead to dramatic fetal or neonatal outcomes. No clinical trial has evaluated treatment options, and retrospective studies of cases are therefore important to define optimal regimens. We report four cases of materno-neonatal listeriosis illustrating inadequate antimicrobial therapy management and discuss recommended treatment options. © 2013 The Authors Clinical Microbiology and Infection © 2013 European Society of Clinical Microbiology and Infectious Diseases.

  2. ConvAn: a convergence analyzing tool for optimization of biochemical networks.

    PubMed

    Kostromins, Andrejs; Mozga, Ivars; Stalidzans, Egils

    2012-01-01

    Dynamic models of biochemical networks usually are described as a system of nonlinear differential equations. In case of optimization of models for purpose of parameter estimation or design of new properties mainly numerical methods are used. That causes problems of optimization predictability as most of numerical optimization methods have stochastic properties and the convergence of the objective function to the global optimum is hardly predictable. Determination of suitable optimization method and necessary duration of optimization becomes critical in case of evaluation of high number of combinations of adjustable parameters or in case of large dynamic models. This task is complex due to variety of optimization methods, software tools and nonlinearity features of models in different parameter spaces. A software tool ConvAn is developed to analyze statistical properties of convergence dynamics for optimization runs with particular optimization method, model, software tool, set of optimization method parameters and number of adjustable parameters of the model. The convergence curves can be normalized automatically to enable comparison of different methods and models in the same scale. By the help of the biochemistry adapted graphical user interface of ConvAn it is possible to compare different optimization methods in terms of ability to find the global optima or values close to that as well as the necessary computational time to reach them. It is possible to estimate the optimization performance for different number of adjustable parameters. The functionality of ConvAn enables statistical assessment of necessary optimization time depending on the necessary optimization accuracy. Optimization methods, which are not suitable for a particular optimization task, can be rejected if they have poor repeatability or convergence properties. The software ConvAn is freely available on www.biosystems.lv/convan. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  3. Dose-mass inverse optimization for minimally moving thoracic lesions

    NASA Astrophysics Data System (ADS)

    Mihaylov, I. B.; Moros, E. G.

    2015-05-01

    In the past decade, several different radiotherapy treatment plan evaluation and optimization schemes have been proposed as viable approaches, aiming for dose escalation or an increase of healthy tissue sparing. In particular, it has been argued that dose-mass plan evaluation and treatment plan optimization might be viable alternatives to the standard of care, which is realized through dose-volume evaluation and optimization. The purpose of this investigation is to apply dose-mass optimization to a cohort of lung cancer patients and compare the achievable healthy tissue sparing to that one achievable through dose-volume optimization. Fourteen non-small cell lung cancer (NSCLC) patient plans were studied retrospectively. The range of tumor motion was less than 0.5 cm and motion management in the treatment planning process was not considered. For each case, dose-volume (DV)-based and dose-mass (DM)-based optimization was performed. Nine-field step-and-shoot IMRT was used, with all of the optimization parameters kept the same between DV and DM optimizations. Commonly used dosimetric indices (DIs) such as dose to 1% the spinal cord volume, dose to 50% of the esophageal volume, and doses to 20 and 30% of healthy lung volumes were used for cross-comparison. Similarly, mass-based indices (MIs), such as doses to 20 and 30% of healthy lung masses, 1% of spinal cord mass, and 33% of heart mass, were also tallied. Statistical equivalence tests were performed to quantify the findings for the entire patient cohort. Both DV and DM plans for each case were normalized such that 95% of the planning target volume received the prescribed dose. DM optimization resulted in more organs at risk (OAR) sparing than DV optimization. The average sparing of cord, heart, and esophagus was 23, 4, and 6%, respectively. For the majority of the DIs, DM optimization resulted in lower lung doses. On average, the doses to 20 and 30% of healthy lung were lower by approximately 3 and 4%, whereas lung volumes receiving 2000 and 3000 cGy were lower by 3 and 2%, respectively. The behavior of MIs was very similar. The statistical analyses of the results again indicated better healthy anatomical structure sparing with DM optimization. The presented findings indicate that dose-mass-based optimization results in statistically significant OAR sparing as compared to dose-volume-based optimization for NSCLC. However, the sparing is case-dependent and it is not observed for all tallied dosimetric endpoints.

  4. Speedup of lexicographic optimization by superiorization and its applications to cancer radiotherapy treatment

    NASA Astrophysics Data System (ADS)

    Bonacker, Esther; Gibali, Aviv; Küfer, Karl-Heinz; Süss, Philipp

    2017-04-01

    Multicriteria optimization problems occur in many real life applications, for example in cancer radiotherapy treatment and in particular in intensity modulated radiation therapy (IMRT). In this work we focus on optimization problems with multiple objectives that are ranked according to their importance. We solve these problems numerically by combining lexicographic optimization with our recently proposed level set scheme, which yields a sequence of auxiliary convex feasibility problems; solved here via projection methods. The projection enables us to combine the newly introduced superiorization methodology with multicriteria optimization methods to speed up computation while guaranteeing convergence of the optimization. We demonstrate our scheme with a simple 2D academic example (used in the literature) and also present results from calculations on four real head neck cases in IMRT (Radiation Oncology of the Ludwig-Maximilians University, Munich, Germany) for two different choices of superiorization parameter sets suited to yield fast convergence for each case individually or robust behavior for all four cases.

  5. Warpage optimization on a mobile phone case using response surface methodology (RSM)

    NASA Astrophysics Data System (ADS)

    Lee, X. N.; Fathullah, M.; Shayfull, Z.; Nasir, S. M.; Hazwan, M. H. M.; Shazzuan, S.

    2017-09-01

    Plastic injection moulding is a popular manufacturing method not only it is reliable, but also efficient and cost saving. It able to produce plastic part with detailed features and complex geometry. However, defects in injection moulding process degrades the quality and aesthetic of the injection moulded product. The most common defect occur in the process is warpage. Inappropriate process parameter setting of injection moulding machine is one of the reason that leads to the occurrence of warpage. The aims of this study were to improve the quality of injection moulded part by investigating the optimal parameters in minimizing warpage using Response Surface Methodology (RSM). Subsequent to this, the most significant parameter was identified and recommended parameters setting was compared with the optimized parameter setting using RSM. In this research, the mobile phone case was selected as case study. The mould temperature, melt temperature, packing pressure, packing time and cooling time were selected as variables whereas warpage in y-direction was selected as responses in this research. The simulation was carried out by using Autodesk Moldflow Insight 2012. In addition, the RSM was performed by using Design Expert 7.0. The warpage in y direction recommended by RSM were reduced by 70 %. RSM performed well in solving warpage issue.

  6. Optimization of medical imaging display systems: using the channelized Hotelling observer for detecting lung nodules: experimental study

    NASA Astrophysics Data System (ADS)

    Platisa, Ljiljana; Vansteenkiste, Ewout; Goossens, Bart; Marchessoux, Cédric; Kimpe, Tom; Philips, Wilfried

    2009-02-01

    Medical-imaging systems are designed to aid medical specialists in a specific task. Therefore, the physical parameters of a system need to optimize the task performance of a human observer. This requires measurements of human performance in a given task during the system optimization. Typically, psychophysical studies are conducted for this purpose. Numerical observer models have been successfully used to predict human performance in several detection tasks. Especially, the task of signal detection using a channelized Hotelling observer (CHO) in simulated images has been widely explored. However, there are few studies done for clinically acquired images that also contain anatomic noise. In this paper, we investigate the performance of a CHO in the task of detecting lung nodules in real radiographic images of the chest. To evaluate variability introduced by the limited available data, we employ a commonly used study of a multi-reader multi-case (MRMC) scenario. It accounts for both case and reader variability. Finally, we use the "oneshot" methods to estimate the MRMC variance of the area under the ROC curve (AUC). The obtained AUC compares well to those reported for human observer study on a similar data set. Furthermore, the "one-shot" analysis implies a fairly consistent performance of the CHO with the variance of AUC below 0.002. This indicates promising potential for numerical observers in optimization of medical imaging displays and encourages further investigation on the subject.

  7. Stochastic Optimization for an Analytical Model of Saltwater Intrusion in Coastal Aquifers

    PubMed Central

    Stratis, Paris N.; Karatzas, George P.; Papadopoulou, Elena P.; Zakynthinaki, Maria S.; Saridakis, Yiannis G.

    2016-01-01

    The present study implements a stochastic optimization technique to optimally manage freshwater pumping from coastal aquifers. Our simulations utilize the well-known sharp interface model for saltwater intrusion in coastal aquifers together with its known analytical solution. The objective is to maximize the total volume of freshwater pumped by the wells from the aquifer while, at the same time, protecting the aquifer from saltwater intrusion. In the direction of dealing with this problem in real time, the ALOPEX stochastic optimization method is used, to optimize the pumping rates of the wells, coupled with a penalty-based strategy that keeps the saltwater front at a safe distance from the wells. Several numerical optimization results, that simulate a known real aquifer case, are presented. The results explore the computational performance of the chosen stochastic optimization method as well as its abilities to manage freshwater pumping in real aquifer environments. PMID:27689362

  8. Optimization of wastewater treatment plant operation for greenhouse gas mitigation.

    PubMed

    Kim, Dongwook; Bowen, James D; Ozelkan, Ertunga C

    2015-11-01

    This study deals with the determination of optimal operation of a wastewater treatment system for minimizing greenhouse gas emissions, operating costs, and pollution loads in the effluent. To do this, an integrated performance index that includes three objectives was established to assess system performance. The ASMN_G model was used to perform system optimization aimed at determining a set of operational parameters that can satisfy three different objectives. The complex nonlinear optimization problem was simulated using the Nelder-Mead Simplex optimization algorithm. A sensitivity analysis was performed to identify influential operational parameters on system performance. The results obtained from the optimization simulations for six scenarios demonstrated that there are apparent trade-offs among the three conflicting objectives. The best optimized system simultaneously reduced greenhouse gas emissions by 31%, reduced operating cost by 11%, and improved effluent quality by 2% compared to the base case operation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Optimizing the parameters of heat transmission in a small heat exchanger with spiral tapes cut as triangles and Aluminum oxide nanofluid using central composite design method

    NASA Astrophysics Data System (ADS)

    Ghasemi, Nahid; Aghayari, Reza; Maddah, Heydar

    2018-07-01

    The present study aims at optimizing the heat transmission parameters such as Nusselt number and friction factor in a small double pipe heat exchanger equipped with rotating spiral tapes cut as triangles and filled with aluminum oxide nanofluid. The effects of Reynolds number, twist ratio (y/w), rotating twisted tape and concentration (w%) on the Nusselt number and friction factor are also investigated. The central composite design and the response surface methodology are used for evaluating the responses necessary for optimization. According to the optimal curves, the most optimized value obtained for Nusselt number and friction factor was 146.6675 and 0.06020, respectively. Finally, an appropriate correlation is also provided to achieve the optimal model of the minimum cost. Optimization results showed that the cost has decreased in the best case.

  10. Optimizing the parameters of heat transmission in a small heat exchanger with spiral tapes cut as triangles and Aluminum oxide nanofluid using central composite design method

    NASA Astrophysics Data System (ADS)

    Ghasemi, Nahid; Aghayari, Reza; Maddah, Heydar

    2018-02-01

    The present study aims at optimizing the heat transmission parameters such as Nusselt number and friction factor in a small double pipe heat exchanger equipped with rotating spiral tapes cut as triangles and filled with aluminum oxide nanofluid. The effects of Reynolds number, twist ratio (y/w), rotating twisted tape and concentration (w%) on the Nusselt number and friction factor are also investigated. The central composite design and the response surface methodology are used for evaluating the responses necessary for optimization. According to the optimal curves, the most optimized value obtained for Nusselt number and friction factor was 146.6675 and 0.06020, respectively. Finally, an appropriate correlation is also provided to achieve the optimal model of the minimum cost. Optimization results showed that the cost has decreased in the best case.

  11. Comparing the distress thermometer (DT) with the patient health questionnaire (PHQ)-2 for screening for possible cases of depression among patients newly diagnosed with advanced cancer.

    PubMed

    Lazenby, Mark; Dixon, Jane; Bai, Mei; McCorkle, Ruth

    2014-02-01

    Distress screening guidelines call for rapid screening for emotional distress at the time of cancer diagnosis. The purpose of this study was to examine the distress thermometer's (DT) ability to screen in patients in treatment for advanced cancer who may be depressed. Using cross-sectional data collected from patients within 30 days of diagnosis with advanced cancer, this study used ROC analysis to determine the optimal-cutoff point of the distress thermometer (DT) for screening for depression as measured by the physician health questionnaire (PHQ)-9; inter-test reliability analysis to compare the DT with the PHQ-2 for screening in possible cases of depression, and multivariate analysis to examine associations among the DT emotional problem list (EPL) items with cases of depression. The average age of the 123 patients in the study was 59.9 (12.9) years. Seventy (56.9%) were female. All had Stage 3 or 4 cancers (40% gastrointestinal, 19% gynecologic, 20% head and neck, 21% lung). The mean DT score was 4 (2.7)/10; and 56 (43%) were depressed as measured by the PHQ-9 ≥ 5. The optimal DT cut-off score to screen in possible cases of depression was ≥ 2/10, with a sensitivity of .96, compared to a sensitivity of .32 of the PHQ-2 ≥ 2. Correlation coefficients for the DT ≥ 2 and the PHQ-2 with the PHQ-9 ≥ 5 were 0.4 and -0.2, respectively. EPL items associated with cases of depression were Depression (OR = 0.15, 0.02-0.85) and Sadness (OR = 0.21, 0.06-0.72). The optimal DT threshold for identifying possible cases of depression at the time of diagnosis is ≥ 2; this threshold is more sensitive than the PHQ-2 ≥ 2. EPL items may be used with the DT score to triage patients for evaluation.

  12. Capacity improvement using simulation optimization approaches: A case study in the thermotechnology industry

    NASA Astrophysics Data System (ADS)

    Yelkenci Köse, Simge; Demir, Leyla; Tunalı, Semra; Türsel Eliiyi, Deniz

    2015-02-01

    In manufacturing systems, optimal buffer allocation has a considerable impact on capacity improvement. This study presents a simulation optimization procedure to solve the buffer allocation problem in a heat exchanger production plant so as to improve the capacity of the system. For optimization, three metaheuristic-based search algorithms, i.e. a binary-genetic algorithm (B-GA), a binary-simulated annealing algorithm (B-SA) and a binary-tabu search algorithm (B-TS), are proposed. These algorithms are integrated with the simulation model of the production line. The simulation model, which captures the stochastic and dynamic nature of the production line, is used as an evaluation function for the proposed metaheuristics. The experimental study with benchmark problem instances from the literature and the real-life problem show that the proposed B-TS algorithm outperforms B-GA and B-SA in terms of solution quality.

  13. Optimization of Water Resources and Agricultural Activities for Economic Benefit in Colorado

    NASA Astrophysics Data System (ADS)

    LIM, J.; Lall, U.

    2017-12-01

    The limited water resources available for irrigation are a key constraint for the important agricultural sector of Colorado's economy. As climate change and groundwater depletion reshape these resources, it is essential to understand the economic potential of water resources under different agricultural production practices. This study uses a linear programming optimization at the county spatial scale and annual temporal scales to study the optimal allocation of water withdrawal and crop choices. The model, AWASH, reflects streamflow constraints between different extraction points, six field crops, and a distinct irrigation decision for maize and wheat. The optimized decision variables, under different environmental, social, economic, and physical constraints, provide long-term solutions for ground and surface water distribution and for land use decisions so that the state can generate the maximum net revenue. Colorado, one of the largest agricultural producers, is tested as a case study and the sensitivity on water price and on climate variability is explored.

  14. Portfolio Optimization of Nanomaterial Use in Clean Energy Technologies.

    PubMed

    Moore, Elizabeth A; Babbitt, Callie W; Gaustad, Gabrielle; Moore, Sean T

    2018-04-03

    While engineered nanomaterials (ENMs) are increasingly incorporated in diverse applications, risks of ENM adoption remain difficult to predict and mitigate proactively. Current decision-making tools do not adequately account for ENM uncertainties including varying functional forms, unique environmental behavior, economic costs, unknown supply and demand, and upstream emissions. The complexity of the ENM system necessitates a novel approach: in this study, the adaptation of an investment portfolio optimization model is demonstrated for optimization of ENM use in renewable energy technologies. Where a traditional investment portfolio optimization model maximizes return on investment through optimal selection of stock, ENM portfolio optimization maximizes the performance of energy technology systems by optimizing selective use of ENMs. Cumulative impacts of multiple ENM material portfolios are evaluated in two case studies: organic photovoltaic cells (OPVs) for renewable energy and lithium-ion batteries (LIBs) for electric vehicles. Results indicate ENM adoption is dependent on overall performance and variance of the material, resource use, environmental impact, and economic trade-offs. From a sustainability perspective, improved clean energy applications can help extend product lifespans, reduce fossil energy consumption, and substitute ENMs for scarce incumbent materials.

  15. A quantitative study of IMRT delivery effects in commercial planning systems for the case of oesophagus and prostate tumours.

    PubMed

    Seco, J; Clark, C H; Evans, P M; Webb, S

    2006-05-01

    This study focuses on understanding the impact of intensity-modulated radiotherapy (IMRT) delivery effects when applied to plans generated by commercial treatment-planning systems such as Pinnacle (ADAC Laboratories Inc.) and CadPlan/Helios (Varian Medical Systems). These commercial planning systems have had several version upgrades (with improvements in the optimization algorithm), but the IMRT delivery effects have not been incorporated into the optimization process. IMRT delivery effects include head-scatter fluence from IMRT fields, transmission through leaves and the effect of the rounded shape of the leaf ends. They are usually accounted for after optimization when leaf sequencing the "optimal" fluence profiles, to derive the delivered fluence profile. The study was divided into two main parts: (a) analysing the dose distribution within the planning-target volume (PTV), produced by each of the commercial treatment-planning systems, after the delivered fluence had been renormalized to deliver the correct dose to the PTV; and (b) studying the impact of the IMRT delivery technique on the surrounding critical organs such as the spinal cord, lungs, rectum, bladder etc. The study was performed for tumours of (i) the oesophagus and (ii) the prostate and pelvic nodes. An oesophagus case was planned with the Pinnacle planning system for IMRT delivery, via multiple-static fields (MSF) and compensators, using the Elekta SL25 with a multileaf collimator (MLC) component. A prostate and pelvic nodes IMRT plan was performed with the Cadplan/Helios system for a dynamic delivery (DMLC) using the Varian 120-leaf Millennium MLC. In these commercial planning systems, since IMRT delivery effects are not included into the optimization process, fluence renormalization is required such that the median delivered PTV dose equals the initial prescribed PTV dose. In preparing the optimum fluence profile for delivery, the PTV dose has been "smeared" by the IMRT delivery techniques. In the case of the oesophagus, the critical organ, spinal cord, received a greater dose than initially planned, due to the delivery effects. The increase in the spinal cord dose is of the order of 2-3 Gy. In the case of the prostate and pelvic nodes, the IMRT delivery effects led to an increase of approximately 2 Gy in the dose delivered to the secondary PTV, the pelvic nodes. In addition to this, the small bowel, rectum and bladder received an increased dose of the order of 2-3 Gy to 50% of their total volume. IMRT delivery techniques strongly influence the delivered dose distributions for the oesophagus and prostate/pelvic nodes tumour sites and these effects are not yet accounted for in the Pinnacle and the CadPlan/Helios planning systems. Currently, they must be taken into account during the optimization stage by altering the dose limits accepted during optimization so that the final (sequenced) dose is within the constraints.

  16. A Comparison of the Hot Spot and the Average Cancer Cell Counting Methods and the Optimal Cutoff Point of the Ki-67 Index for Luminal Type Breast Cancer.

    PubMed

    Arima, Nobuyuki; Nishimura, Reiki; Osako, Tomofumi; Nishiyama, Yasuyuki; Fujisue, Mamiko; Okumura, Yasuhiro; Nakano, Masahiro; Tashima, Rumiko; Toyozumi, Yasuo

    2016-01-01

    In this case-control study, we investigated the most suitable cell counting area and the optimal cutoff point of the Ki-67 index. Thirty recurrent cases were selected among hormone receptor (HR)-positive/HER2-negative breast cancer patients. As controls, 90 nonrecurrent cases were randomly selected by allotting 3 controls to each recurrent case based on the following criteria: age, nodal status, tumor size, and adjuvant endocrine therapy alone. Both the hot spot and the average area of the tumor were evaluated on a Ki-67 immunostaining slide. The median Ki-67 index value at the hot spot and average area were 25.0 and 14.5%, respectively. Irrespective of the area counted, the Ki-67 index value was significantly higher in all of the recurrent cases (p < 0.0001). The multivariate analysis revealed that the Ki-67 index value of 20% at the hot spot was the most suitable cutoff point for predicting recurrence. Moreover, higher x0394;Ki-67 index value (the difference between the hot spot and the average area, ≥10%) and lower progesterone receptor expression (<20%) were significantly correlated with recurrence. A higher Ki-67 index value at the hot spot strongly correlated with recurrence, and the optimal cutoff point was found to be 20%. © 2015 S. Karger AG, Basel.

  17. The hospital anxiety and depression rating scale: A cross-sectional study of psychometrics and case finding abilities in general practice

    PubMed Central

    Olssøn, Ingrid; Mykletun, Arnstein; Dahl, Alv A

    2005-01-01

    Background General practitioners' (GPs) diagnostic skills lead to underidentification of generalized anxiety disorders (GAD) and major depressive episodes (MDE). Supplement of brief questionnaires could improve the diagnostic accuracy of GPs for these common mental disorders. The aims of this study were to examine the usefulness of The Hospital Anxiety and Depression Rating Scale (HADS) for GPs by: 1) Examining its psychometrics in the GPs' setting; 2) Testing its case-finding properties compared to patient-rated GAD and MDE (DSM-IV); and 3) Comparing its case finding abilities to that of the GPs using Clinical Global Impression-Severity (CGI-S) rating. Methods In a cross-sectional survey study 1,781 patients in three consecutive days in September 2001 attended 141 GPs geographically spread in Norway. Sensitivity, specificity, optimal cut off score, and Area under the curve (AUC) for the HADS and the CGI-S were calculated with Generalized Anxiety Questionnaire (GAS-Q) as reference standard for GAD, and Depression Screening Questionnaire (DSQ) for MDE. Results The HADS-A had optimal cut off ≥8 (sensitivity 0.89, specificity 0.75), AUC 0.88 and 76% of patients were correctly classified in relation to GAD. The HADS-D had by optimal cut off ≥8 (sensitivity 0.80 and specificity 0.88) AUC 0.93 and 87% of the patients were correctly classified in relation to MDE. Proportions of the total correctly classified at the CGI-S optimal cut-off ≥3 were 83% of patients for GAD and 81% for MDE. Conclusion The results indicate that addition of the patients' HADS scores to GPs' information could improve their diagnostic accuracy of GAD and MDE. PMID:16351733

  18. A method to optimize the shield compact and lightweight combining the structure with components together by genetic algorithm and MCNP code.

    PubMed

    Cai, Yao; Hu, Huasi; Pan, Ziheng; Hu, Guang; Zhang, Tao

    2018-05-17

    To optimize the shield for neutrons and gamma rays compact and lightweight, a method combining the structure and components together was established employing genetic algorithms and MCNP code. As a typical case, the fission energy spectrum of 235 U which mixed neutrons and gamma rays was adopted in this study. Six types of materials were presented and optimized by the method. Spherical geometry was adopted in the optimization after checking the geometry effect. Simulations have made to verify the reliability of the optimization method and the efficiency of the optimized materials. To compare the materials visually and conveniently, the volume and weight needed to build a shield are employed. The results showed that, the composite multilayer material has the best performance. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Advanced optimal design concepts for composite material aircraft repair

    NASA Astrophysics Data System (ADS)

    Renaud, Guillaume

    The application of an automated optimization approach for bonded composite patch design is investigated. To do so, a finite element computer analysis tool to evaluate patch design quality was developed. This tool examines both the mechanical and the thermal issues of the problem. The optimized shape is obtained with a bi-quadratic B-spline surface that represents the top surface of the patch. Additional design variables corresponding to the ply angles are also used. Furthermore, a multi-objective optimization approach was developed to treat multiple and uncertain loads. This formulation aims at designing according to the most unfavorable mechanical and thermal loads. The problem of finding the optimal patch shape for several situations is addressed. The objective is to minimize a stress component at a specific point in the host structure (plate) while ensuring acceptable stress levels in the adhesive. A parametric study is performed in order to identify the effects of various shape parameters on the quality of the repair and its optimal configuration. The effects of mechanical loads and service temperature are also investigated. Two bonding methods are considered, as they imply different thermal histories. It is shown that the proposed techniques are effective and inexpensive for analyzing and optimizing composite patch repairs. It is also shown that thermal effects should not only be present in the analysis, but that they play a paramount role on the resulting quality of the optimized design. In all cases, the optimized configuration results in a significant reduction of the desired stress level by deflecting the loads away from rather than over the damage zone, as is the case with standard designs. Furthermore, the automated optimization ensures the safety of the patch design for all considered operating conditions.

  20. Study and Optimization of CPT Resonance Parameters in 87 Rb/Ar/Ne Microcells Aimed for Application in Metrology

    NASA Astrophysics Data System (ADS)

    Masian, Y.; Sivak, A.; Sevostianov, D.; Vassiliev, V.; Velichansky, V.

    The paper shows the presents results of studies of small-size rubidium cells with argon and neon buffer gases, produced by a patent pended technique of laser welding [Fishman et al. (2014)]. Cells were designed for miniature frequency standard. Temperature dependence of the frequency of the coherent population trapping (CPT) resonance was measured and used to optimize the ratio of partial pressures of buffer gases. The influence of duration and regime of annealing on the CPT-resonance frequency drift was investigated. The parameters of the FM modulation of laser current for two cases which correspond to the highest amplitude of CPT resonance and to the smallest light shifts of the resonance frequency were determined. The temperature dependences of the CPT resonance frequency were found to be surprisingly different in the two cases. A non-linear dependence of CPT resonance frequency on the temperature of the cell with the two extremes was revealed for one of these cases.

  1. Deploying response surface methodology (RSM) and glowworm swarm optimization (GSO) in optimizing warpage on a mobile phone cover

    NASA Astrophysics Data System (ADS)

    Lee, X. N.; Fathullah, M.; Shayfull, Z.; Nasir, S. M.; Hazwan, M. H. M.; Shazzuan, S.

    2017-09-01

    Plastic injection moulding is a popular manufacturing method not only it is reliable, but also efficient and cost saving. It able to produce plastic part with detailed features and complex geometry. However, defects in injection moulding process degrades the quality and aesthetic of the injection moulded product. The most common defect occur in the process is warpage. Inappropriate process parameter setting of injection moulding machine is one of the reason that leads to the occurrence of warpage. The aims of this study were to improve the quality of injection moulded part by investigating the optimal parameters in minimizing warpage using Response Surface Methodology (RSM) and Glowworm Swarm Optimization (GSO). Subsequent to this, the most significant parameter was identified and recommended parameters setting was compared with the optimized parameter setting using RSM and GSO. In this research, the mobile phone case was selected as case study. The mould temperature, melt temperature, packing pressure, packing time and cooling time were selected as variables whereas warpage in y-direction was selected as responses in this research. The simulation was carried out by using Autodesk Moldflow Insight 2012. In addition, the RSM was performed by using Design Expert 7.0 whereas the GSO was utilized by using MATLAB. The warpage in y direction recommended by RSM were reduced by 70 %. The warpages recommended by GSO were decreased by 61 % in y direction. The resulting warpages under optimal parameter setting by RSM and GSO were validated by simulation in AMI 2012. RSM performed better than GSO in solving warpage issue.

  2. Sant Joan d’Alacant declaration in defense of Open Access to scientific publications, by the group of editors of Spanish journals on health sciences (GERECS)

    PubMed

    Grupo de Editores de Revistas Españolas Sobre Ciencias de la Salud, Gerecs

    2018-01-10

    3-hydroxy-3-methylglutaryl-CoA (HMG-CoA) lyase deficiency is an autosomal recessive disorder that usually presents in the neonatal period with vomiting, metabolic acidosis, hypoglycemia and absent ketonuria. Few cases are reported in the literature, and optimal dietary management and long term outcome are not fully understood. We report a 2 year old girl with HMG-CoA-lyase deficiency who had limited fasting tolerance on a low protein diet, with several recurrent hospital admissions with severe hypoketotic hypoglycaemia and metabolic acidosis. We also review the dietary management and outcome of other reported cases in the literature. In order to define optimal dietary treatment, it is important to collect higher numbers of case studies with detailed dietary management, fasting times and outcome.

  3. Validation of a Case Definition for Pediatric Brain Injury Using Administrative Data.

    PubMed

    McChesney-Corbeil, Jane; Barlow, Karen; Quan, Hude; Chen, Guanmin; Wiebe, Samuel; Jette, Nathalie

    2017-03-01

    Health administrative data are a common population-based data source for traumatic brain injury (TBI) surveillance and research; however, before using these data for surveillance, it is important to develop a validated case definition. The objective of this study was to identify the optimal International Classification of Disease , edition 10 (ICD-10), case definition to ascertain children with TBI in emergency room (ER) or hospital administrative data. We tested multiple case definitions. Children who visited the ER were identified from the Regional Emergency Department Information System at Alberta Children's Hospital. Secondary data were collected for children with trauma, musculoskeletal, or central nervous system complaints who visited the ER between October 5, 2005, and June 6, 2007. TBI status was determined based on chart review. Sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) were calculated for each case definition. Of 6639 patients, 1343 had a TBI. The best case definition was, "1 hospital or 1 ER encounter coded with an ICD-10 code for TBI in 1 year" (sensitivity 69.8% [95% confidence interval (CI), 67.3-72.2], specificity 96.7% [95% CI, 96.2-97.2], PPV 84.2% [95% CI 82.0-86.3], NPV 92.7% [95% CI, 92.0-93.3]). The nonspecific code S09.9 identified >80% of TBI cases in our study. The optimal ICD-10-based case definition for pediatric TBI in this study is valid and should be considered for future pediatric TBI surveillance studies. However, external validation is recommended before use in other jurisdictions, particularly because it is plausible that a larger proportion of patients in our cohort had milder injuries.

  4. Standardless quantification by parameter optimization in electron probe microanalysis

    NASA Astrophysics Data System (ADS)

    Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.

    2012-11-01

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively.

  5. The Design of Secondary Schools--A Case Study, Singapore.

    ERIC Educational Resources Information Center

    Liew Kok-Pun, Michael; And Others

    Land scarcity dominates the thinking of school planners in Singapore. Techniques for optimizing the use of land for schools include (1) the construction of multi-storied or high-rise schools; (2) operation of a double-shift system and, in some cases, a triple-shift system; (3) multiple use of educational spaces; and (4) construction of several…

  6. Investigation of earthquake factor for optimum tuned mass dampers

    NASA Astrophysics Data System (ADS)

    Nigdeli, Sinan Melih; Bekdaş, Gebrail

    2012-09-01

    In this study the optimum parameters of tuned mass dampers (TMD) are investigated under earthquake excitations. An optimization strategy was carried out by using the Harmony Search (HS) algorithm. HS is a metaheuristic method which is inspired from the nature of musical performances. In addition to the HS algorithm, the results of the optimization objective are compared with the results of the other documented method and the corresponding results are eliminated. In that case, the best optimum results are obtained. During the optimization, the optimum TMD parameters were searched for single degree of freedom (SDOF) structure models with different periods. The optimization was done for different earthquakes separately and the results were compared.

  7. Computer model for refinery operations with emphasis on jet fuel production. Volume 1: Program description

    NASA Technical Reports Server (NTRS)

    Dunbar, D. N.; Tunnah, B. G.

    1978-01-01

    A FORTRAN computer program is described for predicting the flow streams and material, energy, and economic balances of a typical petroleum refinery, with particular emphasis on production of aviation turbine fuel of varying end point and hydrogen content specifications. The program has provision for shale oil and coal oil in addition to petroleum crudes. A case study feature permits dependent cases to be run for parametric or optimization studies by input of only the variables which are changed from the base case.

  8. Computer model for refinery operations with emphasis on jet fuel production. Volume 3: Detailed systems and programming documentation

    NASA Technical Reports Server (NTRS)

    Dunbar, D. N.; Tunnah, B. G.

    1978-01-01

    The FORTRAN computing program predicts flow streams and material, energy, and economic balances of a typical petroleum refinery, with particular emphasis on production of aviation turbine fuels of varying end point and hydrogen content specifications. The program has a provision for shale oil and coal oil in addition to petroleum crudes. A case study feature permits dependent cases to be run for parametric or optimization studies by input of only the variables which are changed from the base case.

  9. A dynamic feedback-control toll pricing methodology : a case study on Interstate 95 managed lanes.

    DOT National Transportation Integrated Search

    2013-06-01

    Recently, congestion pricing emerged as a cost-effective and efficient strategy to mitigate the congestion problem on freeways. This study develops a feedback-control based dynamic toll approach to formulate and solve for optimal tolls. The study com...

  10. SU-E-T-625: Robustness Evaluation and Robust Optimization of IMPT Plans Based on Per-Voxel Standard Deviation of Dose Distributions.

    PubMed

    Liu, W; Mohan, R

    2012-06-01

    Proton dose distributions, IMPT in particular, are highly sensitive to setup and range uncertainties. We report a novel method, based on per-voxel standard deviation (SD) of dose distributions, to evaluate the robustness of proton plans and to robustly optimize IMPT plans to render them less sensitive to uncertainties. For each optimization iteration, nine dose distributions are computed - the nominal one, and one each for ± setup uncertainties along x, y and z axes and for ± range uncertainty. SD of dose in each voxel is used to create SD-volume histogram (SVH) for each structure. SVH may be considered a quantitative representation of the robustness of the dose distribution. For optimization, the desired robustness may be specified in terms of an SD-volume (SV) constraint on the CTV and incorporated as a term in the objective function. Results of optimization with and without this constraint were compared in terms of plan optimality and robustness using the so called'worst case' dose distributions; which are obtained by assigning the lowest among the nine doses to each voxel in the clinical target volume (CTV) and the highest to normal tissue voxels outside the CTV. The SVH curve and the area under it for each structure were used as quantitative measures of robustness. Penalty parameter of SV constraint may be varied to control the tradeoff between robustness and plan optimality. We applied these methods to one case each of H&N and lung. In both cases, we found that imposing SV constraint improved plan robustness but at the cost of normal tissue sparing. SVH-based optimization and evaluation is an effective tool for robustness evaluation and robust optimization of IMPT plans. Studies need to be conducted to test the methods for larger cohorts of patients and for other sites. This research is supported by National Cancer Institute (NCI) grant P01CA021239, the University Cancer Foundation via the Institutional Research Grant program at the University of Texas MD Anderson Cancer Center, and MD Anderson’s cancer center support grant CA016672. © 2012 American Association of Physicists in Medicine.

  11. Optimal control problem for linear fractional-order systems, described by equations with Hadamard-type derivative

    NASA Astrophysics Data System (ADS)

    Postnov, Sergey

    2017-11-01

    Two kinds of optimal control problem are investigated for linear time-invariant fractional-order systems with lumped parameters which dynamics described by equations with Hadamard-type derivative: the problem of control with minimal norm and the problem of control with minimal time at given restriction on control norm. The problem setting with nonlocal initial conditions studied. Admissible controls allowed to be the p-integrable functions (p > 1) at half-interval. The optimal control problem studied by moment method. The correctness and solvability conditions for the corresponding moment problem are derived. For several special cases the optimal control problems stated are solved analytically. Some analogies pointed for results obtained with the results which are known for integer-order systems and fractional-order systems describing by equations with Caputo- and Riemann-Liouville-type derivatives.

  12. AN OPTIMAL MAINTENANCE MANAGEMENT MODEL FOR AIRPORT CONCRETE PAVEMENT

    NASA Astrophysics Data System (ADS)

    Shimomura, Taizo; Fujimori, Yuji; Kaito, Kiyoyuki; Obama, Kengo; Kobayashi, Kiyoshi

    In this paper, an optimal management model is formulated for the performance-based rehabilitation/maintenance contract for airport concrete pavement, whereby two types of life cycle cost risks, i.e., ground consolidation risk and concrete depreciation risk, are explicitly considered. The non-homogenous Markov chain model is formulated to represent the deterioration processes of concrete pavement which are conditional upon the ground consolidation processes. The optimal non-homogenous Markov decision model with multiple types of risk is presented to design the optimal rehabilitation/maintenance plans. And the methodology to revise the optimal rehabilitation/maintenance plans based upon the monitoring data by the Bayesian up-to-dating rules. The validity of the methodology presented in this paper is examined based upon the case studies carried out for the H airport.

  13. Engineering two-wire optical antennas for near field enhancement

    NASA Astrophysics Data System (ADS)

    Yang, Zhong-Jian; Zhao, Qian; Xiao, Si; He, Jun

    2017-07-01

    We study the optimization of near field enhancement in the two-wire optical antenna system. By varying the nanowire sizes we obtain the optimized side-length (width and height) for the maximum field enhancement with a given gap size. The optimized side-length applies to a broadband range (λ = 650-1000 nm). The ratio of extinction cross section to field concentration size is found to be closely related to the field enhancement behavior. We also investigate two experimentally feasible cases which are antennas on glass substrate and mirror, and find that the optimized side-length also applies to these systems. It is also found that the optimized side-length shows a tendency of increasing with the gap size. Our results could find applications in field-enhanced spectroscopies.

  14. Case study: technology initiative led to advanced lead optimization screening processes at Bristol-Myers Squibb, 2004-2009.

    PubMed

    Zhang, Litao; Cvijic, Mary Ellen; Lippy, Jonathan; Myslik, James; Brenner, Stephen L; Binnie, Alastair; Houston, John G

    2012-07-01

    In this paper, we review the key solutions that enabled evolution of the lead optimization screening support process at Bristol-Myers Squibb (BMS) between 2004 and 2009. During this time, technology infrastructure investment and scientific expertise integration laid the foundations to build and tailor lead optimization screening support models across all therapeutic groups at BMS. Together, harnessing advanced screening technology platforms and expanding panel screening strategy led to a paradigm shift at BMS in supporting lead optimization screening capability. Parallel SAR and structure liability relationship (SLR) screening approaches were first and broadly introduced to empower more-rapid and -informed decisions about chemical synthesis strategy and to broaden options for identifying high-quality drug candidates during lead optimization. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Optimization of the Hartmann-Shack microlens array

    NASA Astrophysics Data System (ADS)

    de Oliveira, Otávio Gomes; de Lima Monteiro, Davies William

    2011-04-01

    In this work we propose to optimize the microlens-array geometry for a Hartmann-Shack wavefront sensor. The optimization makes possible that regular microlens arrays with a larger number of microlenses are replaced by arrays with fewer microlenses located at optimal sampling positions, with no increase in the reconstruction error. The goal is to propose a straightforward and widely accessible numerical method to calculate an optimized microlens array for a known aberration statistics. The optimization comprises the minimization of the wavefront reconstruction error and/or the number of necessary microlenses in the array. We numerically generate, sample and reconstruct the wavefront, and use a genetic algorithm to discover the optimal array geometry. Within an ophthalmological context, as a case study, we demonstrate that an array with only 10 suitably located microlenses can be used to produce reconstruction errors as small as those of a 36-microlens regular array. The same optimization procedure can be employed for any application where the wavefront statistics is known.

  16. Optimal Bayesian Adaptive Design for Test-Item Calibration.

    PubMed

    van der Linden, Wim J; Ren, Hao

    2015-06-01

    An optimal adaptive design for test-item calibration based on Bayesian optimality criteria is presented. The design adapts the choice of field-test items to the examinees taking an operational adaptive test using both the information in the posterior distributions of their ability parameters and the current posterior distributions of the field-test parameters. Different criteria of optimality based on the two types of posterior distributions are possible. The design can be implemented using an MCMC scheme with alternating stages of sampling from the posterior distributions of the test takers' ability parameters and the parameters of the field-test items while reusing samples from earlier posterior distributions of the other parameters. Results from a simulation study demonstrated the feasibility of the proposed MCMC implementation for operational item calibration. A comparison of performances for different optimality criteria showed faster calibration of substantial numbers of items for the criterion of D-optimality relative to A-optimality, a special case of c-optimality, and random assignment of items to the test takers.

  17. Application of dragonfly algorithm for optimal performance analysis of process parameters in turn-mill operations- A case study

    NASA Astrophysics Data System (ADS)

    Vikram, K. Arun; Ratnam, Ch; Lakshmi, VVK; Kumar, A. Sunny; Ramakanth, RT

    2018-02-01

    Meta-heuristic multi-response optimization methods are widely in use to solve multi-objective problems to obtain Pareto optimal solutions during optimization. This work focuses on optimal multi-response evaluation of process parameters in generating responses like surface roughness (Ra), surface hardness (H) and tool vibration displacement amplitude (Vib) while performing operations like tangential and orthogonal turn-mill processes on A-axis Computer Numerical Control vertical milling center. Process parameters like tool speed, feed rate and depth of cut are considered as process parameters machined over brass material under dry condition with high speed steel end milling cutters using Taguchi design of experiments (DOE). Meta-heuristic like Dragonfly algorithm is used to optimize the multi-objectives like ‘Ra’, ‘H’ and ‘Vib’ to identify the optimal multi-response process parameters combination. Later, the results thus obtained from multi-objective dragonfly algorithm (MODA) are compared with another multi-response optimization technique Viz. Grey relational analysis (GRA).

  18. Web-based newborn screening system for metabolic diseases: machine learning versus clinicians.

    PubMed

    Chen, Wei-Hsin; Hsieh, Sheau-Ling; Hsu, Kai-Ping; Chen, Han-Ping; Su, Xing-Yu; Tseng, Yi-Ju; Chien, Yin-Hsiu; Hwu, Wuh-Liang; Lai, Feipei

    2013-05-23

    A hospital information system (HIS) that integrates screening data and interpretation of the data is routinely requested by hospitals and parents. However, the accuracy of disease classification may be low because of the disease characteristics and the analytes used for classification. The objective of this study is to describe a system that enhanced the neonatal screening system of the Newborn Screening Center at the National Taiwan University Hospital. The system was designed and deployed according to a service-oriented architecture (SOA) framework under the Web services .NET environment. The system consists of sample collection, testing, diagnosis, evaluation, treatment, and follow-up services among collaborating hospitals. To improve the accuracy of newborn screening, machine learning and optimal feature selection mechanisms were investigated for screening newborns for inborn errors of metabolism. The framework of the Newborn Screening Hospital Information System (NSHIS) used the embedded Health Level Seven (HL7) standards for data exchanges among heterogeneous platforms integrated by Web services in the C# language. In this study, machine learning classification was used to predict phenylketonuria (PKU), hypermethioninemia, and 3-methylcrotonyl-CoA-carboxylase (3-MCC) deficiency. The classification methods used 347,312 newborn dried blood samples collected at the Center between 2006 and 2011. Of these, 220 newborns had values over the diagnostic cutoffs (positive cases) and 1557 had values that were over the screening cutoffs but did not meet the diagnostic cutoffs (suspected cases). The original 35 analytes and the manifested features were ranked based on F score, then combinations of the top 20 ranked features were selected as input features to support vector machine (SVM) classifiers to obtain optimal feature sets. These feature sets were tested using 5-fold cross-validation and optimal models were generated. The datasets collected in year 2011 were used as predicting cases. The feature selection strategies were implemented and the optimal markers for PKU, hypermethioninemia, and 3-MCC deficiency were obtained. The results of the machine learning approach were compared with the cutoff scheme. The number of the false positive cases were reduced from 21 to 2 for PKU, from 30 to 10 for hypermethioninemia, and 209 to 46 for 3-MCC deficiency. This SOA Web service-based newborn screening system can accelerate screening procedures effectively and efficiently. An SVM learning methodology for PKU, hypermethioninemia, and 3-MCC deficiency metabolic diseases classification, including optimal feature selection strategies, is presented. By adopting the results of this study, the number of suspected cases could be reduced dramatically.

  19. Web-Based Newborn Screening System for Metabolic Diseases: Machine Learning Versus Clinicians

    PubMed Central

    Chen, Wei-Hsin; Hsu, Kai-Ping; Chen, Han-Ping; Su, Xing-Yu; Tseng, Yi-Ju; Chien, Yin-Hsiu; Hwu, Wuh-Liang; Lai, Feipei

    2013-01-01

    Background A hospital information system (HIS) that integrates screening data and interpretation of the data is routinely requested by hospitals and parents. However, the accuracy of disease classification may be low because of the disease characteristics and the analytes used for classification. Objective The objective of this study is to describe a system that enhanced the neonatal screening system of the Newborn Screening Center at the National Taiwan University Hospital. The system was designed and deployed according to a service-oriented architecture (SOA) framework under the Web services .NET environment. The system consists of sample collection, testing, diagnosis, evaluation, treatment, and follow-up services among collaborating hospitals. To improve the accuracy of newborn screening, machine learning and optimal feature selection mechanisms were investigated for screening newborns for inborn errors of metabolism. Methods The framework of the Newborn Screening Hospital Information System (NSHIS) used the embedded Health Level Seven (HL7) standards for data exchanges among heterogeneous platforms integrated by Web services in the C# language. In this study, machine learning classification was used to predict phenylketonuria (PKU), hypermethioninemia, and 3-methylcrotonyl-CoA-carboxylase (3-MCC) deficiency. The classification methods used 347,312 newborn dried blood samples collected at the Center between 2006 and 2011. Of these, 220 newborns had values over the diagnostic cutoffs (positive cases) and 1557 had values that were over the screening cutoffs but did not meet the diagnostic cutoffs (suspected cases). The original 35 analytes and the manifested features were ranked based on F score, then combinations of the top 20 ranked features were selected as input features to support vector machine (SVM) classifiers to obtain optimal feature sets. These feature sets were tested using 5-fold cross-validation and optimal models were generated. The datasets collected in year 2011 were used as predicting cases. Results The feature selection strategies were implemented and the optimal markers for PKU, hypermethioninemia, and 3-MCC deficiency were obtained. The results of the machine learning approach were compared with the cutoff scheme. The number of the false positive cases were reduced from 21 to 2 for PKU, from 30 to 10 for hypermethioninemia, and 209 to 46 for 3-MCC deficiency. Conclusions This SOA Web service–based newborn screening system can accelerate screening procedures effectively and efficiently. An SVM learning methodology for PKU, hypermethioninemia, and 3-MCC deficiency metabolic diseases classification, including optimal feature selection strategies, is presented. By adopting the results of this study, the number of suspected cases could be reduced dramatically. PMID:23702487

  20. Optimization of Micro Metal Injection Molding By Using Grey Relational Grade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibrahim, M. H. I.; Precision Process Research Group, Dept. of Mechanical and Materials Engineering, Faculty of Engineering, Universiti Kebangsaan Malaysia; Muhamad, N.

    2011-01-17

    Micro metal injection molding ({mu}MIM) which is a variant of MIM process is a promising method towards near net-shape of metallic micro components of complex geometry. In this paper, {mu}MIM is applied to produce 316L stainless steel micro components. Due to highly stringent characteristic of {mu}MIM properties, the study has been emphasized on optimization of process parameter where Taguchi method associated with Grey Relational Analysis (GRA) will be implemented as it represents novel approach towards investigation of multiple performance characteristics. Basic idea of GRA is to find a grey relational grade (GRG) which can be used for the optimization conversionmore » from multi objectives case which are density and strength to a single objective case. After considering the form 'the larger the better', results show that the injection time(D) is the most significant followed by injection pressure(A), holding time(E), mold temperature(C) and injection temperature(B). Analysis of variance (ANOVA) is also employed to strengthen the significant of each parameter involved in this study.« less

  1. An efficient inverse radiotherapy planning method for VMAT using quadratic programming optimization.

    PubMed

    Hoegele, W; Loeschel, R; Merkle, N; Zygmanski, P

    2012-01-01

    The purpose of this study is to investigate the feasibility of an inverse planning optimization approach for the Volumetric Modulated Arc Therapy (VMAT) based on quadratic programming and the projection method. The performance of this method is evaluated against a reference commercial planning system (eclipse(TM) for rapidarc(TM)) for clinically relevant cases. The inverse problem is posed in terms of a linear combination of basis functions representing arclet dose contributions and their respective linear coefficients as degrees of freedom. MLC motion is decomposed into basic motion patterns in an intuitive manner leading to a system of equations with a relatively small number of equations and unknowns. These equations are solved using quadratic programming under certain limiting physical conditions for the solution, such as the avoidance of negative dose during optimization and Monitor Unit reduction. The modeling by the projection method assures a unique treatment plan with beneficial properties, such as the explicit relation between organ weightings and the final dose distribution. Clinical cases studied include prostate and spine treatments. The optimized plans are evaluated by comparing isodose lines, DVH profiles for target and normal organs, and Monitor Units to those obtained by the clinical treatment planning system eclipse(TM). The resulting dose distributions for a prostate (with rectum and bladder as organs at risk), and for a spine case (with kidneys, liver, lung and heart as organs at risk) are presented. Overall, the results indicate that similar plan qualities for quadratic programming (QP) and rapidarc(TM) could be achieved at significantly more efficient computational and planning effort using QP. Additionally, results for the quasimodo phantom [Bohsung et al., "IMRT treatment planning: A comparative inter-system and inter-centre planning exercise of the estro quasimodo group," Radiother. Oncol. 76(3), 354-361 (2005)] are presented as an example for an extreme concave case. Quadratic programming is an alternative approach for inverse planning which generates clinically satisfying plans in comparison to the clinical system and constitutes an efficient optimization process characterized by uniqueness and reproducibility of the solution.

  2. A robust simulation-optimization modeling system for effluent trading--a case study of nonpoint source pollution control.

    PubMed

    Zhang, J L; Li, Y P; Huang, G H

    2014-04-01

    In this study, a robust simulation-optimization modeling system (RSOMS) is developed for supporting agricultural nonpoint source (NPS) effluent trading planning. The RSOMS can enhance effluent trading through incorporation of a distributed simulation model and an optimization model within its framework. The modeling system not only can handle uncertainties expressed as probability density functions and interval values but also deal with the variability of the second-stage costs that are above the expected level as well as capture the notion of risk under high-variability situations. A case study is conducted for mitigating agricultural NPS pollution with an effluent trading program in Xiangxi watershed. Compared with non-trading policy, trading scheme can successfully mitigate agricultural NPS pollution with an increased system benefit. Through trading scheme, [213.7, 288.8] × 10(3) kg of TN and [11.8, 30.2] × 10(3) kg of TP emissions from cropped area can be cut down during the planning horizon. The results can help identify desired effluent trading schemes for water quality management with the tradeoff between the system benefit and reliability being balanced and risk aversion being considered.

  3. SU-E-T-07: 4DCT Robust Optimization for Esophageal Cancer Using Intensity Modulated Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, L; Department of Industrial Engineering, University of Houston, Houston, TX; Yu, J

    2015-06-15

    Purpose: To develop a 4DCT robust optimization method to reduce the dosimetric impact from respiratory motion in intensity modulated proton therapy (IMPT) for esophageal cancer. Methods: Four esophageal cancer patients were selected for this study. The different phases of CT from a set of 4DCT were incorporated into the worst-case dose distribution robust optimization algorithm. 4DCT robust treatment plans were designed and compared with the conventional non-robust plans. Result doses were calculated on the average and maximum inhale/exhale phases of 4DCT. Dose volume histogram (DVH) band graphic and ΔD95%, ΔD98%, ΔD5%, ΔD2% of CTV between different phases were used tomore » evaluate the robustness of the plans. Results: Compare to the IMPT plans optimized using conventional methods, the 4DCT robust IMPT plans can achieve the same quality in nominal cases, while yield a better robustness to breathing motion. The mean ΔD95%, ΔD98%, ΔD5% and ΔD2% of CTV are 6%, 3.2%, 0.9% and 1% for the robustly optimized plans vs. 16.2%, 11.8%, 1.6% and 3.3% from the conventional non-robust plans. Conclusion: A 4DCT robust optimization method was proposed for esophageal cancer using IMPT. We demonstrate that the 4DCT robust optimization can mitigate the dose deviation caused by the diaphragm motion.« less

  4. Optimization of Ballast Design: A Case Study of the Physics Entrepreneurship Program

    NASA Astrophysics Data System (ADS)

    Ding, Jun; Cheng, Norman; Lamouri, Abbas; Sulcs, Juris; Brown, Robert; Taylor, Cyrus

    2001-10-01

    This talk presents a typical internship project for students in the Physics Entrepreneurship Program at Case Western Reserve University. As part of their overall strategy, Advanced Lighting International (ADLT) is involved in the production of magnetic ballasts for metal halide lamps. The systems in which these ballasts function is undergoing rapid evolution, leading to the question of how the design of the ballasts can be optimized in order to deliver superior performance for lower cost. Addressing this question requires a full understanding of a variety of issues ranging from the basic modeling of the physics of the magnetic ballasts to questions of overall market strategy, manufacturing considerations, and the competitive environment.

  5. A framework for using ant colony optimization to schedule environmental flow management alternatives for rivers, wetlands, and floodplains

    NASA Astrophysics Data System (ADS)

    Szemis, J. M.; Maier, H. R.; Dandy, G. C.

    2012-08-01

    Rivers, wetlands, and floodplains are in need of management as they have been altered from natural conditions and are at risk of vanishing because of river development. One method to mitigate these impacts involves the scheduling of environmental flow management alternatives (EFMA); however, this is a complex task as there are generally a large number of ecological assets (e.g., wetlands) that need to be considered, each with species with competing flow requirements. Hence, this problem evolves into an optimization problem to maximize an ecological benefit within constraints imposed by human needs and the physical layout of the system. This paper presents a novel optimization framework which uses ant colony optimization to enable optimal scheduling of EFMAs, given constraints on the environmental water that is available. This optimization algorithm is selected because, unlike other currently popular algorithms, it is able to account for all aspects of the problem. The approach is validated by comparing it to a heuristic approach, and its utility is demonstrated using a case study based on the Murray River in South Australia to investigate (1) the trade-off between plant recruitment (i.e., promoting germination) and maintenance (i.e., maintaining habitat) flow requirements, (2) the trade-off between flora and fauna flow requirements, and (3) a hydrograph inversion case. The results demonstrate the usefulness and flexibility of the proposed framework as it is able to determine EFMA schedules that provide optimal or near-optimal trade-offs between the competing needs of species under a range of operating conditions and valuable insight for managers.

  6. Application of Hybrid Optimization-Expert System for Optimal Power Management on Board Space Power Station

    NASA Technical Reports Server (NTRS)

    Momoh, James; Chattopadhyay, Deb; Basheer, Omar Ali AL

    1996-01-01

    The space power system has two sources of energy: photo-voltaic blankets and batteries. The optimal power management problem on-board has two broad operations: off-line power scheduling to determine the load allocation schedule of the next several hours based on the forecast of load and solar power availability. The nature of this study puts less emphasis on speed requirement for computation and more importance on the optimality of the solution. The second category problem, on-line power rescheduling, is needed in the event of occurrence of a contingency to optimally reschedule the loads to minimize the 'unused' or 'wasted' energy while keeping the priority on certain type of load and minimum disturbance of the original optimal schedule determined in the first-stage off-line study. The computational performance of the on-line 'rescheduler' is an important criterion and plays a critical role in the selection of the appropriate tool. The Howard University Center for Energy Systems and Control has developed a hybrid optimization-expert systems based power management program. The pre-scheduler has been developed using a non-linear multi-objective optimization technique called the Outer Approximation method and implemented using the General Algebraic Modeling System (GAMS). The optimization model has the capability of dealing with multiple conflicting objectives viz. maximizing energy utilization, minimizing the variation of load over a day, etc. and incorporates several complex interaction between the loads in a space system. The rescheduling is performed using an expert system developed in PROLOG which utilizes a rule-base for reallocation of the loads in an emergency condition viz. shortage of power due to solar array failure, increase of base load, addition of new activity, repetition of old activity etc. Both the modules handle decision making on battery charging and discharging and allocation of loads over a time-horizon of a day divided into intervals of 10 minutes. The models have been extensively tested using a case study for the Space Station Freedom and the results for the case study will be presented. Several future enhancements of the pre-scheduler and the 'rescheduler' have been outlined which include graphic analyzer for the on-line module, incorporating probabilistic considerations, including spatial location of the loads and the connectivity using a direct current (DC) load flow model.

  7. Optimization of Geothermal Well Placement under Geological Uncertainty

    NASA Astrophysics Data System (ADS)

    Schulte, Daniel O.; Arnold, Dan; Demyanov, Vasily; Sass, Ingo; Geiger, Sebastian

    2017-04-01

    Well placement optimization is critical to commercial success of geothermal projects. However, uncertainties of geological parameters prohibit optimization based on a single scenario of the subsurface, particularly when few expensive wells are to be drilled. The optimization of borehole locations is usually based on numerical reservoir models to predict reservoir performance and entails the choice of objectives to optimize (total enthalpy, minimum enthalpy rate, production temperature) and the development options to adjust (well location, pump rate, difference in production and injection temperature). Optimization traditionally requires trying different development options on a single geological realization yet there are many possible different interpretations possible. Therefore, we aim to optimize across a range of representative geological models to account for geological uncertainty in geothermal optimization. We present an approach that uses a response surface methodology based on a large number of geological realizations selected by experimental design to optimize the placement of geothermal wells in a realistic field example. A large number of geological scenarios and design options were simulated and the response surfaces were constructed using polynomial proxy models, which consider both geological uncertainties and design parameters. The polynomial proxies were validated against additional simulation runs and shown to provide an adequate representation of the model response for the cases tested. The resulting proxy models allow for the identification of the optimal borehole locations given the mean response of the geological scenarios from the proxy (i.e. maximizing or minimizing the mean response). The approach is demonstrated on the realistic Watt field example by optimizing the borehole locations to maximize the mean heat extraction from the reservoir under geological uncertainty. The training simulations are based on a comprehensive semi-synthetic data set of a hierarchical benchmark case study for a hydrocarbon reservoir, which specifically considers the interpretational uncertainty in the modeling work flow. The optimal choice of boreholes prolongs the time to cold water breakthrough and allows for higher pump rates and increased water production temperatures.

  8. Hybrid Microgrid Configuration Optimization with Evolutionary Algorithms

    NASA Astrophysics Data System (ADS)

    Lopez, Nicolas

    This dissertation explores the Renewable Energy Integration Problem, and proposes a Genetic Algorithm embedded with a Monte Carlo simulation to solve large instances of the problem that are impractical to solve via full enumeration. The Renewable Energy Integration Problem is defined as finding the optimum set of components to supply the electric demand to a hybrid microgrid. The components considered are solar panels, wind turbines, diesel generators, electric batteries, connections to the power grid and converters, which can be inverters and/or rectifiers. The methodology developed is explained as well as the combinatorial formulation. In addition, 2 case studies of a single objective optimization version of the problem are presented, in order to minimize cost and to minimize global warming potential (GWP) followed by a multi-objective implementation of the offered methodology, by utilizing a non-sorting Genetic Algorithm embedded with a monte Carlo Simulation. The method is validated by solving a small instance of the problem with known solution via a full enumeration algorithm developed by NREL in their software HOMER. The dissertation concludes that the evolutionary algorithms embedded with Monte Carlo simulation namely modified Genetic Algorithms are an efficient form of solving the problem, by finding approximate solutions in the case of single objective optimization, and by approximating the true Pareto front in the case of multiple objective optimization of the Renewable Energy Integration Problem.

  9. Optimized PID control of depth of hypnosis in anesthesia.

    PubMed

    Padula, Fabrizio; Ionescu, Clara; Latronico, Nicola; Paltenghi, Massimiliano; Visioli, Antonio; Vivacqua, Giulio

    2017-06-01

    This paper addresses the use of proportional-integral-derivative controllers for regulating the depth of hypnosis in anesthesia by using propofol administration and the bispectral index as a controlled variable. In fact, introducing an automatic control system might provide significant benefits for the patient in reducing the risk for under- and over-dosing. In this study, the controller parameters are obtained through genetic algorithms by solving a min-max optimization problem. A set of 12 patient models representative of a large population variance is used to test controller robustness. The worst-case performance in the considered population is minimized considering two different scenarios: the induction case and the maintenance case. Our results indicate that including a gain scheduling strategy enables optimal performance for induction and maintenance phases, separately. Using a single tuning to address both tasks may results in a loss of performance up to 102% in the induction phase and up to 31% in the maintenance phase. Further on, it is shown that a suitably designed low-pass filter on the controller output can handle the trade-off between the performance and the noise effect in the control variable. Optimally tuned PID controllers provide a fast induction time with an acceptable overshoot and a satisfactory disturbance rejection performance during maintenance. These features make them a very good tool for comparison when other control algorithms are developed. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Contrast-enhanced spectral mammography with a photon-counting detector.

    PubMed

    Fredenberg, Erik; Hemmendorff, Magnus; Cederström, Björn; Aslund, Magnus; Danielsson, Mats

    2010-05-01

    Spectral imaging is a method in medical x-ray imaging to extract information about the object constituents by the material-specific energy dependence of x-ray attenuation. The authors have investigated a photon-counting spectral imaging system with two energy bins for contrast-enhanced mammography. System optimization and the potential benefit compared to conventional non-energy-resolved absorption imaging was studied. A framework for system characterization was set up that included quantum and anatomical noise and a theoretical model of the system was benchmarked to phantom measurements. Optimal combination of the energy-resolved images corresponded approximately to minimization of the anatomical noise, which is commonly referred to as energy subtraction. In that case, an ideal-observer detectability index could be improved close to 50% compared to absorption imaging in the phantom study. Optimization with respect to the signal-to-quantum-noise ratio, commonly referred to as energy weighting, yielded only a minute improvement. In a simulation of a clinically more realistic case, spectral imaging was predicted to perform approximately 30% better than absorption imaging for an average glandularity breast with an average level of anatomical noise. For dense breast tissue and a high level of anatomical noise, however, a rise in detectability by a factor of 6 was predicted. Another approximately 70%-90% improvement was found to be within reach for an optimized system. Contrast-enhanced spectral mammography is feasible and beneficial with the current system, and there is room for additional improvements. Inclusion of anatomical noise is essential for optimizing spectral imaging systems.

  11. Optimal design of a gas transmission network: A case study of the Turkish natural gas pipeline network system

    NASA Astrophysics Data System (ADS)

    Gunes, Ersin Fatih

    Turkey is located between Europe, which has increasing demand for natural gas and the geographies of Middle East, Asia and Russia, which have rich and strong natural gas supply. Because of the geographical location, Turkey has strategic importance according to energy sources. To supply this demand, a pipeline network configuration with the optimal and efficient lengths, pressures, diameters and number of compressor stations is extremely needed. Because, Turkey has a currently working and constructed network topology, obtaining an optimal configuration of the pipelines, including an optimal number of compressor stations with optimal locations, is the focus of this study. Identifying a network design with lowest costs is important because of the high maintenance and set-up costs. The quantity of compressor stations, the pipeline segments' lengths, the diameter sizes and pressures at compressor stations, are considered to be decision variables in this study. Two existing optimization models were selected and applied to the case study of Turkey. Because of the fixed cost of investment, both models are formulated as mixed integer nonlinear programs, which require branch and bound combined with the nonlinear programming solution methods. The differences between these two models are related to some factors that can affect the network system of natural gas such as wall thickness, material balance compressor isentropic head and amount of gas to be delivered. The results obtained by these two techniques are compared with each other and with the current system. Major differences between results are costs, pressures and flow rates. These solution techniques are able to find a solution with minimum cost for each model both of which are less than the current cost of the system while satisfying all the constraints on diameter, length, flow rate and pressure. These results give the big picture of an ideal configuration for the future state network for the country of Turkey.

  12. Optimal race strategy for a 200-m flying sprint in a human-powered vehicle: A case study of a world-record attempt.

    PubMed

    de Koning, Jos J; van der Zweep, Cees-Jan; Cornelissen, Jesper; Kuiper, Bouke

    2013-03-01

    Optimal pacing strategy was determined for breaking the world speed record on a human-powered vehicle (HPV) using an energy-flow model in which the rider's physical capacities, the vehicle's properties, and the environmental conditions were included. Power data from world-record attempts were compared with data from the model, and race protocols were adjusted to the results from the model. HPV performance can be improved by using an energy-flow model for optimizing race strategy. A biphased in-run followed by a sprint gave best results.

  13. Optimization of the rocket mode trajectory in a rocket based combined cycle (RBCC) engine powered SSTO vehicle

    NASA Astrophysics Data System (ADS)

    Foster, Richard W.

    1989-07-01

    The application of rocket-based combined cycle (RBCC) engines to booster-stage propulsion, in combination with all-rocket second stages in orbital-ascent missions, has been studied since the mid-1960s; attention is presently given to the case of the 'ejector scramjet' RBCC configuration's application to SSTO vehicles. While total mass delivered to initial orbit is optimized at Mach 20, payload delivery capability to initial orbit optimizes at Mach 17, primarily due to the reduction of hydrogen fuel tankage structure, insulation, and thermal protection system weights.

  14. Transmission loss optimization in acoustic sandwich panels

    NASA Astrophysics Data System (ADS)

    Makris, S. E.; Dym, C. L.; MacGregor Smith, J.

    1986-06-01

    Considering the sound transmission loss (TL) of a sandwich panel as the single objective, different optimization techniques are examined and a sophisticated computer program is used to find the optimum TL. Also, for one of the possible case studies such as core optimization, closed-form expressions are given between TL and the core-design variables for different sets of skins. The significance of these functional relationships lies in the fact that the panel designer can bypass the necessity of using a sophisticated software package in order to assess explicitly the dependence of the TL on core thickness and density.

  15. On the functional optimization of a certain class of nonstationary spatial functions

    USGS Publications Warehouse

    Christakos, G.; Paraskevopoulos, P.N.

    1987-01-01

    Procedures are developed in order to obtain optimal estimates of linear functionals for a wide class of nonstationary spatial functions. These procedures rely on well-established constrained minimum-norm criteria, and are applicable to multidimensional phenomena which are characterized by the so-called hypothesis of inherentity. The latter requires elimination of the polynomial, trend-related components of the spatial function leading to stationary quantities, and also it generates some interesting mathematics within the context of modelling and optimization in several dimensions. The arguments are illustrated using various examples, and a case study computed in detail. ?? 1987 Plenum Publishing Corporation.

  16. Automated IMRT planning with regional optimization using planning scripts

    PubMed Central

    Wong, Eugene; Bzdusek, Karl; Lock, Michael; Chen, Jeff Z.

    2013-01-01

    Intensity‐modulated radiation therapy (IMRT) has become a standard technique in radiation therapy for treating different types of cancers. Various class solutions have been developed for simple cases (e.g., localized prostate, whole breast) to generate IMRT plans efficiently. However, for more complex cases (e.g., head and neck, pelvic nodes), it can be time‐consuming for a planner to generate optimized IMRT plans. To generate optimal plans in these more complex cases which generally have multiple target volumes and organs at risk, it is often required to have additional IMRT optimization structures such as dose limiting ring structures, adjust beam geometry, select inverse planning objectives and associated weights, and additional IMRT objectives to reduce cold and hot spots in the dose distribution. These parameters are generally manually adjusted with a repeated trial and error approach during the optimization process. To improve IMRT planning efficiency in these more complex cases, an iterative method that incorporates some of these adjustment processes automatically in a planning script is designed, implemented, and validated. In particular, regional optimization has been implemented in an iterative way to reduce various hot or cold spots during the optimization process that begins with defining and automatic segmentation of hot and cold spots, introducing new objectives and their relative weights into inverse planning, and turn this into an iterative process with termination criteria. The method has been applied to three clinical sites: prostate with pelvic nodes, head and neck, and anal canal cancers, and has shown to reduce IMRT planning time significantly for clinical applications with improved plan quality. The IMRT planning scripts have been used for more than 500 clinical cases. PACS numbers: 87.55.D, 87.55.de PMID:23318393

  17. Solving bi-level optimization problems in engineering design using kriging models

    NASA Astrophysics Data System (ADS)

    Xia, Yi; Liu, Xiaojie; Du, Gang

    2018-05-01

    Stackelberg game-theoretic approaches are applied extensively in engineering design to handle distributed collaboration decisions. Bi-level genetic algorithms (BLGAs) and response surfaces have been used to solve the corresponding bi-level programming models. However, the computational costs for BLGAs often increase rapidly with the complexity of lower-level programs, and optimal solution functions sometimes cannot be approximated by response surfaces. This article proposes a new method, namely the optimal solution function approximation by kriging model (OSFAKM), in which kriging models are used to approximate the optimal solution functions. A detailed example demonstrates that OSFAKM can obtain better solutions than BLGAs and response surface-based methods, and at the same time reduce the workload of computation remarkably. Five benchmark problems and a case study of the optimal design of a thin-walled pressure vessel are also presented to illustrate the feasibility and potential of the proposed method for bi-level optimization in engineering design.

  18. A multi-material topology optimization approach for wrinkle-free design of cable-suspended membrane structures

    NASA Astrophysics Data System (ADS)

    Luo, Yangjun; Niu, Yanzhuang; Li, Ming; Kang, Zhan

    2017-06-01

    In order to eliminate stress-related wrinkles in cable-suspended membrane structures and to provide simple and reliable deployment, this study presents a multi-material topology optimization model and an effective solution procedure for generating optimal connected layouts for membranes and cables. On the basis of the principal stress criterion of membrane wrinkling behavior and the density-based interpolation of multi-phase materials, the optimization objective is to maximize the total structural stiffness while satisfying principal stress constraints and specified material volume requirements. By adopting the cosine-type relaxation scheme to avoid the stress singularity phenomenon, the optimization model is successfully solved through a standard gradient-based algorithm. Four-corner tensioned membrane structures with different loading cases were investigated to demonstrate the effectiveness of the proposed method in automatically finding the optimal design composed of curved boundary cables and wrinkle-free membranes.

  19. Automated Design Framework for Synthetic Biology Exploiting Pareto Optimality.

    PubMed

    Otero-Muras, Irene; Banga, Julio R

    2017-07-21

    In this work we consider Pareto optimality for automated design in synthetic biology. We present a generalized framework based on a mixed-integer dynamic optimization formulation that, given design specifications, allows the computation of Pareto optimal sets of designs, that is, the set of best trade-offs for the metrics of interest. We show how this framework can be used for (i) forward design, that is, finding the Pareto optimal set of synthetic designs for implementation, and (ii) reverse design, that is, analyzing and inferring motifs and/or design principles of gene regulatory networks from the Pareto set of optimal circuits. Finally, we illustrate the capabilities and performance of this framework considering four case studies. In the first problem we consider the forward design of an oscillator. In the remaining problems, we illustrate how to apply the reverse design approach to find motifs for stripe formation, rapid adaption, and fold-change detection, respectively.

  20. Event-driven time-optimal control for a class of discontinuous bioreactors.

    PubMed

    Moreno, Jaime A; Betancur, Manuel J; Buitrón, Germán; Moreno-Andrade, Iván

    2006-07-05

    Discontinuous bioreactors may be further optimized for processing inhibitory substrates using a convenient fed-batch mode. To do so the filling rate must be controlled in such a way as to push the reaction rate to its maximum value, by increasing the substrate concentration just up to the point where inhibition begins. However, an exact optimal controller requires measuring several variables (e.g., substrate concentrations in the feed and in the tank) and also good model knowledge (e.g., yield and kinetic parameters), requirements rarely satisfied in real applications. An environmentally important case, that exemplifies all these handicaps, is toxicant wastewater treatment. There the lack of online practical pollutant sensors may allow unforeseen high shock loads to be fed to the bioreactor, causing biomass inhibition that slows down the treatment process and, in extreme cases, even renders the biological process useless. In this work an event-driven time-optimal control (ED-TOC) is proposed to circumvent these limitations. We show how to detect a "there is inhibition" event by using some computable function of the available measurements. This event drives the ED-TOC to stop the filling. Later, by detecting the symmetric event, "there is no inhibition," the ED-TOC may restart the filling. A fill-react cycling then maintains the process safely hovering near its maximum reaction rate, allowing a robust and practically time-optimal operation of the bioreactor. An experimental study case of a wastewater treatment process application is presented. There the dissolved oxygen concentration was used to detect the events needed to drive the controller. (c) 2006 Wiley Periodicals, Inc.

  1. Developing a Shuffled Complex-Self Adaptive Hybrid Evolution (SC-SAHEL) Framework for Water Resources Management and Water-Energy System Optimization

    NASA Astrophysics Data System (ADS)

    Rahnamay Naeini, M.; Sadegh, M.; AghaKouchak, A.; Hsu, K. L.; Sorooshian, S.; Yang, T.

    2017-12-01

    Meta-Heuristic optimization algorithms have gained a great deal of attention in a wide variety of fields. Simplicity and flexibility of these algorithms, along with their robustness, make them attractive tools for solving optimization problems. Different optimization methods, however, hold algorithm-specific strengths and limitations. Performance of each individual algorithm obeys the "No-Free-Lunch" theorem, which means a single algorithm cannot consistently outperform all possible optimization problems over a variety of problems. From users' perspective, it is a tedious process to compare, validate, and select the best-performing algorithm for a specific problem or a set of test cases. In this study, we introduce a new hybrid optimization framework, entitled Shuffled Complex-Self Adaptive Hybrid EvoLution (SC-SAHEL), which combines the strengths of different evolutionary algorithms (EAs) in a parallel computing scheme, and allows users to select the most suitable algorithm tailored to the problem at hand. The concept of SC-SAHEL is to execute different EAs as separate parallel search cores, and let all participating EAs to compete during the course of the search. The newly developed SC-SAHEL algorithm is designed to automatically select, the best performing algorithm for the given optimization problem. This algorithm is rigorously effective in finding the global optimum for several strenuous benchmark test functions, and computationally efficient as compared to individual EAs. We benchmark the proposed SC-SAHEL algorithm over 29 conceptual test functions, and two real-world case studies - one hydropower reservoir model and one hydrological model (SAC-SMA). Results show that the proposed framework outperforms individual EAs in an absolute majority of the test problems, and can provide competitive results to the fittest EA algorithm with more comprehensive information during the search. The proposed framework is also flexible for merging additional EAs, boundary-handling techniques, and sampling schemes, and has good potential to be used in Water-Energy system optimal operation and management.

  2. Deterministic and unambiguous dense coding

    NASA Astrophysics Data System (ADS)

    Wu, Shengjun; Cohen, Scott M.; Sun, Yuqing; Griffiths, Robert B.

    2006-04-01

    Optimal dense coding using a partially-entangled pure state of Schmidt rank Dmacr and a noiseless quantum channel of dimension D is studied both in the deterministic case where at most Ld messages can be transmitted with perfect fidelity, and in the unambiguous case where when the protocol succeeds (probability τx ) Bob knows for sure that Alice sent message x , and when it fails (probability 1-τx ) he knows it has failed. Alice is allowed any single-shot (one use) encoding procedure, and Bob any single-shot measurement. For Dmacr ⩽D a bound is obtained for Ld in terms of the largest Schmidt coefficient of the entangled state, and is compared with published results by Mozes [Phys. Rev. A71, 012311 (2005)]. For Dmacr >D it is shown that Ld is strictly less than D2 unless Dmacr is an integer multiple of D , in which case uniform (maximal) entanglement is not needed to achieve the optimal protocol. The unambiguous case is studied for Dmacr ⩽D , assuming τx>0 for a set of Dmacr D messages, and a bound is obtained for the average ⟨1/τ⟩ . A bound on the average ⟨τ⟩ requires an additional assumption of encoding by isometries (unitaries when Dmacr =D ) that are orthogonal for different messages. Both bounds are saturated when τx is a constant independent of x , by a protocol based on one-shot entanglement concentration. For Dmacr >D it is shown that (at least) D2 messages can be sent unambiguously. Whether unitary (isometric) encoding suffices for optimal protocols remains a major unanswered question, both for our work and for previous studies of dense coding using partially-entangled states, including noisy (mixed) states.

  3. Improving multi-objective reservoir operation optimization with sensitivity-informed dimension reduction

    NASA Astrophysics Data System (ADS)

    Chu, J.; Zhang, C.; Fu, G.; Li, Y.; Zhou, H.

    2015-08-01

    This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed method dramatically reduces the computational demands required for attaining high-quality approximations of optimal trade-off relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed dimension reduction and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform dimension reduction of optimization problems when solving complex multi-objective reservoir operation problems.

  4. First-Order Frameworks for Managing Models in Engineering Optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natlia M.; Lewis, Robert Michael

    2000-01-01

    Approximation/model management optimization (AMMO) is a rigorous methodology for attaining solutions of high-fidelity optimization problems with minimal expense in high- fidelity function and derivative evaluation. First-order AMMO frameworks allow for a wide variety of models and underlying optimization algorithms. Recent demonstrations with aerodynamic optimization achieved three-fold savings in terms of high- fidelity function and derivative evaluation in the case of variable-resolution models and five-fold savings in the case of variable-fidelity physics models. The savings are problem dependent but certain trends are beginning to emerge. We give an overview of the first-order frameworks, current computational results, and an idea of the scope of the first-order framework applicability.

  5. Pneumafil casing blower through moving reference frame (MRF) - A CFD simulation

    NASA Astrophysics Data System (ADS)

    Manivel, R.; Vijayanandh, R.; Babin, T.; Sriram, G.

    2018-05-01

    In this analysis work, the ring frame of Pneumafil casing blower of the textile mills with a power rating of 5 kW have been simulated using Computational Fluid Dynamics (CFD) code. The CFD analysis of the blower is carried out in Ansys Workbench 16.2 with Fluent using MRF solver settings. The simulation settings and boundary conditions are based on literature study and field data acquired. The main objective of this work is to reduce the energy consumption of the blower. The flow analysis indicated that the power consumption is influenced by the deflector plate orientation and deflector plate strip situated at the outlet casing of the blower. The energy losses occurred in the blower is due to the recirculation zones formed around the deflector plate strip. The deflector plate orientation is changed and optimized to reduce the energy consumption. The proposed optimized model is based on the simulation results which had relatively lesser power consumption than the existing and other cases. The energy losses in the Pneumafil casing blower are reduced through CFD analysis.

  6. Optimization in Radiation Therapy: Applications in Brachytherapy and Intensity Modulated Radiation Therapy

    NASA Astrophysics Data System (ADS)

    McGeachy, Philip David

    Over 50% of cancer patients require radiation therapy (RT). RT is an optimization problem requiring maximization of the radiation damage to the tumor while minimizing the harm to the healthy tissues. This dissertation focuses on two main RT optimization problems: 1) brachytherapy and 2) intensity modulated radiation therapy (IMRT). The brachytherapy research involved solving a non-convex optimization problem by creating an open-source genetic algorithm optimizer to determine the optimal radioactive seed distribution for a given set of patient volumes and constraints, both dosimetric- and implant-based. The optimizer was tested for a set of 45 prostate brachytherapy patients. While all solutions met the clinical standards, they also benchmarked favorably with those generated by a standard commercial solver. Compared to its compatriot, the salient features of the generated solutions were: slightly reduced prostate coverage, lower dose to the urethra and rectum, and a smaller number of needles required for an implant. Historically, IMRT requires modulation of fluence while keeping the photon beam energy fixed. The IMRT-related investigation in this thesis aimed at broadening the solution space by varying photon energy. The problem therefore involved simultaneous optimization of photon beamlet energy and fluence, denoted by XMRT. Formulating the problem as convex, linear programming was applied to obtain solutions for optimal energy-dependent fluences, while achieving all clinical objectives and constraints imposed. Dosimetric advantages of XMRT over single-energy IMRT in the improved sparing of organs at risk (OARs) was demonstrated in simplified phantom studies. The XMRT algorithm was improved to include clinical dose-volume constraints and clinical studies for prostate and head and neck cancer patients were investigated. Compared to IMRT, XMRT provided improved dosimetric benefit in the prostate case, particularly within intermediate- to low-dose regions (≤ 40 Gy) for OARs. For head and neck cases, XMRT solutions showed no significant disadvantage or advantage over IMRT. The deliverability concerns for the fluence maps generated from XMRT were addressed by incorporating smoothing constraints during the optimization and through successful generation of treatment machine files. Further research is needed to explore the full potential of the XMRT approach to RT.

  7. Sensitivity-Based Guided Model Calibration

    NASA Astrophysics Data System (ADS)

    Semnani, M.; Asadzadeh, M.

    2017-12-01

    A common practice in automatic calibration of hydrologic models is applying the sensitivity analysis prior to the global optimization to reduce the number of decision variables (DVs) by identifying the most sensitive ones. This two-stage process aims to improve the optimization efficiency. However, Parameter sensitivity information can be used to enhance the ability of the optimization algorithms to find good quality solutions in a fewer number of solution evaluations. This improvement can be achieved by increasing the focus of optimization on sampling from the most sensitive parameters in each iteration. In this study, the selection process of the dynamically dimensioned search (DDS) optimization algorithm is enhanced by utilizing a sensitivity analysis method to put more emphasis on the most sensitive decision variables for perturbation. The performance of DDS with the sensitivity information is compared to the original version of DDS for different mathematical test functions and a model calibration case study. Overall, the results show that DDS with sensitivity information finds nearly the same solutions as original DDS, however, in a significantly fewer number of solution evaluations.

  8. Technology Assessment in Support of the Presidential Vision for Space Exploration

    NASA Technical Reports Server (NTRS)

    Weisbin, Charles R.; Lincoln, William; Mrozinski, Joe; Hua, Hook; Merida, Sofia; Shelton, Kacie; Adumitroaie, Virgil; Derleth, Jason; Silberg, Robert

    2006-01-01

    This document is a viewgraph presentation that contains: (1) pictorial description of lunar context, (2) Definition of base case, (3) Optimization results, (4) Effects of cost uncertainties for base case and different assumed annual budget levels and (5) Effects of temporal optimization.

  9. A Large-Telescope Natural Guide Star AO System

    NASA Technical Reports Server (NTRS)

    Redding, David; Milman, Mark; Needels, Laura

    1994-01-01

    None given. From overview and conclusion:Keck Telescope case study. Objectives-low cost, good sky coverage. Approach--natural guide star at 0.8um, correcting at 2.2um.Concl- Good performance is possible for Keck with natural guide star AO system (SR>0.2 to mag 17+).AO-optimized CCD should b every effective. Optimizing td is very effective.Spatial Coadding is not effective except perhaps at extreme low light levels.

  10. Optimization of Laminated Composite Plates

    DTIC Science & Technology

    1989-09-01

    plane loads has already been studied, and a number of technical publications and software packages can be found. In the present report, an optimization of...described above. There is no difficulty in any case, and comercial software , from personal computers to macro- systems, is available. In the chapter...Reforzado y su Aplicacion a los Medios de Transporte", Ph.D. University of Zaragoza, Spain, 1984. 77. Miravete A., "Caracterisation et mise au Point d’un

  11. Constellation design with geometric and probabilistic shaping

    NASA Astrophysics Data System (ADS)

    Zhang, Shaoliang; Yaman, Fatih

    2018-02-01

    A systematic study, including theory, simulation and experiments, is carried out to review the generalized pairwise optimization algorithm for designing optimized constellation. In order to verify its effectiveness, the algorithm is applied in three testing cases: 2-dimensional 8 quadrature amplitude modulation (QAM), 4-dimensional set-partitioning QAM, and probabilistic-shaped (PS) 32QAM. The results suggest that geometric shaping can work together with PS to further bridge the gap toward the Shannon limit.

  12. Aeroelastic Optimization Study Based on the X-56A Model

    NASA Technical Reports Server (NTRS)

    Li, Wesley W.; Pak, Chan-Gi

    2014-01-01

    One way to increase the aircraft fuel efficiency is to reduce structural weight while maintaining adequate structural airworthiness, both statically and aeroelastically. A design process which incorporates the object-oriented multidisciplinary design, analysis, and optimization (MDAO) tool and the aeroelastic effects of high fidelity finite element models to characterize the design space was successfully developed and established. This paper presents two multidisciplinary design optimization studies using an object-oriented MDAO tool developed at NASA Armstrong Flight Research Center. The first study demonstrates the use of aeroelastic tailoring concepts to minimize the structural weight while meeting the design requirements including strength, buckling, and flutter. Such an approach exploits the anisotropic capabilities of the fiber composite materials chosen for this analytical exercise with ply stacking sequence. A hybrid and discretization optimization approach improves accuracy and computational efficiency of a global optimization algorithm. The second study presents a flutter mass balancing optimization study for the fabricated flexible wing of the X-56A model since a desired flutter speed band is required for the active flutter suppression demonstration during flight testing. The results of the second study provide guidance to modify the wing design and move the design flutter speeds back into the flight envelope so that the original objective of X-56A flight test can be accomplished successfully. The second case also demonstrates that the object-oriented MDAO tool can handle multiple analytical configurations in a single optimization run.

  13. Application of the method of steepest descent to laminated shield weight optimization with several constraints: Theory

    NASA Technical Reports Server (NTRS)

    Lahti, G. P.

    1971-01-01

    The method of steepest descent used in optimizing one-dimensional layered radiation shields is extended to multidimensional, multiconstraint situations. The multidimensional optimization algorithm and equations are developed for the case of a dose constraint in any one direction being dependent only on the shield thicknesses in that direction and independent of shield thicknesses in other directions. Expressions are derived for one-, two-, and three-dimensional cases (one, two, and three constraints). The precedure is applicable to the optimization of shields where there are different dose constraints and layering arrangements in the principal directions.

  14. Patient-specific rehearsal prior to EVAR: a pilot study.

    PubMed

    Desender, L; Rancic, Z; Aggarwal, R; Duchateau, J; Glenck, M; Lachat, M; Vermassen, F; Van Herzeele, I

    2013-06-01

    This study aims to evaluate feasibility, face validity, influence on technical factors and subjective sense of utility of patient-specific rehearsal (PsR) prior to endovascular aortic aneurysm repair (EVAR). A prospective, multicentre pilot study. Patients suitable for EVAR were enrolled and a three-dimensional (3D) model of the patient's anatomy was generated. Less than 24 h prior to the real case, rehearsals were conducted in the laboratory or clinical angiosuite. Technical metrics were recorded during both procedures. A subjective questionnaire was used to evaluate realism, technical and human factor aspects (scale 1-5). Ten patients were enrolled. In one case, the treatment plan was altered based on PsR. In 7/9 patients, the rehearsal significantly altered the optimal C-arm position for the proximal landing zone and an identical fluoroscopy angle was chosen in the real procedure. All team members found the rehearsal useful for selecting the optimal fluoroscopy angle (median 4). The realism of the EVAR procedure simulation was rated highly (median 4). All team members found the PsR useful to prepare the individual team members and the entire team (median 4). PsR for EVAR permits creation of realistic case studies. Subjective evaluation indicates that it may influence optimal C-arm angles and be valuable to prepare the entire team. A randomised controlled trial (RCT) is planned to evaluate how this technology may influence technical and team performance, ultimately leading to improved patient safety. Copyright © 2013 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.

  15. Explore or Exploit? A Generic Model and an Exactly Solvable Case

    NASA Astrophysics Data System (ADS)

    Gueudré, Thomas; Dobrinevski, Alexander; Bouchaud, Jean-Philippe

    2014-02-01

    Finding a good compromise between the exploitation of known resources and the exploration of unknown, but potentially more profitable choices, is a general problem, which arises in many different scientific disciplines. We propose a stylized model for these exploration-exploitation situations, including population or economic growth, portfolio optimization, evolutionary dynamics, or the problem of optimal pinning of vortices or dislocations in disordered materials. We find the exact growth rate of this model for treelike geometries and prove the existence of an optimal migration rate in this case. Numerical simulations in the one-dimensional case confirm the generic existence of an optimum.

  16. Explore or exploit? A generic model and an exactly solvable case.

    PubMed

    Gueudré, Thomas; Dobrinevski, Alexander; Bouchaud, Jean-Philippe

    2014-02-07

    Finding a good compromise between the exploitation of known resources and the exploration of unknown, but potentially more profitable choices, is a general problem, which arises in many different scientific disciplines. We propose a stylized model for these exploration-exploitation situations, including population or economic growth, portfolio optimization, evolutionary dynamics, or the problem of optimal pinning of vortices or dislocations in disordered materials. We find the exact growth rate of this model for treelike geometries and prove the existence of an optimal migration rate in this case. Numerical simulations in the one-dimensional case confirm the generic existence of an optimum.

  17. Influence of robust optimization in intensity-modulated proton therapy with different dose delivery techniques

    PubMed Central

    Liu, Wei; Li, Yupeng; Li, Xiaoqiang; Cao, Wenhua; Zhang, Xiaodong

    2012-01-01

    Purpose: The distal edge tracking (DET) technique in intensity-modulated proton therapy (IMPT) allows for high energy efficiency, fast and simple delivery, and simple inverse treatment planning; however, it is highly sensitive to uncertainties. In this study, the authors explored the application of DET in IMPT (IMPT-DET) and conducted robust optimization of IMPT-DET to see if the planning technique’s sensitivity to uncertainties was reduced. They also compared conventional and robust optimization of IMPT-DET with three-dimensional IMPT (IMPT-3D) to gain understanding about how plan robustness is achieved. Methods: They compared the robustness of IMPT-DET and IMPT-3D plans to uncertainties by analyzing plans created for a typical prostate cancer case and a base of skull (BOS) cancer case (using data for patients who had undergone proton therapy at our institution). Spots with the highest and second highest energy layers were chosen so that the Bragg peak would be at the distal edge of the targets in IMPT-DET using 36 equally spaced angle beams; in IMPT-3D, 3 beams with angles chosen by a beam angle optimization algorithm were planned. Dose contributions for a number of range and setup uncertainties were calculated, and a worst-case robust optimization was performed. A robust quantification technique was used to evaluate the plans’ sensitivity to uncertainties. Results: With no uncertainties considered, the DET is less robust to uncertainties than is the 3D method but offers better normal tissue protection. With robust optimization to account for range and setup uncertainties, robust optimization can improve the robustness of IMPT plans to uncertainties; however, our findings show the extent of improvement varies. Conclusions: IMPT’s sensitivity to uncertainties can be improved by using robust optimization. They found two possible mechanisms that made improvements possible: (1) a localized single-field uniform dose distribution (LSFUD) mechanism, in which the optimization algorithm attempts to produce a single-field uniform dose distribution while minimizing the patching field as much as possible; and (2) perturbed dose distribution, which follows the change in anatomical geometry. Multiple-instance optimization has more knowledge of the influence matrices; this greater knowledge improves IMPT plans’ ability to retain robustness despite the presence of uncertainties. PMID:22755694

  18. Inverse problem of the vibrational band gap of periodically supported beam

    NASA Astrophysics Data System (ADS)

    Shi, Xiaona; Shu, Haisheng; Dong, Fuzhen; Zhao, Lei

    2017-04-01

    The researches of periodic structures have a long history with the main contents confined in the field of forward problem. In this paper, the inverse problem is considered and an overall frame is proposed which includes two main stages, i.e., the band gap criterion and its optimization. As a preliminary investigation, the inverse problem of the flexural vibrational band gap of a periodically supported beam is analyzed. According to existing knowledge of its forward problem, the band gap criterion is given in implicit form. Then, two cases with three independent parameters, namely the double supported case and the triple one, are studied in detail and the explicit expressions of the feasible domain are constructed by numerical fitting. Finally, the parameter optimization of the double supported case with three variables is conducted using genetic algorithm aiming for the best mean attenuation within specified frequency band.

  19. Fast online Monte Carlo-based IMRT planning for the MRI linear accelerator

    NASA Astrophysics Data System (ADS)

    Bol, G. H.; Hissoiny, S.; Lagendijk, J. J. W.; Raaymakers, B. W.

    2012-03-01

    The MRI accelerator, a combination of a 6 MV linear accelerator with a 1.5 T MRI, facilitates continuous patient anatomy updates regarding translations, rotations and deformations of targets and organs at risk. Accounting for these demands high speed, online intensity-modulated radiotherapy (IMRT) re-optimization. In this paper, a fast IMRT optimization system is described which combines a GPU-based Monte Carlo dose calculation engine for online beamlet generation and a fast inverse dose optimization algorithm. Tightly conformal IMRT plans are generated for four phantom cases and two clinical cases (cervix and kidney) in the presence of the magnetic fields of 0 and 1.5 T. We show that for the presented cases the beamlet generation and optimization routines are fast enough for online IMRT planning. Furthermore, there is no influence of the magnetic field on plan quality and complexity, and equal optimization constraints at 0 and 1.5 T lead to almost identical dose distributions.

  20. Deployment-based lifetime optimization for linear wireless sensor networks considering both retransmission and discrete power control.

    PubMed

    Li, Ruiying; Ma, Wenting; Huang, Ning; Kang, Rui

    2017-01-01

    A sophisticated method for node deployment can efficiently reduce the energy consumption of a Wireless Sensor Network (WSN) and prolong the corresponding network lifetime. Pioneers have proposed many node deployment based lifetime optimization methods for WSNs, however, the retransmission mechanism and the discrete power control strategy, which are widely used in practice and have large effect on the network energy consumption, are often neglected and assumed as a continuous one, respectively, in the previous studies. In this paper, both retransmission and discrete power control are considered together, and a more realistic energy-consumption-based network lifetime model for linear WSNs is provided. Using this model, we then propose a generic deployment-based optimization model that maximizes network lifetime under coverage, connectivity and transmission rate success constraints. The more accurate lifetime evaluation conduces to a longer optimal network lifetime in the realistic situation. To illustrate the effectiveness of our method, both one-tiered and two-tiered uniformly and non-uniformly distributed linear WSNs are optimized in our case studies, and the comparisons between our optimal results and those based on relatively inaccurate lifetime evaluation show the advantage of our method when investigating WSN lifetime optimization problems.

  1. Optimizing groundwater development strategies by genetic algorithm: a case study for balancing the needs for agricultural irrigation and environmental protection in northern China

    NASA Astrophysics Data System (ADS)

    Wu, Jianfeng; Zheng, Li; Liu, Depeng

    2007-11-01

    Gaoqing Plain is a major agriculture center of Shandong Province in northern China. Over the last 30 years, the diversion of Yellow River water for intensive irrigation in Gaoqing Plain has led to elevation of the water table and increased evaporation, and subsequently, a dramatic increase in salt content in soil and rapid degradation of crop productivity. Optimal strategies have been explored, that will balance the need to extract sufficient groundwater for irrigation (to ease the pressure on diverting Yellow River water) with the need to improve the local environment by appropriately lowering the water table. Two simulation-optimization models have been formulated and a genetic algorithm (GA) is applied to search for the optimal groundwater development strategies in Gaoqing Plain, while keeping the adverse environmental impacts in check. Compared with the trial-and-error approach of previous studies, the optimization results demonstrate that using an optimization model coupled with a GA search is both effective and efficient. The optimal solutions identified by the GA will provide Gaoqing Plain with the blueprints for developing sustainable groundwater abstraction plans to support local economic development and improve its environmental quality.

  2. Design of optimal groundwater remediation systems under flexible environmental-standard constraints.

    PubMed

    Fan, Xing; He, Li; Lu, Hong-Wei; Li, Jing

    2015-01-01

    In developing optimal groundwater remediation strategies, limited effort has been exerted to solve the uncertainty in environmental quality standards. When such uncertainty is not considered, either over optimistic or over pessimistic optimization strategies may be developed, probably leading to the formulation of rigid remediation strategies. This study advances a mathematical programming modeling approach for optimizing groundwater remediation design. This approach not only prevents the formulation of over optimistic and over pessimistic optimization strategies but also provides a satisfaction level that indicates the degree to which the environmental quality standard is satisfied. Therefore the approach may be expected to be significantly more acknowledged by the decision maker than those who do not consider standard uncertainty. The proposed approach is applied to a petroleum-contaminated site in western Canada. Results from the case study show that (1) the peak benzene concentrations can always satisfy the environmental standard under the optimal strategy, (2) the pumping rates of all wells decrease under a relaxed standard or long-term remediation approach, (3) the pumping rates are less affected by environmental quality constraints under short-term remediation, and (4) increased flexible environmental standards have a reduced effect on the optimal remediation strategy.

  3. Optimal intervention strategies for cholera outbreak by education and chlorination

    NASA Astrophysics Data System (ADS)

    Bakhtiar, Toni

    2016-01-01

    This paper discusses the control of infectious diseases in the framework of optimal control approach. A case study on cholera control was studied by considering two control strategies, namely education and chlorination. We distinct the former control into one regarding person-to-person behaviour and another one concerning person-to-environment conduct. Model are divided into two interacted populations: human population which follows an SIR model and pathogen population. Pontryagin maximum principle was applied in deriving a set of differential equations which consists of dynamical and adjoin systems as optimality conditions. Then, the fourth order Runge-Kutta method was exploited to numerically solve the equation system. An illustrative example was provided to assess the effectiveness of the control strategies toward a set of control scenarios.

  4. Optimal feedback scheme and universal time scaling for Hamiltonian parameter estimation.

    PubMed

    Yuan, Haidong; Fung, Chi-Hang Fred

    2015-09-11

    Time is a valuable resource and it is expected that a longer time period should lead to better precision in Hamiltonian parameter estimation. However, recent studies in quantum metrology have shown that in certain cases more time may even lead to worse estimations, which puts this intuition into question. In this Letter we show that by including feedback controls this intuition can be restored. By deriving asymptotically optimal feedback controls we quantify the maximal improvement feedback controls can provide in Hamiltonian parameter estimation and show a universal time scaling for the precision limit under the optimal feedback scheme. Our study reveals an intriguing connection between noncommutativity in the dynamics and the gain of feedback controls in Hamiltonian parameter estimation.

  5. Black-Litterman model on non-normal stock return (Case study four banks at LQ-45 stock index)

    NASA Astrophysics Data System (ADS)

    Mahrivandi, Rizki; Noviyanti, Lienda; Setyanto, Gatot Riwi

    2017-03-01

    The formation of the optimal portfolio is a method that can help investors to minimize risks and optimize profitability. One model for the optimal portfolio is a Black-Litterman (BL) model. BL model can incorporate an element of historical data and the views of investors to form a new prediction about the return of the portfolio as a basis for preparing the asset weighting models. BL model has two fundamental problems, the assumption of normality and estimation parameters on the market Bayesian prior framework that does not from a normal distribution. This study provides an alternative solution where the modelling of the BL model stock returns and investor views from non-normal distribution.

  6. Optimizing Excited-State Electronic-Structure Codes for Intel Knights Landing: A Case Study on the BerkeleyGW Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deslippe, Jack; da Jornada, Felipe H.; Vigil-Fowler, Derek

    2016-10-06

    We profile and optimize calculations performed with the BerkeleyGW code on the Xeon-Phi architecture. BerkeleyGW depends both on hand-tuned critical kernels as well as on BLAS and FFT libraries. We describe the optimization process and performance improvements achieved. We discuss a layered parallelization strategy to take advantage of vector, thread and node-level parallelism. We discuss locality changes (including the consequence of the lack of L3 cache) and effective use of the on-package high-bandwidth memory. We show preliminary results on Knights-Landing including a roofline study of code performance before and after a number of optimizations. We find that the GW methodmore » is particularly well-suited for many-core architectures due to the ability to exploit a large amount of parallelism over plane-wave components, band-pairs, and frequencies.« less

  7. Research on robust optimization of emergency logistics network considering the time dependence characteristic

    NASA Astrophysics Data System (ADS)

    WANG, Qingrong; ZHU, Changfeng; LI, Ying; ZHANG, Zhengkun

    2017-06-01

    Considering the time dependence of emergency logistic network and complexity of the environment that the network exists in, in this paper the time dependent network optimization theory and robust discrete optimization theory are combined, and the emergency logistics dynamic network optimization model with characteristics of robustness is built to maximize the timeliness of emergency logistics. On this basis, considering the complexity of dynamic network and the time dependence of edge weight, an improved ant colony algorithm is proposed to realize the coupling of the optimization algorithm and the network time dependence and robustness. Finally, a case study has been carried out in order to testify validity of this robustness optimization model and its algorithm, and the value of different regulation factors was analyzed considering the importance of the value of the control factor in solving the optimal path. Analysis results show that this model and its algorithm above-mentioned have good timeliness and strong robustness.

  8. The dynamics and optimal control of spinning spacecraft and movable telescoping appendages, part A. [two axis control with single offset boom

    NASA Technical Reports Server (NTRS)

    Bainum, P. M.; Sellappan, R.

    1977-01-01

    The problem of optimal control with a minimum time criterion as applied to a single boom system for achieving two axis control is discussed. The special case where the initial conditions are such that the system can be driven to the equilibrium state with only a single switching maneuver in the bang-bang optimal sequence is analyzed. The system responses are presented. Application of the linear regulator problem for the optimal control of the telescoping system is extended to consider the effects of measurement and plant noises. The noise uncertainties are included with an application of the estimator - Kalman filter problem. Different schemes for measuring the components of the angular velocity are considered. Analytical results are obtained for special cases, and numerical results are presented for the general case.

  9. A temperature match based optimization method for daily load prediction considering DLC effect

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Z.

    This paper presents a unique optimization method for short term load forecasting. The new method is based on the optimal template temperature match between the future and past temperatures. The optimal error reduction technique is a new concept introduced in this paper. Two case studies show that for hourly load forecasting, this method can yield results as good as the rather complicated Box-Jenkins Transfer Function method, and better than the Box-Jenkins method; for peak load prediction, this method is comparable in accuracy to the neural network method with back propagation, and can produce more accurate results than the multi-linear regressionmore » method. The DLC effect on system load is also considered in this method.« less

  10. SU-E-T-230: Creating a Large Number of Focused Beams with Variable Patient Head Tilt to Improve Dose Fall-Off for Brain Radiosurgery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiu, J; Ma, L

    2015-06-15

    Purpose: To develop a treatment delivery and planning strategy by increasing the number of beams to minimize dose to brain tissue surrounding a target, while maximizing dose coverage to the target. Methods: We analyzed 14 different treatment plans via Leksell PFX and 4C. For standardization, single tumor cases were chosen. Original treatment plans were compared with two optimized plans. The number of beams was increased in treatment plans by varying tilt angles of the patient head, while maintaining original isocenter and the beam positions in the x-, y- and z-axes, collimator size, and beam blocking. PFX optimized plans increased beammore » numbers with three pre-set tilt angles, 70, 90, 110, and 4C optimized plans increased beam numbers with tilt angles increasing arbitrarily from range of 30 to 150 degrees. Optimized treatment plans were compared dosimetrically with original treatment plans. Results: Comparing total normal tissue isodose volumes between original and optimized plans, the low-level percentage isodose volumes decreased in all plans. Despite the addition of multiple beams up to a factor of 25, beam-on times for 1 tilt angle versus 3 or more tilt angles were comparable (<1 min.). In 64% (9/14) of the studied cases, the volume percentage decrease by >5%, with the highest value reaching 19%. The addition of more tilt angles correlates to a greater decrease in normal brain irradiated volume. Selectivity and coverage for original and optimized plans remained comparable. Conclusion: Adding large number of additional focused beams with variable patient head tilt shows improvement for dose fall-off for brain radiosurgery. The study demonstrates technical feasibility of adding beams to decrease target volume.« less

  11. Calculation of the Target Lumbar Lordosis Angle for Restoring an Optimal Pelvic Tilt in Elderly Patients With Adult Spinal Deformity.

    PubMed

    Yamato, Yu; Hasegawa, Tomohiko; Kobayashi, Sho; Yasuda, Tatsuya; Togawa, Daisuke; Arima, Hideyuki; Oe, Shin; Iida, Takahiro; Matsumura, Akira; Hosogane, Naobumi; Matsumoto, Morio; Matsuyama, Yukihiro

    2016-02-01

    This investigation consisted of a cross-sectional study and a retrospective multicenter case series. This investigation sought to identify the ideal lumbar lordosis (LL) angle for restoring an optimal pelvic tilt (PT) in patients with adult spinal deformity (ASD). To achieve successful corrective fusion in ASD patients with sagittal imbalance, it is essential to correct the sagittal spinal alignment and obtain a suitable pelvic inclination. We determined the LL angle that would restore the optimal PT following ASD surgery. The cross-sectional study included 184 elderly volunteers (mean age 64 years) with an Oswestry Disability Index score less than 20%. The relationship between PT or LL and the pelvic incidence (PI) in normal individuals was investigated. The second study included 116 ASD patients (mean age 66 years) who underwent thoracolumbar corrective fusion at 1 of 4 spine centers. The postoperative PT values were calculated using the parameters measured. On the basis of these studies, an ideal LL angle was determined. In the cross-sectional study, the linear regression equation for the optimal PT as a function of PI was "optimal PT = 0.47 × PI - 7.5." In the second study, the postoperative PT was determined as a function of PI and corrected LL, using the equation "postoperative PT = 0.7 × PI - 0.5 × corrected LL + 8.1." The target LL angle was determined by mathematically equalizing the PTs of these 2 equations: "target LL = 0.45 × PI + 31.8." The ideal LL angle can be determined using the equation "LL = 0.45 × PI + 31.8," which can be used as a reference during surgical planning in ASD cases. 4.

  12. A predictive control framework for optimal energy extraction of wind farms

    NASA Astrophysics Data System (ADS)

    Vali, M.; van Wingerden, J. W.; Boersma, S.; Petrović, V.; Kühn, M.

    2016-09-01

    This paper proposes an adjoint-based model predictive control for optimal energy extraction of wind farms. It employs the axial induction factor of wind turbines to influence their aerodynamic interactions through the wake. The performance index is defined here as the total power production of the wind farm over a finite prediction horizon. A medium-fidelity wind farm model is utilized to predict the inflow propagation in advance. The adjoint method is employed to solve the formulated optimization problem in a cost effective way and the first part of the optimal solution is implemented over the control horizon. This procedure is repeated at the next controller sample time providing the feedback into the optimization. The effectiveness and some key features of the proposed approach are studied for a two turbine test case through simulations.

  13. Affordable Design: A Methodolgy to Implement Process-Based Manufacturing Cost into the Traditional Performance-Focused Multidisciplinary Design Optimization

    NASA Technical Reports Server (NTRS)

    Bao, Han P.; Samareh, J. A.

    2000-01-01

    The primary objective of this paper is to demonstrate the use of process-based manufacturing and assembly cost models in a traditional performance-focused multidisciplinary design and optimization process. The use of automated cost-performance analysis is an enabling technology that could bring realistic processbased manufacturing and assembly cost into multidisciplinary design and optimization. In this paper, we present a new methodology for incorporating process costing into a standard multidisciplinary design optimization process. Material, manufacturing processes, and assembly processes costs then could be used as the objective function for the optimization method. A case study involving forty-six different configurations of a simple wing is presented, indicating that a design based on performance criteria alone may not necessarily be the most affordable as far as manufacturing and assembly cost is concerned.

  14. Optimality of the barrier strategy in de Finetti's dividend problem for spectrally negative Lévy processes: An alternative approach

    NASA Astrophysics Data System (ADS)

    Yin, Chuancun; Wang, Chunwei

    2009-11-01

    The optimal dividend problem proposed in de Finetti [1] is to find the dividend-payment strategy that maximizes the expected discounted value of dividends which are paid to the shareholders until the company is ruined. Avram et al. [9] studied the case when the risk process is modelled by a general spectrally negative Lévy process and Loeffen [10] gave sufficient conditions under which the optimal strategy is of the barrier type. Recently Kyprianou et al. [11] strengthened the result of Loeffen [10] which established a larger class of Lévy processes for which the barrier strategy is optimal among all admissible ones. In this paper we use an analytical argument to re-investigate the optimality of barrier dividend strategies considered in the three recent papers.

  15. A New Artificial Neural Network Enhanced by the Shuffled Complex Evolution Optimization with Principal Component Analysis (SP-UCI) for Water Resources Management

    NASA Astrophysics Data System (ADS)

    Hayatbini, N.; Faridzad, M.; Yang, T.; Akbari Asanjan, A.; Gao, X.; Sorooshian, S.

    2016-12-01

    The Artificial Neural Networks (ANNs) are useful in many fields, including water resources engineering and management. However, due to the non-linear and chaotic characteristics associated with natural processes and human decision making, the use of ANNs in real-world applications is still limited, and its performance needs to be further improved for a broader practical use. The commonly used Back-Propagation (BP) scheme and gradient-based optimization in training the ANNs have already found to be problematic in some cases. The BP scheme and gradient-based optimization methods are associated with the risk of premature convergence, stuck in local optimums, and the searching is highly dependent on initial conditions. Therefore, as an alternative to BP and gradient-based searching scheme, we propose an effective and efficient global searching method, termed the Shuffled Complex Evolutionary Global optimization algorithm with Principal Component Analysis (SP-UCI), to train the ANN connectivity weights. Large number of real-world datasets are tested with the SP-UCI-based ANN, as well as various popular Evolutionary Algorithms (EAs)-enhanced ANNs, i.e., Particle Swarm Optimization (PSO)-, Genetic Algorithm (GA)-, Simulated Annealing (SA)-, and Differential Evolution (DE)-enhanced ANNs. Results show that SP-UCI-enhanced ANN is generally superior over other EA-enhanced ANNs with regard to the convergence and computational performance. In addition, we carried out a case study for hydropower scheduling in the Trinity Lake in the western U.S. In this case study, multiple climate indices are used as predictors for the SP-UCI-enhanced ANN. The reservoir inflows and hydropower releases are predicted up to sub-seasonal to seasonal scale. Results show that SP-UCI-enhanced ANN is able to achieve better statistics than other EAs-based ANN, which implies the usefulness and powerfulness of proposed SP-UCI-enhanced ANN for reservoir operation, water resources engineering and management. The SP-UCI-enhanced ANN is universally applicable to many other regression and prediction problems, and it has a good potential to be an alternative to the classical BP scheme and gradient-based optimization methods.

  16. An optimal consumption and investment problem with quadratic utility and negative wealth constraints.

    PubMed

    Roh, Kum-Hwan; Kim, Ji Yeoun; Shin, Yong Hyun

    2017-01-01

    In this paper, we investigate the optimal consumption and portfolio selection problem with negative wealth constraints for an economic agent who has a quadratic utility function of consumption and receives a constant labor income. Due to the property of the quadratic utility function, we separate our problem into two cases and derive the closed-form solutions for each case. We also illustrate some numerical implications of the optimal consumption and portfolio.

  17. Optimal control for the sun-powered airplane with taking into account efficiency of onboard accumulator charging-discharging and charge limits

    NASA Astrophysics Data System (ADS)

    Serokhvostov, S. V.; Churkina, T. E.

    2018-06-01

    The problem of optimal control for the aircraft with the electric powerplant and solar cells for the multiday flight is investigated using the more precise equation of motion comparing to the previous investigations. The cases of some restrictions on aircraft energy storage and peculiarities of its charge and discharge are also analyzed. Pontryagin's maximum principle is utilized. Optimal trajectories were obtained for the cases considered.

  18. Assessment of quality outcomes for robotic pancreaticoduodenectomy: identification of the learning curve.

    PubMed

    Boone, Brian A; Zenati, Mazen; Hogg, Melissa E; Steve, Jennifer; Moser, Arthur James; Bartlett, David L; Zeh, Herbert J; Zureikat, Amer H

    2015-05-01

    Quality assessment is an important instrument to ensure optimal surgical outcomes, particularly during the adoption of new surgical technology. The use of the robotic platform for complex pancreatic resections, such as the pancreaticoduodenectomy, requires close monitoring of outcomes during its implementation phase to ensure patient safety is maintained and the learning curve identified. To report the results of a quality analysis and learning curve during the implementation of robotic pancreaticoduodenectomy (RPD). A retrospective review of a prospectively maintained database of 200 consecutive patients who underwent RPD in a large academic center from October 3, 2008, through March 1, 2014, was evaluated for important metrics of quality. Patients were analyzed in groups of 20 to minimize demographic differences and optimize the ability to detect statistically meaningful changes in performance. Robotic pancreaticoduodenectomy. Optimization of perioperative outcome parameters. No statistical differences in mortality rates or major morbidity were noted during the study. Statistical improvements in estimated blood loss and conversions to open surgery occurred after 20 cases (600 mL vs 250 mL [P = .002] and 35.0% vs 3.3% [P < .001], respectively), incidence of pancreatic fistula after 40 cases (27.5% vs 14.4%; P = .04), and operative time after 80 cases (581 minutes vs 417 minutes [P < .001]). Complication rates, lengths of stay, and readmission rates showed continuous improvement that did not reach statistical significance. Outcomes for the last 120 cases (representing optimized metrics beyond the learning curve) included a mean operative time of 417 minutes, median estimated blood loss of 250 mL, a conversion rate of 3.3%, 90-day mortality of 3.3%, a clinically significant (grade B/C) pancreatic fistula rate of 6.9%, and a median length of stay of 9 days. Continuous assessment of quality metrics allows for safe implementation of RPD. We identified several inflexion points corresponding to optimization of performance metrics for RPD that can be used as benchmarks for surgeons who are adopting this technology.

  19. TARGETED SEQUENTIAL DESIGN FOR TARGETED LEARNING INFERENCE OF THE OPTIMAL TREATMENT RULE AND ITS MEAN REWARD.

    PubMed

    Chambaz, Antoine; Zheng, Wenjing; van der Laan, Mark J

    2017-01-01

    This article studies the targeted sequential inference of an optimal treatment rule (TR) and its mean reward in the non-exceptional case, i.e. , assuming that there is no stratum of the baseline covariates where treatment is neither beneficial nor harmful, and under a companion margin assumption. Our pivotal estimator, whose definition hinges on the targeted minimum loss estimation (TMLE) principle, actually infers the mean reward under the current estimate of the optimal TR. This data-adaptive statistical parameter is worthy of interest on its own. Our main result is a central limit theorem which enables the construction of confidence intervals on both mean rewards under the current estimate of the optimal TR and under the optimal TR itself. The asymptotic variance of the estimator takes the form of the variance of an efficient influence curve at a limiting distribution, allowing to discuss the efficiency of inference. As a by product, we also derive confidence intervals on two cumulated pseudo-regrets, a key notion in the study of bandits problems. A simulation study illustrates the procedure. One of the corner-stones of the theoretical study is a new maximal inequality for martingales with respect to the uniform entropy integral.

  20. Optimization of operation conditions and configurations for solid-propellant ducted rocket combustors

    NASA Astrophysics Data System (ADS)

    Onn, Shing-Chung; Chiang, Hau-Jei; Hwang, Hang-Che; Wei, Jen-Ko; Cherng, Dao-Lien

    1993-06-01

    The dynamic behavior of a 2D turbulent mixing and combustion process has been studied numerically in the main combustion chamber of a solid-propellant ducted rocket (SDR). The mathematical model is based on the Favre-averaged conservation equations developed by Cherng (1990). Combustion efficiency, rather than specific impulse from earlier studies, is applied successfully to optimize the effects of two parameters by a multiple linear regression model. Specifically, the fuel-air equivalence ratio of the operating conditions and the air inlet location of configurations for the SDR combustor have been studied. For a equivalence ratio near the stoichiometric condition, the use of specific impulse or combustion efficiency will show similar trend in characterizing the reacting flow field in the combustor. For the overall fuel lean operating conditions, the change of combustion efficiency is much more sensitive to that of air inlet location than specific impulse does, suggesting combustion efficiency a better property than specific impulse in representing the condition toward flammability limits. In addition, the air inlet for maximum efficiency, in general, appears to be located at downstream of that for highest specific impulse. The optimal case for the effects of two parameters occurs at fuel lean condition, which shows a larger recirculation zone in front, deeper penetration of ram air into the combustor and much larger high temperature zone near the centerline of the combustor exit than those shown in the optimal case for overall equivalence ratio close to stoichiometric.

  1. Rationally optimized cryopreservation of multiple mouse embryonic stem cell lines: II—Mathematical prediction and experimental validation of optimal cryopreservation protocols☆

    PubMed Central

    Kashuba, Corinna M.; Benson, James D.; Critser, John K.

    2014-01-01

    In Part I, we documented differences in cryopreservation success measured by membrane integrity in four mouse embryonic stem cell (mESC) lines from different genetic backgrounds (BALB/c, CBA, FVB, and 129R1), and we demonstrated a potential biophysical basis for these differences through a comparative study characterizing the membrane permeability characteristics and osmotic tolerance limits of each cell line. Here we use these values to predict optimal cryoprotectants, cooling rates, warming rates, and plunge temperatures. We subsequently verified these predictions experimentally for their effects on post-thaw recovery. From this study, we determined that a cryopreservation protocol utilizing 1 M propylene glycol, a cooling rate of 1 °C/minute, and plunging into liquid nitrogen at −41 °C, combined with subsequent warming in a 22 °C water bath with agitation, significantly improved post-thaw recovery for three of the four mESC lines, and did not diminish post-thaw recovery for our single exception. It is proposed that this protocol can be successfully applied to most mESC lines beyond those included within this study once the effect of propylene glycol on mESC gene expression, growth characteristics, and germ-line transmission has been determined. Mouse ESC lines with poor survival using current standard cryopreservation protocols or our proposed protocol can be optimized on a case-by-case basis using the method we have outlined over two papers. For our single exception, the CBA cell line, a cooling rate of 5 °C/minute in the presence of 1.0 M dimethyl sulfoxide or 1.0 M propylene glycol, combined with plunge temperature of −80 °C was optimal. PMID:24560712

  2. Cost-effectiveness of angiographic imaging in isolated perimesencephalic subarachnoid hemorrhage.

    PubMed

    Kalra, Vivek B; Wu, Xiao; Forman, Howard P; Malhotra, Ajay

    2014-12-01

    The purpose of this study is to perform a comprehensive cost-effectiveness analysis of all possible permutations of computed tomographic angiography (CTA) and digital subtraction angiography imaging strategies for both initial diagnosis and follow-up imaging in patients with perimesencephalic subarachnoid hemorrhage on noncontrast CT. Each possible imaging strategy was evaluated in a decision tree created with TreeAge Pro Suite 2014, with parameters derived from a meta-analysis of 40 studies and literature values. Base case and sensitivity analyses were performed to assess the cost-effectiveness of each strategy. A Monte Carlo simulation was conducted with distributional variables to evaluate the robustness of the optimal strategy. The base case scenario showed performing initial CTA with no follow-up angiographic studies in patients with perimesencephalic subarachnoid hemorrhage to be the most cost-effective strategy ($5422/quality adjusted life year). Using a willingness-to-pay threshold of $50 000/quality adjusted life year, the most cost-effective strategy based on net monetary benefit is CTA with no follow-up when the sensitivity of initial CTA is >97.9%, and CTA with CTA follow-up otherwise. The Monte Carlo simulation reported CTA with no follow-up to be the optimal strategy at willingness-to-pay of $50 000 in 99.99% of the iterations. Digital subtraction angiography, whether at initial diagnosis or as part of follow-up imaging, is never the optimal strategy in our model. CTA without follow-up imaging is the optimal strategy for evaluation of patients with perimesencephalic subarachnoid hemorrhage when modern CT scanners and a strict definition of perimesencephalic subarachnoid hemorrhage are used. Digital subtraction angiography and follow-up imaging are not optimal as they carry complications and associated costs. © 2014 American Heart Association, Inc.

  3. Optimization of Blended Wing Body Composite Panels Using Both NASTRAN and Genetic Algorithm

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.

    2006-01-01

    The blended wing body (BWB) is a concept that has been investigated for improving the performance of transport aircraft. A trade study was conducted by evaluating four regions from a BWB design characterized by three fuselage bays and a 400,000 lb. gross take-off weight (GTW). This report describes the structural optimization of these regions via computational analysis and compares them to the baseline designs of the same construction. The identified regions were simplified for use in the optimization. The regions were represented by flat panels having appropriate classical boundary conditions and uniform force resultants along the panel edges. Panel-edge tractions and internal pressure values applied during the study were those determined by nonlinear NASTRAN analyses. Only one load case was considered in the optimization analysis for each panel region. Optimization was accomplished using both NASTRAN solution 200 and Genetic Algorithm (GA), with constraints imposed on stress, buckling, and minimum thicknesses. The NASTRAN optimization analyses often resulted in infeasible solutions due to violation of the constraints, whereas the GA enforced satisfaction of the constraints and, therefore, always ensured a feasible solution. However, both optimization methods encountered difficulties when the number of design variables was increased. In general, the optimized panels weighed less than the comparable baseline panels.

  4. Computer model for refinery operations with emphasis on jet fuel production. Volume 2: Data and technical bases

    NASA Technical Reports Server (NTRS)

    Dunbar, D. N.; Tunnah, B. G.

    1978-01-01

    The FORTRAN computing program predicts the flow streams and material, energy, and economic balances of a typical petroleum refinery, with particular emphasis on production of aviation turbine fuel of varying end point and hydrogen content specifications. The program has provision for shale oil and coal oil in addition to petroleum crudes. A case study feature permits dependent cases to be run for parametric or optimization studies by input of only the variables which are changed from the base case. The report has sufficient detail for the information of most readers.

  5. Trajectory optimization for dynamic couch rotation during volumetric modulated arc radiotherapy

    NASA Astrophysics Data System (ADS)

    Smyth, Gregory; Bamber, Jeffrey C.; Evans, Philip M.; Bedford, James L.

    2013-11-01

    Non-coplanar radiation beams are often used in three-dimensional conformal and intensity modulated radiotherapy to reduce dose to organs at risk (OAR) by geometric avoidance. In volumetric modulated arc radiotherapy (VMAT) non-coplanar geometries are generally achieved by applying patient couch rotations to single or multiple full or partial arcs. This paper presents a trajectory optimization method for a non-coplanar technique, dynamic couch rotation during VMAT (DCR-VMAT), which combines ray tracing with a graph search algorithm. Four clinical test cases (partial breast, brain, prostate only, and prostate and pelvic nodes) were used to evaluate the potential OAR sparing for trajectory-optimized DCR-VMAT plans, compared with standard coplanar VMAT. In each case, ray tracing was performed and a cost map reflecting the number of OAR voxels intersected for each potential source position was generated. The least-cost path through the cost map, corresponding to an optimal DCR-VMAT trajectory, was determined using Dijkstra’s algorithm. Results show that trajectory optimization can reduce dose to specified OARs for plans otherwise comparable to conventional coplanar VMAT techniques. For the partial breast case, the mean heart dose was reduced by 53%. In the brain case, the maximum lens doses were reduced by 61% (left) and 77% (right) and the globes by 37% (left) and 40% (right). Bowel mean dose was reduced by 15% in the prostate only case. For the prostate and pelvic nodes case, the bowel V50 Gy and V60 Gy were reduced by 9% and 45% respectively. Future work will involve further development of the algorithm and assessment of its performance over a larger number of cases in site-specific cohorts.

  6. A two‐point scheme for optimal breast IMRT treatment planning

    PubMed Central

    2013-01-01

    We propose an approach to determining optimal beam weights in breast/chest wall IMRT treatment plans. The goal is to decrease breathing effect and to maximize skin dose if the skin is included in the target or, otherwise, to minimize the skin dose. Two points in the target are utilized to calculate the optimal weights. The optimal plan (i.e., the plan with optimal beam weights) consists of high energy unblocked beams, low energy unblocked beams, and IMRT beams. Six breast and five chest wall cases were retrospectively planned with this scheme in Eclipse, including one breast case where CTV was contoured by the physician. Compared with 3D CRT plans composed of unblocked and field‐in‐field beams, the optimal plans demonstrated comparable or better dose uniformity, homogeneity, and conformity to the target, especially at beam junction when supraclavicular nodes are involved. Compared with nonoptimal plans (i.e., plans with nonoptimized weights), the optimal plans had better dose distributions at shallow depths close to the skin, especially in cases where breathing effect was taken into account. This was verified with experiments using a MapCHECK device attached to a motion simulation table (to mimic motion caused by breathing). PACS number: 87.55 de PMID:24257291

  7. Maximum cycle work output optimization for generalized radiative law Otto cycle engines

    NASA Astrophysics Data System (ADS)

    Xia, Shaojun; Chen, Lingen; Sun, Fengrui

    2016-11-01

    An Otto cycle internal combustion engine which includes thermal and friction losses is investigated by finite-time thermodynamics, and the optimization objective is the maximum cycle work output. The thermal energy transfer from the working substance to the cylinder inner wall follows the generalized radiative law (q∝Δ (Tn)). Under the condition that all of the fuel consumption, the compression ratio and the cycle period are given, the optimal piston trajectories for both the examples with unlimited and limited accelerations on every stroke are determined, and the cycle-period distribution among all strokes is also optimized. Numerical calculation results for the case of radiative law are provided and compared with those obtained for the cases of Newtonian law and linear phenomenological law. The results indicate that the optimal piston trajectory on each stroke contains three sections, which consist of an original maximum-acceleration and a terminal maximum-deceleration parts; for the case of radiative law, optimizing the piston motion path can achieve an improvement of more than 20% in both the cycle-work output and the second-law efficiency of the Otto cycle compared with the conventional near-sinusoidal operation, and heat transfer mechanisms have both qualitative and quantitative influences on the optimal paths of piston movements.

  8. Analytical solutions to optimal underactuated spacecraft formation reconfiguration

    NASA Astrophysics Data System (ADS)

    Huang, Xu; Yan, Ye; Zhou, Yang

    2015-11-01

    Underactuated systems can generally be defined as systems with fewer number of control inputs than that of the degrees of freedom to be controlled. In this paper, analytical solutions to optimal underactuated spacecraft formation reconfiguration without either the radial or the in-track control are derived. By using a linear dynamical model of underactuated spacecraft formation in circular orbits, controllability analysis is conducted for either underactuated case. Indirect optimization methods based on the minimum principle are then introduced to generate analytical solutions to optimal open-loop underactuated reconfiguration problems. Both fixed and free final conditions constraints are considered for either underactuated case and comparisons between these two final conditions indicate that the optimal control strategies with free final conditions require less control efforts than those with the fixed ones. Meanwhile, closed-loop adaptive sliding mode controllers for both underactuated cases are designed to guarantee optimal trajectory tracking in the presence of unmatched external perturbations, linearization errors, and system uncertainties. The adaptation laws are designed via a Lyapunov-based method to ensure the overall stability of the closed-loop system. The explicit expressions of the terminal convergent regions of each system states have also been obtained. Numerical simulations demonstrate the validity and feasibility of the proposed open-loop and closed-loop control schemes for optimal underactuated spacecraft formation reconfiguration in circular orbits.

  9. Selection of Thermal Worst-Case Orbits via Modified Efficient Global Optimization

    NASA Technical Reports Server (NTRS)

    Moeller, Timothy M.; Wilhite, Alan W.; Liles, Kaitlin A.

    2014-01-01

    Efficient Global Optimization (EGO) was used to select orbits with worst-case hot and cold thermal environments for the Stratospheric Aerosol and Gas Experiment (SAGE) III. The SAGE III system thermal model changed substantially since the previous selection of worst-case orbits (which did not use the EGO method), so the selections were revised to ensure the worst cases are being captured. The EGO method consists of first conducting an initial set of parametric runs, generated with a space-filling Design of Experiments (DoE) method, then fitting a surrogate model to the data and searching for points of maximum Expected Improvement (EI) to conduct additional runs. The general EGO method was modified by using a multi-start optimizer to identify multiple new test points at each iteration. This modification facilitates parallel computing and decreases the burden of user interaction when the optimizer code is not integrated with the model. Thermal worst-case orbits for SAGE III were successfully identified and shown by direct comparison to be more severe than those identified in the previous selection. The EGO method is a useful tool for this application and can result in computational savings if the initial Design of Experiments (DoE) is selected appropriately.

  10. Effect of modulation p-doping level on multi-state lasing in InAs/InGaAs quantum dot lasers having different external loss

    NASA Astrophysics Data System (ADS)

    Korenev, V. V.; Savelyev, A. V.; Maximov, M. V.; Zubov, F. I.; Shernyakov, Yu. M.; Kulagina, M. M.; Zhukov, A. E.

    2017-09-01

    The influence of the modulation p-doping level on multi-state lasing in InAs/InGaAs quantum dot (QD) lasers is studied experimentally for devices having various external losses. It is shown that in the case of short cavities (high external loss), there is an increase in the lasing power component corresponding to the ground-state optical transitions of QDs as the p-doping level grows. However, in the case of long cavities (small external loss), higher dopant concentrations may have an opposite effect on the output power. Based on these observations, an optimal design of laser geometry and an optimal doping level are discussed.

  11. Maximizing the probability of satisfying the clinical goals in radiation therapy treatment planning under setup uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fredriksson, Albin, E-mail: albin.fredriksson@raysearchlabs.com; Hårdemark, Björn; Forsgren, Anders

    2015-07-15

    Purpose: This paper introduces a method that maximizes the probability of satisfying the clinical goals in intensity-modulated radiation therapy treatments subject to setup uncertainty. Methods: The authors perform robust optimization in which the clinical goals are constrained to be satisfied whenever the setup error falls within an uncertainty set. The shape of the uncertainty set is included as a variable in the optimization. The goal of the optimization is to modify the shape of the uncertainty set in order to maximize the probability that the setup error will fall within the modified set. Because the constraints enforce the clinical goalsmore » to be satisfied under all setup errors within the uncertainty set, this is equivalent to maximizing the probability of satisfying the clinical goals. This type of robust optimization is studied with respect to photon and proton therapy applied to a prostate case and compared to robust optimization using an a priori defined uncertainty set. Results: Slight reductions of the uncertainty sets resulted in plans that satisfied a larger number of clinical goals than optimization with respect to a priori defined uncertainty sets, both within the reduced uncertainty sets and within the a priori, nonreduced, uncertainty sets. For the prostate case, the plans taking reduced uncertainty sets into account satisfied 1.4 (photons) and 1.5 (protons) times as many clinical goals over the scenarios as the method taking a priori uncertainty sets into account. Conclusions: Reducing the uncertainty sets enabled the optimization to find better solutions with respect to the errors within the reduced as well as the nonreduced uncertainty sets and thereby achieve higher probability of satisfying the clinical goals. This shows that asking for a little less in the optimization sometimes leads to better overall plan quality.« less

  12. Optimization of auxiliary optics in active-optics telescopes

    NASA Astrophysics Data System (ADS)

    Ragazzoni, Roberto

    1993-04-01

    The a-priori knowledge of the availability of active optics in a telescope can be advantageous in the design, optimization, and specification of tolerances for auxiliary devices of such a telescope. A modification of the merit function to be used into the optimization process is given, together with some considerations about the design procedure. The different effects of aberrations typically depending upon the position of the field of view (like coma or astigmatism), with those typically constant over the whole field of view (like spherical aberration) are explicitly taken into account in the mathematical treatment. A possible range of applications (prime focus corrector, off-axis field corrector, field flattener, reducing camera, and so on) is discussed. A case study for a field flattener is shown. The general result that can be derived from this paper is that tolerances are generally strongly relaxed, while a significant improvement of the nominal performances can be obtained only in particular cases or assuming a high dynamic range of the active optics correction.

  13. Efficient logistic regression designs under an imperfect population identifier.

    PubMed

    Albert, Paul S; Liu, Aiyi; Nansel, Tonja

    2014-03-01

    Motivated by actual study designs, this article considers efficient logistic regression designs where the population is identified with a binary test that is subject to diagnostic error. We consider the case where the imperfect test is obtained on all participants, while the gold standard test is measured on a small chosen subsample. Under maximum-likelihood estimation, we evaluate the optimal design in terms of sample selection as well as verification. We show that there may be substantial efficiency gains by choosing a small percentage of individuals who test negative on the imperfect test for inclusion in the sample (e.g., verifying 90% test-positive cases). We also show that a two-stage design may be a good practical alternative to a fixed design in some situations. Under optimal and nearly optimal designs, we compare maximum-likelihood and semi-parametric efficient estimators under correct and misspecified models with simulations. The methodology is illustrated with an analysis from a diabetes behavioral intervention trial. © 2013, The International Biometric Society.

  14. Multiple Choice Knapsack Problem: example of planning choice in transportation.

    PubMed

    Zhong, Tao; Young, Rhonda

    2010-05-01

    Transportation programming, a process of selecting projects for funding given budget and other constraints, is becoming more complex as a result of new federal laws, local planning regulations, and increased public involvement. This article describes the use of an integer programming tool, Multiple Choice Knapsack Problem (MCKP), to provide optimal solutions to transportation programming problems in cases where alternative versions of projects are under consideration. In this paper, optimization methods for use in the transportation programming process are compared and then the process of building and solving the optimization problems is discussed. The concepts about the use of MCKP are presented and a real-world transportation programming example at various budget levels is provided. This article illustrates how the use of MCKP addresses the modern complexities and provides timely solutions in transportation programming practice. While the article uses transportation programming as a case study, MCKP can be useful in other fields where a similar decision among a subset of the alternatives is required. Copyright 2009 Elsevier Ltd. All rights reserved.

  15. Optimization of green infrastructure network at semi-urbanized watersheds to manage stormwater volume, peak flow and life cycle cost: Case study of Dead Run watershed in Maryland

    NASA Astrophysics Data System (ADS)

    Heidari Haratmeh, B.; Rai, A.; Minsker, B. S.

    2016-12-01

    Green Infrastructure (GI) has become widely known as a sustainable solution for stormwater management in urban environments. Despite more recognition and acknowledgment, researchers and practitioners lack clear and explicit guidelines on how GI practices should be implemented in urban settings. This study is developing a noisy-based multi-objective, multi-scaled genetic algorithm that determines optimal GI networks for environmental, economic and social objectives. The methodology accounts for uncertainty in modeling results and is designed to perform at sub-watershed as well as patch scale using two different simulation models, SWMM and RHESSys, in a Cloud-based implementation using a Web interface. As an initial case study, a semi-urbanized watershed— DeadRun 5— in Baltimore County, Maryland, is selected. The objective of the study is to minimize life cycle cost, maximize human preference for human well-being and the difference between pre-development hydrographs generated from current rainfall events and design storms, as well as those that result from proposed GI scenarios. Initial results for DeadRun5 watershed suggest that placing GI in the proximity of the watershed outlet optimizes life cycle cost, stormwater volume, and peak flow capture. The framework can easily present outcomes of GI design scenarios to both designers and local stakeholders, and future plans include receiving feedback from users on candidate designs, and interactively updating optimal GI network designs in a crowd-sourced design process. This approach can also be helpful in deriving design guidelines that better meet stakeholder needs.

  16. On the existence of touch points for first-order state inequality constraints

    NASA Technical Reports Server (NTRS)

    Seywald, Hans; Cliff, Eugene M.

    1993-01-01

    The appearance of touch points in state constrained optimal control problems with general vector-valued control is studied. Under the assumption that the Hamiltonian is regular, touch points for first-order state inequalities are shown to exist only under very special conditions. In many cases of practical importance these conditions can be used to exclude touch points a priori without solving an optimal control problem. The results are demonstrated on a simple example.

  17. A Simulation-Optimization Model for the Management of Seawater Intrusion

    NASA Astrophysics Data System (ADS)

    Stanko, Z.; Nishikawa, T.

    2012-12-01

    Seawater intrusion is a common problem in coastal aquifers where excessive groundwater pumping can lead to chloride contamination of a freshwater resource. Simulation-optimization techniques have been developed to determine optimal management strategies while mitigating seawater intrusion. The simulation models are often density-independent groundwater-flow models that may assume a sharp interface and/or use equivalent freshwater heads. The optimization methods are often linear-programming (LP) based techniques that that require simplifications of the real-world system. However, seawater intrusion is a highly nonlinear, density-dependent flow and transport problem, which requires the use of nonlinear-programming (NLP) or global-optimization (GO) techniques. NLP approaches are difficult because of the need for gradient information; therefore, we have chosen a GO technique for this study. Specifically, we have coupled a multi-objective genetic algorithm (GA) with a density-dependent groundwater-flow and transport model to simulate and identify strategies that optimally manage seawater intrusion. GA is a heuristic approach, often chosen when seeking optimal solutions to highly complex and nonlinear problems where LP or NLP methods cannot be applied. The GA utilized in this study is the Epsilon-Nondominated Sorted Genetic Algorithm II (ɛ-NSGAII), which can approximate a pareto-optimal front between competing objectives. This algorithm has several key features: real and/or binary variable capabilities; an efficient sorting scheme; preservation and diversity of good solutions; dynamic population sizing; constraint handling; parallelizable implementation; and user controlled precision for each objective. The simulation model is SEAWAT, the USGS model that couples MODFLOW with MT3DMS for variable-density flow and transport. ɛ-NSGAII and SEAWAT were efficiently linked together through a C-Fortran interface. The simulation-optimization model was first tested by using a published density-independent flow model test case that was originally solved using a sequential LP method with the USGS's Ground-Water Management Process (GWM). For the problem formulation, the objective is to maximize net groundwater extraction, subject to head and head-gradient constraints. The decision variables are pumping rates at fixed wells and the system's state is represented with freshwater hydraulic head. The results of the proposed algorithm were similar to the published results (within 1%); discrepancies may be attributed to differences in the simulators and inherent differences between LP and GA. The GWM test case was then extended to a density-dependent flow and transport version. As formulated, the optimization problem is infeasible because of the density effects on hydraulic head. Therefore, the sum of the squared constraint violation (SSC) was used as a second objective. The result is a pareto curve showing optimal pumping rates versus the SSC. Analysis of this curve indicates that a similar net-extraction rate to the test case can be obtained with a minor violation in vertical head-gradient constraints. This study shows that a coupled ɛ-NSGAII/SEAWAT model can be used for the management of groundwater seawater intrusion. In the future, the proposed methodology will be applied to a real-world seawater intrusion and resource management problem for Santa Barbara, CA.

  18. Power optimization in body sensor networks: the case of an autonomous wireless EMG sensor powered by PV-cells.

    PubMed

    Penders, J; Pop, V; Caballero, L; van de Molengraft, J; van Schaijk, R; Vullers, R; Van Hoof, C

    2010-01-01

    Recent advances in ultra-low-power circuits and energy harvesters are making self-powered body sensor nodes a reality. Power optimization at the system and application level is crucial in achieving ultra-low-power consumption for the entire system. This paper reviews system-level power optimization techniques, and illustrates their impact on the case of autonomous wireless EMG monitoring. The resulting prototype, an Autonomous wireless EMG sensor power by PV-cells, is presented.

  19. On the suitability of different representations of solid catalysts for combinatorial library design by genetic algorithms.

    PubMed

    Gobin, Oliver C; Schüth, Ferdi

    2008-01-01

    Genetic algorithms are widely used to solve and optimize combinatorial problems and are more often applied for library design in combinatorial chemistry. Because of their flexibility, however, their implementation can be challenging. In this study, the influence of the representation of solid catalysts on the performance of genetic algorithms was systematically investigated on the basis of a new, constrained, multiobjective, combinatorial test problem with properties common to problems in combinatorial materials science. Constraints were satisfied by penalty functions, repair algorithms, or special representations. The tests were performed using three state-of-the-art evolutionary multiobjective algorithms by performing 100 optimization runs for each algorithm and test case. Experimental data obtained during the optimization of a noble metal-free solid catalyst system active in the selective catalytic reduction of nitric oxide with propene was used to build up a predictive model to validate the results of the theoretical test problem. A significant influence of the representation on the optimization performance was observed. Binary encodings were found to be the preferred encoding in most of the cases, and depending on the experimental test unit, repair algorithms or penalty functions performed best.

  20. Optimization of groundwater artificial recharge systems using a genetic algorithm: a case study in Beijing, China

    NASA Astrophysics Data System (ADS)

    Hao, Qichen; Shao, Jingli; Cui, Yali; Zhang, Qiulan; Huang, Linxian

    2018-05-01

    An optimization approach is used for the operation of groundwater artificial recharge systems in an alluvial fan in Beijing, China. The optimization model incorporates a transient groundwater flow model, which allows for simulation of the groundwater response to artificial recharge. The facilities' operation with regard to recharge rates is formulated as a nonlinear programming problem to maximize the volume of surface water recharged into the aquifers under specific constraints. This optimization problem is solved by the parallel genetic algorithm (PGA) based on OpenMP, which could substantially reduce the computation time. To solve the PGA with constraints, the multiplicative penalty method is applied. In addition, the facilities' locations are implicitly determined on the basis of the results of the recharge-rate optimizations. Two scenarios are optimized and the optimal results indicate that the amount of water recharged into the aquifers will increase without exceeding the upper limits of the groundwater levels. Optimal operation of this artificial recharge system can also contribute to the more effective recovery of the groundwater storage capacity.

  1. Cast Off expansion plan by rapid improvement through Optimization tool design, Tool Parameters and using Six Sigma’s ECRS Technique

    NASA Astrophysics Data System (ADS)

    Gopalakrishnan, T.; Saravanan, R.

    2017-03-01

    Powerful management concepts step-up the quality of the product, time saving in producing the product thereby increase the production rate, improves tools and techniques, work culture, work place and employee motivation and morale. In this paper discussed about the case study of optimizing the tool design, tool parameters to cast off expansion plan according ECRS technique. The proposed designs and optimal tool parameters yielded best results and meet the customer demand without expansion plan. Hence the work yielded huge savings of money (direct and indirect cost), time and improved the motivation and more of employees significantly.

  2. Lead optimization attrition analysis (LOAA): a novel and general methodology for medicinal chemistry.

    PubMed

    Munson, Mark; Lieberman, Harvey; Tserlin, Elina; Rocnik, Jennifer; Ge, Jie; Fitzgerald, Maria; Patel, Vinod; Garcia-Echeverria, Carlos

    2015-08-01

    Herein, we report a novel and general method, lead optimization attrition analysis (LOAA), to benchmark two distinct small-molecule lead series using a relatively unbiased, simple technique and commercially available software. We illustrate this approach with data collected during lead optimization of two independent oncology programs as a case study. Easily generated graphics and attrition curves enabled us to calibrate progress and support go/no go decisions on each program. We believe that this data-driven technique could be used broadly by medicinal chemists and management to guide strategic decisions during drug discovery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Procuring load curtailment from local customers under uncertainty.

    PubMed

    Mijatović, Aleksandar; Moriarty, John; Vogrinc, Jure

    2017-08-13

    Demand side response (DSR) provides a flexible approach to managing constrained power network assets. This is valuable if future asset utilization is uncertain. However there may be uncertainty over the process of procurement of DSR from customers. In this context we combine probabilistic modelling, simulation and optimization to identify economically optimal procurement policies from heterogeneous customers local to the asset, under chance constraints on the adequacy of the procured DSR. Mathematically this gives rise to a search over permutations, and we provide an illustrative example implementation and case study.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  4. A multi-period optimization model for energy planning with CO(2) emission consideration.

    PubMed

    Mirzaesmaeeli, H; Elkamel, A; Douglas, P L; Croiset, E; Gupta, M

    2010-05-01

    A novel deterministic multi-period mixed-integer linear programming (MILP) model for the power generation planning of electric systems is described and evaluated in this paper. The model is developed with the objective of determining the optimal mix of energy supply sources and pollutant mitigation options that meet a specified electricity demand and CO(2) emission targets at minimum cost. Several time-dependent parameters are included in the model formulation; they include forecasted energy demand, fuel price variability, construction lead time, conservation initiatives, and increase in fixed operational and maintenance costs over time. The developed model is applied to two case studies. The objective of the case studies is to examine the economical, structural, and environmental effects that would result if the electricity sector was required to reduce its CO(2) emissions to a specified limit. Copyright 2009 Elsevier Ltd. All rights reserved.

  5. How do employees and managers perceive depression: a worksite case study.

    PubMed

    Hauck, Katelyn; Chard, Gill

    2009-01-01

    The impact of depression in the workplace is significant. If managers and employees understood depression better they could assist those with depression to achieve optimal work performance. The case study was a medium-sized, privately owned forest products company located in western Canada. Individual interviews were used to explore the views of employees and managers about depression and its impact on work performance. Suggest that how one perceives workplace support for depression is influenced by the interaction of the following factors: a) knowledge and understanding of depression, b) roles and responsibilities within the work environment, and c) perceptions of work role boundaries. Better links are needed between employees and managers to enhance workplace collaborations and achieve optimal work performance. The implementation of mental health support programs and the vocational role of occupational therapy in addressing the impact of depression in the workplace are discussed.

  6. Duodenal and jejunal Dieulafoy’s lesions: optimal management

    PubMed Central

    Yılmaz, Tonguç Utku; Kozan, Ramazan

    2017-01-01

    Dieulafoy’s lesions (DLs) are rare and cause gastrointestinal bleeding resulting from erosion of dilated submucosal vessels. The most common location for DL is the stomach, followed by duodenum. There is little information about duodenal and jejunal DLs. Challenges for diagnosis and treatment of Dieulafoy’s lesions include the rare nature of the disease, asymptomatic patients, bleeding symptoms often requiring rapid diagnosis and treatment in symptomatic patients, variability in the diagnosis and treatment methods resulting from different lesion locations, and the risk of re-bleeding. For these reasons, there is no universal consensus about the diagnosis and treatment approach. There are few published case reports and case series recently published. Most duodenal DLs are not evaluated seperately in the studies, which makes it difficult to determine the optimal model. In this study, we summarize the general aspects and recent approaches used to treat duodenal DL. PMID:29158686

  7. An efficient hybrid approach for multiobjective optimization of water distribution systems

    NASA Astrophysics Data System (ADS)

    Zheng, Feifei; Simpson, Angus R.; Zecchin, Aaron C.

    2014-05-01

    An efficient hybrid approach for the design of water distribution systems (WDSs) with multiple objectives is described in this paper. The objectives are the minimization of the network cost and maximization of the network resilience. A self-adaptive multiobjective differential evolution (SAMODE) algorithm has been developed, in which control parameters are automatically adapted by means of evolution instead of the presetting of fine-tuned parameter values. In the proposed method, a graph algorithm is first used to decompose a looped WDS into a shortest-distance tree (T) or forest, and chords (Ω). The original two-objective optimization problem is then approximated by a series of single-objective optimization problems of the T to be solved by nonlinear programming (NLP), thereby providing an approximate Pareto optimal front for the original whole network. Finally, the solutions at the approximate front are used to seed the SAMODE algorithm to find an improved front for the original entire network. The proposed approach is compared with two other conventional full-search optimization methods (the SAMODE algorithm and the NSGA-II) that seed the initial population with purely random solutions based on three case studies: a benchmark network and two real-world networks with multiple demand loading cases. Results show that (i) the proposed NLP-SAMODE method consistently generates better-quality Pareto fronts than the full-search methods with significantly improved efficiency; and (ii) the proposed SAMODE algorithm (no parameter tuning) exhibits better performance than the NSGA-II with calibrated parameter values in efficiently offering optimal fronts.

  8. Optimization of Contrast Detection Power with Probabilistic Behavioral Information

    PubMed Central

    Cordes, Dietmar; Herzmann, Grit; Nandy, Rajesh; Curran, Tim

    2012-01-01

    Recent progress in the experimental design for event-related fMRI experiments made it possible to find the optimal stimulus sequence for maximum contrast detection power using a genetic algorithm. In this study, a novel algorithm is proposed for optimization of contrast detection power by including probabilistic behavioral information, based on pilot data, in the genetic algorithm. As a particular application, a recognition memory task is studied and the design matrix optimized for contrasts involving the familiarity of individual items (pictures of objects) and the recollection of qualitative information associated with the items (left/right orientation). Optimization of contrast efficiency is a complicated issue whenever subjects’ responses are not deterministic but probabilistic. Contrast efficiencies are not predictable unless behavioral responses are included in the design optimization. However, available software for design optimization does not include options for probabilistic behavioral constraints. If the anticipated behavioral responses are included in the optimization algorithm, the design is optimal for the assumed behavioral responses, and the resulting contrast efficiency is greater than what either a block design or a random design can achieve. Furthermore, improvements of contrast detection power depend strongly on the behavioral probabilities, the perceived randomness, and the contrast of interest. The present genetic algorithm can be applied to any case in which fMRI contrasts are dependent on probabilistic responses that can be estimated from pilot data. PMID:22326984

  9. Multi-GPU implementation of a VMAT treatment plan optimization algorithm.

    PubMed

    Tian, Zhen; Peng, Fei; Folkerts, Michael; Tan, Jun; Jia, Xun; Jiang, Steve B

    2015-06-01

    Volumetric modulated arc therapy (VMAT) optimization is a computationally challenging problem due to its large data size, high degrees of freedom, and many hardware constraints. High-performance graphics processing units (GPUs) have been used to speed up the computations. However, GPU's relatively small memory size cannot handle cases with a large dose-deposition coefficient (DDC) matrix in cases of, e.g., those with a large target size, multiple targets, multiple arcs, and/or small beamlet size. The main purpose of this paper is to report an implementation of a column-generation-based VMAT algorithm, previously developed in the authors' group, on a multi-GPU platform to solve the memory limitation problem. While the column-generation-based VMAT algorithm has been previously developed, the GPU implementation details have not been reported. Hence, another purpose is to present detailed techniques employed for GPU implementation. The authors also would like to utilize this particular problem as an example problem to study the feasibility of using a multi-GPU platform to solve large-scale problems in medical physics. The column-generation approach generates VMAT apertures sequentially by solving a pricing problem (PP) and a master problem (MP) iteratively. In the authors' method, the sparse DDC matrix is first stored on a CPU in coordinate list format (COO). On the GPU side, this matrix is split into four submatrices according to beam angles, which are stored on four GPUs in compressed sparse row format. Computation of beamlet price, the first step in PP, is accomplished using multi-GPUs. A fast inter-GPU data transfer scheme is accomplished using peer-to-peer access. The remaining steps of PP and MP problems are implemented on CPU or a single GPU due to their modest problem scale and computational loads. Barzilai and Borwein algorithm with a subspace step scheme is adopted here to solve the MP problem. A head and neck (H&N) cancer case is then used to validate the authors' method. The authors also compare their multi-GPU implementation with three different single GPU implementation strategies, i.e., truncating DDC matrix (S1), repeatedly transferring DDC matrix between CPU and GPU (S2), and porting computations involving DDC matrix to CPU (S3), in terms of both plan quality and computational efficiency. Two more H&N patient cases and three prostate cases are used to demonstrate the advantages of the authors' method. The authors' multi-GPU implementation can finish the optimization process within ∼ 1 min for the H&N patient case. S1 leads to an inferior plan quality although its total time was 10 s shorter than the multi-GPU implementation due to the reduced matrix size. S2 and S3 yield the same plan quality as the multi-GPU implementation but take ∼4 and ∼6 min, respectively. High computational efficiency was consistently achieved for the other five patient cases tested, with VMAT plans of clinically acceptable quality obtained within 23-46 s. Conversely, to obtain clinically comparable or acceptable plans for all six of these VMAT cases that the authors have tested in this paper, the optimization time needed in a commercial TPS system on CPU was found to be in an order of several minutes. The results demonstrate that the multi-GPU implementation of the authors' column-generation-based VMAT optimization can handle the large-scale VMAT optimization problem efficiently without sacrificing plan quality. The authors' study may serve as an example to shed some light on other large-scale medical physics problems that require multi-GPU techniques.

  10. Combining epidemiology and biomechanics in sports injury prevention research: a new approach for selecting suitable controls.

    PubMed

    Finch, Caroline F; Ullah, Shahid; McIntosh, Andrew S

    2011-01-01

    Several important methodological issues need to be considered when designing sports injury case-control studies. Major design goals for case-control studies include the accounting for prior injury risk exposure, and optimal definitions of both cases and suitable controls are needed to ensure this. This article reviews methodological aspects of published sports injury case-control studies, particularly with regard to the selection of controls. It argues for a new approach towards selecting controls for case-control studies that draws on an interface between epidemiological and biomechanical concepts. A review was conducted to identify sport injury case-control studies published in the peer-review literature during 1985-2008. Overall, 32 articles were identified, of which the majority related to upper or lower extremity injuries. Matching considerations were used for control selection in 16 studies. Specific mention of application of biomechanical principles in the selection of appropriate controls was absent from all studies, including those purporting to evaluate the benefits of personal protective equipment to protect against impact injury. This is a problem because it could lead to biased conclusions, as cases and controls are not fully comparable in terms of similar biomechanical impact profiles relating to the injury incident, such as site of the impact on the body. The strength of the conclusions drawn from case-control studies, and the extent to which results can be generalized, is directly influenced by the definition and recruitment of cases and appropriate controls. Future studies should consider the interface between epidemiological and biomechanical concepts when choosing appropriate controls to ensure that proper adjustment of prior exposure to injury risk is made. To provide necessary guidance for the optimal selection of controls in case-control studies of interventions to prevent sports-related impact injury, this review outlines a new case-control selection strategy that reflects the importance of biomechanical considerations, which ensures that controls are selected based on the presence of the same global injury mechanism as the cases. To summarize, the general biomechanical principles that should apply to the selection of controls in future case-control studies are as follows: (i) each control must have been exposed to the same global injury mechanism as the case, (e.g. head impact, fall onto outstretched arm); and (ii) intrinsic (individual) factors (e.g. age, sex, skill level) that might modify the person's response to the relevant biomechanical loads are adjusted when either selecting the controls or are in the analysis phase. The same considerations for control selection apply to other study designs such as matched cohort studies or case-crossover studies.

  11. The development of multi-objective optimization model for excess bagasse utilization: A case study for Thailand

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buddadee, Bancha; Wirojanagud, Wanpen; Watts, Daniel J.

    In this paper, a multi-objective optimization model is proposed as a tool to assist in deciding for the proper utilization scheme of excess bagasse produced in sugarcane industry. Two major scenarios for excess bagasse utilization are considered in the optimization. The first scenario is the typical situation when excess bagasse is used for the onsite electricity production. In case of the second scenario, excess bagasse is processed for the offsite ethanol production. Then the ethanol is blended with an octane rating of 91 gasoline by a portion of 10% and 90% by volume respectively and the mixture is used asmore » alternative fuel for gasoline vehicles in Thailand. The model proposed in this paper called 'Environmental System Optimization' comprises the life cycle impact assessment of global warming potential (GWP) and the associated cost followed by the multi-objective optimization which facilitates in finding out the optimal proportion of the excess bagasse processed in each scenario. Basic mathematical expressions for indicating the GWP and cost of the entire process of excess bagasse utilization are taken into account in the model formulation and optimization. The outcome of this study is the methodology developed for decision-making concerning the excess bagasse utilization available in Thailand in view of the GWP and economic effects. A demonstration example is presented to illustrate the advantage of the methodology which may be used by the policy maker. The methodology developed is successfully performed to satisfy both environmental and economic objectives over the whole life cycle of the system. It is shown in the demonstration example that the first scenario results in positive GWP while the second scenario results in negative GWP. The combination of these two scenario results in positive or negative GWP depending on the preference of the weighting given to each objective. The results on economics of all scenarios show the satisfied outcomes.« less

  12. Metaheuristic optimization approaches to predict shear-wave velocity from conventional well logs in sandstone and carbonate case studies

    NASA Astrophysics Data System (ADS)

    Emami Niri, Mohammad; Amiri Kolajoobi, Rasool; Khodaiy Arbat, Mohammad; Shahbazi Raz, Mahdi

    2018-06-01

    Seismic wave velocities, along with petrophysical data, provide valuable information during the exploration and development stages of oil and gas fields. The compressional-wave velocity (VP ) is acquired using conventional acoustic logging tools in many drilled wells. But the shear-wave velocity (VS ) is recorded using advanced logging tools only in a limited number of wells, mainly because of the high operational costs. In addition, laboratory measurements of seismic velocities on core samples are expensive and time consuming. So, alternative methods are often used to estimate VS . Heretofore, several empirical correlations that predict VS by using well logging measurements and petrophysical data such as VP , porosity and density are proposed. However, these empirical relations can only be used in limited cases. The use of intelligent systems and optimization algorithms are inexpensive, fast and efficient approaches for predicting VS. In this study, in addition to the widely used Greenberg–Castagna empirical method, we implement three relatively recently developed metaheuristic algorithms to construct linear and nonlinear models for predicting VS : teaching–learning based optimization, imperialist competitive and artificial bee colony algorithms. We demonstrate the applicability and performance of these algorithms to predict Vs using conventional well logs in two field data examples, a sandstone formation from an offshore oil field and a carbonate formation from an onshore oil field. We compared the estimated VS using each of the employed metaheuristic approaches with observed VS and also with those predicted by Greenberg–Castagna relations. The results indicate that, for both sandstone and carbonate case studies, all three implemented metaheuristic algorithms are more efficient and reliable than the empirical correlation to predict VS . The results also demonstrate that in both sandstone and carbonate case studies, the performance of an artificial bee colony algorithm in VS prediction is slightly higher than two other alternative employed approaches.

  13. Improving Curriculum through Blended Learning Pedagogy

    ERIC Educational Resources Information Center

    Darojat, Ojat

    2016-01-01

    This paper is a study of blended learning pedagogy in open and distance learning (ODL), involving two universities in Southeast Asia, STOU Thailand and UT Indonesia. The purpose of this study is to understand the issues related to the implementation of blended-learning pedagogy. Qualitative case study was employed to optimize my understanding of…

  14. Surrogate Model Application to the Identification of Optimal Groundwater Exploitation Scheme Based on Regression Kriging Method—A Case Study of Western Jilin Province

    PubMed Central

    An, Yongkai; Lu, Wenxi; Cheng, Weiguo

    2015-01-01

    This paper introduces a surrogate model to identify an optimal exploitation scheme, while the western Jilin province was selected as the study area. A numerical simulation model of groundwater flow was established first, and four exploitation wells were set in the Tongyu county and Qian Gorlos county respectively so as to supply water to Daan county. Second, the Latin Hypercube Sampling (LHS) method was used to collect data in the feasible region for input variables. A surrogate model of the numerical simulation model of groundwater flow was developed using the regression kriging method. An optimization model was established to search an optimal groundwater exploitation scheme using the minimum average drawdown of groundwater table and the minimum cost of groundwater exploitation as multi-objective functions. Finally, the surrogate model was invoked by the optimization model in the process of solving the optimization problem. Results show that the relative error and root mean square error of the groundwater table drawdown between the simulation model and the surrogate model for 10 validation samples are both lower than 5%, which is a high approximation accuracy. The contrast between the surrogate-based simulation optimization model and the conventional simulation optimization model for solving the same optimization problem, shows the former only needs 5.5 hours, and the latter needs 25 days. The above results indicate that the surrogate model developed in this study could not only considerably reduce the computational burden of the simulation optimization process, but also maintain high computational accuracy. This can thus provide an effective method for identifying an optimal groundwater exploitation scheme quickly and accurately. PMID:26264008

  15. Designing a multistage supply chain in cross-stage reverse logistics environments: application of particle swarm optimization algorithms.

    PubMed

    Chiang, Tzu-An; Che, Z H; Cui, Zhihua

    2014-01-01

    This study designed a cross-stage reverse logistics course for defective products so that damaged products generated in downstream partners can be directly returned to upstream partners throughout the stages of a supply chain for rework and maintenance. To solve this reverse supply chain design problem, an optimal cross-stage reverse logistics mathematical model was developed. In addition, we developed a genetic algorithm (GA) and three particle swarm optimization (PSO) algorithms: the inertia weight method (PSOA_IWM), V(Max) method (PSOA_VMM), and constriction factor method (PSOA_CFM), which we employed to find solutions to support this mathematical model. Finally, a real case and five simulative cases with different scopes were used to compare the execution times, convergence times, and objective function values of the four algorithms used to validate the model proposed in this study. Regarding system execution time, the GA consumed more time than the other three PSOs did. Regarding objective function value, the GA, PSOA_IWM, and PSOA_CFM could obtain a lower convergence value than PSOA_VMM could. Finally, PSOA_IWM demonstrated a faster convergence speed than PSOA_VMM, PSOA_CFM, and the GA did.

  16. Designing a Multistage Supply Chain in Cross-Stage Reverse Logistics Environments: Application of Particle Swarm Optimization Algorithms

    PubMed Central

    Chiang, Tzu-An; Che, Z. H.

    2014-01-01

    This study designed a cross-stage reverse logistics course for defective products so that damaged products generated in downstream partners can be directly returned to upstream partners throughout the stages of a supply chain for rework and maintenance. To solve this reverse supply chain design problem, an optimal cross-stage reverse logistics mathematical model was developed. In addition, we developed a genetic algorithm (GA) and three particle swarm optimization (PSO) algorithms: the inertia weight method (PSOA_IWM), V Max method (PSOA_VMM), and constriction factor method (PSOA_CFM), which we employed to find solutions to support this mathematical model. Finally, a real case and five simulative cases with different scopes were used to compare the execution times, convergence times, and objective function values of the four algorithms used to validate the model proposed in this study. Regarding system execution time, the GA consumed more time than the other three PSOs did. Regarding objective function value, the GA, PSOA_IWM, and PSOA_CFM could obtain a lower convergence value than PSOA_VMM could. Finally, PSOA_IWM demonstrated a faster convergence speed than PSOA_VMM, PSOA_CFM, and the GA did. PMID:24772026

  17. Joint location, inventory, and preservation decisions for non-instantaneous deterioration items under delay in payments

    NASA Astrophysics Data System (ADS)

    Tsao, Yu-Chung

    2016-02-01

    This study models a joint location, inventory and preservation decision-making problem for non-instantaneous deteriorating items under delay in payments. An outside supplier provides a credit period to the wholesaler which has a distribution system with distribution centres (DCs). The non-instantaneous deteriorating means no deterioration occurs in the earlier stage, which is very useful for items such as fresh food and fruits. This paper also considers that the deteriorating rate will decrease and the reservation cost will increase as the preservation effort increases. Therefore, how much preservation effort should be made is a crucial decision. The objective of this paper is to determine the optimal locations and number of DCs, the optimal replenishment cycle time at DCs, and the optimal preservation effort simultaneously such that the total network profit is maximised. The problem is formulated as piecewise nonlinear functions and has three different cases. Algorithms based on piecewise nonlinear optimisation are provided to solve the joint location and inventory problem for all cases. Computational analysis illustrates the solution procedures and the impacts of the related parameters on decisions and profits. The results of this study can serve as references for business managers or administrators.

  18. Efficient Gradient-Based Shape Optimization Methodology Using Inviscid/Viscous CFD

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay

    1997-01-01

    The formerly developed preconditioned-biconjugate-gradient (PBCG) solvers for the analysis and the sensitivity equations had resulted in very large error reductions per iteration; quadratic convergence was achieved whenever the solution entered the domain of attraction to the root. Its memory requirement was also lower as compared to a direct inversion solver. However, this memory requirement was high enough to preclude the realistic, high grid-density design of a practical 3D geometry. This limitation served as the impetus to the first-year activity (March 9, 1995 to March 8, 1996). Therefore, the major activity for this period was the development of the low-memory methodology for the discrete-sensitivity-based shape optimization. This was accomplished by solving all the resulting sets of equations using an alternating-direction-implicit (ADI) approach. The results indicated that shape optimization problems which required large numbers of grid points could be resolved with a gradient-based approach. Therefore, to better utilize the computational resources, it was recommended that a number of coarse grid cases, using the PBCG method, should initially be conducted to better define the optimization problem and the design space, and obtain an improved initial shape. Subsequently, a fine grid shape optimization, which necessitates using the ADI method, should be conducted to accurately obtain the final optimized shape. The other activity during this period was the interaction with the members of the Aerodynamic and Aeroacoustic Methods Branch of Langley Research Center during one stage of their investigation to develop an adjoint-variable sensitivity method using the viscous flow equations. This method had algorithmic similarities to the variational sensitivity methods and the control-theory approach. However, unlike the prior studies, it was considered for the three-dimensional, viscous flow equations. The major accomplishment in the second period of this project (March 9, 1996 to March 8, 1997) was the extension of the shape optimization methodology for the Thin-Layer Navier-Stokes equations. Both the Euler-based and the TLNS-based analyses compared with the analyses obtained using the CFL3D code. The sensitivities, again from both levels of the flow equations, also compared very well with the finite-differenced sensitivities. A fairly large set of shape optimization cases were conducted to study a number of issues previously not well understood. The testbed for these cases was the shaping of an arrow wing in Mach 2.4 flow. All the final shapes, obtained either from a coarse-grid-based or a fine-grid-based optimization, using either a Euler-based or a TLNS-based analysis, were all re-analyzed using a fine-grid, TLNS solution for their function evaluations. This allowed for a more fair comparison of their relative merits. From the aerodynamic performance standpoint, the fine-grid TLNS-based optimization produced the best shape, and the fine-grid Euler-based optimization produced the lowest cruise efficiency.

  19. The optimal age of measles immunisation in low-income countries: a secondary analysis of the assumptions underlying the current policy

    PubMed Central

    Martins, Cesário L; Garly, May-Lill; Rodrigues, Amabelia; Benn, Christine S; Whittle, Hilton

    2012-01-01

    Objective The current policy of measles vaccination at 9 months of age was decided in the mid-1970s. The policy was not tested for impact on child survival but was based on studies of seroconversion after measles vaccination at different ages. The authors examined the empirical evidence for the six underlying assumptions. Design Secondary analysis. Data sources and methods These assumptions have not been research issues. Hence, the authors examined case reports to assess the empirical evidence for the original assumptions. The authors used existing reviews, and in December 2011, the authors made a PubMed search for relevant papers. The title and abstract of papers in English, French, Portuguese, Spanish, German and Scandinavian languages were assessed to ascertain whether the paper was potentially relevant. Based on cumulative measles incidence figures, the authors calculated how many measles cases had been prevented assuming everybody was vaccinated at a specific age, how many ‘vaccine failures’ would occur after the age of vaccination and how many cases would occur before the specific age of vaccination. In the combined analyses of several studies, the authors used the Mantel–Haenszel weighted RR stratifying for study or age groups to estimate common trends. Setting and participants African community studies of measles infection. Primary and secondary outcomes Consistency between assumptions and empirical evidence and the predicted effect on mortality. Results In retrospect, the major assumptions were based on false premises. First, in the single study examining this point, seronegative vaccinated children had considerable protection against measles infection. Second, in 18 community studies, vaccinated measles cases (‘vaccine failures’) had threefold lower case death than unvaccinated cases. Third, in 24 community studies, infants had twofold higher case death than older measles cases. Fourth, the only study examining the assumption that ‘vaccine failures’ lead to lack of confidence found the opposite because vaccinated children had milder measles infection. Fifth, a one-dose policy was recommended. However, the two randomised trials of early two-dose measles vaccination compared with one-dose vaccination found significantly reduced mortality until 3 years of age. Thus, current evidence suggests that the optimal age for a single dose of measles vaccine should have been 6 or 7 months resulting in fewer severe unvaccinated cases among infants but more mild ‘vaccine failures’ among older children. Furthermore, the two-dose trials indicate that measles vaccine reduces mortality from other causes than measles infection. Conclusions Many lives may have been lost by not determining the optimal age of measles vaccination. Since seroconversion continues to be the basis for policy, the current recommendation is to increase the age of measles vaccination to 12 months in countries with limited measles transmission. This policy may lead to an increase in child mortality. PMID:22815465

  20. Optimal Chebyshev polynomials on ellipses in the complex plane

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd; Freund, Roland

    1989-01-01

    The design of iterative schemes for sparse matrix computations often leads to constrained polynomial approximation problems on sets in the complex plane. For the case of ellipses, we introduce a new class of complex polynomials which are in general very good approximations to the best polynomials and even optimal in most cases.

  1. An Optimal Algorithm towards Successive Location Privacy in Sensor Networks with Dynamic Programming

    NASA Astrophysics Data System (ADS)

    Zhao, Baokang; Wang, Dan; Shao, Zili; Cao, Jiannong; Chan, Keith C. C.; Su, Jinshu

    In wireless sensor networks, preserving location privacy under successive inference attacks is extremely critical. Although this problem is NP-complete in general cases, we propose a dynamic programming based algorithm and prove it is optimal in special cases where the correlation only exists between p immediate adjacent observations.

  2. Computed Tomographic Analysis of Ventral Atlantoaxial Optimal Safe Implantation Corridors in 27 Dogs.

    PubMed

    Leblond, Guillaume; Gaitero, Luis; Moens, Noel M M; Zur Linden, Alex; James, Fiona M K; Monteith, Gabrielle J; Runciman, John

    2017-11-01

    Objectives  Ventral atlantoaxial stabilization techniques are challenging surgical procedures in dogs. Available surgical guidelines are based upon subjective anatomical landmarks, and limited radiographic and computed tomographic data. The aims of this study were (1) to provide detailed anatomical descriptions of atlantoaxial optimal safe implantation corridors to generate objective recommendations for optimal implant placements and (2) to compare anatomical data obtained in non-affected Toy breed dogs, affected Toy breed dogs suffering from atlantoaxial instability and non-affected Beagle dogs. Methods  Anatomical data were collected from a prospectively recruited population of 27 dogs using a previously validated method of optimal safe implantation corridor analysis using computed tomographic images. Results  Optimal implant positions and three-dimensional numerical data were generated successfully in all cases. Anatomical landmarks could be used to generate objective definitions of optimal insertion points which were applicable across all three groups. Overall the geometrical distribution of all implant sites was similar in all three groups with a few exceptions. Clinical Significance  This study provides extensive anatomical data available to facilitate surgical planning of implant placement for atlantoaxial stabilization. Our data suggest that non-affected Toy breed dogs and non-affected Beagle dogs constitute reasonable research models to study atlantoaxial stabilization constructs. Schattauer GmbH Stuttgart.

  3. WE-AB-303-06: Combining DAO with MV + KV Optimization to Improve Skin Dose Sparing with Real-Time Fluoroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grelewicz, Z; Wiersma, R

    Purpose: Real-time fluoroscopy may allow for improved patient positioning and tumor tracking, particularly in the treatment of lung tumors. In order to mitigate the effects of the imaging dose, previous studies have demonstrated the effect of including both imaging dose and imaging constraints into the inverse treatment planning object function. That method of combined MV+kV optimization may Result in plans with treatment beams chosen to allow for more gentle imaging beam-on times. Direct-aperture optimization (DAO) is also known to produce treatment plans with fluence maps more conducive to lower beam-on times. Therefore, in this work we demonstrate the feasibility ofmore » a combination of DAO and MV+kV optimization for further optimized real-time kV imaging. Methods: Therapeutic and imaging beams were modeled in the EGSnrc Monte Carlo environment, and applied to a patient model for a previously treated lung patient to provide dose influence matrices from DOSXYZnrc. An MV + kV IMRT DAO treatment planning system was developed to compare DAO treatment plans with and without MV+kV optimization. The objective function was optimized using simulated annealing. In order to allow for comparisons between different cases of the stochastically optimized plans, the optimization was repeated twenty times. Results: Across twenty optimizations, combined MV+kV IMRT resulted in an average of 12.8% reduction in peak skin dose. Both non-optimized and MV+kV optimized imaging beams delivered, on average, mean dose of approximately 1 cGy per fraction to the target, with peak doses to target of approximately 6 cGy per fraction. Conclusion: When using DAO, MV+kV optimization is shown to Result in improvements to plan quality in terms of skin dose, when compared to the case of MV optimization with non-optimized kV imaging. The combination of DAO and MV+kV optimization may allow for real-time imaging without excessive imaging dose. Financial support for the work has been provided in part by NIH Grant T32 EB002103, ACS RSG-13-313-01-CCE, and NIH S10 RR021039 and P30 CA14599 grants. The contents of this submission do not necessarily represent the official views of any of the supporting organizations.« less

  4. A novel comprehensive learning artificial bee colony optimizer for dynamic optimization biological problems.

    PubMed

    Su, Weixing; Chen, Hanning; Liu, Fang; Lin, Na; Jing, Shikai; Liang, Xiaodan; Liu, Wei

    2017-03-01

    There are many dynamic optimization problems in the real world, whose convergence and searching ability is cautiously desired, obviously different from static optimization cases. This requires an optimization algorithm adaptively seek the changing optima over dynamic environments, instead of only finding the global optimal solution in the static environment. This paper proposes a novel comprehensive learning artificial bee colony optimizer (CLABC) for optimization in dynamic environments problems, which employs a pool of optimal foraging strategies to balance the exploration and exploitation tradeoff. The main motive of CLABC is to enrich artificial bee foraging behaviors in the ABC model by combining Powell's pattern search method, life-cycle, and crossover-based social learning strategy. The proposed CLABC is a more bee-colony-realistic model that the bee can reproduce and die dynamically throughout the foraging process and population size varies as the algorithm runs. The experiments for evaluating CLABC are conducted on the dynamic moving peak benchmarks. Furthermore, the proposed algorithm is applied to a real-world application of dynamic RFID network optimization. Statistical analysis of all these cases highlights the significant performance improvement due to the beneficial combination and demonstrates the performance superiority of the proposed algorithm.

  5. Energy efficiency analysis and optimization for mobile platforms

    NASA Astrophysics Data System (ADS)

    Metri, Grace Camille

    The introduction of mobile devices changed the landscape of computing. Gradually, these devices are replacing traditional personal computer (PCs) to become the devices of choice for entertainment, connectivity, and productivity. There are currently at least 45.5 million people in the United States who own a mobile device, and that number is expected to increase to 1.5 billion by 2015. Users of mobile devices expect and mandate that their mobile devices have maximized performance while consuming minimal possible power. However, due to the battery size constraints, the amount of energy stored in these devices is limited and is only growing by 5% annually. As a result, we focused in this dissertation on energy efficiency analysis and optimization for mobile platforms. We specifically developed SoftPowerMon, a tool that can power profile Android platforms in order to expose the power consumption behavior of the CPU. We also performed an extensive set of case studies in order to determine energy inefficiencies of mobile applications. Through our case studies, we were able to propose optimization techniques in order to increase the energy efficiency of mobile devices and proposed guidelines for energy-efficient application development. In addition, we developed BatteryExtender, an adaptive user-guided tool for power management of mobile devices. The tool enables users to extend battery life on demand for a specific duration until a particular task is completed. Moreover, we examined the power consumption of System-on-Chips (SoCs) and observed the impact on the energy efficiency in the event of offloading tasks from the CPU to the specialized custom engines. Based on our case studies, we were able to demonstrate that current software-based power profiling techniques for SoCs can have an error rate close to 12%, which needs to be addressed in order to be able to optimize the energy consumption of the SoC. Finally, we summarize our contributions and outline possible direction for future research in this field.

  6. Potential for Integrating Entry Guidance into the Multi-Disciplinary Entry Vehicle Optimization Environment

    NASA Technical Reports Server (NTRS)

    D'souza, Sarah N.; Kinney, David J.; Garcia, Joseph A.; Sarigul-Klijn, Nesrin

    2014-01-01

    The state-of-the-art in vehicle design decouples flight feasible trajectory generation from the optimization process of an entry spacecraft shape. The disadvantage to this decoupled process is seen when a particular aeroshell does not meet in-flight requirements when integrated into Guidance, Navigation, and Control simulations. It is postulated that the integration of a guidance algorithm into the design process will provide a real-time, rapid trajectory generation technique to enhance the robustness of vehicle design solutions. The potential benefit of this integration is a reduction in design cycles (possible cost savings) and increased accuracy in the aerothermal environment (possible mass savings). This work examines two aspects: 1) the performance of a reference tracking guidance algorithm for five different geometries with the same reference trajectory and 2) the potential of mass savings from improved aerothermal predictions. An Apollo Derived Guidance (ADG) algorithm is used in this study. The baseline geometry and five test case geometries were flown using the same baseline trajectory. The guided trajectory results are compared to separate trajectories determined in a vehicle optimization study conducted for NASA's Mars Entry, Descent, and Landing System Analysis. This study revealed several aspects regarding the potential gains and required developments for integrating a guidance algorithm into the vehicle optimization environment. First, the generation of flight feasible trajectories is only as good as the robustness of the guidance algorithm. The set of dispersed geometries modelled aerodynamic dispersions that ranged from +/-1% to +/-17% and a single extreme case was modelled where the aerodynamics were approximately 80% less than the baseline geometry. The ADG, as expected, was able to guide the vehicle into the aeroshell separation box at the target location for dispersions up to 17%, but failed for the 80% dispersion cases. Finally, the results revealed that including flight feasible trajectories for a set of dispersed geometries has the potential to save mass up to 430 kg.

  7. Diffusion Monte Carlo approach versus adiabatic computation for local Hamiltonians

    NASA Astrophysics Data System (ADS)

    Bringewatt, Jacob; Dorland, William; Jordan, Stephen P.; Mink, Alan

    2018-02-01

    Most research regarding quantum adiabatic optimization has focused on stoquastic Hamiltonians, whose ground states can be expressed with only real non-negative amplitudes and thus for whom destructive interference is not manifest. This raises the question of whether classical Monte Carlo algorithms can efficiently simulate quantum adiabatic optimization with stoquastic Hamiltonians. Recent results have given counterexamples in which path-integral and diffusion Monte Carlo fail to do so. However, most adiabatic optimization algorithms, such as for solving MAX-k -SAT problems, use k -local Hamiltonians, whereas our previous counterexample for diffusion Monte Carlo involved n -body interactions. Here we present a 6-local counterexample which demonstrates that even for these local Hamiltonians there are cases where diffusion Monte Carlo cannot efficiently simulate quantum adiabatic optimization. Furthermore, we perform empirical testing of diffusion Monte Carlo on a standard well-studied class of permutation-symmetric tunneling problems and similarly find large advantages for quantum optimization over diffusion Monte Carlo.

  8. Application of Particle Swarm Optimization Algorithm in the Heating System Planning Problem

    PubMed Central

    Ma, Rong-Jiang; Yu, Nan-Yang; Hu, Jun-Yi

    2013-01-01

    Based on the life cycle cost (LCC) approach, this paper presents an integral mathematical model and particle swarm optimization (PSO) algorithm for the heating system planning (HSP) problem. The proposed mathematical model minimizes the cost of heating system as the objective for a given life cycle time. For the particularity of HSP problem, the general particle swarm optimization algorithm was improved. An actual case study was calculated to check its feasibility in practical use. The results show that the improved particle swarm optimization (IPSO) algorithm can more preferably solve the HSP problem than PSO algorithm. Moreover, the results also present the potential to provide useful information when making decisions in the practical planning process. Therefore, it is believed that if this approach is applied correctly and in combination with other elements, it can become a powerful and effective optimization tool for HSP problem. PMID:23935429

  9. Optimal design of upstream processes in biotransformation technologies.

    PubMed

    Dheskali, Endrit; Michailidi, Katerina; de Castro, Aline Machado; Koutinas, Apostolis A; Kookos, Ioannis K

    2017-01-01

    In this work a mathematical programming model for the optimal design of the bioreaction section of biotechnological processes is presented. Equations for the estimation of the equipment cost derived from a recent publication by the US National Renewable Energy Laboratory (NREL) are also summarized. The cost-optimal design of process units and the optimal scheduling of their operation can be obtained using the proposed formulation that has been implemented in software available from the journal web page or the corresponding author. The proposed optimization model can be used to quantify the effects of decisions taken at a lab scale on the industrial scale process economics. It is of paramount important to note that this can be achieved at the early stage of the development of a biotechnological project. Two case studies are presented that demonstrate the usefulness and potential of the proposed methodology. Copyright © 2016. Published by Elsevier Ltd.

  10. Customer demand prediction of service-oriented manufacturing using the least square support vector machine optimized by particle swarm optimization algorithm

    NASA Astrophysics Data System (ADS)

    Cao, Jin; Jiang, Zhibin; Wang, Kangzhou

    2017-07-01

    Many nonlinear customer satisfaction-related factors significantly influence the future customer demand for service-oriented manufacturing (SOM). To address this issue and enhance the prediction accuracy, this article develops a novel customer demand prediction approach for SOM. The approach combines the phase space reconstruction (PSR) technique with the optimized least square support vector machine (LSSVM). First, the prediction sample space is reconstructed by the PSR to enrich the time-series dynamics of the limited data sample. Then, the generalization and learning ability of the LSSVM are improved by the hybrid polynomial and radial basis function kernel. Finally, the key parameters of the LSSVM are optimized by the particle swarm optimization algorithm. In a real case study, the customer demand prediction of an air conditioner compressor is implemented. Furthermore, the effectiveness and validity of the proposed approach are demonstrated by comparison with other classical predication approaches.

  11. Optimized scheme in coal-fired boiler combustion based on information entropy and modified K-prototypes algorithm

    NASA Astrophysics Data System (ADS)

    Gu, Hui; Zhu, Hongxia; Cui, Yanfeng; Si, Fengqi; Xue, Rui; Xi, Han; Zhang, Jiayu

    2018-06-01

    An integrated combustion optimization scheme is proposed for the combined considering the restriction in coal-fired boiler combustion efficiency and outlet NOx emissions. Continuous attribute discretization and reduction techniques are handled as optimization preparation by E-Cluster and C_RED methods, in which the segmentation numbers don't need to be provided in advance and can be continuously adapted with data characters. In order to obtain results of multi-objections with clustering method for mixed data, a modified K-prototypes algorithm is then proposed. This algorithm can be divided into two stages as K-prototypes algorithm for clustering number self-adaptation and clustering for multi-objective optimization, respectively. Field tests were carried out at a 660 MW coal-fired boiler to provide real data as a case study for controllable attribute discretization and reduction in boiler system and obtaining optimization parameters considering [ maxηb, minyNOx ] multi-objective rule.

  12. Isolation strategy of a two-strain avian influenza model using optimal control

    NASA Astrophysics Data System (ADS)

    Mardlijah, Ariani, Tika Desi; Asfihani, Tahiyatul

    2017-08-01

    Avian influenza has killed many victims of both birds and humans. Most cases of avian influenza infection in humans have resulted transmission from poultry to humans. To prevent or minimize the patients of avian influenza can be done by pharmaceutical and non-pharmaceutical measures such as the use of masks, isolation, etc. We will be analyzed two strains of avian influenza models that focus on treatment of symptoms with insulation, then investigate the stability of the equilibrium point by using Routh-Hurwitz criteria. We also used optimal control to reduce the number of humans infected by making the isolation level as the control then proceeds optimal control will be simulated. The completion of optimal control used in this study is the Pontryagin Minimum Principle and for simulation we are using Runge Kutta method. The results obtained showed that the application of two control is more optimal compared to apply one control only.

  13. Global Optimization of Emergency Evacuation Assignments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han, Lee; Yuan, Fang; Chin, Shih-Miao

    2006-01-01

    Conventional emergency evacuation plans often assign evacuees to fixed routes or destinations based mainly on geographic proximity. Such approaches can be inefficient if the roads are congested, blocked, or otherwise dangerous because of the emergency. By not constraining evacuees to prespecified destinations, a one-destination evacuation approach provides flexibility in the optimization process. We present a framework for the simultaneous optimization of evacuation-traffic distribution and assignment. Based on the one-destination evacuation concept, we can obtain the optimal destination and route assignment by solving a one-destination traffic-assignment problem on a modified network representation. In a county-wide, large-scale evacuation case study, the one-destinationmore » model yields substantial improvement over the conventional approach, with the overall evacuation time reduced by more than 60 percent. More importantly, emergency planners can easily implement this framework by instructing evacuees to go to destinations that the one-destination optimization process selects.« less

  14. Optimal spacecraft attitude control using collocation and nonlinear programming

    NASA Astrophysics Data System (ADS)

    Herman, A. L.; Conway, B. A.

    1992-10-01

    Direct collocation with nonlinear programming (DCNLP) is employed to find the optimal open-loop control histories for detumbling a disabled satellite. The controls are torques and forces applied to the docking arm and joint and torques applied about the body axes of the OMV. Solutions are obtained for cases in which various constraints are placed on the controls and in which the number of controls is reduced or increased from that considered in Conway and Widhalm (1986). DCLNP works well when applied to the optimal control problem of satellite attitude control. The formulation is straightforward and produces good results in a relatively small amount of time on a Cray X/MP with no a priori information about the optimal solution. The addition of joint acceleration to the controls significantly reduces the control magnitudes and optimal cost. In all cases, the torques and acclerations are modest and the optimal cost is very modest.

  15. A hybrid multi-objective evolutionary algorithm for wind-turbine blade optimization

    NASA Astrophysics Data System (ADS)

    Sessarego, M.; Dixon, K. R.; Rival, D. E.; Wood, D. H.

    2015-08-01

    A concurrent-hybrid non-dominated sorting genetic algorithm (hybrid NSGA-II) has been developed and applied to the simultaneous optimization of the annual energy production, flapwise root-bending moment and mass of the NREL 5 MW wind-turbine blade. By hybridizing a multi-objective evolutionary algorithm (MOEA) with gradient-based local search, it is believed that the optimal set of blade designs could be achieved in lower computational cost than for a conventional MOEA. To measure the convergence between the hybrid and non-hybrid NSGA-II on a wind-turbine blade optimization problem, a computationally intensive case was performed using the non-hybrid NSGA-II. From this particular case, a three-dimensional surface representing the optimal trade-off between the annual energy production, flapwise root-bending moment and blade mass was achieved. The inclusion of local gradients in the blade optimization, however, shows no improvement in the convergence for this three-objective problem.

  16. Muscle Synergies May Improve Optimization Prediction of Knee Contact Forces During Walking

    PubMed Central

    Walter, Jonathan P.; Kinney, Allison L.; Banks, Scott A.; D'Lima, Darryl D.; Besier, Thor F.; Lloyd, David G.; Fregly, Benjamin J.

    2014-01-01

    The ability to predict patient-specific joint contact and muscle forces accurately could improve the treatment of walking-related disorders. Muscle synergy analysis, which decomposes a large number of muscle electromyographic (EMG) signals into a small number of synergy control signals, could reduce the dimensionality and thus redundancy of the muscle and contact force prediction process. This study investigated whether use of subject-specific synergy controls can improve optimization prediction of knee contact forces during walking. To generate the predictions, we performed mixed dynamic muscle force optimizations (i.e., inverse skeletal dynamics with forward muscle activation and contraction dynamics) using data collected from a subject implanted with a force-measuring knee replacement. Twelve optimization problems (three cases with four subcases each) that minimized the sum of squares of muscle excitations were formulated to investigate how synergy controls affect knee contact force predictions. The three cases were: (1) Calibrate+Match where muscle model parameter values were calibrated and experimental knee contact forces were simultaneously matched, (2) Precalibrate+Predict where experimental knee contact forces were predicted using precalibrated muscle model parameters values from the first case, and (3) Calibrate+Predict where muscle model parameter values were calibrated and experimental knee contact forces were simultaneously predicted, all while matching inverse dynamic loads at the hip, knee, and ankle. The four subcases used either 44 independent controls or five synergy controls with and without EMG shape tracking. For the Calibrate+Match case, all four subcases closely reproduced the measured medial and lateral knee contact forces (R2 ≥ 0.94, root-mean-square (RMS) error < 66 N), indicating sufficient model fidelity for contact force prediction. For the Precalibrate+Predict and Calibrate+Predict cases, synergy controls yielded better contact force predictions (0.61 < R2 < 0.90, 83 N < RMS error < 161 N) than did independent controls (-0.15 < R2 < 0.79, 124 N < RMS error < 343 N) for corresponding subcases. For independent controls, contact force predictions improved when precalibrated model parameter values or EMG shape tracking was used. For synergy controls, contact force predictions were relatively insensitive to how model parameter values were calibrated, while EMG shape tracking made lateral (but not medial) contact force predictions worse. For the subject and optimization cost function analyzed in this study, use of subject-specific synergy controls improved the accuracy of knee contact force predictions, especially for lateral contact force when EMG shape tracking was omitted, and reduced prediction sensitivity to uncertainties in muscle model parameter values. PMID:24402438

  17. Muscle synergies may improve optimization prediction of knee contact forces during walking.

    PubMed

    Walter, Jonathan P; Kinney, Allison L; Banks, Scott A; D'Lima, Darryl D; Besier, Thor F; Lloyd, David G; Fregly, Benjamin J

    2014-02-01

    The ability to predict patient-specific joint contact and muscle forces accurately could improve the treatment of walking-related disorders. Muscle synergy analysis, which decomposes a large number of muscle electromyographic (EMG) signals into a small number of synergy control signals, could reduce the dimensionality and thus redundancy of the muscle and contact force prediction process. This study investigated whether use of subject-specific synergy controls can improve optimization prediction of knee contact forces during walking. To generate the predictions, we performed mixed dynamic muscle force optimizations (i.e., inverse skeletal dynamics with forward muscle activation and contraction dynamics) using data collected from a subject implanted with a force-measuring knee replacement. Twelve optimization problems (three cases with four subcases each) that minimized the sum of squares of muscle excitations were formulated to investigate how synergy controls affect knee contact force predictions. The three cases were: (1) Calibrate+Match where muscle model parameter values were calibrated and experimental knee contact forces were simultaneously matched, (2) Precalibrate+Predict where experimental knee contact forces were predicted using precalibrated muscle model parameters values from the first case, and (3) Calibrate+Predict where muscle model parameter values were calibrated and experimental knee contact forces were simultaneously predicted, all while matching inverse dynamic loads at the hip, knee, and ankle. The four subcases used either 44 independent controls or five synergy controls with and without EMG shape tracking. For the Calibrate+Match case, all four subcases closely reproduced the measured medial and lateral knee contact forces (R2 ≥ 0.94, root-mean-square (RMS) error < 66 N), indicating sufficient model fidelity for contact force prediction. For the Precalibrate+Predict and Calibrate+Predict cases, synergy controls yielded better contact force predictions (0.61 < R2 < 0.90, 83 N < RMS error < 161 N) than did independent controls (-0.15 < R2 < 0.79, 124 N < RMS error < 343 N) for corresponding subcases. For independent controls, contact force predictions improved when precalibrated model parameter values or EMG shape tracking was used. For synergy controls, contact force predictions were relatively insensitive to how model parameter values were calibrated, while EMG shape tracking made lateral (but not medial) contact force predictions worse. For the subject and optimization cost function analyzed in this study, use of subject-specific synergy controls improved the accuracy of knee contact force predictions, especially for lateral contact force when EMG shape tracking was omitted, and reduced prediction sensitivity to uncertainties in muscle model parameter values.

  18. Optimal mix of renewable power generation in the MENA region as a basis for an efficient electricity supply to europe

    NASA Astrophysics Data System (ADS)

    Alhamwi, Alaa; Kleinhans, David; Weitemeyer, Stefan; Vogt, Thomas

    2014-12-01

    Renewable Energy sources are gaining importance in the Middle East and North Africa (MENA) region. The purpose of this study is to quantify the optimal mix of renewable power generation in the MENA region, taking Morocco as a case study. Based on hourly meteorological data and load data, a 100% solar-plus-wind only scenario for Morocco is investigated. For the optimal mix analyses, a mismatch energy modelling approach is adopted with the objective to minimise the required storage capacities. For a hypothetical Moroccan energy supply system which is entirely based on renewable energy sources, our results show that the minimum storage capacity is achieved at a share of 63% solar and 37% wind power generations.

  19. Constraint factor in optimization of truss structures via flower pollination algorithm

    NASA Astrophysics Data System (ADS)

    Bekdaş, Gebrail; Nigdeli, Sinan Melih; Sayin, Baris

    2017-07-01

    The aim of the paper is to investigate the optimum design of truss structures by considering different stress and displacement constraints. For that reason, the flower pollination algorithm based methodology was applied for sizing optimization of space truss structures. Flower pollination algorithm is a metaheuristic algorithm inspired by the pollination process of flowering plants. By the imitation of cross-pollination and self-pollination processes, the randomly generation of sizes of truss members are done in two ways and these two types of optimization are controlled with a switch probability. In the study, a 72 bar space truss structure was optimized by using five different cases of the constraint limits. According to the results, a linear relationship between the optimum structure weight and constraint limits was observed.

  20. Investigation on the optimal magnetic field of a cusp electron gun for a W-band gyro-TWA

    NASA Astrophysics Data System (ADS)

    Zhang, Liang; He, Wenlong; Donaldson, Craig R.; Cross, Adrian W.

    2018-05-01

    High efficiency and broadband operation of a gyrotron traveling wave amplifier (gyro-TWA) require a high-quality electron beam with low-velocity spreads. The beam velocity spreads are mainly due to the differences of the electric and magnetic fields that the electrons withstand the electron gun. This paper investigates the possibility to decouple the design of electron gun geometry and the magnet system while still achieving optimal results, through a case study of designing a cusp electron gun for a W-band gyro-TWA. A global multiple-objective optimization routing was used to optimize the electron gun geometry for different predefined magnetic field profiles individually. Their results were compared and the properties of the required magnetic field profile are summarized.

  1. Research on Collection System Optimal Design of Wind Farm with Obstacles

    NASA Astrophysics Data System (ADS)

    Huang, W.; Yan, B. Y.; Tan, R. S.; Liu, L. F.

    2017-05-01

    To the collection system optimal design of offshore wind farm, the factors considered are not only the reasonable configuration of the cable and switch, but also the influence of the obstacles on the topology design of the offshore wind farm. This paper presents a concrete topology optimization algorithm with obstacles. The minimal area rectangle encasing box of the obstacle is obtained by using the method of minimal area encasing box. Then the optimization algorithm combining the advantages of Dijkstra algorithm and Prim algorithm is used to gain the scheme of avoidance obstacle path planning. Finally a fuzzy comprehensive evaluation model based on the analytic hierarchy process is constructed to compare the performance of the different topologies. Case studies demonstrate the feasibility of the proposed algorithm and model.

  2. Polarimetric SAR Interferometry to Monitor Land Subsidence in Tehran

    NASA Astrophysics Data System (ADS)

    Sadeghi, Zahra; Valadan Zoej, Mohammad Javad; Muller, Jan-Peter

    2016-08-01

    This letter uses a combination of ADInSAR with a coherence optimization method. Polarimetric DInSAR is able to enhance pixel phase quality and thus coherent pixel density. The coherence optimization method is a search-based approach to find the optimized scattering mechanism introduced by Navarro-Sanchez [1]. The case study is southwest of Tehran basin located in the North of Iran. It suffers from a high-rate of land subsidence and is covered by agricultural fields. Usually such an area would significantly decorrelate but applying polarimetric ADInSAR it is possible to obtain a more coherent pixel coverage. A set of dual-pol TerraSAR-X images was ordered for polarimetric ADInSAR procedure. The coherence optimization method is shown to have increased the density and phase quality of coherent pixels significantly.

  3. Voronoi Diagram Based Optimization of Dynamic Reactive Power Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Weihong; Sun, Kai; Qi, Junjian

    2015-01-01

    Dynamic var sources can effectively mitigate fault-induced delayed voltage recovery (FIDVR) issues or even voltage collapse. This paper proposes a new approach to optimization of the sizes of dynamic var sources at candidate locations by a Voronoi diagram based algorithm. It first disperses sample points of potential solutions in a searching space, evaluates a cost function at each point by barycentric interpolation for the subspaces around the point, and then constructs a Voronoi diagram about cost function values over the entire space. Accordingly, the final optimal solution can be obtained. Case studies on the WSCC 9-bus system and NPCC 140-busmore » system have validated that the new approach can quickly identify the boundary of feasible solutions in searching space and converge to the global optimal solution.« less

  4. Multiple crack detection in 3D using a stable XFEM and global optimization

    NASA Astrophysics Data System (ADS)

    Agathos, Konstantinos; Chatzi, Eleni; Bordas, Stéphane P. A.

    2018-02-01

    A numerical scheme is proposed for the detection of multiple cracks in three dimensional (3D) structures. The scheme is based on a variant of the extended finite element method (XFEM) and a hybrid optimizer solution. The proposed XFEM variant is particularly well-suited for the simulation of 3D fracture problems, and as such serves as an efficient solution to the so-called forward problem. A set of heuristic optimization algorithms are recombined into a multiscale optimization scheme. The introduced approach proves effective in tackling the complex inverse problem involved, where identification of multiple flaws is sought on the basis of sparse measurements collected near the structural boundary. The potential of the scheme is demonstrated through a set of numerical case studies of varying complexity.

  5. Optimizing a Workplace Learning Pattern: A Case Study from Aviation

    ERIC Educational Resources Information Center

    Mavin, Timothy John; Roth, Wolff-Michael

    2015-01-01

    Purpose: This study aims to contribute to current research on team learning patterns. It specifically addresses some negative perceptions of the job performance learning pattern. Design/methodology/approach: Over a period of three years, qualitative and quantitative data were gathered on pilot learning in the workplace. The instructional modes…

  6. A Hypothesis-Driven Approach to Site Investigation

    NASA Astrophysics Data System (ADS)

    Nowak, W.

    2008-12-01

    Variability of subsurface formations and the scarcity of data lead to the notion of aquifer parameters as geostatistical random variables. Given an information need and limited resources for field campaigns, site investigation is often put into the context of optimal design. In optimal design, the types, numbers and positions of samples are optimized under case-specific objectives to meet the information needs. Past studies feature optimal data worth (balancing maximum financial profit in an engineering task versus the cost of additional sampling), or aim at a minimum prediction uncertainty of stochastic models for a prescribed investigation budget. Recent studies also account for other sources of uncertainty outside the hydrogeological range, such as uncertain toxicity, ingestion and behavioral parameters of the affected population when predicting the human health risk from groundwater contaminations. The current study looks at optimal site investigation from a new angle. Answering a yes/no question under uncertainty directly requires recasting the original question as a hypothesis test. Otherwise, false confidence in the resulting answer would be pretended. A straightforward example is whether a recent contaminant spill will cause contaminant concentrations in excess of a legal limit at a nearby drinking water well. This question can only be answered down to a specified chance of error, i.e., based on the significance level used in hypothesis tests. Optimal design is placed into the hypothesis-driven context by using the chance of providing a false yes/no answer as new criterion to be minimized. Different configurations apply for one-sided and two-sided hypothesis tests. If a false answer entails financial liability, the hypothesis-driven context can be re-cast in the context of data worth. The remaining difference is that failure is a hard constraint in the data worth context versus a monetary punishment term in the hypothesis-driven context. The basic principle is discussed and illustrated on the case of a hypothetical contaminant spill and the exceedance of critical contaminant levels at a downstream location. An tempting and important side question is whether site investigation could be tweaked towards a yes or no answer in maliciously biased campaigns by unfair formulation of the optimization objective.

  7. Optimizing adaptive design for Phase 2 dose-finding trials incorporating long-term success and financial considerations: A case study for neuropathic pain.

    PubMed

    Gao, Jingjing; Nangia, Narinder; Jia, Jia; Bolognese, James; Bhattacharyya, Jaydeep; Patel, Nitin

    2017-06-01

    In this paper, we propose an adaptive randomization design for Phase 2 dose-finding trials to optimize Net Present Value (NPV) for an experimental drug. We replace the traditional fixed sample size design (Patel, et al., 2012) by this new design to see if NPV from the original paper can be improved. Comparison of the proposed design to the previous design is made via simulations using a hypothetical example based on a Diabetic Neuropathic Pain Study. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Forecasting peaks of seasonal influenza epidemics.

    PubMed

    Nsoesie, Elaine; Mararthe, Madhav; Brownstein, John

    2013-06-21

    We present a framework for near real-time forecast of influenza epidemics using a simulation optimization approach. The method combines an individual-based model and a simple root finding optimization method for parameter estimation and forecasting. In this study, retrospective forecasts were generated for seasonal influenza epidemics using web-based estimates of influenza activity from Google Flu Trends for 2004-2005, 2007-2008 and 2012-2013 flu seasons. In some cases, the peak could be forecasted 5-6 weeks ahead. This study adds to existing resources for influenza forecasting and the proposed method can be used in conjunction with other approaches in an ensemble framework.

  9. SEWER AND TANK SEDIMENT FLUSHING: CASE STUDIES

    EPA Science Inventory

    The objective of the report summarized here is to demonstrate that sewer system and storage tank flushing that reduces sediment deposition and accumulation is of prime importance to optimizing performance, maintaining structural integrity, and minimizing pollution of receiving wa...

  10. Methodology for designing and manufacturing complex biologically inspired soft robotic fluidic actuators: prosthetic hand case study.

    PubMed

    Thompson-Bean, E; Das, R; McDaid, A

    2016-10-31

    We present a novel methodology for the design and manufacture of complex biologically inspired soft robotic fluidic actuators. The methodology is applied to the design and manufacture of a prosthetic for the hand. Real human hands are scanned to produce a 3D model of a finger, and pneumatic networks are implemented within it to produce a biomimetic bending motion. The finger is then partitioned into material sections, and a genetic algorithm based optimization, using finite element analysis, is employed to discover the optimal material for each section. This is based on two biomimetic performance criteria. Two sets of optimizations using two material sets are performed. Promising optimized material arrangements are fabricated using two techniques to validate the optimization routine, and the fabricated and simulated results are compared. We find that the optimization is successful in producing biomimetic soft robotic fingers and that fabrication of the fingers is possible. Limitations and paths for development are discussed. This methodology can be applied for other fluidic soft robotic devices.

  11. Interplanetary program to optimize simulated trajectories (IPOST). Volume 4: Sample cases

    NASA Technical Reports Server (NTRS)

    Hong, P. E.; Kent, P. D; Olson, D. W.; Vallado, C. A.

    1992-01-01

    The Interplanetary Program to Optimize Simulated Trajectories (IPOST) is intended to support many analysis phases, from early interplanetary feasibility studies through spacecraft development and operations. The IPOST output provides information for sizing and understanding mission impacts related to propulsion, guidance, communications, sensor/actuators, payload, and other dynamic and geometric environments. IPOST models three degree of freedom trajectory events, such as launch/ascent, orbital coast, propulsive maneuvering (impulsive and finite burn), gravity assist, and atmospheric entry. Trajectory propagation is performed using a choice of Cowell, Encke, Multiconic, Onestep, or Conic methods. The user identifies a desired sequence of trajectory events, and selects which parameters are independent (controls) and dependent (targets), as well as other constraints and the cost function. Targeting and optimization are performed using the Standard NPSOL algorithm. The IPOST structure allows sub-problems within a master optimization problem to aid in the general constrained parameter optimization solution. An alternate optimization method uses implicit simulation and collocation techniques.

  12. Stochastic Methods for Aircraft Design

    NASA Technical Reports Server (NTRS)

    Pelz, Richard B.; Ogot, Madara

    1998-01-01

    The global stochastic optimization method, simulated annealing (SA), was adapted and applied to various problems in aircraft design. The research was aimed at overcoming the problem of finding an optimal design in a space with multiple minima and roughness ubiquitous to numerically generated nonlinear objective functions. SA was modified to reduce the number of objective function evaluations for an optimal design, historically the main criticism of stochastic methods. SA was applied to many CFD/MDO problems including: low sonic-boom bodies, minimum drag on supersonic fore-bodies, minimum drag on supersonic aeroelastic fore-bodies, minimum drag on HSCT aeroelastic wings, FLOPS preliminary design code, another preliminary aircraft design study with vortex lattice aerodynamics, HSR complete aircraft aerodynamics. In every case, SA provided a simple, robust and reliable optimization method which found optimal designs in order 100 objective function evaluations. Perhaps most importantly, from this academic/industrial project, technology has been successfully transferred; this method is the method of choice for optimization problems at Northrop Grumman.

  13. Optimal operation management of fuel cell/wind/photovoltaic power sources connected to distribution networks

    NASA Astrophysics Data System (ADS)

    Niknam, Taher; Kavousifard, Abdollah; Tabatabaei, Sajad; Aghaei, Jamshid

    2011-10-01

    In this paper a new multiobjective modified honey bee mating optimization (MHBMO) algorithm is presented to investigate the distribution feeder reconfiguration (DFR) problem considering renewable energy sources (RESs) (photovoltaics, fuel cell and wind energy) connected to the distribution network. The objective functions of the problem to be minimized are the electrical active power losses, the voltage deviations, the total electrical energy costs and the total emissions of RESs and substations. During the optimization process, the proposed algorithm finds a set of non-dominated (Pareto) optimal solutions which are stored in an external memory called repository. Since the objective functions investigated are not the same, a fuzzy clustering algorithm is utilized to handle the size of the repository in the specified limits. Moreover, a fuzzy-based decision maker is adopted to select the 'best' compromised solution among the non-dominated optimal solutions of multiobjective optimization problem. In order to see the feasibility and effectiveness of the proposed algorithm, two standard distribution test systems are used as case studies.

  14. Optimization of the volume reconstruction for classical Tomo-PIV algorithms (MART, BIMART and SMART): synthetic and experimental studies

    NASA Astrophysics Data System (ADS)

    Thomas, L.; Tremblais, B.; David, L.

    2014-03-01

    Optimization of multiplicative algebraic reconstruction technique (MART), simultaneous MART and block iterative MART reconstruction techniques was carried out on synthetic and experimental data. Different criteria were defined to improve the preprocessing of the initial images. Knowledge of how each reconstruction parameter influences the quality of particle volume reconstruction and computing time is the key in Tomo-PIV. These criteria were applied to a real case, a jet in cross flow, and were validated.

  15. Harmony search method: theory and applications.

    PubMed

    Gao, X Z; Govindasamy, V; Xu, H; Wang, X; Zenger, K

    2015-01-01

    The Harmony Search (HS) method is an emerging metaheuristic optimization algorithm, which has been employed to cope with numerous challenging tasks during the past decade. In this paper, the essential theory and applications of the HS algorithm are first described and reviewed. Several typical variants of the original HS are next briefly explained. As an example of case study, a modified HS method inspired by the idea of Pareto-dominance-based ranking is also presented. It is further applied to handle a practical wind generator optimal design problem.

  16. Virtual Factory Framework for Supporting Production Planning and Control.

    PubMed

    Kibira, Deogratias; Shao, Guodong

    2017-01-01

    Developing optimal production plans for smart manufacturing systems is challenging because shop floor events change dynamically. A virtual factory incorporating engineering tools, simulation, and optimization generates and communicates performance data to guide wise decision making for different control levels. This paper describes such a platform specifically for production planning. We also discuss verification and validation of the constituent models. A case study of a machine shop is used to demonstrate data generation for production planning in a virtual factory.

  17. Analytic solution to variance optimization with no short positions

    NASA Astrophysics Data System (ADS)

    Kondor, Imre; Papp, Gábor; Caccioli, Fabio

    2017-12-01

    We consider the variance portfolio optimization problem with a ban on short selling. We provide an analytical solution by means of the replica method for the case of a portfolio of independent, but not identically distributed, assets. We study the behavior of the solution as a function of the ratio r between the number N of assets and the length T of the time series of returns used to estimate risk. The no-short-selling constraint acts as an asymmetric \

  18. Wakefield acceleration in planetary atmospheres: A possible source of MeV electrons. The collisionless case

    NASA Astrophysics Data System (ADS)

    Arrayás, M.; Cubero, D.; Montanya, J.; Seviour, R.; Trueba, J. L.

    2018-07-01

    Intense electromagnetic pulses interacting with a plasma can create a wake of plasma oscillations. Electrons trapped in such oscillations can be accelerated under certain conditions to very high energies. We study the optimal conditions for the wakefield acceleration to produce MeV electrons in planetary plasmas under collisionless conditions. The conditions for the optimal plasma densities can be found in the Earth atmosphere at higher altitudes than 10-15 km, which are the altitudes where lightning leaders can take place.

  19. Evaluating the effects of real power losses in optimal power flow based storage integration

    DOE PAGES

    Castillo, Anya; Gayme, Dennice

    2017-03-27

    This study proposes a DC optimal power flow (DCOPF) with losses formulation (the `-DCOPF+S problem) and uses it to investigate the role of real power losses in OPF based grid-scale storage integration. We derive the `- DCOPF+S problem by augmenting a standard DCOPF with storage (DCOPF+S) problem to include quadratic real power loss approximations. This procedure leads to a multi-period nonconvex quadratically constrained quadratic program, which we prove can be solved to optimality using either a semidefinite or second order cone relaxation. Our approach has some important benefits over existing models. It is more computationally tractable than ACOPF with storagemore » (ACOPF+S) formulations and the provably exact convex relaxations guarantee that an optimal solution can be attained for a feasible problem. Adding loss approximations to a DCOPF+S model leads to a more accurate representation of locational marginal prices, which have been shown to be critical to determining optimal storage dispatch and siting in prior ACOPF+S based studies. Case studies demonstrate the improved accuracy of the `-DCOPF+S model over a DCOPF+S model and the computational advantages over an ACOPF+S formulation.« less

  20. Improving multi-objective reservoir operation optimization with sensitivity-informed problem decomposition

    NASA Astrophysics Data System (ADS)

    Chu, J. G.; Zhang, C.; Fu, G. T.; Li, Y.; Zhou, H. C.

    2015-04-01

    This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce the computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed problem decomposition dramatically reduces the computational demands required for attaining high quality approximations of optimal tradeoff relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed problem decomposition and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform problem decomposition when solving the complex multi-objective reservoir operation problems.

  1. Oxidation of alkylarenes to the corresponding acids using aqueous potassium permanganate by hydrodynamic cavitation.

    PubMed

    Ambulgekar, G V; Samant, S D; Pandit, A B

    2004-05-01

    Oxidation of toluene using aqueous potassium permanganate was studied under heterogeneous condition in the presence of hydrodynamic cavitation and compared with the results of the reaction under acoustic cavitation. Various parameters, such as quantity of potassium permanganate, toluene to aqueous phase ratio, reaction time and cavitation parameters such as orifice plate, and pump discharge pressure were optimized. The reaction was found to be considerably accelerated at ambient temperature in the presence of cavitation. On comparison, it was found that when 1 kJ of energy was passed to the reaction mixture in the case of acoustic cavitation, the product obtained was 4.63 x 10(-6) mol, whereas when 1 kJ of energy was passed to the reaction mixture in the case of hydrodynamic cavitation the product obtained was 2.70 x 10(-5) mol. Hence, about six times more product would be obtained in the case of hydrodynamic cavitation than in the case of acoustic cavitation at same energy dissipation. It has been observed that further optimization is possible.

  2. Optimal design of studies of influenza transmission in households. II: comparison between cohort and case-ascertained studies.

    PubMed

    Klick, B; Nishiura, H; Leung, G M; Cowling, B J

    2014-04-01

    Both case-ascertained household studies, in which households are recruited after an 'index case' is identified, and household cohort studies, where a household is enrolled before the start of the epidemic, may be used to test and estimate the protective effect of interventions used to prevent influenza transmission. A simulation approach parameterized with empirical data from household studies was used to evaluate and compare the statistical power of four study designs: a cohort study with routine virological testing of household contacts of infected index case, a cohort study where only household contacts with acute respiratory illness (ARI) are sampled for virological testing, a case-ascertained study with routine virological testing of household contacts, and a case-ascertained study where only household contacts with ARI are sampled for virological testing. We found that a case-ascertained study with ARI-triggered testing would be the most powerful design while a cohort design only testing household contacts with ARI was the least powerful. Sensitivity analysis demonstrated that these conclusions varied by model parameters including the serial interval and the risk of influenza virus infection from outside the household.

  3. Multi-parameter optimization of piezoelectric actuators for multi-mode active vibration control of cylindrical shells

    NASA Astrophysics Data System (ADS)

    Hu, K. M.; Li, Hua

    2018-07-01

    A novel technique for the multi-parameter optimization of distributed piezoelectric actuators is presented in this paper. The proposed method is designed to improve the performance of multi-mode vibration control in cylindrical shells. The optimization parameters of actuator patch configuration include position, size, and tilt angle. The modal control force of tilted orthotropic piezoelectric actuators is derived and the multi-parameter cylindrical shell optimization model is established. The linear quadratic energy index is employed as the optimization criterion. A geometric constraint is proposed to prevent overlap between tilted actuators, which is plugged into a genetic algorithm to search the optimal configuration parameters. A simply-supported closed cylindrical shell with two actuators serves as a case study. The vibration control efficiencies of various parameter sets are evaluated via frequency response and transient response simulations. The results show that the linear quadratic energy indexes of position and size optimization decreased by 14.0% compared to position optimization; those of position and tilt angle optimization decreased by 16.8%; and those of position, size, and tilt angle optimization decreased by 25.9%. It indicates that, adding configuration optimization parameters is an efficient approach to improving the vibration control performance of piezoelectric actuators on shells.

  4. A game theory model for stabilizing price of chili: A case study

    NASA Astrophysics Data System (ADS)

    Wardayanti, Ari; Aviv, Afgan Suffan; Sutopo, Wahyudi; Hisjam, Muh.

    2017-11-01

    Chili is one of the important agricultural commodity in Indonesia because of its widely consumption by the Indonesian. Chili becomes one of the commodities that experience price fluctuations and important cause of yearly inflation in Indonesia. The unstable price of chili is affected by the scarcity of the commodity in some months and the difference of the harvest season. This study proposes a model to solve the problem by considering the substitution of fresh chilies with dried chili. We propose the cooperative of chili's farmer as entities that process fresh chili into dry ones. The existence of substitution products is expected to maintain the price stability chili. This research was conducted by taking a case study on chili commodity markets in Surakarta which consists of 19 traditional markets. This study aims to create a price stabilization scheme with product substitution using a game theory model. There are 4 strategies proposed in game theory model to describe the relationship between producers and consumers. In this case, the producers are the farmers and the consumers are the trade market. A mixed strategy of was chosen to determine the optimal value among 4 strategies. From the calculation results obtained optimal value when doing a mixed strategy of IDR 201,188,829,000.

  5. Experimental and simulation studies of multivariable adaptive optimization of continuous bioreactors using bilevel forgetting factors.

    PubMed

    Chang, Y K; Lim, H C

    1989-08-20

    A multivariable on-line adaptive optimization algorithm using a bilevel forgetting factor method was developed and applied to a continuous baker's yeast culture in simulation and experimental studies to maximize the cellular productivity by manipulating the dilution rate and the temperature. The algorithm showed a good optimization speed and a good adaptability and reoptimization capability. The algorithm was able to stably maintain the process around the optimum point for an extended period of time. Two cases were investigated: an unconstrained and a constrained optimization. In the constrained optimization the ethanol concentration was used as an index for the baking quality of yeast cells. An equality constraint with a quadratic penalty was imposed on the ethanol concentration to keep its level close to a hypothetical "optimum" value. The developed algorithm was experimentally applied to a baker's yeast culture to demonstrate its validity. Only unconstrained optimization was carried out experimentally. A set of tuning parameter values was suggested after evaluating the results from several experimental runs. With those tuning parameter values the optimization took 50-90 h. At the attained steady state the dilution rate was 0.310 h(-1) the temperature 32.8 degrees C, and the cellular productivity 1.50 g/L/h.

  6. On Consistency Test Method of Expert Opinion in Ecological Security Assessment

    PubMed Central

    Wang, Lihong

    2017-01-01

    To reflect the initiative design and initiative of human security management and safety warning, ecological safety assessment is of great value. In the comprehensive evaluation of regional ecological security with the participation of experts, the expert’s individual judgment level, ability and the consistency of the expert’s overall opinion will have a very important influence on the evaluation result. This paper studies the consistency measure and consensus measure based on the multiplicative and additive consistency property of fuzzy preference relation (FPR). We firstly propose the optimization methods to obtain the optimal multiplicative consistent and additively consistent FPRs of individual and group judgments, respectively. Then, we put forward a consistency measure by computing the distance between the original individual judgment and the optimal individual estimation, along with a consensus measure by computing the distance between the original collective judgment and the optimal collective estimation. In the end, we make a case study on ecological security for five cities. Result shows that the optimal FPRs are helpful in measuring the consistency degree of individual judgment and the consensus degree of collective judgment. PMID:28869570

  7. On Consistency Test Method of Expert Opinion in Ecological Security Assessment.

    PubMed

    Gong, Zaiwu; Wang, Lihong

    2017-09-04

    To reflect the initiative design and initiative of human security management and safety warning, ecological safety assessment is of great value. In the comprehensive evaluation of regional ecological security with the participation of experts, the expert's individual judgment level, ability and the consistency of the expert's overall opinion will have a very important influence on the evaluation result. This paper studies the consistency measure and consensus measure based on the multiplicative and additive consistency property of fuzzy preference relation (FPR). We firstly propose the optimization methods to obtain the optimal multiplicative consistent and additively consistent FPRs of individual and group judgments, respectively. Then, we put forward a consistency measure by computing the distance between the original individual judgment and the optimal individual estimation, along with a consensus measure by computing the distance between the original collective judgment and the optimal collective estimation. In the end, we make a case study on ecological security for five cities. Result shows that the optimal FPRs are helpful in measuring the consistency degree of individual judgment and the consensus degree of collective judgment.

  8. A study of viscous interaction effects on hypersonic waveriders. Ph.D. Thesis, Dec. 1991

    NASA Technical Reports Server (NTRS)

    Chang, Jinhwa

    1992-01-01

    The effects of viscous interaction in the analysis and design of improved classes of viscous optimized hypersonic waveriders is examined. The Corda computer program is used to generate viscous optimized hypersonic waveriders from conical flow fields without viscous interaction. Each waverider is optimized for maximum L/D, and comparison studies are made between cases with and without viscous interaction. The results show that aerodynamic performance of the viscous interaction waveriders are reduced due mainly to a large increase in skin-friction drag associated with the viscous interaction phenomena that grows with increasing Mach number and altitude, but some of this performance loss can be recouped by including viscous interactions within the optimization procedure. When the waverider is optimized for viscous interaction, the shape can change dramatically. A significant result of the present work delineates on a velocity-altitude map the region where viscous interaction effects are significant for modern hypersonic waveriders by performing parametric runs to produce L/D, C sub L, and C sub D contour plots for Mach numbers from 6 to 30 at altitudes from 30 to 80 km.

  9. Improving healthcare services using web based platform for management of medical case studies.

    PubMed

    Ogescu, Cristina; Plaisanu, Claudiu; Udrescu, Florian; Dumitru, Silviu

    2008-01-01

    The paper presents a web based platform for management of medical cases, support for healthcare specialists in taking the best clinical decision. Research has been oriented mostly on multimedia data management, classification algorithms for querying, retrieving and processing different medical data types (text and images). The medical case studies can be accessed by healthcare specialists and by students as anonymous case studies providing trust and confidentiality in Internet virtual environment. The MIDAS platform develops an intelligent framework to manage sets of medical data (text, static or dynamic images), in order to optimize the diagnosis and the decision process, which will reduce the medical errors and will increase the quality of medical act. MIDAS is an integrated project working on medical information retrieval from heterogeneous, distributed medical multimedia database.

  10. Optimal stocking of species by diameter class for even-aged mid-to-late rotation Appalachian hardwoods

    Treesearch

    Joseph B. Roise; Joosang Chung; Chris B. LeDoux

    1988-01-01

    Nonlinear programming (NP) is applied to the problem of finding optimal thinning and harvest regimes simultaneously with species mix and diameter class distribution. Optimal results for given cases are reported. Results of the NP optimization are compared with prescriptions developed by Appalachian hardwood silviculturists.

  11. Managing simulation-based training: A framework for optimizing learning, cost, and time

    NASA Astrophysics Data System (ADS)

    Richmond, Noah Joseph

    This study provides a management framework for optimizing training programs for learning, cost, and time when using simulation based training (SBT) and reality based training (RBT) as resources. Simulation is shown to be an effective means for implementing activity substitution as a way to reduce risk. The risk profile of 22 US Air Force vehicles are calculated, and the potential risk reduction is calculated under the assumption of perfect substitutability of RBT and SBT. Methods are subsequently developed to relax the assumption of perfect substitutability. The transfer effectiveness ratio (TER) concept is defined and modeled as a function of the quality of the simulator used, and the requirements of the activity trained. The Navy F/A-18 is then analyzed in a case study illustrating how learning can be maximized subject to constraints in cost and time, and also subject to the decision maker's preferences for the proportional and absolute use of simulation. Solution methods for optimizing multiple activities across shared resources are next provided. Finally, a simulation strategy including an operations planning program (OPP), an implementation program (IP), an acquisition program (AP), and a pedagogical research program (PRP) is detailed. The study provides the theoretical tools to understand how to leverage SBT, a case study demonstrating these tools' efficacy, and a set of policy recommendations to enable the US military to better utilize SBT in the future.

  12. A method to incorporate leakage and head scatter corrections into a tomotherapy inverse treatment planning algorithm

    NASA Astrophysics Data System (ADS)

    Holmes, Timothy W.

    2001-01-01

    A detailed tomotherapy inverse treatment planning method is described which incorporates leakage and head scatter corrections during each iteration of the optimization process, allowing these effects to be directly accounted for in the optimized dose distribution. It is shown that the conventional inverse planning method for optimizing incident intensity can be extended to include a `concurrent' leaf sequencing operation from which the leakage and head scatter corrections are determined. The method is demonstrated using the steepest-descent optimization technique with constant step size and a least-squared error objective. The method was implemented using the MATLAB scientific programming environment and its feasibility demonstrated for 2D test cases simulating treatment delivery using a single coplanar rotation. The results indicate that this modification does not significantly affect convergence of the intensity optimization method when exposure times of individual leaves are stratified to a large number of levels (>100) during leaf sequencing. In general, the addition of aperture dependent corrections, especially `head scatter', reduces incident fluence in local regions of the modulated fan beam, resulting in increased exposure times for individual collimator leaves. These local variations can result in 5% or greater local variation in the optimized dose distribution compared to the uncorrected case. The overall efficiency of the modified intensity optimization algorithm is comparable to that of the original unmodified case.

  13. The effect of first chromosome long arm duplication on survival of endometrial carcinoma.

    PubMed

    Sever, Erman; Doğer, Emek; Çakıroğlu, Yiğit; Sünnetçi, Deniz; Çine, Naci; Savlı, Hakan; Yücesoy, İzzet

    2014-12-01

    The aim of this study is to investigate the effect of first chromosome long arm duplication (dup(1q)) in cases with endometrial carcinoma detected with array based comperative genomic hybridization (aCGH) on survival from the cancer. A total of 53 patients with the diagnosis of endometrial carcinom due to endometrial biopsy and who have been operated for this reason have been allocated in the study. Frozen section biopsy and staging surgery have been performed for all the cases. Samples obtained from the tumoral mass have been investigated for chromosomal aberrations with aCGH method. Kaplan-Meier and Cox-regression analysis have been performed for survival analysis. Among 53 cases with endometrial carcinomas, dup(1q) was diagnosed in 14 (26.4%) of the cases. For the patient group that has been followed-up for 24 months (3-33 months), dup(1q) (p=.01), optimal cytoreduction (p<.001), lymph node positivity (p=.006), tumor stage >1 (p=.006) and presence of high risk tumor were the factors that were associated with survival. Cox-regression analysis has revealed that optimal cytoreduction was the most important prognostic factor (p=.02). Presence of 1q duplication can be used as a prognostic factor in the preoperative period.

  14. The effect of first chromosome long arm duplication on survival of endometrial carcinoma

    PubMed Central

    Sever, Erman; Doğer, Emek; Çakıroğlu, Yiğit; Sünnetçi, Deniz; Çine, Naci; Savlı, Hakan; Yücesoy, İzzet

    2014-01-01

    Objective: The aim of this study is to investigate the effect of first chromosome long arm duplication (dup(1q)) in cases with endometrial carcinoma detected with array based comperative genomic hybridization (aCGH) on survival from the cancer. Materials and Methods: A total of 53 patients with the diagnosis of endometrial carcinom due to endometrial biopsy and who have been operated for this reason have been allocated in the study. Frozen section biopsy and staging surgery have been performed for all the cases. Samples obtained from the tumoral mass have been investigated for chromosomal aberrations with aCGH method. Kaplan-Meier and Cox-regression analysis have been performed for survival analysis. Results: Among 53 cases with endometrial carcinomas, dup(1q) was diagnosed in 14 (26.4%) of the cases. For the patient group that has been followed-up for 24 months (3-33 months), dup(1q) (p=.01), optimal cytoreduction (p<.001), lymph node positivity (p=.006), tumor stage >1 (p=.006) and presence of high risk tumor were the factors that were associated with survival. Cox-regression analysis has revealed that optimal cytoreduction was the most important prognostic factor (p=.02). Conclusion: Presence of 1q duplication can be used as a prognostic factor in the preoperative period. PMID:28913021

  15. Multiplex RT-PCR and indirect immunofluorescence assays for detection and subtyping of human influenza virus in Tunisia.

    PubMed

    Ben M'hadheb, Manel; Harrabi, Myriam; Souii, Amira; Jrad-Battikh, Nadia; Gharbi, Jawhar

    2015-03-01

    Influenza viruses are negative stranded segmented RNA viruses belonging to Orthomyxoviridae family. They are classified into three types A, B, and C. Type A influenza viruses are classified into subtypes according to the antigenic characters of the surface glycoproteins: hemagglutinin (H) and neuraminidase (N). The aim of the present study is to develop a fast and reliable multiplex RT-PCR technique for detecting simultaneously the subtypes A/H1N1 and A/H3N2 of influenza virus. Our study included 398 patients (mean age 30.33 ± 19.92 years) with flu or flu-like syndromes, consulting physicians affiliated with collaborating teams. A multiplex RT-PCR detecting A/H1N1 and A/H3N2 influenza viruses and an examination by indirect immunofluorescence (IFI) were performed. In the optimized conditions, we diagnosed by IFI a viral infection in 90 patients (22.6 %): 85 cases of influenza type A, four cases of influenza type B, and only one case of coinfection with types A and B. An evaluation of the technique was performed on 19 clinical specimens positive in IFI, and we detected eight cases of A/H3N2, five cases of A/H1N1, one case of influenza virus type A which is not an H1N1 nor H3N2, and five negative cases. Multiplex RT-PCR is a sensitive technique allowing an effective and fast diagnosis of respiratory infections caused by influenza viruses in which the optimization often collides with problems of sensibility.

  16. Optimal Policy of Cross-Layer Design for Channel Access and Transmission Rate Adaptation in Cognitive Radio Networks

    NASA Astrophysics Data System (ADS)

    He, Hao; Wang, Jun; Zhu, Jiang; Li, Shaoqian

    2010-12-01

    In this paper, we investigate the cross-layer design of joint channel access and transmission rate adaptation in CR networks with multiple channels for both centralized and decentralized cases. Our target is to maximize the throughput of CR network under transmission power constraint by taking spectrum sensing errors into account. In centralized case, this problem is formulated as a special constrained Markov decision process (CMDP), which can be solved by standard linear programming (LP) method. As the complexity of finding the optimal policy by LP increases exponentially with the size of action space and state space, we further apply action set reduction and state aggregation to reduce the complexity without loss of optimality. Meanwhile, for the convenience of implementation, we also consider the pure policy design and analyze the corresponding characteristics. In decentralized case, where only local information is available and there is no coordination among the CR users, we prove the existence of the constrained Nash equilibrium and obtain the optimal decentralized policy. Finally, in the case that the traffic load parameters of the licensed users are unknown for the CR users, we propose two methods to estimate the parameters for two different cases. Numerical results validate the theoretic analysis.

  17. Optimizing Natural Gas Networks through Dynamic Manifold Theory and a Decentralized Algorithm: Belgium Case Study

    NASA Astrophysics Data System (ADS)

    Koch, Caleb; Winfrey, Leigh

    2014-10-01

    Natural Gas is a major energy source in Europe, yet political instabilities have the potential to disrupt access and supply. Energy resilience is an increasingly essential construct and begins with transmission network design. This study proposes a new way of thinking about modelling natural gas flow. Rather than relying on classical economic models, this problem is cast into a time-dependent Hamiltonian dynamics discussion. Traditional Natural Gas constraints, including inelastic demand and maximum/minimum pipe flows, are portrayed as energy functions and built into the dynamics of each pipe flow. Doing so allows the constraints to be built into the dynamics of each pipeline. As time progresses in the model, natural gas flow rates find the minimum energy, thus the optimal gas flow rates. The most important result of this study is using dynamical principles to ensure the output of natural gas at demand nodes remains constant, which is important for country to country natural gas transmission. Another important step in this study is building the dynamics of each flow in a decentralized algorithm format. Decentralized regulation has solved congestion problems for internet data flow, traffic flow, epidemiology, and as demonstrated in this study can solve the problem of Natural Gas congestion. A mathematical description is provided for how decentralized regulation leads to globally optimized network flow. Furthermore, the dynamical principles and decentralized algorithm are applied to a case study of the Fluxys Belgium Natural Gas Network.

  18. Continuous traumatic stress as a mental and physical health challenge: Case studies from South Africa.

    PubMed

    Kaminer, Debra; Eagle, Gillian; Crawford-Browne, Sarah

    2018-07-01

    This article discusses the condition of continuous traumatic stress as common on the African continent and in other international settings characterised by very high levels of ongoing violence and threat of community, political or war-related origin. Through consideration of three case studies from South Africa, contexts of continuous traumatic stress are described, and the mental health and physical health effects of living in such contexts are elaborated. Cautions are raised about attempting to transpose existing posttraumatic stress models onto individuals exposed to situations of continuous traumatic stress, and guidelines for optimal interventions with such cases are proposed.

  19. Five Antiretroviral Drug Class-Resistant HIV-1 in a Treatment-Naïve Patient Successfully Suppressed with Optimized Antiretroviral Drug Selection.

    PubMed

    Volpe, Joseph M; Ward, Douglas J; Napolitano, Laura; Phung, Pham; Toma, Jonathan; Solberg, Owen; Petropoulos, Christos J; Walworth, Charles M

    2015-01-01

    Transmitted HIV-1 exhibiting reduced susceptibility to protease and reverse transcriptase inhibitors is well documented but limited for integrase inhibitors and enfuvirtide. We describe here a case of transmitted 5 drug class-resistance in an antiretroviral (ARV)-naïve patient who was successfully treated based on the optimized selection of an active ARV drug regimen. The value of baseline resistance testing to determine an optimal ARV treatment regimen is highlighted in this case report. © The Author(s) 2015.

  20. Optimization of plasma amplifiers

    DOE PAGES

    Sadler, James D.; Trines, Raoul M. G. M.; Tabak, Max; ...

    2017-05-24

    Here, plasma amplifiers offer a route to side-step limitations on chirped pulse amplification and generate laser pulses at the power frontier. They compress long pulses by transferring energy to a shorter pulse via the Raman or Brillouin instabilities. We present an extensive kinetic numerical study of the three-dimensional parameter space for the Raman case. Further particle-in-cell simulations find the optimal seed pulse parameters for experimentally relevant constraints. The high-efficiency self-similar behavior is observed only for seeds shorter than the linear Raman growth time. A test case similar to an upcoming experiment at the Laboratory for Laser Energetics is found tomore » maintain good transverse coherence and high-energy efficiency. Effective compression of a 10kJ, nanosecond-long driver pulse is also demonstrated in a 15-cm-long amplifier.« less

  1. Optimization of plasma amplifiers

    NASA Astrophysics Data System (ADS)

    Sadler, James D.; Trines, Raoul M. Â. G. Â. M.; Tabak, Max; Haberberger, Dan; Froula, Dustin H.; Davies, Andrew S.; Bucht, Sara; Silva, Luís O.; Alves, E. Paulo; Fiúza, Frederico; Ceurvorst, Luke; Ratan, Naren; Kasim, Muhammad F.; Bingham, Robert; Norreys, Peter A.

    2017-05-01

    Plasma amplifiers offer a route to side-step limitations on chirped pulse amplification and generate laser pulses at the power frontier. They compress long pulses by transferring energy to a shorter pulse via the Raman or Brillouin instabilities. We present an extensive kinetic numerical study of the three-dimensional parameter space for the Raman case. Further particle-in-cell simulations find the optimal seed pulse parameters for experimentally relevant constraints. The high-efficiency self-similar behavior is observed only for seeds shorter than the linear Raman growth time. A test case similar to an upcoming experiment at the Laboratory for Laser Energetics is found to maintain good transverse coherence and high-energy efficiency. Effective compression of a 10 kJ , nanosecond-long driver pulse is also demonstrated in a 15-cm-long amplifier.

  2. Optimization of seismic isolation systems via harmony search

    NASA Astrophysics Data System (ADS)

    Melih Nigdeli, Sinan; Bekdaş, Gebrail; Alhan, Cenk

    2014-11-01

    In this article, the optimization of isolation system parameters via the harmony search (HS) optimization method is proposed for seismically isolated buildings subjected to both near-fault and far-fault earthquakes. To obtain optimum values of isolation system parameters, an optimization program was developed in Matlab/Simulink employing the HS algorithm. The objective was to obtain a set of isolation system parameters within a defined range that minimizes the acceleration response of a seismically isolated structure subjected to various earthquakes without exceeding a peak isolation system displacement limit. Several cases were investigated for different isolation system damping ratios and peak displacement limitations of seismic isolation devices. Time history analyses were repeated for the neighbouring parameters of optimum values and the results proved that the parameters determined via HS were true optima. The performance of the optimum isolation system was tested under a second set of earthquakes that was different from the first set used in the optimization process. The proposed optimization approach is applicable to linear isolation systems. Isolation systems composed of isolation elements that are inherently nonlinear are the subject of a future study. Investigation of the optimum isolation system parameters has been considered in parametric studies. However, obtaining the best performance of a seismic isolation system requires a true optimization by taking the possibility of both near-fault and far-fault earthquakes into account. HS optimization is proposed here as a viable solution to this problem.

  3. Strategies for the Optimization of Natural Leads to Anticancer Drugs or Drug Candidates

    PubMed Central

    Xiao, Zhiyan; Morris-Natschke, Susan L.; Lee, Kuo-Hsiung

    2015-01-01

    Natural products have made significant contribution to cancer chemotherapy over the past decades and remain an indispensable source of molecular and mechanistic diversity for anticancer drug discovery. More often than not, natural products may serve as leads for further drug development rather than as effective anticancer drugs by themselves. Generally, optimization of natural leads into anticancer drugs or drug candidates should not only address drug efficacy, but also improve ADMET profiles and chemical accessibility associated with the natural leads. Optimization strategies involve direct chemical manipulation of functional groups, structure-activity relationship-directed optimization and pharmacophore-oriented molecular design based on the natural templates. Both fundamental medicinal chemistry principles (e.g., bio-isosterism) and state-of-the-art computer-aided drug design techniques (e.g., structure-based design) can be applied to facilitate optimization efforts. In this review, the strategies to optimize natural leads to anticancer drugs or drug candidates are illustrated with examples and described according to their purposes. Furthermore, successful case studies on lead optimization of bioactive compounds performed in the Natural Products Research Laboratories at UNC are highlighted. PMID:26359649

  4. Surrogate Based Uni/Multi-Objective Optimization and Distribution Estimation Methods

    NASA Astrophysics Data System (ADS)

    Gong, W.; Duan, Q.; Huo, X.

    2017-12-01

    Parameter calibration has been demonstrated as an effective way to improve the performance of dynamic models, such as hydrological models, land surface models, weather and climate models etc. Traditional optimization algorithms usually cost a huge number of model evaluations, making dynamic model calibration very difficult, or even computationally prohibitive. With the help of a serious of recently developed adaptive surrogate-modelling based optimization methods: uni-objective optimization method ASMO, multi-objective optimization method MO-ASMO, and probability distribution estimation method ASMO-PODE, the number of model evaluations can be significantly reduced to several hundreds, making it possible to calibrate very expensive dynamic models, such as regional high resolution land surface models, weather forecast models such as WRF, and intermediate complexity earth system models such as LOVECLIM. This presentation provides a brief introduction to the common framework of adaptive surrogate-based optimization algorithms of ASMO, MO-ASMO and ASMO-PODE, a case study of Common Land Model (CoLM) calibration in Heihe river basin in Northwest China, and an outlook of the potential applications of the surrogate-based optimization methods.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beltran, C; Kamal, H

    Purpose: To provide a multicriteria optimization algorithm for intensity modulated radiation therapy using pencil proton beam scanning. Methods: Intensity modulated radiation therapy using pencil proton beam scanning requires efficient optimization algorithms to overcome the uncertainties in the Bragg peaks locations. This work is focused on optimization algorithms that are based on Monte Carlo simulation of the treatment planning and use the weights and the dose volume histogram (DVH) control points to steer toward desired plans. The proton beam treatment planning process based on single objective optimization (representing a weighted sum of multiple objectives) usually leads to time-consuming iterations involving treatmentmore » planning team members. We proved a time efficient multicriteria optimization algorithm that is developed to run on NVIDIA GPU (Graphical Processing Units) cluster. The multicriteria optimization algorithm running time benefits from up-sampling of the CT voxel size of the calculations without loss of fidelity. Results: We will present preliminary results of Multicriteria optimization for intensity modulated proton therapy based on DVH control points. The results will show optimization results of a phantom case and a brain tumor case. Conclusion: The multicriteria optimization of the intensity modulated radiation therapy using pencil proton beam scanning provides a novel tool for treatment planning. Work support by a grant from Varian Inc.« less

  6. Comparative evaluation of two dose optimization methods for image-guided, highly-conformal, tandem and ovoids cervix brachytherapy planning

    NASA Astrophysics Data System (ADS)

    Ren, Jiyun; Menon, Geetha; Sloboda, Ron

    2013-04-01

    Although the Manchester system is still extensively used to prescribe dose in brachytherapy (BT) for locally advanced cervix cancer, many radiation oncology centers are transitioning to 3D image-guided BT, owing to the excellent anatomy definition offered by modern imaging modalities. As automatic dose optimization is highly desirable for 3D image-based BT, this study comparatively evaluates the performance of two optimization methods used in BT treatment planning—Nelder-Mead simplex (NMS) and simulated annealing (SA)—for a cervix BT computer simulation model incorporating a Manchester-style applicator. Eight model cases were constructed based on anatomical structure data (for high risk-clinical target volume (HR-CTV), bladder, rectum and sigmoid) obtained from measurements on fused MR-CT images for BT patients. D90 and V100 for HR-CTV, D2cc for organs at risk (OARs), dose to point A, conformation index and the sum of dwell times within the tandem and ovoids were calculated for optimized treatment plans designed to treat the HR-CTV in a highly conformal manner. Compared to the NMS algorithm, SA was found to be superior as it could perform optimization starting from a range of initial dwell times, while the performance of NMS was strongly dependent on their initial choice. SA-optimized plans also exhibited lower D2cc to OARs, especially the bladder and sigmoid, and reduced tandem dwell times. For cases with smaller HR-CTV having good separation from adjoining OARs, multiple SA-optimized solutions were found which differed markedly from each other and were associated with different choices for initial dwell times. Finally and importantly, the SA method yielded plans with lower dwell time variability compared with the NMS method.

  7. Optimal dose selection accounting for patient subpopulations in a randomized Phase II trial to maximize the success probability of a subsequent Phase III trial.

    PubMed

    Takahashi, Fumihiro; Morita, Satoshi

    2018-02-08

    Phase II clinical trials are conducted to determine the optimal dose of the study drug for use in Phase III clinical trials while also balancing efficacy and safety. In conducting these trials, it may be important to consider subpopulations of patients grouped by background factors such as drug metabolism and kidney and liver function. Determining the optimal dose, as well as maximizing the effectiveness of the study drug by analyzing patient subpopulations, requires a complex decision-making process. In extreme cases, drug development has to be terminated due to inadequate efficacy or severe toxicity. Such a decision may be based on a particular subpopulation. We propose a Bayesian utility approach (BUART) to randomized Phase II clinical trials which uses a first-order bivariate normal dynamic linear model for efficacy and safety in order to determine the optimal dose and study population in a subsequent Phase III clinical trial. We carried out a simulation study under a wide range of clinical scenarios to evaluate the performance of the proposed method in comparison with a conventional method separately analyzing efficacy and safety in each patient population. The proposed method showed more favorable operating characteristics in determining the optimal population and dose.

  8. Emotions and Golf Performance

    ERIC Educational Resources Information Center

    Cohen, Alexander B.; Tenenbaum, Gershon; English, R. William

    2006-01-01

    A multiple case study investigation is reported in which emotions and performance were assessed within the probabilistic individual zone of optimal functioning (IZOF) model (Kamata, Tenenbaum, & Hanin, 2002) to develop idiosyncratic emotion-performance profiles. These profiles were incorporated into a psychological skills training (PST)…

  9. Artificial-intelligence-based optimization of the management of snow removal assets and resources.

    DOT National Transportation Integrated Search

    2002-10-01

    Geographic information systems (GIS) and artificial intelligence (AI) techniques were used to develop an intelligent : snow removal asset management system (SRAMS). The system has been evaluated through a case study examining : snow removal from the ...

  10. Optimization of greenhouse gas emissions in second-hand consumer product recovery through reuse platforms.

    PubMed

    Fortuna, Lorena M; Diyamandoglu, Vasil

    2017-08-01

    Product reuse in the solid waste management sector is promoted as one of the key strategies for waste prevention. This practice is considered to have favorable impact on the environment, but its benefits have yet to be established. Existing research describes the perspective of "avoided production" only, but has failed to examine the interdependent nature of reuse practices within an entire solid waste management system. This study proposes a new framework that uses optimization to minimize the greenhouse gas emissions of an integrated solid waste management system that includes reuse strategies and practices such as reuse enterprises, online platforms, and materials exchanges along with traditional solid waste management practices such as recycling, landfilling, and incineration. The proposed framework uses material flow analysis in combination with an optimization model to provide the best outcome in terms of GHG emissions by redistributing product flows in the integrated solid waste management system to the least impacting routes and processes. The optimization results provide a basis for understanding the contributions of reuse to the environmental benefits of the integrated solid waste management system and the exploration of the effects of reuse activities on waste prevention. A case study involving second-hand clothing is presented to illustrate the implementation of the proposed framework as applied to the material flow. Results of the case study showed the considerable impact of reuse on GHG emissions even for small replacement rates, and helped illustrate the interdependency of the reuse sector with other waste management practices. One major contribution of this study is the development of a framework centered on product reuse that can be applied to identify the best management strategies to reduce the environmental impact of product disposal and to increase recovery of reusable products. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Tracing the Fingerprint of Chemical Bonds within the Electron Densities of Hydrocarbons: A Comparative Analysis of the Optimized and the Promolecule Densities.

    PubMed

    Keyvani, Zahra Alimohammadi; Shahbazian, Shant; Zahedi, Mansour

    2016-10-18

    The equivalence of the molecular graphs emerging from the comparative analysis of the optimized and the promolecule electron densities in two hundred and twenty five unsubstituted hydrocarbons was recently demonstrated [Keyvani et al. Chem. Eur. J. 2016, 22, 5003]. Thus, the molecular graph of an optimized molecular electron density is not shaped by the formation of the C-H and C-C bonds. In the present study, to trace the fingerprint of the C-H and C-C bonds in the electron densities of the same set of hydrocarbons, the amount of electron density and its Laplacian at the (3, -1) critical points associated with these bonds are derived from both optimized and promolecule densities, and compared in a newly proposed comparative analysis. The analysis not only conforms to the qualitative picture of the electron density build up between two atoms upon formation of a bond in between, but also quantifies the resulting accumulation of the electron density at the (3, -1) critical points. The comparative analysis also reveals a unified mode of density accumulation in the case of 2318 studied C-H bonds, but various modes of density accumulation are observed in the case of 1509 studied C-C bonds and they are classified into four groups. The four emerging groups do not always conform to the traditional classification based on the bond orders. Furthermore, four C-C bonds described as exotic bonds in previous studies, for example the inverted C-C bond in 1,1,1-propellane, are naturally distinguished from the analysis. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Optimum aerodynamic design via boundary control

    NASA Technical Reports Server (NTRS)

    Jameson, Antony

    1994-01-01

    These lectures describe the implementation of optimization techniques based on control theory for airfoil and wing design. In previous studies it was shown that control theory could be used to devise an effective optimization procedure for two-dimensional profiles in which the shape is determined by a conformal transformation from a unit circle, and the control is the mapping function. Recently the method has been implemented in an alternative formulation which does not depend on conformal mapping, so that it can more easily be extended to treat general configurations. The method has also been extended to treat the Euler equations, and results are presented for both two and three dimensional cases, including the optimization of a swept wing.

  13. A spatial multi-objective optimization model for sustainable urban wastewater system layout planning.

    PubMed

    Dong, X; Zeng, S; Chen, J

    2012-01-01

    Design of a sustainable city has changed the traditional centralized urban wastewater system towards a decentralized or clustering one. Note that there is considerable spatial variability of the factors that affect urban drainage performance including urban catchment characteristics. The potential options are numerous for planning the layout of an urban wastewater system, which are associated with different costs and local environmental impacts. There is thus a need to develop an approach to find the optimal spatial layout for collecting, treating, reusing and discharging the municipal wastewater of a city. In this study, a spatial multi-objective optimization model, called Urban wastewateR system Layout model (URL), was developed. It is solved by a genetic algorithm embedding Monte Carlo sampling and a series of graph algorithms. This model was illustrated by a case study in a newly developing urban area in Beijing, China. Five optimized system layouts were recommended to the local municipality for further detailed design.

  14. Using the PORS Problems to Examine Evolutionary Optimization of Multiscale Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reinhart, Zachary; Molian, Vaelan; Bryden, Kenneth

    2013-01-01

    Nearly all systems of practical interest are composed of parts assembled across multiple scales. For example, an agrodynamic system is composed of flora and fauna on one scale; soil types, slope, and water runoff on another scale; and management practice and yield on another scale. Or consider an advanced coal-fired power plant: combustion and pollutant formation occurs on one scale, the plant components on another scale, and the overall performance of the power system is measured on another. In spite of this, there are few practical tools for the optimization of multiscale systems. This paper examines multiscale optimization of systemsmore » composed of discrete elements using the plus-one-recall-store (PORS) problem as a test case or study problem for multiscale systems. From this study, it is found that by recognizing the constraints and patterns present in discrete multiscale systems, the solution time can be significantly reduced and much more complex problems can be optimized.« less

  15. A Case-Based Reasoning Method with Rank Aggregation

    NASA Astrophysics Data System (ADS)

    Sun, Jinhua; Du, Jiao; Hu, Jian

    2018-03-01

    In order to improve the accuracy of case-based reasoning (CBR), this paper addresses a new CBR framework with the basic principle of rank aggregation. First, the ranking methods are put forward in each attribute subspace of case. The ordering relation between cases on each attribute is got between cases. Then, a sorting matrix is got. Second, the similar case retrieval process from ranking matrix is transformed into a rank aggregation optimal problem, which uses the Kemeny optimal. On the basis, a rank aggregation case-based reasoning algorithm, named RA-CBR, is designed. The experiment result on UCI data sets shows that case retrieval accuracy of RA-CBR algorithm is higher than euclidean distance CBR and mahalanobis distance CBR testing.So we can get the conclusion that RA-CBR method can increase the performance and efficiency of CBR.

  16. An equivalent method for optimization of particle tuned mass damper based on experimental parametric study

    NASA Astrophysics Data System (ADS)

    Lu, Zheng; Chen, Xiaoyi; Zhou, Ying

    2018-04-01

    A particle tuned mass damper (PTMD) is a creative combination of a widely used tuned mass damper (TMD) and an efficient particle damper (PD) in the vibration control area. The performance of a one-storey steel frame attached with a PTMD is investigated through free vibration and shaking table tests. The influence of some key parameters (filling ratio of particles, auxiliary mass ratio, and particle density) on the vibration control effects is investigated, and it is shown that the attenuation level significantly depends on the filling ratio of particles. According to the experimental parametric study, some guidelines for optimization of the PTMD that mainly consider the filling ratio are proposed. Furthermore, an approximate analytical solution based on the concept of an equivalent single-particle damper is proposed, and it shows satisfied agreement between the simulation and experimental results. This simplified method is then used for the preliminary optimal design of a PTMD system, and a case study of a PTMD system attached to a five-storey steel structure following this optimization process is presented.

  17. Optimal control of anthracnose using mixed strategies.

    PubMed

    Fotsa Mbogne, David Jaures; Thron, Christopher

    2015-11-01

    In this paper we propose and study a spatial diffusion model for the control of anthracnose disease in a bounded domain. The model is a generalization of the one previously developed in [15]. We use the model to simulate two different types of control strategies against anthracnose disease. Strategies that employ chemical fungicides are modeled using a continuous control function; while strategies that rely on cultivational practices (such as pruning and removal of mummified fruits) are modeled with a control function which is discrete in time (though not in space). For comparative purposes, we perform our analyses for a spatially-averaged model as well as the space-dependent diffusion model. Under weak smoothness conditions on parameters we demonstrate the well-posedness of both models by verifying existence and uniqueness of the solution for the growth inhibition rate for given initial conditions. We also show that the set [0, 1] is positively invariant. We first study control by impulsive strategies, then analyze the simultaneous use of mixed continuous and pulse strategies. In each case we specify a cost functional to be minimized, and we demonstrate the existence of optimal control strategies. In the case of pulse-only strategies, we provide explicit algorithms for finding the optimal control strategies for both the spatially-averaged model and the space-dependent model. We verify the algorithms for both models via simulation, and discuss properties of the optimal solutions. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Conceptual Design and Structural Optimization of NASA Environmentally Responsible Aviation (ERA) Hybrid Wing Body Aircraft

    NASA Technical Reports Server (NTRS)

    Quinlan, Jesse R.; Gern, Frank H.

    2016-01-01

    Simultaneously achieving the fuel consumption and noise reduction goals set forth by NASA's Environmentally Responsible Aviation (ERA) project requires innovative and unconventional aircraft concepts. In response, advanced hybrid wing body (HWB) aircraft concepts have been proposed and analyzed as a means of meeting these objectives. For the current study, several HWB concepts were analyzed using the Hybrid wing body Conceptual Design and structural optimization (HCDstruct) analysis code. HCDstruct is a medium-fidelity finite element based conceptual design and structural optimization tool developed to fill the critical analysis gap existing between lower order structural sizing approaches and detailed, often finite element based sizing methods for HWB aircraft concepts. Whereas prior versions of the tool used a half-model approach in building the representative finite element model, a full wing-tip-to-wing-tip modeling capability was recently added to HCDstruct, which alleviated the symmetry constraints at the model centerline in place of a free-flying model and allowed for more realistic center body, aft body, and wing loading and trim response. The latest version of HCDstruct was applied to two ERA reference cases, including the Boeing Open Rotor Engine Integration On an HWB (OREIO) concept and the Boeing ERA-0009H1 concept, and results agreed favorably with detailed Boeing design data and related Flight Optimization System (FLOPS) analyses. Following these benchmark cases, HCDstruct was used to size NASA's ERA HWB concepts and to perform a related scaling study.

  19. Two Studies of Complex Nonlinear Systems: Engineered Granular Crystals and Coarse-Graining Optimization Problems

    NASA Astrophysics Data System (ADS)

    Pozharskiy, Dmitry

    In recent years a nonlinear, acoustic metamaterial, named granular crystals, has gained prominence due to its high accessibility, both experimentally and computationally. The observation of a wide range of dynamical phenomena in the system, due to its inherent nonlinearities, has suggested its importance in many engineering applications related to wave propagation. In the first part of this dissertation, we explore the nonlinear dynamics of damped-driven granular crystals. In one case, we consider a highly nonlinear setting, also known as a sonic vacuum, and derive a nonlinear analogue of a linear spectrum, corresponding to resonant periodic propagation and antiresonances. Experimental studies confirm the computational findings and the assimilation of experimental data into a numerical model is demonstrated. In the second case, global bifurcations in a precompressed granular crystal are examined, and their involvement in the appearance of chaotic dynamics is demonstrated. Both results highlight the importance of exploring the nonlinear dynamics, to gain insight into how a granular crystal responds to different external excitations. In the second part, we borrow established ideas from coarse-graining of dynamical systems, and extend them to optimization problems. We combine manifold learning algorithms, such as Diffusion Maps, with stochastic optimization methods, such as Simulated Annealing, and show that we can retrieve an ensemble, of few, important parameters that should be explored in detail. This framework can lead to acceleration of convergence when dealing with complex, high-dimensional optimization, and could potentially be applied to design engineered granular crystals.

  20. A new sparse optimization scheme for simultaneous beam angle and fluence map optimization in radiotherapy planning

    NASA Astrophysics Data System (ADS)

    Liu, Hongcheng; Dong, Peng; Xing, Lei

    2017-08-01

    {{\\ell }2,1} -minimization-based sparse optimization was employed to solve the beam angle optimization (BAO) in intensity-modulated radiation therapy (IMRT) planning. The technique approximates the exact BAO formulation with efficiently computable convex surrogates, leading to plans that are inferior to those attainable with recently proposed gradient-based greedy schemes. In this paper, we alleviate/reduce the nontrivial inconsistencies between the {{\\ell }2,1} -based formulations and the exact BAO model by proposing a new sparse optimization framework based on the most recent developments in group variable selection. We propose the incorporation of the group-folded concave penalty (gFCP) as a substitution to the {{\\ell }2,1} -minimization framework. The new formulation is then solved by a variation of an existing gradient method. The performance of the proposed scheme is evaluated by both plan quality and the computational efficiency using three IMRT cases: a coplanar prostate case, a coplanar head-and-neck case, and a noncoplanar liver case. Involved in the evaluation are two alternative schemes: the {{\\ell }2,1} -minimization approach and the gradient norm method (GNM). The gFCP-based scheme outperforms both counterpart approaches. In particular, gFCP generates better plans than those obtained using the {{\\ell }2,1} -minimization for all three cases with a comparable computation time. As compared to the GNM, the gFCP improves both the plan quality and computational efficiency. The proposed gFCP-based scheme provides a promising framework for BAO and promises to improve both planning time and plan quality.

  1. A new sparse optimization scheme for simultaneous beam angle and fluence map optimization in radiotherapy planning.

    PubMed

    Liu, Hongcheng; Dong, Peng; Xing, Lei

    2017-07-20

    [Formula: see text]-minimization-based sparse optimization was employed to solve the beam angle optimization (BAO) in intensity-modulated radiation therapy (IMRT) planning. The technique approximates the exact BAO formulation with efficiently computable convex surrogates, leading to plans that are inferior to those attainable with recently proposed gradient-based greedy schemes. In this paper, we alleviate/reduce the nontrivial inconsistencies between the [Formula: see text]-based formulations and the exact BAO model by proposing a new sparse optimization framework based on the most recent developments in group variable selection. We propose the incorporation of the group-folded concave penalty (gFCP) as a substitution to the [Formula: see text]-minimization framework. The new formulation is then solved by a variation of an existing gradient method. The performance of the proposed scheme is evaluated by both plan quality and the computational efficiency using three IMRT cases: a coplanar prostate case, a coplanar head-and-neck case, and a noncoplanar liver case. Involved in the evaluation are two alternative schemes: the [Formula: see text]-minimization approach and the gradient norm method (GNM). The gFCP-based scheme outperforms both counterpart approaches. In particular, gFCP generates better plans than those obtained using the [Formula: see text]-minimization for all three cases with a comparable computation time. As compared to the GNM, the gFCP improves both the plan quality and computational efficiency. The proposed gFCP-based scheme provides a promising framework for BAO and promises to improve both planning time and plan quality.

  2. Generating Alternative Engineering Designs by Integrating Desktop VR with Genetic Algorithms

    ERIC Educational Resources Information Center

    Chandramouli, Magesh; Bertoline, Gary; Connolly, Patrick

    2009-01-01

    This study proposes an innovative solution to the problem of multiobjective engineering design optimization by integrating desktop VR with genetic computing. Although, this study considers the case of construction design as an example to illustrate the framework, this method can very much be extended to other engineering design problems as well.…

  3. Multifaceted antibiotic treatment analysis of methicillin-sensitive Staphylococcus aureus bloodstream infections.

    PubMed

    Weber, Zhanni; Ariano, Robert; Lagacé-Wiens, Philippe; Zelenitsky, Sheryl

    2016-12-01

    Given the overall prevalence and poor prognosis of Staphylococcus aureus bloodstream infections (BSIs), the study of treatment strategies to improve patient outcomes is important. The aim of this study was to conduct a multifaceted antibiotic treatment analysis of methicillin-sensitive S. aureus (MSSA) BSI and to characterise optimal early antibiotic therapy (within the first 7 days of drawing the index blood culture) for this serious infection. Antibiotic selection was categorised as optimal targeted (intravenous cloxacillin or cefazolin), optimal broad (piperacillin/tazobactam or meropenem), adequate (vancomycin) or inadequate (other antibiotics or oral therapy). A TSE (timing, selection, exposure) score was developed to comprehensively characterise early antibiotic therapy, where higher points corresponded to prompt initiation, optimal antibiotic selection and longer exposure (duration). Amongst 71 cases of complicated MSSA-BSI, end-of-treatment (EOT) response (i.e. clinical cure) was improved when at least adequate antibiotic therapy was initiated within 24 h [71.7% (33/46) vs. 48.0% (12/25); P = 0.047]. Clinical cure was also more likely when therapy included ≥4 days of optimal targeted antibiotics within the first 7 days [74.4% (29/39) vs. 50.0% (16/32); P = 0.03]. The TSE score was an informative index of early antibiotic therapy, with EOT cure documented in 72.0% (36/50) compared with 42.9% (9/21) of cases with scores above and below 15.2, respectively (P = 0.02). In multivariable analysis, lower Charlson comorbidity index, presence of BSI on admission, and optimising early antibiotic therapy, as described above, were associated with clinical cure in patients with MSSA-BSI. Copyright © 2016 Elsevier B.V. and International Society of Chemotherapy. All rights reserved.

  4. Optimization of a simplified automobile finite element model using time varying injury metrics.

    PubMed

    Gaewsky, James P; Danelson, Kerry A; Weaver, Caitlin M; Stitzel, Joel D

    2014-01-01

    In 2011, frontal crashes resulted in 55% of passenger car injuries with 10,277 fatalities and 866,000 injuries in the United States. To better understand frontal crash injury mechanisms, human body finite element models (FEMs) can be used to reconstruct Crash Injury Research and Engineering Network (CIREN) cases. A limitation of this method is the paucity of vehicle FEMs; therefore, we developed a functionally equivalent simplified vehicle model. The New Car Assessment Program (NCAP) data for our selected vehicle was from a frontal collision with Hybrid III (H3) Anthropomorphic Test Device (ATD) occupant. From NCAP test reports, the vehicle geometry was created and the H3 ATD was positioned. The material and component properties optimized using a variation study process were: steering column shear bolt fracture force and stroke resistance, seatbelt pretensioner force, frontal and knee bolster airbag stiffness, and belt friction through the D-ring. These parameters were varied using three successive Latin Hypercube Designs of Experiments with 130-200 simulations each. The H3 injury response was compared to the reported NCAP frontal test results for the head, chest and pelvis accelerations, and seat belt and femur forces. The phase, magnitude, and comprehensive error factors, from a Sprague and Geers analysis were calculated for each injury metric and then combined to determine the simulations with the best match to the crash test. The Sprague and Geers analyses typically yield error factors ranging from 0 to 1 with lower scores being more optimized. The total body injury response error factor for the most optimized simulation from each round of the variation study decreased from 0.466 to 0.395 to 0.360. This procedure to optimize vehicle FEMs is a valuable tool to conduct future CIREN case reconstructions in a variety of vehicles.

  5. An optimization framework for measuring spatial access over healthcare networks.

    PubMed

    Li, Zihao; Serban, Nicoleta; Swann, Julie L

    2015-07-17

    Measurement of healthcare spatial access over a network involves accounting for demand, supply, and network structure. Popular approaches are based on floating catchment areas; however the methods can overestimate demand over the network and fail to capture cascading effects across the system. Optimization is presented as a framework to measure spatial access. Questions related to when and why optimization should be used are addressed. The accuracy of the optimization models compared to the two-step floating catchment area method and its variations is analytically demonstrated, and a case study of specialty care for Cystic Fibrosis over the continental United States is used to compare these approaches. The optimization models capture a patient's experience rather than their opportunities and avoid overestimating patient demand. They can also capture system effects due to change based on congestion. Furthermore, the optimization models provide more elements of access than traditional catchment methods. Optimization models can incorporate user choice and other variations, and they can be useful towards targeting interventions to improve access. They can be easily adapted to measure access for different types of patients, over different provider types, or with capacity constraints in the network. Moreover, optimization models allow differences in access in rural and urban areas.

  6. Traumatic brain injury rehabilitation: case management and insurance-related issues.

    PubMed

    Pressman, Helaine Tobey

    2007-02-01

    Traumatic brain injury (TBI) cases are medically complex, involving the physical, cognitive, behavioral, social, and emotional aspects of the survivor. Often catastrophic, these cases require substantial financial resources not only for the patient's survival but to achieve the optimal outcome of a functional life with return to family and work responsibilities for the long term. TBI cases involve the injured person, the family, medical professionals such as treating physicians, therapists, attorneys, the employer, community resources, and the funding source, usually an insurance company. Case management is required to facilitate achievement of an optimal result by collaborating with all parties involved, assessing priorities and options, coordinating services, and educating and communicating with all concerned.

  7. Weight optimization of large span steel truss structures with genetic algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mojolic, Cristian; Hulea, Radu; Pârv, Bianca Roxana

    2015-03-10

    The paper presents the weight optimization process of the main steel truss that supports the Slatina Sport Hall roof. The structure was loaded with self-weight, dead loads, live loads, snow, wind and temperature, grouped in eleven load cases. The optimization of the structure was made using genetic algorithms implemented in a Matlab code. A total number of four different cases were taken into consideration when trying to determine the lowest weight of the structure, depending on the types of connections with the concrete structure ( types of supports, bearing modes), and the possibility of the lower truss chord nodes tomore » change their vertical position. A number of restrictions for tension, maximum displacement and buckling were enforced on the elements, and the cross sections are chosen by the program from a user data base. The results in each of the four cases were analyzed in terms of weight, element tension, element section and displacement. The paper presents the optimization process and the conclusions drawn.« less

  8. Building of Reusable Reverse Logistics Model and its Optimization Considering the Decision of Backorder or Next Arrival of Goods

    NASA Astrophysics Data System (ADS)

    Lee, Jeong-Eun; Gen, Mitsuo; Rhee, Kyong-Gu; Lee, Hee-Hyol

    This paper deals with the building of the reusable reverse logistics model considering the decision of the backorder or the next arrival of goods. The optimization method to minimize the transportation cost and to minimize the volume of the backorder or the next arrival of goods occurred by the Just in Time delivery of the final delivery stage between the manufacturer and the processing center is proposed. Through the optimization algorithms using the priority-based genetic algorithm and the hybrid genetic algorithm, the sub-optimal delivery routes are determined. Based on the case study of a distilling and sale company in Busan in Korea, the new model of the reusable reverse logistics of empty bottles is built and the effectiveness of the proposed method is verified.

  9. Method of determining the optimal dilution ratio for fluorescence fingerprint of food constituents.

    PubMed

    Trivittayasil, Vipavee; Tsuta, Mizuki; Kokawa, Mito; Yoshimura, Masatoshi; Sugiyama, Junichi; Fujita, Kaori; Shibata, Mario

    2015-01-01

    Quantitative determination by fluorescence spectroscopy is possible because of the linear relationship between the intensity of emitted fluorescence and the fluorophore concentration. However, concentration quenching may cause the relationship to become nonlinear, and thus, the optimal dilution ratio has to be determined. In the case of fluorescence fingerprint (FF) measurement, fluorescence is measured under multiple wavelength conditions and a method of determining the optimal dilution ratio for multivariate data such as FFs has not been reported. In this study, the FFs of mixed solutions of tryptophan and epicatechin of different concentrations and composition ratios were measured. Principal component analysis was applied, and the resulting loading plots were found to contain useful information about each constituent. The optimal concentration ranges could be determined by identifying the linear region of the PC score plotted against total concentration.

  10. Multiobjective optimization techniques for structural design

    NASA Technical Reports Server (NTRS)

    Rao, S. S.

    1984-01-01

    The multiobjective programming techniques are important in the design of complex structural systems whose quality depends generally on a number of different and often conflicting objective functions which cannot be combined into a single design objective. The applicability of multiobjective optimization techniques is studied with reference to simple design problems. Specifically, the parameter optimization of a cantilever beam with a tip mass and a three-degree-of-freedom vabration isolation system and the trajectory optimization of a cantilever beam are considered. The solutions of these multicriteria design problems are attempted by using global criterion, utility function, game theory, goal programming, goal attainment, bounded objective function, and lexicographic methods. It has been observed that the game theory approach required the maximum computational effort, but it yielded better optimum solutions with proper balance of the various objective functions in all the cases.

  11. Optimization of Artificial Neural Network using Evolutionary Programming for Prediction of Cascading Collapse Occurrence due to the Hidden Failure Effect

    NASA Astrophysics Data System (ADS)

    Idris, N. H.; Salim, N. A.; Othman, M. M.; Yasin, Z. M.

    2018-03-01

    This paper presents the Evolutionary Programming (EP) which proposed to optimize the training parameters for Artificial Neural Network (ANN) in predicting cascading collapse occurrence due to the effect of protection system hidden failure. The data has been collected from the probability of hidden failure model simulation from the historical data. The training parameters of multilayer-feedforward with backpropagation has been optimized with objective function to minimize the Mean Square Error (MSE). The optimal training parameters consists of the momentum rate, learning rate and number of neurons in first hidden layer and second hidden layer is selected in EP-ANN. The IEEE 14 bus system has been tested as a case study to validate the propose technique. The results show the reliable prediction of performance validated through MSE and Correlation Coefficient (R).

  12. Attitude determination using vector observations: A fast optimal matrix algorithm

    NASA Technical Reports Server (NTRS)

    Markley, F. Landis

    1993-01-01

    The attitude matrix minimizing Wahba's loss function is computed directly by a method that is competitive with the fastest known algorithm for finding this optimal estimate. The method also provides an estimate of the attitude error covariance matrix. Analysis of the special case of two vector observations identifies those cases for which the TRIAD or algebraic method minimizes Wahba's loss function.

  13. Selecting registration schemes in case of interstitial lung disease follow-up in CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vlachopoulos, Georgios; Korfiatis, Panayiotis; Skiadopoulos, Spyros

    Purpose: Primary goal of this study is to select optimal registration schemes in the framework of interstitial lung disease (ILD) follow-up analysis in CT. Methods: A set of 128 multiresolution schemes composed of multiresolution nonrigid and combinations of rigid and nonrigid registration schemes are evaluated, utilizing ten artificially warped ILD follow-up volumes, originating from ten clinical volumetric CT scans of ILD affected patients, to select candidate optimal schemes. Specifically, all combinations of four transformation models (three rigid: rigid, similarity, affine and one nonrigid: third order B-spline), four cost functions (sum-of-square distances, normalized correlation coefficient, mutual information, and normalized mutual information),more » four gradient descent optimizers (standard, regular step, adaptive stochastic, and finite difference), and two types of pyramids (recursive and Gaussian-smoothing) were considered. The selection process involves two stages. The first stage involves identification of schemes with deformation field singularities, according to the determinant of the Jacobian matrix. In the second stage, evaluation methodology is based on distance between corresponding landmark points in both normal lung parenchyma (NLP) and ILD affected regions. Statistical analysis was performed in order to select near optimal registration schemes per evaluation metric. Performance of the candidate registration schemes was verified on a case sample of ten clinical follow-up CT scans to obtain the selected registration schemes. Results: By considering near optimal schemes common to all ranking lists, 16 out of 128 registration schemes were initially selected. These schemes obtained submillimeter registration accuracies in terms of average distance errors 0.18 ± 0.01 mm for NLP and 0.20 ± 0.01 mm for ILD, in case of artificially generated follow-up data. Registration accuracy in terms of average distance error in clinical follow-up data was in the range of 1.985–2.156 mm and 1.966–2.234 mm, for NLP and ILD affected regions, respectively, excluding schemes with statistically significant lower performance (Wilcoxon signed-ranks test, p < 0.05), resulting in 13 finally selected registration schemes. Conclusions: Selected registration schemes in case of ILD CT follow-up analysis indicate the significance of adaptive stochastic gradient descent optimizer, as well as the importance of combined rigid and nonrigid schemes providing high accuracy and time efficiency. The selected optimal deformable registration schemes are equivalent in terms of their accuracy and thus compatible in terms of their clinical outcome.« less

  14. Uncertainty-Based Multi-Objective Optimization of Groundwater Remediation Design

    NASA Astrophysics Data System (ADS)

    Singh, A.; Minsker, B.

    2003-12-01

    Management of groundwater contamination is a cost-intensive undertaking filled with conflicting objectives and substantial uncertainty. A critical source of this uncertainty in groundwater remediation design problems comes from the hydraulic conductivity values for the aquifer, upon which the prediction of flow and transport of contaminants are dependent. For a remediation solution to be reliable in practice it is important that it is robust over the potential error in the model predictions. This work focuses on incorporating such uncertainty within a multi-objective optimization framework, to get reliable as well as Pareto optimal solutions. Previous research has shown that small amounts of sampling within a single-objective genetic algorithm can produce highly reliable solutions. However with multiple objectives the noise can interfere with the basic operations of a multi-objective solver, such as determining non-domination of individuals, diversity preservation, and elitism. This work proposes several approaches to improve the performance of noisy multi-objective solvers. These include a simple averaging approach, taking samples across the population (which we call extended averaging), and a stochastic optimization approach. All the approaches are tested on standard multi-objective benchmark problems and a hypothetical groundwater remediation case-study; the best-performing approach is then tested on a field-scale case at Umatilla Army Depot.

  15. Teleportation of squeezing: Optimization using non-Gaussian resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dell'Anno, Fabio; De Siena, Silvio; Illuminati, Fabrizio

    2010-12-15

    We study the continuous-variable quantum teleportation of states, statistical moments of observables, and scale parameters such as squeezing. We investigate the problem both in ideal and imperfect Vaidman-Braunstein-Kimble protocol setups. We show how the teleportation fidelity is maximized and the difference between output and input variances is minimized by using suitably optimized entangled resources. Specifically, we consider the teleportation of coherent squeezed states, exploiting squeezed Bell states as entangled resources. This class of non-Gaussian states, introduced by Illuminati and co-workers [F. Dell'Anno, S. De Siena, L. Albano, and F. Illuminati, Phys. Rev. A 76, 022301 (2007); F. Dell'Anno, S. Demore » Siena, and F. Illuminati, ibid. 81, 012333 (2010)], includes photon-added and photon-subtracted squeezed states as special cases. At variance with the case of entangled Gaussian resources, the use of entangled non-Gaussian squeezed Bell resources allows one to choose different optimization procedures that lead to inequivalent results. Performing two independent optimization procedures, one can either maximize the state teleportation fidelity, or minimize the difference between input and output quadrature variances. The two different procedures are compared depending on the degrees of displacement and squeezing of the input states and on the working conditions in ideal and nonideal setups.« less

  16. Population Modeling Approach to Optimize Crop Harvest Strategy. The Case of Field Tomato.

    PubMed

    Tran, Dinh T; Hertog, Maarten L A T M; Tran, Thi L H; Quyen, Nguyen T; Van de Poel, Bram; Mata, Clara I; Nicolaï, Bart M

    2017-01-01

    In this study, the aim is to develop a population model based approach to optimize fruit harvesting strategies with regard to fruit quality and its derived economic value. This approach was applied to the case of tomato fruit harvesting under Vietnamese conditions. Fruit growth and development of tomato (cv. "Savior") was monitored in terms of fruit size and color during both the Vietnamese winter and summer growing seasons. A kinetic tomato fruit growth model was applied to quantify biological fruit-to-fruit variation in terms of their physiological maturation. This model was successfully calibrated. Finally, the model was extended to translate the fruit-to-fruit variation at harvest into the economic value of the harvested crop. It can be concluded that a model based approach to the optimization of harvest date and harvest frequency with regard to economic value of the crop as such is feasible. This approach allows growers to optimize their harvesting strategy by harvesting the crop at more uniform maturity stages meeting the stringent retail demands for homogeneous high quality product. The total farm profit would still depend on the impact a change in harvesting strategy might have on related expenditures. This model based harvest optimisation approach can be easily transferred to other fruit and vegetable crops improving homogeneity of the postharvest product streams.

  17. Goal-Oriented Intelligence in Optimization of Distributed Parameter Systems

    DTIC Science & Technology

    2004-08-01

    Yarus, and R.L. Chambers, editors, AAPG Computer Applications in geology, No. 3, The American Association of Petroleum Geologists, Tulsa, OK, USA...Stochastic Modeling and Geostatistics – Principles, Methods, and Case Studies, AAPG Computer Applications in geology, No. 3, The American

  18. On-orbit servicing system assessment and optimization methods based on lifecycle simulation under mixed aleatory and epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel

    2013-06-01

    To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.

  19. The Abbreviated Injury Scale and its correlation with preventable traumatic accidental deaths: a study from South Delhi.

    PubMed

    Rautji, R; Bhardwaj, D N; Dogra, T D

    2006-04-01

    Anatomic trauma scoring systems are fundamental to trauma research. The Abbreviated Injury Scale (AIS) and its derivative, the Injury Severity Score (ISS), are the most frequently used scales. In a prospective study, 400 autopsies of road traffic accident victims performed between January 2002 and December 2003 were coded according to the AIS and ISS methods. All the cases were classified into different injury groups according to the Injury Severity Scale. Fifty-eight cases (14.5%) were assigned an ISS value of <25; 244 (61%) cases were valued between 25-49; 38 cases (9.5%) were valued between 50-74 and 60 (15%) cases had a value of 75. On analysis of medical care, in cases with ISS<50, about 96% of the victims did not receive optimal care quickly enough with a lack of pre-hospital resuscitation measures and lengthy transportation time to hospital being of major importance.

  20. Integrated configurable equipment selection and line balancing for mass production with serial-parallel machining systems

    NASA Astrophysics Data System (ADS)

    Battaïa, Olga; Dolgui, Alexandre; Guschinsky, Nikolai; Levin, Genrikh

    2014-10-01

    Solving equipment selection and line balancing problems together allows better line configurations to be reached and avoids local optimal solutions. This article considers jointly these two decision problems for mass production lines with serial-parallel workplaces. This study was motivated by the design of production lines based on machines with rotary or mobile tables. Nevertheless, the results are more general and can be applied to assembly and production lines with similar structures. The designers' objectives and the constraints are studied in order to suggest a relevant mathematical model and an efficient optimization approach to solve it. A real case study is used to validate the model and the developed approach.

Top