Science.gov

Sample records for design optimization framework

  1. Evaluation of Frameworks for HSCT Design Optimization

    NASA Technical Reports Server (NTRS)

    Krishnan, Ramki

    1998-01-01

    This report is an evaluation of engineering frameworks that could be used to augment, supplement, or replace the existing FIDO 3.5 (Framework for Interdisciplinary Design and Optimization Version 3.5) framework. The report begins with the motivation for this effort, followed by a description of an "ideal" multidisciplinary design and optimization (MDO) framework. The discussion then turns to how each candidate framework stacks up against this ideal. This report ends with recommendations as to the "best" frameworks that should be down-selected for detailed review.

  2. Global optimization framework for solar building design

    NASA Astrophysics Data System (ADS)

    Silva, N.; Alves, N.; Pascoal-Faria, P.

    2017-07-01

    The generative modeling paradigm is a shift from static models to flexible models. It describes a modeling process using functions, methods and operators. The result is an algorithmic description of the construction process. Each evaluation of such an algorithm creates a model instance, which depends on its input parameters (width, height, volume, roof angle, orientation, location). These values are normally chosen according to aesthetic aspects and style. In this study, the model's parameters are automatically generated according to an objective function. A generative model can be optimized according to its parameters, in this way, the best solution for a constrained problem is determined. Besides the establishment of an overall framework design, this work consists on the identification of different building shapes and their main parameters, the creation of an algorithmic description for these main shapes and the formulation of the objective function, respecting a building's energy consumption (solar energy, heating and insulation). Additionally, the conception of an optimization pipeline, combining an energy calculation tool with a geometric scripting engine is presented. The methods developed leads to an automated and optimized 3D shape generation for the projected building (based on the desired conditions and according to specific constrains). The approach proposed will help in the construction of real buildings that account for less energy consumption and for a more sustainable world.

  3. Automated Design Framework for Synthetic Biology Exploiting Pareto Optimality.

    PubMed

    Otero-Muras, Irene; Banga, Julio R

    2017-04-12

    In this work we consider Pareto optimality for automated design in synthetic biology. We present a generalized framework based on a mixed-integer dynamic optimization formulation that, given design specifications, allows the computation of Pareto optimal sets of designs, that is, the set of best trade-offs for the metrics of interest. We show how this framework can be used for (i) forward design, that is, finding the Pareto optimal set of synthetic designs for implementation, and (ii) reverse design, that is, analyzing and inferring motifs and/or design principles of gene regulatory networks from the Pareto set of optimal circuits. Finally, we illustrate the capabilities and performance of this framework considering four case studies. In the first problem we consider the forward design of an oscillator. In the remaining problems, we illustrate how to apply the reverse design approach to find motifs for stripe formation, rapid adaption, and fold-change detection, respectively.

  4. A Framework for Designing Optimal Spacecraft Formations

    DTIC Science & Technology

    2002-09-01

    3 1. Reference Frame ..................................................................................6 B. SOLVING OPTIMAL CONTROL PROBLEMS ........................................7...spacecraft state. Depending on the model, there may be additional variables in the state, but there will be a minimum of these six. B. SOLVING OPTIMAL CONTROL PROBLEMS Until

  5. Deterministic Design Optimization of Structures in OpenMDAO Framework

    NASA Technical Reports Server (NTRS)

    Coroneos, Rula M.; Pai, Shantaram S.

    2012-01-01

    Nonlinear programming algorithms play an important role in structural design optimization. Several such algorithms have been implemented in OpenMDAO framework developed at NASA Glenn Research Center (GRC). OpenMDAO is an open source engineering analysis framework, written in Python, for analyzing and solving Multi-Disciplinary Analysis and Optimization (MDAO) problems. It provides a number of solvers and optimizers, referred to as components and drivers, which users can leverage to build new tools and processes quickly and efficiently. Users may download, use, modify, and distribute the OpenMDAO software at no cost. This paper summarizes the process involved in analyzing and optimizing structural components by utilizing the framework s structural solvers and several gradient based optimizers along with a multi-objective genetic algorithm. For comparison purposes, the same structural components were analyzed and optimized using CometBoards, a NASA GRC developed code. The reliability and efficiency of the OpenMDAO framework was compared and reported in this report.

  6. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2005-07-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  7. A Framework to Design and Optimize Chemical Flooding Processes

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  8. A Framework to Design and Optimize Chemical Flooding Processes

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  9. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    SciTech Connect

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2004-11-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  10. An framework for robust flight control design using constrained optimization

    NASA Technical Reports Server (NTRS)

    Palazoglu, A.; Yousefpor, M.; Hess, R. A.

    1992-01-01

    An analytical framework is described for the design of feedback control systems to meet specified performance criteria in the presence of structured and unstructured uncertainty. Attention is focused upon the linear time invariant, single-input, single-output problem for the purposes of exposition. The framework provides for control of the degree of the stabilizing compensator or controller.

  11. Optimal Aeroacoustic Shape Design Using the Surrogate Management Framework

    DTIC Science & Technology

    2004-02-09

    wish to thank the IMA for providing a forum for collaboration, as well as Charles Audet and Petros Koumoutsakos for valuable discussions. The authors...17] N. Hansen, D. Mller, and P. Koumoutsakos . Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation...P. Koumoutsakos . Optimal aeroacoustic shape design using approximation modeling. Annual Research Briefs, Center for Turbulence Research, Stanford

  12. A geometry optimization framework for photonic crystal design

    NASA Astrophysics Data System (ADS)

    Hart, E. E.; Sóbester, A.; Djidjeli, K.; Molinari, M.; Thomas, K. S.; Cox, S. J.

    2012-01-01

    The performance of photonic crystal devices can depend strongly on their geometry. Alas, their fundamental physics offers relatively little by way of pointers in terms of optimum shapes, so numerical design search techniques must be used in an attempt to determine high performance layouts. We discuss strategies for solving this type of optimization problem, the main challenge of which is the conflict between the enormous size of the space of potentially useful designs and the relatively high computational cost of evaluating the performance of putative shapes. The optimization technique proposed here operates over increasing levels of fidelity, both in terms of the resolution of its non-parametric shape definition and in terms of the resolution of the numerical analysis of the performance of putative designs. This is a generic method, potentially applicable to any type of electromagnetic device shape design problem. We also consider a methodology for assessing the robustness of the optima generated through this process, investigating the impact of manufacturing errors on their performance. As an illustration, we apply this technology to the design of a two-dimensional photonic crystal structure; the result features a large complete band gap structure and a topology that is different from previously published designs.

  13. Reduced Design Load Basis for Ultimate Blade Loads Estimation in Multidisciplinary Design Optimization Frameworks

    NASA Astrophysics Data System (ADS)

    Pavese, Christian; Tibaldi, Carlo; Larsen, Torben J.; Kim, Taeseong; Thomsen, Kenneth

    2016-09-01

    The aim is to provide a fast and reliable approach to estimate ultimate blade loads for a multidisciplinary design optimization (MDO) framework. For blade design purposes, the standards require a large amount of computationally expensive simulations, which cannot be efficiently run each cost function evaluation of an MDO process. This work describes a method that allows integrating the calculation of the blade load envelopes inside an MDO loop. Ultimate blade load envelopes are calculated for a baseline design and a design obtained after an iteration of an MDO. These envelopes are computed for a full standard design load basis (DLB) and a deterministic reduced DLB. Ultimate loads extracted from the two DLBs with the two blade designs each are compared and analyzed. Although the reduced DLB supplies ultimate loads of different magnitude, the shape of the estimated envelopes are similar to the one computed using the full DLB. This observation is used to propose a scheme that is computationally cheap, and that can be integrated inside an MDO framework, providing a sufficiently reliable estimation of the blade ultimate loading. The latter aspect is of key importance when design variables implementing passive control methodologies are included in the formulation of the optimization problem. An MDO of a 10 MW wind turbine blade is presented as an applied case study to show the efficacy of the reduced DLB concept.

  14. ROSE: The Design of a General Tool for the Independent Optimization of Object-Oriented Frameworks

    SciTech Connect

    Davis, K.; Philip, B.; Quinlan, D.

    1999-05-18

    ROSE represents a programmable preprocessor for the highly aggressive optimization of C++ object-oriented frameworks. A fundamental feature of ROSE is that it preserves the semantics, the implicit meaning, of the object-oriented framework's abstractions throughout the optimization process, permitting the framework's abstractions to be recognized and optimizations to capitalize upon the added value of the framework's true meaning. In contrast, a C++ compiler only sees the semantics of the C++ language and thus is severely limited in what optimizations it can introduce. The use of the semantics of the framework's abstractions avoids program analysis that would be incapable of recapturing the framework's full semantics from those of the C++ language implementation of the application or framework. Just as no level of program analysis within the C++ compiler would not be expected to recognize the use of adaptive mesh refinement and introduce optimizations based upon such information. Since ROSE is programmable, additional specialized program analysis is possible which then compliments the semantics of the framework's abstractions. Enabling an optimization mechanism to use the high level semantics of the framework's abstractions together with a programmable level of program analysis (e.g. dependence analysis), at the level of the framework's abstractions, allows for the design of high performance object-oriented frameworks with uniquely tailored sophisticated optimizations far beyond the limits of contemporary serial F0RTRAN 77, C or C++ language compiler technology. In short, faster, more highly aggressive optimizations are possible. The resulting optimizations are literally driven by the framework's definition of its abstractions. Since the abstractions within a framework are of third party design the optimizations are similarly of third party design, specifically independent of the compiler and the applications that use the framework. The interface to ROSE is

  15. A multiobjective optimization framework for multicontaminant industrial water network design.

    PubMed

    Boix, Marianne; Montastruc, Ludovic; Pibouleau, Luc; Azzaro-Pantel, Catherine; Domenech, Serge

    2011-07-01

    The optimal design of multicontaminant industrial water networks according to several objectives is carried out in this paper. The general formulation of the water allocation problem (WAP) is given as a set of nonlinear equations with binary variables representing the presence of interconnections in the network. For optimization purposes, three antagonist objectives are considered: F(1), the freshwater flow-rate at the network entrance, F(2), the water flow-rate at inlet of regeneration units, and F(3), the number of interconnections in the network. The multiobjective problem is solved via a lexicographic strategy, where a mixed-integer nonlinear programming (MINLP) procedure is used at each step. The approach is illustrated by a numerical example taken from the literature involving five processes, one regeneration unit and three contaminants. The set of potential network solutions is provided in the form of a Pareto front. Finally, the strategy for choosing the best network solution among those given by Pareto fronts is presented. This Multiple Criteria Decision Making (MCDM) problem is tackled by means of two approaches: a classical TOPSIS analysis is first implemented and then an innovative strategy based on the global equivalent cost (GEC) in freshwater that turns out to be more efficient for choosing a good network according to a practical point of view.

  16. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  17. A KBE-enabled design framework for cost/weight optimization study of aircraft composite structures

    NASA Astrophysics Data System (ADS)

    Wang, H.; La Rocca, G.; van Tooren, M. J. L.

    2014-10-01

    Traditionally, minimum weight is the objective when optimizing airframe structures. This optimization, however, does not consider the manufacturing cost which actually determines the profit of the airframe manufacturer. To this purpose, a design framework has been developed able to perform cost/weight multi-objective optimization of an aircraft component, including large topology variations of the structural configuration. The key element of the proposed framework is a dedicated knowledge based engineering (KBE) application, called multi-model generator, which enables modelling very different product configurations and variants and extract all data required to feed the weight and cost estimation modules, in a fully automated fashion. The weight estimation method developed in this research work uses Finite Element Analysis to calculate the internal stresses of the structural elements and an analytical composite plate sizing method to determine their minimum required thicknesses. The manufacturing cost estimation module was developed on the basis of a cost model available in literature. The capability of the framework was successfully demonstrated by designing and optimizing the composite structure of a business jet rudder. The study case indicates the design framework is able to find the Pareto optimal set for minimum structural weight and manufacturing costin a very quick way. Based on the Pareto set, the rudder manufacturer is in conditions to conduct both internal trade-off studies between minimum weight and minimum cost solutions, as well as to offer the OEM a full set of optimized options to choose, rather than one feasible design.

  18. A Robust and Reliability-Based Optimization Framework for Conceptual Aircraft Wing Design

    NASA Astrophysics Data System (ADS)

    Paiva, Ricardo Miguel

    A robustness and reliability based multidisciplinary analysis and optimization framework for aircraft design is presented. Robust design optimization and Reliability Based Design Optimization are merged into a unified formulation which streamlines the setup of optimization problems and aims at preventing foreseeable implementation issues in uncertainty based design. Surrogate models are evaluated to circumvent the intensive computations resulting from using direct evaluation in nondeterministic optimization. Three types of models are implemented in the framework: quadratic interpolation, regression Kriging and artificial neural networks. Regression Kriging presents the best compromise between performance and accuracy in deterministic wing design problems. The performance of the simultaneous implementation of robustness and reliability is evaluated using simple analytic problems and more complex wing design problems, revealing that performance benefits can still be achieved while satisfying probabilistic constraints rather than the simpler (and not as computationally intensive) robust constraints. The latter are proven to to be unable to follow a reliability constraint as uncertainty in the input variables increases. The computational effort of the reliability analysis is further reduced through the implementation of a coordinate change in the respective optimization sub-problem. The computational tool developed is a stand-alone application and it presents a user-friendly graphical user interface. The multidisciplinary analysis and design optimization tool includes modules for aerodynamics, structural, aeroelastic and cost analysis, that can be used either individually or coupled.

  19. A general framework of marker design with optimal allocation to assess clinical utility.

    PubMed

    Tang, Liansheng; Zhou, Xiao-Hua

    2013-02-20

    This paper proposes a general framework of marker validation designs, which includes most of existing validation designs. The sample size calculation formulas for the proposed general design are derived on the basis of the optimal allocation that minimizes the expected number of treatment failures. The optimal allocation is especially important in the targeted design which is often motivated by preliminary evidence that marker-positive patients respond to one treatment better than the other. Our sample size calculation also takes into account the classification error of a marker. The numerical studies are conducted to investigate the expected reduction on the treatment failures and the relative efficiency between the targeted design and the traditional design based on the optimal ratios. We illustrate the calculation of the optimal allocation and sample sizes through a hypothetical stage II colon cancer trial.

  20. A Hybrid Optimization Framework with POD-based Order Reduction and Design-Space Evolution Scheme

    NASA Astrophysics Data System (ADS)

    Ghoman, Satyajit S.

    The main objective of this research is to develop an innovative multi-fidelity multi-disciplinary design, analysis and optimization suite that integrates certain solution generation codes and newly developed innovative tools to improve the overall optimization process. The research performed herein is divided into two parts: (1) the development of an MDAO framework by integration of variable fidelity physics-based computational codes, and (2) enhancements to such a framework by incorporating innovative features extending its robustness. The first part of this dissertation describes the development of a conceptual Multi-Fidelity Multi-Strategy and Multi-Disciplinary Design Optimization Environment (M3 DOE), in context of aircraft wing optimization. M 3 DOE provides the user a capability to optimize configurations with a choice of (i) the level of fidelity desired, (ii) the use of a single-step or multi-step optimization strategy, and (iii) combination of a series of structural and aerodynamic analyses. The modularity of M3 DOE allows it to be a part of other inclusive optimization frameworks. The M 3 DOE is demonstrated within the context of shape and sizing optimization of the wing of a Generic Business Jet aircraft. Two different optimization objectives, viz. dry weight minimization, and cruise range maximization are studied by conducting one low-fidelity and two high-fidelity optimization runs to demonstrate the application scope of M3 DOE. The second part of this dissertation describes the development of an innovative hybrid optimization framework that extends the robustness of M 3 DOE by employing a proper orthogonal decomposition-based design-space order reduction scheme combined with the evolutionary algorithm technique. The POD method of extracting dominant modes from an ensemble of candidate configurations is used for the design-space order reduction. The snapshot of candidate population is updated iteratively using evolutionary algorithm technique of

  1. Optimization of experimental design in fMRI: a general framework using a genetic algorithm.

    PubMed

    Wager, Tor D; Nichols, Thomas E

    2003-02-01

    This article describes a method for selecting design parameters and a particular sequence of events in fMRI so as to maximize statistical power and psychological validity. Our approach uses a genetic algorithm (GA), a class of flexible search algorithms that optimize designs with respect to single or multiple measures of fitness. Two strengths of the GA framework are that (1) it operates with any sort of model, allowing for very specific parameterization of experimental conditions, including nonstandard trial types and experimentally observed scanner autocorrelation, and (2) it is flexible with respect to fitness criteria, allowing optimization over known or novel fitness measures. We describe how genetic algorithms may be applied to experimental design for fMRI, and we use the framework to explore the space of possible fMRI design parameters, with the goal of providing information about optimal design choices for several types of designs. In our simulations, we considered three fitness measures: contrast estimation efficiency, hemodynamic response estimation efficiency, and design counterbalancing. Although there are inherent trade-offs between these three fitness measures, GA optimization can produce designs that outperform random designs on all three criteria simultaneously.

  2. Stochastic optimization framework (SOF) for computer-optimized design, engineering, and performance of multi-dimensional systems and processes

    NASA Astrophysics Data System (ADS)

    Fink, Wolfgang

    2008-04-01

    Many systems and processes, both natural and artificial, may be described by parameter-driven mathematical and physical models. We introduce a generally applicable Stochastic Optimization Framework (SOF) that can be interfaced to or wrapped around such models to optimize model outcomes by effectively "inverting" them. The Visual and Autonomous Exploration Systems Research Laboratory (http://autonomy.caltech.edu edu) at the California Institute of Technology (Caltech) has long-term experience in the optimization of multi-dimensional systems and processes. Several examples of successful application of a SOF are reviewed and presented, including biochemistry, robotics, device performance, mission design, parameter retrieval, and fractal landscape optimization. Applications of a SOF are manifold, such as in science, engineering, industry, defense & security, and reconnaissance/exploration. Keywords: Multi-parameter optimization, design/performance optimization, gradient-based steepest-descent methods, local minima, global minimum, degeneracy, overlap parameter distribution, fitness function, stochastic optimization framework, Simulated Annealing, Genetic Algorithms, Evolutionary Algorithms, Genetic Programming, Evolutionary Computation, multi-objective optimization, Pareto-optimal front, trade studies )

  3. Axiomatic Design of a Framework for the Comprehensive Optimization of Patient Flows in Hospitals

    PubMed Central

    Matt, Dominik T.

    2017-01-01

    Lean Management and Six Sigma are nowadays applied not only to the manufacturing industry but also to service industry and public administration. The manifold variables affecting the Health Care system minimize the effect of a narrow Lean intervention. Therefore, this paper aims to discuss a comprehensive, system-based approach to achieve a factual holistic optimization of patient flows. This paper debates the efficacy of Lean principles applied to the optimization of patient flows and related activities, structures, and resources, developing a theoretical framework based on the principles of the Axiomatic Design. The demand for patient-oriented and efficient health services leads to use these methodologies to improve hospital processes. In the framework, patients with similar characteristics are clustered in families to achieve homogeneous flows through the value stream. An optimization checklist is outlined as the result of the mapping between Functional Requirements and Design Parameters, with the right sequence of the steps to optimize the patient flow according to the principles of Axiomatic Design. The Axiomatic Design-based top-down implementation of Health Care evidence, according to Lean principles, results in a holistic optimization of hospital patient flows, by reducing the complexity of the system.

  4. OpenMDAO: Framework for Flexible Multidisciplinary Design, Analysis and Optimization Methods

    NASA Technical Reports Server (NTRS)

    Heath, Christopher M.; Gray, Justin S.

    2012-01-01

    The OpenMDAO project is underway at NASA to develop a framework which simplifies the implementation of state-of-the-art tools and methods for multidisciplinary design, analysis and optimization. Foremost, OpenMDAO has been designed to handle variable problem formulations, encourage reconfigurability, and promote model reuse. This work demonstrates the concept of iteration hierarchies in OpenMDAO to achieve a flexible environment for supporting advanced optimization methods which include adaptive sampling and surrogate modeling techniques. In this effort, two efficient global optimization methods were applied to solve a constrained, single-objective and constrained, multiobjective version of a joint aircraft/engine sizing problem. The aircraft model, NASA's nextgeneration advanced single-aisle civil transport, is being studied as part of the Subsonic Fixed Wing project to help meet simultaneous program goals for reduced fuel burn, emissions, and noise. This analysis serves as a realistic test problem to demonstrate the flexibility and reconfigurability offered by OpenMDAO.

  5. SBROME: a scalable optimization and module matching framework for automated biosystems design.

    PubMed

    Huynh, Linh; Tsoukalas, Athanasios; Köppe, Matthias; Tagkopoulos, Ilias

    2013-05-17

    The development of a scalable framework for biodesign automation is a formidable challenge given the expected increase in part availability and the ever-growing complexity of synthetic circuits. To allow for (a) the use of previously constructed and characterized circuits or modules and (b) the implementation of designs that can scale up to hundreds of nodes, we here propose a divide-and-conquer Synthetic Biology Reusable Optimization Methodology (SBROME). An abstract user-defined circuit is first transformed and matched against a module database that incorporates circuits that have previously been experimentally characterized. Then the resulting circuit is decomposed to subcircuits that are populated with the set of parts that best approximate the desired function. Finally, all subcircuits are subsequently characterized and deposited back to the module database for future reuse. We successfully applied SBROME toward two alternative designs of a modular 3-input multiplexer that utilize pre-existing logic gates and characterized biological parts.

  6. Materials design by evolutionary optimization of functional groups in metal-organic frameworks.

    PubMed

    Collins, Sean P; Daff, Thomas D; Piotrkowski, Sarah S; Woo, Tom K

    2016-11-01

    A genetic algorithm that efficiently optimizes a desired physical or functional property in metal-organic frameworks (MOFs) by evolving the functional groups within the pores has been developed. The approach has been used to optimize the CO2 uptake capacity of 141 experimentally characterized MOFs under conditions relevant for postcombustion CO2 capture. A total search space of 1.65 trillion structures was screened, and 1035 derivatives of 23 different parent MOFs were identified as having exceptional CO2 uptakes of >3.0 mmol/g (at 0.15 atm and 298 K). Many well-known MOF platforms were optimized, with some, such as MIL-47, having their CO2 adsorption increase by more than 400%. The structures of the high-performing MOFs are provided as potential targets for synthesis.

  7. Materials design by evolutionary optimization of functional groups in metal-organic frameworks

    PubMed Central

    Collins, Sean P.; Daff, Thomas D.; Piotrkowski, Sarah S.; Woo, Tom K.

    2016-01-01

    A genetic algorithm that efficiently optimizes a desired physical or functional property in metal-organic frameworks (MOFs) by evolving the functional groups within the pores has been developed. The approach has been used to optimize the CO2 uptake capacity of 141 experimentally characterized MOFs under conditions relevant for postcombustion CO2 capture. A total search space of 1.65 trillion structures was screened, and 1035 derivatives of 23 different parent MOFs were identified as having exceptional CO2 uptakes of >3.0 mmol/g (at 0.15 atm and 298 K). Many well-known MOF platforms were optimized, with some, such as MIL-47, having their CO2 adsorption increase by more than 400%. The structures of the high-performing MOFs are provided as potential targets for synthesis. PMID:28138523

  8. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    NASA Technical Reports Server (NTRS)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  9. Accounting for Proof Test Data in a Reliability Based Design Optimization Framework

    NASA Technical Reports Server (NTRS)

    Ventor, Gerharad; Scotti, Stephen J.

    2012-01-01

    This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.

  10. Multi-layer photonics modeling framework for the design, analysis, and optimization of devices, links, and networks

    NASA Astrophysics Data System (ADS)

    Richter, André; Louchet, Hadrien; Arellano, Cristina; Farina, Jim; Koltchanov, Igor

    2011-01-01

    Requirements on photonics modeling vary significantly when aiming to design, analyze and optimize a single device, a complete transmission link or a complex network. Depending on the task at hand, different levels of detail for emulating the underlying physical characteristics and signal interactions are necessary. We present a multi-layer photonics modeling framework that addresses the different design challenges of devices, links and networks. Our discussed methodology is based on flexible optical signal representations, equipment models ranging from very detailed to high-level parametric, sophisticated numerical algorithms and means for automated parameter and technology variation and optimization. We discuss applications such as the detailed modeling on photonics integrated circuit level, the characterization of a high-speed transmission link utilizing multilevel modulation and coherent detection, the parametric analysis of transmission links and network dynamics, and the cost-optimized placement of equipment in moderately complex networks.

  11. The Automatic Neuroscientist: A framework for optimizing experimental design with closed-loop real-time fMRI

    PubMed Central

    Lorenz, Romy; Monti, Ricardo Pio; Violante, Inês R.; Anagnostopoulos, Christoforos; Faisal, Aldo A.; Montana, Giovanni; Leech, Robert

    2016-01-01

    Functional neuroimaging typically explores how a particular task activates a set of brain regions. Importantly though, the same neural system can be activated by inherently different tasks. To date, there is no approach available that systematically explores whether and how distinct tasks probe the same neural system. Here, we propose and validate an alternative framework, the Automatic Neuroscientist, which turns the standard fMRI approach on its head. We use real-time fMRI in combination with modern machine-learning techniques to automatically design the optimal experiment to evoke a desired target brain state. In this work, we present two proof-of-principle studies involving perceptual stimuli. In both studies optimization algorithms of varying complexity were employed; the first involved a stochastic approximation method while the second incorporated a more sophisticated Bayesian optimization technique. In the first study, we achieved convergence for the hypothesized optimum in 11 out of 14 runs in less than 10 min. Results of the second study showed how our closed-loop framework accurately and with high efficiency estimated the underlying relationship between stimuli and neural responses for each subject in one to two runs: with each run lasting 6.3 min. Moreover, we demonstrate that using only the first run produced a reliable solution at a group-level. Supporting simulation analyses provided evidence on the robustness of the Bayesian optimization approach for scenarios with low contrast-to-noise ratio. This framework is generalizable to numerous applications, ranging from optimizing stimuli in neuroimaging pilot studies to tailoring clinical rehabilitation therapy to patients and can be used with multiple imaging modalities in humans and animals. PMID:26804778

  12. Simulation-Based Design for Wearable Robotic Systems: An Optimization Framework for Enhancing a Standing Long Jump.

    PubMed

    Ong, Carmichael F; Hicks, Jennifer L; Delp, Scott L

    2016-05-01

    Technologies that augment human performance are the focus of intensive research and development, driven by advances in wearable robotic systems. Success has been limited by the challenge of understanding human-robot interaction. To address this challenge, we developed an optimization framework to synthesize a realistic human standing long jump and used the framework to explore how simulated wearable robotic devices might enhance jump performance. A planar, five-segment, seven-degree-of-freedom model with physiological torque actuators, which have variable torque capacity depending on joint position and velocity, was used to represent human musculoskeletal dynamics. An active augmentation device was modeled as a torque actuator that could apply a single pulse of up to 100 Nm of extension torque. A passive design was modeled as rotational springs about each lower limb joint. Dynamic optimization searched for physiological and device actuation patterns to maximize jump distance. Optimization of the nominal case yielded a 2.27 m jump that captured salient kinematic and kinetic features of human jumps. When the active device was added to the ankle, knee, or hip, jump distance increased to between 2.49 and 2.52 m. Active augmentation of all three joints increased the jump distance to 3.10 m. The passive design increased jump distance to 3.32 m by adding torques of 135, 365, and 297 Nm to the ankle, knee, and hip, respectively. Dynamic optimization can be used to simulate a standing long jump and investigate human-robot interaction. Simulation can aid in the design of performance-enhancing technologies.

  13. Interactive Genetic Algorithm - An Adaptive and Interactive Decision Support Framework for Design of Optimal Groundwater Monitoring Plans

    NASA Astrophysics Data System (ADS)

    Babbar-Sebens, M.; Minsker, B. S.

    2006-12-01

    In the water resources management field, decision making encompasses many kinds of engineering, social, and economic constraints and objectives. Representing all of these problem dependant criteria through models (analytical or numerical) and various formulations (e.g., objectives, constraints, etc.) within an optimization- simulation system can be a very non-trivial issue. Most models and formulations utilized for discerning desirable traits in a solution can only approximate the decision maker's (DM) true preference criteria, and they often fail to consider important qualitative and incomputable phenomena related to the management problem. In our research, we have proposed novel decision support frameworks that allow DMs to actively participate in the optimization process. The DMs explicitly indicate their true preferences based on their subjective criteria and the results of various simulation models and formulations. The feedback from the DMs is then used to guide the search process towards solutions that are "all-rounders" from the perspective of the DM. The two main research questions explored in this work are: a) Does interaction between the optimization algorithm and a DM assist the system in searching for groundwater monitoring designs that are robust from the DM's perspective?, and b) How can an interactive search process be made more effective when human factors, such as human fatigue and cognitive learning processes, affect the performance of the algorithm? The application of these frameworks on a real-world groundwater long-term monitoring (LTM) case study in Michigan highlighted the following salient advantages: a) in contrast to the non-interactive optimization methodology, the proposed interactive frameworks were able to identify low cost monitoring designs whose interpolation maps respected the expected spatial distribution of the contaminants, b) for many same-cost designs, the interactive methodologies were able to propose multiple alternatives

  14. Multi-Disciplinary Analysis and Optimization Frameworks

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia Gutierrez

    2009-01-01

    Since July 2008, the Multidisciplinary Analysis & Optimization Working Group (MDAO WG) of the Systems Analysis Design & Optimization (SAD&O) discipline in the Fundamental Aeronautics Program s Subsonic Fixed Wing (SFW) project completed one major milestone, Define Architecture & Interfaces for Next Generation Open Source MDAO Framework Milestone (9/30/08), and is completing the Generation 1 Framework validation milestone, which is due December 2008. Included in the presentation are: details of progress on developing the Open MDAO framework, modeling and testing the Generation 1 Framework, progress toward establishing partnerships with external parties, and discussion of additional potential collaborations

  15. A framework for simultaneous aerodynamic design optimization in the presence of chaos

    SciTech Connect

    Günther, Stefanie; Gauger, Nicolas R.; Wang, Qiqi

    2017-01-01

    Integrating existing solvers for unsteady partial differential equations into a simultaneous optimization method is challenging due to the forward-in-time information propagation of classical time-stepping methods. This paper applies the simultaneous single-step one-shot optimization method to a reformulated unsteady constraint that allows for both forward- and backward-in-time information propagation. Especially in the presence of chaotic and turbulent flow, solving the initial value problem simultaneously with the optimization problem often scales poorly with the time domain length. The new formulation relaxes the initial condition and instead solves a least squares problem for the discrete partial differential equations. This enables efficient one-shot optimization that is independent of the time domain length, even in the presence of chaos.

  16. A framework for simultaneous aerodynamic design optimization in the presence of chaos

    NASA Astrophysics Data System (ADS)

    Günther, Stefanie; Gauger, Nicolas R.; Wang, Qiqi

    2017-01-01

    Integrating existing solvers for unsteady partial differential equations into a simultaneous optimization method is challenging due to the forward-in-time information propagation of classical time-stepping methods. This paper applies the simultaneous single-step one-shot optimization method to a reformulated unsteady constraint that allows for both forward- and backward-in-time information propagation. Especially in the presence of chaotic and turbulent flow, solving the initial value problem simultaneously with the optimization problem often scales poorly with the time domain length. The new formulation relaxes the initial condition and instead solves a least squares problem for the discrete partial differential equations. This enables efficient one-shot optimization that is independent of the time domain length, even in the presence of chaos.

  17. A Virtual Reality Framework to Optimize Design, Operation and Refueling of GEN-IV Reactors.

    SciTech Connect

    Rizwan-uddin; Nick Karancevic; Stefano Markidis; Joel Dixon; Cheng Luo; Jared Reynolds

    2008-04-23

    many GEN-IV candidate designs are currently under investigation. Technical issues related to material, safety and economics are being addressed at research laboratories, industry and in academia. After safety, economic feasibility is likely to be the most important crterion in the success of GEN-IV design(s). Lessons learned from the designers and operators of GEN-II (and GEN-III) reactors must play a vital role in achieving both safety and economic feasibility goals.

  18. A Heuristic Design Information Sharing Framework for Hard Discrete Optimization Problems

    DTIC Science & Technology

    2007-03-01

    and Solow 1993). In addition, if local search is applied and the local optimum obtained is a global optimum, then the problem is solved, though this...Jacobson and Solow (1993) and Armstrong and Jacobson (2005) investigate the complexity of finding polynomial time improvement algorithms for discrete...Security Devices," Optimization and Engineering 6(3), 339-359. S.H. Jacobson, D. Solow , 1993, "The Effectiveness of Finite Improvement Algorithms for

  19. A Generalized Framework for Constrained Design Optimization of General Supersonic Configurations Using Adjoint Based Sensitivity Derivatives

    NASA Technical Reports Server (NTRS)

    Karman, Steve L., Jr.

    2011-01-01

    The Aeronautics Research Mission Directorate (ARMD) sent out an NASA Research Announcement (NRA) for proposals soliciting research and technical development. The proposed research program was aimed at addressing the desired milestones and outcomes of ROA (ROA-2006) Subtopic A.4.1.1 Advanced Computational Methods. The second milestone, SUP.1.06.02 Robust, validated mesh adaptation and error quantification for near field Computational Fluid Dynamics (CFD), was addressed by the proposed research. Additional research utilizing the direct links to geometry through a CAD interface enabled by this work will allow for geometric constraints to be applied and address the final milestone, SUP2.07.06 Constrained low-drag supersonic aerodynamic design capability. The original product of the proposed research program was an integrated system of tools that can be used for the mesh mechanics required for rapid high fidelity analysis and for design of supersonic cruise vehicles. These Euler and Navier-Stokes volume grid manipulation tools were proposed to efficiently use parallel processing. The mesh adaptation provides a systematic approach for achieving demonstrated levels of accuracy in the solutions. NASA chose to fund only the mesh generation/adaptation portion of the proposal. So this report describes the completion of the proposed tasks for mesh creation, manipulation and adaptation as it pertains to sonic boom prediction of supersonic configurations.

  20. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis.

    SciTech Connect

    Eldred, Michael Scott; Vigil, Dena M.; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Lefantzi, Sophia; Hough, Patricia Diane; Eddy, John P.

    2011-12-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.

  1. Optimal Design in Three-Level Block Randomized Designs with Two Levels of Nesting: An ANOVA Framework with Random Effects

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2013-01-01

    Large-scale experiments that involve nested structures may assign treatment conditions either to subgroups such as classrooms or to individuals such as students within subgroups. Key aspects of the design of such experiments include knowledge of the variance structure in higher levels and the sample sizes necessary to reach sufficient power to…

  2. An optimization-based design framework for steering steady states and improving robustness of glycolysis-glycogenolysis pathway.

    PubMed

    Panja, Surajit; Patra, Sourav; Mukherjee, Anirban; Basu, Madhumita; Sengupta, Sanghamitra; Dutta, Pranab K

    2013-02-01

    A robust synthesis technique is devised for synergism and saturation systems, commonly known as S-systems, for controlling the steady states of the glycolysis-glycogenolysis pathway. The development of the robust biochemical network is essential owing to the fragile response to the perturbation of intrinsic and extrinsic parameters of the nominal S-system. The synthesis problem is formulated in a computationally attractive convex optimization framework. The linear matrix inequalities are framed to aim at the minimization of steady-state error, improvement of robustness, and utilization of minimum control input to the biochemical network.

  3. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    SciTech Connect

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.; Jakeman, John Davis; Swiler, Laura Painton; Stephens, John Adam; Vigil, Dena M.; Wildey, Timothy Michael; Bohnhoff, William J.; Eddy, John P.; Hu, Kenneth T.; Dalbey, Keith R.; Bauman, Lara E; Hough, Patricia Diane

    2014-05-01

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  4. An Optimization Framework for Driver Feedback Systems

    SciTech Connect

    Malikopoulos, Andreas; Aguilar, Juan P.

    2013-01-01

    Modern vehicles have sophisticated electronic control units that can control engine operation with discretion to balance fuel economy, emissions, and power. These control units are designed for specific driving conditions (e.g., different speed profiles for highway and city driving). However, individual driving styles are different and rarely match the specific driving conditions for which the units were designed. In the research reported here, we investigate driving-style factors that have a major impact on fuel economy and construct an optimization framework to optimize individual driving styles with respect to these driving factors. In this context, we construct a set of polynomial metamodels to reflect the responses produced in fuel economy by changing the driving factors. Then, we compare the optimized driving styles to the original driving styles and evaluate the effectiveness of the optimization framework. Finally, we use this proposed framework to develop a real-time feedback system, including visual instructions, to enable drivers to alter their driving styles in response to actual driving conditions to improve fuel efficiency.

  5. An effective and optimal quality control approach for green energy manufacturing using design of experiments framework and evolutionary algorithm

    NASA Astrophysics Data System (ADS)

    Saavedra, Juan Alejandro

    Quality Control (QC) and Quality Assurance (QA) strategies vary significantly across industries in the manufacturing sector depending on the product being built. Such strategies range from simple statistical analysis and process controls, decision-making process of reworking, repairing, or scraping defective product. This study proposes an optimal QC methodology in order to include rework stations during the manufacturing process by identifying the amount and location of these workstations. The factors that are considered to optimize these stations are cost, cycle time, reworkability and rework benefit. The goal is to minimize the cost and cycle time of the process, but increase the reworkability and rework benefit. The specific objectives of this study are: (1) to propose a cost estimation model that includes energy consumption, and (2) to propose an optimal QC methodology to identify quantity and location of rework workstations. The cost estimation model includes energy consumption as part of the product direct cost. The cost estimation model developed allows the user to calculate product direct cost as the quality sigma level of the process changes. This provides a benefit because a complete cost estimation calculation does not need to be performed every time the processes yield changes. This cost estimation model is then used for the QC strategy optimization process. In order to propose a methodology that provides an optimal QC strategy, the possible factors that affect QC were evaluated. A screening Design of Experiments (DOE) was performed on seven initial factors and identified 3 significant factors. It reflected that one response variable was not required for the optimization process. A full factorial DOE was estimated in order to verify the significant factors obtained previously. The QC strategy optimization is performed through a Genetic Algorithm (GA) which allows the evaluation of several solutions in order to obtain feasible optimal solutions. The GA

  6. Winglet design using multidisciplinary design optimization techniques

    NASA Astrophysics Data System (ADS)

    Elham, Ali; van Tooren, Michel J. L.

    2014-10-01

    A quasi-three-dimensional aerodynamic solver is integrated with a semi-analytical structural weight estimation method inside a multidisciplinary design optimization framework to design and optimize a winglet for a passenger aircraft. The winglet is optimized for minimum drag and minimum structural weight. The Pareto front between those two objective functions is found applying a genetic algorithm. The aircraft minimum take-off weight and the aircraft minimum direct operating cost are used to select the best winglets among those on the Pareto front.

  7. Conceptual design optimization study

    NASA Technical Reports Server (NTRS)

    Hollowell, S. J.; Beeman, E. R., II; Hiyama, R. M.

    1990-01-01

    The feasibility of applying multilevel functional decomposition and optimization techniques to conceptual design of advanced fighter aircraft was investigated. Applying the functional decomposition techniques to the conceptual design phase appears to be feasible. The initial implementation of the modified design process will optimize wing design variables. A hybrid approach, combining functional decomposition techniques for generation of aerodynamic and mass properties linear sensitivity derivatives with existing techniques for sizing mission performance and optimization, is proposed.

  8. FRANOPP: Framework for analysis and optimization problems user's guide

    NASA Technical Reports Server (NTRS)

    Riley, K. M.

    1981-01-01

    Framework for analysis and optimization problems (FRANOPP) is a software aid for the study and solution of design (optimization) problems which provides the driving program and plotting capability for a user generated programming system. In addition to FRANOPP, the programming system also contains the optimization code CONMIN, and two user supplied codes, one for analysis and one for output. With FRANOPP the user is provided with five options for studying a design problem. Three of the options utilize the plot capability and present an indepth study of the design problem. The study can be focused on a history of the optimization process or on the interaction of variables within the design problem.

  9. Multidisciplinary design and optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1992-01-01

    Mutual couplings among the mathematical models of physical phenomena and parts of a system such as an aircraft complicate the design process because each contemplated design change may have a far reaching consequence throughout the system. This paper outlines techniques for computing these influences as system design derivatives useful to both judgmental and formal optimization purposes. The techniques facilitate decomposition of the design process into smaller, more manageable tasks and they form a methodology that can easily fit into existing engineering optimizations and incorporate their design tools.

  10. Structural Analysis in a Conceptual Design Framework

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.; Robinson, Jay H.; Eldred, Lloyd B.

    2012-01-01

    Supersonic aircraft designers must shape the outer mold line of the aircraft to improve multiple objectives, such as mission performance, cruise efficiency, and sonic-boom signatures. Conceptual designers have demonstrated an ability to assess these objectives for a large number of candidate designs. Other critical objectives and constraints, such as weight, fuel volume, aeroelastic effects, and structural soundness, are more difficult to address during the conceptual design process. The present research adds both static structural analysis and sizing to an existing conceptual design framework. The ultimate goal is to include structural analysis in the multidisciplinary optimization of a supersonic aircraft. Progress towards that goal is discussed and demonstrated.

  11. An Optimization Framework for Workplace Charging Strategies

    SciTech Connect

    Huang, Yongxi; Zhou, Yan

    2015-03-01

    The workplace charging (WPC) has been recently recognized as the most important secondary charging point next to residential charging for plug-in electric vehicles (PEVs). The current WPC practice is spontaneous and grants every PEV a designated charger, which may not be practical or economic when there are a large number of PEVs present at workplace. This study is the first research undertaken that develops an optimization framework for WPC strategies to satisfy all charging demand while explicitly addressing different eligible levels of charging technology and employees’ demographic distributions. The optimization model is to minimize the lifetime cost of equipment, installations, and operations, and is formulated as an integer program. We demonstrate the applicability of the model using numerical examples based on national average data. The results indicate that the proposed optimization model can reduce the total cost of running a WPC system by up to 70% compared to the current practice. The WPC strategies are sensitive to the time windows and installation costs, and dominated by the PEV population size. The WPC has also been identified as an alternative sustainable transportation program to the public transit subsidy programs for both economic and environmental advantages.

  12. An Optimization Framework for Dynamic Hybrid Energy Systems

    SciTech Connect

    Wenbo Du; Humberto E Garcia; Christiaan J.J. Paredis

    2014-03-01

    A computational framework for the efficient analysis and optimization of dynamic hybrid energy systems (HES) is developed. A microgrid system with multiple inputs and multiple outputs (MIMO) is modeled using the Modelica language in the Dymola environment. The optimization loop is implemented in MATLAB, with the FMI Toolbox serving as the interface between the computational platforms. Two characteristic optimization problems are selected to demonstrate the methodology and gain insight into the system performance. The first is an unconstrained optimization problem that optimizes the dynamic properties of the battery, reactor and generator to minimize variability in the HES. The second problem takes operating and capital costs into consideration by imposing linear and nonlinear constraints on the design variables. The preliminary optimization results obtained in this study provide an essential step towards the development of a comprehensive framework for designing HES.

  13. A programming environment for distributed complex computing. An overview of the Framework for Interdisciplinary Design Optimization (FIDO) project. NASA Langley TOPS exhibit H120b

    NASA Technical Reports Server (NTRS)

    Townsend, James C.; Weston, Robert P.; Eidson, Thomas M.

    1993-01-01

    The Framework for Interdisciplinary Design Optimization (FIDO) is a general programming environment for automating the distribution of complex computing tasks over a networked system of heterogeneous computers. For example, instead of manually passing a complex design problem between its diverse specialty disciplines, the FIDO system provides for automatic interactions between the discipline tasks and facilitates their communications. The FIDO system networks all the computers involved into a distributed heterogeneous computing system, so they have access to centralized data and can work on their parts of the total computation simultaneously in parallel whenever possible. Thus, each computational task can be done by the most appropriate computer. Results can be viewed as they are produced and variables changed manually for steering the process. The software is modular in order to ease migration to new problems: different codes can be substituted for each of the current code modules with little or no effect on the others. The potential for commercial use of FIDO rests in the capability it provides for automatically coordinating diverse computations on a networked system of workstations and computers. For example, FIDO could provide the coordination required for the design of vehicles or electronics or for modeling complex systems.

  14. Optimization of digital designs

    NASA Technical Reports Server (NTRS)

    Whitaker, Sterling R. (Inventor); Miles, Lowell H. (Inventor)

    2009-01-01

    An application specific integrated circuit is optimized by translating a first representation of its digital design to a second representation. The second representation includes multiple syntactic expressions that admit a representation of a higher-order function of base Boolean values. The syntactic expressions are manipulated to form a third representation of the digital design.

  15. Hydrodynamic Design Optimization Tool

    DTIC Science & Technology

    2011-08-01

    appreciated. The authors would also like to thank David Walden and Francis Noblesse of Code 50 for being instrumental in defining this project, Wesley...and efficiently during the early stage of the design process. The Computational Fluid Dynamics ( CFD ) group at George Mason University has an...specific design constraints. In order to apply CFD -based tool to the hydrodynamic design optimization of ship hull forms, an initial hull form is

  16. Towards a hierarchical optimization modeling framework for ...

    EPA Pesticide Factsheets

    Background:Bilevel optimization has been recognized as a 2-player Stackelberg game where players are represented as leaders and followers and each pursue their own set of objectives. Hierarchical optimization problems, which are a generalization of bilevel, are especially difficult because the optimization is nested, meaning that the objectives of one level depend on solutions to the other levels. We introduce a hierarchical optimization framework for spatially targeting multiobjective green infrastructure (GI) incentive policies under uncertainties related to policy budget, compliance, and GI effectiveness. We demonstrate the utility of the framework using a hypothetical urban watershed, where the levels are characterized by multiple levels of policy makers (e.g., local, regional, national) and policy followers (e.g., landowners, communities), and objectives include minimization of policy cost, implementation cost, and risk; reduction of combined sewer overflow (CSO) events; and improvement in environmental benefits such as reduced nutrient run-off and water availability. Conclusions: While computationally expensive, this hierarchical optimization framework explicitly simulates the interaction between multiple levels of policy makers (e.g., local, regional, national) and policy followers (e.g., landowners, communities) and is especially useful for constructing and evaluating environmental and ecological policy. Using the framework with a hypothetical urba

  17. Synthesis and characterization of magnetic metal-organic framework (MOF) as a novel sorbent, and its optimization by experimental design methodology for determination of palladium in environmental samples.

    PubMed

    Bagheri, Akbar; Taghizadeh, Mohsen; Behbahani, Mohammad; Asgharinezhad, Ali Akbar; Salarian, Mani; Dehghani, Ali; Ebrahimzadeh, Homeira; Amini, Mostafa M

    2012-09-15

    This paper describes the synthesis and application of novel magnetic metal-organic framework (MOF) [(Fe(3)O(4)-Pyridine)/Cu(3)(BTC)(2)] for preconcentration of Pd(II) and its determination by flame atomic absorption spectrometry (FAAS). A Box-Behnken design was used to find the optimum conditions for the preconcentration procedure through response surface methodology. Three variables including amount of magnetic MOF, extraction time, and pH of extraction were selected as factors for adsorption step, and in desorption step, four parameters including type, volume, and concentration of eluent, and desorption time were selected in the optimization study. These values were 30 mg, 6 min, 6.9, K(2)SO(4)+NaOH, 6 mL, 9.5 (w/v %)+0.01 mol L(-1), 15.5 min, for amount of MOF, extraction time, pH of extraction, type, volume, and concentration of the eluent, and desorption time, respectively. The preconcentration factor (PF), relative standard deviation (RSD), limit of detection (LOD), and adsorption capacity of the method were found to be 208, 2.1%, 0.37 ng mL(-1), and 105.1 mg g(-1), respectively. It was found that the magnetic MOF has more capacity compared to Fe(3)O(4)-Py. Finally, the magnetic MOF was successfully applied for rapid extraction of trace amounts of Pd (II) ions in fish, sediment, soil, and water samples. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Optimizing medical data quality based on multiagent web service framework.

    PubMed

    Wu, Ching-Seh; Khoury, Ibrahim; Shah, Hemant

    2012-07-01

    One of the most important issues in e-healthcare information systems is to optimize the medical data quality extracted from distributed and heterogeneous environments, which can extremely improve diagnostic and treatment decision making. This paper proposes a multiagent web service framework based on service-oriented architecture for the optimization of medical data quality in the e-healthcare information system. Based on the design of the multiagent web service framework, an evolutionary algorithm (EA) for the dynamic optimization of the medical data quality is proposed. The framework consists of two main components; first, an EA will be used to dynamically optimize the composition of medical processes into optimal task sequence according to specific quality attributes. Second, a multiagent framework will be proposed to discover, monitor, and report any inconstancy between the optimized task sequence and the actual medical records. To demonstrate the proposed framework, experimental results for a breast cancer case study are provided. Furthermore, to show the unique performance of our algorithm, a comparison with other works in the literature review will be presented.

  19. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed

  20. An Algorithmic Framework for Multiobjective Optimization

    PubMed Central

    Ganesan, T.; Elamvazuthi, I.; Shaari, Ku Zilati Ku; Vasant, P.

    2013-01-01

    Multiobjective (MO) optimization is an emerging field which is increasingly being encountered in many fields globally. Various metaheuristic techniques such as differential evolution (DE), genetic algorithm (GA), gravitational search algorithm (GSA), and particle swarm optimization (PSO) have been used in conjunction with scalarization techniques such as weighted sum approach and the normal-boundary intersection (NBI) method to solve MO problems. Nevertheless, many challenges still arise especially when dealing with problems with multiple objectives (especially in cases more than two). In addition, problems with extensive computational overhead emerge when dealing with hybrid algorithms. This paper discusses these issues by proposing an alternative framework that utilizes algorithmic concepts related to the problem structure for generating efficient and effective algorithms. This paper proposes a framework to generate new high-performance algorithms with minimal computational overhead for MO optimization. PMID:24470795

  1. Constrained Aeroacoustic Shape Optimization Using the Surrogate Management Framework

    DTIC Science & Technology

    2003-12-01

    412 A. L. Marsden, M. Wang & J. E. Dennis, Jr. rum for collaboration, as well as Charles Audet and Petros Koumoutsakos for valuable discussions...2003 Optimal aeroa- coustic shape design using the surrogate management framework. Submitted for review. MARSDEN, A. L., WANG, M. & KOUMOUTSAKOS , P

  2. DAKOTA, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 reference manual

    SciTech Connect

    Griffin, Joshua D. (Sandai National Labs, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L.; Watson, Jean-Paul; Kolda, Tamara Gibson; Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J.; Hough, Patricia Diane; Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Guinta, Anthony A.; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications.

  3. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, user's reference manual.

    SciTech Connect

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane; Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications.

  4. Topology Optimization for Architected Materials Design

    NASA Astrophysics Data System (ADS)

    Osanov, Mikhail; Guest, James K.

    2016-07-01

    Advanced manufacturing processes provide a tremendous opportunity to fabricate materials with precisely defined architectures. To fully leverage these capabilities, however, materials architectures must be optimally designed according to the target application, base material used, and specifics of the fabrication process. Computational topology optimization offers a systematic, mathematically driven framework for navigating this new design challenge. The design problem is posed and solved formally as an optimization problem with unit cell and upscaling mechanics embedded within this formulation. This article briefly reviews the key requirements to apply topology optimization to materials architecture design and discusses several fundamental findings related to optimization of elastic, thermal, and fluidic properties in periodic materials. Emerging areas related to topology optimization for manufacturability and manufacturing variations, nonlinear mechanics, and multiscale design are also discussed.

  5. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

    SciTech Connect

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S; Jakeman, John Davis; Swiler, Laura Painton; Stephens, John Adam; Vigil, Dena M.; Wildey, Timothy Michael; Bohnhoff, William J.; Eddy, John P.; Hu, Kenneth T.; Dalbey, Keith R.; Bauman, Lara E; Hough, Patricia Diane

    2014-05-01

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.

  6. An optimization framework of biological dynamical systems.

    PubMed

    Horie, Ryota

    2008-07-07

    Different biological dynamics are often described by different mathematical equations. On the other hand, some mathematical models describe many biological dynamics universally. Here, we focus on three biological dynamics: the Lotka-Volterra equation, the Hopfield neural networks, and the replicator equation. We describe these three dynamical models using a single optimization framework, which is constructed with employing the Riemannian geometry. Then, we show that the optimization structures of these dynamics are identical, and the differences among the three dynamics are only in the constraints of the optimization. From this perspective, we discuss the unified view for biological dynamics. We also discuss the plausible categorizations, the fundamental nature, and the efficient modeling of the biological dynamics, which arise from the optimization perspective of the dynamical systems.

  7. A Rigorous Framework for Optimization of Expensive Functions by Surrogates

    NASA Technical Reports Server (NTRS)

    Booker, Andrew J.; Dennis, J. E., Jr.; Frank, Paul D.; Serafini, David B.; Torczon, Virginia; Trosset, Michael W.

    1998-01-01

    The goal of the research reported here is to develop rigorous optimization algorithms to apply to some engineering design problems for which design application of traditional optimization approaches is not practical. This paper presents and analyzes a framework for generating a sequence of approximations to the objective function and managing the use of these approximations as surrogates for optimization. The result is to obtain convergence to a minimizer of an expensive objective function subject to simple constraints. The approach is widely applicable because it does not require, or even explicitly approximate, derivatives of the objective. Numerical results are presented for a 31-variable helicopter rotor blade design example and for a standard optimization test example.

  8. OPTIMAL NETWORK TOPOLOGY DESIGN

    NASA Technical Reports Server (NTRS)

    Yuen, J. H.

    1994-01-01

    This program was developed as part of a research study on the topology design and performance analysis for the Space Station Information System (SSIS) network. It uses an efficient algorithm to generate candidate network designs (consisting of subsets of the set of all network components) in increasing order of their total costs, and checks each design to see if it forms an acceptable network. This technique gives the true cost-optimal network, and is particularly useful when the network has many constraints and not too many components. It is intended that this new design technique consider all important performance measures explicitly and take into account the constraints due to various technical feasibilities. In the current program, technical constraints are taken care of by the user properly forming the starting set of candidate components (e.g. nonfeasible links are not included). As subsets are generated, they are tested to see if they form an acceptable network by checking that all requirements are satisfied. Thus the first acceptable subset encountered gives the cost-optimal topology satisfying all given constraints. The user must sort the set of "feasible" link elements in increasing order of their costs. The program prompts the user for the following information for each link: 1) cost, 2) connectivity (number of stations connected by the link), and 3) the stations connected by that link. Unless instructed to stop, the program generates all possible acceptable networks in increasing order of their total costs. The program is written only to generate topologies that are simply connected. Tests on reliability, delay, and other performance measures are discussed in the documentation, but have not been incorporated into the program. This program is written in PASCAL for interactive execution and has been implemented on an IBM PC series computer operating under PC DOS. The disk contains source code only. This program was developed in 1985.

  9. Parameters optimization using experimental design for headspace solid phase micro-extraction analysis of short-chain chlorinated paraffins in waters under the European water framework directive.

    PubMed

    Gandolfi, F; Malleret, L; Sergent, M; Doumenq, P

    2015-08-07

    The water framework directives (WFD 2000/60/EC and 2013/39/EU) force European countries to monitor the quality of their aquatic environment. Among the priority hazardous substances targeted by the WFD, short chain chlorinated paraffins C10-C13 (SCCPs), still represent an analytical challenge, because few laboratories are nowadays able to analyze them. Moreover, an annual average quality standards as low as 0.4μgL(-1) was set for SCCPs in surface water. Therefore, to test for compliance, the implementation of sensitive and reliable analysis method of SCCPs in water are required. The aim of this work was to address this issue by evaluating automated solid phase micro-extraction (SPME) combined on line with gas chromatography-electron capture negative ionization mass spectrometry (GC/ECNI-MS). Fiber polymer, extraction mode, ionic strength, extraction temperature and time were the most significant thermodynamic and kinetic parameters studied. To determine the suitable factors working ranges, the study of the extraction conditions was first carried out by using a classical one factor-at-a-time approach. Then a mixed level factorial 3×2(3) design was performed, in order to give rise to the most influent parameters and to estimate potential interactions effects between them. The most influent factors, i.e. extraction temperature and duration, were optimized by using a second experimental design, in order to maximize the chromatographic response. At the close of the study, a method involving headspace SPME (HS-SPME) coupled to GC/ECNI-MS is proposed. The optimum extraction conditions were sample temperature 90°C, extraction time 80min, with the PDMS 100μm fiber and desorption at 250°C during 2min. Linear response from 0.2ngmL(-1) to 10ngmL(-1) with r(2)=0.99 and limits of detection and quantification, respectively of 4pgmL(-1) and 120pgmL(-1) in MilliQ water, were achieved. The method proved to be applicable in different types of waters and show key advantages, such

  10. Optimal designs for comparing curves

    PubMed Central

    Dette, Holger; Schorning, Kirsten

    2016-01-01

    We consider the optimal design problem for a comparison of two regression curves, which is used to establish the similarity between the dose response relationships of two groups. An optimal pair of designs minimizes the width of the confidence band for the difference between the two regression functions. Optimal design theory (equivalence theorems, efficiency bounds) is developed for this non standard design problem and for some commonly used dose response models optimal designs are found explicitly. The results are illustrated in several examples modeling dose response relationships. It is demonstrated that the optimal pair of designs for the comparison of the regression curves is not the pair of the optimal designs for the individual models. In particular it is shown that the use of the optimal designs proposed in this paper instead of commonly used “non-optimal” designs yields a reduction of the width of the confidence band by more than 50%. PMID:27340305

  11. Ocean power technology design optimization

    DOE PAGES

    van Rij, Jennifer; Yu, Yi-Hsiang; Edwards, Kathleen; ...

    2017-07-18

    For this study, the National Renewable Energy Laboratory and Ocean Power Technologies (OPT) conducted a collaborative code validation and design optimization study for OPT's PowerBuoy wave energy converter (WEC). NREL utilized WEC-Sim, an open-source WEC simulator, to compare four design variations of OPT's PowerBuoy. As an input to the WEC-Sim models, viscous drag coefficients for the PowerBuoy floats were first evaluated using computational fluid dynamics. The resulting WEC-Sim PowerBuoy models were then validated with experimental power output and fatigue load data provided by OPT. The validated WEC-Sim models were then used to simulate the power performance and loads for operationalmore » conditions, extreme conditions, and directional waves, for each of the four PowerBuoy design variations, assuming the wave environment of Humboldt Bay, California. And finally, ratios of power-to-weight, power-to-fatigue-load, power-to-maximum-extreme-load, power-to-water-plane-area, and power-to-wetted-surface-area were used to make a final comparison of the potential PowerBuoy WEC designs. Lastly, the design comparison methodologies developed and presented in this study are applicable to other WEC devices and may be useful as a framework for future WEC design development projects.« less

  12. Design optimization for active twist rotor blades

    NASA Astrophysics Data System (ADS)

    Mok, Ji Won

    This dissertation introduces the process of optimizing active twist rotor blades in the presence of embedded anisotropic piezo-composite actuators. Optimum design of active twist blades is a complex task, since it involves a rich design space with tightly coupled design variables. The study presents the development of an optimization framework for active helicopter rotor blade cross-sectional design. This optimization framework allows for exploring a rich and highly nonlinear design space in order to optimize the active twist rotor blades. Different analytical components are combined in the framework: cross-sectional analysis (UM/VABS), an automated mesh generator, a beam solver (DYMORE), a three-dimensional local strain recovery module, and a gradient based optimizer within MATLAB. Through the mathematical optimization problem, the static twist actuation performance of a blade is maximized while satisfying a series of blade constraints. These constraints are associated with locations of the center of gravity and elastic axis, blade mass per unit span, fundamental rotating blade frequencies, and the blade strength based on local three-dimensional strain fields under worst loading conditions. Through pre-processing, limitations of the proposed process have been studied. When limitations were detected, resolution strategies were proposed. These include mesh overlapping, element distortion, trailing edge tab modeling, electrode modeling and foam implementation of the mesh generator, and the initial point sensibility of the current optimization scheme. Examples demonstrate the effectiveness of this process. Optimization studies were performed on the NASA/Army/MIT ATR blade case. Even though that design was built and shown significant impact in vibration reduction, the proposed optimization process showed that the design could be improved significantly. The second example, based on a model scale of the AH-64D Apache blade, emphasized the capability of this framework to

  13. DAKOTA, a multilevel parellel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 uers's manual.

    SciTech Connect

    Griffin, Joshua D. (Sandai National Labs, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L.; Watson, Jean-Paul; Kolda, Tamara Gibson; Giunta, Anthony Andrew; Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J. (Sandai National Labs, Livermore, CA); Hough, Patricia Diane (Sandai National Labs, Livermore, CA); Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the DAKOTA software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  14. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, user's manual.

    SciTech Connect

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane; Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the DAKOTA software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  15. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, developers manual.

    SciTech Connect

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane; Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

  16. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 developers manual.

    SciTech Connect

    Griffin, Joshua D. (Sandia National lababoratory, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L.; Watson, Jean-Paul; Kolda, Tamara Gibson (Sandia National lababoratory, Livermore, CA); Giunta, Anthony Andrew; Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J.; Hough, Patricia Diane (Sandia National lababoratory, Livermore, CA); Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

  17. Optimal optoacoustic detector design

    NASA Technical Reports Server (NTRS)

    Rosengren, L.-G.

    1975-01-01

    Optoacoustic detectors are used to measure pressure changes occurring in enclosed gases, liquids, or solids being excited by intensity or frequency modulated electromagnetic radiation. Radiation absorption spectra, collisional relaxation rates, substance compositions, and reactions can be determined from the time behavior of these pressure changes. Very successful measurements of gaseous air pollutants have, for instance, been performed by using detectors of this type together with different lasers. The measuring instrument consisting of radiation source, modulator, optoacoustic detector, etc. is often called spectrophone. In the present paper, a thorough optoacoustic detector optimization analysis based upon a review of its theory of operation is introduced. New quantitative rules and suggestions explaining how to design detectors with maximal pressure responsivity and over-all sensitivity and minimal background signal are presented.

  18. Structural Optimization in automotive design

    NASA Technical Reports Server (NTRS)

    Bennett, J. A.; Botkin, M. E.

    1984-01-01

    Although mathematical structural optimization has been an active research area for twenty years, there has been relatively little penetration into the design process. Experience indicates that often this is due to the traditional layout-analysis design process. In many cases, optimization efforts have been outgrowths of analysis groups which are themselves appendages to the traditional design process. As a result, optimization is often introduced into the design process too late to have a significant effect because many potential design variables have already been fixed. A series of examples are given to indicate how structural optimization has been effectively integrated into the design process.

  19. Optimal Flow Control Design

    NASA Technical Reports Server (NTRS)

    Allan, Brian; Owens, Lewis

    2010-01-01

    In support of the Blended-Wing-Body aircraft concept, a new flow control hybrid vane/jet design has been developed for use in a boundary-layer-ingesting (BLI) offset inlet in transonic flows. This inlet flow control is designed to minimize the engine fan-face distortion levels and the first five Fourier harmonic half amplitudes while maximizing the inlet pressure recovery. This concept represents a potentially enabling technology for quieter and more environmentally friendly transport aircraft. An optimum vane design was found by minimizing the engine fan-face distortion, DC60, and the first five Fourier harmonic half amplitudes, while maximizing the total pressure recovery. The optimal vane design was then used in a BLI inlet wind tunnel experiment at NASA Langley's 0.3-meter transonic cryogenic tunnel. The experimental results demonstrated an 80-percent decrease in DPCPavg, the reduction in the circumferential distortion levels, at an inlet mass flow rate corresponding to the middle of the operational range at the cruise condition. Even though the vanes were designed at a single inlet mass flow rate, they performed very well over the entire inlet mass flow range tested in the wind tunnel experiment with the addition of a small amount of jet flow control. While the circumferential distortion was decreased, the radial distortion on the outer rings at the aerodynamic interface plane (AIP) increased. This was a result of the large boundary layer being distributed from the bottom of the AIP in the baseline case to the outer edges of the AIP when using the vortex generator (VG) vane flow control. Experimental results, as already mentioned, showed an 80-percent reduction of DPCPavg, the circumferential distortion level at the engine fan-face. The hybrid approach leverages strengths of vane and jet flow control devices, increasing inlet performance over a broader operational range with significant reduction in mass flow requirements. Minimal distortion level requirements

  20. Design of Optimally Robust Control Systems.

    DTIC Science & Technology

    1980-01-01

    approach is that the optimization framework is an artificial device. While some design constraints can easily be incorporated into a single cost function...indicating that that point was indeed the solution. Also, an intellegent initial guess for k was important in order to avoid being hung up at the double

  1. Systematic description of direct push sensor systems: A conceptual framework for system decomposition as a basis for the optimal sensor system design

    NASA Astrophysics Data System (ADS)

    Bumberger, Jan; Paasche, Hendrik; Dietrich, Peter

    2015-11-01

    Systematic decomposition and evaluation of existing sensor systems as well as the optimal design of future generations of direct push probes are of high importance for optimized geophysical experiments since the employed equipment is a constrain on the data space. Direct push technologies became established methods in the field of geophysical, geotechnical, hydrogeological, and environmental sciences for the investigation of the near subsurface. By using direct push sensor systems it is possible to measure in-situ parameters with high vertical resolution. Such information is frequently used for quantitative geophysical model calibration of interpretation of geotechnical and hydrological subsurface conditions. Most of the available direct push sensor systems are largely based on empirical testing and consecutively evaluated under field conditions. Approaches suitable to identify specific characteristics and problems of direct push sensor systems have not been established, yet. We develop a general systematic approach for the classification, analysis, and optimization of direct push sensor systems. First, a classification is presented for different existing sensor systems. The following systematic description, which is based on the conceptual decomposition of an existing sensor system into subsystems, is a suitable way to analyze and explore the transfer behavior of the system components and therefore of the complete system. Also, this approach may serve as guideline for the synthesis and the design of new and optimized direct push sensor systems.

  2. Integrated controls design optimization

    DOEpatents

    Lou, Xinsheng; Neuschaefer, Carl H.

    2015-09-01

    A control system (207) for optimizing a chemical looping process of a power plant includes an optimizer (420), an income algorithm (230) and a cost algorithm (225) and a chemical looping process models. The process models are used to predict the process outputs from process input variables. Some of the process in puts and output variables are related to the income of the plant; and some others are related to the cost of the plant operations. The income algorithm (230) provides an income input to the optimizer (420) based on a plurality of input parameters (215) of the power plant. The cost algorithm (225) provides a cost input to the optimizer (420) based on a plurality of output parameters (220) of the power plant. The optimizer (420) determines an optimized operating parameter solution based on at least one of the income input and the cost input, and supplies the optimized operating parameter solution to the power plant.

  3. Optimized design for PIGMI

    SciTech Connect

    Hansborough, L.; Hamm, R.; Stovall, J.; Swenson, D.

    1980-01-01

    PIGMI (Pion Generator for Medical Irradiations) is a compact linear proton accelerator design, optimized for pion production and cancer treatment use in a hospital environment. Technology developed during a four-year PIGMI Prototype experimental program allows the design of smaller, less expensive, and more reliable proton linacs. A new type of low-energy accelerating structure, the radio-frequency quadrupole (RFQ) has been tested; it produces an exceptionally good-quality beam and allows the use of a simple 30-kV injector. Average axial electric-field gradients of over 9 MV/m have been demonstrated in a drift-tube linac (DTL) structure. Experimental work is underway to test the disk-and-washer (DAW) structure, another new type of accelerating structure for use in the high-energy coupled-cavity linac (CCL). Sufficient experimental and developmental progress has been made to closely define an actual PIGMI. It will consist of a 30-kV injector, and RFQ linac to a proton energy of 2.5 MeV, a DTL linac to 125 MeV, and a CCL linac to the final energy of 650 MeV. The total length of the accelerator is 133 meters. The RFQ and DTL will be driven by a single 440-MHz klystron; the CCL will be driven by six 1320-MHz klystrons. The peak beam current is 28 mA. The beam pulse length is 60 ..mu..s at a 60-Hz repetition rate, resulting in a 100-..mu..A average beam current. The total cost of the accelerator is estimated to be approx. $10 million.

  4. Design Optimization Toolkit: Users' Manual

    SciTech Connect

    Aguilo Valentin, Miguel Alejandro

    2014-07-01

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLAB command window.

  5. ELPSA as a Lesson Design Framework

    ERIC Educational Resources Information Center

    Lowrie, Tom; Patahuddin, Sitti Maesuri

    2015-01-01

    This paper offers a framework for a mathematics lesson design that is consistent with the way we learn about, and discover, most things in life. In addition, the framework provides a structure for identifying how mathematical concepts and understanding are acquired and developed. This framework is called ELPSA and represents five learning…

  6. An Automated DAKOTA and VULCAN-CFD Framework with Application to Supersonic Facility Nozzle Flowpath Optimization

    NASA Technical Reports Server (NTRS)

    Axdahl, Erik L.

    2015-01-01

    Removing human interaction from design processes by using automation may lead to gains in both productivity and design precision. This memorandum describes efforts to incorporate high fidelity numerical analysis tools into an automated framework and applying that framework to applications of practical interest. The purpose of this effort was to integrate VULCAN-CFD into an automated, DAKOTA-enabled framework with a proof-of-concept application being the optimization of supersonic test facility nozzles. It was shown that the optimization framework could be deployed on a high performance computing cluster with the flow of information handled effectively to guide the optimization process. Furthermore, the application of the framework to supersonic test facility nozzle flowpath design and optimization was demonstrated using multiple optimization algorithms.

  7. Comparison of Optimal Design Methods in Inverse Problems

    PubMed Central

    Banks, H. T.; Holm, Kathleen; Kappel, Franz

    2011-01-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher Information Matrix (FIM). A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criteria with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model [13], the standard harmonic oscillator model [13] and a popular glucose regulation model [16, 19, 29]. PMID:21857762

  8. Comparison of Optimal Design Methods in Inverse Problems.

    PubMed

    Banks, H T; Holm, Kathleen; Kappel, Franz

    2011-07-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher Information Matrix (FIM). A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criteria with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model [13], the standard harmonic oscillator model [13] and a popular glucose regulation model [16, 19, 29].

  9. A framework for parallelized efficient global optimization with application to vehicle crashworthiness optimization

    NASA Astrophysics Data System (ADS)

    Hamza, Karim; Shalaby, Mohamed

    2014-09-01

    This article presents a framework for simulation-based design optimization of computationally expensive problems, where economizing the generation of sample designs is highly desirable. One popular approach for such problems is efficient global optimization (EGO), where an initial set of design samples is used to construct a kriging model, which is then used to generate new 'infill' sample designs at regions of the search space where there is high expectancy of improvement. This article attempts to address one of the limitations of EGO, where generation of infill samples can become a difficult optimization problem in its own right, as well as allow the generation of multiple samples at a time in order to take advantage of parallel computing in the evaluation of the new samples. The proposed approach is tested on analytical functions, and then applied to the vehicle crashworthiness design of a full Geo Metro model undergoing frontal crash conditions.

  10. Talking about Multimedia: A Layered Design Framework.

    ERIC Educational Resources Information Center

    Taylor, Josie; Sumner, Tamara; Law, Andrew

    1997-01-01

    Describes a layered analytical framework for discussing design and educational issues that can be shared by the many different stakeholders involved in educational multimedia design and deployment. Illustrates the framework using a detailed analysis of the Galapagos Pilot project of the Open University science faculty which examines the processes…

  11. Multidisciplinary Design Optimization on Conceptual Design of Aero-engine

    NASA Astrophysics Data System (ADS)

    Zhang, Xiao-bo; Wang, Zhan-xue; Zhou, Li; Liu, Zeng-wen

    2016-06-01

    In order to obtain better integrated performance of aero-engine during the conceptual design stage, multiple disciplines such as aerodynamics, structure, weight, and aircraft mission are required. Unfortunately, the couplings between these disciplines make it difficult to model or solve by conventional method. MDO (Multidisciplinary Design Optimization) methodology which can well deal with couplings of disciplines is considered to solve this coupled problem. Approximation method, optimization method, coordination method, and modeling method for MDO framework are deeply analyzed. For obtaining the more efficient MDO framework, an improved CSSO (Concurrent Subspace Optimization) strategy which is based on DOE (Design Of Experiment) and RSM (Response Surface Model) methods is proposed in this paper; and an improved DE (Differential Evolution) algorithm is recommended to solve the system-level and discipline-level optimization problems in MDO framework. The improved CSSO strategy and DE algorithm are evaluated by utilizing the numerical test problem. The result shows that the efficiency of improved methods proposed by this paper is significantly increased. The coupled problem of VCE (Variable Cycle Engine) conceptual design is solved by utilizing improved CSSO strategy, and the design parameter given by improved CSSO strategy is better than the original one. The integrated performance of VCE is significantly improved.

  12. Framework for Optimal Global Vaccine Stockpile Design for Vaccine-Preventable Diseases: Application to Measles and Cholera Vaccines as Contrasting Examples.

    PubMed

    Thompson, Kimberly M; Duintjer Tebbens, Radboud J

    2016-07-01

    Managing the dynamics of vaccine supply and demand represents a significant challenge with very high stakes. Insufficient vaccine supplies can necessitate rationing, lead to preventable adverse health outcomes, delay the achievements of elimination or eradication goals, and/or pose reputation risks for public health authorities and/or manufacturers. This article explores the dynamics of global vaccine supply and demand to consider the opportunities to develop and maintain optimal global vaccine stockpiles for universal vaccines, characterized by large global demand (for which we use measles vaccines as an example), and nonuniversal (including new and niche) vaccines (for which we use oral cholera vaccine as an example). We contrast our approach with other vaccine stockpile optimization frameworks previously developed for the United States pediatric vaccine stockpile to address disruptions in supply and global emergency response vaccine stockpiles to provide on-demand vaccines for use in outbreaks. For measles vaccine, we explore the complexity that arises due to different formulations and presentations of vaccines, consideration of rubella, and the context of regional elimination goals. We conclude that global health policy leaders and stakeholders should procure and maintain appropriate global vaccine rotating stocks for measles and rubella vaccine now to support current regional elimination goals, and should probably also do so for other vaccines to help prevent and control endemic or epidemic diseases. This work suggests the need to better model global vaccine supplies to improve efficiency in the vaccine supply chain, ensure adequate supplies to support elimination and eradication initiatives, and support progress toward the goals of the Global Vaccine Action Plan.

  13. Optimized solar module design

    NASA Technical Reports Server (NTRS)

    Santala, T.; Sabol, R.; Carbajal, B. G.

    1978-01-01

    The minimum cost per unit of power output from flat plate solar modules can most likely be achieved through efficient packaging of higher efficiency solar cells. This paper outlines a module optimization method which is broadly applicable, and illustrates the potential results achievable from a specific high efficiency tandem junction (TJ) cell. A mathematical model is used to assess the impact of various factors influencing the encapsulated cell and packing efficiency. The optimization of the packing efficiency is demonstrated. The effect of encapsulated cell and packing efficiency on the module add-on cost is shown in a nomograph form.

  14. Optimizing exchanger design early

    SciTech Connect

    Lacunza, M.; Vaschetti, G.; Campana, H.

    1987-08-01

    It is not practical for process engineers and designers to make a rigorous economic evaluation for each component of a process due to the loss of time and money. But, it's very helpful and useful to have a method for a quick design evaluation of heat exchangers, considering their important contribution to the total fixed investment in a process plant. This article is devoted to this subject, and the authors present a method that has been proved in some design cases. Linking rigorous design procedures with a quick cost-estimation method provides a good technique for obtaining the right heat exchanger. The cost will be appropriate, sometimes not the lowest because of design restrictions, but a good approach to the optimum in an earlier process design stage. The authors intend to show the influence of the design variables in a shell and tube heat exchanger on capital investment, or conversely, taking into account the general limiting factors of the process such as thermodynamics, operability, corrosion, etc., and/or from the mechanical design of the calculated unit. The last is a special consideration for countries with no access to industrial technology or with difficulties in obtaining certain construction materials or equipment.

  15. Coupling between a multi-physics workflow engine and an optimization framework

    NASA Astrophysics Data System (ADS)

    Di Gallo, L.; Reux, C.; Imbeaux, F.; Artaud, J.-F.; Owsiak, M.; Saoutic, B.; Aiello, G.; Bernardi, P.; Ciraolo, G.; Bucalossi, J.; Duchateau, J.-L.; Fausser, C.; Galassi, D.; Hertout, P.; Jaboulay, J.-C.; Li-Puma, A.; Zani, L.

    2016-03-01

    A generic coupling method between a multi-physics workflow engine and an optimization framework is presented in this paper. The coupling architecture has been developed in order to preserve the integrity of the two frameworks. The objective is to provide the possibility to replace a framework, a workflow or an optimizer by another one without changing the whole coupling procedure or modifying the main content in each framework. The coupling is achieved by using a socket-based communication library for exchanging data between the two frameworks. Among a number of algorithms provided by optimization frameworks, Genetic Algorithms (GAs) have demonstrated their efficiency on single and multiple criteria optimization. Additionally to their robustness, GAs can handle non-valid data which may appear during the optimization. Consequently GAs work on most general cases. A parallelized framework has been developed to reduce the time spent for optimizations and evaluation of large samples. A test has shown a good scaling efficiency of this parallelized framework. This coupling method has been applied to the case of SYCOMORE (SYstem COde for MOdeling tokamak REactor) which is a system code developed in form of a modular workflow for designing magnetic fusion reactors. The coupling of SYCOMORE with the optimization platform URANIE enables design optimization along various figures of merit and constraints.

  16. Adaptive optimization as a design and management methodology for coal-mining enterprise in uncertain and volatile market environment - the conceptual framework

    NASA Astrophysics Data System (ADS)

    Mikhalchenko, V. V.; Rubanik, Yu T.

    2016-10-01

    The work is devoted to the problem of cost-effective adaptation of coal mines to the volatile and uncertain market conditions. Conceptually it can be achieved through alignment of the dynamic characteristics of the coal mining system and power spectrum of market demand for coal product. In practical terms, this ensures the viability and competitiveness of coal mines. Transformation of dynamic characteristics is to be done by changing the structure of production system as well as corporate, logistics and management processes. The proposed methods and algorithms of control are aimed at the development of the theoretical foundations of adaptive optimization as basic methodology for coal mine enterprise management in conditions of high variability and uncertainty of economic and natural environment. Implementation of the proposed methodology requires a revision of the basic principles of open coal mining enterprises design.

  17. Design optimization of transonic airfoils

    NASA Technical Reports Server (NTRS)

    Joh, C.-Y.; Grossman, B.; Haftka, R. T.

    1991-01-01

    Numerical optimization procedures were considered for the design of airfoils in transonic flow based on the transonic small disturbance (TSD) and Euler equations. A sequential approximation optimization technique was implemented with an accurate approximation of the wave drag based on the Nixon's coordinate straining approach. A modification of the Euler surface boundary conditions was implemented in order to efficiently compute design sensitivities without remeshing the grid. Two effective design procedures producing converged designs in approximately 10 global iterations were developed: interchanging the role of the objective function and constraint and the direct lift maximization with move limits which were fixed absolute values of the design variables.

  18. Multifidelity Analysis and Optimization for Supersonic Design

    NASA Technical Reports Server (NTRS)

    Kroo, Ilan; Willcox, Karen; March, Andrew; Haas, Alex; Rajnarayan, Dev; Kays, Cory

    2010-01-01

    Supersonic aircraft design is a computationally expensive optimization problem and multifidelity approaches over a significant opportunity to reduce design time and computational cost. This report presents tools developed to improve supersonic aircraft design capabilities including: aerodynamic tools for supersonic aircraft configurations; a systematic way to manage model uncertainty; and multifidelity model management concepts that incorporate uncertainty. The aerodynamic analysis tools developed are appropriate for use in a multifidelity optimization framework, and include four analysis routines to estimate the lift and drag of a supersonic airfoil, a multifidelity supersonic drag code that estimates the drag of aircraft configurations with three different methods: an area rule method, a panel method, and an Euler solver. In addition, five multifidelity optimization methods are developed, which include local and global methods as well as gradient-based and gradient-free techniques.

  19. Dynamic optimization and adaptive controller design

    NASA Astrophysics Data System (ADS)

    Inamdar, S. R.

    2010-10-01

    In this work I present a new type of controller which is an adaptive tracking controller which employs dynamic optimization for optimizing current value of controller action for the temperature control of nonisothermal continuously stirred tank reactor (CSTR). We begin with a two-state model of nonisothermal CSTR which are mass and heat balance equations and then add cooling system dynamics to eliminate input multiplicity. The initial design value is obtained using local stability of steady states where approach temperature for cooling action is specified as a steady state and a design specification. Later we make a correction in the dynamics where material balance is manipulated to use feed concentration as a system parameter as an adaptive control measure in order to avoid actuator saturation for the main control loop. The analysis leading to design of dynamic optimization based parameter adaptive controller is presented. The important component of this mathematical framework is reference trajectory generation to form an adaptive control measure.

  20. Multidisciplinary Optimization Methods for Preliminary Design

    NASA Technical Reports Server (NTRS)

    Korte, J. J.; Weston, R. P.; Zang, T. A.

    1997-01-01

    An overview of multidisciplinary optimization (MDO) methodology and two applications of this methodology to the preliminary design phase are presented. These applications are being undertaken to improve, develop, validate and demonstrate MDO methods. Each is presented to illustrate different aspects of this methodology. The first application is an MDO preliminary design problem for defining the geometry and structure of an aerospike nozzle of a linear aerospike rocket engine. The second application demonstrates the use of the Framework for Interdisciplinary Design Optimization (FIDO), which is a computational environment system, by solving a preliminary design problem for a High-Speed Civil Transport (HSCT). The two sample problems illustrate the advantages to performing preliminary design with an MDO process.

  1. Habitat Design Optimization and Analysis

    NASA Technical Reports Server (NTRS)

    SanSoucie, Michael P.; Hull, Patrick V.; Tinker, Michael L.

    2006-01-01

    Long-duration surface missions to the Moon and Mars will require habitats for the astronauts. The materials chosen for the habitat walls play a direct role in the protection against the harsh environments found on the surface. Choosing the best materials, their configuration, and the amount required is extremely difficult due to the immense size of the design region. Advanced optimization techniques are necessary for habitat wall design. Standard optimization techniques are not suitable for problems with such large search spaces; therefore, a habitat design optimization tool utilizing genetic algorithms has been developed. Genetic algorithms use a "survival of the fittest" philosophy, where the most fit individuals are more likely to survive and reproduce. This habitat design optimization tool is a multi-objective formulation of structural analysis, heat loss, radiation protection, and meteoroid protection. This paper presents the research and development of this tool.

  2. Design optimization of rod shaped IPMC actuator

    NASA Astrophysics Data System (ADS)

    Ruiz, S. A.; Mead, B.; Yun, H.; Yim, W.; Kim, K. J.

    2013-04-01

    Ionic polymer-metal composites (IPMCs) are some of the most well-known electro-active polymers. This is due to their large deformation provided a relatively low voltage source. IPMCs have been acknowledged as a potential candidate for biomedical applications such as cardiac catheters and surgical probes; however, there is still no existing mass manufacturing of IPMCs. This study intends to provide a theoretical framework which could be used to design practical purpose IPMCs depending on the end users interest. By explicitly coupling electrostatics, transport phenomenon, and solid mechanics, design optimization is conducted on a simulation in order to provide conceptual motivation for future designs. Utilizing a multi-physics analysis approach on a three dimensional cylinder and tube type IPMC provides physically accurate results for time dependent end effector displacement given a voltage source. Simulations are conducted with the finite element method and are also validated with empirical evidences. Having an in-depth understanding of the physical coupling provides optimal design parameters that cannot be altered from a standard electro-mechanical coupling. These parameters are altered in order to determine optimal designs for end-effector displacement, maximum force, and improved mobility with limited voltage magnitude. Design alterations are conducted on the electrode patterns in order to provide greater mobility, electrode size for efficient bending, and Nafion diameter for improved force. The results of this study will provide optimal design parameters of the IPMC for different applications.

  3. A visualization framework for design and evaluation

    NASA Astrophysics Data System (ADS)

    Blundell, Benjamin J.; Ng, Gary; Pettifer, Steve

    2006-01-01

    The creation of compelling visualisation paradigms is a craft often dominated by intuition and issues of aesthetics, with relatively few models to support good design. The majority of problem cases are approached by simply applying a previously evaluated visualisation technique. A large body of work exists covering the individual aspects of visualisation design such as the human cognition aspects visualisation methods for specific problem areas, psychology studies and so forth, yet most frameworks regarding visualisation are applied after-the-fact as an evaluation measure. We present an extensible framework for visualisation aimed at structuring the design process, increasing decision traceability and delineating the notions of function, aesthetics and usability. The framework can be used to derive a set of requirements for good visualisation design and evaluating existing visualisations, presenting possible improvements. Our framework achieves this by being both broad and general, built on top of existing works, with hooks for extensions and customizations. This paper shows how existing theories of information visualisation fit into the scheme, presents our experience in the application of this framework on several designs, and offers our evaluation of the framework and the designs studied.

  4. Optimally designing games for behavioural research.

    PubMed

    Rafferty, Anna N; Zaharia, Matei; Griffiths, Thomas L

    2014-07-08

    Computer games can be motivating and engaging experiences that facilitate learning, leading to their increasing use in education and behavioural experiments. For these applications, it is often important to make inferences about the knowledge and cognitive processes of players based on their behaviour. However, designing games that provide useful behavioural data are a difficult task that typically requires significant trial and error. We address this issue by creating a new formal framework that extends optimal experiment design, used in statistics, to apply to game design. In this framework, we use Markov decision processes to model players' actions within a game, and then make inferences about the parameters of a cognitive model from these actions. Using a variety of concept learning games, we show that in practice, this method can predict which games will result in better estimates of the parameters of interest. The best games require only half as many players to attain the same level of precision.

  5. Optimally designing games for behavioural research

    PubMed Central

    Rafferty, Anna N.; Zaharia, Matei; Griffiths, Thomas L.

    2014-01-01

    Computer games can be motivating and engaging experiences that facilitate learning, leading to their increasing use in education and behavioural experiments. For these applications, it is often important to make inferences about the knowledge and cognitive processes of players based on their behaviour. However, designing games that provide useful behavioural data are a difficult task that typically requires significant trial and error. We address this issue by creating a new formal framework that extends optimal experiment design, used in statistics, to apply to game design. In this framework, we use Markov decision processes to model players' actions within a game, and then make inferences about the parameters of a cognitive model from these actions. Using a variety of concept learning games, we show that in practice, this method can predict which games will result in better estimates of the parameters of interest. The best games require only half as many players to attain the same level of precision. PMID:25002821

  6. Intelligent Frameworks for Instructional Design.

    ERIC Educational Resources Information Center

    Spector, J. Michael; And Others

    Many researchers are attempting to develop automated instructional development systems to guide subject matter experts through the lengthy and difficult process of courseware development. Because the targeted users often lack instructional design expertise, a great deal of emphasis has been placed on the use of artificial intelligence (AI) to…

  7. Intelligent Frameworks for Instructional Design.

    ERIC Educational Resources Information Center

    Spector, J. Michael; And Others

    1992-01-01

    Presents a taxonomy describing various uses of artificial intelligence techniques in automated instructional development systems. Instructional systems development is discussed in relation to the design of computer-based instructional courseware; two systems being developed at the Air Force Armstrong Laboratory are reviewed; and further research…

  8. Probabilistic Design and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  9. Advanced transport design using multidisciplinary design optimization

    NASA Technical Reports Server (NTRS)

    Barnum, Jennifer; Bathras, Curt; Beene, Kirk; Bush, Michael; Kaupin, Glenn; Lowe, Steve; Sobieski, Ian; Tingen, Kelly; Wells, Douglas

    1991-01-01

    This paper describes the results of the first implementation of multidisciplinary design optimisation (MDO) techniques by undergraduates ina design course. The objective of the work was to design a civilian transport aircraft of the Boeing 777 class. The first half of the two semester design course consisted of application of traditional sizing methods and techniques to form a baseline aircraft. MDO techniques were then applied to this baseline design. This paper describes the evolution of the design with special emphasis on the application of MDO techniques, and presents the results of four iterations through the design space. Minimization of take-off gross weight was the goal of the optimization process. The resultant aircraft derived from the MDO procedure weighed approximately 13,382 lbs (2.57 percent) less than the baseline aircraft.

  10. Initial Multidisciplinary Design and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Ozoroski, L. P.; Geiselhart, K. A.; Padula, S. L.; Li, W.; Olson, E. D.; Campbell, R. L.; Shields, E. W.; Berton, J. J.; Gray, J. S.; Jones, S. M.; Naiman, C. G.; Seidel, J. A.; Moore, K. T.; Naylor, B. A.; Townsend, S.

    2010-01-01

    Within the Supersonics (SUP) Project of the Fundamental Aeronautics Program (FAP), an initial multidisciplinary design & analysis framework has been developed. A set of low- and intermediate-fidelity discipline design and analysis codes were integrated within a multidisciplinary design and analysis framework and demonstrated on two challenging test cases. The first test case demonstrates an initial capability to design for low boom and performance. The second test case demonstrates rapid assessment of a well-characterized design. The current system has been shown to greatly increase the design and analysis speed and capability, and many future areas for development were identified. This work has established a state-of-the-art capability for immediate use by supersonic concept designers and systems analysts at NASA, while also providing a strong base to build upon for future releases as more multifidelity capabilities are developed and integrated.

  11. Design optimization of space structures

    NASA Astrophysics Data System (ADS)

    Felippa, Carlos

    1991-11-01

    The topology-shape-size optimization of space structures is investigated through Kikuchi's homogenization method. The method starts from a 'design domain block,' which is a region of space into which the structure is to materialize. This domain is initially filled with a finite element mesh, typically regular. Force and displacement boundary conditions corresponding to applied loads and supports are applied at specific points in the domain. An optimal structure is to be 'carved out' of the design under two conditions: (1) a cost function is to be minimized, and (2) equality or inequality constraints are to be satisfied. The 'carving' process is accomplished by letting microstructure holes develop and grow in elements during the optimization process. These holes have a rectangular shape in two dimensions and a cubical shape in three dimensions, and may also rotate with respect to the reference axes. The properties of the perforated element are obtained through an homogenization procedure. Once a hole reaches the volume of the element, that element effectively disappears. The project has two phases. In the first phase the method was implemented as the combination of two computer programs: a finite element module, and an optimization driver. In the second part, focus is on the application of this technique to planetary structures. The finite element part of the method was programmed for the two-dimensional case using four-node quadrilateral elements to cover the design domain. An element homogenization technique different from that of Kikuchi and coworkers was implemented. The optimization driver is based on an augmented Lagrangian optimizer, with the volume constraint treated as a Courant penalty function. The optimizer has to be especially tuned to this type of optimization because the number of design variables can reach into the thousands. The driver is presently under development.

  12. Engineering design optimization using services and workflows.

    PubMed

    Crick, Tom; Dunning, Peter; Kim, Hyunsun; Padget, Julian

    2009-07-13

    Multi-disciplinary design optimization (MDO) is the process whereby the often conflicting requirements of the different disciplines to the engineering design process attempts to converge upon a description that represents an acceptable compromise in the design space. We present a simple demonstrator of a flexible workflow framework for engineering design optimization using an e-Science tool. This paper provides a concise introduction to MDO, complemented by a summary of the related tools and techniques developed under the umbrella of the UK e-Science programme that we have explored in support of the engineering process. The main contributions of this paper are: (i) a description of the optimization workflow that has been developed in the Taverna workbench, (ii) a demonstrator of a structural optimization process with a range of tool options using common benchmark problems, (iii) some reflections on the experience of software engineering meeting mechanical engineering, and (iv) an indicative discussion on the feasibility of a 'plug-and-play' engineering environment for analysis and design.

  13. Optimal designs for copula models

    PubMed Central

    Perrone, E.; Müller, W.G.

    2016-01-01

    Copula modelling has in the past decade become a standard tool in many areas of applied statistics. However, a largely neglected aspect concerns the design of related experiments. Particularly the issue of whether the estimation of copula parameters can be enhanced by optimizing experimental conditions and how robust all the parameter estimates for the model are with respect to the type of copula employed. In this paper an equivalence theorem for (bivariate) copula models is provided that allows formulation of efficient design algorithms and quick checks of whether designs are optimal or at least efficient. Some examples illustrate that in practical situations considerable gains in design efficiency can be achieved. A natural comparison between different copula models with respect to design efficiency is provided as well. PMID:27453616

  14. Parallel Object-Oriented Framework Optimization

    SciTech Connect

    Quinlan, D

    2001-05-01

    Object-oriented libraries arise naturally from the increasing complexity of developing related scientific applications. The optimization of the use of libraries within scientific applications is one of many high-performance optimization, and is the subject of this paper. This type of optimization can have significant potential because it can either reduce the overhead of calls to a library, specialize the library calls given the context of their use within the application, or use the semantics of the library calls to locally rewrite sections of the application. This type of optimization is only now becoming an active area of research. The optimization of the use of libraries within scientific applications is particularly attractive because it maps to the extensive use of libraries within numerous large existing scientific applications sharing common problem domains. This paper presents an approach toward the optimization of parallel object-oriented libraries. ROSE[1] is a tool for building source-to-source preprocessors, ROSETTA is a tool for defining the grammars used within ROSE. The definition of the grammars directly determines what can be recognized at compile time. ROSETTA permits grammars to be automatically generated which are specific to the identification of abstractions introduced within object-oriented libraries. Thus the semantics of complex abstractions defined outside of the C++ language can be leveraged at compile time to introduce library specific optimizations. The details of the optimizations performed are not a part of this paper and are up to the library developer to define using ROSETTA and ROSE to build such an optimizing preprocessor. Within performance optimizations, if they are to be automated, the problems of automatically locating where such optimizations can be done are significant and most often overlooked. Note that a novel part of this work is the degree of automation. Thus library developers can be expected to be able to build their

  15. Comparison of optimal design methods in inverse problems

    NASA Astrophysics Data System (ADS)

    Banks, H. T.; Holm, K.; Kappel, F.

    2011-07-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric-based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher information matrix. A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criterion with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model (Banks H T and Tran H T 2009 Mathematical and Experimental Modeling of Physical and Biological Processes (Boca Raton, FL: Chapman and Hall/CRC)), the standard harmonic oscillator model (Banks H T and Tran H T 2009) and a popular glucose regulation model (Bergman R N, Ider Y Z, Bowden C R and Cobelli C 1979 Am. J. Physiol. 236 E667-77 De Gaetano A and Arino O 2000 J. Math. Biol. 40 136-68 Toffolo G, Bergman R N, Finegood D T, Bowden C R and Cobelli C 1980 Diabetes 29 979-90).

  16. Telemanipulator design and optimization software

    NASA Astrophysics Data System (ADS)

    Cote, Jean; Pelletier, Michel

    1995-12-01

    For many years, industrial robots have been used to execute specific repetitive tasks. In those cases, the optimal configuration and location of the manipulator only has to be found once. The optimal configuration or position where often found empirically according to the tasks to be performed. In telemanipulation, the nature of the tasks to be executed is much wider and can be very demanding in terms of dexterity and workspace. The position/orientation of the robot's base could be required to move during the execution of a task. At present, the choice of the initial position of the teleoperator is usually found empirically which can be sufficient in the case of an easy or repetitive task. In the converse situation, the amount of time wasted to move the teleoperator support platform has to be taken into account during the execution of the task. Automatic optimization of the position/orientation of the platform or a better designed robot configuration could minimize these movements and save time. This paper will present two algorithms. The first algorithm is used to optimize the position and orientation of a given manipulator (or manipulators) with respect to the environment on which a task has to be executed. The second algorithm is used to optimize the position or the kinematic configuration of a robot. For this purpose, the tasks to be executed are digitized using a position/orientation measurement system and a compact representation based on special octrees. Given a digitized task, the optimal position or Denavit-Hartenberg configuration of the manipulator can be obtained numerically. Constraints on the robot design can also be taken into account. A graphical interface has been designed to facilitate the use of the two optimization algorithms.

  17. Neural Meta-Memes Framework for Combinatorial Optimization

    NASA Astrophysics Data System (ADS)

    Song, Li Qin; Lim, Meng Hiot; Ong, Yew Soon

    In this paper, we present a Neural Meta-Memes Framework (NMMF) for combinatorial optimization. NMMF is a framework which models basic optimization algorithms as memes and manages them dynamically when solving combinatorial problems. NMMF encompasses neural networks which serve as the overall planner/coordinator to balance the workload between memes. We show the efficacy of the proposed NMMF through empirical study on a class of combinatorial problem, the quadratic assignment problem (QAP).

  18. Singularities in Optimal Structural Design

    NASA Technical Reports Server (NTRS)

    Patnaik, S. N.; Guptill, J. D.; Berke, L.

    1992-01-01

    Singularity conditions that arise during structural optimization can seriously degrade the performance of the optimizer. The singularities are intrinsic to the formulation of the structural optimization problem and are not associated with the method of analysis. Certain conditions that give rise to singularities have been identified in earlier papers, encompassing the entire structure. Further examination revealed more complex sets of conditions in which singularities occur. Some of these singularities are local in nature, being associated with only a segment of the structure. Moreover, the likelihood that one of these local singularities may arise during an optimization procedure can be much greater than that of the global singularity identified earlier. Examples are provided of these additional forms of singularities. A framework is also given in which these singularities can be recognized. In particular, the singularities can be identified by examination of the stress displacement relations along with the compatibility conditions and/or the displacement stress relations derived in the integrated force method of structural analysis.

  19. Singularities in optimal structural design

    NASA Technical Reports Server (NTRS)

    Patnaik, S. N.; Guptill, J. D.; Berke, L.

    1992-01-01

    Singularity conditions that arise during structural optimization can seriously degrade the performance of the optimizer. The singularities are intrinsic to the formulation of the structural optimization problem and are not associated with the method of analysis. Certain conditions that give rise to singularities have been identified in earlier papers, encompassing the entire structure. Further examination revealed more complex sets of conditions in which singularities occur. Some of these singularities are local in nature, being associated with only a segment of the structure. Moreover, the likelihood that one of these local singularities may arise during an optimization procedure can be much greater than that of the global singularity identified earlier. Examples are provided of these additional forms of singularities. A framework is also given in which these singularities can be recognized. In particular, the singularities can be identified by examination of the stress displacement relations along with the compatibility conditions and/or the displacement stress relations derived in the integrated force method of structural analysis.

  20. Optimized quadrature surface coil designs

    PubMed Central

    Kumar, Ananda; Bottomley, Paul A.

    2008-01-01

    Background Quadrature surface MRI/MRS detectors comprised of circular loop and figure-8 or butterfly-shaped coils offer improved signal-to-noise-ratios (SNR) compared to single surface coils, and reduced power and specific absorption rates (SAR) when used for MRI excitation. While the radius of the optimum loop coil for performing MRI at depth d in a sample is known, the optimum geometry for figure-8 and butterfly coils is not. Materials and methods The geometries of figure-8 and square butterfly detector coils that deliver the optimum SNR are determined numerically by the electromagnetic method of moments. Figure-8 and loop detectors are then combined to create SNR-optimized quadrature detectors whose theoretical and experimental SNR performance are compared with a novel quadrature detector comprised of a strip and a loop, and with two overlapped loops optimized for the same depth at 3 T. The quadrature detection efficiency and local SAR during transmission for the three quadrature configurations are analyzed and compared. Results The SNR-optimized figure-8 detector has loop radius r8 ∼ 0.6d, so r8/r0 ∼ 1.3 in an optimized quadrature detector at 3 T. The optimized butterfly coil has side length ∼ d and crossover angle of ≥ 150° at the center. Conclusions These new design rules for figure-8 and butterfly coils optimize their performance as linear and quadrature detectors. PMID:18057975

  1. A computational molecular design framework for crosslinked polymer networks

    PubMed Central

    Eslick, J.C.; Ye, Q.; Park, J.; Topp, E.M.; Spencer, P.; Camarda, K.V.

    2013-01-01

    Crosslinked polymers are important in a very wide range of applications including dental restorative materials. However, currently used polymeric materials experience limited durability in the clinical oral environment. Researchers in the dental polymer field have generally used a time-consuming experimental trial-and-error approach to the design of new materials. The application of computational molecular design (CMD) to crosslinked polymer networks has the potential to facilitate development of improved polymethacrylate dental materials. CMD uses quantitative structure property relations (QSPRs) and optimization techniques to design molecules possessing desired properties. This paper describes a mathematical framework which provides tools necessary for the application of CMD to crosslinked polymer systems. The novel parts of the system include the data structures used, which allow for simple calculation of structural descriptors, and the formulation of the optimization problem. A heuristic optimization method, Tabu Search, is used to determine candidate monomers. Use of a heuristic optimization algorithm makes the system more independent of the types of QSPRs used, and more efficient when applied to combinatorial problems. A software package has been created which provides polymer researchers access to the design framework. A complete example of the methodology is provided for polymethacrylate dental materials. PMID:23904665

  2. Framework for computationally efficient optimal irrigation scheduling using ant colony optimization

    USDA-ARS?s Scientific Manuscript database

    A general optimization framework is introduced with the overall goal of reducing search space size and increasing the computational efficiency of evolutionary algorithm application for optimal irrigation scheduling. The framework achieves this goal by representing the problem in the form of a decisi...

  3. Optimal stochastic control in natural resource management: Framework and examples

    USGS Publications Warehouse

    Williams, B.K.

    1982-01-01

    A framework is presented for the application of optimal control methods to natural resource problems. An expression of the optimal control problem appropriate for renewable natural resources is given and its application to Markovian systems is presented in some detail. Three general approaches are outlined for determining optimal control of infinite time horizon systems and three examples from the natural resource literature are used for illustration.

  4. First-Order Frameworks for Managing Models in Engineering Optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natlia M.; Lewis, Robert Michael

    2000-01-01

    Approximation/model management optimization (AMMO) is a rigorous methodology for attaining solutions of high-fidelity optimization problems with minimal expense in high- fidelity function and derivative evaluation. First-order AMMO frameworks allow for a wide variety of models and underlying optimization algorithms. Recent demonstrations with aerodynamic optimization achieved three-fold savings in terms of high- fidelity function and derivative evaluation in the case of variable-resolution models and five-fold savings in the case of variable-fidelity physics models. The savings are problem dependent but certain trends are beginning to emerge. We give an overview of the first-order frameworks, current computational results, and an idea of the scope of the first-order framework applicability.

  5. Optimal design of solidification processes

    NASA Technical Reports Server (NTRS)

    Dantzig, Jonathan A.; Tortorelli, Daniel A.

    1991-01-01

    An optimal design algorithm is presented for the analysis of general solidification processes, and is demonstrated for the growth of GaAs crystals in a Bridgman furnace. The system is optimal in the sense that the prespecified temperature distribution in the solidifying materials is obtained to maximize product quality. The optimization uses traditional numerical programming techniques which require the evaluation of cost and constraint functions and their sensitivities. The finite element method is incorporated to analyze the crystal solidification problem, evaluate the cost and constraint functions, and compute the sensitivities. These techniques are demonstrated in the crystal growth application by determining an optimal furnace wall temperature distribution to obtain the desired temperature profile in the crystal, and hence to maximize the crystal's quality. Several numerical optimization algorithms are studied to determine the proper convergence criteria, effective 1-D search strategies, appropriate forms of the cost and constraint functions, etc. In particular, we incorporate the conjugate gradient and quasi-Newton methods for unconstrained problems. The efficiency and effectiveness of each algorithm is presented in the example problem.

  6. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    DOE PAGES

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less

  7. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    SciTech Connect

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; Tong, Charles H.; Sahinidis, Nikolaos V.; Miller, David C.

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification through PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.

  8. Optimizing Trial Designs for Targeted Therapies

    PubMed Central

    Beckman, Robert A.; Burman, Carl-Fredrik; König, Franz; Stallard, Nigel; Posch, Martin

    2016-01-01

    An important objective in the development of targeted therapies is to identify the populations where the treatment under consideration has positive benefit risk balance. We consider pivotal clinical trials, where the efficacy of a treatment is tested in an overall population and/or in a pre-specified subpopulation. Based on a decision theoretic framework we derive optimized trial designs by maximizing utility functions. Features to be optimized include the sample size and the population in which the trial is performed (the full population or the targeted subgroup only) as well as the underlying multiple test procedure. The approach accounts for prior knowledge of the efficacy of the drug in the considered populations using a two dimensional prior distribution. The considered utility functions account for the costs of the clinical trial as well as the expected benefit when demonstrating efficacy in the different subpopulations. We model utility functions from a sponsor’s as well as from a public health perspective, reflecting actual civil interests. Examples of optimized trial designs obtained by numerical optimization are presented for both perspectives. PMID:27684573

  9. Research on optimization-based design

    NASA Technical Reports Server (NTRS)

    Balling, R. J.; Parkinson, A. R.; Free, J. C.

    1989-01-01

    Research on optimization-based design is discussed. Illustrative examples are given for cases involving continuous optimization with discrete variables and optimization with tolerances. Approximation of computationally expensive and noisy functions, electromechanical actuator/control system design using decomposition and application of knowledge-based systems and optimization for the design of a valve anti-cavitation device are among the topics covered.

  10. A Design Framework for Awareness Systems

    NASA Astrophysics Data System (ADS)

    Markopoulos, Panos

    This chapter discusses the design of awareness systems, whose main function is a social one, namely, to support social communication, mediated social interactions and eventually relationships of the individuals they connect. We focus especially on connecting friends and family rather than on systems used in the context of collaborative work. Readers interested in this latter kind of applications are referred to the design frameworks by Ginelle and Gutwin (2005) and Gutwin and Greenberg (2002). Below, we outline the relevant design space and the corresponding challenges for the design of awareness systems. The challenges pertain to social aspects of interaction design rather than the technological challenges relating to such systems. As such, they are inspired by Jonathan Grudin’s exposition of design challenges for the domain of groupware applications (Grudin, 1994).

  11. Automation enhancements in multidisciplinary design optimization

    NASA Astrophysics Data System (ADS)

    Wujek, Brett Alan

    The process of designing complex systems has necessarily evolved into one which includes the contributions and interactions of multiple disciplines. To date, the Multidisciplinary Design Optimization (MDO) process has been addressed mainly from the standpoint of algorithm development, with the primary concerns being effective and efficient coordination of disciplinary activities, modification of conventional optimization methods, and the utility of approximation techniques toward this goal. The focus of this dissertation is on improving the efficiency of MDO algorithms through the automation of common procedures and the development of improved methods to carry out these procedures. In this research, automation enhancements are made to the MDO process in three different areas: execution, sensitivity analysis and utility, and design variable move-limit management. A framework is developed along with a graphical user interface called NDOPT to automate the setup and execution of MDO algorithms in a research environment. The technology of automatic differentiation (AD) is utilized within various modules of MDO algorithms for fast and accurate sensitivity calculation, allowing for the frequent use of updated sensitivity information. With the use of AD, efficiency improvements are observed in the convergence of system analyses and in certain optimization procedures since gradient-based methods, traditionally considered cost-prohibitive, can be employed at a more reasonable expense. Finally, a method is developed to automatically monitor and adjust design variable move-limits for the approximate optimization process commonly used in MDO algorithms. With its basis in the well established and probably convergent trust region approach, the Trust region Ratio Approximation method (TRAM) developed in this research accounts for approximation accuracy and the sensitivity of the model error to the design space in providing a flexible move-limit adjustment factor. Favorable results

  12. Designing efficient exponential integrators with EPIRK framework

    NASA Astrophysics Data System (ADS)

    Rainwater, Greg; Tokman, Mayya

    2017-07-01

    Exponential propagation iterative methods of Runge-Kutta type (EPIRK) provide a flexible framework to derive efficient exponential integrators for different types of ODE systems. Different classes of EPIRK methods can be constructed depending on the properties of the equations to be solved. Both classically and stiffly accurate EPIRK schemes can be derived. Flexibility of the order conditions allows to optimize coefficients to construct more efficient schemes. Particularly well-performing fourth-order stiffly accurate methods have been derived and applied to a number of problems. A new efficient three-stage fourth order method is presented and tested here using numerical examples.

  13. A Framework for Optimizing the Placement of Tidal Turbines

    NASA Astrophysics Data System (ADS)

    Nelson, K. S.; Roberts, J.; Jones, C.; James, S. C.

    2013-12-01

    Power generation with marine hydrokinetic (MHK) current energy converters (CECs), often in the form of underwater turbines, is receiving growing global interest. Because of reasonable investment, maintenance, reliability, and environmental friendliness, this technology can contribute to national (and global) energy markets and is worthy of research investment. Furthermore, in remote areas, small-scale MHK energy from river, tidal, or ocean currents can provide a local power supply. However, little is known about the potential environmental effects of CEC operation in coastal embayments, estuaries, or rivers, or of the cumulative impacts of these devices on aquatic ecosystems over years or decades of operation. There is an urgent need for practical, accessible tools and peer-reviewed publications to help industry and regulators evaluate environmental impacts and mitigation measures, while establishing best sitting and design practices. Sandia National Laboratories (SNL) and Sea Engineering, Inc. (SEI) have investigated the potential environmental impacts and performance of individual tidal energy converters (TECs) in Cobscook Bay, ME; TECs are a subset of CECs that are specifically deployed in tidal channels. Cobscook Bay is the first deployment location of Ocean Renewable Power Company's (ORPC) TidGenTM unit. One unit is currently in place with four more to follow. Together, SNL and SEI built a coarse-grid, regional-scale model that included Cobscook Bay and all other landward embayments using the modeling platform SNL-EFDC. Within SNL-EFDC tidal turbines are represented using a unique set of momentum extraction, turbulence generation, and turbulence dissipation equations at TEC locations. The global model was then coupled to a local-scale model that was centered on the proposed TEC deployment locations. An optimization frame work was developed that used the refined model to determine optimal device placement locations that maximized array performance. Within the

  14. A Nonconvex Optimization Framework for Low Rank Matrix Estimation*

    PubMed Central

    Zhao, Tuo; Wang, Zhaoran; Liu, Han

    2016-01-01

    We study the estimation of low rank matrices via nonconvex optimization. Compared with convex relaxation, nonconvex optimization exhibits superior empirical performance for large scale instances of low rank matrix estimation. However, the understanding of its theoretical guarantees are limited. In this paper, we define the notion of projected oracle divergence based on which we establish sufficient conditions for the success of nonconvex optimization. We illustrate the consequences of this general framework for matrix sensing. In particular, we prove that a broad class of nonconvex optimization algorithms, including alternating minimization and gradient-type methods, geometrically converge to the global optimum and exactly recover the true low rank matrices under standard conditions. PMID:28316458

  15. Optimal design of airlift fermenters

    SciTech Connect

    Moresi, M.

    1981-11-01

    In this article a modeling of a draft-tube airlift fermenter (ALF) based on perfect back-mixing of liquid and plugflow for gas bubbles has been carried out to optimize the design and operation of fermentation units at different working capacities. With reference to a whey fermentation by yeasts the economic optimization has led to a slim ALF with an aspect ratio of about 15. As far as power expended per unit of oxygen transfer is concerned, the responses of the model are highly influenced by kLa. However, a safer use of the model has been suggested in order to assess the feasibility of the fermentation process under study. (Refs. 39).

  16. Multidisciplinary optimization in aircraft design using analytic technology models

    NASA Technical Reports Server (NTRS)

    Malone, Brett; Mason, W. H.

    1991-01-01

    An approach to multidisciplinary optimization is presented which combines the Global Sensitivity Equation method, parametric optimization, and analytic technology models. The result is a powerful yet simple procedure for identifying key design issues. It can be used both to investigate technology integration issues very early in the design cycle, and to establish the information flow framework between disciplines for use in multidisciplinary optimization projects using much more computational intense representations of each technology. To illustrate the approach, an examination of the optimization of a short takeoff heavy transport aircraft is presented for numerous combinations of performance and technology constraints.

  17. Designing directories in distributed systems: A systematic framework

    SciTech Connect

    Chandy, K.M.; Schooler, E.M.

    1996-12-31

    This paper proposes a framework for the systematic design of directory-based distributed applications. We evaluate a space of directory designs using our framework. We present a case study consisting of design, implementation and analysis of directories for a multicast application. Our framework is based on a model that extends the formal concept of process knowledge in distributed systems. This concept is used informally in phrases such as {open_quotes}process p knows when it is in state s that process q is active.{close_quotes} We show that this definition of knowledge is too strong for many distributed applications, including directory design. We propose a weaker concept: estimation. We describe the meaning of phrases of the form: {open_quotes}process p in state s estimates with probability 0.9 that process q is active.{close_quotes} We specify directory design as an optimization problem with the objective function of maximizing estimation probabilities, and with constraints on the amount of bandwidth, computation and storage used. We show how this specification helps in a systematic analysis of alternative directory designs.

  18. Design framework for entanglement-distribution switching networks

    NASA Astrophysics Data System (ADS)

    Drost, Robert J.; Brodsky, Michael

    2016-09-01

    The distribution of quantum entanglement appears to be an important component of applications of quantum communications and networks. The ability to centralize the sourcing of entanglement in a quantum network can provide for improved efficiency and enable a variety of network structures. A necessary feature of an entanglement-sourcing network node comprising several sources of entangled photons is the ability to reconfigurably route the generated pairs of photons to network neighbors depending on the desired entanglement sharing of the network users at a given time. One approach to such routing is the use of a photonic switching network. The requirements for an entanglement distribution switching network are less restrictive than for typical conventional applications, leading to design freedom that can be leveraged to optimize additional criteria. In this paper, we present a mathematical framework defining the requirements of an entanglement-distribution switching network. We then consider the design of such a switching network using a number of 2 × 2 crossbar switches, addressing the interconnection of these switches and efficient routing algorithms. In particular, we define a worst-case loss metric and consider 6 × 6, 8 × 8, and 10 × 10 network designs that optimize both this metric and the number of crossbar switches composing the network. We pay particular attention to the 10 × 10 network, detailing novel results proving the optimality of the proposed design. These optimized network designs have great potential for use in practical quantum networks, thus advancing the concept of quantum networks toward reality.

  19. An Evolutionary Optimization Framework for Neural Networks and Neuromorphic Architectures

    SciTech Connect

    Schuman, Catherine D; Plank, James; Disney, Adam; Reynolds, John

    2016-01-01

    As new neural network and neuromorphic architectures are being developed, new training methods that operate within the constraints of the new architectures are required. Evolutionary optimization (EO) is a convenient training method for new architectures. In this work, we review a spiking neural network architecture and a neuromorphic architecture, and we describe an EO training framework for these architectures. We present the results of this training framework on four classification data sets and compare those results to other neural network and neuromorphic implementations. We also discuss how this EO framework may be extended to other architectures.

  20. Globally optimal trial design for local decision making.

    PubMed

    Eckermann, Simon; Willan, Andrew R

    2009-02-01

    Value of information methods allows decision makers to identify efficient trial design following a principle of maximizing the expected value to decision makers of information from potential trial designs relative to their expected cost. However, in health technology assessment (HTA) the restrictive assumption has been made that, prospectively, there is only expected value of sample information from research commissioned within jurisdiction. This paper extends the framework for optimal trial design and decision making within jurisdiction to allow for optimal trial design across jurisdictions. This is illustrated in identifying an optimal trial design for decision making across the US, the UK and Australia for early versus late external cephalic version for pregnant women presenting in the breech position. The expected net gain from locally optimal trial designs of US$0.72M is shown to increase to US$1.14M with a globally optimal trial design. In general, the proposed method of globally optimal trial design improves on optimal trial design within jurisdictions by: (i) reflecting the global value of non-rival information; (ii) allowing optimal allocation of trial sample across jurisdictions; (iii) avoiding market failure associated with free-rider effects, sub-optimal spreading of fixed costs and heterogeneity of trial information with multiple trials.

  1. When Playing Meets Learning: Methodological Framework for Designing Educational Games

    NASA Astrophysics Data System (ADS)

    Linek, Stephanie B.; Schwarz, Daniel; Bopp, Matthias; Albert, Dietrich

    Game-based learning builds upon the idea of using the motivational potential of video games in the educational context. Thus, the design of educational games has to address optimizing enjoyment as well as optimizing learning. Within the EC-project ELEKTRA a methodological framework for the conceptual design of educational games was developed. Thereby state-of-the-art psycho-pedagogical approaches were combined with insights of media-psychology as well as with best-practice game design. This science-based interdisciplinary approach was enriched by enclosed empirical research to answer open questions on educational game-design. Additionally, several evaluation-cycles were implemented to achieve further improvements. The psycho-pedagogical core of the methodology can be summarized by the ELEKTRA's 4Ms: Macroadaptivity, Microadaptivity, Metacognition, and Motivation. The conceptual framework is structured in eight phases which have several interconnections and feedback-cycles that enable a close interdisciplinary collaboration between game design, pedagogy, cognitive science and media psychology.

  2. Optimal Designs for the Rasch Model

    ERIC Educational Resources Information Center

    Grasshoff, Ulrike; Holling, Heinz; Schwabe, Rainer

    2012-01-01

    In this paper, optimal designs will be derived for estimating the ability parameters of the Rasch model when difficulty parameters are known. It is well established that a design is locally D-optimal if the ability and difficulty coincide. But locally optimal designs require that the ability parameters to be estimated are known. To attenuate this…

  3. Optimal Designs for the Rasch Model

    ERIC Educational Resources Information Center

    Grasshoff, Ulrike; Holling, Heinz; Schwabe, Rainer

    2012-01-01

    In this paper, optimal designs will be derived for estimating the ability parameters of the Rasch model when difficulty parameters are known. It is well established that a design is locally D-optimal if the ability and difficulty coincide. But locally optimal designs require that the ability parameters to be estimated are known. To attenuate this…

  4. An optimal structural design algorithm using optimality criteria

    NASA Technical Reports Server (NTRS)

    Taylor, J. E.; Rossow, M. P.

    1976-01-01

    An algorithm for optimal design is given which incorporates several of the desirable features of both mathematical programming and optimality criteria, while avoiding some of the undesirable features. The algorithm proceeds by approaching the optimal solution through the solutions of an associated set of constrained optimal design problems. The solutions of the constrained problems are recognized at each stage through the application of optimality criteria based on energy concepts. Two examples are described in which the optimal member size and layout of a truss is predicted, given the joint locations and loads.

  5. Synergy: A language and framework for robot design

    NASA Astrophysics Data System (ADS)

    Katragadda, Lalitesh Kumar

    Due to escalation in complexity, capability and application, robot design is increasingly difficult. A design environment can automate many design tasks, relieving the designer's burden. Prior to robot development, designers compose a robot from existing or custom developed components, simulate performance, optimize configuration and parameters, and write software for the robot. Robot designers customize these facets to the robot using a variety of software ranging from spreadsheets to C code to CAD tools. Valuable resources are expended, and very little of this expertise and development is reusable. This research begins with the premise that a language to comprehensively represent robots is lacking and that the aforementioned design tasks can be automated once such a language exists. This research proposes and demonstrates the following thesis: "A language to represent robots, along with a framework to generate simulations, optimize designs and generate control software, increases the effectiveness of design." Synergy is the software developed in this research to reflect this philosophy. Synergy was prototyped and demonstrated in the context of lunar rover design, a challenging real-world problem with multiple requirements and a broad design space. Synergy was used to automatically optimize robot parameters and select parts to generate effective designs, while meeting constraints of the embedded components and sub-systems. The generated designs are superior in performance and consistency when compared to designs by teams of designers using the same knowledge. Using a single representation, multiple designs are generated for four distinct lunar exploration objectives. Synergy uses the same representation to auto-generate landing simulations and simultaneously generate control software for the landing. Synergy consists of four software agents. A database and spreadsheet agent compiles the design and component information, generating component interconnections and

  6. A duality framework for stochastic optimal control of complex systems

    SciTech Connect

    Malikopoulos, Andreas A.

    2016-01-01

    In this study, we address the problem of minimizing the long-run expected average cost of a complex system consisting of interactive subsystems. We formulate a multiobjective optimization problem of the one-stage expected costs of the subsystems and provide a duality framework to prove that the control policy yielding the Pareto optimal solution minimizes the average cost criterion of the system. We provide the conditions of existence and a geometric interpretation of the solution. For practical situations having constraints consistent with those studied here, our results imply that the Pareto control policy may be of value when we seek to derive online the optimal control policy in complex systems.

  7. Evolutionary squeaky wheel optimization: a new framework for analysis.

    PubMed

    Li, Jingpeng; Parkes, Andrew J; Burke, Edmund K

    2011-01-01

    Squeaky wheel optimization (SWO) is a relatively new metaheuristic that has been shown to be effective for many real-world problems. At each iteration SWO does a complete construction of a solution starting from the empty assignment. Although the construction uses information from previous iterations, the complete rebuilding does mean that SWO is generally effective at diversification but can suffer from a relatively weak intensification. Evolutionary SWO (ESWO) is a recent extension to SWO that is designed to improve the intensification by keeping the good components of solutions and only using SWO to reconstruct other poorer components of the solution. In such algorithms a standard challenge is to understand how the various parameters affect the search process. In order to support the future study of such issues, we propose a formal framework for the analysis of ESWO. The framework is based on Markov chains, and the main novelty arises because ESWO moves through the space of partial assignments. This makes it significantly different from the analyses used in local search (such as simulated annealing) which only move through complete assignments. Generally, the exact details of ESWO will depend on various heuristics; so we focus our approach on a case of ESWO that we call ESWO-II and that has probabilistic as opposed to heuristic selection and construction operators. For ESWO-II, we study a simple problem instance and explicitly compute the stationary distribution probability over the states of the search space. We find interesting properties of the distribution. In particular, we find that the probabilities of states generally, but not always, increase with their fitness. This nonmonotonocity is quite different from the monotonicity expected in algorithms such as simulated annealing.

  8. A Framework for Optimal Control Allocation with Structural Load Constraints

    NASA Technical Reports Server (NTRS)

    Frost, Susan A.; Taylor, Brian R.; Jutte, Christine V.; Burken, John J.; Trinh, Khanh V.; Bodson, Marc

    2010-01-01

    Conventional aircraft generally employ mixing algorithms or lookup tables to determine control surface deflections needed to achieve moments commanded by the flight control system. Control allocation is the problem of converting desired moments into control effector commands. Next generation aircraft may have many multipurpose, redundant control surfaces, adding considerable complexity to the control allocation problem. These issues can be addressed with optimal control allocation. Most optimal control allocation algorithms have control surface position and rate constraints. However, these constraints are insufficient to ensure that the aircraft's structural load limits will not be exceeded by commanded surface deflections. In this paper, a framework is proposed to enable a flight control system with optimal control allocation to incorporate real-time structural load feedback and structural load constraints. A proof of concept simulation that demonstrates the framework in a simulation of a generic transport aircraft is presented.

  9. Optimized IR synchrotron beamline design.

    PubMed

    Moreno, Thierry

    2015-09-01

    Synchrotron infrared beamlines are powerful tools on which to perform spectroscopy on microscopic length scales but require working with large bending-magnet source apertures in order to provide intense photon beams to the experiments. Many infrared beamlines use a single toroidal-shaped mirror to focus the source emission which generates, for large apertures, beams with significant geometrical aberrations resulting from the shape of the source and the beamline optics. In this paper, an optical layout optimized for synchrotron infrared beamlines, that removes almost totally the geometrical aberrations of the source, is presented and analyzed. This layout is already operational on the IR beamline of the Brazilian synchrotron. An infrared beamline design based on a SOLEIL bending-magnet source is given as an example, which could be useful for future IR beamline improvements at this facility.

  10. Pathway Design, Engineering, and Optimization.

    PubMed

    Garcia-Ruiz, Eva; HamediRad, Mohammad; Zhao, Huimin

    2016-09-16

    The microbial metabolic versatility found in nature has inspired scientists to create microorganisms capable of producing value-added compounds. Many endeavors have been made to transfer and/or combine pathways, existing or even engineered enzymes with new function to tractable microorganisms to generate new metabolic routes for drug, biofuel, and specialty chemical production. However, the success of these pathways can be impeded by different complications from an inherent failure of the pathway to cell perturbations. Pursuing ways to overcome these shortcomings, a wide variety of strategies have been developed. This chapter will review the computational algorithms and experimental tools used to design efficient metabolic routes, and construct and optimize biochemical pathways to produce chemicals of high interest.

  11. An Evolutionary Optimization System for Spacecraft Design

    NASA Technical Reports Server (NTRS)

    Fukunaga, A.; Stechert, A.

    1997-01-01

    Spacecraft design optimization is a domian that can benefit from the application of optimization algorithms such as genetic algorithms. In this paper, we describe DEVO, an evolutionary optimization system that addresses these issues and provides a tool that can be applied to a number of real-world spacecraft design applications. We describe two current applications of DEVO: physical design if a Mars Microprobe Soil Penetrator, and system configuration optimization for a Neptune Orbiter.

  12. Optimal design of compact spur gear reductions

    NASA Technical Reports Server (NTRS)

    Savage, M.; Lattime, S. B.; Kimmel, J. A.; Coe, H. H.

    1992-01-01

    The optimal design of compact spur gear reductions includes the selection of bearing and shaft proportions in addition to gear mesh parameters. Designs for single mesh spur gear reductions are based on optimization of system life, system volume, and system weight including gears, support shafts, and the four bearings. The overall optimization allows component properties to interact, yielding the best composite design. A modified feasible directions search algorithm directs the optimization through a continuous design space. Interpolated polynomials expand the discrete bearing properties and proportions into continuous variables for optimization. After finding the continuous optimum, the designer can analyze near optimal designs for comparison and selection. Design examples show the influence of the bearings on the optimal configurations.

  13. Effect of framework design on crown failure.

    PubMed

    Bonfante, Estevam A; da Silva, Nelson R F A; Coelho, Paulo G; Bayardo-González, Daniel E; Thompson, Van P; Bonfante, Gerson

    2009-04-01

    This study evaluated the effect of core-design modification on the characteristic strength and failure modes of glass-infiltrated alumina (In-Ceram) (ICA) compared with porcelain fused to metal (PFM). Premolar crowns of a standard design (PFMs and ICAs) or with a modified framework design (PFMm and ICAm) were fabricated, cemented on dies, and loaded until failure. The crowns were loaded at 0.5 mm min(-1) using a 6.25 mm tungsten-carbide ball at the central fossa. Fracture load values were recorded and fracture analysis of representative samples were evaluated using scanning electron microscopy. Probability Weibull curves with two-sided 90% confidence limits were calculated for each group and a contour plot of the characteristic strength was obtained. Design modification showed an increase in the characteristic strength of the PFMm and ICAm groups, with PFM groups showing higher characteristic strength than ICA groups. The PFMm group showed the highest characteristic strength among all groups. Fracture modes of PFMs and of PFMm frequently reached the core interface at the lingual cusp, whereas ICA exhibited bulk fracture through the alumina core. Core-design modification significantly improved the characteristic strength for PFM and for ICA. The PFM groups demonstrated higher characteristic strength than both ICA groups combined.

  14. Designing Educational Software with Students through Collaborative Design Games: The We!Design&Play Framework

    ERIC Educational Resources Information Center

    Triantafyllakos, George; Palaigeorgiou, George; Tsoukalas, Ioannis A.

    2011-01-01

    In this paper, we present a framework for the development of collaborative design games that can be employed in participatory design sessions with students for the design of educational applications. The framework is inspired by idea generation theory and the design games literature, and guides the development of board games which, through the use…

  15. Designing Educational Software with Students through Collaborative Design Games: The We!Design&Play Framework

    ERIC Educational Resources Information Center

    Triantafyllakos, George; Palaigeorgiou, George; Tsoukalas, Ioannis A.

    2011-01-01

    In this paper, we present a framework for the development of collaborative design games that can be employed in participatory design sessions with students for the design of educational applications. The framework is inspired by idea generation theory and the design games literature, and guides the development of board games which, through the use…

  16. Optimizing SRF Gun Cavity Profiles in a Genetic Algorithm Framework

    SciTech Connect

    Alicia Hofler, Pavel Evtushenko, Frank Marhauser

    2009-09-01

    Automation of DC photoinjector designs using a genetic algorithm (GA) based optimization is an accepted practice in accelerator physics. Allowing the gun cavity field profile shape to be varied can extend the utility of this optimization methodology to superconducting and normal conducting radio frequency (SRF/RF) gun based injectors. Finding optimal field and cavity geometry configurations can provide guidance for cavity design choices and verify existing designs. We have considered two approaches for varying the electric field profile. The first is to determine the optimal field profile shape that should be used independent of the cavity geometry, and the other is to vary the geometry of the gun cavity structure to produce an optimal field profile. The first method can provide a theoretical optimal and can illuminate where possible gains can be made in field shaping. The second method can produce more realistically achievable designs that can be compared to existing designs. In this paper, we discuss the design and implementation for these two methods for generating field profiles for SRF/RF guns in a GA based injector optimization scheme and provide preliminary results.

  17. Handling Qualities Optimization for Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Lawrence, Ben; Theodore, Colin R.; Berger, Tom

    2016-01-01

    Over the past decade, NASA, under a succession of rotary-wing programs has been moving towards coupling multiple discipline analyses in a rigorous consistent manner to evaluate rotorcraft conceptual designs. Handling qualities is one of the component analyses to be included in a future NASA Multidisciplinary Analysis and Optimization framework for conceptual design of VTOL aircraft. Similarly, the future vision for the capability of the Concept Design and Assessment Technology Area (CD&A-TA) of the U.S Army Aviation Development Directorate also includes a handling qualities component. SIMPLI-FLYD is a tool jointly developed by NASA and the U.S. Army to perform modeling and analysis for the assessment of flight dynamics and control aspects of the handling qualities of rotorcraft conceptual designs. An exploration of handling qualities analysis has been carried out using SIMPLI-FLYD in illustrative scenarios of a tiltrotor in forward flight and single-main rotor helicopter at hover. Using SIMPLI-FLYD and the conceptual design tool NDARC integrated into a single process, the effects of variations of design parameters such as tail or rotor size were evaluated in the form of margins to fixed- and rotary-wing handling qualities metrics as well as the vehicle empty weight. The handling qualities design margins are shown to vary across the flight envelope due to both changing flight dynamic and control characteristics and changing handling qualities specification requirements. The current SIMPLI-FLYD capability and future developments are discussed in the context of an overall rotorcraft conceptual design process.

  18. Design optimization method for Francis turbine

    NASA Astrophysics Data System (ADS)

    Kawajiri, H.; Enomoto, Y.; Kurosawa, S.

    2014-03-01

    This paper presents a design optimization system coupled CFD. Optimization algorithm of the system employs particle swarm optimization (PSO). Blade shape design is carried out in one kind of NURBS curve defined by a series of control points. The system was applied for designing the stationary vanes and the runner of higher specific speed francis turbine. As the first step, single objective optimization was performed on stay vane profile, and second step was multi-objective optimization for runner in wide operating range. As a result, it was confirmed that the design system is useful for developing of hydro turbine.

  19. Design sensitivity analysis and optimization tool (DSO) for sizing design applications

    NASA Technical Reports Server (NTRS)

    Chang, Kuang-Hua; Choi, Kyung K.; Perng, Jyh-Hwa

    1992-01-01

    The DSO tool, a structural design software system that provides the designer with a graphics-based menu-driven design environment to perform easy design optimization for general applications, is presented. Three design stages, preprocessing, design sensitivity analysis, and postprocessing, are implemented in the DSO to allow the designer to carry out the design process systematically. A framework, including data base, user interface, foundation class, and remote module, has been designed and implemented to facilitate software development for the DSO. A number of dedicated commercial software/packages have been integrated in the DSO to support the design procedures. Instead of parameterizing an FEM, design parameters are defined on a geometric model associated with physical quantities, and the continuum design sensitivity analysis theory is implemented to compute design sensitivity coefficients using postprocessing data from the analysis codes. A tracked vehicle road wheel is given as a sizing design application to demonstrate the DSO's easy and convenient design optimization process.

  20. Design sensitivity analysis and optimization tool (DSO) for sizing design applications

    NASA Technical Reports Server (NTRS)

    Chang, Kuang-Hua; Choi, Kyung K.; Perng, Jyh-Hwa

    1992-01-01

    The DSO tool, a structural design software system that provides the designer with a graphics-based menu-driven design environment to perform easy design optimization for general applications, is presented. Three design stages, preprocessing, design sensitivity analysis, and postprocessing, are implemented in the DSO to allow the designer to carry out the design process systematically. A framework, including data base, user interface, foundation class, and remote module, has been designed and implemented to facilitate software development for the DSO. A number of dedicated commercial software/packages have been integrated in the DSO to support the design procedures. Instead of parameterizing an FEM, design parameters are defined on a geometric model associated with physical quantities, and the continuum design sensitivity analysis theory is implemented to compute design sensitivity coefficients using postprocessing data from the analysis codes. A tracked vehicle road wheel is given as a sizing design application to demonstrate the DSO's easy and convenient design optimization process.

  1. A dynamic hybrid framework for constrained evolutionary optimization.

    PubMed

    Wang, Yong; Cai, Zixing

    2012-02-01

    Based on our previous work, this paper presents a dynamic hybrid framework, called DyHF, for solving constrained optimization problems. This framework consists of two major steps: global search model and local search model. In the global and local search models, differential evolution serves as the search engine, and Pareto dominance used in multiobjective optimization is employed to compare the individuals in the population. Unlike other existing methods, the above two steps are executed dynamically according to the feasibility proportion of the current population in this paper, with the purpose of reasonably distributing the computational resource for the global and local search during the evolution. The performance of DyHF is tested on 22 benchmark test functions. The experimental results clearly show that the overall performance of DyHF is highly competitive with that of a number of state-of-the-art approaches from the literature.

  2. Program Aids Analysis And Optimization Of Design

    NASA Technical Reports Server (NTRS)

    Rogers, James L., Jr.; Lamarsh, William J., II

    1994-01-01

    NETS/ PROSSS (NETS Coupled With Programming System for Structural Synthesis) computer program developed to provide system for combining NETS (MSC-21588), neural-network application program and CONMIN (Constrained Function Minimization, ARC-10836), optimization program. Enables user to reach nearly optimal design. Design then used as starting point in normal optimization process, possibly enabling user to converge to optimal solution in significantly fewer iterations. NEWT/PROSSS written in C language and FORTRAN 77.

  3. Program Aids Analysis And Optimization Of Design

    NASA Technical Reports Server (NTRS)

    Rogers, James L., Jr.; Lamarsh, William J., II

    1994-01-01

    NETS/ PROSSS (NETS Coupled With Programming System for Structural Synthesis) computer program developed to provide system for combining NETS (MSC-21588), neural-network application program and CONMIN (Constrained Function Minimization, ARC-10836), optimization program. Enables user to reach nearly optimal design. Design then used as starting point in normal optimization process, possibly enabling user to converge to optimal solution in significantly fewer iterations. NEWT/PROSSS written in C language and FORTRAN 77.

  4. A Communication-Optimal Framework for Contracting Distributed Tensors

    SciTech Connect

    Rajbhandari, Samyam; NIkam, Akshay; Lai, Pai-Wei; Stock, Kevin; Krishnamoorthy, Sriram; Sadayappan, Ponnuswamy

    2014-11-16

    Tensor contractions are extremely compute intensive generalized matrix multiplication operations encountered in many computational science fields, such as quantum chemistry and nuclear physics. Unlike distributed matrix multiplication, which has been extensively studied, limited work has been done in understanding distributed tensor contractions. In this paper, we characterize distributed tensor contraction algorithms on torus networks. We develop a framework with three fundamental communication operators to generate communication-efficient contraction algorithms for arbitrary tensor contractions. We show that for a given amount of memory per processor, our framework is communication optimal for all tensor contractions. We demonstrate performance and scalability of our framework on up to 262,144 cores of BG/Q supercomputer using five tensor contraction examples.

  5. Analysis and System Design Framework for Infrared Spatial Heterodyne Spectrometers

    SciTech Connect

    Cooke, B.J.; Smith, B.W.; Laubscher, B.E.; Villeneuve, P.V.; Briles, S.D.

    1999-04-05

    The authors present a preliminary analysis and design framework developed for the evaluation and optimization of infrared, Imaging Spatial Heterodyne Spectrometer (SHS) electro-optic systems. Commensurate with conventional interferometric spectrometers, SHS modeling requires an integrated analysis environment for rigorous evaluation of system error propagation due to detection process, detection noise, system motion, retrieval algorithm and calibration algorithm. The analysis tools provide for optimization of critical system parameters and components including : (1) optical aperture, f-number, and spectral transmission, (2) SHS interferometer grating and Littrow parameters, and (3) image plane requirements as well as cold shield, optical filtering, and focal-plane dimensions, pixel dimensions and quantum efficiency, (4) SHS spatial and temporal sampling parameters, and (5) retrieval and calibration algorithm issues.

  6. Vehicle systems design optimization study

    SciTech Connect

    Gilmour, J. L.

    1980-04-01

    The optimization of an electric vehicle layout requires a weight distribution in the range of 53/47 to 62/38 in order to assure dynamic handling characteristics comparable to current production internal combustion engine vehicles. It is possible to achieve this goal and also provide passenger and cargo space comparable to a selected current production sub-compact car either in a unique new design or by utilizing the production vehicle as a base. Necessary modification of the base vehicle can be accomplished without major modification of the structure or running gear. As long as batteries are as heavy and require as much space as they currently do, they must be divided into two packages - one at front under the hood and a second at the rear under the cargo area - in order to achieve the desired weight distribution. The weight distribution criteria requires the placement of batteries at the front of the vehicle even when the central tunnel is used for the location of some batteries. The optimum layout has a front motor and front wheel drive. This configuration provides the optimum vehicle dynamic handling characteristics and the maximum passsenger and cargo space for a given size vehicle.

  7. Integrated multidisciplinary design optimization of rotorcraft

    NASA Technical Reports Server (NTRS)

    Adelman, Howard M.; Mantay, Wayne R.

    1989-01-01

    The NASA/Army research plan for developing the logic elements for helicopter rotor design optimization by integrating appropriate disciplines and accounting for important interactions among the disciplines is discussed. The paper describes the optimization formulation in terms of the objective function, design variables, and constraints. The analysis aspects are discussed, and an initial effort at defining the interdisciplinary coupling is summarized. Results are presented on the achievements made in the rotor aerodynamic performance optimization for minimum hover horsepower, rotor dynamic optimization for vibration reduction, rotor structural optimization for minimum weight, and integrated aerodynamic load/dynamics optimization for minimum vibration and weight.

  8. Multi-Disciplinary Design Optimization Using WAVE

    NASA Technical Reports Server (NTRS)

    Irwin, Keith

    2000-01-01

    develop an associative control structure (framework) in the UG WAVE environment enabling multi-disciplinary design of turbine propulsion systems. The capabilities of WAVE were evaluated to assess its use as a rapid optimization and productivity tool. This project also identified future WAVE product enhancements that will make the tool still more beneficial for product development.

  9. Optimal control concepts in design sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Belegundu, Ashok D.

    1987-01-01

    A close link is established between open loop optimal control theory and optimal design by noting certain similarities in the gradient calculations. The resulting benefits include a unified approach, together with physical insights in design sensitivity analysis, and an efficient approach for simultaneous optimal control and design. Both matrix displacement and matrix force methods are considered, and results are presented for dynamic systems, structures, and elasticity problems.

  10. Design of crashworthy structures with controlled behavior in HCA framework

    NASA Astrophysics Data System (ADS)

    Bandi, Punit

    The field of crashworthiness design is gaining more interest and attention from automakers around the world due to increasing competition and tighter safety norms. In the last two decades, topology and topometry optimization methods from structural optimization have been widely explored to improve existing designs or conceive new designs with better crashworthiness. Although many gradient-based and heuristic methods for topology- and topometry-based crashworthiness design are available these days, most of them result in stiff structures that are suitable only for a set of vehicle components in which maximizing the energy absorption or minimizing the intrusion is the main concern. However, there are some other components in a vehicle structure that should have characteristics of both stiffness and flexibility. Moreover, the load paths within the structure and potential buckle modes also play an important role in efficient functioning of such components. For example, the front bumper, side frame rails, steering column, and occupant protection devices like the knee bolster should all exhibit controlled deformation and collapse behavior. The primary objective of this research is to develop new methodologies to design crashworthy structures with controlled behavior. The well established Hybrid Cellular Automaton (HCA) method is used as the basic framework for the new methodologies, and compliant mechanism-type (sub)structures are the highlight of this research. The ability of compliant mechanisms to efficiently transfer force and/or motion from points of application of input loads to desired points within the structure is used to design solid and tubular components that exhibit controlled deformation and collapse behavior under crash loads. In addition, a new methodology for controlling the behavior of a structure under multiple crash load scenarios by adaptively changing the contributions from individual load cases is developed. Applied to practical design problems

  11. An Exploratory Framework for Combining CFD Analysis and Evolutionary Optimization into a Single Integrated Computational Environment

    SciTech Connect

    McCorkle, Douglas S.; Bryden, Kenneth M.

    2011-01-01

    Several recent reports and workshops have identified integrated computational engineering as an emerging technology with the potential to transform engineering design. The goal is to integrate geometric models, analyses, simulations, optimization and decision-making tools, and all other aspects of the engineering process into a shared, interactive computer-generated environment that facilitates multidisciplinary and collaborative engineering. While integrated computational engineering environments can be constructed from scratch with high-level programming languages, the complexity of these proposed environments makes this type of approach prohibitively slow and expensive. Rather, a high-level software framework is needed to provide the user with the capability to construct an application in an intuitive manner using existing models and engineering tools with minimal programming. In this paper, we present an exploratory open source software framework that can be used to integrate the geometric models, computational fluid dynamics (CFD), and optimization tools needed for shape optimization of complex systems. This framework is demonstrated using the multiphase flow analysis of a complete coal transport system for an 800 MW pulverized coal power station. The framework uses engineering objects and three-dimensional visualization to enable the user to interactively design and optimize the performance of the coal transport system.

  12. A duality framework for stochastic optimal control of complex systems

    DOE PAGES

    Malikopoulos, Andreas A.

    2016-01-01

    In this study, we address the problem of minimizing the long-run expected average cost of a complex system consisting of interactive subsystems. We formulate a multiobjective optimization problem of the one-stage expected costs of the subsystems and provide a duality framework to prove that the control policy yielding the Pareto optimal solution minimizes the average cost criterion of the system. We provide the conditions of existence and a geometric interpretation of the solution. For practical situations having constraints consistent with those studied here, our results imply that the Pareto control policy may be of value when we seek to derivemore » online the optimal control policy in complex systems.« less

  13. Optimal design criteria - prediction vs. parameter estimation

    NASA Astrophysics Data System (ADS)

    Waldl, Helmut

    2014-05-01

    G-optimality is a popular design criterion for optimal prediction, it tries to minimize the kriging variance over the whole design region. A G-optimal design minimizes the maximum variance of all predicted values. If we use kriging methods for prediction it is self-evident to use the kriging variance as a measure of uncertainty for the estimates. Though the computation of the kriging variance and even more the computation of the empirical kriging variance is computationally very costly and finding the maximum kriging variance in high-dimensional regions can be time demanding such that we cannot really find the G-optimal design with nowadays available computer equipment in practice. We cannot always avoid this problem by using space-filling designs because small designs that minimize the empirical kriging variance are often non-space-filling. D-optimality is the design criterion related to parameter estimation. A D-optimal design maximizes the determinant of the information matrix of the estimates. D-optimality in terms of trend parameter estimation and D-optimality in terms of covariance parameter estimation yield basically different designs. The Pareto frontier of these two competing determinant criteria corresponds with designs that perform well under both criteria. Under certain conditions searching the G-optimal design on the above Pareto frontier yields almost as good results as searching the G-optimal design in the whole design region. In doing so the maximum of the empirical kriging variance has to be computed only a few times though. The method is demonstrated by means of a computer simulation experiment based on data provided by the Belgian institute Management Unit of the North Sea Mathematical Models (MUMM) that describe the evolution of inorganic and organic carbon and nutrients, phytoplankton, bacteria and zooplankton in the Southern Bight of the North Sea.

  14. Optimal experimental design strategies for detecting hormesis.

    PubMed

    Dette, Holger; Pepelyshev, Andrey; Wong, Weng Kee

    2011-12-01

    Hormesis is a widely observed phenomenon in many branches of life sciences, ranging from toxicology studies to agronomy, with obvious public health and risk assessment implications. We address optimal experimental design strategies for determining the presence of hormesis in a controlled environment using the recently proposed Hunt-Bowman model. We propose alternative models that have an implicit hormetic threshold, discuss their advantages over current models, and construct and study properties of optimal designs for (i) estimating model parameters, (ii) estimating the threshold dose, and (iii) testing for the presence of hormesis. We also determine maximin optimal designs that maximize the minimum of the design efficiencies when we have multiple design criteria or there is model uncertainty where we have a few plausible models of interest. We apply these optimal design strategies to a teratology study and show that the proposed designs outperform the implemented design by a wide margin for many situations.

  15. Optimal design of isotope labeling experiments.

    PubMed

    Yang, Hong; Mandy, Dominic E; Libourel, Igor G L

    2014-01-01

    Stable isotope labeling experiments (ILE) constitute a powerful methodology for estimating metabolic fluxes. An optimal label design for such an experiment is necessary to maximize the precision with which fluxes can be determined. But often, precision gained in the determination of one flux comes at the expense of the precision of other fluxes, and an appropriate label design therefore foremost depends on the question the investigator wants to address. One could liken ILE to shadows that metabolism casts on products. Optimal label design is the placement of the lamp; creating clear shadows for some parts of metabolism and obscuring others.An optimal isotope label design is influenced by: (1) the network structure; (2) the true flux values; (3) the available label measurements; and, (4) commercially available substrates. The first two aspects are dictated by nature and constrain any optimal design. The second two aspects are suitable design parameters. To create an optimal label design, an explicit optimization criterion needs to be formulated. This usually is a property of the flux covariance matrix, which can be augmented by weighting label substrate cost. An optimal design is found by using such a criterion as an objective function for an optimizer. This chapter uses a simple elementary metabolite units (EMU) representation of the TCA cycle to illustrate the process of experimental design of isotope labeled substrates.

  16. Scalar and Multivariate Approaches for Optimal Network Design in Antarctica

    NASA Astrophysics Data System (ADS)

    Hryniw, Natalia

    Observations are crucial for weather and climate, not only for daily forecasts and logistical purposes, for but maintaining representative records and for tuning atmospheric models. Here scalar theory for optimal network design is expanded in a multivariate framework, to allow for optimal station siting for full field optimization. Ensemble sensitivity theory is expanded to produce the covariance trace approach, which optimizes for the trace of the covariance matrix. Relative entropy is also used for multivariate optimization as an information theory approach for finding optimal locations. Antarctic surface temperature data is used as a testbed for these methods. Both methods produce different results which are tied to the fundamental physical parameters of the Antarctic temperature field.

  17. Integrated multidisciplinary design optimization of rotorcraft

    NASA Technical Reports Server (NTRS)

    Adelman, Howard M.; Mantay, Wayne R.

    1989-01-01

    The NASA/Army research plan for developing the logic elements for helicopter rotor design optimization by integrating appropriate disciplines and accounting for important interactions among the disciplines is discussed. The optimization formulation is described in terms of the objective function, design variables, and constraints. The analysis aspects are discussed, and an initial effort at defining the interdisciplinary coupling is summarized. Results are presented on the achievements made in the rotor dynamic optimization for vibration reduction, rotor structural optimization for minimum weight, and integrated aerodynamic load/dynamics optimization for minimum vibration and weight.

  18. Developmental framework to validate future designs of ballistic neck protection.

    PubMed

    Breeze, J; Midwinter, M J; Pope, D; Porter, K; Hepper, A E; Clasper, J

    2013-01-01

    The number of neck injuries has increased during the war in Afghanistan, and they have become an appreciable source of mortality and long-term morbidity for UK servicemen. A three-dimensional numerical model of the neck is necessary to allow simulation of penetrating injury from explosive fragments so that the design of body armour can be optimal, and a framework is required to validate and describe the individual components of this program. An interdisciplinary consensus group consisting of military maxillofacial surgeons, and biomedical, physical, and material scientists was convened to generate the components of the framework, and as a result it incorporates the following components: analysis of deaths and long-term morbidity, assessment of critical cervical structures for incorporation into the model, characterisation of explosive fragments, evaluation of the material of which the body armour is made, and mapping of the entry sites of fragments. The resulting numerical model will simulate the wound tract produced by fragments of differing masses and velocities, and illustrate the effects of temporary cavities on cervical neurovascular structures. Using this framework, a new shirt to be worn under body armour that incorporates ballistic cervical protection has been developed for use in Afghanistan. New designs of the collar validated by human factors and assessment of coverage are currently being incorporated into early versions of the numerical model. The aim of this paper is to describe this developmental framework and provide an update on the current progress of its individual components. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  19. Design optimization studies using COSMIC NASTRAN

    NASA Technical Reports Server (NTRS)

    Pitrof, Stephen M.; Bharatram, G.; Venkayya, Vipperla B.

    1993-01-01

    The purpose of this study is to create, test and document a procedure to integrate mathematical optimization algorithms with COSMIC NASTRAN. This procedure is very important to structural design engineers who wish to capitalize on optimization methods to ensure that their design is optimized for its intended application. The OPTNAST computer program was created to link NASTRAN and design optimization codes into one package. This implementation was tested using two truss structure models and optimizing their designs for minimum weight, subject to multiple loading conditions and displacement and stress constraints. However, the process is generalized so that an engineer could design other types of elements by adding to or modifying some parts of the code.

  20. Optimal Multiobjective Design of Digital Filters Using Taguchi Optimization Technique

    NASA Astrophysics Data System (ADS)

    Ouadi, Abderrahmane; Bentarzi, Hamid; Recioui, Abdelmadjid

    2014-01-01

    The multiobjective design of digital filters using the powerful Taguchi optimization technique is considered in this paper. This relatively new optimization tool has been recently introduced to the field of engineering and is based on orthogonal arrays. It is characterized by its robustness, immunity to local optima trapping, relative fast convergence and ease of implementation. The objectives of filter design include matching some desired frequency response while having minimum linear phase; hence, reducing the time response. The results demonstrate that the proposed problem solving approach blended with the use of the Taguchi optimization technique produced filters that fulfill the desired characteristics and are of practical use.

  1. Optimal multiobjective design of digital filters using spiral optimization technique.

    PubMed

    Ouadi, Abderrahmane; Bentarzi, Hamid; Recioui, Abdelmadjid

    2013-01-01

    The multiobjective design of digital filters using spiral optimization technique is considered in this paper. This new optimization tool is a metaheuristic technique inspired by the dynamics of spirals. It is characterized by its robustness, immunity to local optima trapping, relative fast convergence and ease of implementation. The objectives of filter design include matching some desired frequency response while having minimum linear phase; hence, reducing the time response. The results demonstrate that the proposed problem solving approach blended with the use of the spiral optimization technique produced filters which fulfill the desired characteristics and are of practical use.

  2. Design Optimization of Marine Reduction Gears.

    DTIC Science & Technology

    1983-09-01

    Association (AGMA). For instance, the AGNIA recommends tooth proportions for fine-pitch involute spur and helical gears and specifies 67 TABLE 11 UNITED STATES...E 6F -h me) Computer Aided Design; Design Optimization; Automated Design Synthesis; Marine Gear Design; Marine Gears ; Helical Gears ; Computer Aided...41 1. Materials--------------------------------42 2. Gear Size--------------------------------43 3. Tooth Size

  3. Optimization, an Important Stage of Engineering Design

    ERIC Educational Resources Information Center

    Kelley, Todd R.

    2010-01-01

    A number of leaders in technology education have indicated that a major difference between the technological design process and the engineering design process is analysis and optimization. The analysis stage of the engineering design process is when mathematical models and scientific principles are employed to help the designer predict design…

  4. Optimization, an Important Stage of Engineering Design

    ERIC Educational Resources Information Center

    Kelley, Todd R.

    2010-01-01

    A number of leaders in technology education have indicated that a major difference between the technological design process and the engineering design process is analysis and optimization. The analysis stage of the engineering design process is when mathematical models and scientific principles are employed to help the designer predict design…

  5. Spatial optimization of watershed management practices for nitrogen load reduction using a modeling-optimization framework

    EPA Science Inventory

    Best management practices (BMPs) are perceived as being effective in reducing nutrient loads transported from non-point sources (NPS) to receiving water bodies. The objective of this study was to develop a modeling-optimization framework that can be used by watershed management p...

  6. Spatial optimization of watershed management practices for nitrogen load reduction using a modeling-optimization framework

    EPA Science Inventory

    Best management practices (BMPs) are perceived as being effective in reducing nutrient loads transported from non-point sources (NPS) to receiving water bodies. The objective of this study was to develop a modeling-optimization framework that can be used by watershed management p...

  7. Optimizing investment fund allocation using vehicle routing problem framework

    NASA Astrophysics Data System (ADS)

    Mamat, Nur Jumaadzan Zaleha; Jaaman, Saiful Hafizah; Ahmad, Rokiah Rozita

    2014-07-01

    The objective of investment is to maximize total returns or minimize total risks. To determine the optimum order of investment, vehicle routing problem method is used. The method which is widely used in the field of resource distribution shares almost similar characteristics with the problem of investment fund allocation. In this paper we describe and elucidate the concept of using vehicle routing problem framework in optimizing the allocation of investment fund. To better illustrate these similarities, sectorial data from FTSE Bursa Malaysia is used. Results show that different values of utility for risk-averse investors generate the same investment routes.

  8. An OER Architecture Framework: Needs and Design

    ERIC Educational Resources Information Center

    Khanna, Pankaj; Basak, P. C.

    2013-01-01

    This paper describes an open educational resources (OER) architecture framework that would bring significant improvements in a well-structured and systematic way to the educational practices of distance education institutions of India. The OER architecture framework is articulated with six dimensions: pedagogical, technological, managerial,…

  9. Spatial optimization of watershed management practices for nitrogen load reduction using a modeling-optimization framework.

    PubMed

    Yang, Guoxiang; Best, Elly P H

    2015-09-15

    Best management practices (BMPs) can be used effectively to reduce nutrient loads transported from non-point sources to receiving water bodies. However, methodologies of BMP selection and placement in a cost-effective way are needed to assist watershed management planners and stakeholders. We developed a novel modeling-optimization framework that can be used to find cost-effective solutions of BMP placement to attain nutrient load reduction targets. This was accomplished by integrating a GIS-based BMP siting method, a WQM-TMDL-N modeling approach to estimate total nitrogen (TN) loading, and a multi-objective optimization algorithm. Wetland restoration and buffer strip implementation were the two BMP categories used to explore the performance of this framework, both differing greatly in complexity of spatial analysis for site identification. Minimizing TN load and BMP cost were the two objective functions for the optimization process. The performance of this framework was demonstrated in the Tippecanoe River watershed, Indiana, USA. Optimized scenario-based load reduction indicated that the wetland subset selected by the minimum scenario had the greatest N removal efficiency. Buffer strips were more effective for load removal than wetlands. The optimized solutions provided a range of trade-offs between the two objective functions for both BMPs. This framework can be expanded conveniently to a regional scale because the NHDPlus catchment serves as its spatial computational unit. The present study demonstrated the potential of this framework to find cost-effective solutions to meet a water quality target, such as a 20% TN load reduction, under different conditions.

  10. Optimal design of structures with buckling constraints.

    NASA Technical Reports Server (NTRS)

    Kiusalaas, J.

    1973-01-01

    The paper presents an iterative, finite element method for minimum weight design of structures with respect to buckling constraints. The redesign equation is derived from the optimality criterion, as opposed to a numerical search procedure, and can handle problems that are characterized by the existence of two fundamental buckling modes at the optimal design. Application of the method is illustrated by beam and orthogonal frame design problems.

  11. A hydroeconomic modeling framework for optimal integrated management of forest and water

    NASA Astrophysics Data System (ADS)

    Garcia-Prats, Alberto; del Campo, Antonio D.; Pulido-Velazquez, Manuel

    2016-10-01

    Forests play a determinant role in the hydrologic cycle, with water being the most important ecosystem service they provide in semiarid regions. However, this contribution is usually neither quantified nor explicitly valued. The aim of this study is to develop a novel hydroeconomic modeling framework for assessing and designing the optimal integrated forest and water management for forested catchments. The optimization model explicitly integrates changes in water yield in the stands (increase in groundwater recharge) induced by forest management and the value of the additional water provided to the system. The model determines the optimal schedule of silvicultural interventions in the stands of the catchment in order to maximize the total net benefit in the system. Canopy cover and biomass evolution over time were simulated using growth and yield allometric equations specific for the species in Mediterranean conditions. Silvicultural operation costs according to stand density and canopy cover were modeled using local cost databases. Groundwater recharge was simulated using HYDRUS, calibrated and validated with data from the experimental plots. In order to illustrate the presented modeling framework, a case study was carried out in a planted pine forest (Pinus halepensis Mill.) located in south-western Valencia province (Spain). The optimized scenario increased groundwater recharge. This novel modeling framework can be used in the design of a "payment for environmental services" scheme in which water beneficiaries could contribute to fund and promote efficient forest management operations.

  12. Multidisciplinary Optimization Methods for Aircraft Preliminary Design

    NASA Technical Reports Server (NTRS)

    Kroo, Ilan; Altus, Steve; Braun, Robert; Gage, Peter; Sobieski, Ian

    1994-01-01

    This paper describes a research program aimed at improved methods for multidisciplinary design and optimization of large-scale aeronautical systems. The research involves new approaches to system decomposition, interdisciplinary communication, and methods of exploiting coarse-grained parallelism for analysis and optimization. A new architecture, that involves a tight coupling between optimization and analysis, is intended to improve efficiency while simplifying the structure of multidisciplinary, computation-intensive design problems involving many analysis disciplines and perhaps hundreds of design variables. Work in two areas is described here: system decomposition using compatibility constraints to simplify the analysis structure and take advantage of coarse-grained parallelism; and collaborative optimization, a decomposition of the optimization process to permit parallel design and to simplify interdisciplinary communication requirements.

  13. Exponential approximations in optimal design

    NASA Technical Reports Server (NTRS)

    Belegundu, A. D.; Rajan, S. D.; Rajgopal, J.

    1990-01-01

    One-point and two-point exponential functions have been developed and proved to be very effective approximations of structural response. The exponential has been compared to the linear, reciprocal and quadratic fit methods. Four test problems in structural analysis have been selected. The use of such approximations is attractive in structural optimization to reduce the numbers of exact analyses which involve computationally expensive finite element analysis.

  14. Optimization design of electromagnetic shielding composites

    NASA Astrophysics Data System (ADS)

    Qu, Zhaoming; Wang, Qingguo; Qin, Siliang; Hu, Xiaofeng

    2013-03-01

    The effective electromagnetic parameters physical model of composites and prediction formulas of composites' shielding effectiveness and reflectivity were derived based on micromechanics, variational principle and electromagnetic wave transmission theory. The multi-objective optimization design of multilayer composites was carried out using genetic algorithm. The optimized results indicate that material parameter proportioning of biggest absorption ability can be acquired under the condition of the minimum shielding effectiveness can be satisfied in certain frequency band. The validity of optimization design model was verified and the scheme has certain theoretical value and directive significance to the design of high efficiency shielding composites.

  15. Topology optimization design of space rectangular mirror

    NASA Astrophysics Data System (ADS)

    Qu, Yanjun; Wang, Wei; Liu, Bei; Li, Xupeng

    2016-10-01

    A conceptual lightweight rectangular mirror is designed based on the theory of topology optimization and the specific structure size is determined through sensitivity analysis and size optimization in this paper. Under the load condition of gravity along the optical axis, compared with the mirrors designed by traditional method using finite element analysis method, the performance of the topology optimization reflectors supported by peripheral six points are superior in lightweight ratio, structure stiffness and the reflective surface accuracy. This suggests that the lightweight method in this paper is effective and has potential value for the design of rectangular reflector.

  16. Optimization methods applied to hybrid vehicle design

    NASA Technical Reports Server (NTRS)

    Donoghue, J. F.; Burghart, J. H.

    1983-01-01

    The use of optimization methods as an effective design tool in the design of hybrid vehicle propulsion systems is demonstrated. Optimization techniques were used to select values for three design parameters (battery weight, heat engine power rating and power split between the two on-board energy sources) such that various measures of vehicle performance (acquisition cost, life cycle cost and petroleum consumption) were optimized. The apporach produced designs which were often significant improvements over hybrid designs already reported on in the literature. The principal conclusions are as follows. First, it was found that the strategy used to split the required power between the two on-board energy sources can have a significant effect on life cycle cost and petroleum consumption. Second, the optimization program should be constructed so that performance measures and design variables can be easily changed. Third, the vehicle simulation program has a significant effect on the computer run time of the overall optimization program; run time can be significantly reduced by proper design of the types of trips the vehicle takes in a one year period. Fourth, care must be taken in designing the cost and constraint expressions which are used in the optimization so that they are relatively smooth functions of the design variables. Fifth, proper handling of constraints on battery weight and heat engine rating, variables which must be large enough to meet power demands, is particularly important for the success of an optimization study. Finally, the principal conclusion is that optimization methods provide a practical tool for carrying out the design of a hybrid vehicle propulsion system.

  17. Design Optimization Programmable Calculators versus Campus Computers.

    ERIC Educational Resources Information Center

    Savage, Michael

    1982-01-01

    A hypothetical design optimization problem and technical information on the three design parameters are presented. Although this nested iteration problem can be solved on a computer (flow diagram provided), this article suggests that several hand held calculators can be used to perform the same design iteration. (SK)

  18. Computer optimization of photoreceiver designs

    SciTech Connect

    Wistey, M.

    1994-04-01

    We customized an existing simulator which uses the beam propagation method to analyze optoelectronic devices, in order to test the wavelength filtering characteristics of a number of photoreceiver designs. We tested the simulator against traditional analytical techniques and verified its accuracy for small step sizes. Finally, we applied a simulated annealing algorithm to the BPM simulator in order to produce photoreceiver designs with improved wavelength filtering characteristics.

  19. Novel integrated design framework for radio frequency quadrupoles

    NASA Astrophysics Data System (ADS)

    Jolly, Simon; Easton, Matthew; Lawrie, Scott; Letchford, Alan; Pozimski, Jürgen; Savage, Peter

    2014-01-01

    A novel design framework for Radio Frequency Quadrupoles (RFQs), developed as part of the design of the FETS RFQ, is presented. This framework integrates several previously disparate steps in the design of RFQs, including the beam dynamics design, mechanical design, electromagnetic, thermal and mechanical modelling and beam dynamics simulations. Each stage of the design process is described in detail, including the various software options and reasons for the final software suite selected. Results are given for each of these steps, describing how each stage affects the overall design process, with an emphasis on the resulting design choices for the FETS RFQ.

  20. A Design Framework for Online Teacher Professional Development Communities

    ERIC Educational Resources Information Center

    Liu, Katrina Yan

    2012-01-01

    This paper provides a design framework for building online teacher professional development communities for preservice and inservice teachers. The framework is based on a comprehensive literature review on the latest technology and epistemology of online community and teacher professional development, comprising four major design factors and three…

  1. Building a Framework for Engineering Design Experiences in High School

    ERIC Educational Resources Information Center

    Denson, Cameron D.; Lammi, Matthew

    2014-01-01

    In this article, Denson and Lammi put forth a conceptual framework that will help promote the successful infusion of engineering design experiences into high school settings. When considering a conceptual framework of engineering design in high school settings, it is important to consider the complex issue at hand. For the purposes of this…

  2. A Design Framework for Online Teacher Professional Development Communities

    ERIC Educational Resources Information Center

    Liu, Katrina Yan

    2012-01-01

    This paper provides a design framework for building online teacher professional development communities for preservice and inservice teachers. The framework is based on a comprehensive literature review on the latest technology and epistemology of online community and teacher professional development, comprising four major design factors and three…

  3. Interaction prediction optimization in multidisciplinary design optimization problems.

    PubMed

    Meng, Debiao; Zhang, Xiaoling; Huang, Hong-Zhong; Wang, Zhonglai; Xu, Huanwei

    2014-01-01

    The distributed strategy of Collaborative Optimization (CO) is suitable for large-scale engineering systems. However, it is hard for CO to converge when there is a high level coupled dimension. Furthermore, the discipline objectives cannot be considered in each discipline optimization problem. In this paper, one large-scale systems control strategy, the interaction prediction method (IPM), is introduced to enhance CO. IPM is utilized for controlling subsystems and coordinating the produce process in large-scale systems originally. We combine the strategy of IPM with CO and propose the Interaction Prediction Optimization (IPO) method to solve MDO problems. As a hierarchical strategy, there are a system level and a subsystem level in IPO. The interaction design variables (including shared design variables and linking design variables) are operated at the system level and assigned to the subsystem level as design parameters. Each discipline objective is considered and optimized at the subsystem level simultaneously. The values of design variables are transported between system level and subsystem level. The compatibility constraints are replaced with the enhanced compatibility constraints to reduce the dimension of design variables in compatibility constraints. Two examples are presented to show the potential application of IPO for MDO.

  4. Multi-objective optimization of nitinol stent design.

    PubMed

    Alaimo, G; Auricchio, F; Conti, M; Zingales, M

    2017-09-01

    Nitinol stents continuously experience loadings due to pulsatile pressure, thus a given stent design should possess an adequate fatigue strength and, at the same time, it should guarantee a sufficient vessel scaffolding. The present study proposes an optimization framework aiming at increasing the fatigue life reducing the maximum strut strain along the structure through a local modification of the strut profile.The adopted computational framework relies on nonlinear structural finite element analysis combined with a Multi Objective Genetic Algorithm, based on Kriging response surfaces. In particular, such an approach is used to investigate the design optimization of planar stent cell.The results of the strut profile optimization confirm the key role of a tapered strut design to enhance the stent fatigue strength, suggesting that it is possible to achieve a marked improvement of both the fatigue safety factor and the scaffolding capability simultaneously. The present study underlines the value of advanced engineering tools to optimize the design of medical devices. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  5. A Predictive Framework to Elucidate Venous Stenosis: CFD & Shape Optimization.

    PubMed

    Javid Mahmoudzadeh Akherat, S M; Cassel, Kevin; Boghosian, Michael; Hammes, Mary; Coe, Fredric

    2017-07-01

    The surgical creation of vascular accesses for renal failure patients provides an abnormally high flow rate conduit in the patient's upper arm vasculature that facilitates the hemodialysis treatment. These vascular accesses, however, are very often associated with complications that lead to access failure and thrombotic incidents, mainly due to excessive neointimal hyperplasia (NH) and subsequently stenosis. Development of a framework to monitor and predict the evolution of the venous system post access creation can greatly contribute to maintaining access patency. Computational fluid dynamics (CFD) has been exploited to inspect the non-homeostatic wall shear stress (WSS) distribution that is speculated to trigger NH in the patient cohort under investigation. Thereafter, CFD in liaison with a gradient-free shape optimization method has been employed to analyze the deformation modes of the venous system enduring non-physiological hemodynamics. It is observed that the optimally evolved shapes and their corresponding hemodynamics strive to restore the homeostatic state of the venous system to a normal, pre-surgery condition. It is concluded that a CFD-shape optimization coupling that seeks to regulate the WSS back to a well-defined physiological WSS target range can accurately predict the mode of patient-specific access failure.

  6. A multi-fidelity framework for physics based rotor blade simulation and optimization

    NASA Astrophysics Data System (ADS)

    Collins, Kyle Brian

    with lower fidelity models. This thesis documents the development of automated low and high fidelity physics based rotor simulation frameworks. The low fidelity framework uses a comprehensive code with simplified aerodynamics. The high fidelity model uses a parallel processor capable CFD/CSD methodology. Both low and high fidelity frameworks include an aeroacoustic simulation for prediction of noise. A synergistic process is developed that uses both the low and high fidelity frameworks together to build approximate models of important high fidelity metrics as functions of certain design variables. To test the process, a 4-bladed hingeless rotor model is used as a baseline. The design variables investigated include tip geometry and spanwise twist distribution. Approximation models are built for metrics related to rotor efficiency and vibration using the results from 60+ high fidelity (CFD/CSD) experiments and 400+ low fidelity experiments. Optimization using the approximation models found the Pareto Frontier anchor points, or the design having maximum rotor efficiency and the design having minimum vibration. Various Pareto generation methods are used to find designs on the frontier between these two anchor designs. When tested in the high fidelity framework, the Pareto anchor designs are shown to be very good designs when compared with other designs from the high fidelity database. This provides evidence that the process proposed has merit. Ultimately, this process can be utilized by industry rotor designers with their existing tools to bring high fidelity analysis into the preliminary design stage of rotors. In conclusion, the methods developed and documented in this thesis have made several novel contributions. First, an automated high fidelity CFD based forward flight simulation framework has been built for use in preliminary design optimization. The framework was built around an integrated, parallel processor capable CFD/CSD/AA process. Second, a novel method of

  7. Turbomachinery Airfoil Design Optimization Using Differential Evolution

    NASA Technical Reports Server (NTRS)

    Madavan, Nateri K.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    An aerodynamic design optimization procedure that is based on a evolutionary algorithm known at Differential Evolution is described. Differential Evolution is a simple, fast, and robust evolutionary strategy that has been proven effective in determining the global optimum for several difficult optimization problems, including highly nonlinear systems with discontinuities and multiple local optima. The method is combined with a Navier-Stokes solver that evaluates the various intermediate designs and provides inputs to the optimization procedure. An efficient constraint handling mechanism is also incorporated. Results are presented for the inverse design of a turbine airfoil from a modern jet engine and compared to earlier methods. The capability of the method to search large design spaces and obtain the optimal airfoils in an automatic fashion is demonstrated. Substantial reductions in the overall computing time requirements are achieved by using the algorithm in conjunction with neural networks.

  8. Turbomachinery Airfoil Design Optimization Using Differential Evolution

    NASA Technical Reports Server (NTRS)

    Madavan, Nateri K.; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    An aerodynamic design optimization procedure that is based on a evolutionary algorithm known at Differential Evolution is described. Differential Evolution is a simple, fast, and robust evolutionary strategy that has been proven effective in determining the global optimum for several difficult optimization problems, including highly nonlinear systems with discontinuities and multiple local optima. The method is combined with a Navier-Stokes solver that evaluates the various intermediate designs and provides inputs to the optimization procedure. An efficient constraint handling mechanism is also incorporated. Results are presented for the inverse design of a turbine airfoil from a modern jet engine. The capability of the method to search large design spaces and obtain the optimal airfoils in an automatic fashion is demonstrated. Substantial reductions in the overall computing time requirements are achieved by using the algorithm in conjunction with neural networks.

  9. Comparing, optimizing, and benchmarking quantum-control algorithms in a unifying programming framework

    SciTech Connect

    Machnes, S.; Sander, U.; Glaser, S. J.; Schulte-Herbrueggen, T.; Fouquieres, P. de; Gruslys, A.; Schirmer, S.

    2011-08-15

    For paving the way to novel applications in quantum simulation, computation, and technology, increasingly large quantum systems have to be steered with high precision. It is a typical task amenable to numerical optimal control to turn the time course of pulses, i.e., piecewise constant control amplitudes, iteratively into an optimized shape. Here, we present a comparative study of optimal-control algorithms for a wide range of finite-dimensional applications. We focus on the most commonly used algorithms: GRAPE methods which update all controls concurrently, and Krotov-type methods which do so sequentially. Guidelines for their use are given and open research questions are pointed out. Moreover, we introduce a unifying algorithmic framework, DYNAMO (dynamic optimization platform), designed to provide the quantum-technology community with a convenient matlab-based tool set for optimal control. In addition, it gives researchers in optimal-control techniques a framework for benchmarking and comparing newly proposed algorithms with the state of the art. It allows a mix-and-match approach with various types of gradients, update and step-size methods as well as subspace choices. Open-source code including examples is made available at http://qlib.info.

  10. Geometric methods for optimal sensor design.

    PubMed

    Belabbas, M-A

    2016-01-01

    The Kalman-Bucy filter is the optimal estimator of the state of a linear dynamical system from sensor measurements. Because its performance is limited by the sensors to which it is paired, it is natural to seek optimal sensors. The resulting optimization problem is however non-convex. Therefore, many ad hoc methods have been used over the years to design sensors in fields ranging from engineering to biology to economics. We show in this paper how to obtain optimal sensors for the Kalman filter. Precisely, we provide a structural equation that characterizes optimal sensors. We furthermore provide a gradient algorithm and prove its convergence to the optimal sensor. This optimal sensor yields the lowest possible estimation error for measurements with a fixed signal-to-noise ratio. The results of the paper are proved by reducing the optimal sensor problem to an optimization problem on a Grassmannian manifold and proving that the function to be minimized is a Morse function with a unique minimum. The results presented here also apply to the dual problem of optimal actuator design.

  11. Geometric methods for optimal sensor design

    PubMed Central

    Belabbas, M.-A.

    2016-01-01

    The Kalman–Bucy filter is the optimal estimator of the state of a linear dynamical system from sensor measurements. Because its performance is limited by the sensors to which it is paired, it is natural to seek optimal sensors. The resulting optimization problem is however non-convex. Therefore, many ad hoc methods have been used over the years to design sensors in fields ranging from engineering to biology to economics. We show in this paper how to obtain optimal sensors for the Kalman filter. Precisely, we provide a structural equation that characterizes optimal sensors. We furthermore provide a gradient algorithm and prove its convergence to the optimal sensor. This optimal sensor yields the lowest possible estimation error for measurements with a fixed signal-to-noise ratio. The results of the paper are proved by reducing the optimal sensor problem to an optimization problem on a Grassmannian manifold and proving that the function to be minimized is a Morse function with a unique minimum. The results presented here also apply to the dual problem of optimal actuator design. PMID:26997885

  12. Multidisciplinary design optimization using response surface analysis

    NASA Technical Reports Server (NTRS)

    Unal, Resit

    1992-01-01

    Aerospace conceptual vehicle design is a complex process which involves multidisciplinary studies of configuration and technology options considering many parameters at many values. NASA Langley's Vehicle Analysis Branch (VAB) has detailed computerized analysis capabilities in most of the key disciplines required by advanced vehicle design. Given a configuration, the capability exists to quickly determine its performance and lifecycle cost. The next step in vehicle design is to determine the best settings of design parameters that optimize the performance characteristics. Typical approach to design optimization is experience based, trial and error variation of many parameters one at a time where possible combinations usually number in the thousands. However, this approach can either lead to a very long and expensive design process or to a premature termination of the design process due to budget and/or schedule pressures. Furthermore, one variable at a time approach can not account for the interactions that occur among parts of systems and among disciplines. As a result, vehicle design may be far from optimal. Advanced multidisciplinary design optimization (MDO) methods are needed to direct the search in an efficient and intelligent manner in order to drastically reduce the number of candidate designs to be evaluated. The payoffs in terms of enhanced performance and reduced cost are significant. A literature review yields two such advanced MDO methods used in aerospace design optimization; Taguchi methods and response surface methods. Taguchi methods provide a systematic and efficient method for design optimization for performance and cost. However, response surface method (RSM) leads to a better, more accurate exploration of the parameter space and to estimated optimum conditions with a small expenditure on experimental data. These two methods are described.

  13. Multidisciplinary design optimization using genetic algorithms

    NASA Technical Reports Server (NTRS)

    Unal, Resit

    1994-01-01

    Multidisciplinary design optimization (MDO) is an important step in the conceptual design and evaluation of launch vehicles since it can have a significant impact on performance and life cycle cost. The objective is to search the system design space to determine values of design variables that optimize the performance characteristic subject to system constraints. Gradient-based optimization routines have been used extensively for aerospace design optimization. However, one limitation of gradient based optimizers is their need for gradient information. Therefore, design problems which include discrete variables can not be studied. Such problems are common in launch vehicle design. For example, the number of engines and material choices must be integer values or assume only a few discrete values. In this study, genetic algorithms are investigated as an approach to MDO problems involving discrete variables and discontinuous domains. Optimization by genetic algorithms (GA) uses a search procedure which is fundamentally different from those gradient based methods. Genetic algorithms seek to find good solutions in an efficient and timely manner rather than finding the best solution. GA are designed to mimic evolutionary selection. A population of candidate designs is evaluated at each iteration, and each individual's probability of reproduction (existence in the next generation) depends on its fitness value (related to the value of the objective function). Progress toward the optimum is achieved by the crossover and mutation operations. GA is attractive since it uses only objective function values in the search process, so gradient calculations are avoided. Hence, GA are able to deal with discrete variables. Studies report success in the use of GA for aircraft design optimization studies, trajectory analysis, space structure design and control systems design. In these studies reliable convergence was achieved, but the number of function evaluations was large compared

  14. A computational framework to empower probabilistic protein design

    PubMed Central

    Fromer, Menachem; Yanover, Chen

    2008-01-01

    Motivation: The task of engineering a protein to perform a target biological function is known as protein design. A commonly used paradigm casts this functional design problem as a structural one, assuming a fixed backbone. In probabilistic protein design, positional amino acid probabilities are used to create a random library of sequences to be simultaneously screened for biological activity. Clearly, certain choices of probability distributions will be more successful in yielding functional sequences. However, since the number of sequences is exponential in protein length, computational optimization of the distribution is difficult. Results: In this paper, we develop a computational framework for probabilistic protein design following the structural paradigm. We formulate the distribution of sequences for a structure using the Boltzmann distribution over their free energies. The corresponding probabilistic graphical model is constructed, and we apply belief propagation (BP) to calculate marginal amino acid probabilities. We test this method on a large structural dataset and demonstrate the superiority of BP over previous methods. Nevertheless, since the results obtained by BP are far from optimal, we thoroughly assess the paradigm using high-quality experimental data. We demonstrate that, for small scale sub-problems, BP attains identical results to those produced by exact inference on the paradigmatic model. However, quantitative analysis shows that the distributions predicted significantly differ from the experimental data. These findings, along with the excellent performance we observed using BP on the smaller problems, suggest potential shortcomings of the paradigm. We conclude with a discussion of how it may be improved in the future. Contact: fromer@cs.huji.ac.il PMID:18586717

  15. Flow simulation and shape optimization for aircraft design

    NASA Astrophysics Data System (ADS)

    Kroll, Norbert; Gauger, Nicolas R.; Brezillon, Joel; Dwight, Richard; Fazzolari, Antonio; Vollmer, Daniel; Becker, Klaus; Barnewitz, Holger; Schulz, Volker; Hazra, Subhendu

    2007-06-01

    Within the framework of the German aerospace research program, the CFD project MEGADESIGN was initiated. The main goal of the project is the development of efficient numerical methods for shape design and optimization. In order to meet the requirements of industrial implementations a co-operative effort has been set up which involves the German aircraft industry, the DLR, several universities and some small enterprises specialized in numerical optimization. This paper outlines the planned activities within MEGADESIGN, the status at the beginning of the project and it presents some early results achieved in the project.

  16. An expert system for optimal gear design

    SciTech Connect

    Lin, K.C.

    1988-01-01

    By properly developing the mathematical model, numerical optimization can be used to seek the best solution for a given set of geometric constraints. The process of determining the non-geometric design variables is automated by using symbolic computation. This gear-design system is built according to the AGMA standards and a survey of gear-design experts. The recommendations of gear designers and the information provided by AGMA standards are integrated into knowledge bases and data bases. By providing fast information retrieval and design guidelines, this expert system greatly streamlines the spur gear design process. The concept of separating the design space into geometric and non-geometric variables can also be applied to the design process for general mechanical elements. The expert-system techniques is used to simulate a human designer to optimize the process of determining non-geometric parameters, and the numerical optimization is used to identify for the best geometric solution. The combination of the expert-system technique with numerical optimization essentially eliminates the deficiencies of both methods and thus provides a better way of modeling the engineering design process.

  17. Optimal Experimental Design for Model Discrimination

    PubMed Central

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it possible to determine these values, and thereby identify an optimal experimental design. After describing the method, it is demonstrated in two content areas in cognitive psychology in which models are highly competitive: retention (i.e., forgetting) and categorization. The optimal design is compared with the quality of designs used in the literature. The findings demonstrate that design optimization has the potential to increase the informativeness of the experimental method. PMID:19618983

  18. Optimal design of reverse osmosis module networks

    SciTech Connect

    Maskan, F.; Wiley, D.E.; Johnston, L.P.M.; Clements, D.J.

    2000-05-01

    The structure of individual reverse osmosis modules, the configuration of the module network, and the operating conditions were optimized for seawater and brackish water desalination. The system model included simple mathematical equations to predict the performance of the reverse osmosis modules. The optimization problem was formulated as a constrained multivariable nonlinear optimization. The objective function was the annual profit for the system, consisting of the profit obtained from the permeate, capital cost for the process units, and operating costs associated with energy consumption and maintenance. Optimization of several dual-stage reverse osmosis systems were investigated and compared. It was found that optimal network designs are the ones that produce the most permeate. It may be possible to achieve economic improvements by refining current membrane module designs and their operating pressures.

  19. Vehicle systems design optimization study

    NASA Technical Reports Server (NTRS)

    Gilmour, J. L.

    1980-01-01

    The optimum vehicle configuration and component locations are determined for an electric drive vehicle based on using the basic structure of a current production subcompact vehicle. The optimization of an electric vehicle layout requires a weight distribution in the range of 53/47 to 62/38 in order to assure dynamic handling characteristics comparable to current internal combustion engine vehicles. Necessary modification of the base vehicle can be accomplished without major modification of the structure or running gear. As long as batteries are as heavy and require as much space as they currently do, they must be divided into two packages, one at front under the hood and a second at the rear under the cargo area, in order to achieve the desired weight distribution. The weight distribution criteria requires the placement of batteries at the front of the vehicle even when the central tunnel is used for the location of some batteries. The optimum layout has a front motor and front wheel drive. This configuration provides the optimum vehicle dynamic handling characteristics and the maximum passenger and cargo space for a given size vehicle.

  20. Optimizing participation of children with autism spectrum disorder experiencing sensory challenges: a clinical reasoning framework.

    PubMed

    Ashburner, Jill K; Rodger, Sylvia A; Ziviani, Jenny M; Hinder, Elizabeth A

    2014-02-01

    Remedial sensory interventions currently lack supportive evidence and can be challenging to implement for families and clinicians. It may be timely to shift the focus to optimizing participation of children with autism spectrum disorders (ASD) through accommodation and self-regulation of their sensory differences. A framework to guide practitioners in selecting strategies is proposed based on clinical reasoning considerations, including (a) research evidence, (b) client- and family-centredness, (c) practice contexts, (d) occupation-centredness, and (e) risks. Information-sharing with families and coaching constitute the basis for intervention. Specific strategies are identified where sensory aversions or seeking behaviours, challenges with modulation of arousal, or sensory-related behaviours interfere with participation. Self-regulatory strategies are advocated. The application of universal design principles to shared environments is also recommended. The implications of this framework for future research, education, and practice are discussed. The clinical utility of the framework now needs to be tested.

  1. Torsional ultrasonic transducer computational design optimization.

    PubMed

    Melchor, J; Rus, G

    2014-09-01

    A torsional piezoelectric ultrasonic sensor design is proposed in this paper and computationally tested and optimized to measure shear stiffness properties of soft tissue. These are correlated with a number of pathologies like tumors, hepatic lesions and others. The reason is that, whereas compressibility is predominantly governed by the fluid phase of the tissue, the shear stiffness is dependent on the stroma micro-architecture, which is directly affected by those pathologies. However, diagnostic tools to quantify them are currently not well developed. The first contribution is a new typology of design adapted to quasifluids. A second contribution is the procedure for design optimization, for which an analytical estimate of the Robust Probability Of Detection, called RPOD, is presented for use as optimality criteria. The RPOD is formulated probabilistically to maximize the probability of detecting the least possible pathology while minimizing the effect of noise. The resulting optimal transducer has a resonance frequency of 28 kHz.

  2. A Novel Modeling Framework for Heterogeneous Catalyst Design

    NASA Astrophysics Data System (ADS)

    Katare, Santhoji; Bhan, Aditya; Caruthers, James; Delgass, Nicholas; Lauterbach, Jochen; Venkatasubramanian, Venkat

    2002-03-01

    A systems-oriented, integrated knowledge architecture that enables the use of data from High Throughput Experiments (HTE) for catalyst design is being developed. Higher-level critical reasoning is required to extract information efficiently from the increasingly available HTE data and to develop predictive models that can be used for design purposes. Towards this objective, we have developed a framework that aids the catalyst designer in negotiating the data and model complexities. Traditional kinetic and statistical tools have been systematically implemented and novel artificial intelligence tools have been developed and integrated to speed up the process of modeling catalytic reactions. Multiple nonlinear models that describe CO oxidation on supported metals have been screened using qualitative and quantitative features based optimization ideas. Physical constraints of the system have been used to select the optimum model parameters from the multiple solutions to the parameter estimation problem. Preliminary results about the selection of catalyst descriptors that match a target performance and the use of HTE data for refining fundamentals based models will be discussed.

  3. Interior Design Education within a Human Ecological Framework

    ERIC Educational Resources Information Center

    Kaup, Migette L.; Anderson, Barbara G.; Honey, Peggy

    2007-01-01

    An education based in human ecology can greatly benefit interior designers as they work to understand and improve the human condition. Design programs housed in colleges focusing on human ecology can improve the interior design profession by taking advantage of their home base and emphasizing the human ecological framework in the design curricula.…

  4. Interior Design Education within a Human Ecological Framework

    ERIC Educational Resources Information Center

    Kaup, Migette L.; Anderson, Barbara G.; Honey, Peggy

    2007-01-01

    An education based in human ecology can greatly benefit interior designers as they work to understand and improve the human condition. Design programs housed in colleges focusing on human ecology can improve the interior design profession by taking advantage of their home base and emphasizing the human ecological framework in the design curricula.…

  5. Optimal Design of Tests with Dichotomous and Polytomous Items.

    ERIC Educational Resources Information Center

    Berger, Martijn P. F.

    1998-01-01

    Reviews some results on optimal design of tests with items of dichotomous and polytomous response formats and offers rules and guidelines for optimal test assembly. Discusses the problem of optimal test design for two optimality criteria. (Author/SLD)

  6. Theoretical Foundation of Copernicus: A Unified System for Trajectory Design and Optimization

    NASA Technical Reports Server (NTRS)

    Ocampo, Cesar; Senent, Juan S.; Williams, Jacob

    2010-01-01

    The fundamental methods are described for the general spacecraft trajectory design and optimization software system called Copernicus. The methods rely on a unified framework that is used to model, design, and optimize spacecraft trajectories that may operate in complex gravitational force fields, use multiple propulsion systems, and involve multiple spacecraft. The trajectory model, with its associated equations of motion and maneuver models, are discussed.

  7. Aircraft design optimization with multidisciplinary performance criteria

    NASA Technical Reports Server (NTRS)

    Morris, Stephen; Kroo, Ilan

    1989-01-01

    The method described here for aircraft design optimization with dynamic response considerations provides an inexpensive means of integrating dynamics into aircraft preliminary design. By defining a dynamic performance index that can be added to a conventional objective function, a designer can investigate the trade-off between performance and handling (as measured by the vehicle's unforced response). The procedure is formulated to permit the use of control system gains as design variables, but does not require full-state feedback. The examples discussed here show how such an approach can lead to significant improvements in the design as compared with the more common sequential design of system and control law.

  8. Inducing optimally directed non-routine designs

    NASA Technical Reports Server (NTRS)

    Cagan, Jonathan; Agogino, Alice M.

    1990-01-01

    Non-routine designs are characterized by the creation of new variables and thus an expansion of the design space. These differ from routine designs in that the latter are restricted to a fixed set of variables and thus a pre-defined design space. In previous papers, the 1stPRINCE non-routine design methodology that expands the design space in such a way that optimal trends dictate the direction and form of this expansion is described. How inductive techniques can determine these optimal trends are examined. The induction techniques in 1stPRINCE utilize constraint information from monotonicity analysis to determine how to mutate the design space. The process observes the constraint information of mutated designs and induces trends from those constraints. Application of 1stPRINCE to a beam under flexural load leads to an optimally directed tapered beam. An interesting application of 1stPRINCE to determine the optimally directed shape of a spinning square block such that resistance to spinning is minimized leads to the discovery of a circular wheel.

  9. A design optimization methodology for Li+ batteries

    NASA Astrophysics Data System (ADS)

    Golmon, Stephanie; Maute, Kurt; Dunn, Martin L.

    2014-05-01

    Design optimization for functionally graded battery electrodes is shown to improve the usable energy capacity of Li batteries predicted by computational simulations and numerically optimizing the electrode porosities and particle radii. A multi-scale battery model which accounts for nonlinear transient transport processes, electrochemical reactions, and mechanical deformations is used to predict the usable energy storage capacity of the battery over a range of discharge rates. A multi-objective formulation of the design problem is introduced to maximize the usable capacity over a range of discharge rates while limiting the mechanical stresses. The optimization problem is solved via a gradient based optimization. A LiMn2O4 cathode is simulated with a PEO-LiCF3SO3 electrolyte and both a Li Foil (half cell) and LiC6 anode. Studies were performed on both half and full cell configurations resulting in distinctly different optimal electrode designs. The numerical results show that the highest rate discharge drives the simulations and the optimal designs are dominated by Li+ transport rates. The results also suggest that spatially varying electrode porosities and active particle sizes provides an efficient approach to improve the power-to-energy density of Li+ batteries. For the half cell configuration, the optimal design improves the discharge capacity by 29% while for the full cell the discharge capacity was improved 61% relative to an initial design with a uniform electrode structure. Most of the improvement in capacity was due to the spatially varying porosity, with up to 5% of the gains attributed to the particle radii design variables.

  10. Development and implementation of rotorcraft preliminary design methodology using multidisciplinary design optimization

    NASA Astrophysics Data System (ADS)

    Khalid, Adeel Syed

    Rotorcraft's evolution has lagged behind that of fixed-wing aircraft. One of the reasons for this gap is the absence of a formal methodology to accomplish a complete conceptual and preliminary design. Traditional rotorcraft methodologies are not only time consuming and expensive but also yield sub-optimal designs. Rotorcraft design is an excellent example of a multidisciplinary complex environment where several interdependent disciplines are involved. A formal framework is developed and implemented in this research for preliminary rotorcraft design using IPPD methodology. The design methodology consists of the product and process development cycles. In the product development loop, all the technical aspects of design are considered including the vehicle engineering, dynamic analysis, stability and control, aerodynamic performance, propulsion, transmission design, weight and balance, noise analysis and economic analysis. The design loop starts with a detailed analysis of requirements. A baseline is selected and upgrade targets are identified depending on the mission requirements. An Overall Evaluation Criterion (OEC) is developed that is used to measure the goodness of the design or to compare the design with competitors. The requirements analysis and baseline upgrade targets lead to the initial sizing and performance estimation of the new design. The digital information is then passed to disciplinary experts. This is where the detailed disciplinary analyses are performed. Information is transferred from one discipline to another as the design loop is iterated. To coordinate all the disciplines in the product development cycle, Multidisciplinary Design Optimization (MDO) techniques e.g. All At Once (AAO) and Collaborative Optimization (CO) are suggested. The methodology is implemented on a Light Turbine Training Helicopter (LTTH) design. Detailed disciplinary analyses are integrated through a common platform for efficient and centralized transfer of design

  11. Multiobjective optimization techniques for structural design

    NASA Technical Reports Server (NTRS)

    Rao, S. S.

    1984-01-01

    The multiobjective programming techniques are important in the design of complex structural systems whose quality depends generally on a number of different and often conflicting objective functions which cannot be combined into a single design objective. The applicability of multiobjective optimization techniques is studied with reference to simple design problems. Specifically, the parameter optimization of a cantilever beam with a tip mass and a three-degree-of-freedom vabration isolation system and the trajectory optimization of a cantilever beam are considered. The solutions of these multicriteria design problems are attempted by using global criterion, utility function, game theory, goal programming, goal attainment, bounded objective function, and lexicographic methods. It has been observed that the game theory approach required the maximum computational effort, but it yielded better optimum solutions with proper balance of the various objective functions in all the cases.

  12. Incorporating water quality responses into the framework of best management practices optimization

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Wei, Guoyuan; Shen, Zhenyao

    2016-10-01

    Determining cost-effective configurations of best management practices (BMPs) is a notably complex problem, especially for large-scale watersheds. In this paper, a Markov-based simulator that has been developed to quantify water quality responses is described, and a new framework is also proposed for the optimal design of BMPs by integrating the Markov approach, a watershed model, and an evolutionary algorithm. This new framework was then tested in a typical watershed, the Three Georges Reservoir Region in China. The results obtained from this application indicate the integration of water quality responses is vital for the optimal design of BMPs, especially for the downstream areas of the targeted river assessment section. The Markov-based algorithm had a computational advantage over traditional algorithm and this new algorithm offers the prospect of providing more cost-effective medium-cost solutions. The relative impacts of upstream BMPs were also highlighted in protecting water quality at multiple river assessment sections. This new algorithm can easily be extended to any other watershed to aid decision managers in the optimal design of BMPs at the watershed scale.

  13. Optimization-based controller design for rotorcraft

    NASA Technical Reports Server (NTRS)

    Tsing, N.-K.; Fan, M. K. H.; Barlow, J.; Tits, A. L.; Tischler, M. B.

    1993-01-01

    An optimization-based methodology for linear control system design is outlined by considering the design of a controller for a UH-60 rotorcraft in hover. A wide range of design specifications is taken into account: internal stability, decoupling between longitudinal and lateral motions, handling qualities, and rejection of windgusts. These specifications are investigated while taking into account physical limitations in the swashplate displacements and rates of displacement. The methodology crucially relies on user-machine interaction for tradeoff exploration.

  14. Optimal control design of preparation pulses for contrast optimization in MRI

    NASA Astrophysics Data System (ADS)

    Van Reeth, Eric; Ratiney, Hélène; Tesch, Michael; Grenier, Denis; Beuf, Olivier; Glaser, Steffen J.; Sugny, Dominique

    2017-06-01

    This work investigates the use of MRI radio-frequency (RF) pulses designed within the framework of optimal control theory for image contrast optimization. The magnetization evolution is modeled with Bloch equations, which defines a dynamic system that can be controlled via the application of the Pontryagin Maximum Principle (PMP). This framework allows the computation of optimal RF pulses that bring the magnetization to a given state to obtain the desired contrast after acquisition. Creating contrast through the optimal manipulation of Bloch equations is a new way of handling contrast in MRI, which can explore the theoretical limits of the system. Simulation experiments carried out on-resonance quantify the contrast improvement when compared to standard T1 or T2 weighting strategies. The use of optimal pulses is also validated for the first time in both in vitro and in vivo experiments on a small-animal 4.7 T MR system. Results demonstrate their robustness to static field inhomogeneities as well as the fact that they can be embedded in standard imaging sequences without affecting standard parameters such as slice selection or echo type. In vivo results on rat and mouse brains illustrate the ability of optimal contrast pulses to create non-trivial contrasts on well-studied structures (white matter versus gray matter).

  15. Global Design Optimization for Fluid Machinery Applications

    NASA Technical Reports Server (NTRS)

    Shyy, Wei; Papila, Nilay; Tucker, Kevin; Vaidyanathan, Raj; Griffin, Lisa

    2000-01-01

    Recent experiences in utilizing the global optimization methodology, based on polynomial and neural network techniques for fluid machinery design are summarized. Global optimization methods can utilize the information collected from various sources and by different tools. These methods offer multi-criterion optimization, handle the existence of multiple design points and trade-offs via insight into the entire design space can easily perform tasks in parallel, and are often effective in filtering the noise intrinsic to numerical and experimental data. Another advantage is that these methods do not need to calculate the sensitivity of each design variable locally. However, a successful application of the global optimization method needs to address issues related to data requirements with an increase in the number of design variables and methods for predicting the model performance. Examples of applications selected from rocket propulsion components including a supersonic turbine and an injector element and a turbulent flow diffuser are used to illustrate the usefulness of the global optimization method.

  16. From task parameters to motor synergies: A hierarchical framework for approximately-optimal control of redundant manipulators

    PubMed Central

    Todorov, Emanuel; Li, Weiwei; Pan, Xiuchuan

    2006-01-01

    We present a hierarchical framework for approximately-optimal control of redundant manipulators. The plant is augmented with a low-level feedback controller, designed to yield input-output behavior that captures the task-relevant aspects of plant dynamics but has reduced dimensionality. This makes it possible to reformulate the optimal control problem in terms of the augmented dynamics, and optimize a high-level feedback controller without running into the curse of dimensionality. The resulting control hierarchy compares favorably to existing methods in robotics. Furthermore we demonstrate a number of similarities to (non-hierarchical) optimal feedback control. Besides its engineering applications, the new framework addresses a key unresolved problem in the neural control of movement. It has long been hypothesized that coordination involves selective control of task parameters via muscle synergies, but the link between these parameters and the synergies capable of controlling them has remained elusive. Our framework provides this missing link. PMID:17710121

  17. Virtual Reality Hypermedia Design Frameworks for Science Instruction.

    ERIC Educational Resources Information Center

    Maule, R. William; Oh, Byron; Check, Rosa

    This paper reports on a study that conceptualizes a research framework to aid software design and development for virtual reality (VR) computer applications for instruction in the sciences. The framework provides methodologies for the processing, collection, examination, classification, and presentation of multimedia information within hyperlinked…

  18. A Framework for Designing Cluster Randomized Trials with Binary Outcomes

    ERIC Educational Resources Information Center

    Spybrook, Jessaca; Martinez, Andres

    2011-01-01

    The purpose of this paper is to provide a frame work for approaching a power analysis for a CRT (cluster randomized trial) with a binary outcome. The authors suggest a framework in the context of a simple CRT and then extend it to a blocked design, or a multi-site cluster randomized trial (MSCRT). The framework is based on proportions, an…

  19. Regression analysis as a design optimization tool

    NASA Technical Reports Server (NTRS)

    Perley, R.

    1984-01-01

    The optimization concepts are described in relation to an overall design process as opposed to a detailed, part-design process where the requirements are firmly stated, the optimization criteria are well established, and a design is known to be feasible. The overall design process starts with the stated requirements. Some of the design criteria are derived directly from the requirements, but others are affected by the design concept. It is these design criteria that define the performance index, or objective function, that is to be minimized within some constraints. In general, there will be multiple objectives, some mutually exclusive, with no clear statement of their relative importance. The optimization loop that is given adjusts the design variables and analyzes the resulting design, in an iterative fashion, until the objective function is minimized within the constraints. This provides a solution, but it is only the beginning. In effect, the problem definition evolves as information is derived from the results. It becomes a learning process as we determine what the physics of the system can deliver in relation to the desirable system characteristics. As with any learning process, an interactive capability is a real attriubute for investigating the many alternatives that will be suggested as learning progresses.

  20. Lens design: optimization with Global Explorer

    NASA Astrophysics Data System (ADS)

    Isshiki, Masaki

    2013-02-01

    The optimization method damped least squares method (DLS) was almost completed late in the 1960s. DLS has been overwhelming in the local optimization technology. After that, various efforts were made to seek the global optimization. They came into the world after 1990 and the Global Explorer (GE) was one of them invented by the author to find plural solutions, each of which has the local minimum of the merit function. The robustness of the designed lens is also an important factor as well as the performance of the lens; both of these requirements are balanced in the process of optimization with GE2 (the second version of GE). An idea is also proposed to modify GE2 for aspherical lens systems. A design example is shown.

  1. Self-supporting structure design in additive manufacturing through explicit topology optimization

    NASA Astrophysics Data System (ADS)

    Guo, Xu; Zhou, Jianhua; Zhang, Weisheng; Du, Zongliang; Liu, Chang; Liu, Ying

    2017-08-01

    One of the challenging issues in additive manufacturing (AM) oriented topology optimization is how to design structures that are self-supportive in a manufacture process without introducing additional supporting materials. In the present contribution, it is intended to resolve this problem under an explicit topology optimization framework where optimal structural topology can be found by optimizing a set of explicit geometry parameters. Two solution approaches established based on the Moving Morphable Components (MMC) and Moving Morphable Voids (MMV) frameworks, respectively, are proposed and some theoretical issues associated with AM oriented topology optimization are also analyzed. Numerical examples provided demonstrate the effectiveness of the proposed methods.

  2. A Design Framework for Syllabus Generator

    ERIC Educational Resources Information Center

    Abdous, M'hammed; He, Wu

    2008-01-01

    A well-designed syllabus provides students with a roadmap for an engaging and successful learning experience, whereas a poorly designed syllabus impedes communication between faculty and students, increases student anxiety and potential complaints, and reduces overall teaching effectiveness. In an effort to facilitate, streamline, and improve…

  3. A Design Framework for Syllabus Generator

    ERIC Educational Resources Information Center

    Abdous, M'hammed; He, Wu

    2008-01-01

    A well-designed syllabus provides students with a roadmap for an engaging and successful learning experience, whereas a poorly designed syllabus impedes communication between faculty and students, increases student anxiety and potential complaints, and reduces overall teaching effectiveness. In an effort to facilitate, streamline, and improve…

  4. Response Surface Model Building and Multidisciplinary Optimization Using D-Optimal Designs

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Lepsch, Roger A.; McMillin, Mark L.

    1998-01-01

    This paper discusses response surface methods for approximation model building and multidisciplinary design optimization. The response surface methods discussed are central composite designs, Bayesian methods and D-optimal designs. An over-determined D-optimal design is applied to a configuration design and optimization study of a wing-body, launch vehicle. Results suggest that over determined D-optimal designs may provide an efficient approach for approximation model building and for multidisciplinary design optimization.

  5. Integrating Hybrid Life Cycle Assessment with Multiobjective Optimization: A Modeling Framework.

    PubMed

    Yue, Dajun; Pandya, Shyama; You, Fengqi

    2016-02-02

    By combining life cycle assessment (LCA) with multiobjective optimization (MOO), the life cycle optimization (LCO) framework holds the promise not only to evaluate the environmental impacts for a given product but also to compare different alternatives and identify both ecologically and economically better decisions. Despite the recent methodological developments in LCA, most LCO applications are developed upon process-based LCA, which results in system boundary truncation and underestimation of the true impact. In this study, we propose a comprehensive LCO framework that seamlessly integrates MOO with integrated hybrid LCA. It quantifies both direct and indirect environmental impacts and incorporates them into the decision making process in addition to the more traditional economic criteria. The proposed LCO framework is demonstrated through an application on sustainable design of a potential bioethanol supply chain in the UK. Results indicate that the proposed hybrid LCO framework identifies a considerable amount of indirect greenhouse gas emissions (up to 58.4%) that are essentially ignored in process-based LCO. Among the biomass feedstock options considered, using woody biomass for bioethanol production would be the most preferable choice from a climate perspective, while the mixed use of wheat and wheat straw as feedstocks would be the most cost-effective one.

  6. Toward a More Flexible Web-Based Framework for Multidisciplinary Design

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.; Salas, A. O.

    1999-01-01

    In today's competitive environment, both industry and government agencies are under pressure to reduce the time and cost of multidisciplinary design projects. New tools have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. One such tool, a framework for multidisciplinary design, is defined as a hardware-software architecture that enables integration, execution, and communication among diverse disciplinary processes. An examination of current frameworks reveals weaknesses in various areas, such as sequencing, monitoring, controlling, and displaying the design process. The objective of this research is to explore how Web technology can improve these areas of weakness and lead toward a more flexible framework. This article describes a Web-based system that optimizes and controls the execution sequence of design processes in addition to monitoring the project status and displaying the design results.

  7. Design of optimized piezoelectric HDD-sliders

    NASA Astrophysics Data System (ADS)

    Nakasone, Paulo H.; Yoo, Jeonghoon; Silva, Emilio C. N.

    2010-04-01

    As storage data density in hard-disk drives (HDDs) increases for constant or miniaturizing sizes, precision positioning of HDD heads becomes a more relevant issue to ensure enormous amounts of data to be properly written and read. Since the traditional single-stage voice coil motor (VCM) cannot satisfy the positioning requirement of high-density tracks per inch (TPI) HDDs, dual-stage servo systems have been proposed to overcome this matter, by using VCMs to coarsely move the HDD head while piezoelectric actuators provides fine and fast positioning. Thus, the aim of this work is to apply topology optimization method (TOM) to design novel piezoelectric HDD heads, by finding optimal placement of base-plate and piezoelectric material to high precision positioning HDD heads. Topology optimization method is a structural optimization technique that combines the finite element method (FEM) with optimization algorithms. The laminated finite element employs the MITC (mixed interpolation of tensorial components) formulation to provide accurate and reliable results. The topology optimization uses a rational approximation of material properties to vary the material properties between 'void' and 'filled' portions. The design problem consists in generating optimal structures that provide maximal displacements, appropriate structural stiffness and resonance phenomena avoidance. The requirements are achieved by applying formulations to maximize displacements, minimize structural compliance and maximize resonance frequencies. This paper presents the implementation of the algorithms and show results to confirm the feasibility of this approach.

  8. Design optimization for cost and quality: The robust design approach

    NASA Technical Reports Server (NTRS)

    Unal, Resit

    1990-01-01

    Designing reliable, low cost, and operable space systems has become the key to future space operations. Designing high quality space systems at low cost is an economic and technological challenge to the designer. A systematic and efficient way to meet this challenge is a new method of design optimization for performance, quality, and cost, called Robust Design. Robust Design is an approach for design optimization. It consists of: making system performance insensitive to material and subsystem variation, thus allowing the use of less costly materials and components; making designs less sensitive to the variations in the operating environment, thus improving reliability and reducing operating costs; and using a new structured development process so that engineering time is used most productively. The objective in Robust Design is to select the best combination of controllable design parameters so that the system is most robust to uncontrollable noise factors. The robust design methodology uses a mathematical tool called an orthogonal array, from design of experiments theory, to study a large number of decision variables with a significantly small number of experiments. Robust design also uses a statistical measure of performance, called a signal-to-noise ratio, from electrical control theory, to evaluate the level of performance and the effect of noise factors. The purpose is to investigate the Robust Design methodology for improving quality and cost, demonstrate its application by the use of an example, and suggest its use as an integral part of space system design process.

  9. The optimal design of standard gearsets

    NASA Technical Reports Server (NTRS)

    Savage, M.; Coy, J. J.; Townsend, D. P.

    1983-01-01

    A design procedure for sizing standard involute spur gearsets is presented. The procedure is applied to find the optimal design for two examples - an external gear mesh with a ratio of 5:1 and an internal gear mesh with a ratio of 5:1. In the procedure, the gear mesh is designed to minimize the center distance for a given gear ratio, pressure angle, pinion torque, and allowable tooth strengths. From the methodology presented, a design space may be formulated for either external gear contact or for internal contact. The design space includes kinematics considerations of involute interference, tip fouling, and contact ratio. Also included are design constraints based on bending fatigue in the pinion fillet and Hertzian contact pressure in the full load region and at the gear tip where scoring is possible. This design space is two dimensional, giving the gear mesh center distance as a function of diametral pitch and the number of pinion teeth. The constraint equations were identified for kinematic interference, fillet bending fatigue, pitting fatigue, and scoring pressure, which define the optimal design space for a given gear design. The locus of equal size optimum designs was identified as the straight line through the origin which has the least slope in the design region.

  10. Multidisciplinary design optimization using multiobjective formulation techniques

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Pagaldipti, Narayanan S.

    1995-01-01

    This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.

  11. Nonlinear program based optimization of boost and buck-boost converter designs

    NASA Technical Reports Server (NTRS)

    Rahman, S.; Lee, F. C.

    1981-01-01

    The facility of an Augmented Lagrangian (ALAG) multiplier based nonlinear programming technique is demonstrated for minimum-weight design optimizations of boost and buck-boost power converters. Certain important features of ALAG are presented in the framework of a comprehensive design example for buck-boost power converter design optimization. The study provides refreshing design insight of power converters and presents such information as weight and loss profiles of various semiconductor components and magnetics as a function of the switching frequency.

  12. Nonlinear program based optimization of boost and buck-boost converter designs

    NASA Technical Reports Server (NTRS)

    Rahman, S.; Lee, F. C.

    1981-01-01

    The facility of an Augmented Lagrangian (ALAG) multiplier based nonlinear programming technique is demonstrated for minimum-weight design optimizations of boost and buck-boost power converters. Certain important features of ALAG are presented in the framework of a comprehensive design example for buck-boost power converter design optimization. The study provides refreshing design insight of power converters and presents such information as weight and loss profiles of various semiconductor components and magnetics as a function of the switching frequency.

  13. A design framework for exploratory geovisualization in epidemiology

    PubMed Central

    Robinson, Anthony C.

    2009-01-01

    This paper presents a design framework for geographic visualization based on iterative evaluations of a toolkit designed to support cancer epidemiology. The Exploratory Spatio-Temporal Analysis Toolkit (ESTAT), is intended to support visual exploration through multivariate health data. Its purpose is to provide epidemiologists with the ability to generate new hypotheses or further refine those they may already have. Through an iterative user-centered design process, ESTAT has been evaluated by epidemiologists at the National Cancer Institute (NCI). Results of these evaluations are discussed, and a design framework based on evaluation evidence is presented. The framework provides specific recommendations and considerations for the design and development of a geovisualization toolkit for epidemiology. Its basic structure provides a model for future design and evaluation efforts in information visualization. PMID:20390052

  14. Using Approximations to Accelerate Engineering Design Optimization

    NASA Technical Reports Server (NTRS)

    Torczon, Virginia; Trosset, Michael W.

    1998-01-01

    Optimization problems that arise in engineering design are often characterized by several features that hinder the use of standard nonlinear optimization techniques. Foremost among these features is that the functions used to define the engineering optimization problem often are computationally intensive. Within a standard nonlinear optimization algorithm, the computational expense of evaluating the functions that define the problem would necessarily be incurred for each iteration of the optimization algorithm. Faced with such prohibitive computational costs, an attractive alternative is to make use of surrogates within an optimization context since surrogates can be chosen or constructed so that they are typically much less expensive to compute. For the purposes of this paper, we will focus on the use of algebraic approximations as surrogates for the objective. In this paper we introduce the use of so-called merit functions that explicitly recognize the desirability of improving the current approximation to the objective during the course of the optimization. We define and experiment with the use of merit functions chosen to simultaneously improve both the solution to the optimization problem (the objective) and the quality of the approximation. Our goal is to further improve the effectiveness of our general approach without sacrificing any of its rigor.

  15. Application of Optimal Designs to Item Calibration

    PubMed Central

    Lu, Hung-Yi

    2014-01-01

    In computerized adaptive testing (CAT), examinees are presented with various sets of items chosen from a precalibrated item pool. Consequently, the attrition speed of the items is extremely fast, and replenishing the item pool is essential. Therefore, item calibration has become a crucial concern in maintaining item banks. In this study, a two-parameter logistic model is used. We applied optimal designs and adaptive sequential analysis to solve this item calibration problem. The results indicated that the proposed optimal designs are cost effective and time efficient. PMID:25188318

  16. Optimization and surgical design for applications in pediatric cardiology

    NASA Astrophysics Data System (ADS)

    Marsden, Alison; Bernstein, Adam; Taylor, Charles; Feinstein, Jeffrey

    2007-11-01

    The coupling of shape optimization to cardiovascular blood flow simulations has potential to improve the design of current surgeries and to eventually allow for optimization of surgical designs for individual patients. This is particularly true in pediatric cardiology, where geometries vary dramatically between patients, and unusual geometries can lead to unfavorable hemodynamic conditions. Interfacing shape optimization to three-dimensional, time-dependent fluid mechanics problems is particularly challenging because of the large computational cost and the difficulty in computing objective function gradients. In this work a derivative-free optimization algorithm is coupled to a three-dimensional Navier-Stokes solver that has been tailored for cardiovascular applications. The optimization code employs mesh adaptive direct search in conjunction with a Kriging surrogate. This framework is successfully demonstrated on several geometries representative of cardiovascular surgical applications. We will discuss issues of cost function choice for surgical applications, including energy loss and wall shear stress distribution. In particular, we will discuss the creation of new designs for the Fontan procedure, a surgery done in pediatric cardiology to treat single ventricle heart defects.

  17. A Framework for the Design of Service Systems

    NASA Astrophysics Data System (ADS)

    Tan, Yao-Hua; Hofman, Wout; Gordijn, Jaap; Hulstijn, Joris

    We propose a framework for the design and implementation of service systems, especially to design controls for long-term sustainable value co-creation. The framework is based on the software support tool e3-control. To illustrate the framework we use a large-scale case study, the Beer Living Lab, for simplification of customs procedures in international trade. The BeerLL shows how value co-creation can be achieved by reduction of administrative burden in international beer export due to electronic customs. Participants in the BeerLL are Heineken, IBM and Dutch Tax & Customs.

  18. Instrument design and optimization using genetic algorithms

    SciTech Connect

    Hoelzel, Robert; Bentley, Phillip M.; Fouquet, Peter

    2006-10-15

    This article describes the design of highly complex physical instruments by using a canonical genetic algorithm (GA). The procedure can be applied to all instrument designs where performance goals can be quantified. It is particularly suited to the optimization of instrument design where local optima in the performance figure of merit are prevalent. Here, a GA is used to evolve the design of the neutron spin-echo spectrometer WASP which is presently being constructed at the Institut Laue-Langevin, Grenoble, France. A comparison is made between this artificial intelligence approach and the traditional manual design methods. We demonstrate that the search of parameter space is more efficient when applying the genetic algorithm, and the GA produces a significantly better instrument design. Furthermore, it is found that the GA increases flexibility, by facilitating the reoptimization of the design after changes in boundary conditions during the design phase. The GA also allows the exploration of 'nonstandard' magnet coil geometries. We conclude that this technique constitutes a powerful complementary tool for the design and optimization of complex scientific apparatus, without replacing the careful thought processes employed in traditional design methods.

  19. Optimization methods for alternative energy system design

    NASA Astrophysics Data System (ADS)

    Reinhardt, Michael Henry

    An electric vehicle heating system and a solar thermal coffee dryer are presented as case studies in alternative energy system design optimization. Design optimization tools are compared using these case studies, including linear programming, integer programming, and fuzzy integer programming. Although most decision variables in the designs of alternative energy systems are generally discrete (e.g., numbers of photovoltaic modules, thermal panels, layers of glazing in windows), the literature shows that the optimization methods used historically for design utilize continuous decision variables. Integer programming, used to find the optimal investment in conservation measures as a function of life cycle cost of an electric vehicle heating system, is compared to linear programming, demonstrating the importance of accounting for the discrete nature of design variables. The electric vehicle study shows that conservation methods similar to those used in building design, that reduce the overall UA of a 22 ft. electric shuttle bus from 488 to 202 (Btu/hr-F), can eliminate the need for fossil fuel heating systems when operating in the northeast United States. Fuzzy integer programming is presented as a means of accounting for imprecise design constraints such as being environmentally friendly in the optimization process. The solar thermal coffee dryer study focuses on a deep-bed design using unglazed thermal collectors (UTC). Experimental data from parchment coffee drying are gathered, including drying constants and equilibrium moisture. In this case, fuzzy linear programming is presented as a means of optimizing experimental procedures to produce the most information under imprecise constraints. Graphical optimization is used to show that for every 1 m2 deep-bed dryer, of 0.4 m depth, a UTC array consisting of 5, 1.1 m 2 panels, and a photovoltaic array consisting of 1, 0.25 m 2 panels produces the most dry coffee per dollar invested in the system. In general this study

  20. Branch target buffer design and optimization

    NASA Technical Reports Server (NTRS)

    Perleberg, Chris H.; Smith, Alan J.

    1993-01-01

    Consideration is given to two major issues in the design of branch target buffers (BTBs), with the goal of achieving maximum performance for a given number of bits allocated to the BTB design. The first issue is BTB management; the second is what information to keep in the BTB. A number of solutions to these problems are reviewed, and various optimizations in the design of BTBs are discussed. Design target miss ratios for BTBs are developed, making it possible to estimate the performance of BTBs for real workloads.

  1. Branch target buffer design and optimization

    NASA Technical Reports Server (NTRS)

    Perleberg, Chris H.; Smith, Alan J.

    1993-01-01

    Consideration is given to two major issues in the design of branch target buffers (BTBs), with the goal of achieving maximum performance for a given number of bits allocated to the BTB design. The first issue is BTB management; the second is what information to keep in the BTB. A number of solutions to these problems are reviewed, and various optimizations in the design of BTBs are discussed. Design target miss ratios for BTBs are developed, making it possible to estimate the performance of BTBs for real workloads.

  2. Evolutionary optimization methods for accelerator design

    NASA Astrophysics Data System (ADS)

    Poklonskiy, Alexey A.

    Many problems from the fields of accelerator physics and beam theory can be formulated as optimization problems and, as such, solved using optimization methods. Despite growing efficiency of the optimization methods, the adoption of modern optimization techniques in these fields is rather limited. Evolutionary Algorithms (EAs) form a relatively new and actively developed optimization methods family. They possess many attractive features such as: ease of the implementation, modest requirements on the objective function, a good tolerance to noise, robustness, and the ability to perform a global search efficiently. In this work we study the application of EAs to problems from accelerator physics and beam theory. We review the most commonly used methods of unconstrained optimization and describe the GATool, evolutionary algorithm and the software package, used in this work, in detail. Then we use a set of test problems to assess its performance in terms of computational resources, quality of the obtained result, and the tradeoff between them. We justify the choice of GATool as a heuristic method to generate cutoff values for the COSY-GO rigorous global optimization package for the COSY Infinity scientific computing package. We design the model of their mutual interaction and demonstrate that the quality of the result obtained by GATool increases as the information about the search domain is refined, which supports the usefulness of this model. We Giscuss GATool's performance on the problems suffering from static and dynamic noise and study useful strategies of GATool parameter tuning for these and other difficult problems. We review the challenges of constrained optimization with EAs and methods commonly used to overcome them. We describe REPA, a new constrained optimization method based on repairing, in exquisite detail, including the properties of its two repairing techniques: REFIND and REPROPT. We assess REPROPT's performance on the standard constrained

  3. eCodonOpt: a systematic computational framework for optimizing codon usage in directed evolution experiments

    PubMed Central

    Moore, Gregory L.; Maranas, Costas D.

    2002-01-01

    We present a systematic computational framework, eCodonOpt, for designing parental DNA sequences for directed evolution experiments through codon usage optimization. Given a set of homologous parental proteins to be recombined at the DNA level, the optimal DNA sequences encoding these proteins are sought for a given diversity objective. We find that the free energy of annealing between the recombining DNA sequences is a much better descriptor of the extent of crossover formation than sequence identity. Three different diversity targets are investigated for the DNA shuffling protocol to showcase the utility of the eCodonOpt framework: (i) maximizing the average number of crossovers per recombined sequence; (ii) minimizing bias in family DNA shuffling so that each of the parental sequence pair contributes a similar number of crossovers to the library; and (iii) maximizing the relative frequency of crossovers in specific structural regions. Each one of these design challenges is formulated as a constrained optimization problem that utilizes 0–1 binary variables as on/off switches to model the selection of different codon choices for each residue position. Computational results suggest that many-fold improvements in the crossover frequency, location and specificity are possible, providing valuable insights for the engineering of directed evolution protocols. PMID:12034828

  4. Optimizing Health Care Coalitions: Conceptual Frameworks and a Research Agenda.

    PubMed

    Hupert, Nathaniel; Biala, Karen; Holland, Tara; Baehr, Avi; Hasan, Aisha; Harvey, Melissa

    2015-12-01

    The US health care system has maintained an objective of preparedness for natural or manmade catastrophic events as part of its larger charge to deliver health services for the American population. In 2002, support for hospital-based preparedness activities was bolstered by the creation of the National Bioterrorism Hospital Preparedness Program, now called the Hospital Preparedness Program, in the US Department of Health and Human Services. Since 2012, this program has promoted linking health care facilities into health care coalitions that build key preparedness and emergency response capabilities. Recognizing that well-functioning health care coalitions can have a positive impact on the health outcomes of the populations they serve, this article informs efforts to optimize health care coalition activity. We first review the landscape of health care coalitions in the United States. Then, using principles from supply chain management and high-reliability organization theory, we present 2 frameworks extending beyond the Office of the Assistant Secretary for Preparedness and Response's current guidance in a way that may help health care coalition leaders gain conceptual insight into how different enterprises achieve similar ends relevant to emergency response. We conclude with a proposed research agenda to advance understanding of how coalitions can contribute to the day-to-day functioning of health care systems and disaster preparedness.

  5. Optimizing Experimental Design for Comparing Models of Brain Function

    PubMed Central

    Daunizeau, Jean; Preuschoff, Kerstin; Friston, Karl; Stephan, Klaas

    2011-01-01

    This article presents the first attempt to formalize the optimization of experimental design with the aim of comparing models of brain function based on neuroimaging data. We demonstrate our approach in the context of Dynamic Causal Modelling (DCM), which relates experimental manipulations to observed network dynamics (via hidden neuronal states) and provides an inference framework for selecting among candidate models. Here, we show how to optimize the sensitivity of model selection by choosing among experimental designs according to their respective model selection accuracy. Using Bayesian decision theory, we (i) derive the Laplace-Chernoff risk for model selection, (ii) disclose its relationship with classical design optimality criteria and (iii) assess its sensitivity to basic modelling assumptions. We then evaluate the approach when identifying brain networks using DCM. Monte-Carlo simulations and empirical analyses of fMRI data from a simple bimanual motor task in humans serve to demonstrate the relationship between network identification and the optimal experimental design. For example, we show that deciding whether there is a feedback connection requires shorter epoch durations, relative to asking whether there is experimentally induced change in a connection that is known to be present. Finally, we discuss limitations and potential extensions of this work. PMID:22125485

  6. Shape optimization techniques for musical instrument design

    NASA Astrophysics Data System (ADS)

    Henrique, Luis; Antunes, Jose; Carvalho, Joao S.

    2002-11-01

    The design of musical instruments is still mostly based on empirical knowledge and costly experimentation. One interesting improvement is the shape optimization of resonating components, given a number of constraints (allowed parameter ranges, shape smoothness, etc.), so that vibrations occur at specified modal frequencies. Each admissible geometrical configuration generates an error between computed eigenfrequencies and the target set. Typically, error surfaces present many local minima, corresponding to suboptimal designs. This difficulty can be overcome using global optimization techniques, such as simulated annealing. However these methods are greedy, concerning the number of function evaluations required. Thus, the computational effort can be unacceptable if complex problems, such as bell optimization, are tackled. Those issues are addressed in this paper, and a method for improving optimization procedures is proposed. Instead of using the local geometric parameters as searched variables, the system geometry is modeled in terms of truncated series of orthogonal space-funcitons, and optimization is performed on their amplitude coefficients. Fourier series and orthogonal polynomials are typical such functions. This technique reduces considerably the number of searched variables, and has a potential for significant computational savings in complex problems. It is illustrated by optimizing the shapes of both current and uncommon marimba bars.

  7. Towards a Framework for Professional Curriculum Design

    ERIC Educational Resources Information Center

    Winch, Christopher

    2015-01-01

    Recent reviews of vocational qualifications in England have noted problems with their restricted nature. However, the underlying issue of how to conceptualise professional agency in curriculum design has not been properly addressed, either by the Richard or the Whitehead reviews. Drawing on comparative work in England and Europe it is argued that…

  8. Towards a Framework for Professional Curriculum Design

    ERIC Educational Resources Information Center

    Winch, Christopher

    2015-01-01

    Recent reviews of vocational qualifications in England have noted problems with their restricted nature. However, the underlying issue of how to conceptualise professional agency in curriculum design has not been properly addressed, either by the Richard or the Whitehead reviews. Drawing on comparative work in England and Europe it is argued that…

  9. Achieving Equivalence: A Transnational Curriculum Design Framework

    ERIC Educational Resources Information Center

    Clarke, Angela; Johal, Terry; Sharp, Kristen; Quinn, Shayna

    2016-01-01

    Transnational education is now essential to university international development strategies. As a result, tertiary educators are expected to engage with the complexities of diverse cultural contexts, different delivery modes, and mixed student cohorts to design quality learning experiences for all. To support this transition we developed a…

  10. Achieving Equivalence: A Transnational Curriculum Design Framework

    ERIC Educational Resources Information Center

    Clarke, Angela; Johal, Terry; Sharp, Kristen; Quinn, Shayna

    2016-01-01

    Transnational education is now essential to university international development strategies. As a result, tertiary educators are expected to engage with the complexities of diverse cultural contexts, different delivery modes, and mixed student cohorts to design quality learning experiences for all. To support this transition we developed a…

  11. Integrated structural-aerodynamic design optimization

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.; Kao, P. J.; Grossman, B.; Polen, D.; Sobieszczanski-Sobieski, J.

    1988-01-01

    This paper focuses on the processes of simultaneous aerodynamic and structural wing design as a prototype for design integration, with emphasis on the major difficulty associated with multidisciplinary design optimization processes, their enormous computational costs. Methods are presented for reducing this computational burden through the development of efficient methods for cross-sensitivity calculations and the implementation of approximate optimization procedures. Utilizing a modular sensitivity analysis approach, it is shown that the sensitivities can be computed without the expensive calculation of the derivatives of the aerodynamic influence coefficient matrix, and the derivatives of the structural flexibility matrix. The same process is used to efficiently evaluate the sensitivities of the wing divergence constraint, which should be particularly useful, not only in problems of complete integrated aircraft design, but also in aeroelastic tailoring applications.

  12. Stented artery biomechanics and device design optimization.

    PubMed

    Timmins, Lucas H; Moreno, Michael R; Meyer, Clark A; Criscione, John C; Rachev, Alexander; Moore, James E

    2007-05-01

    The deployment of a vascular stent aims to increase lumen diameter for the restoration of blood flow, but the accompanied alterations in the mechanical environment possibly affect the long-term patency of these devices. The primary aim of this investigation was to develop an algorithm to optimize stent design, allowing for consideration of competing solid mechanical concerns (wall stress, lumen gain, and cyclic deflection). Finite element modeling (FEM) was used to estimate artery wall stress and systolic/diastolic geometries, from which single parameter outputs were derived expressing stress, lumen gain, and cyclic artery wall deflection. An optimization scheme was developed using Lagrangian interpolation elements that sought to minimize the sum of these outputs, with weighting coefficients. Varying the weighting coefficients results in stent designs that prioritize one output over another. The accuracy of the algorithm was confirmed by evaluating the resulting outputs of the optimized geometries using FEM. The capacity of the optimization algorithm to identify optimal geometries and their resulting mechanical measures was retained over a wide range of weighting coefficients. The variety of stent designs identified provides general guidelines that have potential clinical use (i.e., lesion-specific stenting).

  13. Multidisciplinary Concurrent Design Optimization via the Internet

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Kelkar, Atul G.; Koganti, Gopichand

    2001-01-01

    A methodology is presented which uses commercial design and analysis software and the Internet to perform concurrent multidisciplinary optimization. The methodology provides a means to develop multidisciplinary designs without requiring that all software be accessible from the same local network. The procedures are amenable to design and development teams whose members, expertise and respective software are not geographically located together. This methodology facilitates multidisciplinary teams working concurrently on a design problem of common interest. Partition of design software to different machines allows each constituent software to be used on the machine that provides the most economy and efficiency. The methodology is demonstrated on the concurrent design of a spacecraft structure and attitude control system. Results are compared to those derived from performing the design with an autonomous FORTRAN program.

  14. Optimal radar waveform design for moving target

    NASA Astrophysics Data System (ADS)

    Zhu, Binqi; Gao, Yesheng; Wang, Kaizhi; Liu, Xingzhao

    2016-07-01

    An optimal radar waveform-design method is proposed to detect moving targets in the presence of clutter and noise. The clutter is split into moving and static parts. Radar-moving target/clutter models are introduced and combined with Neyman-Pearson criteria to design optimal waveforms. Results show that optimal waveform for a moving target is different with that for a static target. The combination of simple-frequency signals could produce maximum detectability based on different noise-power spectrum density situations. Simulations show that our algorithm greatly improves signal-to-clutter plus noise ratio of radar system. Therefore, this algorithm may be preferable for moving target detection when prior information on clutter and noise is available.

  15. Photovoltaic design optimization for terrestrial applications

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1978-01-01

    As part of the Jet Propulsion Laboratory's Low-Cost Solar Array Project, a comprehensive program of module cost-optimization has been carried out. The objective of these studies has been to define means of reducing the cost and improving the utility and reliability of photovoltaic modules for the broad spectrum of terrestrial applications. This paper describes one of the methods being used for module optimization, including the derivation of specific equations which allow the optimization of various module design features. The method is based on minimizing the life-cycle cost of energy for the complete system. Comparison of the life-cycle energy cost with the marginal cost of energy each year allows the logical plant lifetime to be determined. The equations derived allow the explicit inclusion of design parameters such as tracking, site variability, and module degradation with time. An example problem involving the selection of an optimum module glass substrate is presented.

  16. Optimization of confocal scanning laser ophthalmoscope design.

    PubMed

    LaRocca, Francesco; Dhalla, Al-Hafeez; Kelly, Michael P; Farsiu, Sina; Izatt, Joseph A

    2013-07-01

    Confocal scanning laser ophthalmoscopy (cSLO) enables high-resolution and high-contrast imaging of the retina by employing spatial filtering for scattered light rejection. However, to obtain optimized image quality, one must design the cSLO around scanner technology limitations and minimize the effects of ocular aberrations and imaging artifacts. We describe a cSLO design methodology resulting in a simple, relatively inexpensive, and compact lens-based cSLO design optimized to balance resolution and throughput for a 20-deg field of view (FOV) with minimal imaging artifacts. We tested the imaging capabilities of our cSLO design with an experimental setup from which we obtained fast and high signal-to-noise ratio (SNR) retinal images. At lower FOVs, we were able to visualize parafoveal cone photoreceptors and nerve fiber bundles even without the use of adaptive optics. Through an experiment comparing our optimized cSLO design to a commercial cSLO system, we show that our design demonstrates a significant improvement in both image quality and resolution.

  17. Shape design sensitivity analysis and optimal design of structural systems

    NASA Technical Reports Server (NTRS)

    Choi, Kyung K.

    1987-01-01

    The material derivative concept of continuum mechanics and an adjoint variable method of design sensitivity analysis are used to relate variations in structural shape to measures of structural performance. A domain method of shape design sensitivity analysis is used to best utilize the basic character of the finite element method that gives accurate information not on the boundary but in the domain. Implementation of shape design sensitivty analysis using finite element computer codes is discussed. Recent numerical results are used to demonstrate the accuracy obtainable using the method. Result of design sensitivity analysis is used to carry out design optimization of a built-up structure.

  18. MDO can help resolve the designer's dilemma. [multidisciplinary design optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Tulinius, Jan R.

    1991-01-01

    Multidisciplinary design optimization (MDO) is presented as a rapidly growing body of methods, algorithms, and techniques that will provide a quantum jump in the effectiveness and efficiency of the quantitative side of design, and will turn that side into an environment in which the qualitative side can thrive. MDO borrows from CAD/CAM for graphic visualization of geometrical and numerical data, data base technology, and in computer software and hardware. Expected benefits from this methodology are a rational, mathematically consistent approach to hypersonic aircraft designs, designs pushed closer to the optimum, and a design process either shortened or leaving time available for different concepts to be explored.

  19. MDO can help resolve the designer's dilemma. [Multidisciplinary design optimization

    SciTech Connect

    Sobieszczanski-sobieski, Jaroslaw; Tulinius, J.R. Rockwell International Corp., El Segundo, CA )

    1991-09-01

    Multidisciplinary design optimization (MDO) is presented as a rapidly growing body of methods, algorithms, and techniques that will provide a quantum jump in the effectiveness and efficiency of the quantitative side of design, and will turn that side into an environment in which the qualitative side can thrive. MDO borrows from CAD/CAM for graphic visualization of geometrical and numerical data, data base technology, and in computer software and hardware. Expected benefits from this methodology are a rational, mathematically consistent approach to hypersonic aircraft designs, designs pushed closer to the optimum, and a design process either shortened or leaving time available for different concepts to be explored.

  20. MDO can help resolve the designer's dilemma. [multidisciplinary design optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Tulinius, Jan R.

    1991-01-01

    Multidisciplinary design optimization (MDO) is presented as a rapidly growing body of methods, algorithms, and techniques that will provide a quantum jump in the effectiveness and efficiency of the quantitative side of design, and will turn that side into an environment in which the qualitative side can thrive. MDO borrows from CAD/CAM for graphic visualization of geometrical and numerical data, data base technology, and in computer software and hardware. Expected benefits from this methodology are a rational, mathematically consistent approach to hypersonic aircraft designs, designs pushed closer to the optimum, and a design process either shortened or leaving time available for different concepts to be explored.

  1. Design Optimization of Structural Health Monitoring Systems

    SciTech Connect

    Flynn, Eric B.

    2014-03-06

    Sensor networks drive decisions. Approach: Design networks to minimize the expected total cost (in a statistical sense, i.e. Bayes Risk) associated with making wrong decisions and with installing maintaining and running the sensor network itself. Search for optimal solutions using Monte-Carlo-Sampling-Adapted Genetic Algorithm. Applications include structural health monitoring and surveillance.

  2. Design Oriented Structural Modeling for Airplane Conceptual Design Optimization

    NASA Technical Reports Server (NTRS)

    Livne, Eli

    1999-01-01

    The main goal for research conducted with the support of this grant was to develop design oriented structural optimization methods for the conceptual design of airplanes. Traditionally in conceptual design airframe weight is estimated based on statistical equations developed over years of fitting airplane weight data in data bases of similar existing air- planes. Utilization of such regression equations for the design of new airplanes can be justified only if the new air-planes use structural technology similar to the technology on the airplanes in those weight data bases. If any new structural technology is to be pursued or any new unconventional configurations designed the statistical weight equations cannot be used. In such cases any structural weight estimation must be based on rigorous "physics based" structural analysis and optimization of the airframes under consideration. Work under this grant progressed to explore airframe design-oriented structural optimization techniques along two lines of research: methods based on "fast" design oriented finite element technology and methods based on equivalent plate / equivalent shell models of airframes, in which the vehicle is modelled as an assembly of plate and shell components, each simulating a lifting surface or nacelle / fuselage pieces. Since response to changes in geometry are essential in conceptual design of airplanes, as well as the capability to optimize the shape itself, research supported by this grant sought to develop efficient techniques for parametrization of airplane shape and sensitivity analysis with respect to shape design variables. Towards the end of the grant period a prototype automated structural analysis code designed to work with the NASA Aircraft Synthesis conceptual design code ACS= was delivered to NASA Ames.

  3. Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Briggs, Jeffery L.

    2008-01-01

    The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.

  4. Collective Framework and Performance Optimizations to Open MPI for Cray XT Platforms

    SciTech Connect

    Ladd, Joshua S; Gorentla Venkata, Manjunath; Shamis, Pavel; Graham, Richard L

    2011-01-01

    The performance and scalability of collective operations plays a key role in the performance and scalability of many scientific applications. Within the Open MPI code base we have developed a general purpose hierarchical collective operations framework called Cheetah, and applied it at large scale on the Oak Ridge Leadership Computing Facility's Jaguar (OLCF) platform, obtaining better performance and scalability than the native MPI implementation. This paper discuss Cheetah's design and implementation, and optimizations to the framework for Cray XT 5 platforms. Our results show that the Cheetah's Broadcast and Barrier perform better than the native MPI implementation. For medium data, the Cheetah's Broadcast outperforms the native MPI implementation by 93% for 49,152 processes problem size. For small and large data, it out performs the native MPI implementation by 10% and 9%, respectively, at 24,576 processes problem size. The Cheetah's Barrier performs 10% better than the native MPI implementation for 12,288 processes problem size.

  5. Optimal designs in regression with correlated errors

    PubMed Central

    Dette, Holger; Pepelyshev, Andrey; Zhigljavsky, Anatoly

    2016-01-01

    This paper discusses the problem of determining optimal designs for regression models, when the observations are dependent and taken on an interval. A complete solution of this challenging optimal design problem is given for a broad class of regression models and covariance kernels. We propose a class of estimators which are only slightly more complicated than the ordinary least-squares estimators. We then demonstrate that we can design the experiments, such that asymptotically the new estimators achieve the same precision as the best linear unbiased estimator computed for the whole trajectory of the process. As a by-product we derive explicit expressions for the BLUE in the continuous time model and analytic expressions for the optimal designs in a wide class of regression models. We also demonstrate that for a finite number of observations the precision of the proposed procedure, which includes the estimator and design, is very close to the best achievable. The results are illustrated on a few numerical examples. PMID:27340304

  6. Computational design and optimization of energy materials

    NASA Astrophysics Data System (ADS)

    Chan, Maria

    The use of density functional theory (DFT) to understand and improve energy materials for diverse applications - including energy storage, thermal management, catalysis, and photovoltaics - is widespread. The further step of using high throughput DFT calculations to design materials and has led to an acceleration in materials discovery and development. Due to various limitations in DFT, including accuracy and computational cost, however, it is important to leverage effective models and, in some cases, experimental information to aid the design process. In this talk, I will discuss efforts in design and optimization of energy materials using a combination of effective models, DFT, machine learning, and experimental information.

  7. Optimizing the TRD design for ACCESS

    SciTech Connect

    Cherry, M. L.; Guzik, T. G.; Isbert, J.; Wefel, J. P.

    1999-01-22

    The present ACCESS design combines an ionization calorimeter with a transition radiation detector (TRD) to measure the cosmic ray composition and energy spectrum from H to Fe at energies above 1 TeV/nucleon to the 'knee' in the all particle spectrum. We are in the process of optimizing the TRD design to extend the range of the technique to as high an energy as possible given the constraints of the International Space Station mission and the need to coexist with the calorimeter. The current status of the design effort and preliminary results will be presented.

  8. A concept ideation framework for medical device design.

    PubMed

    Hagedorn, Thomas J; Grosse, Ian R; Krishnamurty, Sundar

    2015-06-01

    Medical device design is a challenging process, often requiring collaboration between medical and engineering domain experts. This collaboration can be best institutionalized through systematic knowledge transfer between the two domains coupled with effective knowledge management throughout the design innovation process. Toward this goal, we present the development of a semantic framework for medical device design that unifies a large medical ontology with detailed engineering functional models along with the repository of design innovation information contained in the US Patent Database. As part of our development, existing medical, engineering, and patent document ontologies were modified and interlinked to create a comprehensive medical device innovation and design tool with appropriate properties and semantic relations to facilitate knowledge capture, enrich existing knowledge, and enable effective knowledge reuse for different scenarios. The result is a Concept Ideation Framework for Medical Device Design (CIFMeDD). Key features of the resulting framework include function-based searching and automated inter-domain reasoning to uniquely enable identification of functionally similar procedures, tools, and inventions from multiple domains based on simple semantic searches. The significance and usefulness of the resulting framework for aiding in conceptual design and innovation in the medical realm are explored via two case studies examining medical device design problems. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. A Comprehensive Learning Event Design Using a Communication Framework

    ERIC Educational Resources Information Center

    Bower, Robert L.

    1975-01-01

    A learning event design for accountability uses a communications framework. The example given is a slide presentation on the invasion of Cuba during the Spanish-American War. Design components include introduction, objectives, media, involvement plans, motivation, bibliography, recapitulation, involvement sheets, evaluation, stimulus-response…

  10. Learning Experience as Transaction: A Framework for Instructional Design

    ERIC Educational Resources Information Center

    Parrish, Patrick E.; Wilson, Brent G.; Dunlap, Joanna C.

    2011-01-01

    This article presents a framework for understanding learning experience as an object for instructional design--as an object for design as well as research and understanding. Compared to traditional behavioral objectives or discrete cognitive skills, the object of experience is more holistic, requiring simultaneous attention to cognition, behavior,…

  11. Framework Programmable Platform for the advanced software development workstation: Framework processor design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les

    1991-01-01

    The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.

  12. A Framework on Causes and Effects of Design Iterations

    NASA Astrophysics Data System (ADS)

    Mujumdar, Purva; Matsagar, Vasant

    2017-06-01

    Design is an evolutionary process that involves several teams collaborating with each other to achieve the final design solution. Due to an intrinsic association of teams participating during design stages, frequent exchanges of information take place among them to execute various design activities. These information exchanges occur in cycles/loops where the design is improved/refined at each information exchange. This process is referred as design iteration. Iteration is one of the most complex and unavoidable phenomenon of a design process and they last until the specifications and design requirements are met. Hence, design teams need to acquire a thorough understanding of the factors that cause iterations. As the causes of iterations have not been identified in a classified manner, a causes and effects framework of design iteration is presented in this paper. The methodology adopted to develop this framework was detailed literature review and interactions done with industry experts. This framework guides project planners to identify the possible causes of iterations and enables them to plan their design projects efficiently.

  13. Optimized Uncertainty Quantification Algorithm Within a Dynamic Event Tree Framework

    SciTech Connect

    J. W. Nielsen; Akira Tokuhiro; Robert Hiromoto

    2014-06-01

    Methods for developing Phenomenological Identification and Ranking Tables (PIRT) for nuclear power plants have been a useful tool in providing insight into modelling aspects that are important to safety. These methods have involved expert knowledge with regards to reactor plant transients and thermal-hydraulic codes to identify are of highest importance. Quantified PIRT provides for rigorous method for quantifying the phenomena that can have the greatest impact. The transients that are evaluated and the timing of those events are typically developed in collaboration with the Probabilistic Risk Analysis. Though quite effective in evaluating risk, traditional PRA methods lack the capability to evaluate complex dynamic systems where end states may vary as a function of transition time from physical state to physical state . Dynamic PRA (DPRA) methods provide a more rigorous analysis of complex dynamic systems. A limitation of DPRA is its potential for state or combinatorial explosion that grows as a function of the number of components; as well as, the sampling of transition times from state-to-state of the entire system. This paper presents a method for performing QPIRT within a dynamic event tree framework such that timing events which result in the highest probabilities of failure are captured and a QPIRT is performed simultaneously while performing a discrete dynamic event tree evaluation. The resulting simulation results in a formal QPIRT for each end state. The use of dynamic event trees results in state explosion as the number of possible component states increases. This paper utilizes a branch and bound algorithm to optimize the solution of the dynamic event trees. The paper summarizes the methods used to implement the branch-and-bound algorithm in solving the discrete dynamic event trees.

  14. High Speed Civil Transport Design Using Collaborative Optimization and Approximate Models

    NASA Technical Reports Server (NTRS)

    Manning, Valerie Michelle

    1999-01-01

    The design of supersonic aircraft requires complex analysis in multiple disciplines, posing, a challenge for optimization methods. In this thesis, collaborative optimization, a design architecture developed to solve large-scale multidisciplinary design problems, is applied to the design of supersonic transport concepts. Collaborative optimization takes advantage of natural disciplinary segmentation to facilitate parallel execution of design tasks. Discipline-specific design optimization proceeds while a coordinating mechanism ensures progress toward an optimum and compatibility between disciplinary designs. Two concepts for supersonic aircraft are investigated: a conventional delta-wing design and a natural laminar flow concept that achieves improved performance by exploiting properties of supersonic flow to delay boundary layer transition. The work involves the development of aerodynamics and structural analyses, and integration within a collaborative optimization framework. It represents the most extensive application of the method to date.

  15. Aircraft family design using enhanced collaborative optimization

    NASA Astrophysics Data System (ADS)

    Roth, Brian Douglas

    Significant progress has been made toward the development of multidisciplinary design optimization (MDO) methods that are well-suited to practical large-scale design problems. However, opportunities exist for further progress. This thesis describes the development of enhanced collaborative optimization (ECO), a new decomposition-based MDO method. To support the development effort, the thesis offers a detailed comparison of two existing MDO methods: collaborative optimization (CO) and analytical target cascading (ATC). This aids in clarifying their function and capabilities, and it provides inspiration for the development of ECO. The ECO method offers several significant contributions. First, it enhances communication between disciplinary design teams while retaining the low-order coupling between them. Second, it provides disciplinary design teams with more authority over the design process. Third, it resolves several troubling computational inefficiencies that are associated with CO. As a result, ECO provides significant computational savings (relative to CO) for the test cases and practical design problems described in this thesis. New aircraft development projects seldom focus on a single set of mission requirements. Rather, a family of aircraft is designed, with each family member tailored to a different set of requirements. This thesis illustrates the application of decomposition-based MDO methods to aircraft family design. This represents a new application area, since MDO methods have traditionally been applied to multidisciplinary problems. ECO offers aircraft family design the same benefits that it affords to multidisciplinary design problems. Namely, it simplifies analysis integration, it provides a means to manage problem complexity, and it enables concurrent design of all family members. In support of aircraft family design, this thesis introduces a new wing structural model with sufficient fidelity to capture the tradeoffs associated with component

  16. New theoretical framework for designing nonionic surfactant mixtures that exhibit a desired adsorption kinetics behavior.

    PubMed

    Moorkanikkara, Srinivas Nageswaran; Blankschtein, Daniel

    2010-12-21

    How does one design a surfactant mixture using a set of available surfactants such that it exhibits a desired adsorption kinetics behavior? The traditional approach used to address this design problem involves conducting trial-and-error experiments with specific surfactant mixtures. This approach is typically time-consuming and resource-intensive and becomes increasingly challenging when the number of surfactants that can be mixed increases. In this article, we propose a new theoretical framework to identify a surfactant mixture that most closely meets a desired adsorption kinetics behavior. Specifically, the new theoretical framework involves (a) formulating the surfactant mixture design problem as an optimization problem using an adsorption kinetics model and (b) solving the optimization problem using a commercial optimization package. The proposed framework aims to identify the surfactant mixture that most closely satisfies the desired adsorption kinetics behavior subject to the predictive capabilities of the chosen adsorption kinetics model. Experiments can then be conducted at the identified surfactant mixture condition to validate the predictions. We demonstrate the reliability and effectiveness of the proposed theoretical framework through a realistic case study by identifying a nonionic surfactant mixture consisting of up to four alkyl poly(ethylene oxide) surfactants (C(10)E(4), C(12)E(5), C(12)E(6), and C(10)E(8)) such that it most closely exhibits a desired dynamic surface tension (DST) profile. Specifically, we use the Mulqueen-Stebe-Blankschtein (MSB) adsorption kinetics model (Mulqueen, M.; Stebe, K. J.; Blankschtein, D. Langmuir 2001, 17, 5196-5207) to formulate the optimization problem as well as the SNOPT commercial optimization solver to identify a surfactant mixture consisting of these four surfactants that most closely exhibits the desired DST profile. Finally, we compare the experimental DST profile measured at the surfactant mixture condition

  17. Optimal AFCS: particularities of real design

    NASA Astrophysics Data System (ADS)

    Platonov, A.; Zaitsev, Ie.; Chaciński, H.

    2015-09-01

    The paper discusses particularities of optimal adaptive communication systems (AFCS) design conditioned by the particularities of their architecture and way of functioning, as well as by the approach to their design. The main one is that AFCS employ the analog method of transmission (in the paper - amplitude modulation), and are intended for short-range transmission of signals from the analog sources. Another one is that AFCS design is carried out on the basis of strict results of concurrent analytical optimisation of the transmitting and receiving parts of the system. Below, general problems appearing during transition from the theoretical results to the real engineering design, as well as approach to their solution are discussed. Some concrete tasks of AFCS design are also considered.

  18. Optimal design of waveguiding periodic structures

    NASA Astrophysics Data System (ADS)

    Diana, Roberto; Giorgio, Agostino; Perri, Anna Gina; Armenise, Mario Nicola

    2003-04-01

    The design of some 1D waveguiding photonic bandgap (PBG) devices and Fiber Bragg Gratings (FBG) for microstrain based sensing applications has been carried out by a model based on the Bloch-Floquet theorem. A lwo loss, very narrow passband GaAs PBG filter for the operating wavelength λ = 1.55 μm, having an air bridge configuration, was designed and simulated. Moreover, a resonant Si on glass PBG device has been designed to obtain the resonance condition at λ= 1.55 μm. Finally, a FBG-based microstrain sensor design has been carried out, having an array of 32 FBG. A complete analysis of the propagation characteristics, electromagnetic field harmonics and total field distribution, transmission and reflection coefficients, guided and raidated power, and total losses, enabled the optimization of the design in a very short CPU time.

  19. A General Framework of Dynamic Constrained Multiobjective Evolutionary Algorithms for Constrained Optimization.

    PubMed

    Zeng, Sanyou; Jiao, Ruwang; Li, Changhe; Li, Xi; Alkasassbeh, Jawdat S

    2017-09-01

    A novel multiobjective technique is proposed for solving constrained optimization problems (COPs) in this paper. The method highlights three different perspectives: 1) a COP is converted into an equivalent dynamic constrained multiobjective optimization problem (DCMOP) with three objectives: a) the original objective; b) a constraint-violation objective; and c) a niche-count objective; 2) a method of gradually reducing the constraint boundary aims to handle the constraint difficulty; and 3) a method of gradually reducing the niche size aims to handle the multimodal difficulty. A general framework of the design of dynamic constrained multiobjective evolutionary algorithms is proposed for solving DCMOPs. Three popular types of multiobjective evolutionary algorithms, i.e., Pareto ranking-based, decomposition-based, and hype-volume indicator-based, are employed to instantiate the framework. The three instantiations are tested on two benchmark suites. Experimental results show that they perform better than or competitive to a set of state-of-the-art constraint optimizers, especially on problems with a large number of dimensions.

  20. Robust Design Optimization via Failure Domain Bounding

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2007-01-01

    This paper extends and applies the strategies recently developed by the authors for handling constraints under uncertainty to robust design optimization. For the scope of this paper, robust optimization is a methodology aimed at problems for which some parameters are uncertain and are only known to belong to some uncertainty set. This set can be described by either a deterministic or a probabilistic model. In the methodology developed herein, optimization-based strategies are used to bound the constraint violation region using hyper-spheres and hyper-rectangles. By comparing the resulting bounding sets with any given uncertainty model, it can be determined whether the constraints are satisfied for all members of the uncertainty model (i.e., constraints are feasible) or not (i.e., constraints are infeasible). If constraints are infeasible and a probabilistic uncertainty model is available, upper bounds to the probability of constraint violation can be efficiently calculated. The tools developed enable approximating not only the set of designs that make the constraints feasible but also, when required, the set of designs for which the probability of constraint violation is below a prescribed admissible value. When constraint feasibility is possible, several design criteria can be used to shape the uncertainty model of performance metrics of interest. Worst-case, least-second-moment, and reliability-based design criteria are considered herein. Since the problem formulation is generic and the tools derived only require standard optimization algorithms for their implementation, these strategies are easily applicable to a broad range of engineering problems.

  1. Multidisciplinary Design Optimization of A Highly Flexible Aeroservoelastic Wing

    NASA Astrophysics Data System (ADS)

    Haghighat, Sohrab

    A multidisciplinary design optimization framework is developed that integrates control system design with aerostructural design for a highly-deformable wing. The objective of this framework is to surpass the existing aircraft endurance limits through the use of an active load alleviation system designed concurrently with the rest of the aircraft. The novelty of this work is two fold. First, a unified dynamics framework is developed to represent the full six-degree-of-freedom rigid-body along with the structural dynamics. It allows for an integrated control design to account for both manoeuvrability (flying quality) and aeroelasticity criteria simultaneously. Secondly, by synthesizing the aircraft control system along with the structural sizing and aerodynamic shape design, the final design has the potential to exploit synergies among the three disciplines and yield higher performing aircraft. A co-rotational structural framework featuring Euler--Bernoulli beam elements is developed to capture the wing's nonlinear deformations under the effect of aerodynamic and inertial loadings. In this work, a three-dimensional aerodynamic panel code, capable of calculating both steady and unsteady loadings is used. Two different control methods, a model predictive controller (MPC) and a 2-DOF mixed-norm robust controller, are considered in this work to control a highly flexible aircraft. Both control techniques offer unique advantages that make them promising for controlling a highly flexible aircraft. The control system works towards executing time-dependent manoeuvres along with performing gust/manoeuvre load alleviation. The developed framework is investigated for demonstration in two design cases: one in which the control system simply worked towards achieving or maintaining a target altitude, and another where the control system is also performing load alleviation. The use of the active load alleviation system results in a significant improvement in the aircraft performance

  2. Optimal design of a space power system

    NASA Technical Reports Server (NTRS)

    Chun, Young W.; Braun, James F.

    1990-01-01

    The aerospace industry, like many other industries, regularly applies optimization techniques to develop designs which reduce cost, maximize performance, and minimize weight. The desire to minimize weight is of particular importance in space-related products since the costs of launch are directly related to payload weight, and launch vehicle capabilities often limit the allowable weight of a component or system. With these concerns in mind, this paper presents the optimization of a space-based power generation system for minimum mass. The goal of this work is to demonstrate the use of optimization techniques on a realistic and practical engineering system. The power system described uses thermoelectric devices to convert heat into electricity. The heat source for the system is a nuclear reactor. Waste heat is rejected from the system to space by a radiator.

  3. Design Methods and Optimization for Morphing Aircraft

    NASA Technical Reports Server (NTRS)

    Crossley, William A.

    2005-01-01

    This report provides a summary of accomplishments made during this research effort. The major accomplishments are in three areas. The first is the use of a multiobjective optimization strategy to help identify potential morphing features that uses an existing aircraft sizing code to predict the weight, size and performance of several fixed-geometry aircraft that are Pareto-optimal based upon on two competing aircraft performance objectives. The second area has been titled morphing as an independent variable and formulates the sizing of a morphing aircraft as an optimization problem in which the amount of geometric morphing for various aircraft parameters are included as design variables. This second effort consumed most of the overall effort on the project. The third area involved a more detailed sizing study of a commercial transport aircraft that would incorporate a morphing wing to possibly enable transatlantic point-to-point passenger service.

  4. Optimal design of a space power system

    NASA Technical Reports Server (NTRS)

    Chun, Young W.; Braun, James F.

    1990-01-01

    The aerospace industry, like many other industries, regularly applies optimization techniques to develop designs which reduce cost, maximize performance, and minimize weight. The desire to minimize weight is of particular importance in space-related products since the costs of launch are directly related to payload weight, and launch vehicle capabilities often limit the allowable weight of a component or system. With these concerns in mind, this paper presents the optimization of a space-based power generation system for minimum mass. The goal of this work is to demonstrate the use of optimization techniques on a realistic and practical engineering system. The power system described uses thermoelectric devices to convert heat into electricity. The heat source for the system is a nuclear reactor. Waste heat is rejected from the system to space by a radiator.

  5. Generalized mathematical models in design optimization

    NASA Technical Reports Server (NTRS)

    Papalambros, Panos Y.; Rao, J. R. Jagannatha

    1989-01-01

    The theory of optimality conditions of extremal problems can be extended to problems continuously deformed by an input vector. The connection between the sensitivity, well-posedness, stability and approximation of optimization problems is steadily emerging. The authors believe that the important realization here is that the underlying basis of all such work is still the study of point-to-set maps and of small perturbations, yet what has been identified previously as being just related to solution procedures is now being extended to study modeling itself in its own right. Many important studies related to the theoretical issues of parametric programming and large deformation in nonlinear programming have been reported in the last few years, and the challenge now seems to be in devising effective computational tools for solving these generalized design optimization models.

  6. Sequential ensemble-based optimal design for parameter estimation: SEQUENTIAL ENSEMBLE-BASED OPTIMAL DESIGN

    SciTech Connect

    Man, Jun; Zhang, Jiangjiang; Li, Weixuan; Zeng, Lingzao; Wu, Laosheng

    2016-10-01

    The ensemble Kalman filter (EnKF) has been widely used in parameter estimation for hydrological models. The focus of most previous studies was to develop more efficient analysis (estimation) algorithms. On the other hand, it is intuitively understandable that a well-designed sampling (data-collection) strategy should provide more informative measurements and subsequently improve the parameter estimation. In this work, a Sequential Ensemble-based Optimal Design (SEOD) method, coupled with EnKF, information theory and sequential optimal design, is proposed to improve the performance of parameter estimation. Based on the first-order and second-order statistics, different information metrics including the Shannon entropy difference (SD), degrees of freedom for signal (DFS) and relative entropy (RE) are used to design the optimal sampling strategy, respectively. The effectiveness of the proposed method is illustrated by synthetic one-dimensional and two-dimensional unsaturated flow case studies. It is shown that the designed sampling strategies can provide more accurate parameter estimation and state prediction compared with conventional sampling strategies. Optimal sampling designs based on various information metrics perform similarly in our cases. The effect of ensemble size on the optimal design is also investigated. Overall, larger ensemble size improves the parameter estimation and convergence of optimal sampling strategy. Although the proposed method is applied to unsaturated flow problems in this study, it can be equally applied in any other hydrological problems.

  7. Design and Performance Frameworks for Constructing Problem-Solving Simulations

    PubMed Central

    Stevens, Ron; Palacio-Cayetano, Joycelin

    2003-01-01

    Rapid advancements in hardware, software, and connectivity are helping to shorten the times needed to develop computer simulations for science education. These advancements, however, have not been accompanied by corresponding theories of how best to design and use these technologies for teaching, learning, and testing. Such design frameworks ideally would be guided less by the strengths/limitations of the presentation media and more by cognitive analyses detailing the goals of the tasks, the needs and abilities of students, and the resulting decision outcomes needed by different audiences. This article describes a problem-solving environment and associated theoretical framework for investigating how students select and use strategies as they solve complex science problems. A framework is first described for designing on-line problem spaces that highlights issues of content, scale, cognitive complexity, and constraints. While this framework was originally designed for medical education, it has proven robust and has been successfully applied to learning environments from elementary school through medical school. Next, a similar framework is detailed for collecting student performance and progress data that can provide evidence of students' strategic thinking and that could potentially be used to accelerate student progress. Finally, experimental validation data are presented that link strategy selection and use with other metrics of scientific reasoning and student achievement. PMID:14506505

  8. Multiobjective optimization in integrated photonics design.

    PubMed

    Gagnon, Denis; Dumont, Joey; Dubé, Louis J

    2013-07-01

    We propose the use of the parallel tabu search algorithm (PTS) to solve combinatorial inverse design problems in integrated photonics. To assess the potential of this algorithm, we consider the problem of beam shaping using a two-dimensional arrangement of dielectric scatterers. The performance of PTS is compared to one of the most widely used optimization algorithms in photonics design, the genetic algorithm (GA). We find that PTS can produce comparable or better solutions than the GA, while requiring less computation time and fewer adjustable parameters. For the coherent beam shaping problem as a case study, we demonstrate how PTS can tackle multiobjective optimization problems and represent a robust and efficient alternative to GA.

  9. Initial data sampling in design optimization

    NASA Astrophysics Data System (ADS)

    Southall, Hugh L.; O'Donnell, Terry H.

    2011-06-01

    Evolutionary computation (EC) techniques in design optimization such as genetic algorithms (GA) or efficient global optimization (EGO) require an initial set of data samples (design points) to start the algorithm. They are obtained by evaluating the cost function at selected sites in the input space. A two-dimensional input space can be sampled using a Latin square, a statistical sampling technique which samples a square grid such that there is a single sample in any given row and column. The Latin hypercube is a generalization to any number of dimensions. However, a standard random Latin hypercube can result in initial data sets which may be highly correlated and may not have good space-filling properties. There are techniques which address these issues. We describe and use one technique in this paper.

  10. Social research design: framework for integrating philosophical and practical elements.

    PubMed

    Cunningham, Kathryn Burns

    2014-09-01

    To provide and elucidate a comprehensible framework for the design of social research. An abundance of information exists concerning the process of designing social research. The overall message that can be gleaned is that numerable elements - both philosophical (ontological and epistemological assumptions and theoretical perspective) and practical (issue to be addressed, purpose, aims and research questions) - are influential in the process of selecting a research methodology and methods, and that these elements and their inter-relationships must be considered and explicated to ensure a coherent research design that enables well-founded and meaningful conclusions. There is a lack of guidance concerning the integration of practical and philosophical elements, hindering their consideration and explication. The author's PhD research into loneliness and cancer. This is a methodology paper. A guiding framework that incorporates all of the philosophical and practical elements influential in social research design is presented. The chronological and informative relationships between the elements are discussed. The framework presented can be used by social researchers to consider and explicate the practical and philosophical elements influential in the selection of a methodology and methods. It is hoped that the framework presented will aid social researchers with the design and the explication of the design of their research, thereby enhancing the credibility of their projects and enabling their research to establish well-founded and meaningful conclusions.

  11. Optimized design for an electrothermal microactuator

    NASA Astrophysics Data System (ADS)

    Cǎlimǎnescu, Ioan; Stan, Liviu-Constantin; Popa, Viorica

    2015-02-01

    In micromechanical structures, electrothermal actuators are known to be capable of providing larger force and reasonable tip deflection compared to electrostatic ones. Many studies have been devoted to the analysis of the flexure actuators. One of the most popular electrothermal actuators is called `U-shaped' actuator. The device is composed of two suspended beams with variable cross sections joined at the free end, which constrains the tip to move in an arcing motion while current is passed through the actuator. The goal of this research is to determine via FEA the best fitted geometry of the microactuator (optimization input parameters) in order to render some of the of the output parameters such as thermal strain or total deformations to their maximum values. The software to generate the CAD geometry was SolidWorks 2010 and all the FEA analysis was conducted with Ansys 13 TM. The optimized model has smaller geometric values of the input parameters that is a more compact geometry; The maximum temperature reached a smaller value for the optimized model; The calculated heat flux is with 13% bigger for the optimized model; the same for Joule Heat (26%), Total deformation (1.2%) and Thermal Strain (8%). By simple optimizing the design the dimensions and the performance of the micro actuator resulted more compact and more efficient.

  12. Computational Methods for Design, Control and Optimization

    DTIC Science & Technology

    2007-10-01

    34scenario" that applies to channel flows ( Poiseuille flows , Couette flow ) and pipe flows . Over the past 75 years many complex "transition theories" have... Simulation of Turbulent Flows , Springer Verlag, 2005. Additional Publications Supported by this Grant 1. J. Borggaard and T. Iliescu, Approximate Deconvolution...rigorous analysis of design algorithms that combine numerical simulation codes, approximate sensitivity calculations and optimization codes. The fundamental

  13. Optimal Design of Compact Spur Gear Reductions

    DTIC Science & Technology

    1992-09-01

    stress, psi Lundberg and Palmgren (1952) developed a theory for the life and pressure angle, deg capacity of ball and roller bearings . This life model is... bearings (Lundberg and Paimgren, 1952). Lundberg and Palmgren determined that the scatter in the life of a bearing can be modeled with a two-parameter...optimal design of compact spur gear reductions includes the Vf unit gradient in the feasible direction selection of bearing and shaft proportions in

  14. Embedding Educational Design Pattern Frameworks into Learning Management Systems

    NASA Astrophysics Data System (ADS)

    Derntl, Michael; Calvo, Rafael A.

    Educational design patterns describe reusable solutions to the design of learning tasks and environments. While there are many projects producing patterns, there are few approaches dealing with supporting the instructor/user in instantiating and running those patterns on learning management systems (LMS). This paper aims to make a leap forward in this direction by presenting two different methods of embedding design pattern frameworks into LMS: (1) Supplying custom LMS components as part of the design patterns, and (2) Configuring existing LMS components based on design patterns. Descriptions of implementations and implications of these methods are provided.

  15. Optimal design of a tidal turbine

    NASA Astrophysics Data System (ADS)

    Kueny, J. L.; Lalande, T.; Herou, J. J.; Terme, L.

    2012-11-01

    An optimal design procedure has been applied to improve the design of an open-center tidal turbine. A specific software developed in C++ enables to generate the geometry adapted to the specific constraints imposed to this machine. Automatic scripts based on the AUTOGRID, IGG, FINE/TURBO and CFView software of the NUMECA CFD suite are used to evaluate all the candidate geometries. This package is coupled with the optimization software EASY, which is based on an evolutionary strategy completed by an artificial neural network. A new technique is proposed to guarantee the robustness of the mesh in the whole range of the design parameters. An important improvement of the initial geometry has been obtained. To limit the whole CPU time necessary for this optimization process, the geometry of the tidal turbine has been considered as axisymmetric, with a uniform upstream velocity. A more complete model (12 M nodes) has been built in order to analyze the effects related to the sea bed boundary layer, the proximity of the sea surface, the presence of an important triangular basement supporting the turbine and a possible incidence of the upstream velocity.

  16. General purpose optimization software for engineering design

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1990-01-01

    The author has developed several general purpose optimization programs over the past twenty years. The earlier programs were developed as research codes and served that purpose reasonably well. However, in taking the formal step from research to industrial application programs, several important lessons have been learned. Among these are the importance of clear documentation, immediate user support, and consistent maintenance. Most important has been the issue of providing software that gives a good, or at least acceptable, design at minimum computational cost. Here, the basic issues developing optimization software for industrial applications are outlined and issues of convergence rate, reliability, and relative minima are discussed. Considerable feedback has been received from users, and new software is being developed to respond to identified needs. The basic capabilities of this software are outlined. A major motivation for the development of commercial grade software is ease of use and flexibility, and these issues are discussed with reference to general multidisciplinary applications. It is concluded that design productivity can be significantly enhanced by the more widespread use of optimization as an everyday design tool.

  17. Optimal robust motion controller design using multiobjective genetic algorithm.

    PubMed

    Sarjaš, Andrej; Svečko, Rajko; Chowdhury, Amor

    2014-01-01

    This paper describes the use of a multiobjective genetic algorithm for robust motion controller design. Motion controller structure is based on a disturbance observer in an RIC framework. The RIC approach is presented in the form with internal and external feedback loops, in which an internal disturbance rejection controller and an external performance controller must be synthesised. This paper involves novel objectives for robustness and performance assessments for such an approach. Objective functions for the robustness property of RIC are based on simple even polynomials with nonnegativity conditions. Regional pole placement method is presented with the aims of controllers' structures simplification and their additional arbitrary selection. Regional pole placement involves arbitrary selection of central polynomials for both loops, with additional admissible region of the optimized pole location. Polynomial deviation between selected and optimized polynomials is measured with derived performance objective functions. A multiobjective function is composed of different unrelated criteria such as robust stability, controllers' stability, and time-performance indexes of closed loops. The design of controllers and multiobjective optimization procedure involve a set of the objectives, which are optimized simultaneously with a genetic algorithm-differential evolution.

  18. Optimal Robust Motion Controller Design Using Multiobjective Genetic Algorithm

    PubMed Central

    Svečko, Rajko

    2014-01-01

    This paper describes the use of a multiobjective genetic algorithm for robust motion controller design. Motion controller structure is based on a disturbance observer in an RIC framework. The RIC approach is presented in the form with internal and external feedback loops, in which an internal disturbance rejection controller and an external performance controller must be synthesised. This paper involves novel objectives for robustness and performance assessments for such an approach. Objective functions for the robustness property of RIC are based on simple even polynomials with nonnegativity conditions. Regional pole placement method is presented with the aims of controllers' structures simplification and their additional arbitrary selection. Regional pole placement involves arbitrary selection of central polynomials for both loops, with additional admissible region of the optimized pole location. Polynomial deviation between selected and optimized polynomials is measured with derived performance objective functions. A multiobjective function is composed of different unrelated criteria such as robust stability, controllers' stability, and time-performance indexes of closed loops. The design of controllers and multiobjective optimization procedure involve a set of the objectives, which are optimized simultaneously with a genetic algorithm—differential evolution. PMID:24987749

  19. A clothing modeling framework for uniform and armor system design

    NASA Astrophysics Data System (ADS)

    Man, Xiaolin; Swan, Colby C.; Rahmatalla, Salam

    2006-05-01

    In the analysis and design of military uniforms and body armor systems it is helpful to quantify the effects of the clothing/armor system on a wearer's physical performance capabilities. Toward this end, a clothing modeling framework for quantifying the mechanical interactions between a given uniform or body armor system design and a specific wearer performing defined physical tasks is proposed. The modeling framework consists of three interacting modules: (1) a macroscale fabric mechanics/dynamics model; (2) a collision detection and contact correction module; and (3) a human motion module. In the proposed framework, the macroscopic fabric model is based on a rigorous large deformation continuum-degenerated shell theory representation. The collision and contact module enforces non-penetration constraints between the fabric and human body and computes the associated contact forces between the two. The human body is represented in the current framework, as an assemblage of overlapping ellipsoids that undergo rigid body motions consistent with human motions while performing actions such as walking, running, or jumping. The transient rigid body motions of each ellipsoidal body segment in time are determined using motion capture technology. The integrated modeling framework is then exercised to quantify the resistance that the clothing exerts on the wearer during the specific activities under consideration. Current results from the framework are presented and its intended applications are discussed along with some of the key challenges remaining in clothing system modeling.

  20. Geometric constraints for shape and topology optimization in architectural design

    NASA Astrophysics Data System (ADS)

    Dapogny, Charles; Faure, Alexis; Michailidis, Georgios; Allaire, Grégoire; Couvelas, Agnes; Estevez, Rafael

    2017-02-01

    This work proposes a shape and topology optimization framework oriented towards conceptual architectural design. A particular emphasis is put on the possibility for the user to interfere on the optimization process by supplying information about his personal taste. More precisely, we formulate three novel constraints on the geometry of shapes; while the first two are mainly related to aesthetics, the third one may also be used to handle several fabrication issues that are of special interest in the device of civil structures. The common mathematical ingredient to all three models is the signed distance function to a domain, and its sensitivity analysis with respect to perturbations of this domain; in the present work, this material is extended to the case where the ambient space is equipped with an anisotropic metric tensor. Numerical examples are discussed in two and three space dimensions.

  1. Geometric constraints for shape and topology optimization in architectural design

    NASA Astrophysics Data System (ADS)

    Dapogny, Charles; Faure, Alexis; Michailidis, Georgios; Allaire, Grégoire; Couvelas, Agnes; Estevez, Rafael

    2017-06-01

    This work proposes a shape and topology optimization framework oriented towards conceptual architectural design. A particular emphasis is put on the possibility for the user to interfere on the optimization process by supplying information about his personal taste. More precisely, we formulate three novel constraints on the geometry of shapes; while the first two are mainly related to aesthetics, the third one may also be used to handle several fabrication issues that are of special interest in the device of civil structures. The common mathematical ingredient to all three models is the signed distance function to a domain, and its sensitivity analysis with respect to perturbations of this domain; in the present work, this material is extended to the case where the ambient space is equipped with an anisotropic metric tensor. Numerical examples are discussed in two and three space dimensions.

  2. A hybrid-algorithm-based parallel computing framework for optimal reservoir operation

    NASA Astrophysics Data System (ADS)

    Li, X.; Wei, J.; Li, T.; Wang, G.

    2012-12-01

    Up to date, various optimization models have been developed to offer optimal operating policies for reservoirs. Each optimization model has its own merits and limitations, and no general algorithm exists even today. At times, some optimization models have to be combined to obtain desired results. In this paper, we present a parallel computing framework to combine various optimization models in a different way compared to traditional serial computing. This framework consists of three functional processor types, that is, master processor, slave processor and transfer processor. The master processor has a full computation scheme that allocates optimization models to slave processors; slave processors perform allocated optimization models; the transfer processor is in charge of the solution communication among all slave processors. Based on these, the proposed framework can perform various optimization models in parallel. Because of the solution communication, the framework can also integrate the merits of involved optimization models while in iteration and the performance of each optimization model can therefore be improved. And more, it can be concluded the framework can effectively improve the solution quality and increase the solution speed by making full use of computing power of parallel computers.

  3. Project Assessment Framework through Design (PAFTD) - A Project Assessment Framework in Support of Strategic Decision Making

    NASA Technical Reports Server (NTRS)

    Depenbrock, Brett T.; Balint, Tibor S.; Sheehy, Jeffrey A.

    2014-01-01

    Research and development organizations that push the innovation edge of technology frequently encounter challenges when attempting to identify an investment strategy and to accurately forecast the cost and schedule performance of selected projects. Fast moving and complex environments require managers to quickly analyze and diagnose the value of returns on investment versus allocated resources. Our Project Assessment Framework through Design (PAFTD) tool facilitates decision making for NASA senior leadership to enable more strategic and consistent technology development investment analysis, beginning at implementation and continuing through the project life cycle. The framework takes an integrated approach by leveraging design principles of useability, feasibility, and viability and aligns them with methods employed by NASA's Independent Program Assessment Office for project performance assessment. The need exists to periodically revisit the justification and prioritization of technology development investments as changes occur over project life cycles. The framework informs management rapidly and comprehensively about diagnosed internal and external root causes of project performance.

  4. A framework for designing and analyzing binary decision-making strategies in cellular systems†

    PubMed Central

    Porter, Joshua R.; Andrews, Burton W.; Iglesias, Pablo A.

    2015-01-01

    Cells make many binary (all-or-nothing) decisions based on noisy signals gathered from their environment and processed through noisy decision-making pathways. Reducing the effect of noise to improve the fidelity of decision-making comes at the expense of increased complexity, creating a tradeoff between performance and metabolic cost. We present a framework based on rate distortion theory, a branch of information theory, to quantify this tradeoff and design binary decision-making strategies that balance low cost and accuracy in optimal ways. With this framework, we show that several observed behaviors of binary decision-making systems, including random strategies, hysteresis, and irreversibility, are optimal in an information-theoretic sense for various situations. This framework can also be used to quantify the goals around which a decision-making system is optimized and to evaluate the optimality of cellular decision-making systems by a fundamental information-theoretic criterion. As proof of concept, we use the framework to quantify the goals of the externally triggered apoptosis pathway. PMID:22370552

  5. Machine Learning Techniques in Optimal Design

    NASA Technical Reports Server (NTRS)

    Cerbone, Giuseppe

    1992-01-01

    Many important applications can be formalized as constrained optimization tasks. For example, we are studying the engineering domain of two-dimensional (2-D) structural design. In this task, the goal is to design a structure of minimum weight that bears a set of loads. A solution to a design problem in which there is a single load (L) and two stationary support points (S1 and S2) consists of four members, E1, E2, E3, and E4 that connect the load to the support points is discussed. In principle, optimal solutions to problems of this kind can be found by numerical optimization techniques. However, in practice [Vanderplaats, 1984] these methods are slow and they can produce different local solutions whose quality (ratio to the global optimum) varies with the choice of starting points. Hence, their applicability to real-world problems is severely restricted. To overcome these limitations, we propose to augment numerical optimization by first performing a symbolic compilation stage to produce: (a) objective functions that are faster to evaluate and that depend less on the choice of the starting point and (b) selection rules that associate problem instances to a set of recommended solutions. These goals are accomplished by successive specializations of the problem class and of the associated objective functions. In the end, this process reduces the problem to a collection of independent functions that are fast to evaluate, that can be differentiated symbolically, and that represent smaller regions of the overall search space. However, the specialization process can produce a large number of sub-problems. This is overcome by deriving inductively selection rules which associate problems to small sets of specialized independent sub-problems. Each set of candidate solutions is chosen to minimize a cost function which expresses the tradeoff between the quality of the solution that can be obtained from the sub-problem and the time it takes to produce it. The overall solution

  6. An optimal stratified Simon two-stage design.

    PubMed

    Parashar, Deepak; Bowden, Jack; Starr, Colin; Wernisch, Lorenz; Mander, Adrian

    2016-07-01

    In Phase II oncology trials, therapies are increasingly being evaluated for their effectiveness in specific populations of interest. Such targeted trials require designs that allow for stratification based on the participants' molecular characterisation. A targeted design proposed by Jones and Holmgren (JH) Jones CL, Holmgren E: 'An adaptive Simon two-stage design for phase 2 studies of targeted therapies', Contemporary Clinical Trials 28 (2007) 654-661.determines whether a drug only has activity in a disease sub-population or in the wider disease population. Their adaptive design uses results from a single interim analysis to decide whether to enrich the study population with a subgroup or not; it is based on two parallel Simon two-stage designs. We study the JH design in detail and extend it by providing a few alternative ways to control the familywise error rate, in the weak sense as well as the strong sense. We also introduce a novel optimal design by minimising the expected sample size. Our extended design contributes to the much needed framework for conducting Phase II trials in stratified medicine. © 2016 The Authors Pharmaceutical Statistics Published by John Wiley & Sons Ltd.

  7. An Airway Network Flow Assignment Approach Based on an Efficient Multiobjective Optimization Framework

    PubMed Central

    Zhang, Xuejun; Lei, Jiaxing

    2015-01-01

    Considering reducing the airspace congestion and the flight delay simultaneously, this paper formulates the airway network flow assignment (ANFA) problem as a multiobjective optimization model and presents a new multiobjective optimization framework to solve it. Firstly, an effective multi-island parallel evolution algorithm with multiple evolution populations is employed to improve the optimization capability. Secondly, the nondominated sorting genetic algorithm II is applied for each population. In addition, a cooperative coevolution algorithm is adapted to divide the ANFA problem into several low-dimensional biobjective optimization problems which are easier to deal with. Finally, in order to maintain the diversity of solutions and to avoid prematurity, a dynamic adjustment operator based on solution congestion degree is specifically designed for the ANFA problem. Simulation results using the real traffic data from China air route network and daily flight plans demonstrate that the proposed approach can improve the solution quality effectively, showing superiority to the existing approaches such as the multiobjective genetic algorithm, the well-known multiobjective evolutionary algorithm based on decomposition, and a cooperative coevolution multiobjective algorithm as well as other parallel evolution algorithms with different migration topology. PMID:26180840

  8. Optimal design of biaxial tensile cruciform specimens

    NASA Astrophysics Data System (ADS)

    Demmerle, S.; Boehler, J. P.

    1993-01-01

    F OR EXPERIMENTAL investigations concerning the mechanical behaviour under biaxial stress states of rolled sheet metals, mostly cruciform flat specimens are used. By means of empirical methods, different specimen geometries have been proposed in the literature. In order to evaluate the suitability of a specimen design, a mathematically well defined criterion is developed, based on the standard deviations of the values of the stresses in the test section. Applied to the finite element method, the criterion is employed to realize the shape optimization of biaxial cruciform specimens for isotropic elastic materials. Furthermore, the performance of the obtained optimized specimen design is investigated in the case of off-axes tests on anisotropic materials. Therefore, for the first time, an original testing device, consisting of hinged fixtures with knife edges at each arm of the specimen, is applied to the biaxial test. The obtained results indicate the decisive superiority of the optimized specimens for the proper performance on isotropic materials, as well as the paramount importance of the proposed off-axes testing technique for biaxial tests on anisotropic materials.

  9. Topology optimization design of a space mirror

    NASA Astrophysics Data System (ADS)

    Liu, Jiazhen; Jiang, Bo

    2015-11-01

    As key components of the optical system of the space optical remote sensor, Space mirrors' surface accuracy had a direct impact that couldn't be ignored of the imaging quality of the remote sensor. In the future, large-diameter mirror would become an important trend in the development of space optical technology. However, a sharp increase in the mirror diameter would cause the deformation of the mirror and increase the thermal deformation caused by temperature variations. A reasonable lightweight structure designed to ensure the optical performance of the system to meet the requirements was required. As a new type of lightweight approach, topology optimization technology was an important direction of the current space optical remote sensing technology research. The lightweight design of rectangular mirror was studied. the variable density method of topology optimization was used. The mirror type precision of the mirror assemblies was obtained in different conditions. PV value was less than λ/10 and RMS value was less than λ/50(λ = 632.8nm). The results show that the entire The mirror assemblies can achieve a sufficiently high static rigidity, dynamic stiffness and thermal stability and has the capability of sufficient resistance to external environmental interference . Key words: topology optimization, space mirror, lightweight, space optical remote sensor

  10. A Multiobjective Optimization Framework for Stochastic Control of Complex Systems

    SciTech Connect

    Malikopoulos, Andreas; Maroulas, Vasileios; Xiong, Professor Jie

    2015-01-01

    This paper addresses the problem of minimizing the long-run expected average cost of a complex system consisting of subsystems that interact with each other and the environment. We treat the stochastic control problem as a multiobjective optimization problem of the one-stage expected costs of the subsystems, and we show that the control policy yielding the Pareto optimal solution is an optimal control policy that minimizes the average cost criterion for the entire system. For practical situations with constraints consistent to those we study here, our results imply that the Pareto control policy may be of value in deriving online an optimal control policy in complex systems.

  11. Design search and optimization in aerospace engineering.

    PubMed

    Keane, A J; Scanlan, J P

    2007-10-15

    In this paper, we take a design-led perspective on the use of computational tools in the aerospace sector. We briefly review the current state-of-the-art in design search and optimization (DSO) as applied to problems from aerospace engineering, focusing on those problems that make heavy use of computational fluid dynamics (CFD). This ranges over issues of representation, optimization problem formulation and computational modelling. We then follow this with a multi-objective, multi-disciplinary example of DSO applied to civil aircraft wing design, an area where this kind of approach is becoming essential for companies to maintain their competitive edge. Our example considers the structure and weight of a transonic civil transport wing, its aerodynamic performance at cruise speed and its manufacturing costs. The goals are low drag and cost while holding weight and structural performance at acceptable levels. The constraints and performance metrics are modelled by a linked series of analysis codes, the most expensive of which is a CFD analysis of the aerodynamics using an Euler code with coupled boundary layer model. Structural strength and weight are assessed using semi-empirical schemes based on typical airframe company practice. Costing is carried out using a newly developed generative approach based on a hierarchical decomposition of the key structural elements of a typical machined and bolted wing-box assembly. To carry out the DSO process in the face of multiple competing goals, a recently developed multi-objective probability of improvement formulation is invoked along with stochastic process response surface models (Krigs). This approach both mitigates the significant run times involved in CFD computation and also provides an elegant way of balancing competing goals while still allowing the deployment of the whole range of single objective optimizers commonly available to design teams.

  12. Optimal interferometer designs for optical coherence tomography.

    PubMed

    Rollins, A M; Izatt, J A

    1999-11-01

    We introduce a family of power-conserving fiber-optic interferometer designs for low-coherence reflectometry that use optical circulators, unbalanced couplers, and (or) balanced heterodyne detection. Simple design equations for optimization of the signal-to-noise ratio of the interferometers are expressed in terms of relevant signal and noise sources and measurable system parameters. We use the equations to evaluate the expected performance of the new configurations compared with that of the standard Michelson interferometer that is commonly used in optical coherence tomography (OCT) systems. The analysis indicates that improved sensitivity is expected for all the new interferometer designs, compared with the sensitivity of the standard OCT interferometer, under high-speed imaging conditions.

  13. Sustainable Supply Chain Design by the P-Graph Framework

    EPA Science Inventory

    The present work proposes a computer-aided methodology for designing sustainable supply chains in terms of sustainability metrics by resorting to the P-graph framework. The methodology is an outcome of the collaboration between the Office of Research and Development (ORD) of the ...

  14. Sustainable Supply Chain Design by the P-Graph Framework

    EPA Science Inventory

    The present work proposes a computer-aided methodology for designing sustainable supply chains in terms of sustainability metrics by resorting to the P-graph framework. The methodology is an outcome of the collaboration between the Office of Research and Development (ORD) of the ...

  15. A Proposed Conceptual Framework for Curriculum Design in Physical Fitness.

    ERIC Educational Resources Information Center

    Miller, Peter V.; Beauchamp, Larry S.

    A physical fitness curriculum, designed to provide cumulative benefits in a sequential pattern, is based upon a framework of a conceptual structure. The curriculum's ultimate goal is the achievement of greater physiological efficiency through a holistic approach that would strengthen circulatory-respiratory, mechanical, and neuro-muscular…

  16. A Review of Literacy Frameworks for Learning Environments Design

    ERIC Educational Resources Information Center

    Rebmann, Kristen Radsliff

    2013-01-01

    This article charts the development of three literacy research frameworks: multiliteracies, new literacies, and popular literacies. By reviewing the literature surrounding three current conceptions of literacy, an attempt is made to form an integrative grouping that captures the most relevant elements of each for learning environments design.…

  17. TARDIS: An Automation Framework for JPL Mission Design and Navigation

    NASA Technical Reports Server (NTRS)

    Roundhill, Ian M.; Kelly, Richard M.

    2014-01-01

    Mission Design and Navigation at the Jet Propulsion Laboratory has implemented an automation framework tool to assist in orbit determination and maneuver design analysis. This paper describes the lessons learned from previous automation tools and how they have been implemented in this tool. In addition this tool has revealed challenges in software implementation, testing, and user education. This paper describes some of these challenges and invites others to share their experiences.

  18. TARDIS: An Automation Framework for JPL Mission Design and Navigation

    NASA Technical Reports Server (NTRS)

    Roundhill, Ian M.; Kelly, Richard M.

    2014-01-01

    Mission Design and Navigation at the Jet Propulsion Laboratory has implemented an automation framework tool to assist in orbit determination and maneuver design analysis. This paper describes the lessons learned from previous automation tools and how they have been implemented in this tool. In addition this tool has revealed challenges in software implementation, testing, and user education. This paper describes some of these challenges and invites others to share their experiences.

  19. Surrogate assisted multidisciplinary design optimization for an all-electric GEO satellite

    NASA Astrophysics Data System (ADS)

    Shi, Renhe; Liu, Li; Long, Teng; Liu, Jian; Yuan, Bin

    2017-09-01

    State-of-the-art all-electric geostationary earth orbit (GEO) satellites use electric thrusters to execute all propulsive duties, which significantly differ from the traditional all-chemical ones in orbit-raising, station-keeping, radiation damage protection, and power budget, etc. Design optimization task of an all-electric GEO satellite is therefore a complex multidisciplinary design optimization (MDO) problem involving unique design considerations. However, solving the all-electric GEO satellite MDO problem faces big challenges in disciplinary modeling techniques and efficient optimization strategy. To address these challenges, we presents a surrogate assisted MDO framework consisting of several modules, i.e., MDO problem definition, multidisciplinary modeling, multidisciplinary analysis (MDA), and surrogate assisted optimizer. Based on the proposed framework, the all-electric GEO satellite MDO problem is formulated to minimize the total mass of the satellite system under a number of practical constraints. Then considerable efforts are spent on multidisciplinary modeling involving geosynchronous transfer, GEO station-keeping, power, thermal control, attitude control, and structure disciplines. Since orbit dynamics models and finite element structural model are computationally expensive, an adaptive response surface surrogate based optimizer is incorporated in the proposed framework to solve the satellite MDO problem with moderate computational cost, where a response surface surrogate is gradually refined to represent the computationally expensive MDA process. After optimization, the total mass of the studied GEO satellite is decreased by 185.3 kg (i.e., 7.3% of the total mass). Finally, the optimal design is further discussed to demonstrate the effectiveness of our proposed framework to cope with the all-electric GEO satellite system design optimization problems. This proposed surrogate assisted MDO framework can also provide valuable references for other all

  20. Multidisciplinary design optimization for sonic boom mitigation

    NASA Astrophysics Data System (ADS)

    Ozcer, Isik A.

    product design. The simulation tools are used to optimize three geometries for sonic boom mitigation. The first is a simple axisymmetric shape to be used as a generic nose component, the second is a delta wing with lift, and the third is a real aircraft with nose and wing optimization. The objectives are to minimize the pressure impulse or the peak pressure in the sonic boom signal, while keeping the drag penalty under feasible limits. The design parameters for the meridian profile of the nose shape are the lengths and the half-cone angles of the linear segments that make up the profile. The design parameters for the lifting wing are the dihedral angle, angle of attack, non-linear span-wise twist and camber distribution. The test-bed aircraft is the modified F-5E aircraft built by Northrop Grumman, designated the Shaped Sonic Boom Demonstrator. This aircraft is fitted with an optimized axisymmetric nose, and the wings are optimized to demonstrate optimization for sonic boom mitigation for a real aircraft. The final results predict 42% reduction in bow shock strength, 17% reduction in peak Deltap, 22% reduction in pressure impulse, 10% reduction in foot print size, 24% reduction in inviscid drag, and no loss in lift for the optimized aircraft. Optimization is carried out using response surface methodology, and the design matrices are determined using standard DoE techniques for quadratic response modeling.

  1. Simulation-based design for robotic care device: Optimizing trajectory of transfer support robot.

    PubMed

    Imamura, Yumeko; Ayusawa, Ko; Endo, Yui; Yoshida, Eiichi

    2017-07-01

    This paper presents a framework of simulation-based design for robotic care devices developed to reduce the burden of caregiver and care receivers. First, physical interaction between the user and device is quantitatively estimated by using a digital human simulator. Then we introduce a method for optimizing the design parameters according to given evaluation criteria. An example of trajectory optimization of transfer support robot is provided to demonstrate the effectiveness of the proposed method.

  2. An Active RBSE Framework to Generate Optimal Stimulus Sequences in a BCI for Spelling

    NASA Astrophysics Data System (ADS)

    Moghadamfalahi, Mohammad; Akcakaya, Murat; Nezamfar, Hooman; Sourati, Jamshid; Erdogmus, Deniz

    2017-10-01

    A class of brain computer interfaces (BCIs) employs noninvasive recordings of electroencephalography (EEG) signals to enable users with severe speech and motor impairments to interact with their environment and social network. For example, EEG based BCIs for typing popularly utilize event related potentials (ERPs) for inference. Presentation paradigm design in current ERP-based letter by letter typing BCIs typically query the user with an arbitrary subset characters. However, the typing accuracy and also typing speed can potentially be enhanced with more informed subset selection and flash assignment. In this manuscript, we introduce the active recursive Bayesian state estimation (active-RBSE) framework for inference and sequence optimization. Prior to presentation in each iteration, rather than showing a subset of randomly selected characters, the developed framework optimally selects a subset based on a query function. Selected queries are made adaptively specialized for users during each intent detection. Through a simulation-based study, we assess the effect of active-RBSE on the performance of a language-model assisted typing BCI in terms of typing speed and accuracy. To provide a baseline for comparison, we also utilize standard presentation paradigms namely, row and column matrix presentation paradigm and also random rapid serial visual presentation paradigms. The results show that utilization of active-RBSE can enhance the online performance of the system, both in terms of typing accuracy and speed.

  3. Designing optimal greenhouse gas monitoring networks for Australia

    NASA Astrophysics Data System (ADS)

    Ziehn, T.; Law, R. M.; Rayner, P. J.; Roff, G.

    2016-01-01

    Atmospheric transport inversion is commonly used to infer greenhouse gas (GHG) flux estimates from concentration measurements. The optimal location of ground-based observing stations that supply these measurements can be determined by network design. Here, we use a Lagrangian particle dispersion model (LPDM) in reverse mode together with a Bayesian inverse modelling framework to derive optimal GHG observing networks for Australia. This extends the network design for carbon dioxide (CO2) performed by Ziehn et al. (2014) to also minimise the uncertainty on the flux estimates for methane (CH4) and nitrous oxide (N2O), both individually and in a combined network using multiple objectives. Optimal networks are generated by adding up to five new stations to the base network, which is defined as two existing stations, Cape Grim and Gunn Point, in southern and northern Australia respectively. The individual networks for CO2, CH4 and N2O and the combined observing network show large similarities because the flux uncertainties for each GHG are dominated by regions of biologically productive land. There is little penalty, in terms of flux uncertainty reduction, for the combined network compared to individually designed networks. The location of the stations in the combined network is sensitive to variations in the assumed data uncertainty across locations. A simple assessment of economic costs has been included in our network design approach, considering both establishment and maintenance costs. Our results suggest that, while site logistics change the optimal network, there is only a small impact on the flux uncertainty reductions achieved with increasing network size.

  4. ODIN: Optimal design integration system. [reusable launch vehicle design

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.; Hague, D. S.

    1975-01-01

    The report provides a summary of the Optimal Design Integration (ODIN) System as it exists at Langley Research Center. A discussion of the ODIN System, the executive program and the data base concepts are presented. Two examples illustrate the capabilities of the system which have been exploited. Appended to the report are a summary of abstracts for the ODIN library programs and a description of the use of the executive program in linking the library programs.

  5. Database Design for Structural Analysis and Design Optimization.

    DTIC Science & Technology

    1984-10-01

    C.C. Wu 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT, PROJECT , TASK AREA & WORK UNIT NUMBERS Applied-Optimal Design Laboratory...relational algebraic operations such as PROJECT , JOIN, and SELECT can be used to form new relations. Figure 2.1 shows a typical relational model of data...data set contains the definition of mathematical 3-D surfaces of up to third order to which lines and grids may be projected . The surfaces are defined in

  6. Design optimization of an airbreathing aerospaceplane

    NASA Astrophysics Data System (ADS)

    Janovsky, R.; Staufenbiel, R.

    A new species of advanced space transportation vehicles, called aerospaceplanes, is under discussion, and challenging technology programs are on the way. These vehicles rely on airbreathing propulsion, which, compared to rocket propulsions, provides much higher overall efficiency. Aerospaceplanes will be able to take-off and land horizontally and are partly or fully reusable. For this new kind of transportation system, the application of new design principles is necessary. Complex airbreathing propulsion systems have to be integrated into the airframe. Compared to conventional rocket systems, a spaceplane has to stand higher aerodynamic and thermodynamic loads during the longer flight in a denser atmosphere. Therefore, the empty mass fraction of an aerospaceplane will be higher than that of a rocket system and will partly compensate for the higher propulsion efficiency. In this paper, the influence of the key design parameters on the transport efficiency will be determined for the first stage of a two-stage-to-orbit spaceplane with an airbreathing first stage. The configuration of the first stage is a special lifting body, called 'ELAC I'. As a measure of transport efficiency, the ratio of the vehicle take-off mass to payload mass, the so called Growth Factor, is chosen. Furthermore, the design of the first stage is optimized by using a multidimensional optimization procedure to obtain a minimum Growth Factor for various size ratios between the second and first stages.

  7. A generalized slab-wise framework for parallel transmit multiband RF pulse design

    PubMed Central

    Wu, Xiaoping; Schmitter, Sebastian; Auerbach, Edward J.; Uğurbil, Kâmil; de Moortele, Pierre-François Van

    2015-01-01

    Purpose We propose a new slab-wise framework to design parallel transmit multi-band pulses for volumetric simultaneous multi-slice imaging with a large field of view along the slice direction (FOVs). Theory and Methods The slab-wise framework divides FOVs into a few contiguous slabs and optimizes pulses for each slab. Effects of relevant design parameters including slab number and transmit B1 (B1+) mapping slice placement were investigated for human brain imaging by designing pulses with global or local SAR control based on electromagnetic simulations of a 7T head RF array. Pulse design using in-vivo B1+ maps was demonstrated and evaluated with Bloch simulations. Results RF performance with respect to SAR reduction or B1+ homogenization across the entire human brain improved with increasing slabs; however, this improvement was non-linear and leveled off at ~12 slabs when the slab thickness reduced to ~12 mm. The impact of using different slice placements for B1+ mapping was small. Conclusion Compared to slice-wise approaches where each of the many imaging slices requires both B1+ mapping and pulse optimization, the proposed slab-wise design framework is shown to attain comparable RF performance while drastically reducing the number of required pulses; therefore, it can be used to increase time efficiency for B1+ mapping, pulse calculation and sequence preparation. PMID:25994797

  8. Designing an Advanced Instructional Design Advisor: Conceptual Frameworks. Volume 5

    DTIC Science & Technology

    1991-12-01

    STATING THE FUNCTION OF A TRANSFORMER COGNITIVE STRATEGY USING SPLIT-HALF TO CHECK ( CONTORL PROCESS MALFUNCTION IN ELECTRIC CIRCUIT MOTOR SKILL MAKING A...Directorate and is aimed at producing automated instructional design guidance for developers of computer-based instructional materials. The process of...producing effective computer-based instructional materials is complex and time-consuming. Few experts exist to insure the effectiveness of the process

  9. Design, optimization, and control of tensegrity structures

    NASA Astrophysics Data System (ADS)

    Masic, Milenko

    The contributions of this dissertation may be divided into four categories. The first category involves developing a systematic form-finding method for general and symmetric tensegrity structures. As an extension of the available results, different shape constraints are incorporated in the problem. Methods for treatment of these constraints are considered and proposed. A systematic formulation of the form-finding problem for symmetric tensegrity structures is introduced, and it uses the symmetry to reduce both the number of equations and the number of variables in the problem. The equilibrium analysis of modular tensegrities exploits their peculiar symmetry. The tensegrity similarity transformation completes the contributions in the area of enabling tools for tensegrity form-finding. The second group of contributions develops the methods for optimal mass-to-stiffness-ratio design of tensegrity structures. This technique represents the state-of-the-art for the static design of tensegrity structures. It is an extension of the results available for the topology optimization of truss structures. Besides guaranteeing that the final design satisfies the tensegrity paradigm, the problem constrains the structure from different modes of failure, which makes it very general. The open-loop control of the shape of modular tensegrities is the third contribution of the dissertation. This analytical result offers a closed form solution for the control of the reconfiguration of modular structures. Applications range from the deployment and stowing of large-scale space structures to the locomotion-inducing control for biologically inspired structures. The control algorithm is applicable regardless of the size of the structures, and it represents a very general result for a large class of tensegrities. Controlled deployments of large-scale tensegrity plates and tensegrity towers are shown as examples that demonstrate the full potential of this reconfiguration strategy. The last

  10. Computer optimization of landfill-cover design

    SciTech Connect

    Massmann, J.W.; Moore, C.A.

    1982-12-01

    A finite difference computer program to aid optimizing landfill-cover design was developed. The program was used to compare the methane yield from sand-covred and clay-covered landfills equipped with methane-recovery systems. The results of this comparison indicate a clay cover can restrict air inflow into the landfill system, thus preventing oxygen poisoning of the methane-producing organisms. The practice of monitoring methane-to-air ratios in the pipelines of the recovery system in order to warn of oxygen infiltration into the fill material was shown to be ineffective in some situations. More-reliable methods to forewarn of oxygen poisoning are suggested.

  11. Probabilistic Finite Element Analysis & Design Optimization for Structural Designs

    NASA Astrophysics Data System (ADS)

    Deivanayagam, Arumugam

    This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that

  12. A Cost Comparison Framework for Use in Optimizing Ground Water Pump and Treat Systems

    EPA Pesticide Factsheets

    This fact sheet has been prepared to provide a framework for conducting cost comparisons to evaluate whether or not to pursue potential opportunities from an optimization evaluation for improving, replacing, or supplementing the P&T system.

  13. A framework for the design of ambulance sirens.

    PubMed

    Catchpole, K; McKeown, D

    2007-08-01

    Ambulance sirens are essential for assisting the safe and rapid arrival of an ambulance at the scene of an emergency. In this study, the parameters upon which sirens may be designed were examined and a framework for emergency vehicle siren design was proposed. Validity for the framework was supported through acoustic measurements and the evaluation of ambulance transit times over 240 emergency runs using two different siren systems. Modifying existing siren sounds to add high frequency content would improve vehicle penetration, detectability and sound localization cues, and mounting the siren behind the radiator grill, rather than on the light bar or under the wheel arch, would provide less unwanted noise while maintaining or improving the effective distance in front of the vehicle. Ultimately, these considerations will benefit any new attempt to design auditory warnings for the emergency services.

  14. Inter occasion variability in individual optimal design.

    PubMed

    Kristoffersson, Anders N; Friberg, Lena E; Nyberg, Joakim

    2015-12-01

    Inter occasion variability (IOV) is of importance to consider in the development of a design where individual pharmacokinetic or pharmacodynamic parameters are of interest. IOV may adversely affect the precision of maximum a posteriori (MAP) estimated individual parameters, yet the influence of inclusion of IOV in optimal design for estimation of individual parameters has not been investigated. In this work two methods of including IOV in the maximum a posteriori Fisher information matrix (FIMMAP) are evaluated: (i) MAP occ-the IOV is included as a fixed effect deviation per occasion and individual, and (ii) POP occ-the IOV is included as an occasion random effect. Sparse sampling schedules were designed for two test models and compared to a scenario where IOV is ignored, either by omitting known IOV (Omit) or by mimicking a situation where unknown IOV has inflated the IIV (Inflate). Accounting for IOV in the FIMMAP markedly affected the designs compared to ignoring IOV and, as evaluated by stochastic simulation and estimation, resulted in superior precision in the individual parameters. In addition MAPocc and POP occ accurately predicted precision and shrinkage. For the investigated designs, the MAP occ method was on average slightly superior to POP occ and was less computationally intensive.

  15. A method for nonlinear optimization with discrete design variables

    NASA Technical Reports Server (NTRS)

    Olsen, Gregory R.; Vanderplaats, Garret N.

    1987-01-01

    A numerical method is presented for the solution of nonlinear discrete optimization problems. The applicability of discrete optimization to engineering design is discussed, and several standard structural optimization problems are solved using discrete design variables. The method uses approximation techniques to create subproblems suitable for linear mixed-integer programming methods. The method employs existing software for continuous optimization and integer programming.

  16. A method for nonlinear optimization with discrete design variables

    NASA Technical Reports Server (NTRS)

    Olsen, Gregory R.; Vanderplaats, Garret N.

    1987-01-01

    A numerical method is presented for the solution of nonlinear discrete optimization problems. The applicability of discrete optimization to engineering design is discussed, and several standard structural optimization problems are solved using discrete design variables. The method uses approximation techniques to create subproblems suitable for linear mixed-integer programming methods. The method employs existing software for continuous optimization and integer programming.

  17. Computational design optimization for microfluidic magnetophoresis

    PubMed Central

    Plouffe, Brian D.; Lewis, Laura H.; Murthy, Shashi K.

    2011-01-01

    Current macro- and microfluidic approaches for the isolation of mammalian cells are limited in both efficiency and purity. In order to design a robust platform for the enumeration of a target cell population, high collection efficiencies are required. Additionally, the ability to isolate pure populations with minimal biological perturbation and efficient off-chip recovery will enable subcellular analyses of these cells for applications in personalized medicine. Here, a rational design approach for a simple and efficient device that isolates target cell populations via magnetic tagging is presented. In this work, two magnetophoretic microfluidic device designs are described, with optimized dimensions and operating conditions determined from a force balance equation that considers two dominant and opposing driving forces exerted on a magnetic-particle-tagged cell, namely, magnetic and viscous drag. Quantitative design criteria for an electromagnetic field displacement-based approach are presented, wherein target cells labeled with commercial magnetic microparticles flowing in a central sample stream are shifted laterally into a collection stream. Furthermore, the final device design is constrained to fit on standard rectangular glass coverslip (60 (L)×24 (W)×0.15 (H) mm3) to accommodate small sample volume and point-of-care design considerations. The anticipated performance of the device is examined via a parametric analysis of several key variables within the model. It is observed that minimal currents (<500 mA) are required to generate magnetic fields sufficient to separate cells from the sample streams flowing at rate as high as 7 ml∕h, comparable to the performance of current state-of-the-art magnet-activated cell sorting systems currently used in clinical settings. Experimental validation of the presented model illustrates that a device designed according to the derived rational optimization can effectively isolate (∼100%) a magnetic-particle-tagged cell

  18. Automated design of multiphase space missions using hybrid optimal control

    NASA Astrophysics Data System (ADS)

    Chilan, Christian Miguel

    A modern space mission is assembled from multiple phases or events such as impulsive maneuvers, coast arcs, thrust arcs and planetary flybys. Traditionally, a mission planner would resort to intuition and experience to develop a sequence of events for the multiphase mission and to find the space trajectory that minimizes propellant use by solving the associated continuous optimal control problem. This strategy, however, will most likely yield a sub-optimal solution, as the problem is sophisticated for several reasons. For example, the number of events in the optimal mission structure is not known a priori and the system equations of motion change depending on what event is current. In this work a framework for the automated design of multiphase space missions is presented using hybrid optimal control (HOC). The method developed uses two nested loops: an outer-loop that handles the discrete dynamics and finds the optimal mission structure in terms of the categorical variables, and an inner-loop that performs the optimization of the corresponding continuous-time dynamical system and obtains the required control history. Genetic algorithms (GA) and direct transcription with nonlinear programming (NLP) are introduced as methods of solution for the outer-loop and inner-loop problems, respectively. Automation of the inner-loop, continuous optimal control problem solver, required two new technologies. The first is a method for the automated construction of the NLP problems resulting from the use of a direct solver for systems with different structures, including different numbers of categorical events. The method assembles modules, consisting of parameters and constraints appropriate to each event, sequentially according to the given mission structure. The other new technology is for a robust initial guess generator required by the inner-loop NLP problem solver. Two new methods were developed for cases including low-thrust trajectories. The first method, based on GA

  19. Study of a Fine Grained Threaded Framework Design

    NASA Astrophysics Data System (ADS)

    Jones, C. D.

    2012-12-01

    Traditionally, HEP experiments exploit the multiple cores in a CPU by having each core process one event. However, future PC designs are expected to use CPUs which double the number of processing cores at the same rate as the cost of memory falls by a factor of two. This effectively means the amount of memory per processing core will remain constant. This is a major challenge for LHC processing frameworks since the LHC is expected to deliver more complex events (e.g. greater pileup events) in the coming years while the LHC experiment's frameworks are already memory constrained. Therefore in the not so distant future we may need to be able to efficiently use multiple cores to process one event. In this presentation we will discuss a design for an HEP processing framework which can allow very fine grained parallelization within one event as well as supporting processing multiple events simultaneously while minimizing the memory footprint of the job. The design is built around the libdispatch framework created by Apple Inc. (a port for Linux is available) whose central concept is the use of task queues. This design also accommodates the reality that not all code will be thread safe and therefore allows one to easily mark modules or sub parts of modules as being thread unsafe. In addition, the design efficiently handles the requirement that events in one run must all be processed before starting to process events from a different run. After explaining the design we will provide measurements from simulating different processing scenarios where the processing times used for the simulation are drawn from processing times measured from actual CMS event processing.

  20. Global optimization methods for engineering design

    NASA Technical Reports Server (NTRS)

    Arora, Jasbir S.

    1990-01-01

    The problem is to find a global minimum for the Problem P. Necessary and sufficient conditions are available for local optimality. However, global solution can be assured only under the assumption of convexity of the problem. If the constraint set S is compact and the cost function is continuous on it, existence of a global minimum is guaranteed. However, in view of the fact that no global optimality conditions are available, a global solution can be found only by an exhaustive search to satisfy Inequality. The exhaustive search can be organized in such a way that the entire design space need not be searched for the solution. This way the computational burden is reduced somewhat. It is concluded that zooming algorithm for global optimizations appears to be a good alternative to stochastic methods. More testing is needed; a general, robust, and efficient local minimizer is required. IDESIGN was used in all numerical calculations which is based on a sequential quadratic programming algorithm, and since feasible set keeps on shrinking, a good algorithm to find an initial feasible point is required. Such algorithms need to be developed and evaluated.

  1. Optimal patch code design via device characterization

    NASA Astrophysics Data System (ADS)

    Wu, Wencheng; Dalal, Edul N.

    2012-01-01

    In many color measurement applications, such as those for color calibration and profiling, "patch code" has been used successfully for job identification and automation to reduce operator errors. A patch code is similar to a barcode, but is intended primarily for use in measurement devices that cannot read barcodes due to limited spatial resolution, such as spectrophotometers. There is an inherent tradeoff between decoding robustness and the number of code levels available for encoding. Previous methods have attempted to address this tradeoff, but those solutions have been sub-optimal. In this paper, we propose a method to design optimal patch codes via device characterization. The tradeoff between decoding robustness and the number of available code levels is optimized in terms of printing and measurement efforts, and decoding robustness against noises from the printing and measurement devices. Effort is drastically reduced relative to previous methods because print-and-measure is minimized through modeling and the use of existing printer profiles. Decoding robustness is improved by distributing the code levels in CIE Lab space rather than in CMYK space.

  2. Optimized design of electric power equipment

    SciTech Connect

    Anderson, O.W. )

    1991-01-01

    In the design of rotating electric machines and transformers, the objective is usually to minimize the cost, including the cost of losses, at the same time meeting performance requirements and other limitations. The minimization of the cost function is further complicated by the fact that the function is discontinuous. Dimensions must be rounded, numbers of turns and parallel strands must be integers, and in rotating machines, number of slots and parallel circuits can assume only certain values. This article describes a method, used since about 1965, for design optimization by computer. It has been applied to a large variety of transformers, turbine generators, hydroelectric generators, smaller salient pole synchronous machines, and squirrel-cage induction motors. The programs have been developed in close cooperation with industrial users. The method is surprisingly simple and efficient.

  3. An Optimization Framework for Dynamic, Distributed Real-Time Systems

    NASA Technical Reports Server (NTRS)

    Eckert, Klaus; Juedes, David; Welch, Lonnie; Chelberg, David; Bruggerman, Carl; Drews, Frank; Fleeman, David; Parrott, David; Pfarr, Barbara

    2003-01-01

    Abstract. This paper presents a model that is useful for developing resource allocation algorithms for distributed real-time systems .that operate in dynamic environments. Interesting aspects of the model include dynamic environments, utility and service levels, which provide a means for graceful degradation in resource-constrained situations and support optimization of the allocation of resources. The paper also provides an allocation algorithm that illustrates how to use the model for producing feasible, optimal resource allocations.

  4. Harmonizing and Optimizing Fish Testing Methods: The OECD Framework Project

    EPA Science Inventory

    The Organisation for Economic Cooperation and Development (OECD) serves a key role in the international harmonization of testing of a wide variety of chemicals. An integrated fish testing framework project was initiated in mid-2009 through the OECD with the US as the lead country...

  5. Harmonizing and Optimizing Fish Testing Methods: The OECD Framework Project

    EPA Science Inventory

    The Organisation for Economic Cooperation and Development (OECD) serves a key role in the international harmonization of testing of a wide variety of chemicals. An integrated fish testing framework project was initiated in mid-2009 through the OECD with the US as the lead country...

  6. A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    1998-01-01

    This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.

  7. Optimal Designs of Staggered Dean Vortex Micromixers

    PubMed Central

    Chen, Jyh Jian; Chen, Chun Huei; Shie, Shian Ruei

    2011-01-01

    A novel parallel laminar micromixer with a two-dimensional staggered Dean Vortex micromixer is optimized and fabricated in our study. Dean vortices induced by centrifugal forces in curved rectangular channels cause fluids to produce secondary flows. The split-and-recombination (SAR) structures of the flow channels and the impinging effects result in the reduction of the diffusion distance of two fluids. Three different designs of a curved channel micromixer are introduced to evaluate the mixing performance of the designed micromixer. Mixing performances are demonstrated by means of a pH indicator using an optical microscope and fluorescent particles via a confocal microscope at different flow rates corresponding to Reynolds numbers (Re) ranging from 0.5 to 50. The comparison between the experimental data and numerical results shows a very reasonable agreement. At a Re of 50, the mixing length at the sixth segment, corresponding to the downstream distance of 21.0 mm, can be achieved in a distance 4 times shorter than when the Re equals 1. An optimization of this micromixer is performed with two geometric parameters. These are the angle between the lines from the center to two intersections of two consecutive curved channels, θ, and the angle between two lines of the centers of three consecutive curved channels, ϕ. It can be found that the maximal mixing index is related to the maximal value of the sum of θ and ϕ, which is equal to 139.82°. PMID:21747691

  8. Design and Optimization of Supersonic Intake

    NASA Astrophysics Data System (ADS)

    Merchant, Ved; Radhakrishnan, Jayakrishnan

    2017-08-01

    The ideal supersonic intake was designed using the Theta-Beta-Mach relations and a CAD modelling software. After the design was done the optimization part was carried out. Initially the easy method of cowl deflection angle variation was used to get the optimum results at a particular angle and then the pressure recovery was maximized using the intake bleed technique. The bleed technique was optimized by fixing the best location of bleed according to the shock reflections in the intake and then maximizing the results analyzing the problem for various bleed diameter values. The mesh resolution for all the analyses was taken constant along with the turbulence model and solution methods. The analyses were carried out completely considering the free flow exit condition in the intake and the back pressure condition was not taken into account due to time constraints. Therefore, the pressure recovery or the efficiency of the intake is judged solely on the minimum velocity achieved in the intake. This value was used to maximize the intake too.

  9. PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.

    PubMed

    Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter

    2016-04-01

    Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Recommendations for the optimum design of pultruded frameworks

    NASA Astrophysics Data System (ADS)

    Mottram, J. T.

    1994-09-01

    For the optimum choice of pultruded beam members in frameworks there is a need to have a greater understanding of framework behavior under load. Research on the lateral-torsional buckling of a symmetric I-section has shown how much the resistance may be affected by the loading position and the support boundary conditions. By changing the warping at the connections from free, as assumed in the USA design manual, to fixed, as may be achieved with practical connection designs it is shown that there is a potential doubling in the buckling resistance. In addition, practical connections have some initial stiffness and moment resistance, thus the connections behave in a semirigid manner. This connection behavior makes inappropriate the present procedure for choosing beam sections on the basis of limiting deflection for a simply supported member. It is proposed that research be conducted to establish the potential of semirigid design, as now being used with structural steelwork. Results from such research should provide the first stage in the process for the optimum design of frameworks.

  11. CFD based draft tube hydraulic design optimization

    NASA Astrophysics Data System (ADS)

    McNabb, J.; Devals, C.; Kyriacou, S. A.; Murry, N.; Mullins, B. F.

    2014-03-01

    The draft tube design of a hydraulic turbine, particularly in low to medium head applications, plays an important role in determining the efficiency and power characteristics of the overall machine, since an important proportion of the available energy, being in kinetic form leaving the runner, needs to be recovered by the draft tube into static head. For large units, these efficiency and power characteristics can equate to large sums of money when considering the anticipated selling price of the energy produced over the machine's life-cycle. This same draft tube design is also a key factor in determining the overall civil costs of the powerhouse, primarily in excavation and concreting, which can amount to similar orders of magnitude as the price of the energy produced. Therefore, there is a need to find the optimum compromise between these two conflicting requirements. In this paper, an elaborate approach is described for dealing with this optimization problem. First, the draft tube's detailed geometry is defined as a function of a comprehensive set of design parameters (about 20 of which a subset is allowed to vary during the optimization process) and are then used in a non-uniform rational B-spline based geometric modeller to fully define the wetted surfaces geometry. Since the performance of the draft tube is largely governed by 3D viscous effects, such as boundary layer separation from the walls and swirling flow characteristics, which in turn governs the portion of the available kinetic energy which will be converted into pressure, a full 3D meshing and Navier-Stokes analysis is performed for each design. What makes this even more challenging is the fact that the inlet velocity distribution to the draft tube is governed by the runner at each of the various operating conditions that are of interest for the exploitation of the powerhouse. In order to determine these inlet conditions, a combined steady-state runner and an initial draft tube analysis, using a

  12. Final Project Report: A Polyhedral Transformation Framework for Compiler Optimization

    SciTech Connect

    Sadayappan, Ponnuswamy; Rountev, Atanas

    2015-06-15

    The project developed the polyhedral compiler transformation module PolyOpt/Fortran in the ROSE compiler framework. PolyOpt/Fortran performs automated transformation of affine loop nests within FORTRAN programs for enhanced data locality and parallel execution. A FORTAN version of the Polybench library was also developed by the project. A third development was a dynamic analysis approach to gauge vectorization potential within loops of programs; software (DDVec) for automated instrumentation and dynamic analysis of programs was developed.

  13. A framework of cloud supported collaborative design in glass lens moulds based on aspheric measurement

    NASA Astrophysics Data System (ADS)

    Zhu, Yongjian; Wang, Yu; Na, Jingxin; Zhi, Yanan; Fan, Yufeng

    2013-09-01

    Aspheric mould design includes the top-down design and reversal design. In this paper, a new framework of reversal design is proposed combining with cloud supported collaborative design (CSCD) based on aspheric measurement. The framework is a kind of collaborative platform, which is composed of eight modules, including the computerized aspheric precision measurement module (CAPM), computer-aided optical design of aspheric lens system (CAOD), computer-aided design of lens mould (CADLM), FEM(finite element method) simulation of lens molding module (FEMLM), computer-aided manufacture of lens and moulds (CAMLM), measurement data analysis module (MDAM), optical product lifecycle management module (OPLM) and cloud computing network module (CCNM). In this framework, the remote clients send an improved requirement or fabrication demand about optical lens system through CCNM, which transfers this signal to OPLM. In OPLM, one main server is in charge of the task distribution and collaborative work of other six modules. The first measurement data of aspheric lens are produced by clients or our proposed platform CAPM, then are sent to CAOD for optimization and the electronic drawings of lens moulds are generated in CADLM module. According the design drawings, the FEMLM could give the lens-molding simulation parameters through FEM software. The simulation data are used for the second design of moulds in CADLM module. In this case, the moulds could be fabricated in CAMLM by ultra-precision machine, and the aspheric lens could be also produced by lens-molding machine in CAMLM. At last, the final shape of aspheric lens could be measured in CAPM and the data analysis could be conducted in MDAM module. Through the proposed framework, all the work described above could be performed coordinately. And the optimum design data of lens mould could be realized and saved, then shared by all the work team.

  14. A user-interactive, response surface approximation-based framework for multidisciplinary design

    NASA Astrophysics Data System (ADS)

    Stelmack, Marc Andrew

    Multidisciplinary Design Optimization (MDO) focuses on reducing the time and cost required to design complex engineering systems. One goal of MDO is to develop systematic approaches to design which are effective and reliable in achieving desired performance improvements. Also, the analysis of engineering systems is potentially expensive and time-consuming. Therefore, enhancing the efficiency of current design methods, in terms of the number of designs that must be evaluated, is desirable. A design framework, Concurrent Subspace Design (CSD), is proposed to address some issues that are prevalent in practical design settings. Previously considered methods of system approximation and optimization were extended to accommodate both discrete and continuous design variables. Additionally, the transition from one application to another was made to be as straightforward as possible in developing the associated software. Engineering design generally requires the expertise of numerous individuals, whose efforts must be focused on and coordinated in accordance with a consistent set of design goals. In CSD, system approximations in the form of artificial neural networks provide information pertaining to system performance characteristics. This information provides the basis for design decisions. The approximations enable different designers to operate concurrently and assess the impact of their decisions on the system design goals. The proposed framework was implemented to minimize the weight of an aircraft brake assembly. An existing industrial analysis tool was used to provide design information in that application. CSD was implemented in a user-interactive fashion that permitted human judgement to influence the design process and required minimal modifications to the analysis and design software. The implications of problem formulation and the role of human design experts in automated industrial design processes were explored in the context of that application. In the most

  15. Optimal Design of Automotive Thermoelectric Air Conditioner (TEAC)

    NASA Astrophysics Data System (ADS)

    Attar, Alaa; Lee, HoSung; Weera, Sean

    2014-06-01

    The present work is an analytical study of the optimal design of an automotive thermoelectric air conditioner (TEAC) using a new optimal design method with dimensional analysis that has been recently developed by our research group. The optimal design gives not only the optimal current but also the optimal geometry (i.e., the number of thermocouples, the geometric factor, or the hot fluid parameters). The optimal design for the TEAC is carried out with two configurations: air-to-liquid and air-to-air heat exchangers.

  16. A Connectivity Framework for Social Information Systems Design in Healthcare

    PubMed Central

    Kuziemsky, Craig E.; Andreev, Pavel; Benyoucef, Morad; O'Sullivan, Tracey; Jamaly, Syam

    2016-01-01

    Social information systems (SISs) will play a key role in healthcare systems’ transformation into collaborative patient-centered systems that support care delivery across the entire continuum of care. SISs enable the development of collaborative networks andfacilitate relationships to integrate people and processes across time and space. However, we believe that a “connectivity” issue, which refers to the scope and extent of system requirements for a SIS, is a significant challenge of SIS design. This paper’s contribution is the development of the Social Information System Connectivity Framework for supporting SIS design in healthcare. The framework has three parts. First, it defines the structure of a SIS as a set of social triads. Second, it identifies six dimensions that represent the behaviour of a SIS. Third, it proposes the Social Information System Connectivity Factor as our approximation of the extent of connectivity and degree of complexity in a SIS. PMID:28269869

  17. SysSon - A Framework for Systematic Sonification Design

    NASA Astrophysics Data System (ADS)

    Vogt, Katharina; Goudarzi, Visda; Holger Rutz, Hanns

    2015-04-01

    SysSon is a research approach on introducing sonification systematically to a scientific community where it is not yet commonly used - e.g., in climate science. Thereby, both technical and socio-cultural barriers have to be met. The approach was further developed with climate scientists, who participated in contextual inquiries, usability tests and a workshop of collaborative design. Following from these extensive user tests resulted our final software framework. As frontend, a graphical user interface allows climate scientists to parametrize standard sonifications with their own data sets. Additionally, an interactive shell allows to code new sonifications for users competent in sound design. The framework is a standalone desktop application, available as open source (for details see http://sysson.kug.ac.at/) and works with data in NetCDF format.

  18. A Framework for Robust Multivariable Optimization of Integrated Circuits in Space Applications

    NASA Technical Reports Server (NTRS)

    DuMonthier, Jeffrey; Suarez, George

    2013-01-01

    Application Specific Integrated Circuit (ASIC) design for space applications involves multiple challenges of maximizing performance, minimizing power and ensuring reliable operation in extreme environments. This is a complex multidimensional optimization problem which must be solved early in the development cycle of a system due to the time required for testing and qualification severely limiting opportunities to modify and iterate. Manual design techniques which generally involve simulation at one or a small number of corners with a very limited set of simultaneously variable parameters in order to make the problem tractable are inefficient and not guaranteed to achieve the best possible results within the performance envelope defined by the process and environmental requirements. What is required is a means to automate design parameter variation, allow the designer to specify operational constraints and performance goals, and to analyze the results in a way which facilitates identifying the tradeoffs defining the performance envelope over the full set of process and environmental corner cases. The system developed by the Mixed Signal ASIC Group (MSAG) at the Goddard Space Flight Center is implemented as framework of software modules, templates and function libraries. It integrates CAD tools and a mathematical computing environment, and can be customized for new circuit designs with only a modest amount of effort as most common tasks are already encapsulated. Customization is required for simulation test benches to determine performance metrics and for cost function computation. Templates provide a starting point for both while toolbox functions minimize the code required. Once a test bench has been coded to optimize a particular circuit, it is also used to verify the final design. The combination of test bench and cost function can then serve as a template for similar circuits or be re-used to migrate the design to different processes by re-running it with the

  19. A new framework for designing programmes of assessment.

    PubMed

    Dijkstra, J; Van der Vleuten, C P M; Schuwirth, L W T

    2010-08-01

    Research on assessment in medical education has strongly focused on individual measurement instruments and their psychometric quality. Without detracting from the value of this research, such an approach is not sufficient to high quality assessment of competence as a whole. A programmatic approach is advocated which presupposes criteria for designing comprehensive assessment programmes and for assuring their quality. The paucity of research with relevance to programmatic assessment, and especially its development, prompted us to embark on a research project to develop design principles for programmes of assessment. We conducted focus group interviews to explore the experiences and views of nine assessment experts concerning good practices and new ideas about theoretical and practical issues in programmes of assessment. The discussion was analysed, mapping all aspects relevant for design onto a framework, which was iteratively adjusted to fit the data until saturation was reached. The overarching framework for designing programmes of assessment consists of six assessment programme dimensions: Goals, Programme in Action, Support, Documenting, Improving and Accounting. The model described in this paper can help to frame programmes of assessment; it not only provides a common language, but also a comprehensive picture of the dimensions to be covered when formulating design principles. It helps identifying areas concerning assessment in which ample research and development has been done. But, more importantly, it also helps to detect underserved areas. A guiding principle in design of assessment programmes is fitness for purpose. High quality assessment can only be defined in terms of its goals.

  20. Maximum Entropy/Optimal Projection (MEOP) control design synthesis: Optimal quantification of the major design tradeoffs

    NASA Technical Reports Server (NTRS)

    Hyland, D. C.; Bernstein, D. S.

    1987-01-01

    The underlying philosophy and motivation of the optimal projection/maximum entropy (OP/ME) stochastic modeling and reduced control design methodology for high order systems with parameter uncertainties are discussed. The OP/ME design equations for reduced-order dynamic compensation including the effect of parameter uncertainties are reviewed. The application of the methodology to several Large Space Structures (LSS) problems of representative complexity is illustrated.

  1. Design and architecture of the Mars relay network planning and analysis framework

    NASA Technical Reports Server (NTRS)

    Cheung, K. M.; Lee, C. H.

    2002-01-01

    In this paper we describe the design and architecture of the Mars Network planning and analysis framework that supports generation and validation of efficient planning and scheduling strategy. The goals are to minimize the transmitting time, minimize the delaying time, and/or maximize the network throughputs. The proposed framework would require (1) a client-server architecture to support interactive, batch, WEB, and distributed analysis and planning applications for the relay network analysis scheme, (2) a high-fidelity modeling and simulation environment that expresses link capabilities between spacecraft to spacecraft and spacecraft to Earth stations as time-varying resources, and spacecraft activities, link priority, Solar System dynamic events, the laws of orbital mechanics, and other limiting factors as spacecraft power and thermal constraints, (3) an optimization methodology that casts the resource and constraint models into a standard linear and nonlinear constrained optimization problem that lends itself to commercial off-the-shelf (COTS)planning and scheduling algorithms.

  2. Design and architecture of the Mars relay network planning and analysis framework

    NASA Technical Reports Server (NTRS)

    Cheung, K. M.; Lee, C. H.

    2002-01-01

    In this paper we describe the design and architecture of the Mars Network planning and analysis framework that supports generation and validation of efficient planning and scheduling strategy. The goals are to minimize the transmitting time, minimize the delaying time, and/or maximize the network throughputs. The proposed framework would require (1) a client-server architecture to support interactive, batch, WEB, and distributed analysis and planning applications for the relay network analysis scheme, (2) a high-fidelity modeling and simulation environment that expresses link capabilities between spacecraft to spacecraft and spacecraft to Earth stations as time-varying resources, and spacecraft activities, link priority, Solar System dynamic events, the laws of orbital mechanics, and other limiting factors as spacecraft power and thermal constraints, (3) an optimization methodology that casts the resource and constraint models into a standard linear and nonlinear constrained optimization problem that lends itself to commercial off-the-shelf (COTS)planning and scheduling algorithms.

  3. Integration framework for design information of electromechanical systems

    NASA Astrophysics Data System (ADS)

    Qureshi, Sohail Mehboob

    The objective of this research is to develop a framework that can be used to provide an integrated view of electromechanical system design information. The framework is intended to provide a platform where various standard and pseudo standard information models such as STEP and IBIS can be integrated to provide an integrated view of design information beyond just part numbers, CAD drawings, or some specific geometry. A database application can make use of this framework to provide reuse of design information fragments including geometry, function, behavior, design procedures, performance specification, design rationale, project management, product characteristics, and configuration and version. An "Integration Core Model" is developed to provide the basis for the integration framework, and also facilitate integration of product and process data for the purpose of archiving integrated design history. There are two major subdivisions of the integration core model: product core model providing the high level structure needed to associate process information to the product data, and process core model providing the generic process information that is needed to capture and organize process information. The process core model is developed using a hybrid of structure-oriented and process-oriented approaches to process modeling. Using such a scheme the process core model is able to represent information such as hierarchies of processes, logical and temporal relationships between various design activities, and relationships between activities and the product data at various levels of abstraction. Based upon the integration core model, an integration methodology is developed to provide a systematic way of integrating various information models. Mapping theorems have been developed to methodically point out the problems that may be encountered during the integration of two information models. The integration core model is validated through a case study. Design information

  4. Design Principles for Covalent Organic Frameworks in Energy Storage Applications.

    PubMed

    Alahakoon, Sampath B; Thompson, Christina M; Occhialini, Gino; Smaldone, Ronald Alexander

    2017-03-16

    Covalent organic frameworks (COFs) are an exciting class of microporous materials that have been explored as energy storage materials for more than a decade. This review will discusses the efforts to develop these materials for applications in gas and electrical power storage. This review will also discuss some of the design strategies for developing the gas sorption properties of COFs and mechanistic studies on their formation.

  5. Design and applications of a multimodality image data warehouse framework.

    PubMed

    Wong, Stephen T C; Hoo, Kent Soo; Knowlton, Robert C; Laxer, Kenneth D; Cao, Xinhau; Hawkins, Randall A; Dillon, William P; Arenson, Ronald L

    2002-01-01

    A comprehensive data warehouse framework is needed, which encompasses imaging and non-imaging information in supporting disease management and research. The authors propose such a framework, describe general design principles and system architecture, and illustrate a multimodality neuroimaging data warehouse system implemented for clinical epilepsy research. The data warehouse system is built on top of a picture archiving and communication system (PACS) environment and applies an iterative object-oriented analysis and design (OOAD) approach and recognized data interface and design standards. The implementation is based on a Java CORBA (Common Object Request Broker Architecture) and Web-based architecture that separates the graphical user interface presentation, data warehouse business services, data staging area, and backend source systems into distinct software layers. To illustrate the practicality of the data warehouse system, the authors describe two distinct biomedical applications--namely, clinical diagnostic workup of multimodality neuroimaging cases and research data analysis and decision threshold on seizure foci lateralization. The image data warehouse framework can be modified and generalized for new application domains.

  6. Design and Applications of a Multimodality Image Data Warehouse Framework

    PubMed Central

    Wong, Stephen T.C.; Hoo, Kent Soo; Knowlton, Robert C.; Laxer, Kenneth D.; Cao, Xinhau; Hawkins, Randall A.; Dillon, William P.; Arenson, Ronald L.

    2002-01-01

    A comprehensive data warehouse framework is needed, which encompasses imaging and non-imaging information in supporting disease management and research. The authors propose such a framework, describe general design principles and system architecture, and illustrate a multimodality neuroimaging data warehouse system implemented for clinical epilepsy research. The data warehouse system is built on top of a picture archiving and communication system (PACS) environment and applies an iterative object-oriented analysis and design (OOAD) approach and recognized data interface and design standards. The implementation is based on a Java CORBA (Common Object Request Broker Architecture) and Web-based architecture that separates the graphical user interface presentation, data warehouse business services, data staging area, and backend source systems into distinct software layers. To illustrate the practicality of the data warehouse system, the authors describe two distinct biomedical applications—namely, clinical diagnostic workup of multimodality neuroimaging cases and research data analysis and decision threshold on seizure foci lateralization. The image data warehouse framework can be modified and generalized for new application domains. PMID:11971885

  7. Optimal Ground Source Heat Pump System Design

    SciTech Connect

    Ozbek, Metin; Yavuzturk, Cy; Pinder, George

    2015-04-01

    Despite the facts that GSHPs first gained popularity as early as the 1940’s and they can achieve 30 to 60 percent in energy savings and carbon emission reductions relative to conventional HVAC systems, the use of geothermal energy in the U.S. has been less than 1 percent of the total energy consumption. The key barriers preventing this technically-mature technology from reaching its full commercial potential have been its high installation cost and limited consumer knowledge and trust in GSHP systems to deliver the technology in a cost-effective manner in the market place. Led by ENVIRON, with support from University Hartford and University of Vermont, the team developed and tested a software-based a decision making tool (‘OptGSHP’) for the least-cost design of ground-source heat pump (‘GSHP’) systems. OptGSHP combines state of the art optimization algorithms with GSHP-specific HVAC and groundwater flow and heat transport simulation. The particular strength of OptGSHP is in integrating heat transport due to groundwater flow into the design, which most of the GSHP designs do not get credit for and therefore are overdesigned.

  8. Optimizing Monitoring Designs under Alternative Objectives

    DOE PAGES

    Gastelum, Jason A.; USA, Richland Washington; Porter, Ellen A.; ...

    2014-12-31

    This paper describes an approach to identify monitoring designs that optimize detection of CO2 leakage from a carbon capture and sequestration (CCS) reservoir and compares the results generated under two alternative objective functions. The first objective function minimizes the expected time to first detection of CO2 leakage, the second more conservative objective function minimizes the maximum time to leakage detection across the set of realizations. The approach applies a simulated annealing algorithm that searches the solution space by iteratively mutating the incumbent monitoring design. The approach takes into account uncertainty by evaluating the performance of potential monitoring designs across amore » set of simulated leakage realizations. The approach relies on a flexible two-tiered signature to infer that CO2 leakage has occurred. This research is part of the National Risk Assessment Partnership, a U.S. Department of Energy (DOE) project tasked with conducting risk and uncertainty analysis in the areas of reservoir performance, natural leakage pathways, wellbore integrity, groundwater protection, monitoring, and systems level modeling.« less

  9. Optimizing Monitoring Designs under Alternative Objectives

    SciTech Connect

    Gastelum, Jason A.; USA, Richland Washington; Porter, Ellen A.; USA, Richland Washington

    2014-12-31

    This paper describes an approach to identify monitoring designs that optimize detection of CO2 leakage from a carbon capture and sequestration (CCS) reservoir and compares the results generated under two alternative objective functions. The first objective function minimizes the expected time to first detection of CO2 leakage, the second more conservative objective function minimizes the maximum time to leakage detection across the set of realizations. The approach applies a simulated annealing algorithm that searches the solution space by iteratively mutating the incumbent monitoring design. The approach takes into account uncertainty by evaluating the performance of potential monitoring designs across a set of simulated leakage realizations. The approach relies on a flexible two-tiered signature to infer that CO2 leakage has occurred. This research is part of the National Risk Assessment Partnership, a U.S. Department of Energy (DOE) project tasked with conducting risk and uncertainty analysis in the areas of reservoir performance, natural leakage pathways, wellbore integrity, groundwater protection, monitoring, and systems level modeling.

  10. Application of a COTS Resource Optimization Framework to the SSN Sensor Tasking Domain - Part I: Problem Definition

    NASA Astrophysics Data System (ADS)

    Tran, T.

    With the onset of the SmallSat era, the RSO catalog is expected to see continuing growth in the near future. This presents a significant challenge to the current sensor tasking of the SSN. The Air Force is in need of a sensor tasking system that is robust, efficient, scalable, and able to respond in real-time to interruptive events that can change the tracking requirements of the RSOs. Furthermore, the system must be capable of using processed data from heterogeneous sensors to improve tasking efficiency. The SSN sensor tasking can be regarded as an economic problem of supply and demand: the amount of tracking data needed by each RSO represents the demand side while the SSN sensor tasking represents the supply side. As the number of RSOs to be tracked grows, demand exceeds supply. The decision-maker is faced with the problem of how to allocate resources in the most efficient manner. Braxton recently developed a framework called Multi-Objective Resource Optimization using Genetic Algorithm (MOROUGA) as one of its modern COTS software products. This optimization framework took advantage of the maturing technology of evolutionary computation in the last 15 years. This framework was applied successfully to address the resource allocation of an AFSCN-like problem. In any resource allocation problem, there are five key elements: (1) the resource pool, (2) the tasks using the resources, (3) a set of constraints on the tasks and the resources, (4) the objective functions to be optimized, and (5) the demand levied on the resources. In this paper we explain in detail how the design features of this optimization framework are directly applicable to address the SSN sensor tasking domain. We also discuss our validation effort as well as present the result of the AFSCN resource allocation domain using a prototype based on this optimization framework.

  11. A Human Factors Framework for Payload Display Design

    NASA Technical Reports Server (NTRS)

    Dunn, Mariea C.; Hutchinson, Sonya L.

    1998-01-01

    During missions to space, one charge of the astronaut crew is to conduct research experiments. These experiments, referred to as payloads, typically are controlled by computers. Crewmembers interact with payload computers by using visual interfaces or displays. To enhance the safety, productivity, and efficiency of crewmember interaction with payload displays, particular attention must be paid to the usability of these displays. Enhancing display usability requires adoption of a design process that incorporates human factors engineering principles at each stage. This paper presents a proposed framework for incorporating human factors engineering principles into the payload display design process.

  12. Space tourism optimized reusable spaceplane design

    NASA Astrophysics Data System (ADS)

    Penn, Jay P.; Lindley, Charles A.

    1997-01-01

    Market surveys suggest that a viable space tourism industry will require flight rates about two orders of magnitude higher than those required for conventional spacelift. Although enabling round-trip cost goals for a viable space tourism business are about $240 per pound ($529/kg), or $72,000 per passenger round-trip, goals should be about $50 per pound ($110/kg) or approximately $15,000 for a typical passenger and baggage. The lower price will probably open space tourism to the general population. Vehicle reliabilities must approach those of commercial aircraft as closely as possible. This paper addresses the development of spaceplanes optimized for the ultra-high flight rate and high reliability demands of the space tourism mission. It addresses the fundamental operability, reliability, and cost drivers needed to satisfy this mission need. Figures of merit similar to those used to evaluate the economic viability of conventional commercial aircraft are developed, including items such as payload/vehicle dry weight, turnaround time, propellant cost per passenger, and insurance and depreciation costs, which show that infrastructure can be developed for a viable space tourism industry. A reference spaceplane design optimized for space tourism is described. Subsystem allocations for reliability, operability, and costs are made and a route to developing such a capability is discussed. The vehicle's ability to also satisfy the traditional spacelift market is shown.

  13. Sampling design optimization for spatial functions

    USGS Publications Warehouse

    Olea, R.A.

    1984-01-01

    A new procedure is presented for minimizing the sampling requirements necessary to estimate a mappable spatial function at a specified level of accuracy. The technique is based on universal kriging, an estimation method within the theory of regionalized variables. Neither actual implementation of the sampling nor universal kriging estimations are necessary to make an optimal design. The average standard error and maximum standard error of estimation over the sampling domain are used as global indices of sampling efficiency. The procedure optimally selects those parameters controlling the magnitude of the indices, including the density and spatial pattern of the sample elements and the number of nearest sample elements used in the estimation. As an illustration, the network of observation wells used to monitor the water table in the Equus Beds of Kansas is analyzed and an improved sampling pattern suggested. This example demonstrates the practical utility of the procedure, which can be applied equally well to other spatial sampling problems, as the procedure is not limited by the nature of the spatial function. ?? 1984 Plenum Publishing Corporation.

  14. Three Program Architecture for Design Optimization

    NASA Technical Reports Server (NTRS)

    Miura, Hirokazu; Olson, Lawrence E. (Technical Monitor)

    1998-01-01

    In this presentation, I would like to review historical perspective on the program architecture used to build design optimization capabilities based on mathematical programming and other numerical search techniques. It is rather straightforward to classify the program architecture in three categories as shown above. However, the relative importance of each of the three approaches has not been static, instead dynamically changing as the capabilities of available computational resource increases. For example, we considered that the direct coupling architecture would never be used for practical problems, but availability of such computer systems as multi-processor. In this presentation, I would like to review the roles of three architecture from historical as well as current and future perspective. There may also be some possibility for emergence of hybrid architecture. I hope to provide some seeds for active discussion where we are heading to in the very dynamic environment for high speed computing and communication.

  15. Optimal design of robot accuracy compensators

    SciTech Connect

    Zhuang, H.; Roth, Z.S. . Robotics Center and Electrical Engineering Dept.); Hamano, Fumio . Dept. of Electrical Engineering)

    1993-12-01

    The problem of optimal design of robot accuracy compensators is addressed. Robot accuracy compensation requires that actual kinematic parameters of a robot be previously identified. Additive corrections of joint commands, including those at singular configurations, can be computed without solving the inverse kinematics problem for the actual robot. This is done by either the damped least-squares (DLS) algorithm or the linear quadratic regulator (LQR) algorithm, which is a recursive version of the DLS algorithm. The weight matrix in the performance index can be selected to achieve specific objectives, such as emphasizing end-effector's positioning accuracy over orientation accuracy or vice versa, or taking into account proximity to robot joint travel limits and singularity zones. The paper also compares the LQR and the DLS algorithms in terms of computational complexity, storage requirement, and programming convenience. Simulation results are provided to show the effectiveness of the algorithms.

  16. Three-dimensional hemodynamic design optimization of stents for cerebral aneurysms.

    PubMed

    Lee, Chang-Joon; Srinivas, Karkenahalli; Qian, Yi

    2014-03-01

    Flow-diverting stents occlude aneurysms by diverting the blood flow from entering the aneurysm sac. Their effectiveness is determined by the thrombus formation rate, which depends greatly on stent design. The aim of this study was to provide a general framework for efficient stent design using design optimization methods, with a focus on stent hemodynamics as the starting point. Kriging method was used for completing design optimization. Three different cases of idealized stents were considered, and 40-60 samples from each case were evaluated using computational fluid dynamics. Using maximum velocity and vorticity reduction as objective functions, the optimized designs were identified from the samples. A number of optimized stent designs have been found from optimization, which revealed that a combination of high pore density and thin struts is desired. Additionally, distributing struts near the proximal end of aneurysm neck was found to be effective. The success of the methods and framework devised in this study offers a future possibility of incorporating other disciplines to carry out multidisciplinary design optimization.

  17. Incorporating User Preferences Within an Optimal Traffic Flow Management Framework

    NASA Technical Reports Server (NTRS)

    Rios, Joseph Lucio; Sheth, Kapil S.; Guiterrez-Nolasco, Sebastian Armardo

    2010-01-01

    The effectiveness of future decision support tools for Traffic Flow Management in the National Airspace System will depend on two major factors: computational burden and collaboration. Previous research has focused separately on these two aspects without consideration of their interaction. In this paper, their explicit combination is examined. It is shown that when user preferences are incorporated with an optimal approach to scheduling, runtime is not adversely affected. A benefit-cost ratio is used to measure the influence of user preferences on an optimal solution. This metric shows user preferences can be accommodated without inordinately, negatively affecting the overall system delay. Specifically, incorporating user preferences will increase delays proportionally to increased user satisfaction.

  18. Optimal screening designs for biomedical technology

    SciTech Connect

    Torney, D.C.; Bruno, W.J.; Knill, E.

    1997-10-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Screening a large number of different types of molecules to isolate a few with desirable properties is essential in biomedical technology. For example, trying to find a particular gene in the Human genome could be akin to looking for a needle in a haystack. Fortunately, testing of mixtures, or pools, of molecules allows the desirable ones to be identified, using a number of experiments proportional only to the logarithm of the total number of experiments proportional only to the logarithm of the total number of types of molecules. We show how to capitalize upon this potential by using optimize pooling schemes, or designs. We propose efficient non-adaptive pooling designs, such as {open_quotes}random sets{close_quotes} designs and modified {open_quotes}row and column{close_quotes} designs. Our results have been applied in the pooling and unique-sequence screening of clone libraries used in the Human Genome Project and in the mapping of Human chromosome 16. This required the use of liquid-transferring robots and manifolds--for the largest clone libraries. Finally, we developed an efficient technique for finding the posterior probability each molecule has the desirable property, given the pool assay results. This technique works well, in practice, even if there are substantial rates of errors in the pool assay data. Both our methods and our results are relevant to a broad spectrum of research in modern biology.

  19. Chip Design Process Optimization Based on Design Quality Assessment

    NASA Astrophysics Data System (ADS)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  20. An enhanced BSIM modeling framework for selfheating aware circuit design

    NASA Astrophysics Data System (ADS)

    Schleyer, M.; Leuschner, S.; Baumgartner, P.; Mueller, J.-E.; Klar, H.

    2014-11-01

    This work proposes a modeling framework to enhance the industry-standard BSIM4 MOSFET models with capabilities for coupled electro-thermal simulations. An automated simulation environment extracts thermal information from model data as provided by the semiconductor foundry. The standard BSIM4 model is enhanced with a Verilog-A based wrapper module, adding thermal nodes which can be connected to a thermal-equivalent RC network. The proposed framework allows a fully automated extraction process based on the netlist of the top-level design and the model library. A numerical analysis tool is used to control the extraction flow and to obtain all required parameters. The framework is used to model self-heating effects on a fully integrated class A/AB power amplifier (PA) designed in a standard 65 nm CMOS process. The PA is driven with +30 dBm output power, leading to an average temperature rise of approximately 40 °C over ambient temperature.

  1. Optimal design of artificial reefs for sturgeon

    NASA Astrophysics Data System (ADS)

    Yarbrough, Cody; Cotel, Aline; Kleinheksel, Abby

    2015-11-01

    The Detroit River, part of a busy corridor between Lakes Huron and Erie, was extensively modified to create deep shipping channels, resulting in a loss of spawning habitat for lake sturgeon and other native fish (Caswell et al. 2004, Bennion and Manny 2011). Under the U.S.- Canada Great Lakes Water Quality Agreement, there are remediation plans to construct fish spawning reefs to help with historic habitat losses and degraded fish populations, specifically sturgeon. To determine optimal reef design, experimental work has been undertaken. Different sizes and shapes of reefs are tested for a given set of physical conditions, such as flow depth and flow velocity, matching the relevant dimensionless parameters dominating the flow physics. The physical conditions are matched with the natural conditions encountered in the Detroit River. Using Particle Image Velocimetry, Acoustic Doppler Velocimetry and dye studies, flow structures, vorticity and velocity gradients at selected locations have been identified and quantified to allow comparison with field observations and numerical model results. Preliminary results are helping identify the design features to be implemented in the next phase of reef construction. Sponsored by NOAA.

  2. Hard and Soft Constraints in Reliability-Based Design Optimization

    NASA Technical Reports Server (NTRS)

    Crespo, L.uis G.; Giesy, Daniel P.; Kenny, Sean P.

    2006-01-01

    This paper proposes a framework for the analysis and design optimization of models subject to parametric uncertainty where design requirements in the form of inequality constraints are present. Emphasis is given to uncertainty models prescribed by norm bounded perturbations from a nominal parameter value and by sets of componentwise bounded uncertain variables. These models, which often arise in engineering problems, allow for a sharp mathematical manipulation. Constraints can be implemented in the hard sense, i.e., constraints must be satisfied for all parameter realizations in the uncertainty model, and in the soft sense, i.e., constraints can be violated by some realizations of the uncertain parameter. In regard to hard constraints, this methodology allows (i) to determine if a hard constraint can be satisfied for a given uncertainty model and constraint structure, (ii) to generate conclusive, formally verifiable reliability assessments that allow for unprejudiced comparisons of competing design alternatives and (iii) to identify the critical combination of uncertain parameters leading to constraint violations. In regard to soft constraints, the methodology allows the designer (i) to use probabilistic uncertainty models, (ii) to calculate upper bounds to the probability of constraint violation, and (iii) to efficiently estimate failure probabilities via a hybrid method. This method integrates the upper bounds, for which closed form expressions are derived, along with conditional sampling. In addition, an l(sub infinity) formulation for the efficient manipulation of hyper-rectangular sets is also proposed.

  3. Application of numerical optimization to rotor aerodynamic design

    NASA Technical Reports Server (NTRS)

    Pleasants, W. A., III; Wiggins, T. J.

    1984-01-01

    Based on initial results obtained from the performance optimization code, a number of observations can be made regarding the utility of optimization codes in supporting design of rotors for improved performance. (1) The primary objective of improving the productivity and responsiveness of current design methods can be met. (2) The use of optimization allows the designer to consider a wider range of design variables in a greatly compressed time period. (3) Optimization requires the user to carefully define his problem to avoid unproductive use of computer resources. (4) Optimization will increase the burden on the analyst to validate designs and to improve the accuracy of analysis methods. (5) Direct calculation of finite difference derivatives by the optimizer was not prohibitive for this application but was expensive. Approximate analysis in some form would be considered to improve program response time. (6) Program developement is not complete and will continue to evolve to integrate new analysis methods, design problems, and alternate optimizer options.

  4. Ecohydrology frameworks for green infrastructure design and ecosystem service provision

    NASA Astrophysics Data System (ADS)

    Pavao-Zuckerman, M.; Knerl, A.; Barron-Gafford, G.

    2014-12-01

    Urbanization is a dominant form of landscape change that affects the structure and function of ecosystems and alters control points in biogeochemical and hydrologic cycles. Green infrastructure (GI) has been proposed as a solution to many urban environmental challenges and may be a way to manage biogeochemical control points. Despite this promise, there has been relatively limited empirical focus to evaluate the efficacy of GI, relationships between design and function, and the ability of GI to provide ecosystem services in cities. This work has been driven by goals of adapting GI approaches to dryland cities and to harvest rain and storm water for providing ecosystem services related to storm water management and urban heat island mitigation, as well as other co-benefits. We will present a modification of ecohydrologic theory for guiding the design and function of green infrastructure for dryland systems that highlights how GI functions in context of Trigger - Transfer - Reserve - Pulse (TTRP) dynamic framework. Here we also apply this TTRP framework to observations of established street-scape green infrastructure in Tucson, AZ, and an experimental installation of green infrastructure basins on the campus of Biosphere 2 (Oracle, AZ) where we have been measuring plant performance and soil biogeochemical functions. We found variable sensitivity of microbial activity, soil respiration, N-mineralization, photosynthesis and respiration that was mediated both by elements of basin design (soil texture and composition, choice of surface mulches) and antecedent precipitation inputs and soil moisture conditions. The adapted TTRP framework and field studies suggest that there are strong connections between design and function that have implications for stormwater management and ecosystem service provision in dryland cities.

  5. Design and optimization of a brachytherapy robot

    NASA Astrophysics Data System (ADS)

    Meltsner, Michael A.

    Trans-rectal ultrasound guided (TRUS) low dose rate (LDR) interstitial brachytherapy has become a popular procedure for the treatment of prostate cancer, the most common type of non-skin cancer among men. The current TRUS technique of LDR implantation may result in less than ideal coverage of the tumor with increased risk of negative response such as rectal toxicity and urinary retention. This technique is limited by the skill of the physician performing the implant, the accuracy of needle localization, and the inherent weaknesses of the procedure itself. The treatment may require 100 or more sources and 25 needles, compounding the inaccuracy of the needle localization procedure. A robot designed for prostate brachytherapy may increase the accuracy of needle placement while minimizing the effect of physician technique in the TRUS procedure. Furthermore, a robot may improve associated toxicities by utilizing angled insertions and freeing implantations from constraints applied by the 0.5 cm-spaced template used in the TRUS method. Within our group, Lin et al. have designed a new type of LDR source. The "directional" source is a seed designed to be partially shielded. Thus, a directional, or anisotropic, source does not emit radiation in all directions. The source can be oriented to irradiate cancerous tissues while sparing normal ones. This type of source necessitates a new, highly accurate method for localization in 6 degrees of freedom. A robot is the best way to accomplish this task accurately. The following presentation of work describes the invention and optimization of a new prostate brachytherapy robot that fulfills these goals. Furthermore, some research has been dedicated to the use of the robot to perform needle insertion tasks (brachytherapy, biopsy, RF ablation, etc.) in nearly any other soft tissue in the body. This can be accomplished with the robot combined with automatic, magnetic tracking.

  6. Cat swarm optimization based evolutionary framework for multi document summarization

    NASA Astrophysics Data System (ADS)

    Rautray, Rasmita; Balabantaray, Rakesh Chandra

    2017-07-01

    Today, World Wide Web has brought us enormous quantity of on-line information. As a result, extracting relevant information from massive data has become a challenging issue. In recent past text summarization is recognized as one of the solution to extract useful information from vast amount documents. Based on number of documents considered for summarization, it is categorized as single document or multi document summarization. Rather than single document, multi document summarization is more challenging for the researchers to find accurate summary from multiple documents. Hence in this study, a novel Cat Swarm Optimization (CSO) based multi document summarizer is proposed to address the problem of multi document summarization. The proposed CSO based model is also compared with two other nature inspired based summarizer such as Harmony Search (HS) based summarizer and Particle Swarm Optimization (PSO) based summarizer. With respect to the benchmark Document Understanding Conference (DUC) datasets, the performance of all algorithms are compared in terms of different evaluation metrics such as ROUGE score, F score, sensitivity, positive predicate value, summary accuracy, inter sentence similarity and readability metric to validate non-redundancy, cohesiveness and readability of the summary respectively. The experimental analysis clearly reveals that the proposed approach outperforms the other summarizers included in the study.

  7. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  8. Utilizing object-oriented design to build advanced optimization strategies with generic implementation

    SciTech Connect

    Eldred, M.S.; Hart, W.E.; Bohnhoff, W.J.; Romero, V.J.; Hutchinson, S.A.; Salinger, A.G.

    1996-08-01

    the benefits of applying optimization to computational models are well known, but their range of widespread application to date has been limited. This effort attempts to extend the disciplinary areas to which optimization algorithms may be readily applied through the development and application of advanced optimization strategies capable of handling the computational difficulties associated with complex simulation codes. Towards this goal, a flexible software framework is under continued development for the application of optimization techniques to broad classes of engineering applications, including those with high computational expense and nonsmooth, nonconvex design space features. Object-oriented software design with C++ has been employed as a tool in providing a flexible, extensible, and robust multidisciplinary toolkit with computationally intensive simulations. In this paper, demonstrations of advanced optimization strategies using the software are presented in the hybridization and parallel processing research areas. Performance of the advanced strategies is compared with a benchmark nonlinear programming optimization.

  9. Space tourism optimized reusable spaceplane design

    SciTech Connect

    Penn, J.P.; Lindley, C.A.

    1997-01-01

    Market surveys suggest that a viable space tourism industry will require flight rates about two orders of magnitude higher than those required for conventional spacelift. Although enabling round-trip cost goals for a viable space tourism business are about {dollar_sign}240 per pound ({dollar_sign}529/kg), or {dollar_sign}72,000 per passenger round-trip, goals should be about {dollar_sign}50 per pound ({dollar_sign}110/kg) or approximately {dollar_sign}15,000 for a typical passenger and baggage. The lower price will probably open space tourism to the general population. Vehicle reliabilities must approach those of commercial aircraft as closely as possible. This paper addresses the development of spaceplanes optimized for the ultra-high flight rate and high reliability demands of the space tourism mission. It addresses the fundamental operability, reliability, and cost drivers needed to satisfy this mission need. Figures of merit similar to those used to evaluate the economic viability of conventional commercial aircraft are developed, including items such as payload/vehicle dry weight, turnaround time, propellant cost per passenger, and insurance and depreciation costs, which show that infrastructure can be developed for a viable space tourism industry. A reference spaceplane design optimized for space tourism is described. Subsystem allocations for reliability, operability, and costs are made and a route to developing such a capability is discussed. The vehicle{close_quote}s ability to also satisfy the traditional spacelift market is shown. {copyright} {ital 1997 American Institute of Physics.}

  10. A Framework for Designing Scaffolds That Improve Motivation and Cognition

    PubMed Central

    Belland, Brian R.; Kim, ChanMin; Hannafin, Michael J.

    2013-01-01

    A problematic, yet common, assumption among educational researchers is that when teachers provide authentic, problem-based experiences, students will automatically be engaged. Evidence indicates that this is often not the case. In this article, we discuss (a) problems with ignoring motivation in the design of learning environments, (b) problem-based learning and scaffolding as one way to help, (c) how scaffolding has strayed from what was originally equal parts motivational and cognitive support, and (d) a conceptual framework for the design of scaffolds that can enhance motivation as well as cognitive outcomes. We propose guidelines for the design of computer-based scaffolds to promote motivation and engagement while students are solving authentic problems. Remaining questions and suggestions for future research are then discussed. PMID:24273351

  11. Seed Design Framework for Mapping SOLiD Reads

    NASA Astrophysics Data System (ADS)

    Noé, Laurent; Gîrdea, Marta; Kucherov, Gregory

    The advent of high-throughput sequencing technologies constituted a major advance in genomic studies, offering new prospects in a wide range of applications. We propose a rigorous and flexible algorithmic solution to mapping SOLiD color-space reads to a reference genome. The solution relies on an advanced method of seed design that uses a faithful probabilistic model of read matches and, on the other hand, a novel seeding principle especially adapted to read mapping. Our method can handle both lossy and lossless frameworks and is able to distinguish, at the level of seed design, between SNPs and reading errors. We illustrate our approach by several seed designs and demonstrate their efficiency.

  12. A Framework for Designing Scaffolds That Improve Motivation and Cognition.

    PubMed

    Belland, Brian R; Kim, Chanmin; Hannafin, Michael J

    2013-10-01

    A problematic, yet common, assumption among educational researchers is that when teachers provide authentic, problem-based experiences, students will automatically be engaged. Evidence indicates that this is often not the case. In this article, we discuss (a) problems with ignoring motivation in the design of learning environments, (b) problem-based learning and scaffolding as one way to help, (c) how scaffolding has strayed from what was originally equal parts motivational and cognitive support, and (d) a conceptual framework for the design of scaffolds that can enhance motivation as well as cognitive outcomes. We propose guidelines for the design of computer-based scaffolds to promote motivation and engagement while students are solving authentic problems. Remaining questions and suggestions for future research are then discussed.

  13. Design of Optimal Cyclers Using Solar Sails

    DTIC Science & Technology

    2002-12-01

    necessary (but not sufficient ) conditions for optimality in these cases. Moreover, the optimal control solution is the one where H is minimized...problems, 1H = − for all time. The first and second order necessary (but not sufficient ) conditions for optimality using the Hamiltonian are written as... optimization and the initial conditions , the path of the sail could be propagated by means of a numeric ordinary differential equation solver on the non

  14. Robust broadband nanopositioning: fundamental trade-offs, analysis, and design in a two-degree-of-freedom control framework

    NASA Astrophysics Data System (ADS)

    Lee, Chibum; Salapaka, Srinivasa M.

    2009-01-01

    This paper studies and analyses fundamental trade-offs between positioning resolution, tracking bandwidth, and robustness to modeling uncertainties in two-degree-of-freedom (2DOF) control designs for nanopositioning systems. The analysis of these systems is done in optimal control setting with various architectural constraints imposed on the 2DOF framework. In terms of these trade-offs, our analysis shows that the primary role of feedback is providing robustness to the closed-loop device whereas the feedforward component is mainly effective in overcoming fundamental algebraic constraints that limit the feedback-only designs. This paper presents (1) an optimal prefilter model matching design for a system with an existing feedback controller, (2) a simultaneous feedforward and feedback control design in an optimal mixed sensitivity framework, and (3) a 2DOF optimal robust model matching design. The experimental results on applying these controllers show a significant improvement, as high as 330% increase in bandwidth for similar robustness and resolution over optimal feedback-only designs. Other performance objectives can be improved similarly. We demonstrate that the 2DOF freedom design achieves performance specifications that are analytically impossible for feedback-only designs.

  15. Design Time Optimization for Hardware Watermarking Protection of HDL Designs

    PubMed Central

    Castillo, E.; Morales, D. P.; García, A.; Parrilla, L.; Todorovich, E.; Meyer-Baese, U.

    2015-01-01

    HDL-level design offers important advantages for the application of watermarking to IP cores, but its complexity also requires tools automating these watermarking algorithms. A new tool for signature distribution through combinational logic is proposed in this work. IPP@HDL, a previously proposed high-level watermarking technique, has been employed for evaluating the tool. IPP@HDL relies on spreading the bits of a digital signature at the HDL design level using combinational logic included within the original system. The development of this new tool for the signature distribution has not only extended and eased the applicability of this IPP technique, but it has also improved the signature hosting process itself. Three algorithms were studied in order to develop this automated tool. The selection of a cost function determines the best hosting solutions in terms of area and performance penalties on the IP core to protect. An 1D-DWT core and MD5 and SHA1 digital signatures were used in order to illustrate the benefits of the new tool and its optimization related to the extraction logic resources. Among the proposed algorithms, the alternative based on simulated annealing reduces the additional resources while maintaining an acceptable computation time and also saving designer effort and time. PMID:25861681

  16. Design time optimization for hardware watermarking protection of HDL designs.

    PubMed

    Castillo, E; Morales, D P; García, A; Parrilla, L; Todorovich, E; Meyer-Baese, U

    2015-01-01

    HDL-level design offers important advantages for the application of watermarking to IP cores, but its complexity also requires tools automating these watermarking algorithms. A new tool for signature distribution through combinational logic is proposed in this work. IPP@HDL, a previously proposed high-level watermarking technique, has been employed for evaluating the tool. IPP@HDL relies on spreading the bits of a digital signature at the HDL design level using combinational logic included within the original system. The development of this new tool for the signature distribution has not only extended and eased the applicability of this IPP technique, but it has also improved the signature hosting process itself. Three algorithms were studied in order to develop this automated tool. The selection of a cost function determines the best hosting solutions in terms of area and performance penalties on the IP core to protect. An 1D-DWT core and MD5 and SHA1 digital signatures were used in order to illustrate the benefits of the new tool and its optimization related to the extraction logic resources. Among the proposed algorithms, the alternative based on simulated annealing reduces the additional resources while maintaining an acceptable computation time and also saving designer effort and time.

  17. An Efficient Radial Basis Function Mesh Deformation Scheme within an Adjoint-Based Aerodynamic Optimization Framework

    NASA Astrophysics Data System (ADS)

    Poirier, Vincent

    Mesh deformation schemes play an important role in numerical aerodynamic optimization. As the aerodynamic shape changes, the computational mesh must adapt to conform to the deformed geometry. In this work, an extension to an existing fast and robust Radial Basis Function (RBF) mesh movement scheme is presented. Using a reduced set of surface points to define the mesh deformation increases the efficiency of the RBF method; however, at the cost of introducing errors into the parameterization by not recovering the exact displacement of all surface points. A secondary mesh movement is implemented, within an adjoint-based optimization framework, to eliminate these errors. The proposed scheme is tested within a 3D Euler flow by reducing the pressure drag while maintaining lift of a wing-body configured Boeing-747 and an Onera-M6 wing. As well, an inverse pressure design is executed on the Onera-M6 wing and an inverse span loading case is presented for a wing-body configured DLR-F6 aircraft.

  18. Constrained Aeroacoustic Shape Optimization Using the Surrogate Management Framework

    NASA Technical Reports Server (NTRS)

    Marsden, Alison L.; Wang, Meng; Dennis, John E., Jr.

    2003-01-01

    Reduction of noise generated by turbulent flow past the trailing-edge of a lifting surface is a challenge in many aeronautical and naval applications. Numerical predictions of trailing-edge noise necessitate the use of advanced simulation techniques such as large-eddy simulation (LES) in order to capture a wide range of turbulence scales which are the source of broadband noise. Aeroacoustic calculations of the flow over a model airfoil trailing edge using LES and aeroacoustic theory have been presented in Wang and Moin and were shown to agree favorably with experiments. The goal of the present work is to apply shape optimization to the trailing edge flow previously studied, in order to control aerodynamic noise.

  19. Optimizing an estuarine water quality monitoring program through an entropy-based hierarchical spatiotemporal Bayesian framework

    NASA Astrophysics Data System (ADS)

    Alameddine, Ibrahim; Karmakar, Subhankar; Qian, Song S.; Paerl, Hans W.; Reckhow, Kenneth H.

    2013-10-01

    The total maximum daily load program aims to monitor more than 40,000 standard violations in around 20,000 impaired water bodies across the United States. Given resource limitations, future monitoring efforts have to be hedged against the uncertainties in the monitored system, while taking into account existing knowledge. In that respect, we have developed a hierarchical spatiotemporal Bayesian model that can be used to optimize an existing monitoring network by retaining stations that provide the maximum amount of information, while identifying locations that would benefit from the addition of new stations. The model assumes the water quality parameters are adequately described by a joint matrix normal distribution. The adopted approach allows for a reduction in redundancies, while emphasizing information richness rather than data richness. The developed approach incorporates the concept of entropy to account for the associated uncertainties. Three different entropy-based criteria are adopted: total system entropy, chlorophyll-a standard violation entropy, and dissolved oxygen standard violation entropy. A multiple attribute decision making framework is adopted to integrate the competing design criteria and to generate a single optimal design. The approach is implemented on the water quality monitoring system of the Neuse River Estuary in North Carolina, USA. The model results indicate that the high priority monitoring areas identified by the total system entropy and the dissolved oxygen violation entropy criteria are largely coincident. The monitoring design based on the chlorophyll-a standard violation entropy proved to be less informative, given the low probabilities of violating the water quality standard in the estuary.

  20. Multidisciplinary Analysis and Optimal Design: As Easy as it Sounds?

    NASA Technical Reports Server (NTRS)

    Moore, Greg; Chainyk, Mike; Schiermeier, John

    2004-01-01

    The viewgraph presentation examines optimal design for precision, large aperture structures. Discussion focuses on aspects of design optimization, code architecture and current capabilities, and planned activities and collaborative area suggestions. The discussion of design optimization examines design sensitivity analysis; practical considerations; and new analytical environments including finite element-based capability for high-fidelity multidisciplinary analysis, design sensitivity, and optimization. The discussion of code architecture and current capabilities includes basic thermal and structural elements, nonlinear heat transfer solutions and process, and optical modes generation.

  1. Screen Design Guidelines for Motivation in Interactive Multimedia Instruction: A Survey and Framework for Designers.

    ERIC Educational Resources Information Center

    Lee, Sung Heum; Boling, Elizabeth

    1999-01-01

    Identifies guidelines from the literature relating to screen design and design of interactive instructional materials. Describes two types of guidelines--those aimed at enhancing motivation and those aimed at preventing loss of motivation--for typography, graphics, color, and animation and audio. Proposes a framework for considering motivation in…

  2. Screen Design Guidelines for Motivation in Interactive Multimedia Instruction: A Survey and Framework for Designers.

    ERIC Educational Resources Information Center

    Lee, Sung Heum; Boling, Elizabeth

    1999-01-01

    Identifies guidelines from the literature relating to screen design and design of interactive instructional materials. Describes two types of guidelines--those aimed at enhancing motivation and those aimed at preventing loss of motivation--for typography, graphics, color, and animation and audio. Proposes a framework for considering motivation in…

  3. Interactive computer program for optimal designs of longitudinal cohort studies.

    PubMed

    Tekle, Fetene B; Tan, Frans E S; Berger, Martijn P F

    2009-05-01

    Many large scale longitudinal cohort studies have been carried out or are ongoing in different fields of science. Such studies need a careful planning to obtain the desired quality of results with the available resources. In the past, a number of researches have been performed on optimal designs for longitudinal studies. However, there was no computer program yet available to help researchers to plan their longitudinal cohort design in an optimal way. A new interactive computer program for the optimization of designs of longitudinal cohort studies is therefore presented. The computer program helps users to identify the optimal cohort design with an optimal number of repeated measurements per subject and an optimal allocations of time points within a given study period. Further, users can compute the loss in relative efficiencies of any other alternative design compared to the optimal one. The computer program is described and illustrated using a practical example.

  4. Review of design optimization methods for turbomachinery aerodynamics

    NASA Astrophysics Data System (ADS)

    Li, Zhihui; Zheng, Xinqian

    2017-08-01

    In today's competitive environment, new turbomachinery designs need to be not only more efficient, quieter, and ;greener; but also need to be developed at on much shorter time scales and at lower costs. A number of advanced optimization strategies have been developed to achieve these requirements. This paper reviews recent progress in turbomachinery design optimization to solve real-world aerodynamic problems, especially for compressors and turbines. This review covers the following topics that are important for optimizing turbomachinery designs. (1) optimization methods, (2) stochastic optimization combined with blade parameterization methods and the design of experiment methods, (3) gradient-based optimization methods for compressors and turbines and (4) data mining techniques for Pareto Fronts. We also present our own insights regarding the current research trends and the future optimization of turbomachinery designs.

  5. Experimental analysis of chaotic neural network models for combinatorial optimization under a unifying framework.

    PubMed

    Kwok, T; Smith, K A

    2000-09-01

    The aim of this paper is to study both the theoretical and experimental properties of chaotic neural network (CNN) models for solving combinatorial optimization problems. Previously we have proposed a unifying framework which encompasses the three main model types, namely, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and the Hopfield network with chaotic noise. Each of these models can be represented as a special case under the framework for certain conditions. This paper combines the framework with experimental results to provide new insights into the effect of the chaotic neurodynamics of each model. By solving the N-queen problem of various sizes with computer simulations, the CNN models are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability. Furthermore, characteristic chaotic neurodynamics crucial to effective optimization are identified, together with a guide to choosing the corresponding model parameters.

  6. A Robust Control Design Framework for Substructure Models

    NASA Technical Reports Server (NTRS)

    Lim, Kyong B.

    1994-01-01

    A framework for designing control systems directly from substructure models and uncertainties is proposed. The technique is based on combining a set of substructure robust control problems by an interface stiffness matrix which appears as a constant gain feedback. Variations of uncertainties in the interface stiffness are treated as a parametric uncertainty. It is shown that multivariable robust control can be applied to generate centralized or decentralized controllers that guarantee performance with respect to uncertainties in the interface stiffness, reduced component modes and external disturbances. The technique is particularly suited for large, complex, and weakly coupled flexible structures.

  7. Multidisciplinary design optimization - An emerging new engineering discipline

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1993-01-01

    A definition of the multidisciplinary design optimization (MDO) is introduced, and functionality and relationship of the MDO conceptual components are examined. The latter include design-oriented analysis, approximation concepts, mathematical system modeling, design space search, an optimization procedure, and a humane interface.

  8. A Robust Kalman Framework with Resampling and Optimal Smoothing

    PubMed Central

    Kautz, Thomas; Eskofier, Bjoern M.

    2015-01-01

    The Kalman filter (KF) is an extremely powerful and versatile tool for signal processing that has been applied extensively in various fields. We introduce a novel Kalman-based analysis procedure that encompasses robustness towards outliers, Kalman smoothing and real-time conversion from non-uniformly sampled inputs to a constant output rate. These features have been mostly treated independently, so that not all of their benefits could be exploited at the same time. Here, we present a coherent analysis procedure that combines the aforementioned features and their benefits. To facilitate utilization of the proposed methodology and to ensure optimal performance, we also introduce a procedure to calculate all necessary parameters. Thereby, we substantially expand the versatility of one of the most widely-used filtering approaches, taking full advantage of its most prevalent extensions. The applicability and superior performance of the proposed methods are demonstrated using simulated and real data. The possible areas of applications for the presented analysis procedure range from movement analysis over medical imaging, brain-computer interfaces to robot navigation or meteorological studies. PMID:25734647

  9. A robust optimization methodology for preliminary aircraft design

    NASA Astrophysics Data System (ADS)

    Prigent, S.; Maréchal, P.; Rondepierre, A.; Druot, T.; Belleville, M.

    2016-05-01

    This article focuses on a robust optimization of an aircraft preliminary design under operational constraints. According to engineers' know-how, the aircraft preliminary design problem can be modelled as an uncertain optimization problem whose objective (the cost or the fuel consumption) is almost affine, and whose constraints are convex. It is shown that this uncertain optimization problem can be approximated in a conservative manner by an uncertain linear optimization program, which enables the use of the techniques of robust linear programming of Ben-Tal, El Ghaoui, and Nemirovski [Robust Optimization, Princeton University Press, 2009]. This methodology is then applied to two real cases of aircraft design and numerical results are presented.

  10. A continuum mechanics-based framework for optimizing boundary and finite element meshes associated with underground excavations-framework

    NASA Astrophysics Data System (ADS)

    Zsáki, Attila M.; Curran, John H.

    2005-11-01

    Many field problems, from stress analysis, heat transfer to contaminant transport, deal with disturbances in a continuum caused by a source (defined by its discrete geometry) and a region of interest (where a solution is sought). Depending on the location of regions of interest in relation to the sources, the level of geometric detail necessary to represent the sources in a model can vary considerably. A practical application of stress analysis in mining is the evaluation of the effects of continuous excavation on the states of stress around mine openings. Labour intensive model preparation and lengthy computation coupled with the interpretation of analysis results can have considerable impact on the successful operation of an underground mine, where stope failures can cost tens of millions of dollars and possibly lead to closure of the mine.A framework is proposed based on continuum mechanics principles to automatically optimize the level of geometric detail required for an analysis by simplifying the model geometry using expanded and modified algorithms that originated in computer graphics. This reduction in model size directly translates to savings in computational time. The results obtained from an optimized model have accuracy comparable to the uncertainty in input data (e.g. rock mass properties, geology, etc.). This first paper defines the optimization framework, while a companion paper investigates its efficiency and application to practical mining and excavation-related problems. Copyright

  11. Integrated Layout and Support Structure Optimization for Offshore Wind Farm Design

    NASA Astrophysics Data System (ADS)

    Ashuri, T.; Ponnurangam, C.; Zhang, J.; Rotea, M.

    2016-09-01

    This paper develops a multidisciplinary design optimization framework for integrated design optimization of offshore wind farm layout and support structure. A computational model is developed to characterize the physics of the wind farm wake, aerodynamic and hydrodynamic loads, response of the support structure to these loads, soil- structure interaction, as well as different cost elements. Levelized cost of energy is introduced as the objective function. The design constraints are the farm external boundary, and support structure buckling, first modal-frequency, fatigue damage and ultimate stresses. To evaluate the effectiveness of the proposed approach, four optimization scenarios are considered: a feasible baseline design, optimization of layout only, optimization of support structure only, and integrated design of the layout and support structure. Compared to the baseline design, the optimization results show that the isolated support structure design reduces the levelized cost of energy by 0.6%, the isolated layout design reduces the levelized cost of energy by 2.0%, and the integrated layout and support structure design reduces the levelized cost of energy by 2.6%.

  12. Rainfall and streamflow sensor network design: a review of applications, classification, and a proposed framework

    NASA Astrophysics Data System (ADS)

    Chacon-Hurtado, Juan C.; Alfonso, Leonardo; Solomatine, Dimitri P.

    2017-06-01

    Sensors and sensor networks play an important role in decision-making related to water quality, operational streamflow forecasting, flood early warning systems, and other areas. In this paper we review a number of existing applications and analyse a variety of evaluation and design procedures for sensor networks with respect to various criteria. Most of the existing approaches focus on maximising the observability and information content of a variable of interest. From the context of hydrological modelling only a few studies use the performance of the hydrological simulation in terms of output discharge as a design criterion. In addition to the review, we propose a framework for classifying the existing design methods, and a generalised procedure for an optimal network design in the context of rainfall-runoff hydrological modelling.

  13. Active cooling design for scramjet engines using optimization methods

    NASA Technical Reports Server (NTRS)

    Scotti, Stephen J.; Martin, Carl J.; Lucas, Stephen H.

    1988-01-01

    A methodology for using optimization in designing metallic cooling jackets for scramjet engines is presented. The optimal design minimizes the required coolant flow rate subject to temperature, mechanical-stress, and thermal-fatigue-life constraints on the cooling-jacket panels, and Mach-number and pressure constraints on the coolant exiting the panel. The analytical basis for the methodology is presented, and results for the optimal design of panels are shown to demonstrate its utility.

  14. RTM And VARTM Design, Optimization, And Control With SLIC

    DTIC Science & Technology

    2003-07-02

    UD-CCM l 2 July 2003 1 RTM AND VARTM DESIGN, OPTIMIZATION, AND CONTROL WITH SLIC Kuang-Ting Hsiao UD-CCM Report Documentation Page Form ApprovedOMB...COVERED - 4. TITLE AND SUBTITLE RTM And VARTM Design, Optimization, And Control With SLIC 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...ONR Workshop - 5 Simulation-based Liquid Injection Control: Philosophy SLIC Artificial Intelligence Optimized Design For RTM / VARTM Sensors

  15. Active cooling design for scramjet engines using optimization methods

    NASA Technical Reports Server (NTRS)

    Scotti, Stephen J.; Martin, Carl J.; Lucas, Stephen H.

    1988-01-01

    A methodology for using optimization in designing metallic cooling jackets for scramjet engines is presented. The optimal design minimizes the required coolant flow rate subject to temperature, mechanical-stress, and thermal-fatigue-life constraints on the cooling-jacket panels, and Mach-number and pressure contraints on the coolant exiting the panel. The analytical basis for the methodology is presented, and results for the optimal design of panels are shown to demonstrate its utility.

  16. Active cooling design for scramjet engines using optimization methods

    NASA Technical Reports Server (NTRS)

    Scotti, Stephen J.; Martin, Carl J.; Lucas, Stephen H.

    1988-01-01

    A methodology for using optimization in designing metallic cooling jackets for scramjet engines is presented. The optimal design minimizes the required coolant flow rate subject to temperature, mechanical-stress, and thermal-fatigue-life constraints on the cooling-jacket panels, and Mach-number and pressure constraints on the coolant exiting the panel. The analytical basis for the methodology is presented, and results for the optimal design of panels are shown to demonstrate its utility.

  17. Microgravity isolation system design: A modern control synthesis framework

    NASA Technical Reports Server (NTRS)

    Hampton, R. D.; Knospe, C. R.; Allaire, P. E.; Grodsinsky, C. M.

    1994-01-01

    Manned orbiters will require active vibration isolation for acceleration-sensitive microgravity science experiments. Since umbilicals are highly desirable or even indispensable for many experiments, and since their presence greatly affects the complexity of the isolation problem, they should be considered in control synthesis. In this paper a general framework is presented for applying extended H2 synthesis methods to the three-dimensional microgravity isolation problem. The methodology integrates control and state frequency weighting and input and output disturbance accommodation techniques into the basic H2 synthesis approach. The various system models needed for design and analysis are also presented. The paper concludes with a discussion of a general design philosophy for the microgravity vibration isolation problem.

  18. Microgravity isolation system design: A modern control synthesis framework

    NASA Technical Reports Server (NTRS)

    Hampton, R. D.; Knospe, C. R.; Allaire, P. E.; Grodsinsky, C. M.

    1994-01-01

    Manned orbiters will require active vibration isolation for acceleration-sensitive microgravity science experiments. Since umbilicals are highly desirable or even indispensable for many experiments, and since their presence greatly affects the complexity of the isolation problem, they should be considered in control synthesis. A general framework is presented for applying extended H2 synthesis methods to the three-dimensional microgravity isolation problem. The methodology integrates control and state frequency weighting and input and output disturbance accommodation techniques into the basic H2 synthesis approach. The various system models needed for design and analysis are also presented. The paper concludes with a discussion of a general design philosophy for the microgravity vibration isolation problem.

  19. A sequential linear optimization approach for controller design

    NASA Technical Reports Server (NTRS)

    Horta, L. G.; Juang, J.-N.; Junkins, J. L.

    1985-01-01

    A linear optimization approach with a simple real arithmetic algorithm is presented for reliable controller design and vibration suppression of flexible structures. Using first order sensitivity of the system eigenvalues with respect to the design parameters in conjunction with a continuation procedure, the method converts a nonlinear optimization problem into a maximization problem with linear inequality constraints. The method of linear programming is then applied to solve the converted linear optimization problem. The general efficiency of the linear programming approach allows the method to handle structural optimization problems with a large number of inequality constraints on the design vector. The method is demonstrated using a truss beam finite element model for the optimal sizing and placement of active/passive-structural members for damping augmentation. Results using both the sequential linear optimization approach and nonlinear optimization are presented and compared. The insensitivity to initial conditions of the linear optimization approach is also demonstrated.

  20. Microgravity isolation system design: A modern control analysis framework

    NASA Technical Reports Server (NTRS)

    Hampton, R. D.; Knospe, C. R.; Allaire, P. E.; Grodsinsky, C. M.

    1994-01-01

    Many acceleration-sensitive, microgravity science experiments will require active vibration isolation from the manned orbiters on which they will be mounted. The isolation problem, especially in the case of a tethered payload, is a complex three-dimensional one that is best suited to modern-control design methods. These methods, although more powerful than their classical counterparts, can nonetheless go only so far in meeting the design requirements for practical systems. Once a tentative controller design is available, it must still be evaluated to determine whether or not it is fully acceptable, and to compare it with other possible design candidates. Realistically, such evaluation will be an inherent part of a necessary iterative design process. In this paper, an approach is presented for applying complex mu-analysis methods to a closed-loop vibration isolation system (experiment plus controller). An analysis framework is presented for evaluating nominal stability, nominal performance, robust stability, and robust performance of active microgravity isolation systems, with emphasis on the effective use of mu-analysis methods.

  1. A Five-Level Design Framework for Bicluster Visualizations.

    PubMed

    Sun, Maoyuan; North, Chris; Ramakrishnan, Naren

    2014-12-01

    Analysts often need to explore and identify coordinated relationships (e.g., four people who visited the same five cities on the same set of days) within some large datasets for sensemaking. Biclusters provide a potential solution to ease this process, because each computed bicluster bundles individual relationships into coordinated sets. By understanding such computed, structural, relations within biclusters, analysts can leverage their domain knowledge and intuition to determine the importance and relevance of the extracted relationships for making hypotheses. However, due to the lack of systematic design guidelines, it is still a challenge to design effective and usable visualizations of biclusters to enhance their perceptibility and interactivity for exploring coordinated relationships. In this paper, we present a five-level design framework for bicluster visualizations, with a survey of the state-of-the-art design considerations and applications that are related or that can be applied to bicluster visualizations. We summarize pros and cons of these design options to support user tasks at each of the five-level relationships. Finally, we discuss future research challenges for bicluster visualizations and their incorporation into visual analytics tools.

  2. An Integrated Framework Advancing Membrane Protein Modeling and Design

    PubMed Central

    Weitzner, Brian D.; Duran, Amanda M.; Tilley, Drew C.; Elazar, Assaf; Gray, Jeffrey J.

    2015-01-01

    Membrane proteins are critical functional molecules in the human body, constituting more than 30% of open reading frames in the human genome. Unfortunately, a myriad of difficulties in overexpression and reconstitution into membrane mimetics severely limit our ability to determine their structures. Computational tools are therefore instrumental to membrane protein structure prediction, consequently increasing our understanding of membrane protein function and their role in disease. Here, we describe a general framework facilitating membrane protein modeling and design that combines the scientific principles for membrane protein modeling with the flexible software architecture of Rosetta3. This new framework, called RosettaMP, provides a general membrane representation that interfaces with scoring, conformational sampling, and mutation routines that can be easily combined to create new protocols. To demonstrate the capabilities of this implementation, we developed four proof-of-concept applications for (1) prediction of free energy changes upon mutation; (2) high-resolution structural refinement; (3) protein-protein docking; and (4) assembly of symmetric protein complexes, all in the membrane environment. Preliminary data show that these algorithms can produce meaningful scores and structures. The data also suggest needed improvements to both sampling routines and score functions. Importantly, the applications collectively demonstrate the potential of combining the flexible nature of RosettaMP with the power of Rosetta algorithms to facilitate membrane protein modeling and design. PMID:26325167

  3. Optimal design and uncertainty quantification in blood flow simulations for congenital heart disease

    NASA Astrophysics Data System (ADS)

    Marsden, Alison

    2009-11-01

    Recent work has demonstrated substantial progress in capabilities for patient-specific cardiovascular flow simulations. Recent advances include increasingly complex geometries, physiological flow conditions, and fluid structure interaction. However inputs to these simulations, including medical image data, catheter-derived pressures and material properties, can have significant uncertainties associated with them. For simulations to predict clinically useful and reliable output information, it is necessary to quantify the effects of input uncertainties on outputs of interest. In addition, blood flow simulation tools can now be efficiently coupled to shape optimization algorithms for surgery design applications, and these tools should incorporate uncertainty information. We present a unified framework to systematically and efficient account for uncertainties in simulations using adaptive stochastic collocation. In addition, we present a framework for derivative-free optimization of cardiovascular geometries, and layer these tools to perform optimization under uncertainty. These methods are demonstrated using simulations and surgery optimization to improve hemodynamics in pediatric cardiology applications.

  4. Communication Optimizations for a Wireless Distributed Prognostic Framework

    NASA Technical Reports Server (NTRS)

    Saha, Sankalita; Saha, Bhaskar; Goebel, Kai

    2009-01-01

    Distributed architecture for prognostics is an essential step in prognostic research in order to enable feasible real-time system health management. Communication overhead is an important design problem for such systems. In this paper we focus on communication issues faced in the distributed implementation of an important class of algorithms for prognostics - particle filters. In spite of being computation and memory intensive, particle filters lend well to distributed implementation except for one significant step - resampling. We propose new resampling scheme called parameterized resampling that attempts to reduce communication between collaborating nodes in a distributed wireless sensor network. Analysis and comparison with relevant resampling schemes is also presented. A battery health management system is used as a target application. A new resampling scheme for distributed implementation of particle filters has been discussed in this paper. Analysis and comparison of this new scheme with existing resampling schemes in the context for minimizing communication overhead have also been discussed. Our proposed new resampling scheme performs significantly better compared to other schemes by attempting to reduce both the communication message length as well as number total communication messages exchanged while not compromising prediction accuracy and precision. Future work will explore the effects of the new resampling scheme in the overall computational performance of the whole system as well as full implementation of the new schemes on the Sun SPOT devices. Exploring different network architectures for efficient communication is an importance future research direction as well.

  5. A new optimization framework using genetic algorithm and artificial neural network to reduce uncertainties in petroleum reservoir models

    NASA Astrophysics Data System (ADS)

    Maschio, Célio; José Schiozer, Denis

    2015-01-01

    In this article, a new optimization framework to reduce uncertainties in petroleum reservoir attributes using artificial intelligence techniques (neural network and genetic algorithm) is proposed. Instead of using the deterministic values of the reservoir properties, as in a conventional process, the parameters of the probability density function of each uncertain attribute are set as design variables in an optimization process using a genetic algorithm. The objective function (OF) is based on the misfit of a set of models, sampled from the probability density function, and a symmetry factor (which represents the distribution of curves around the history) is used as weight in the OF. Artificial neural networks are trained to represent the production curves of each well and the proxy models generated are used to evaluate the OF in the optimization process. The proposed method was applied to a reservoir with 16 uncertain attributes and promising results were obtained.

  6. Minimax D-Optimal Designs for Item Response Theory Models.

    ERIC Educational Resources Information Center

    Berger, Martjin P. F.; King, C. Y. Joy; Wong, Weng Kee

    2000-01-01

    Proposed minimax designs for item response theory (IRT) models to overcome the problem of local optimality. Compared minimax designs to sequentially constructed designs for the two parameter logistic model. Results show that minimax designs can be nearly as efficient as sequentially constructed designs. (Author/SLD)

  7. Multilevel Optimization Framework for Hierarchical Stiffened Shells Accelerated by Adaptive Equivalent Strategy

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Tian, Kuo; Zhao, Haixin; Hao, Peng; Zhu, Tianyu; Zhang, Ke; Ma, Yunlong

    2016-09-01

    In order to improve the post-buckling optimization efficiency of hierarchical stiffened shells, a multilevel optimization framework accelerated by adaptive equivalent strategy is presented in this paper. Firstly, the Numerical-based Smeared Stiffener Method (NSSM) for hierarchical stiffened shells is derived by means of the numerical implementation of asymptotic homogenization (NIAH) method. Based on the NSSM, a reasonable adaptive equivalent strategy for hierarchical stiffened shells is developed from the concept of hierarchy reduction. Its core idea is to self-adaptively decide which hierarchy of the structure should be equivalent according to the critical buckling mode rapidly predicted by NSSM. Compared with the detailed model, the high prediction accuracy and efficiency of the proposed model is highlighted. On the basis of this adaptive equivalent model, a multilevel optimization framework is then established by decomposing the complex entire optimization process into major-stiffener-level and minor-stiffener-level sub-optimizations, during which Fixed Point Iteration (FPI) is employed to accelerate convergence. Finally, the illustrative examples of the multilevel framework is carried out to demonstrate its efficiency and effectiveness to search for the global optimum result by contrast with the single-level optimization method. Remarkably, the high efficiency and flexibility of the adaptive equivalent strategy is indicated by compared with the single equivalent strategy.

  8. Multilevel Optimization Framework for Hierarchical Stiffened Shells Accelerated by Adaptive Equivalent Strategy

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Tian, Kuo; Zhao, Haixin; Hao, Peng; Zhu, Tianyu; Zhang, Ke; Ma, Yunlong

    2017-06-01

    In order to improve the post-buckling optimization efficiency of hierarchical stiffened shells, a multilevel optimization framework accelerated by adaptive equivalent strategy is presented in this paper. Firstly, the Numerical-based Smeared Stiffener Method (NSSM) for hierarchical stiffened shells is derived by means of the numerical implementation of asymptotic homogenization (NIAH) method. Based on the NSSM, a reasonable adaptive equivalent strategy for hierarchical stiffened shells is developed from the concept of hierarchy reduction. Its core idea is to self-adaptively decide which hierarchy of the structure should be equivalent according to the critical buckling mode rapidly predicted by NSSM. Compared with the detailed model, the high prediction accuracy and efficiency of the proposed model is highlighted. On the basis of this adaptive equivalent model, a multilevel optimization framework is then established by decomposing the complex entire optimization process into major-stiffener-level and minor-stiffener-level sub-optimizations, during which Fixed Point Iteration (FPI) is employed to accelerate convergence. Finally, the illustrative examples of the multilevel framework is carried out to demonstrate its efficiency and effectiveness to search for the global optimum result by contrast with the single-level optimization method. Remarkably, the high efficiency and flexibility of the adaptive equivalent strategy is indicated by compared with the single equivalent strategy.

  9. Optimal Management and Design of Energy Systems under Atmospheric Uncertainty

    NASA Astrophysics Data System (ADS)

    Anitescu, M.; Constantinescu, E. M.; Zavala, V.

    2010-12-01

    optimal management and design of energy systems, such as the power grid or building systems, under atmospheric conditions uncertainty. The framework is defined in terms of a mathematical paradigm called stochastic programming: minimization of the expected value of the decision-makers objective function subject to physical and operational constraints, such as low blackout porbability, that are enforced on each scenario. We report results on testing the framework on the optimal management of power grid systems under high wind penetration scenarios, a problem whose time horizon is in the order of days. We discuss the computational effort of scenario generation which involves running WRF at high spatio-temporal resolution dictated by the operational constraints as well as solving the optimal dispatch problem. We demonstrate that accounting for uncertainty in atmospheric conditions results in blackout prevention, whereas decisions using only mean forecast does not. We discuss issues in using the framework for planning problems, whose time horizon is of several decades and what requirements this problem would entail from climate simulation systems.

  10. A Modified Particle Swarm Optimization Technique for Finding Optimal Designs for Mixture Models

    PubMed Central

    Wong, Weng Kee; Chen, Ray-Bing; Huang, Chien-Chih; Wang, Weichung

    2015-01-01

    Particle Swarm Optimization (PSO) is a meta-heuristic algorithm that has been shown to be successful in solving a wide variety of real and complicated optimization problems in engineering and computer science. This paper introduces a projection based PSO technique, named ProjPSO, to efficiently find different types of optimal designs, or nearly optimal designs, for mixture models with and without constraints on the components, and also for related models, like the log contrast models. We also compare the modified PSO performance with Fedorov's algorithm, a popular algorithm used to generate optimal designs, Cocktail algorithm, and the recent algorithm proposed by [1]. PMID:26091237

  11. Optimal design for nonlinear estimation of the hemodynamic response function.

    PubMed

    Maus, Bärbel; van Breukelen, Gerard J P; Goebel, Rainer; Berger, Martijn P F

    2012-06-01

    Subject-specific hemodynamic response functions (HRFs) have been recommended to capture variation in the form of the hemodynamic response between subjects (Aguirre et al., [ 1998]: Neuroimage 8:360-369). The purpose of this article is to find optimal designs for estimation of subject-specific parameters for the double gamma HRF. As the double gamma function is a nonlinear function of its parameters, optimal design theory for nonlinear models is employed in this article. The double gamma function is linearized by a Taylor approximation and the maximin criterion is used to handle dependency of the D-optimal design on the expansion point of the Taylor approximation. A realistic range of double gamma HRF parameters is used for the expansion point of the Taylor approximation. Furthermore, a genetic algorithm (GA) (Kao et al., [ 2009]: Neuroimage 44:849-856) is applied to find locally optimal designs for the different expansion points and the maximin design chosen from the locally optimal designs is compared to maximin designs obtained by m-sequences, blocked designs, designs with constant interstimulus interval (ISI) and random event-related designs. The maximin design obtained by the GA is most efficient. Random event-related designs chosen from several generated designs and m-sequences have a high efficiency, while blocked designs and designs with a constant ISI have a low efficiency compared to the maximin GA design.

  12. A derivative-free multistart framework for an automated noncoplanar beam angle optimization in IMRT.

    PubMed

    Rocha, Humberto; Dias, Joana; Ventura, Tiago; Ferreira, Brígida; Lopes, Maria do Carmo

    2016-10-01

    The inverse planning of an intensity-modulated radiation therapy (IMRT) treatment requires decisions regarding the angles used for radiation incidence, even when arcs are used. The possibility of improving the quality of treatment plans by an optimized selection of the beam angle incidences-beam angle optimization (BAO)-is seldom done in clinical practice. The inclusion of noncoplanar beam incidences in an automated optimization routine is even more unusual. However, for some tumor sites, the advantage of considering noncoplanar beam incidences is well known. This paper presents the benefits of using a derivative-free multistart framework for the optimization of the noncoplanar BAO problem. Multistart methods combine a global strategy for sampling the search space with a local strategy for improving the sampled solutions. The proposed global strategy allows a thorough exploration of the continuous search space of the highly nonconvex BAO problem. To avoid local entrapment, a derivative-free method is used as local procedure. Additional advantages of the derivative-free method include the reduced number of function evaluations required to converge and the ability to use multithreaded computing. Twenty nasopharyngeal clinical cases were selected to test the proposed multistart framework. The planning target volumes included the primary tumor, the high and low risk lymph nodes. Organs-at-risk included the spinal cord, brainstem, optical nerves, chiasm, parotids, oral cavity, brain, thyroid, among others. For each case, a setup with seven equispaced beams was chosen and the resulting treatment plan, using a multicriteria optimization framework, was then compared against the coplanar and noncoplanar plans using the optimal beam setups obtained by the derivative-free multistart framework. The optimal noncoplanar beam setup obtained by the derivative-free multistart framework leads to high quality treatment plans with better target coverage and with improved organ sparing

  13. Post-Optimality Analysis In Aerospace Vehicle Design

    NASA Technical Reports Server (NTRS)

    Braun, Robert D.; Kroo, Ilan M.; Gage, Peter J.

    1993-01-01

    This analysis pertains to the applicability of optimal sensitivity information to aerospace vehicle design. An optimal sensitivity (or post-optimality) analysis refers to computations performed once the initial optimization problem is solved. These computations may be used to characterize the design space about the present solution and infer changes in this solution as a result of constraint or parameter variations, without reoptimizing the entire system. The present analysis demonstrates that post-optimality information generated through first-order computations can be used to accurately predict the effect of constraint and parameter perturbations on the optimal solution. This assessment is based on the solution of an aircraft design problem in which the post-optimality estimates are shown to be within a few percent of the true solution over the practical range of constraint and parameter variations. Through solution of a reusable, single-stage-to-orbit, launch vehicle design problem, this optimal sensitivity information is also shown to improve the efficiency of the design process, For a hierarchically decomposed problem, this computational efficiency is realized by estimating the main-problem objective gradient through optimal sep&ivity calculations, By reducing the need for finite differentiation of a re-optimized subproblem, a significant decrease in the number of objective function evaluations required to reach the optimal solution is obtained.

  14. INNOVATIVE METHODS FOR THE OPTIMIZATION OF GRAVITY STORM SEWER DESIGN

    EPA Science Inventory

    The purpose of this paper is to describe a new method for optimizing the design of urban storm sewer systems. Previous efforts to optimize gravity sewers have met with limited success because classical optimization methods require that the problem be well behaved, e.g. describ...

  15. INNOVATIVE METHODS FOR THE OPTIMIZATION OF GRAVITY STORM SEWER DESIGN

    EPA Science Inventory

    The purpose of this paper is to describe a new method for optimizing the design of urban storm sewer systems. Previous efforts to optimize gravity sewers have met with limited success because classical optimization methods require that the problem be well behaved, e.g. describ...

  16. Multidisciplinary design optimization: An emerging new engineering discipline

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1993-01-01

    This paper defines the Multidisciplinary Design Optimization (MDO) as a new field of research endeavor and as an aid in the design of engineering systems. It examines the MDO conceptual components in relation to each other and defines their functions.

  17. A design optimization process for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Chamberlain, Robert G.; Fox, George; Duquette, William H.

    1990-01-01

    The Space Station Freedom Program is used to develop and implement a process for design optimization. Because the relative worth of arbitrary design concepts cannot be assessed directly, comparisons must be based on designs that provide the same performance from the point of view of station users; such designs can be compared in terms of life cycle cost. Since the technology required to produce a space station is widely dispersed, a decentralized optimization process is essential. A formulation of the optimization process is provided and the mathematical models designed to facilitate its implementation are described.

  18. Comparison of two spatial optimization techniques: a framework to solve multiobjective land use distribution problems.

    PubMed

    Meyer, Burghard Christian; Lescot, Jean-Marie; Laplana, Ramon

    2009-02-01

    Two spatial optimization approaches, developed from the opposing perspectives of ecological economics and landscape planning and aimed at the definition of new distributions of farming systems and of land use elements, are compared and integrated into a general framework. The first approach, applied to a small river catchment in southwestern France, uses SWAT (Soil and Water Assessment Tool) and a weighted goal programming model in combination with a geographical information system (GIS) for the determination of optimal farming system patterns, based on selected objective functions to minimize deviations from the goals of reducing nitrogen and maintaining income. The second approach, demonstrated in a suburban landscape near Leipzig, Germany, defines a GIS-based predictive habitat model for the search of unfragmented regions suitable for hare populations (Lepus europaeus), followed by compromise optimization with the aim of planning a new habitat structure distribution for the hare. The multifunctional problem is solved by the integration of the three landscape functions ("production of cereals," "resistance to soil erosion by water," and "landscape water retention"). Through the comparison, we propose a framework for the definition of optimal land use patterns based on optimization techniques. The framework includes the main aspects to solve land use distribution problems with the aim of finding the optimal or best land use decisions. It integrates indicators, goals of spatial developments and stakeholders, including weighting, and model tools for the prediction of objective functions and risk assessments. Methodological limits of the uncertainty of data and model outcomes are stressed. The framework clarifies the use of optimization techniques in spatial planning.

  19. Comparison of Two Spatial Optimization Techniques: A Framework to Solve Multiobjective Land Use Distribution Problems

    NASA Astrophysics Data System (ADS)

    Meyer, Burghard Christian; Lescot, Jean-Marie; Laplana, Ramon

    2009-02-01

    Two spatial optimization approaches, developed from the opposing perspectives of ecological economics and landscape planning and aimed at the definition of new distributions of farming systems and of land use elements, are compared and integrated into a general framework. The first approach, applied to a small river catchment in southwestern France, uses SWAT (Soil and Water Assessment Tool) and a weighted goal programming model in combination with a geographical information system (GIS) for the determination of optimal farming system patterns, based on selected objective functions to minimize deviations from the goals of reducing nitrogen and maintaining income. The second approach, demonstrated in a suburban landscape near Leipzig, Germany, defines a GIS-based predictive habitat model for the search of unfragmented regions suitable for hare populations ( Lepus europaeus), followed by compromise optimization with the aim of planning a new habitat structure distribution for the hare. The multifunctional problem is solved by the integration of the three landscape functions (“production of cereals,” “resistance to soil erosion by water,” and “landscape water retention”). Through the comparison, we propose a framework for the definition of optimal land use patterns based on optimization techniques. The framework includes the main aspects to solve land use distribution problems with the aim of finding the optimal or best land use decisions. It integrates indicators, goals of spatial developments and stakeholders, including weighting, and model tools for the prediction of objective functions and risk assessments. Methodological limits of the uncertainty of data and model outcomes are stressed. The framework clarifies the use of optimization techniques in spatial planning.

  20. Integration of Physical Design and Sequential Optimization

    DTIC Science & Technology

    2006-03-06

    synchronous digital cir- cuits. A sufficient condition for the solution to the optimal clock scheduling problem in the face of process variations is given... Optimization For Lagrangian Dual 1: k← 0 2: x,y← argminx,y L(x,y,k) 3: while KKT conditions are not satisfied do 4: k←max(0,k+ γ ·g(x,y)) 5: x,y← argminx,y L(x...associated with the clock distribution tree in a digital synchronous circuit. A model for this problem and a sufficient condition for its optimal solution is

  1. New Approaches to HSCT Multidisciplinary Design and Optimization

    NASA Technical Reports Server (NTRS)

    Schrage, Daniel P.; Craig, James I.; Fulton, Robert E.; Mistree, Farrokh

    1999-01-01

    New approaches to MDO have been developed and demonstrated during this project on a particularly challenging aeronautics problem- HSCT Aeroelastic Wing Design. To tackle this problem required the integration of resources and collaboration from three Georgia Tech laboratories: ASDL, SDL, and PPRL, along with close coordination and participation from industry. Its success can also be contributed to the close interaction and involvement of fellows from the NASA Multidisciplinary Analysis and Optimization (MAO) program, which was going on in parallel, and provided additional resources to work the very complex, multidisciplinary problem, along with the methods being developed. The development of the Integrated Design Engineering Simulator (IDES) and its initial demonstration is a necessary first step in transitioning the methods and tools developed to larger industrial sized problems of interest. It also provides a framework for the implementation and demonstration of the methodology. Attachment: Appendix A - List of publications. Appendix B - Year 1 report. Appendix C - Year 2 report. Appendix D - Year 3 report. Appendix E - accompanying CDROM.

  2. New Approaches to HSCT Multidisciplinary Design and Optimization

    NASA Technical Reports Server (NTRS)

    Schrage, Daniel P.; Craig, James I.; Fulton, Robert E.; Mistree, Farrokh

    1999-01-01

    New approaches to MDO have been developed and demonstrated during this project on a particularly challenging aeronautics problem- HSCT Aeroelastic Wing Design. To tackle this problem required the integration of resources and collaboration from three Georgia Tech laboratories: ASDL, SDL, and PPRL, along with close coordination and participation from industry. Its success can also be contributed to the close interaction and involvement of fellows from the NASA Multidisciplinary Analysis and Optimization (MAO) program, which was going on in parallel, and provided additional resources to work the very complex, multidisciplinary problem, along with the methods being developed. The development of the Integrated Design Engineering Simulator (IDES) and its initial demonstration is a necessary first step in transitioning the methods and tools developed to larger industrial sized problems of interest. It also provides a framework for the implementation and demonstration of the methodology. Attachment: Appendix A - List of publications. Appendix B - Year 1 report. Appendix C - Year 2 report. Appendix D - Year 3 report. Appendix E - accompanying CDROM.

  3. A Matrix-Free Algorithm for Multidisciplinary Design Optimization

    NASA Astrophysics Data System (ADS)

    Lambe, Andrew Borean

    Multidisciplinary design optimization (MDO) is an approach to engineering design that exploits the coupling between components or knowledge disciplines in a complex system to improve the final product. In aircraft design, MDO methods can be used to simultaneously design the outer shape of the aircraft and the internal structure, taking into account the complex interaction between the aerodynamic forces and the structural flexibility. Efficient strategies are needed to solve such design optimization problems and guarantee convergence to an optimal design. This work begins with a comprehensive review of MDO problem formulations and solution algorithms. First, a fundamental MDO problem formulation is defined from which other formulations may be obtained through simple transformations. Using these fundamental problem formulations, decomposition methods from the literature are reviewed and classified. All MDO methods are presented in a unified mathematical notation to facilitate greater understanding. In addition, a novel set of diagrams, called extended design structure matrices, are used to simultaneously visualize both data communication and process flow between the many software components of each method. For aerostructural design optimization, modern decomposition-based MDO methods cannot efficiently handle the tight coupling between the aerodynamic and structural states. This fact motivates the exploration of methods that can reduce the computational cost. A particular structure in the direct and adjoint methods for gradient computation. motivates the idea of a matrix-free optimization method. A simple matrix-free optimizer is developed based on the augmented Lagrangian algorithm. This new matrix-free optimizer is tested on two structural optimization problems and one aerostructural optimization problem. The results indicate that the matrix-free optimizer is able to efficiently solve structural and multidisciplinary design problems with thousands of variables and

  4. A Matrix-Free Algorithm for Multidisciplinary Design Optimization

    NASA Astrophysics Data System (ADS)

    Lambe, Andrew Borean

    Multidisciplinary design optimization (MDO) is an approach to engineering design that exploits the coupling between components or knowledge disciplines in a complex system to improve the final product. In aircraft design, MDO methods can be used to simultaneously design the outer shape of the aircraft and the internal structure, taking into account the complex interaction between the aerodynamic forces and the structural flexibility. Efficient strategies are needed to solve such design optimization problems and guarantee convergence to an optimal design. This work begins with a comprehensive review of MDO problem formulations and solution algorithms. First, a fundamental MDO problem formulation is defined from which other formulations may be obtained through simple transformations. Using these fundamental problem formulations, decomposition methods from the literature are reviewed and classified. All MDO methods are presented in a unified mathematical notation to facilitate greater understanding. In addition, a novel set of diagrams, called extended design structure matrices, are used to simultaneously visualize both data communication and process flow between the many software components of each method. For aerostructural design optimization, modern decomposition-based MDO methods cannot efficiently handle the tight coupling between the aerodynamic and structural states. This fact motivates the exploration of methods that can reduce the computational cost. A particular structure in the direct and adjoint methods for gradient computation motivates the idea of a matrix-free optimization method. A simple matrix-free optimizer is developed based on the augmented Lagrangian algorithm. This new matrix-free optimizer is tested on two structural optimization problems and one aerostructural optimization problem. The results indicate that the matrix-free optimizer is able to efficiently solve structural and multidisciplinary design problems with thousands of variables and

  5. Revenue cycle optimization in health care institutions. A conceptual framework for change management.

    PubMed

    Mugdh, Mrinal; Pilla, Satya

    2012-01-01

    Health care providers in the United States are constantly faced with the enormous challenge of optimizing their revenue cycle to improve their overall financial performance. These challenges keep evolving in both scope and complexity owing to a host of internal and external factors. Furthermore, given the lack of control that health care providers have over external factors, any attempt to successfully optimize the revenue cycle hinges on several critical improvements aimed at realigning the internal factors. This study provides an integrated change management model that aims to reengineer and realign the people-process-technology framework by using the principles of lean and Six Sigma for revenue cycle optimization.

  6. Optimizing the Design of Computer Classrooms: The Physical Environment.

    ERIC Educational Resources Information Center

    Huffman, Heather B.; Jernstedt, G. Christian; Reed, Virginia A.; Reber, Emily S.; Burns, Mathew B.; Oostenink, Richard J.; Williams, Margot T.

    2003-01-01

    Suggests two guiding principles as a framework to interpret the research findings of environmental psychology that focus on effective classroom design: effective design promotes attention in the classroom and allows for periodic shifts of learner activities. Examines these principles as they apply to the design of a computer classroom, reviewing…

  7. Progress in multidisciplinary design optimization at NASA Langley

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.

    1993-01-01

    Multidisciplinary Design Optimization refers to some combination of disciplinary analyses, sensitivity analysis, and optimization techniques used to design complex engineering systems. The ultimate objective of this research at NASA Langley Research Center is to help the US industry reduce the costs associated with development, manufacturing, and maintenance of aerospace vehicles while improving system performance. This report reviews progress towards this objective and highlights topics for future research. Aerospace design problems selected from the author's research illustrate strengths and weaknesses in existing multidisciplinary optimization techniques. The techniques discussed include multiobjective optimization, global sensitivity equations and sequential linear programming.

  8. A bottom-up robust optimization framework for identifying river basin development pathways under deep climate uncertainty

    NASA Astrophysics Data System (ADS)

    Taner, M. U.; Ray, P.; Brown, C.

    2016-12-01

    Hydroclimatic nonstationarity due to climate change poses challenges for long-term water infrastructure planning in river basin systems. While designing strategies that are flexible or adaptive hold intuitive appeal, development of well-performing strategies requires rigorous quantitative analysis that address uncertainties directly while making the best use of scientific information on the expected evolution of future climate. Multi-stage robust optimization (RO) offers a potentially effective and efficient technique for addressing the problem of staged basin-level planning under climate change, however the necessity of assigning probabilities to future climate states or scenarios is an obstacle to implementation, given that methods to reliably assign probabilities to future climate states are not well developed. We present a method that overcomes this challenge by creating a bottom-up RO-based framework that decreases the dependency on probability distributions of future climate and rather employs them after optimization to aid selection amongst competing alternatives. The iterative process yields a vector of `optimal' decision pathways each under the associated set of probabilistic assumptions. In the final phase, the vector of optimal decision pathways is evaluated to identify the solutions that are least sensitive to the scenario probabilities and are most-likely conditional on the climate information. The framework is illustrated for the planning of new dam and hydro-agricultural expansions projects in the Niger River Basin over a 45-year planning period from 2015 to 2060.

  9. Optimal flexible sample size design with robust power.

    PubMed

    Zhang, Lanju; Cui, Lu; Yang, Bo

    2016-08-30

    It is well recognized that sample size determination is challenging because of the uncertainty on the treatment effect size. Several remedies are available in the literature. Group sequential designs start with a sample size based on a conservative (smaller) effect size and allow early stop at interim looks. Sample size re-estimation designs start with a sample size based on an optimistic (larger) effect size and allow sample size increase if the observed effect size is smaller than planned. Different opinions favoring one type over the other exist. We propose an optimal approach using an appropriate optimality criterion to select the best design among all the candidate designs. Our results show that (1) for the same type of designs, for example, group sequential designs, there is room for significant improvement through our optimization approach; (2) optimal promising zone designs appear to have no advantages over optimal group sequential designs; and (3) optimal designs with sample size re-estimation deliver the best adaptive performance. We conclude that to deal with the challenge of sample size determination due to effect size uncertainty, an optimal approach can help to select the best design that provides most robust power across the effect size range of interest. Copyright © 2016 John Wiley & Sons, Ltd.

  10. Multi-objective optimization framework to obtain model-based guidelines for tuning biological synthetic devices: an adaptive network case.

    PubMed

    Boada, Yadira; Reynoso-Meza, Gilberto; Picó, Jesús; Vignoni, Alejandro

    2016-03-11

    Model based design plays a fundamental role in synthetic biology. Exploiting modularity, i.e. using biological parts and interconnecting them to build new and more complex biological circuits is one of the key issues. In this context, mathematical models have been used to generate predictions of the behavior of the designed device. Designers not only want the ability to predict the circuit behavior once all its components have been determined, but also to help on the design and selection of its biological parts, i.e. to provide guidelines for the experimental implementation. This is tantamount to obtaining proper values of the model parameters, for the circuit behavior results from the interplay between model structure and parameters tuning. However, determining crisp values for parameters of the involved parts is not a realistic approach. Uncertainty is ubiquitous to biology, and the characterization of biological parts is not exempt from it. Moreover, the desired dynamical behavior for the designed circuit usually results from a trade-off among several goals to be optimized. We propose the use of a multi-objective optimization tuning framework to get a model-based set of guidelines for the selection of the kinetic parameters required to build a biological device with desired behavior. The design criteria are encoded in the formulation of the objectives and optimization problem itself. As a result, on the one hand the designer obtains qualitative regions/intervals of values of the circuit parameters giving rise to the predefined circuit behavior; on the other hand, he obtains useful information for its guidance in the implementation process. These parameters are chosen so that they can effectively be tuned at the wet-lab, i.e. they are effective biological tuning knobs. To show the proposed approach, the methodology is applied to the design of a well known biological circuit: a genetic incoherent feed-forward circuit showing adaptive behavior. The proposed multi

  11. Genetic algorithm approaches for conceptual design of spacecraft systems including multi-objective optimization and design under uncertainty

    NASA Astrophysics Data System (ADS)

    Hassan, Rania A.

    In the design of complex large-scale spacecraft systems that involve a large number of components and subsystems, many specialized state-of-the-art design tools are employed to optimize the performance of various subsystems. However, there is no structured system-level concept-architecting process. Currently, spacecraft design is heavily based on the heritage of the industry. Old spacecraft designs are modified to adapt to new mission requirements, and feasible solutions---rather than optimal ones---are often all that is achieved. During the conceptual phase of the design, the choices available to designers are predominantly discrete variables describing major subsystems' technology options and redundancy levels. The complexity of spacecraft configurations makes the number of the system design variables that need to be traded off in an optimization process prohibitive when manual techniques are used. Such a discrete problem is well suited for solution with a Genetic Algorithm, which is a global search technique that performs optimization-like tasks. This research presents a systems engineering framework that places design requirements at the core of the design activities and transforms the design paradigm for spacecraft systems to a top-down approach rather than the current bottom-up approach. To facilitate decision-making in the early phases of the design process, the population-based search nature of the Genetic Algorithm is exploited to provide computationally inexpensive---compared to the state-of-the-practice---tools for both multi-objective design optimization and design optimization under uncertainty. In terms of computational cost, those tools are nearly on the same order of magnitude as that of standard single-objective deterministic Genetic Algorithm. The use of a multi-objective design approach provides system designers with a clear tradeoff optimization surface that allows them to understand the effect of their decisions on all the design objectives

  12. Optimal experimental design for diffusion kurtosis imaging.

    PubMed

    Poot, Dirk H J; den Dekker, Arnold J; Achten, Eric; Verhoye, Marleen; Sijbers, Jan

    2010-03-01

    Diffusion kurtosis imaging (DKI) is a new magnetic resonance imaging (MRI) model that describes the non-Gaussian diffusion behavior in tissues. It has recently been shown that DKI parameters, such as the radial or axial kurtosis, are more sensitive to brain physiology changes than the well-known diffusion tensor imaging (DTI) parameters in several white and gray matter structures. In order to estimate either DTI or DKI parameters with maximum precision, the diffusion weighting gradient settings that are applied during the acquisition need to be optimized. Indeed, it has been shown previously that optimizing the set of diffusion weighting gradient settings can have a significant effect on the precision with which DTI parameters can be estimated. In this paper, we focus on the optimization of DKI gradients settings. Commonly, DKI data are acquired using a standard set of diffusion weighting gradients with fixed directions and with regularly spaced gradient strengths. In this paper, we show that such gradient settings are suboptimal with respect to the precision with which DKI parameters can be estimated. Furthermore, the gradient directions and the strengths of the diffusion-weighted MR images are optimized by minimizing the Cramér-Rao lower bound of DKI parameters. The impact of the optimized gradient settings is evaluated, both on simulated as well as experimentally recorded datasets. It is shown that the precision with which the kurtosis parameters can be estimated, increases substantially by optimizing the gradient settings.

  13. A study of commuter airplane design optimization

    NASA Technical Reports Server (NTRS)

    Roskam, J.; Wyatt, R. D.; Griswold, D. A.; Hammer, J. L.

    1977-01-01

    Problems of commuter airplane configuration design were studied to affect a minimization of direct operating costs. Factors considered were the minimization of fuselage drag, methods of wing design, and the estimated drag of an airplane submerged in a propellor slipstream; all design criteria were studied under a set of fixed performance, mission, and stability constraints. Configuration design data were assembled for application by a computerized design methodology program similar to the NASA-Ames General Aviation Synthesis Program.

  14. Optimal fractional order PID design via Tabu Search based algorithm.

    PubMed

    Ateş, Abdullah; Yeroglu, Celaleddin

    2016-01-01

    This paper presents an optimization method based on the Tabu Search Algorithm (TSA) to design a Fractional-Order Proportional-Integral-Derivative (FOPID) controller. All parameter computations of the FOPID employ random initial conditions, using the proposed optimization method. Illustrative examples demonstrate the performance of the proposed FOPID controller design method.

  15. Topology and boundary shape optimization as an integrated design tool

    NASA Technical Reports Server (NTRS)

    Bendsoe, Martin Philip; Rodrigues, Helder Carrico

    1990-01-01

    The optimal topology of a two dimensional linear elastic body can be computed by regarding the body as a domain of the plane with a high density of material. Such an optimal topology can then be used as the basis for a shape optimization method that computes the optimal form of the boundary curves of the body. This results in an efficient and reliable design tool, which can be implemented via common FEM mesh generator and CAD type input-output facilities.

  16. INSTITUTIONALIZING SAFEGUARDS-BY-DESIGN: HIGH-LEVEL FRAMEWORK

    SciTech Connect

    Trond Bjornard PhD; Joseph Alexander; Robert Bean; Brian Castle; Scott DeMuth, Ph.D.; Phillip Durst; Michael Ehinger; Prof. Michael Golay, Ph.D.; Kevin Hase, Ph.D.; David J. Hebditch, DPhil; John Hockert, Ph.D.; Bruce Meppen; James Morgan; Jerry Phillips, Ph.D., PE

    2009-02-01

    participation in facility design options analysis in the conceptual design phase to enhance intrinsic features, among others. The SBD process is unlikely to be broadly applied in the absence of formal requirements to do so, or compelling evidence of its value. Neither exists today. A formal instrument to require the application of SBD is needed and would vary according to both the national and regulatory environment. Several possible approaches to implementation of the requirements within the DOE framework are explored in this report. Finally, there are numerous barriers to the implementation of SBD, including the lack of a strong safeguards culture, intellectual property concerns, the sensitive nature of safeguards information, and the potentially divergent or conflicting interests of participants in the process. In terms of SBD implementation in the United States, there are no commercial nuclear facilities that are under IAEA safeguards. Efforts to institutionalize SBD must address these issues. Specific work in FY09 could focus on the following: finalizing the proposed SBD process for use by DOE and performing a pilot application on a DOE project in the planning phase; developing regulatory options for mandating SBD; further development of safeguards-related design guidance, principles and requirements; development of a specific SBD process tailored to the NRC environment; and development of an engagement strategy for the IAEA and other international partners.

  17. Rosetta:MSF: a modular framework for multi-state computational protein design

    PubMed Central

    Hupfeld, Enrico; Sterner, Reinhard

    2017-01-01

    Computational protein design (CPD) is a powerful technique to engineer existing proteins or to design novel ones that display desired properties. Rosetta is a software suite including algorithms for computational modeling and analysis of protein structures and offers many elaborate protocols created to solve highly specific tasks of protein engineering. Most of Rosetta’s protocols optimize sequences based on a single conformation (i. e. design state). However, challenging CPD objectives like multi-specificity design or the concurrent consideration of positive and negative design goals demand the simultaneous assessment of multiple states. This is why we have developed the multi-state framework MSF that facilitates the implementation of Rosetta’s single-state protocols in a multi-state environment and made available two frequently used protocols. Utilizing MSF, we demonstrated for one of these protocols that multi-state design yields a 15% higher performance than single-state design on a ligand-binding benchmark consisting of structural conformations. With this protocol, we designed de novo nine retro-aldolases on a conformational ensemble deduced from a (βα)8-barrel protein. All variants displayed measurable catalytic activity, testifying to a high success rate for this concept of multi-state enzyme design. PMID:28604768

  18. Ligand design for functional metal-organic frameworks.

    PubMed

    Paz, Filipe A Almeida; Klinowski, Jacek; Vilela, Sérgio M F; Tomé, João P C; Cavaleiro, José A S; Rocha, João

    2012-02-07

    Metal-organic frameworks (MOFs), also known as coordination polymers, are formed by the self-assembly of metallic centres and bridging organic linkers. In this critical review, we review the key advances in the field and discuss the relationship between the nature and structure of specifically designed organic linkers and the properties of the products. Practical examples demonstrate that the physical and chemical properties of the linkers play a decisive role in the properties of novel functional MOFs. We focus on target materials suitable for the storage of hydrogen and methane, sequestration of carbon dioxide, gas separation, heterogeneous catalysis and as magnetic and photoluminescent materials capable of both metal- and ligand-centred emission, ion exchangers and molecular sieves. The advantages of highly active discrete complexes as metal-bearing ligands in the construction of MOFs are also briefly reviewed (128 references). This journal is © The Royal Society of Chemistry 2012

  19. Architectural Design and the Learning Environment: A Framework for School Design Research

    ERIC Educational Resources Information Center

    Gislason, Neil

    2010-01-01

    This article develops a theoretical framework for studying how instructional space, teaching and learning are related in practice. It is argued that a school's physical design can contribute to the quality of the learning environment, but several non-architectural factors also determine how well a given facility serves as a setting for teaching…

  20. Framework for Implementing Engineering Senior Design Capstone Courses and Design Clinics

    ERIC Educational Resources Information Center

    Franchetti, Matthew; Hefzy, Mohamed Samir; Pourazady, Mehdi; Smallman, Christine

    2012-01-01

    Senior design capstone projects for engineering students are essential components of an undergraduate program that enhances communication, teamwork, and problem solving skills. Capstone projects with industry are well established in management, but not as heavily utilized in engineering. This paper outlines a general framework that can be used by…

  1. Architectural Design and the Learning Environment: A Framework for School Design Research

    ERIC Educational Resources Information Center

    Gislason, Neil

    2010-01-01

    This article develops a theoretical framework for studying how instructional space, teaching and learning are related in practice. It is argued that a school's physical design can contribute to the quality of the learning environment, but several non-architectural factors also determine how well a given facility serves as a setting for teaching…

  2. Framework for Implementing Engineering Senior Design Capstone Courses and Design Clinics

    ERIC Educational Resources Information Center

    Franchetti, Matthew; Hefzy, Mohamed Samir; Pourazady, Mehdi; Smallman, Christine

    2012-01-01

    Senior design capstone projects for engineering students are essential components of an undergraduate program that enhances communication, teamwork, and problem solving skills. Capstone projects with industry are well established in management, but not as heavily utilized in engineering. This paper outlines a general framework that can be used by…

  3. Development and Application of the Collaborative Optimization Architecture in a Multidisciplinary Design Environment

    NASA Technical Reports Server (NTRS)

    Braun, R. D.; Kroo, I. M.

    1995-01-01

    Collaborative optimization is a design architecture applicable in any multidisciplinary analysis environment but specifically intended for large-scale distributed analysis applications. In this approach, a complex problem is hierarchically de- composed along disciplinary boundaries into a number of subproblems which are brought into multidisciplinary agreement by a system-level coordination process. When applied to problems in a multidisciplinary design environment, this scheme has several advantages over traditional solution strategies. These advantageous features include reducing the amount of information transferred between disciplines, the removal of large iteration-loops, allowing the use of different subspace optimizers among the various analysis groups, an analysis framework which is easily parallelized and can operate on heterogenous equipment, and a structural framework that is well-suited for conventional disciplinary organizations. In this article, the collaborative architecture is developed and its mathematical foundation is presented. An example application is also presented which highlights the potential of this method for use in large-scale design applications.

  4. Design and Optimization of Composite Gyroscope Momentum Wheel Rings

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2007-01-01

    Stress analysis and preliminary design/optimization procedures are presented for gyroscope momentum wheel rings composed of metallic, metal matrix composite, and polymer matrix composite materials. The design of these components involves simultaneously minimizing both true part volume and mass, while maximizing angular momentum. The stress analysis results are combined with an anisotropic failure criterion to formulate a new sizing procedure that provides considerable insight into the design of gyroscope momentum wheel ring components. Results compare the performance of two optimized metallic designs, an optimized SiC/Ti composite design, and an optimized graphite/epoxy composite design. The graphite/epoxy design appears to be far superior to the competitors considered unless a much greater premium is placed on volume efficiency compared to mass efficiency.

  5. Design optimization of an opposed piston brake caliper

    NASA Astrophysics Data System (ADS)

    Sergent, Nicolas; Tirovic, Marko; Voveris, Jeronimas

    2014-11-01

    Successful brake caliper designs must be light and stiff, preventing excessive deformation and extended brake pedal travel. These conflicting requirements are difficult to optimize owing to complex caliper geometry, loading and interaction of individual brake components (pads, disc and caliper). The article studies a fixed, four-pot (piston) caliper, and describes in detail the computer-based topology optimization methodology applied to obtain two optimized designs. At first sight, relatively different designs (named 'Z' and 'W') were obtained by minor changes to the designable volume and boundary conditions. However, on closer inspection, the same main bridge design features could be recognized. Both designs offered considerable reduction of caliper mass, by 19% and 28%, respectively. Further finite element analyses conducted on one of the optimized designs (Z caliper) showed which individual bridge features and their combinations are the most important in maintaining caliper stiffness.

  6. Efficient optimization of integrated aerodynamic-structural design

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.; Grossman, B.; Eppard, W. M.; Kao, P. J.

    1987-01-01

    The introduction of composite materials is having a profound effect on the design process. Because these materials permit the designer to tailor material properties to improve structural, aerodynamic and acoustic performance, they require a more integrated multidisciplinary design process. Because of the complexity of the design process numerical optimization methods are required. The present paper is focused on a major difficulty associated with the multidisciplinary design optimization process - its enormous computational cost. We consider two approaches for reducing this computational burden: (1) development of efficient methods for cross-sensitivity calculation using perturbation methods; and (2) the use of approximate numerical optimization procedures. Our efforts are concentrated upon combined aerodynamic-structural optimization. Results are presented for the integrated design of a sailplane wing. The impact of our computational procedures on the computational costs of integrated costs of integrated designs are discussed.

  7. An uncertain multidisciplinary design optimization method using interval convex models

    NASA Astrophysics Data System (ADS)

    Li, Fangyi; Luo, Zhen; Sun, Guangyong; Zhang, Nong

    2013-06-01

    This article proposes an uncertain multi-objective multidisciplinary design optimization methodology, which employs the interval model to represent the uncertainties of uncertain-but-bounded parameters. The interval number programming method is applied to transform each uncertain objective function into two deterministic objective functions, and a satisfaction degree of intervals is used to convert both the uncertain inequality and equality constraints to deterministic inequality constraints. In doing so, an unconstrained deterministic optimization problem will be constructed in association with the penalty function method. The design will be finally formulated as a nested three-loop optimization, a class of highly challenging problems in the area of engineering design optimization. An advanced hierarchical optimization scheme is developed to solve the proposed optimization problem based on the multidisciplinary feasible strategy, which is a well-studied method able to reduce the dimensions of multidisciplinary design optimization problems by using the design variables as independent optimization variables. In the hierarchical optimization system, the non-dominated sorting genetic algorithm II, sequential quadratic programming method and Gauss-Seidel iterative approach are applied to the outer, middle and inner loops of the optimization problem, respectively. Typical numerical examples are used to demonstrate the effectiveness of the proposed methodology.

  8. An Integrated Framework for Parameter-based Optimization of Scientific Workflows

    PubMed Central

    Kumar, Vijay S.; Sadayappan, P.; Mehta, Gaurang; Vahi, Karan; Deelman, Ewa; Ratnakar, Varun; Kim, Jihie; Gil, Yolanda; Hall, Mary; Kurc, Tahsin; Saltz, Joel

    2011-01-01

    Data analysis processes in scientific applications can be expressed as coarse-grain workflows of complex data processing operations with data flow dependencies between them. Performance optimization of these workflows can be viewed as a search for a set of optimal values in a multi-dimensional parameter space. While some performance parameters such as grouping of workflow components and their mapping to machines do not a ect the accuracy of the output, others may dictate trading the output quality of individual components (and of the whole workflow) for performance. This paper describes an integrated framework which is capable of supporting performance optimizations along multiple dimensions of the parameter space. Using two real-world applications in the spatial data analysis domain, we present an experimental evaluation of the proposed framework. PMID:22068617

  9. The Optimal Decay Estimates on the Framework of Besov Spaces for Generally Dissipative Systems

    NASA Astrophysics Data System (ADS)

    Xu, Jiang; Kawashima, Shuichi

    2015-10-01

    We give a new decay framework for the general dissipative hyperbolic system and the hyperbolic-parabolic composite system, which allows us to pay less attention to the traditional spectral analysis in comparison with previous efforts. New ingredients lie in the high-frequency and low-frequency decomposition of a pseudo-differential operator and an interpolation inequality related to homogeneous Besov spaces of negative order. Furthermore, we develop the Littlewood-Paley pointwise energy estimates and new time-weighted energy functionals to establish optimal decay estimates on the framework of spatially critical Besov spaces for the degenerately dissipative hyperbolic system of balance laws. Based on the embedding and the improved Gagliardo-Nirenberg inequality, the optimal decay rates and decay rates are further shown. Finally, as a direct application, the optimal decay rates for three dimensional damped compressible Euler equations are also obtained.

  10. Establishing optimal project-level strategies for pavement maintenance and rehabilitation - A framework and case study

    NASA Astrophysics Data System (ADS)

    Irfan, Muhammad; Bilal Khurshid, Muhammad; Bai, Qiang; Labi, Samuel; Morin, Thomas L.

    2012-05-01

    This article presents a framework and an illustrative example for identifying the optimal pavement maintenance and rehabilitation (M&R) strategy using a mixed-integer nonlinear programming model. The objective function is to maximize the cost-effectiveness expressed as the ratio of the effectiveness to the cost. The constraints for the optimization problem are related to performance, budget, and choice. Two different formulations of effectiveness are derived using treatment-specific performance models for each constituent treatment of the strategy; and cost is expressed in terms of the agency and user costs over the life cycle. The proposed methodology is demonstrated using a case study. Probability distributions are established for the optimization input variables and Monte Carlo simulations are carried out to yield optimal solutions. Using the results of these simulations, M&R strategy contours are developed as a novel tool that can help pavement managers quickly identify the optimal M&R strategy for a given pavement section.

  11. Optimal shielding design for minimum materials cost or mass

    DOE PAGES

    Woolley, Robert D.

    2015-12-02

    The mathematical underpinnings of cost optimal radiation shielding designs based on an extension of optimal control theory are presented, a heuristic algorithm to iteratively solve the resulting optimal design equations is suggested, and computational results for a simple test case are discussed. A typical radiation shielding design problem can have infinitely many solutions, all satisfying the problem's specified set of radiation attenuation requirements. Each such design has its own total materials cost. For a design to be optimal, no admissible change in its deployment of shielding materials can result in a lower cost. This applies in particular to very smallmore » changes, which can be restated using the calculus of variations as the Euler-Lagrange equations. Furthermore, the associated Hamiltonian function and application of Pontryagin's theorem lead to conditions for a shield to be optimal.« less

  12. Gearbox design for uncertain load requirements using active robust optimization

    NASA Astrophysics Data System (ADS)

    Salomon, Shaul; Avigad, Gideon; Purshouse, Robin C.; Fleming, Peter J.

    2016-04-01

    Design and optimization of gear transmissions have been intensively studied, but surprisingly the robustness of the resulting optimal design to uncertain loads has never been considered. Active Robust (AR) optimization is a methodology to design products that attain robustness to uncertain or changing environmental conditions through adaptation. In this study the AR methodology is utilized to optimize the number of transmissions, as well as their gearing ratios, for an uncertain load demand. The problem is formulated as a bi-objective optimization problem where the objectives are to satisfy the load demand in the most energy efficient manner and to minimize production cost. The results show that this approach can find a set of robust designs, revealing a trade-off between energy efficiency and production cost. This can serve as a useful decision-making tool for the gearbox design process, as well as for other applications.

  13. Design optimization of a magnetorheological brake in powered knee orthosis

    NASA Astrophysics Data System (ADS)

    Ma, Hao; Liao, Wei-Hsin

    2015-04-01

    Magneto-rheological (MR) fluids have been utilized in devices like orthoses and prostheses to generate controllable braking torque. In this paper, a flat shape rotary MR brake is designed for powered knee orthosis to provide adjustable resistance. Multiple disk structure with interior inner coil is adopted in the MR brake configuration. In order to increase the maximal magnetic flux, a novel internal structure design with smooth transition surface is proposed. Based on this design, a parameterized model of the MR brake is built for geometrical optimization. Multiple factors are considered in the optimization objective: braking torque, weight, and, particularly, average power consumption. The optimization is then performed with Finite Element Analysis (FEA), and the optimal design is obtained among the Pareto-optimal set considering the trade-offs in design objectives.

  14. Optimal input design for aircraft instrumentation systematic error estimation

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    1991-01-01

    A new technique for designing optimal flight test inputs for accurate estimation of instrumentation systematic errors was developed and demonstrated. A simulation model of the F-18 High Angle of Attack Research Vehicle (HARV) aircraft was used to evaluate the effectiveness of the optimal input compared to input recorded during flight test. Instrumentation systematic error parameter estimates and their standard errors were compared. It was found that the optimal input design improved error parameter estimates and their accuracies for a fixed time input design. Pilot acceptability of the optimal input design was demonstrated using a six degree-of-freedom fixed base piloted simulation of the F-18 HARV. The technique described in this work provides a practical, optimal procedure for designing inputs for data compatibility experiments.

  15. Optimal input design for aircraft instrumentation systematic error estimation

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    1991-01-01

    A new technique for designing optimal flight test inputs for accurate estimation of instrumentation systematic errors was developed and demonstrated. A simulation model of the F-18 High Angle of Attack Research Vehicle (HARV) aircraft was used to evaluate the effectiveness of the optimal input compared to input recorded during flight test. Instrumentation systematic error parameter estimates and their standard errors were compared. It was found that the optimal input design improved error parameter estimates and their accuracies for a fixed time input design. Pilot acceptability of the optimal input design was demonstrated using a six degree-of-freedom fixed base piloted simulation of the F-18 HARV. The technique described in this work provides a practical, optimal procedure for designing inputs for data compatibility experiments.

  16. Optimal shielding design for minimum materials cost or mass

    SciTech Connect

    Woolley, Robert D.

    2015-12-02

    The mathematical underpinnings of cost optimal radiation shielding designs based on an extension of optimal control theory are presented, a heuristic algorithm to iteratively solve the resulting optimal design equations is suggested, and computational results for a simple test case are discussed. A typical radiation shielding design problem can have infinitely many solutions, all satisfying the problem's specified set of radiation attenuation requirements. Each such design has its own total materials cost. For a design to be optimal, no admissible change in its deployment of shielding materials can result in a lower cost. This applies in particular to very small changes, which can be restated using the calculus of variations as the Euler-Lagrange equations. Furthermore, the associated Hamiltonian function and application of Pontryagin's theorem lead to conditions for a shield to be optimal.

  17. Optimal Design of Calibration Signals in Space-Borne Gravitational Wave Detectors

    NASA Technical Reports Server (NTRS)

    Nofrarias, Miquel; Karnesis, Nikolaos; Gibert, Ferran; Armano, Michele; Audley, Heather; Danzmann, Karsten; Diepholz, Ingo; Dolesi, Rita; Ferraioli, Luigi; Ferroni, Valerio; hide

    2016-01-01

    Future space borne gravitational wave detectors will require a precise definition of calibration signals to ensure the achievement of their design sensitivity. The careful design of the test signals plays a key role in the correct understanding and characterisation of these instruments. In that sense, methods achieving optimal experiment designs must be considered as complementary to the parameter estimation methods being used to determine the parameters describing the system. The relevance of experiment design is particularly significant for the LISA Pathfinder mission, which will spend most of its operation time performing experiments to characterize key technologies for future space borne gravitational wave observatories. Here we propose a framework to derive the optimal signals in terms of minimum parameter uncertainty to be injected to these instruments during its calibration phase. We compare our results with an alternative numerical algorithm which achieves an optimal input signal by iteratively improving an initial guess. We show agreement of both approaches when applied to the LISA Pathfinder case.

  18. Optimal Design of Calibration Signals in Space Borne Gravitational Wave Detectors

    NASA Technical Reports Server (NTRS)

    Nofrarias, Miquel; Karnesis, Nikolaos; Gibert, Ferran; Armano, Michele; Audley, Heather; Danzmann, Karsten; Diepholz, Ingo; Dolesi, Rita; Ferraioli, Luigi; Thorpe, James I.

    2014-01-01

    Future space borne gravitational wave detectors will require a precise definition of calibration signals to ensure the achievement of their design sensitivity. The careful design of the test signals plays a key role in the correct understanding and characterization of these instruments. In that sense, methods achieving optimal experiment designs must be considered as complementary to the parameter estimation methods being used to determine the parameters describing the system. The relevance of experiment design is particularly significant for the LISA Pathfinder mission, which will spend most of its operation time performing experiments to characterize key technologies for future space borne gravitational wave observatories. Here we propose a framework to derive the optimal signals in terms of minimum parameter uncertainty to be injected to these instruments during its calibration phase. We compare our results with an alternative numerical algorithm which achieves an optimal input signal by iteratively improving an initial guess. We show agreement of both approaches when applied to the LISA Pathfinder case.

  19. Optimizing spacecraft design - optimization engine development : progress and plans

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Dunphy, Julia R; Salcedo, Jose; Menzies, Tim

    2003-01-01

    At JPL and NASA, a process has been developed to perform life cycle risk management. This process requires users to identify: goals and objectives to be achieved (and their relative priorities), the various risks to achieving those goals and objectives, and options for risk mitigation (prevention, detection ahead of time, and alleviation). Risks are broadly defined to include the risk of failing to design a system with adequate performance, compatibility and robustness in addition to more traditional implementation and operational risks. The options for mitigating these different kinds of risks can include architectural and design choices, technology plans and technology back-up options, test-bed and simulation options, engineering models and hardware/software development techniques and other more traditional risk reduction techniques.

  20. Optimizing spacecraft design - optimization engine development : progress and plans

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Dunphy, Julia R; Salcedo, Jose; Menzies, Tim

    2003-01-01

    At JPL and NASA, a process has been developed to perform life cycle risk management. This process requires users to identify: goals and objectives to be achieved (and their relative priorities), the various risks to achieving those goals and objectives, and options for risk mitigation (prevention, detection ahead of time, and alleviation). Risks are broadly defined to include the risk of failing to design a system with adequate performance, compatibility and robustness in addition to more traditional implementation and operational risks. The options for mitigating these different kinds of risks can include architectural and design choices, technology plans and technology back-up options, test-bed and simulation options, engineering models and hardware/software development techniques and other more traditional risk reduction techniques.