Sample records for efficient design tool

  1. Benefits of Efficient Windows | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  2. Window Selection Tool | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  3. Design Considerations | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  4. Design Guidance for New Windows | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  5. Design Guidance for Replacement Windows | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  6. Development of High Efficiency (14%) Solar Cell Array Module

    NASA Technical Reports Server (NTRS)

    Iles, P. A.; Khemthong, S.; Olah, S.; Sampson, W. J.; Ling, K. S.

    1979-01-01

    High efficiency solar cells required for the low cost modules was developed. The production tooling for the manufacture of the cells and modules was designed. The tooling consisted of: (1) back contact soldering machine; (2) vacuum pickup; (3) antireflective coating tooling; and (4) test fixture.

  7. Gas Fills | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  8. Understanding Windows | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  9. Books & Publications | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  10. Efficient Windows Collaborative | Home

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  11. Resources | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  12. Provide Views | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  13. Links | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  14. Reducing Condensation | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  15. Reduced Fading | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  16. EWC Membership | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  17. Visible Transmittance | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  18. EWC Members | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  19. Financing & Incentives | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  20. Increased Light & View | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  1. Windows for New Construction | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  2. Performance Standards for Windows | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  3. Air Leakage (AL) | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  4. State Fact Sheets | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  5. Fact Sheets & Publications | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  6. Condensation Resistance (CR) | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  7. Assessing Window Replacement Options | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  8. National Fenestration Rating Council (NFRC) | Efficient Windows

    Science.gov Websites

    Collaborative Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring

  9. Low Conductance Spacers | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  10. Energy & Cost Savings | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  11. U-Factor (U-value) | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  12. Replacement Windows for Existing Homes Homes | Efficient Windows

    Science.gov Websites

    Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Selection Tool will take you through a series of design conditions pertaining to your design and location

  13. MODEST: a web-based design tool for oligonucleotide-mediated genome engineering and recombineering

    PubMed Central

    Bonde, Mads T.; Klausen, Michael S.; Anderson, Mads V.; Wallin, Annika I.N.; Wang, Harris H.; Sommer, Morten O.A.

    2014-01-01

    Recombineering and multiplex automated genome engineering (MAGE) offer the possibility to rapidly modify multiple genomic or plasmid sites at high efficiencies. This enables efficient creation of genetic variants including both single mutants with specifically targeted modifications as well as combinatorial cell libraries. Manual design of oligonucleotides for these approaches can be tedious, time-consuming, and may not be practical for larger projects targeting many genomic sites. At present, the change from a desired phenotype (e.g. altered expression of a specific protein) to a designed MAGE oligo, which confers the corresponding genetic change, is performed manually. To address these challenges, we have developed the MAGE Oligo Design Tool (MODEST). This web-based tool allows designing of MAGE oligos for (i) tuning translation rates by modifying the ribosomal binding site, (ii) generating translational gene knockouts and (iii) introducing other coding or non-coding mutations, including amino acid substitutions, insertions, deletions and point mutations. The tool automatically designs oligos based on desired genotypic or phenotypic changes defined by the user, which can be used for high efficiency recombineering and MAGE. MODEST is available for free and is open to all users at http://modest.biosustain.dtu.dk. PMID:24838561

  14. Selection Process for New Windows | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  15. Selection Process for Replacement Windows | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  16. Solar Heat Gain Coefficient (SHGC) | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  17. A climate responsive urban design tool: a platform to improve energy efficiency in a dry hot climate

    NASA Astrophysics Data System (ADS)

    El Dallal, Norhan; Visser, Florentine

    2017-09-01

    In the Middle East and North Africa (MENA) region, new urban developments should address the climatic conditions to improve outdoor comfort and to reduce the energy consumption of buildings. This article describes a design tool that supports climate responsive design for a dry hot climate. The approach takes the climate as an initiator for the conceptual urban form with a more energy-efficient urban morphology. The methodology relates the different passive strategies suitable for major climate conditions in MENA region (dry-hot) to design parameters that create the urban form. This parametric design approach is the basis for a tool that generates conceptual climate responsive urban forms so as to assist the urban designer early in the design process. Various conceptual scenarios, generated by a computational model, are the results of the proposed platform. A practical application of the approach is conducted on a New Urban Community in Aswan (Egypt), showing the economic feasibility of the resulting urban form and morphology, and the proposed tool.

  18. Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

  19. New Design Tool Can Help Cut building Energy Use

    Science.gov Websites

    help almost any architect or engineer evaluate passive solar and efficiency design strategies in a tool that enables them to walk through the design process and understand the consequences of design , a feature that tells designers how large of a heating, ventilation and air conditioning (HVAC

  20. Design handbook : energy efficiency and water conservation in NAS facilities

    DOT National Transportation Integrated Search

    1997-09-30

    This handbook was created to provide definitive energy efficiency and water conservation design criteria for the design of NAS facilities. FAA-HDBK-001 provides implementation strategies and tools to comply with E.O. 12902, Energy and Water Conservat...

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horiike, S.; Okazaki, Y.

    This paper describes a performance estimation tool developed for modeling and simulation of open distributed energy management systems to support their design. The approach of discrete event simulation with detailed models is considered for efficient performance estimation. The tool includes basic models constituting a platform, e.g., Ethernet, communication protocol, operating system, etc. Application softwares are modeled by specifying CPU time, disk access size, communication data size, etc. Different types of system configurations for various system activities can be easily studied. Simulation examples show how the tool is utilized for the efficient design of open distributed energy management systems.

  2. Economics of Agroforestry

    Treesearch

    D. Evan Mercer; Frederick W. Cubbage; Gregory E. Frey

    2014-01-01

    This chapter provides principles, literature and a case study about the economics of agroforestry. We examine necessary conditions for achieving efficiency in agroforestry system design and economic analysis tools for assessing efficiency and adoptability of agroforestry. The tools presented here (capital budgeting, linear progranuning, production frontier analysis...

  3. Fundamental Aeronautics Program: Overview of Project Work in Supersonic Cruise Efficiency

    NASA Technical Reports Server (NTRS)

    Castner, Raymond

    2011-01-01

    The Supersonics Project, part of NASA?s Fundamental Aeronautics Program, contains a number of technical challenge areas which include sonic boom community response, airport noise, high altitude emissions, cruise efficiency, light weight durable engines/airframes, and integrated multi-discipline system design. This presentation provides an overview of the current (2011) activities in the supersonic cruise efficiency technical challenge, and is focused specifically on propulsion technologies. The intent is to develop and validate high-performance supersonic inlet and nozzle technologies. Additional work is planned for design and analysis tools for highly-integrated low-noise, low-boom applications. If successful, the payoffs include improved technologies and tools for optimized propulsion systems, propulsion technologies for a minimized sonic boom signature, and a balanced approach to meeting efficiency and community noise goals. In this propulsion area, the work is divided into advanced supersonic inlet concepts, advanced supersonic nozzle concepts, low fidelity computational tool development, high fidelity computational tools, and improved sensors and measurement capability. The current work in each area is summarized.

  4. Fundamental Aeronautics Program: Overview of Propulsion Work in the Supersonic Cruise Efficiency Technical Challenge

    NASA Technical Reports Server (NTRS)

    Castner, Ray

    2012-01-01

    The Supersonics Project, part of NASA's Fundamental Aeronautics Program, contains a number of technical challenge areas which include sonic boom community response, airport noise, high altitude emissions, cruise efficiency, light weight durable engines/airframes, and integrated multi-discipline system design. This presentation provides an overview of the current (2012) activities in the supersonic cruise efficiency technical challenge, and is focused specifically on propulsion technologies. The intent is to develop and validate high-performance supersonic inlet and nozzle technologies. Additional work is planned for design and analysis tools for highly-integrated low-noise, low-boom applications. If successful, the payoffs include improved technologies and tools for optimized propulsion systems, propulsion technologies for a minimized sonic boom signature, and a balanced approach to meeting efficiency and community noise goals. In this propulsion area, the work is divided into advanced supersonic inlet concepts, advanced supersonic nozzle concepts, low fidelity computational tool development, high fidelity computational tools, and improved sensors and measurement capability. The current work in each area is summarized.

  5. A front-end automation tool supporting design, verification and reuse of SOC.

    PubMed

    Yan, Xiao-lang; Yu, Long-li; Wang, Jie-bing

    2004-09-01

    This paper describes an in-house developed language tool called VPerl used in developing a 250 MHz 32-bit high-performance low power embedded CPU core. The authors showed that use of this tool can compress the Verilog code by more than a factor of 5, increase the efficiency of the front-end design, reduce the bug rate significantly. This tool can be used to enhance the reusability of an intellectual property model, and facilitate porting design for different platforms.

  6. Aerospace Power Systems Design and Analysis (APSDA) Tool

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1998-01-01

    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

  7. Electronic Systems for Spacecraft Vehicles: Required EDA Tools

    NASA Technical Reports Server (NTRS)

    Bachnak, Rafic

    1999-01-01

    The continuous increase in complexity of electronic systems is making the design and manufacturing of such systems more challenging than ever before. As a result, designers are finding it impossible to design efficient systems without the use of sophisticated Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and lead to a correct by design methodology. This report identifies the EDA tools that would be needed to design, analyze, simulate, and evaluate electronic systems for spacecraft vehicles. In addition, the report presents recommendations to enhance the current JSC electronic design capabilities. This includes cost information and a discussion as to the impact, both positive and negative, of implementing the recommendations.

  8. Synthetic biology: tools to design microbes for the production of chemicals and fuels.

    PubMed

    Seo, Sang Woo; Yang, Jina; Min, Byung Eun; Jang, Sungho; Lim, Jae Hyung; Lim, Hyun Gyu; Kim, Seong Cheol; Kim, Se Yeon; Jeong, Jun Hong; Jung, Gyoo Yeol

    2013-11-01

    The engineering of biological systems to achieve specific purposes requires design tools that function in a predictable and quantitative manner. Recent advances in the field of synthetic biology, particularly in the programmable control of gene expression at multiple levels of regulation, have increased our ability to efficiently design and optimize biological systems to perform designed tasks. Furthermore, implementation of these designs in biological systems highlights the potential of using these tools to build microbial cell factories for the production of chemicals and fuels. In this paper, we review current developments in the design of tools for controlling gene expression at transcriptional, post-transcriptional and post-translational levels, and consider potential applications of these tools. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Morphogenic designer--an efficient tool to digitally design tooth forms.

    PubMed

    Hajtó, J; Marinescu, C; Silva, N R F A

    2014-01-01

    Different digital software tools are available today for the purpose of designing anatomically correct anterior and posterior restorations. The current concepts present weaknesses, which can be potentially addressed by more advanced modeling tools, such as the ones already available in professional CAD (Computer Aided Design) graphical software. This study describes the morphogenic designer (MGD) as an efficient and easy method for digitally designing tooth forms for the anterior and posterior dentition. Anterior and posterior tooth forms were selected from a collection of digitalized natural teeth and subjectively assessed as "average". The models in the form of STL files were filtered, cleaned, idealized, and re-meshed to match the specifications of the software used. The shapes were then imported as wavefront ".obj" model into Modo 701, software built for modeling, texturing, visualization, and animation. In order to create a parametric design system, intentional interactive deformations were performed on the average tooth shapes and then further defined as morph targets. By combining various such parameters, several tooth shapes were formed virtually and their images presented. MGD proved to be a versatile and powerful tool for the purpose of esthetic and functional digital crown designs.

  10. Design and Analysis of Turbines for Space Applications

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Dorney, Daniel J.; Huber, Frank W.

    2003-01-01

    In order to mitigate the risk of rocket propulsion development, efficient, accurate, detailed fluid dynamics analysis of the turbomachinery is necessary. This analysis is used for component development, design parametrics, performance prediction, and environment definition. To support this requirement, a task was developed at NASAh4arshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. The turbine chosen on which to demonstrate the procedure was a supersonic design suitable for a reusable launch vehicle (RLV). The hot gas path and blading were redesigned to obtain an increased efficiency. The redesign of the turbine was conducted with a consideration of system requirements, realizing that a highly efficient turbine that, for example, significantly increases engine weight, is of limited benefit. Both preliminary and detailed designs were considered. To generate an improved design, one-dimensional (1D) design and analysis tools, computational fluid dynamics (CFD), response surface methodology (RSM), and neural nets (NN) were used.

  11. Experimental Performance Evaluation of a Supersonic Turbine for Rocket Engine Applications

    NASA Technical Reports Server (NTRS)

    Snellgrove, Lauren M.; Griffin, Lisa W.; Sieja, James P.; Huber, Frank W.

    2003-01-01

    In order to mitigate the risk of rocket propulsion development, efficient, accurate, detailed fluid dynamics analysis and testing of the turbomachinery is necessary. To support this requirement, a task was developed at NASA Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. These tools were applied to optimize a supersonic turbine design suitable for a reusable launch vehicle (RLV). The hot gas path and blading were redesigned-to obtain an increased efficiency. The goal of the demonstration was to increase the total-to- static efficiency of the turbine by eight points over the baseline design. A sub-scale, cold flow test article modeling the final optimized turbine was designed, manufactured, and tested in air at MSFC s Turbine Airflow Facility. Extensive on- and off- design point performance data, steady-state data, and unsteady blade loading data were collected during testing.

  12. A Modeling Tool for Household Biogas Burner Flame Port Design

    NASA Astrophysics Data System (ADS)

    Decker, Thomas J.

    Anaerobic digestion is a well-known and potentially beneficial process for rural communities in emerging markets, providing the opportunity to generate usable gaseous fuel from agricultural waste. With recent developments in low-cost digestion technology, communities across the world are gaining affordable access to the benefits of anaerobic digestion derived biogas. For example, biogas can displace conventional cooking fuels such as biomass (wood, charcoal, dung) and Liquefied Petroleum Gas (LPG), effectively reducing harmful emissions and fuel cost respectively. To support the ongoing scaling effort of biogas in rural communities, this study has developed and tested a design tool aimed at optimizing flame port geometry for household biogas-fired burners. The tool consists of a multi-component simulation that incorporates three-dimensional CAD designs with simulated chemical kinetics and computational fluid dynamics. An array of circular and rectangular port designs was developed for a widely available biogas stove (called the Lotus) as part of this study. These port designs were created through guidance from previous studies found in the literature. The three highest performing designs identified by the tool were manufactured and tested experimentally to validate tool output and to compare against the original port geometry. The experimental results aligned with the tool's prediction for the three chosen designs. Each design demonstrated improved thermal efficiency relative to the original, with one configuration of circular ports exhibiting superior performance. The results of the study indicated that designing for a targeted range of port hydraulic diameter, velocity and mixture density in the tool is a relevant way to improve the thermal efficiency of a biogas burner. Conversely, the emissions predictions made by the tool were found to be unreliable and incongruent with laboratory experiments.

  13. Dynamic Analyses of Result Quality in Energy-Aware Approximate Programs

    NASA Astrophysics Data System (ADS)

    RIngenburg, Michael F.

    Energy efficiency is a key concern in the design of modern computer systems. One promising approach to energy-efficient computation, approximate computing, trades off output precision for energy efficiency. However, this tradeoff can have unexpected effects on computation quality. This thesis presents dynamic analysis tools to study, debug, and monitor the quality and energy efficiency of approximate computations. We propose three styles of tools: prototyping tools that allow developers to experiment with approximation in their applications, online tools that instrument code to determine the key sources of error, and online tools that monitor the quality of deployed applications in real time. Our prototyping tool is based on an extension to the functional language OCaml. We add approximation constructs to the language, an approximation simulator to the runtime, and profiling and auto-tuning tools for studying and experimenting with energy-quality tradeoffs. We also present two online debugging tools and three online monitoring tools. The first online tool identifies correlations between output quality and the total number of executions of, and errors in, individual approximate operations. The second tracks the number of approximate operations that flow into a particular value. Our online tools comprise three low-cost approaches to dynamic quality monitoring. They are designed to monitor quality in deployed applications without spending more energy than is saved by approximation. Online monitors can be used to perform real time adjustments to energy usage in order to meet specific quality goals. We present prototype implementations of all of these tools and describe their usage with several applications. Our prototyping, profiling, and autotuning tools allow us to experiment with approximation strategies and identify new strategies, our online tools succeed in providing new insights into the effects of approximation on output quality, and our monitors succeed in controlling output quality while still maintaining significant energy efficiency gains.

  14. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  15. Method for automation of tool preproduction

    NASA Astrophysics Data System (ADS)

    Rychkov, D. A.; Yanyushkin, A. S.; Lobanov, D. V.; Arkhipov, P. V.

    2018-03-01

    The primary objective of tool production is a creation or selection of such tool design which could make it possible to secure high process efficiency, tool availability as well as a quality of received surfaces with minimum means and resources spent on it. It takes much time of application people, being engaged in tool preparation, to make a correct selection of the appropriate tool among the set of variants. Program software has been developed to solve the problem, which helps to create, systematize and carry out a comparative analysis of tool design to identify the rational variant under given production conditions. The literature indicates that systematization and selection of the tool rational design has been carried out in accordance with the developed modeling technology and comparative design analysis. Software application makes it possible to reduce the period of design by 80....85% and obtain a significant annual saving.

  16. Study on Ultra-deep Azimuthal Electromagnetic Resistivity LWD Tool by Influence Quantification on Azimuthal Depth of Investigation and Real Signal

    NASA Astrophysics Data System (ADS)

    Li, Kesai; Gao, Jie; Ju, Xiaodong; Zhu, Jun; Xiong, Yanchun; Liu, Shuai

    2018-05-01

    This paper proposes a new tool design of ultra-deep azimuthal electromagnetic (EM) resistivity logging while drilling (LWD) for deeper geosteering and formation evaluation, which can benefit hydrocarbon exploration and development. First, a forward numerical simulation of azimuthal EM resistivity LWD is created based on the fast Hankel transform (FHT) method, and its accuracy is confirmed under classic formation conditions. Then, a reasonable range of tool parameters is designed by analyzing the logging response. However, modern technological limitations pose challenges to selecting appropriate tool parameters for ultra-deep azimuthal detection under detectable signal conditions. Therefore, this paper uses grey relational analysis (GRA) to quantify the influence of tool parameters on voltage and azimuthal investigation depth. After analyzing thousands of simulation data under different environmental conditions, the random forest is used to fit data and identify an optimal combination of tool parameters due to its high efficiency and accuracy. Finally, the structure of the ultra-deep azimuthal EM resistivity LWD tool is designed with a theoretical azimuthal investigation depth of 27.42-29.89 m in classic different isotropic and anisotropic formations. This design serves as a reliable theoretical foundation for efficient geosteering and formation evaluation in high-angle and horizontal (HA/HZ) wells in the future.

  17. Modeling Methodologies for Design and Control of Solid Oxide Fuel Cell APUs

    NASA Astrophysics Data System (ADS)

    Pianese, C.; Sorrentino, M.

    2009-08-01

    Among the existing fuel cell technologies, Solid Oxide Fuel Cells (SOFC) are particularly suitable for both stationary and mobile applications, due to their high energy conversion efficiencies, modularity, high fuel flexibility, low emissions and noise. Moreover, the high working temperatures enable their use for efficient cogeneration applications. SOFCs are entering in a pre-industrial era and a strong interest for designing tools has growth in the last years. Optimal system configuration, components sizing, control and diagnostic system design require computational tools that meet the conflicting needs of accuracy, affordable computational time, limited experimental efforts and flexibility. The paper gives an overview on control-oriented modeling of SOFC at both single cell and stack level. Such an approach provides useful simulation tools for designing and controlling SOFC-APUs destined to a wide application area, ranging from automotive to marine and airplane APUs.

  18. PC Software graphics tool for conceptual design of space/planetary electrical power systems

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1995-01-01

    This paper describes the Decision Support System (DSS), a personal computer software graphics tool for designing conceptual space and/or planetary electrical power systems. By using the DSS, users can obtain desirable system design and operating parameters, such as system weight, electrical distribution efficiency, and bus power. With this tool, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. The DSS is a user-friendly, menu-driven tool with online help and a custom graphical user interface. An example design and results are illustrated for a typical space power system with multiple types of power sources, frequencies, energy storage systems, and loads.

  19. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.; Olariu, Stephen

    1995-01-01

    The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.

  20. Operationally efficient propulsion system study (OEPSS) data book. Volume 7; Launch Operations Index (LOI) Design Features and Options

    NASA Technical Reports Server (NTRS)

    Ziese, James M.

    1992-01-01

    A design tool of figure of merit was developed that allows the operability of a propulsion system design to be measured. This Launch Operations Index (LOI) relates Operations Efficiency to System Complexity. The figure of Merit can be used by conceptual designers to compare different propulsion system designs based on their impact on launch operations. The LOI will improve the design process by making sure direct launch operations experience is a necessary feedback to the design process.

  1. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.

    1992-01-01

    The goal was the design and implementation of software to be used in the conceptual design of aerospace vehicles. Several packages and design studies were completed, including two software tools currently used in the conceptual level design of aerospace vehicles. These tools are the Solid Modeling Aerospace Research Tool (SMART) and the Environment for Software Integration and Execution (EASIE). SMART provides conceptual designers with a rapid prototyping capability and additionally provides initial mass property analysis. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand alone analysis codes that result in the streamlining of the exchange of data between programs, reducing errors and improving efficiency.

  2. Exploration of depth modeling mode one lossless wedgelets storage strategies for 3D-high efficiency video coding

    NASA Astrophysics Data System (ADS)

    Sanchez, Gustavo; Marcon, César; Agostini, Luciano Volcan

    2018-01-01

    The 3D-high efficiency video coding has introduced tools to obtain higher efficiency in 3-D video coding, and most of them are related to the depth maps coding. Among these tools, the depth modeling mode-1 (DMM-1) focuses on better encoding edges regions of depth maps. The large memory required for storing all wedgelet patterns is one of the bottlenecks in the DMM-1 hardware design of both encoder and decoder since many patterns must be stored. Three algorithms to reduce the DMM-1 memory requirements and a hardware design targeting the most efficient among these algorithms are presented. Experimental results demonstrate that the proposed solutions surpass related works reducing up to 78.8% of the wedgelet memory, without degrading the encoding efficiency. Synthesis results demonstrate that the proposed algorithm reduces almost 75% of the power dissipation when compared to the standard approach.

  3. Design and application of a tool for structuring, capitalizing and making more accessible information and lessons learned from accidents involving machinery.

    PubMed

    Sadeghi, Samira; Sadeghi, Leyla; Tricot, Nicolas; Mathieu, Luc

    2017-12-01

    Accident reports are published in order to communicate the information and lessons learned from accidents. An efficient accident recording and analysis system is a necessary step towards improvement of safety. However, currently there is a shortage of efficient tools to support such recording and analysis. In this study we introduce a flexible and customizable tool that allows structuring and analysis of this information. This tool has been implemented under TEEXMA®. We named our prototype TEEXMA®SAFETY. This tool provides an information management system to facilitate data collection, organization, query, analysis and reporting of accidents. A predefined information retrieval module provides ready access to data which allows the user to quickly identify the possible hazards for specific machines and provides information on the source of hazards. The main target audience for this tool includes safety personnel, accident reporters and designers. The proposed data model has been developed by analyzing different accident reports.

  4. Optimal designs for copula models

    PubMed Central

    Perrone, E.; Müller, W.G.

    2016-01-01

    Copula modelling has in the past decade become a standard tool in many areas of applied statistics. However, a largely neglected aspect concerns the design of related experiments. Particularly the issue of whether the estimation of copula parameters can be enhanced by optimizing experimental conditions and how robust all the parameter estimates for the model are with respect to the type of copula employed. In this paper an equivalence theorem for (bivariate) copula models is provided that allows formulation of efficient design algorithms and quick checks of whether designs are optimal or at least efficient. Some examples illustrate that in practical situations considerable gains in design efficiency can be achieved. A natural comparison between different copula models with respect to design efficiency is provided as well. PMID:27453616

  5. High-order computational fluid dynamics tools for aircraft design

    PubMed Central

    Wang, Z. J.

    2014-01-01

    Most forecasts predict an annual airline traffic growth rate between 4.5 and 5% in the foreseeable future. To sustain that growth, the environmental impact of aircraft cannot be ignored. Future aircraft must have much better fuel economy, dramatically less greenhouse gas emissions and noise, in addition to better performance. Many technical breakthroughs must take place to achieve the aggressive environmental goals set up by governments in North America and Europe. One of these breakthroughs will be physics-based, highly accurate and efficient computational fluid dynamics and aeroacoustics tools capable of predicting complex flows over the entire flight envelope and through an aircraft engine, and computing aircraft noise. Some of these flows are dominated by unsteady vortices of disparate scales, often highly turbulent, and they call for higher-order methods. As these tools will be integral components of a multi-disciplinary optimization environment, they must be efficient to impact design. Ultimately, the accuracy, efficiency, robustness, scalability and geometric flexibility will determine which methods will be adopted in the design process. This article explores these aspects and identifies pacing items. PMID:25024419

  6. Circuit design tool. User's manual, revision 2

    NASA Technical Reports Server (NTRS)

    Miyake, Keith M.; Smith, Donald E.

    1992-01-01

    The CAM chip design was produced in a UNIX software environment using a design tool that supports definition of digital electronic modules, composition of these modules into higher level circuits, and event-driven simulation of these circuits. Our design tool provides an interface whose goals include straightforward but flexible primitive module definition and circuit composition, efficient simulation, and a debugging environment that facilitates design verification and alteration. The tool provides a set of primitive modules which can be composed into higher level circuits. Each module is a C-language subroutine that uses a set of interface protocols understood by the design tool. Primitives can be altered simply by recoding their C-code image; in addition new primitives can be added allowing higher level circuits to be described in C-code rather than as a composition of primitive modules--this feature can greatly enhance the speed of simulation.

  7. 78 FR 79053 - Self-Regulatory Organizations; NYSE MKT LLC; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-27

    ... discretion to the member organizations to define pre-set risk thresholds. The tools are designed to act as a... comments more efficiently, please use only one method. The Commission will post all comments on the... To Offer Risk Management Tools Designed To Allow Member Organizations To Monitor and Address Exposure...

  8. Knowledge management: An abstraction of knowledge base and database management systems

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel D.

    1990-01-01

    Artificial intelligence application requirements demand powerful representation capabilities as well as efficiency for real-time domains. Many tools exist, the most prevalent being expert systems tools such as ART, KEE, OPS5, and CLIPS. Other tools just emerging from the research environment are truth maintenance systems for representing non-monotonic knowledge, constraint systems, object oriented programming, and qualitative reasoning. Unfortunately, as many knowledge engineers have experienced, simply applying a tool to an application requires a large amount of effort to bend the application to fit. Much work goes into supporting work to make the tool integrate effectively. A Knowledge Management Design System (KNOMAD), is described which is a collection of tools built in layers. The layered architecture provides two major benefits; the ability to flexibly apply only those tools that are necessary for an application, and the ability to keep overhead, and thus inefficiency, to a minimum. KNOMAD is designed to manage many knowledge bases in a distributed environment providing maximum flexibility and expressivity to the knowledge engineer while also providing support for efficiency.

  9. Intuitive Tools for the Design and Analysis of Communication Payloads for Satellites

    NASA Technical Reports Server (NTRS)

    Culver, Michael R.; Soong, Christine; Warner, Joseph D.

    2014-01-01

    In an effort to make future communications satellite payload design more efficient and accessible, two tools were created with intuitive graphical user interfaces (GUIs). The first tool allows payload designers to graphically design their payload by using simple drag and drop of payload components onto a design area within the program. Information about each picked component is pulled from a database of common space-qualified communication components sold by commerical companies. Once a design is completed, various reports can be generated, such as the Master Equipment List. The second tool is a link budget calculator designed specifically for ease of use. Other features of this tool include being able to access a database of NASA ground based apertures for near Earth and Deep Space communication, the Tracking and Data Relay Satellite System (TDRSS) base apertures, and information about the solar system relevant to link budget calculations. The link budget tool allows for over 50 different combinations of user inputs, eliminating the need for multiple spreadsheets and the user errors associated with using them. Both of the aforementioned tools increase the productivity of space communication systems designers, and have the colloquial latitude to allow non-communication experts to design preliminary communication payloads.

  10. A Web Based Collaborative Design Environment for Spacecraft

    NASA Technical Reports Server (NTRS)

    Dunphy, Julia

    1998-01-01

    In this era of shrinking federal budgets in the USA we need to dramatically improve our efficiency in the spacecraft engineering design process. We have come up with a method which captures much of the experts' expertise in a dataflow design graph: Seamlessly connectable set of local and remote design tools; Seamlessly connectable web based design tools; and Web browser interface to the developing spacecraft design. We have recently completed our first web browser interface and demonstrated its utility in the design of an aeroshell using design tools located at web sites at three NASA facilities. Multiple design engineers and managers are now able to interrogate the design engine simultaneously and find out what the design looks like at any point in the design cycle, what its parameters are, and how it reacts to adverse space environments.

  11. Computational tool for simulation of power and refrigeration cycles

    NASA Astrophysics Data System (ADS)

    Córdoba Tuta, E.; Reyes Orozco, M.

    2016-07-01

    Small improvement in thermal efficiency of power cycles brings huge cost savings in the production of electricity, for that reason have a tool for simulation of power cycles allows modeling the optimal changes for a best performance. There is also a big boom in research Organic Rankine Cycle (ORC), which aims to get electricity at low power through cogeneration, in which the working fluid is usually a refrigerant. A tool to design the elements of an ORC cycle and the selection of the working fluid would be helpful, because sources of heat from cogeneration are very different and in each case would be a custom design. In this work the development of a multiplatform software for the simulation of power cycles and refrigeration, which was implemented in the C ++ language and includes a graphical interface which was developed using multiplatform environment Qt and runs on operating systems Windows and Linux. The tool allows the design of custom power cycles, selection the type of fluid (thermodynamic properties are calculated through CoolProp library), calculate the plant efficiency, identify the fractions of flow in each branch and finally generates a report very educational in pdf format via the LaTeX tool.

  12. 48 CFR 323.7100 - Policy.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... ENVIRONMENT, ENERGY AND WATER EFFICIENCY, RENEWABLE ENERGY TECHNOLOGIES, OCCUPATIONAL SAFETY, AND DRUG-FREE... the acquisition and use of designated recycled content, and Energy Star ®, Electronic Product Environmental Assessment Tool (EPEAT)-registered, energy-efficient, bio-based, and environmentally preferable...

  13. The effect of ergonomic laparoscopic tool handle design on performance and efficiency.

    PubMed

    Tung, Kryztopher D; Shorti, Rami M; Downey, Earl C; Bloswick, Donald S; Merryweather, Andrew S

    2015-09-01

    Many factors can affect a surgeon's performance in the operating room; these may include surgeon comfort, ergonomics of tool handle design, and fatigue. A laparoscopic tool handle designed with ergonomic considerations (pistol grip) was tested against a current market tool with a traditional pinch grip handle. The goal of this study is to quantify the impact ergonomic design considerations which have on surgeon performance. We hypothesized that there will be measurable differences between the efficiency while performing FLS surgical trainer tasks when using both tool handle designs in three categories: time to completion, technical skill, and subjective user ratings. The pistol grip incorporates an ergonomic interface intended to reduce contact stress points on the hand and fingers, promote a more neutral operating wrist posture, and reduce hand tremor and fatigue. The traditional pinch grip is a laparoscopic tool developed by Stryker Inc. widely used during minimal invasive surgery. Twenty-three (13 M, 10 F) participants with no existing upper extremity musculoskeletal disorders or experience performing laparoscopic procedures were selected to perform in this study. During a training session prior to testing, participants performed practice trials in a SAGES FLS trainer with both tools. During data collection, participants performed three evaluation tasks using both handle designs (order was randomized, and each trial completed three times). The tasks consisted of FLS peg transfer, cutting, and suturing tasks. Feedback from test participants indicated that they significantly preferred the ergonomic pistol grip in every category (p < 0.05); most notably, participants experienced greater degrees of discomfort in their hands after using the pinch grip tool. Furthermore, participants completed cutting and peg transfer tasks in a shorter time duration (p < 0.05) with the pistol grip than with the pinch grip design; there was no significant difference between completion times for the suturing task. Finally, there was no significant interaction between tool type and errors made during trials. There was a significant preference for as well as lower pain experienced during use of the pistol grip tool as seen from the survey feedback. Both evaluation tasks (cutting and peg transfer) were also completed significantly faster with the pistol grip tool. Finally, due to the high degree of variability in the error data, it was not possible to draw any meaningful conclusions about the effect of tool design on the number or degree of errors made.

  14. Experiences on developing digital down conversion algorithms using Xilinx system generator

    NASA Astrophysics Data System (ADS)

    Xu, Chengfa; Yuan, Yuan; Zhao, Lizhi

    2013-07-01

    The Digital Down Conversion (DDC) algorithm is a classical signal processing method which is widely used in radar and communication systems. In this paper, the DDC function is implemented by Xilinx System Generator tool on FPGA. System Generator is an FPGA design tool provided by Xilinx Inc and MathWorks Inc. It is very convenient for programmers to manipulate the design and debug the function, especially for the complex algorithm. Through the developing process of DDC function based on System Generator, the results show that System Generator is a very fast and efficient tool for FPGA design.

  15. From the Paper to the Tablet: On the Design of an AR-Based Tool for the Inspection of Pre-Fab Buildings. Preliminary Results of the SIRAE Project.

    PubMed

    Portalés, Cristina; Casas, Sergio; Gimeno, Jesús; Fernández, Marcos; Poza, Montse

    2018-04-19

    Energy-efficient Buildings (EeB) are demanded in today’s constructions, fulfilling the requirements for green cities. Pre-fab buildings, which are modularly fully-built in factories, are a good example of this. Although this kind of building is quite new, the in situ inspection is documented using traditional tools, mainly based on paper annotations. Thus, the inspection process is not taking advantage of new technologies. In this paper, we present the preliminary results of the SIRAE project that aims to provide an Augmented Reality (AR) tool that can seamlessly aid in the regular processes of pre-fab building inspections to detect and eliminate the possible existing quality and energy efficiency deviations. In this regards, we show a description of the current inspection process and how an interactive tool can be designed and adapted to it. Our first results show the design and implementation of our tool, which is highly interactive and involves AR visualizations and 3D data-gathering, allowing the inspectors to quickly manage it without altering the way the inspection process is done. First trials on a real environment show that the tool is promising for massive inspection processes.

  16. From the Paper to the Tablet: On the Design of an AR-Based Tool for the Inspection of Pre-Fab Buildings. Preliminary Results of the SIRAE Project

    PubMed Central

    Fernández, Marcos; Poza, Montse

    2018-01-01

    Energy-efficient Buildings (EeB) are demanded in today’s constructions, fulfilling the requirements for green cities. Pre-fab buildings, which are modularly fully-built in factories, are a good example of this. Although this kind of building is quite new, the in situ inspection is documented using traditional tools, mainly based on paper annotations. Thus, the inspection process is not taking advantage of new technologies. In this paper, we present the preliminary results of the SIRAE project that aims to provide an Augmented Reality (AR) tool that can seamlessly aid in the regular processes of pre-fab building inspections to detect and eliminate the possible existing quality and energy efficiency deviations. In this regards, we show a description of the current inspection process and how an interactive tool can be designed and adapted to it. Our first results show the design and implementation of our tool, which is highly interactive and involves AR visualizations and 3D data-gathering, allowing the inspectors to quickly manage it without altering the way the inspection process is done. First trials on a real environment show that the tool is promising for massive inspection processes. PMID:29671799

  17. DyNAVacS: an integrative tool for optimized DNA vaccine design.

    PubMed

    Harish, Nagarajan; Gupta, Rekha; Agarwal, Parul; Scaria, Vinod; Pillai, Beena

    2006-07-01

    DNA vaccines have slowly emerged as keystones in preventive immunology due to their versatility in inducing both cell-mediated as well as humoral immune responses. The design of an efficient DNA vaccine, involves choice of a suitable expression vector, ensuring optimal expression by codon optimization, engineering CpG motifs for enhancing immune responses and providing additional sequence signals for efficient translation. DyNAVacS is a web-based tool created for rapid and easy design of DNA vaccines. It follows a step-wise design flow, which guides the user through the various sequential steps in the design of the vaccine. Further, it allows restriction enzyme mapping, design of primers spanning user specified sequences and provides information regarding the vectors currently used for generation of DNA vaccines. The web version uses Apache HTTP server. The interface was written in HTML and utilizes the Common Gateway Interface scripts written in PERL for functionality. DyNAVacS is an integrated tool consisting of user-friendly programs, which require minimal information from the user. The software is available free of cost, as a web based application at URL: http://miracle.igib.res.in/dynavac/.

  18. A non-linear programming approach to the computer-aided design of regulators using a linear-quadratic formulation

    NASA Technical Reports Server (NTRS)

    Fleming, P.

    1985-01-01

    A design technique is proposed for linear regulators in which a feedback controller of fixed structure is chosen to minimize an integral quadratic objective function subject to the satisfaction of integral quadratic constraint functions. Application of a non-linear programming algorithm to this mathematically tractable formulation results in an efficient and useful computer-aided design tool. Particular attention is paid to computational efficiency and various recommendations are made. Two design examples illustrate the flexibility of the approach and highlight the special insight afforded to the designer.

  19. Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  20. Efficient Windows Collaborative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nils Petermann

    2010-02-28

    The project goals covered both the residential and commercial windows markets and involved a range of audiences such as window manufacturers, builders, homeowners, design professionals, utilities, and public agencies. Essential goals included: (1) Creation of 'Master Toolkits' of information that integrate diverse tools, rating systems, and incentive programs, customized for key audiences such as window manufacturers, design professionals, and utility programs. (2) Delivery of education and outreach programs to multiple audiences through conference presentations, publication of articles for builders and other industry professionals, and targeted dissemination of efficient window curricula to professionals and students. (3) Design and implementation of mechanismsmore » to encourage and track sales of more efficient products through the existing Window Products Database as an incentive for manufacturers to improve products and participate in programs such as NFRC and ENERGY STAR. (4) Development of utility incentive programs to promote more efficient residential and commercial windows. Partnership with regional and local entities on the development of programs and customized information to move the market toward the highest performing products. An overarching project goal was to ensure that different audiences adopt and use the developed information, design and promotion tools and thus increase the market penetration of energy efficient fenestration products. In particular, a crucial success criterion was to move gas and electric utilities to increase the promotion of energy efficient windows through demand side management programs as an important step toward increasing the market share of energy efficient windows.« less

  1. Efficient simulation of press hardening process through integrated structural and CFD analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palaniswamy, Hariharasudhan; Mondalek, Pamela; Wronski, Maciek

    Press hardened steel parts are being increasingly used in automotive structures for their higher strength to meet safety standards while reducing vehicle weight to improve fuel consumption. However, manufacturing of sheet metal parts by press hardening process to achieve desired properties is extremely challenging as it involves complex interaction of plastic deformation, metallurgical change, thermal distribution, and fluid flow. Numerical simulation is critical for successful design of the process and to understand the interaction among the numerous process parameters to control the press hardening process in order to consistently achieve desired part properties. Until now there has been no integratedmore » commercial software solution that can efficiently model the complete process from forming of the blank, heat transfer between the blank and tool, microstructure evolution in the blank, heat loss from tool to the fluid that flows through water channels in the tools. In this study, a numerical solution based on Altair HyperWorks® product suite involving RADIOSS®, a non-linear finite element based structural analysis solver and AcuSolve®, an incompressible fluid flow solver based on Galerkin Least Square Finite Element Method have been utilized to develop an efficient solution for complete press hardening process design and analysis. RADIOSS is used to handle the plastic deformation, heat transfer between the blank and tool, and microstructure evolution in the blank during cooling. While AcuSolve is used to efficiently model heat loss from tool to the fluid that flows through water channels in the tools. The approach is demonstrated through some case studies.« less

  2. Current And Future Directions Of Lens Design Software

    NASA Astrophysics Data System (ADS)

    Gustafson, Darryl E.

    1983-10-01

    The most effective environment for doing lens design continues to evolve as new computer hardware and software tools become available. Important recent hardware developments include: Low-cost but powerful interactive multi-user 32 bit computers with virtual memory that are totally software-compatible with prior larger and more expensive members of the family. A rapidly growing variety of graphics devices for both hard-copy and screen graphics, including many with color capability. In addition, with optical design software readily accessible in many forms, optical design has become a part-time activity for a large number of engineers instead of being restricted to a small number of full-time specialists. A designer interface that is friendly for the part-time user while remaining efficient for the full-time designer is thus becoming more important as well as more practical. Along with these developments, software tools in other scientific and engineering disciplines are proliferating. Thus, the optical designer is less and less unique in his use of computer-aided techniques and faces the challenge and opportunity of efficiently communicating his designs to other computer-aided-design (CAD), computer-aided-manufacturing (CAM), structural, thermal, and mechanical software tools. This paper will address the impact of these developments on the current and future directions of the CODE VTM optical design software package, its implementation, and the resulting lens design environment.

  3. LADES: a software for constructing and analyzing longitudinal designs in biomedical research.

    PubMed

    Vázquez-Alcocer, Alan; Garzón-Cortes, Daniel Ladislao; Sánchez-Casas, Rosa María

    2014-01-01

    One of the most important steps in biomedical longitudinal studies is choosing a good experimental design that can provide high accuracy in the analysis of results with a minimum sample size. Several methods for constructing efficient longitudinal designs have been developed based on power analysis and the statistical model used for analyzing the final results. However, development of this technology is not available to practitioners through user-friendly software. In this paper we introduce LADES (Longitudinal Analysis and Design of Experiments Software) as an alternative and easy-to-use tool for conducting longitudinal analysis and constructing efficient longitudinal designs. LADES incorporates methods for creating cost-efficient longitudinal designs, unequal longitudinal designs, and simple longitudinal designs. In addition, LADES includes different methods for analyzing longitudinal data such as linear mixed models, generalized estimating equations, among others. A study of European eels is reanalyzed in order to show LADES capabilities. Three treatments contained in three aquariums with five eels each were analyzed. Data were collected from 0 up to the 12th week post treatment for all the eels (complete design). The response under evaluation is sperm volume. A linear mixed model was fitted to the results using LADES. The complete design had a power of 88.7% using 15 eels. With LADES we propose the use of an unequal design with only 14 eels and 89.5% efficiency. LADES was developed as a powerful and simple tool to promote the use of statistical methods for analyzing and creating longitudinal experiments in biomedical research.

  4. Design of a high altitude long endurance flying-wing solar-powered unmanned air vehicle

    NASA Astrophysics Data System (ADS)

    Alsahlani, A. A.; Johnston, L. J.; Atcliffe, P. A.

    2017-06-01

    The low-Reynolds number environment of high-altitude §ight places severe demands on the aerodynamic design and stability and control of a high altitude, long endurance (HALE) unmanned air vehicle (UAV). The aerodynamic efficiency of a §ying-wing configuration makes it an attractive design option for such an application and is investigated in the present work. The proposed configuration has a high-aspect ratio, swept-wing planform, the wing sweep being necessary to provide an adequate moment arm for outboard longitudinal and lateral control surfaces. A design optimization framework is developed under a MATLAB environment, combining aerodynamic, structural, and stability analysis. Low-order analysis tools are employed to facilitate efficient computations, which is important when there are multiple optimization loops for the various engineering analyses. In particular, a vortex-lattice method is used to compute the wing planform aerodynamics, coupled to a twodimensional (2D) panel method to derive aerofoil sectional characteristics. Integral boundary-layer methods are coupled to the panel method in order to predict §ow separation boundaries during the design iterations. A quasi-analytical method is adapted for application to flyingwing con¦gurations to predict the wing weight and a linear finite-beam element approach is used for structural analysis of the wing-box. Stability is a particular concern in the low-density environment of high-altitude flight for flying-wing aircraft and so provision of adequate directional stability and control power forms part of the optimization process. At present, a modified Genetic Algorithm is used in all of the optimization loops. Each of the low-order engineering analysis tools is validated using higher-order methods to provide con¦dence in the use of these computationally-efficient tools in the present design-optimization framework. This paper includes the results of employing the present optimization tools in the design of a HALE, flying-wing UAV to indicate that this is a viable design configuration option.

  5. High efficiency, long life terrestrial solar panel

    NASA Technical Reports Server (NTRS)

    Chao, T.; Khemthong, S.; Ling, R.; Olah, S.

    1977-01-01

    The design of a high efficiency, long life terrestrial module was completed. It utilized 256 rectangular, high efficiency solar cells to achieve high packing density and electrical output. Tooling for the fabrication of solar cells was in house and evaluation of the cell performance was begun. Based on the power output analysis, the goal of a 13% efficiency module was achievable.

  6. Design and validation of an improved graphical user interface with the 'Tool ball'.

    PubMed

    Lee, Kuo-Wei; Lee, Ying-Chu

    2012-01-01

    The purpose of this research is introduce the design of an improved graphical user interface (GUI) and verifies the operational efficiency of the proposed interface. Until now, clicking the toolbar with the mouse is the usual way to operate software functions. In our research, we designed an improved graphical user interface - a tool ball that is operated by a mouse wheel to perform software functions. Several experiments are conducted to measure the time needed to operate certain software functions with the traditional combination of "mouse click + tool button" and the proposed integration of "mouse wheel + tool ball". The results indicate that the tool ball design can accelerate the speed of operating software functions, decrease the number of icons on the screen, and enlarge the applications of the mouse wheel. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  7. Otto Van Geet | NREL

    Science.gov Websites

    Center at NREL. Otto has been involved in the design, construction, and operation of energy efficient energy use campus and community design. Mr. Van Geet was one of the founding members of the Labs21 and assessment, passive solar building design, use of design tools, photovoltaic (PV) system design

  8. Investigation of REST-Class Hypersonic Inlet Designs

    NASA Technical Reports Server (NTRS)

    Gollan, Rowan; Ferlemann, Paul G.

    2011-01-01

    Rectangular-to-elliptical shape-transition (REST) inlets are of interest for use on scramjet engines because they are efficient and integrate well with the forebody of a planar vehicle. The classic design technique by Smart for these inlets produces an efficient inlet but the complex three-dimensional viscous effects are only approximately included. Certain undesirable viscous features often occur in these inlets. In the present work, a design toolset has been developed which allows for rapid design of REST-class inlet geometries and the subsequent Navier-Stokes analysis of the inlet performance. This gives the designer feedback on the complex viscous effects at each design iteration. This new tool is applied to design an inlet for on-design operation at Mach 8. The tool allows for rapid investigation of design features that was previously not possible. The outcome is that the inlet shape can be modified to affect aspects of the flow field in a positive way. In one particular example, the boundary layer build-up on the bodyside of the inlet was reduced by 20% of the thickness associated with the classically designed inlet shape.

  9. Recipe for Success: Digital Viewables

    NASA Technical Reports Server (NTRS)

    LaPha, Steven; Gaydos, Frank

    2014-01-01

    The Engineering Services Contract (ESC) and Information Management Communication Support contract (IMCS) at Kennedy Space Center (KSC) provide services to NASA in respect to flight and ground systems design and development. These groups provides the necessary tools, aid, and best practice methodologies required for efficient, optimized design and process development. The team is responsible for configuring and implementing systems, software, along with training, documentation, and administering standards. The team supports over 200 engineers and design specialists with the use of Windchill, Creo Parametric, NX, AutoCAD, and a variety of other design and analysis tools.

  10. Efficient utilization of graphics technology for space animation

    NASA Technical Reports Server (NTRS)

    Panos, Gregory Peter

    1989-01-01

    Efficient utilization of computer graphics technology has become a major investment in the work of aerospace engineers and mission designers. These new tools are having a significant impact in the development and analysis of complex tasks and procedures which must be prepared prior to actual space flight. Design and implementation of useful methods in applying these tools has evolved into a complex interaction of hardware, software, network, video and various user interfaces. Because few people can understand every aspect of this broad mix of technology, many specialists are required to build, train, maintain and adapt these tools to changing user needs. Researchers have set out to create systems where an engineering designer can easily work to achieve goals with a minimum of technological distraction. This was accomplished with high-performance flight simulation visual systems and supercomputer computational horsepower. Control throughout the creative process is judiciously applied while maintaining generality and ease of use to accommodate a wide variety of engineering needs.

  11. Java web tools for PCR, in silico PCR, and oligonucleotide assembly and analysis.

    PubMed

    Kalendar, Ruslan; Lee, David; Schulman, Alan H

    2011-08-01

    The polymerase chain reaction is fundamental to molecular biology and is the most important practical molecular technique for the research laboratory. We have developed and tested efficient tools for PCR primer and probe design, which also predict oligonucleotide properties based on experimental studies of PCR efficiency. The tools provide comprehensive facilities for designing primers for most PCR applications and their combinations, including standard, multiplex, long-distance, inverse, real-time, unique, group-specific, bisulphite modification assays, Overlap-Extension PCR Multi-Fragment Assembly, as well as a programme to design oligonucleotide sets for long sequence assembly by ligase chain reaction. The in silico PCR primer or probe search includes comprehensive analyses of individual primers and primer pairs. It calculates the melting temperature for standard and degenerate oligonucleotides including LNA and other modifications, provides analyses for a set of primers with prediction of oligonucleotide properties, dimer and G-quadruplex detection, linguistic complexity, and provides a dilution and resuspension calculator. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Application of modern tools and techniques to maximize engineering productivity in the development of orbital operations plans for the space station progrm

    NASA Technical Reports Server (NTRS)

    Manford, J. S.; Bennett, G. R.

    1985-01-01

    The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.

  13. Benchmarking CRISPR on-target sgRNA design.

    PubMed

    Yan, Jifang; Chuai, Guohui; Zhou, Chi; Zhu, Chenyu; Yang, Jing; Zhang, Chao; Gu, Feng; Xu, Han; Wei, Jia; Liu, Qi

    2017-02-15

    CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats)-based gene editing has been widely implemented in various cell types and organisms. A major challenge in the effective application of the CRISPR system is the need to design highly efficient single-guide RNA (sgRNA) with minimal off-target cleavage. Several tools are available for sgRNA design, while limited tools were compared. In our opinion, benchmarking the performance of the available tools and indicating their applicable scenarios are important issues. Moreover, whether the reported sgRNA design rules are reproducible across different sgRNA libraries, cell types and organisms remains unclear. In our study, a systematic and unbiased benchmark of the sgRNA predicting efficacy was performed on nine representative on-target design tools, based on six benchmark data sets covering five different cell types. The benchmark study presented here provides novel quantitative insights into the available CRISPR tools. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  15. Optimal Design of Multitype Groundwater Monitoring Networks Using Easily Accessible Tools.

    PubMed

    Wöhling, Thomas; Geiges, Andreas; Nowak, Wolfgang

    2016-11-01

    Monitoring networks are expensive to establish and to maintain. In this paper, we extend an existing data-worth estimation method from the suite of PEST utilities with a global optimization method for optimal sensor placement (called optimal design) in groundwater monitoring networks. Design optimization can include multiple simultaneous sensor locations and multiple sensor types. Both location and sensor type are treated simultaneously as decision variables. Our method combines linear uncertainty quantification and a modified genetic algorithm for discrete multilocation, multitype search. The efficiency of the global optimization is enhanced by an archive of past samples and parallel computing. We demonstrate our methodology for a groundwater monitoring network at the Steinlach experimental site, south-western Germany, which has been established to monitor river-groundwater exchange processes. The target of optimization is the best possible exploration for minimum variance in predicting the mean travel time of the hyporheic exchange. Our results demonstrate that the information gain of monitoring network designs can be explored efficiently and with easily accessible tools prior to taking new field measurements or installing additional measurement points. The proposed methods proved to be efficient and can be applied for model-based optimal design of any type of monitoring network in approximately linear systems. Our key contributions are (1) the use of easy-to-implement tools for an otherwise complex task and (2) yet to consider data-worth interdependencies in simultaneous optimization of multiple sensor locations and sensor types. © 2016, National Ground Water Association.

  16. Commercial Building Energy Asset Score

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This software (Asset Scoring Tool) is designed to help building owners and managers to gain insight into the as-built efficiency of their buildings. It is a web tool where users can enter their building information and obtain an asset score report. The asset score report consists of modeled building energy use (by end use and by fuel type), building systems (envelope, lighting, heating, cooling, service hot water) evaluations, and recommended energy efficiency measures. The intended users are building owners and operators who have limited knowledge of building energy efficiency. The scoring tool collects minimum building data (~20 data entries) frommore » users and build a full-scale energy model using the inference functionalities from Facility Energy Decision System (FEDS). The scoring tool runs real-time building energy simulation using EnergyPlus and performs life-cycle cost analysis using FEDS. An API is also under development to allow the third-party applications to exchange data with the web service of the scoring tool.« less

  17. A Design Support Framework through Dynamic Deployment of Hypothesis and Verification in the Design Process

    NASA Astrophysics Data System (ADS)

    Nomaguch, Yutaka; Fujita, Kikuo

    This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.

  18. SUNREL Related Links | Buildings | NREL

    Science.gov Websites

    SUNREL Related Links SUNREL Related Links DOE Simulation Software Tools Directory a directory of 301 building software tools for evaluation of energy efficiency, renewable energy, and sustainability in buildings. TREAT Software Program a computer program that uses SUNREL and is designed to provide

  19. The ChIP-Seq tools and web server: a resource for analyzing ChIP-seq and other types of genomic data.

    PubMed

    Ambrosini, Giovanna; Dreos, René; Kumar, Sunil; Bucher, Philipp

    2016-11-18

    ChIP-seq and related high-throughput chromatin profilig assays generate ever increasing volumes of highly valuable biological data. To make sense out of it, biologists need versatile, efficient and user-friendly tools for access, visualization and itegrative analysis of such data. Here we present the ChIP-Seq command line tools and web server, implementing basic algorithms for ChIP-seq data analysis starting with a read alignment file. The tools are optimized for memory-efficiency and speed thus allowing for processing of large data volumes on inexpensive hardware. The web interface provides access to a large database of public data. The ChIP-Seq tools have a modular and interoperable design in that the output from one application can serve as input to another one. Complex and innovative tasks can thus be achieved by running several tools in a cascade. The various ChIP-Seq command line tools and web services either complement or compare favorably to related bioinformatics resources in terms of computational efficiency, ease of access to public data and interoperability with other web-based tools. The ChIP-Seq server is accessible at http://ccg.vital-it.ch/chipseq/ .

  20. Adaptable Interactive CBL Design Tools for Education.

    ERIC Educational Resources Information Center

    Chandra, Peter

    The design team approach to the development of computer based learning (CBL) courseware relies heavily on the effective communication between different members of the team, including up-to-date paperwork and documentation. This is important for the accurate and efficient overall coordination of the courseware design, and for future maintenance of…

  1. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    PubMed Central

    Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris

    2014-01-01

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations. PMID:25340050

  2. BEopt-CA (Ex) -- A Tool for Optimal Integration of EE/DR/ES+PV in Existing California Homes. Cooperative Research and Development Final Report, CRADA Number CRD-11-429

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, Craig

    Opportunities for combining energy efficiency, demand response, and energy storage with PV are often missed, because the required knowledge and expertise for these different technologies exist in separate organizations or individuals. Furthermore, there is a lack of quantitative tools to optimize energy efficiency, demand response and energy storage with PV, especially for existing buildings. Our goal is to develop a modeling tool, BEopt-CA (Ex), with capabilities to facilitate identification and implementation of a balanced integration of energy efficiency (EE), demand response (DR), and energy storage (ES) with photovoltaics (PV) within the residential retrofit market. To achieve this goal, we willmore » adapt and extend an existing tool -- BEopt -- that is designed to identify optimal combinations of efficiency and PV in new home designs. In addition, we will develop multifamily residential modeling capabilities for use in California, to facilitate integration of distributed solar power into the grid in order to maximize its value to California ratepayers. The project is follow-on research that leverages previous California Solar Initiative RD&D investment in the BEopt software. BEopt facilitates finding the least cost combination of energy efficiency and renewables to support integrated DSM (iDSM) and Zero Net Energy (ZNE) in California residential buildings. However, BEopt is currently focused on modeling single-family houses and does not include satisfactory capabilities for modeling multifamily homes. The project brings BEopt's existing modeling and optimization capabilities to multifamily buildings, including duplexes, triplexes, townhouses, flats, and low-rise apartment buildings.« less

  3. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.

  4. Examination of CRISPR/Cas9 design tools and the effect of target site accessibility on Cas9 activity.

    PubMed

    Lee, Ciaran M; Davis, Timothy H; Bao, Gang

    2018-04-01

    What is the topic of this review? In this review, we analyse the performance of recently described tools for CRISPR/Cas9 guide RNA design, in particular, design tools that predict CRISPR/Cas9 activity. What advances does it highlight? Recently, many tools designed to predict CRISPR/Cas9 activity have been reported. However, the majority of these tools lack experimental validation. Our analyses indicate that these tools have poor predictive power. Our preliminary results suggest that target site accessibility should be considered in order to develop better guide RNA design tools with improved predictive power. The recent adaptation of the clustered regulatory interspaced short palindromic repeats (CRISPR)/CRISPR-associated protein 9 (Cas9) system for targeted genome engineering has led to its widespread application in many fields worldwide. In order to gain a better understanding of the design rules of CRISPR/Cas9 systems, several groups have carried out large library-based screens leading to some insight into sequence preferences among highly active target sites. To facilitate CRISPR/Cas9 design, these studies have spawned a plethora of guide RNA (gRNA) design tools with algorithms based solely on direct or indirect sequence features. Here, we demonstrate that the predictive power of these tools is poor, suggesting that sequence features alone cannot accurately inform the cutting efficiency of a particular CRISPR/Cas9 gRNA design. Furthermore, we demonstrate that DNA target site accessibility influences the activity of CRISPR/Cas9. With further optimization, we hypothesize that it will be possible to increase the predictive power of gRNA design tools by including both sequence and target site accessibility metrics. © 2017 The Authors. Experimental Physiology © 2017 The Physiological Society.

  5. DEVELOPING A CAPE-OPEN COMPLIANT METAL FINISHING FACILITY POLLUTION PREVENTION TOOL (CO-MFFP2T)

    EPA Science Inventory

    The USEPA is developing a Computer Aided Process Engineering (CAPE) software tool for the metal finishing industry that helps users design efficient metal finishing processes that are less polluting to the environment. Metal finishing process lines can be simulated and evaluated...

  6. Efficient monitoring of CRAB jobs at CMS

    NASA Astrophysics Data System (ADS)

    Silva, J. M. D.; Balcas, J.; Belforte, S.; Ciangottini, D.; Mascheroni, M.; Rupeika, E. A.; Ivanov, T. T.; Hernandez, J. M.; Vaandering, E.

    2017-10-01

    CRAB is a tool used for distributed analysis of CMS data. Users can submit sets of jobs with similar requirements (tasks) with a single request. CRAB uses a client-server architecture, where a lightweight client, a server, and ancillary services work together and are maintained by CMS operators at CERN. As with most complex software, good monitoring tools are crucial for efficient use and longterm maintainability. This work gives an overview of the monitoring tools developed to ensure the CRAB server and infrastructure are functional, help operators debug user problems, and minimize overhead and operating cost. This work also illustrates the design choices and gives a report on our experience with the tools we developed and the external ones we used.

  7. Efficient Monitoring of CRAB Jobs at CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva, J. M.D.; Balcas, J.; Belforte, S.

    CRAB is a tool used for distributed analysis of CMS data. Users can submit sets of jobs with similar requirements (tasks) with a single request. CRAB uses a client-server architecture, where a lightweight client, a server, and ancillary services work together and are maintained by CMS operators at CERN. As with most complex software, good monitoring tools are crucial for efficient use and longterm maintainability. This work gives an overview of the monitoring tools developed to ensure the CRAB server and infrastructure are functional, help operators debug user problems, and minimize overhead and operating cost. This work also illustrates themore » design choices and gives a report on our experience with the tools we developed and the external ones we used.« less

  8. Optimization of Compressor Mounting Bracket of a Passenger Car

    NASA Astrophysics Data System (ADS)

    Kalsi, Sachin; Singh, Daljeet; Saini, J. S.

    2018-05-01

    In the present work, the CAE tools are used for the optimization of the compressor mounting bracket used in an automobile. Both static and dynamic analysis is done for the bracket. With the objective to minimize the mass and increase the stiffness of the bracket, the new design is optimized using shape and topology optimization techniques. The optimized design given by CAE tool is then validated experimentally. The new design results in lower level of vibrations, contribute to lower mass along with lesser cost which is effective in air conditioning system as well as the efficiency of a vehicle. The results given by CAE tool had a very good correlation with the experimental results.

  9. Efficient System Design and Sustainable Finance for China's Village Electrification Program: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, S.; Yin, H.; Kline, D. M.

    2006-08-01

    This paper describes a joint effort of the Institute for Electrical Engineering of the Chinese Academy of Sciences (IEE), and the U.S. National Renewable Energy Laboratory (NREL) to support China's rural electrification program. This project developed a design tool that provides guidelines both for off-grid renewable energy system designs and for cost-based tariff and finance schemes to support them. This tool was developed to capitalize on lessons learned from the Township Electrification Program that preceded the Village Electrification Program. We describe the methods used to develop the analysis, some indicative results, and the planned use of the tool in themore » Village Electrification Program.« less

  10. Efficient Computational Prototyping of Mixed Technology Microfluidic Components and Systems

    DTIC Science & Technology

    2002-08-01

    AFRL-IF-RS-TR-2002-190 Final Technical Report August 2002 EFFICIENT COMPUTATIONAL PROTOTYPING OF MIXED TECHNOLOGY MICROFLUIDIC...SUBTITLE EFFICIENT COMPUTATIONAL PROTOTYPING OF MIXED TECHNOLOGY MICROFLUIDIC COMPONENTS AND SYSTEMS 6. AUTHOR(S) Narayan R. Aluru, Jacob White...Aided Design (CAD) tools for microfluidic components and systems were developed in this effort. Innovative numerical methods and algorithms for mixed

  11. Toward Efficient Design of Reversible Logic Gates in Quantum-Dot Cellular Automata with Power Dissipation Analysis

    NASA Astrophysics Data System (ADS)

    Sasamal, Trailokya Nath; Singh, Ashutosh Kumar; Ghanekar, Umesh

    2018-04-01

    Nanotechnologies, remarkably Quantum-dot Cellular Automata (QCA), offer an attractive perspective for future computing technologies. In this paper, QCA is investigated as an implementation method for designing area and power efficient reversible logic gates. The proposed designs achieve superior performance by incorporating a compact 2-input XOR gate. The proposed design for Feynman, Toffoli, and Fredkin gates demonstrates 28.12, 24.4, and 7% reduction in cell count and utilizes 46, 24.4, and 7.6% less area, respectively over previous best designs. Regarding the cell count (area cover) that of the proposed Peres gate and Double Feynman gate are 44.32% (21.5%) and 12% (25%), respectively less than the most compact previous designs. Further, the delay of Fredkin and Toffoli gates is 0.75 clock cycles, which is equal to the delay of the previous best designs. While the Feynman and Double Feynman gates achieve a delay of 0.5 clock cycles, equal to the least delay previous one. Energy analysis confirms that the average energy dissipation of the developed Feynman, Toffoli, and Fredkin gates is 30.80, 18.08, and 4.3% (for 1.0 E k energy level), respectively less compared to best reported designs. This emphasizes the beneficial role of using proposed reversible gates to design complex and power efficient QCA circuits. The QCADesigner tool is used to validate the layout of the proposed designs, and the QCAPro tool is used to evaluate the energy dissipation.

  12. [CRISPR/CAS9, the King of Genome Editing Tools].

    PubMed

    Bannikov, A V; Lavrov, A V

    2017-01-01

    The discovery of CRISPR/Cas9 brought a hope for having an efficient, reliable, and readily available tool for genome editing. CRISPR/Cas9 is certainly easy to use, while its efficiency and reliability remain the focus of studies. The review describes the general principles of the organization and function of Cas nucleases and a number of important issues to be considered while planning genome editing experiments with CRISPR/Cas9. The issues include evaluation of the efficiency and specificity for Cas9, sgRNA selection, Cas9 variants designed artificially, and use of homologous recombination and nonhomologous end joining in DNA editing.

  13. Efficient Data Generation and Publication as a Test Tool

    NASA Technical Reports Server (NTRS)

    Einstein, Craig Jakob

    2017-01-01

    A tool to facilitate the generation and publication of test data was created to test the individual components of a command and control system designed to launch spacecraft. Specifically, this tool was built to ensure messages are properly passed between system components. The tool can also be used to test whether the appropriate groups have access (read/write privileges) to the correct messages. The messages passed between system components take the form of unique identifiers with associated values. These identifiers are alphanumeric strings that identify the type of message and the additional parameters that are contained within the message. The values that are passed with the message depend on the identifier. The data generation tool allows for the efficient creation and publication of these messages. A configuration file can be used to set the parameters of the tool and also specify which messages to pass.

  14. Improved injection needles facilitate germline transformation of the buckeye butterfly Junonia coenia.

    PubMed

    Beaudette, Kahlia; Hughes, Tia M; Marcus, Jeffrey M

    2014-01-01

    Germline transformation with transposon vectors is an important tool for insect genetics, but progress in developing transformation protocols for butterflies has been limited by high post-injection ova mortality. Here we present an improved glass injection needle design for injecting butterfly ova that increases survival in three Nymphalid butterfly species. Using the needles to genetically transform the common buckeye butterfly Junonia coenia, the hatch rate for injected Junonia ova was 21.7%, the transformation rate was 3%, and the overall experimental efficiency was 0.327%, a substantial improvement over previous results in other butterfly species. Improved needle design and a higher efficiency of transformation should permit the deployment of transposon-based genetic tools in a broad range of less fecund lepidopteran species.

  15. Design and Analysis of Bionic Cutting Blades Using Finite Element Method.

    PubMed

    Li, Mo; Yang, Yuwang; Guo, Li; Chen, Donghui; Sun, Hongliang; Tong, Jin

    2015-01-01

    Praying mantis is one of the most efficient predators in insect world, which has a pair of powerful tools, two sharp and strong forelegs. Its femur and tibia are both armed with a double row of strong spines along their posterior edges which can firmly grasp the prey, when the femur and tibia fold on each other in capturing. These spines are so sharp that they can easily and quickly cut into the prey. The geometrical characteristic of the praying mantis's foreleg, especially its tibia, has important reference value for the design of agricultural soil-cutting tools. Learning from the profile and arrangement of these spines, cutting blades with tooth profile were designed in this work. Two different sizes of tooth structure and arrangement were utilized in the design on the cutting edge. A conventional smooth-edge blade was used to compare with the bionic serrate-edge blades. To compare the working efficiency of conventional blade and bionic blades, 3D finite element simulation analysis and experimental measurement were operated in present work. Both the simulation and experimental results indicated that the bionic serrate-edge blades showed better performance in cutting efficiency.

  16. Design and Analysis of Bionic Cutting Blades Using Finite Element Method

    PubMed Central

    Li, Mo; Yang, Yuwang; Guo, Li; Chen, Donghui; Sun, Hongliang; Tong, Jin

    2015-01-01

    Praying mantis is one of the most efficient predators in insect world, which has a pair of powerful tools, two sharp and strong forelegs. Its femur and tibia are both armed with a double row of strong spines along their posterior edges which can firmly grasp the prey, when the femur and tibia fold on each other in capturing. These spines are so sharp that they can easily and quickly cut into the prey. The geometrical characteristic of the praying mantis's foreleg, especially its tibia, has important reference value for the design of agricultural soil-cutting tools. Learning from the profile and arrangement of these spines, cutting blades with tooth profile were designed in this work. Two different sizes of tooth structure and arrangement were utilized in the design on the cutting edge. A conventional smooth-edge blade was used to compare with the bionic serrate-edge blades. To compare the working efficiency of conventional blade and bionic blades, 3D finite element simulation analysis and experimental measurement were operated in present work. Both the simulation and experimental results indicated that the bionic serrate-edge blades showed better performance in cutting efficiency. PMID:27019583

  17. An application of nonlinear programming to the design of regulators of a linear-quadratic formulation

    NASA Technical Reports Server (NTRS)

    Fleming, P.

    1983-01-01

    A design technique is proposed for linear regulators in which a feedback controller of fixed structure is chosen to minimize an integral quadratic objective function subject to the satisfaction of integral quadratic constraint functions. Application of a nonlinear programming algorithm to this mathematically tractable formulation results in an efficient and useful computer aided design tool. Particular attention is paid to computational efficiency and various recommendations are made. Two design examples illustrate the flexibility of the approach and highlight the special insight afforded to the designer. One concerns helicopter longitudinal dynamics and the other the flight dynamics of an aerodynamically unstable aircraft.

  18. NASA transmission research and its probable effects on helicopter transmission design

    NASA Technical Reports Server (NTRS)

    Zaretsky, E. V.; Coy, J. J.; Townsend, D. P.

    1983-01-01

    Transmissions studied for application to helicopters in addition to the more conventional geared transmissions include hybrid (traction/gear), bearingless planetary, and split torque transmissions. Research is being performed to establish the validity of analysis and computer codes developed to predict the performance, efficiency, life, and reliability of these transmissions. Results of this research should provide the transmission designer with analytical tools to design for minimum weight and noise with maximum life and efficiency. In addition, the advantages and limitations of drive systems as well as the more conventional systems will be defined.

  19. NASA transmission research and its probable effects on helicopter transmission design

    NASA Technical Reports Server (NTRS)

    Zaretsky, E. V.; Coy, J. J.; Townsend, D. P.

    1984-01-01

    Transmissions studied for application to helicopters in addition to the more conventional geared transmissions include hybrid (traction/gear), bearingless planetary, and split torque transmissions. Research is being performed to establish the validity of analysis and computer codes developed to predict the performance, efficiency, life, and reliability of these transmissions. Results of this research should provide the transmission designer with analytical tools to design for minimum weight and noise with maximum life and efficiency. In addition, the advantages and limitations of drive systems as well as the more conventional systems will be defined.

  20. Optimal cost design of water distribution networks using a decomposition approach

    NASA Astrophysics Data System (ADS)

    Lee, Ho Min; Yoo, Do Guen; Sadollah, Ali; Kim, Joong Hoon

    2016-12-01

    Water distribution network decomposition, which is an engineering approach, is adopted to increase the efficiency of obtaining the optimal cost design of a water distribution network using an optimization algorithm. This study applied the source tracing tool in EPANET, which is a hydraulic and water quality analysis model, to the decomposition of a network to improve the efficiency of the optimal design process. The proposed approach was tested by carrying out the optimal cost design of two water distribution networks, and the results were compared with other optimal cost designs derived from previously proposed optimization algorithms. The proposed decomposition approach using the source tracing technique enables the efficient decomposition of an actual large-scale network, and the results can be combined with the optimal cost design process using an optimization algorithm. This proves that the final design in this study is better than those obtained with other previously proposed optimization algorithms.

  1. Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.

    2006-01-01

    This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.

  2. Women's Energy Tool Kit: Home Heating, Cooling and Weatherization.

    ERIC Educational Resources Information Center

    Byalin, Joan

    This book is the first in a series of Energy Tool Kits designed for women by Consumer Action Now, a non-profit organization devoted to promoting energy efficiency and renewable energy resources. Information is provided in 16 sections: introduction, home energy survey; caulking; weatherstripping (double-hung and sliding windows, and casement,…

  3. Analyzing the impact of intermodal-related risk to the design and management of biofuel supply chain.

    DOT National Transportation Integrated Search

    2014-12-01

    The objective of this project is to design decision-support tools for identifying : biorefinery locations that ensure a cost-efficient and reliable supply chain. We built : mathematical models which take into consideration the benefits (such as acces...

  4. Design as a marketing tool: cater to your clients.

    PubMed

    Falick, J

    1982-09-01

    competing successfully in the market and functioning efficiently often depend on a reassessment of the environment. Accordingly, upgraded convenience, comfort, and atmosphere have become major marketing mechanisms for hospitals. This article presents several examples of how hospitals have used design to provide marketing advantages.

  5. Methodological Foundations for Designing Intelligent Computer-Based Training

    DTIC Science & Technology

    1991-09-03

    student models, graphic forms, version control data structures, flowcharts , etc. Circuit simulations are an obvious case. A circuit, after all, can... flowcharts as a basic data structure, and we were able to generalize our tools to create a flowchart drawing tool for inputting both the appearance and...the meaning of flowcharts efficiently. For the Sherlock work, we built a tool that permitted inputting of information about front panels and

  6. KENNEDY SPACE CENTER, FLA. - In KSC's Vertical Processing Facility, Louise Kleba of the Vehicle Integration Test Team (VITT) and engineer Devin Tailor of Goddard Space Flight Center examine the Pistol Grip Tool (PGT), which was designed for use by astronauts during spacewalks. The PGT is a self-contained, micro-processor controlled, battery-powered tool. It also can be used as a nonpowered ratchet wrench. The experiences of the astronauts on the first Hubble Space Telescope (HST) servicing mission led to recommendations for this smaller, more efficient tool for precision work during spacewalks. The PGT will be used on the second HST servicing mission, STS-82. Liftoff aboard Discovery is scheduled Feb. 11.

    NASA Image and Video Library

    1997-01-22

    KENNEDY SPACE CENTER, FLA. - In KSC's Vertical Processing Facility, Louise Kleba of the Vehicle Integration Test Team (VITT) and engineer Devin Tailor of Goddard Space Flight Center examine the Pistol Grip Tool (PGT), which was designed for use by astronauts during spacewalks. The PGT is a self-contained, micro-processor controlled, battery-powered tool. It also can be used as a nonpowered ratchet wrench. The experiences of the astronauts on the first Hubble Space Telescope (HST) servicing mission led to recommendations for this smaller, more efficient tool for precision work during spacewalks. The PGT will be used on the second HST servicing mission, STS-82. Liftoff aboard Discovery is scheduled Feb. 11.

  7. Development of Low Global Warming Potential Refrigerant Solutions for Commercial Refrigeration Systems using a Life Cycle Climate Performance Design Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdelaziz, Omar; Fricke, Brian A; Vineyard, Edward Allan

    Commercial refrigeration systems are known to be prone to high leak rates and to consume large amounts of electricity. As such, direct emissions related to refrigerant leakage and indirect emissions resulting from primary energy consumption contribute greatly to their Life Cycle Climate Performance (LCCP). In this paper, an LCCP design tool is used to evaluate the performance of a typical commercial refrigeration system with alternative refrigerants and minor system modifications to provide lower Global Warming Potential (GWP) refrigerant solutions with improved LCCP compared to baseline systems. The LCCP design tool accounts for system performance, ambient temperature, and system load; systemmore » performance is evaluated using a validated vapor compression system simulation tool while ambient temperature and system load are devised from a widely used building energy modeling tool (EnergyPlus). The LCCP design tool also accounts for the change in hourly electricity emission rate to yield an accurate prediction of indirect emissions. The analysis shows that conventional commercial refrigeration system life cycle emissions are largely due to direct emissions associated with refrigerant leaks and that system efficiency plays a smaller role in the LCCP. However, as a transition occurs to low GWP refrigerants, the indirect emissions become more relevant. Low GWP refrigerants may not be suitable for drop-in replacements in conventional commercial refrigeration systems; however some mixtures may be introduced as transitional drop-in replacements. These transitional refrigerants have a significantly lower GWP than baseline refrigerants and as such, improved LCCP. The paper concludes with a brief discussion on the tradeoffs between refrigerant GWP, efficiency and capacity.« less

  8. Effective Energy Simulation and Optimal Design of Side-lit Buildings with Venetian Blinds

    NASA Astrophysics Data System (ADS)

    Cheng, Tian

    Venetian blinds are popularly used in buildings to control the amount of incoming daylight for improving visual comfort and reducing heat gains in air-conditioning systems. Studies have shown that the proper design and operation of window systems could result in significant energy savings in both lighting and cooling. However, there is no convenient computer tool that allows effective and efficient optimization of the envelope of side-lit buildings with blinds now. Three computer tools, Adeline, DOE2 and EnergyPlus widely used for the above-mentioned purpose have been experimentally examined in this study. Results indicate that the two former tools give unacceptable accuracy due to unrealistic assumptions adopted while the last one may generate large errors in certain conditions. Moreover, current computer tools have to conduct hourly energy simulations, which are not necessary for life-cycle energy analysis and optimal design, to provide annual cooling loads. This is not computationally efficient, particularly not suitable for optimal designing a building at initial stage because the impacts of many design variations and optional features have to be evaluated. A methodology is therefore developed for efficient and effective thermal and daylighting simulations and optimal design of buildings with blinds. Based on geometric optics and radiosity method, a mathematical model is developed to reasonably simulate the daylighting behaviors of venetian blinds. Indoor illuminance at any reference point can be directly and efficiently computed. They have been validated with both experiments and simulations with Radiance. Validation results show that indoor illuminances computed by the new models agree well with the measured data, and the accuracy provided by them is equivalent to that of Radiance. The computational efficiency of the new models is much higher than that of Radiance as well as EnergyPlus. Two new methods are developed for the thermal simulation of buildings. A fast Fourier transform (FFT) method is presented to avoid the root-searching process in the inverse Laplace transform of multilayered walls. Generalized explicit FFT formulae for calculating the discrete Fourier transform (DFT) are developed for the first time. They can largely facilitate the implementation of FFT. The new method also provides a basis for generating the symbolic response factors. Validation simulations show that it can generate the response factors as accurate as the analytical solutions. The second method is for direct estimation of annual or seasonal cooling loads without the need for tedious hourly energy simulations. It is validated by hourly simulation results with DOE2. Then symbolic long-term cooling load can be created by combining the two methods with thermal network analysis. The symbolic long-term cooling load can keep the design parameters of interest as symbols, which is particularly useful for the optimal design and sensitivity analysis. The methodology is applied to an office building in Hong Kong for the optimal design of building envelope. Design variables such as window-to-wall ratio, building orientation, and glazing optical and thermal properties are included in the study. Results show that the selected design values could significantly impact the energy performance of windows, and the optimal design of side-lit buildings could greatly enhance energy savings. The application example also demonstrates that the developed methodology significantly facilitates the optimal building design and sensitivity analysis, and leads to high computational efficiency.

  9. Advanced control design for hybrid turboelectric vehicle

    NASA Technical Reports Server (NTRS)

    Abban, Joseph; Norvell, Johnesta; Momoh, James A.

    1995-01-01

    The new environment standards are a challenge and opportunity for industry and government who manufacture and operate urban mass transient vehicles. A research investigation to provide control scheme for efficient power management of the vehicle is in progress. Different design requirements using functional analysis and trade studies of alternate power sources and controls have been performed. The design issues include portability, weight and emission/fuel efficiency of induction motor, permanent magnet and battery. A strategic design scheme to manage power requirements using advanced control systems is presented. It exploits fuzzy logic, technology and rule based decision support scheme. The benefits of our study will enhance the economic and technical feasibility of technological needs to provide low emission/fuel efficient urban mass transit bus. The design team includes undergraduate researchers in our department. Sample results using NASA HTEV simulation tool are presented.

  10. Architectural evaluation of dynamic and partial reconfigurable systems designed with DREAMS tool

    NASA Astrophysics Data System (ADS)

    Otero, Andrés.; Gallego, Ángel; de la Torre, Eduardo; Riesgo, Teresa

    2013-05-01

    Benefits of dynamic and partial reconfigurable systems are increasingly being more accepted by the industry. For this reason, SRAM-based FPGA manufacturers have improved, or even included for the first time, the support they offer for the design of this kind of systems. However, commercial tools still offer a poor flexibility, which leads to a limited efficiency. This is witnessed by the overhead introduced by the communication primitives, as well as by the inability to relocate reconfigurable modules, among others. For this reason, authors have proposed an academic design tool called DREAMS, which targets the design of dynamically reconfigurable systems. In this paper, main features offered by DREAMS are described, comparing them with existing commercial and academic tools. Moreover, a graphic user interface (GUI) is originally described in this work, with the aim of simplifying the design process, as well as to hide the low level device dependent details to the system designer. The overall goal is to increase the designer productivity. Using the graphic interface, different reconfigurable architectures are provided as design examples. Among them, both conventional slot-based architectures and mesh type designs have been included.

  11. A modelling tool for policy analysis to support the design of efficient and effective policy responses for complex public health problems.

    PubMed

    Atkinson, Jo-An; Page, Andrew; Wells, Robert; Milat, Andrew; Wilson, Andrew

    2015-03-03

    In the design of public health policy, a broader understanding of risk factors for disease across the life course, and an increasing awareness of the social determinants of health, has led to the development of more comprehensive, cross-sectoral strategies to tackle complex problems. However, comprehensive strategies may not represent the most efficient or effective approach to reducing disease burden at the population level. Rather, they may act to spread finite resources less intensively over a greater number of programs and initiatives, diluting the potential impact of the investment. While analytic tools are available that use research evidence to help identify and prioritise disease risk factors for public health action, they are inadequate to support more targeted and effective policy responses for complex public health problems. This paper discusses the limitations of analytic tools that are commonly used to support evidence-informed policy decisions for complex problems. It proposes an alternative policy analysis tool which can integrate diverse evidence sources and provide a platform for virtual testing of policy alternatives in order to design solutions that are efficient, effective, and equitable. The case of suicide prevention in Australia is presented to demonstrate the limitations of current tools to adequately inform prevention policy and discusses the utility of the new policy analysis tool. In contrast to popular belief, a systems approach takes a step beyond comprehensive thinking and seeks to identify where best to target public health action and resources for optimal impact. It is concerned primarily with what can be reasonably left out of strategies for prevention and can be used to explore where disinvestment may occur without adversely affecting population health (or equity). Simulation modelling used for policy analysis offers promise in being able to better operationalise research evidence to support decision making for complex problems, improve targeting of public health policy, and offers a foundation for strengthening relationships between policy makers, stakeholders, and researchers.

  12. Topology and boundary shape optimization as an integrated design tool

    NASA Technical Reports Server (NTRS)

    Bendsoe, Martin Philip; Rodrigues, Helder Carrico

    1990-01-01

    The optimal topology of a two dimensional linear elastic body can be computed by regarding the body as a domain of the plane with a high density of material. Such an optimal topology can then be used as the basis for a shape optimization method that computes the optimal form of the boundary curves of the body. This results in an efficient and reliable design tool, which can be implemented via common FEM mesh generator and CAD type input-output facilities.

  13. Current trends for customized biomedical software tools.

    PubMed

    Khan, Haseeb Ahmad

    2017-01-01

    In the past, biomedical scientists were solely dependent on expensive commercial software packages for various applications. However, the advent of user-friendly programming languages and open source platforms has revolutionized the development of simple and efficient customized software tools for solving specific biomedical problems. Many of these tools are designed and developed by biomedical scientists independently or with the support of computer experts and often made freely available for the benefit of scientific community. The current trends for customized biomedical software tools are highlighted in this short review.

  14. Computer tools for systems engineering at LaRC

    NASA Technical Reports Server (NTRS)

    Walters, J. Milam

    1994-01-01

    The Systems Engineering Office (SEO) has been established to provide life cycle systems engineering support to Langley research Center projects. over the last two years, the computing market has been reviewed for tools which could enhance the effectiveness and efficiency of activities directed towards this mission. A group of interrelated applications have been procured, or are under development including a requirements management tool, a system design and simulation tool, and project and engineering data base. This paper will review the current configuration of these tools and provide information on future milestones and directions.

  15. Design and evaluation of a software prototype for participatory planning of environmental adaptations.

    PubMed

    Eriksson, J; Ek, A; Johansson, G

    2000-03-01

    A software prototype to support the planning process for adapting home and work environments for people with physical disabilities was designed and later evaluated. The prototype exploits low-cost three-dimensional (3-D) graphics products in the home computer market. The essential features of the prototype are: interactive rendering with optional hardware acceleration, interactive walk-throughs, direct manipulation tools for moving objects and measuring distances, and import of 3-D-objects from a library. A usability study was conducted, consisting of two test sessions (three weeks apart) and a final interview. The prototype was then tested and evaluated by representatives of future users: five occupational therapist students, and four persons with physical disability, with no previous experience of the prototype. Emphasis in the usability study was placed on the prototype's efficiency and learnability. We found that it is possible to realise a planning tool for environmental adaptations, both regarding usability and technical efficiency. The usability evaluation confirms our findings from previous case studies, regarding the relevance and positive attitude towards this kind of planning tool. Although the prototype was found to be satisfactorily efficient for the basic tasks, the paper presents several suggestions for improvement of future prototype versions.

  16. Computerized Design Synthesis (CDS), A database-driven multidisciplinary design tool

    NASA Technical Reports Server (NTRS)

    Anderson, D. M.; Bolukbasi, A. O.

    1989-01-01

    The Computerized Design Synthesis (CDS) system under development at McDonnell Douglas Helicopter Company (MDHC) is targeted to make revolutionary improvements in both response time and resource efficiency in the conceptual and preliminary design of rotorcraft systems. It makes the accumulated design database and supporting technology analysis results readily available to designers and analysts of technology, systems, and production, and makes powerful design synthesis software available in a user friendly format.

  17. VirusDetect: An automated pipeline for efficient virus discovery using deep sequencing of small RNAs

    USDA-ARS?s Scientific Manuscript database

    Accurate detection of viruses in plants and animals is critical for agriculture production and human health. Deep sequencing and assembly of virus-derived siRNAs has proven to be a highly efficient approach for virus discovery. However, to date no computational tools specifically designed for both k...

  18. Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2013-09-01

    Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.

  19. Technology Prioritization: Transforming the U.S. Building Stock to Embrace Energy Efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdelaziz, Omar; Farese, Philip; Abramson, Alexis

    2013-01-01

    The U.S. Buildings sector is responsible for about 40% of the national energy expenditures. This is due in part to wasteful use of resources and limited considerations made for energy efficiency during the design and retrofit phases. Recent studies have indicated the potential for up to 30-50% energy savings in the U.S. buildings sector using currently available technologies. This paper discusses efforts to accelerate the transformation in the U.S. building energy efficiency sector using a new technology prioritization framework. The underlying analysis examines building energy use micro segments using the Energy Information Administration Annual Energy Outlook and other publically availablemore » information. The tool includes a stock-and-flow model to track stock vintage and efficiency levels with time. The tool can be used to investigate energy efficiency measures under a variety of scenarios and has a built-in energy accounting framework to prevent double counting of energy savings within any given portfolio. This tool is developed to inform decision making and estimate long term potential energy savings for different market adoption scenarios.« less

  20. 2015 Army Science Planning and Strategy Meeting Series: Outcomes and Conclusions

    DTIC Science & Technology

    2017-12-21

    modeling and nanoscale characterization tools to enable efficient design of hybridized manufacturing ; realtime, multiscale computational capability...to enable predictive analytics for expeditionary on-demand manufacturing • Discovery of design principles to enable programming advanced genetic...goals, significant research is needed to mature the fundamental materials science, processing and manufacturing sciences, design methodologies, data

  1. Needs Analysis of Business English Undergraduates and the Implications to Business English Curriculum Design

    ERIC Educational Resources Information Center

    Li, Juan

    2014-01-01

    Needs Analysis is a valuable and irreplaceable tool in the curriculum design of Business English courses. It ensures a focused and efficient curriculum design responsive to the learners' needs. This paper analyses the needs of Business English undergraduates and the information obtained may offer some helpful suggestions to the setting of the…

  2. Design Aids for Real-Time Systems (DARTS)

    NASA Technical Reports Server (NTRS)

    Szulewski, P. A.

    1982-01-01

    Design-Aids for Real-Time Systems (DARTS) is a tool that assists in defining embedded computer systems through tree structured graphics, military standard documentation support, and various analyses including automated Software Science parameter counting and metrics calculation. These analyses provide both static and dynamic design quality feedback which can potentially aid in producing efficient, high quality software systems.

  3. Managing Input during Assistive Technology Product Design

    ERIC Educational Resources Information Center

    Choi, Young Mi

    2011-01-01

    Many different sources of input are available to assistive technology innovators during the course of designing products. However, there is little information on which ones may be most effective or how they may be efficiently utilized within the design process. The aim of this project was to compare how three types of input--from simulation tools,…

  4. Research on AutoCAD secondary development and function expansion based on VBA technology

    NASA Astrophysics Data System (ADS)

    Zhang, Runmei; Gu, Yehuan

    2017-06-01

    AutoCAD is the most widely used drawing tool among the similar design drawing products. In the process of drawing different types of design drawings of the same product, there are a lot of repetitive and single work contents. The traditional manual method uses a drawing software AutoCAD drawing graphics with low efficiency, high error rate and high input cost shortcomings and many more. In order to solve these problems, the design of the parametric drawing system of the hot-rolled I-beam (steel beam) cross-section is completed by using the VBA secondary development tool and the Access database software with large-capacity storage data, and the analysis of the functional extension of the plane drawing and the parametric drawing design in this paper. For the secondary development of AutoCAD functions, the system drawing work will be simplified and work efficiency also has been greatly improved. This introduction of parametric design of AutoCAD drawing system to promote the industrial mass production and related industries economic growth rate similar to the standard I-beam hot-rolled products.

  5. Aeroelastic Optimization Study Based on the X-56A Model

    NASA Technical Reports Server (NTRS)

    Li, Wesley W.; Pak, Chan-Gi

    2014-01-01

    One way to increase the aircraft fuel efficiency is to reduce structural weight while maintaining adequate structural airworthiness, both statically and aeroelastically. A design process which incorporates the object-oriented multidisciplinary design, analysis, and optimization (MDAO) tool and the aeroelastic effects of high fidelity finite element models to characterize the design space was successfully developed and established. This paper presents two multidisciplinary design optimization studies using an object-oriented MDAO tool developed at NASA Armstrong Flight Research Center. The first study demonstrates the use of aeroelastic tailoring concepts to minimize the structural weight while meeting the design requirements including strength, buckling, and flutter. Such an approach exploits the anisotropic capabilities of the fiber composite materials chosen for this analytical exercise with ply stacking sequence. A hybrid and discretization optimization approach improves accuracy and computational efficiency of a global optimization algorithm. The second study presents a flutter mass balancing optimization study for the fabricated flexible wing of the X-56A model since a desired flutter speed band is required for the active flutter suppression demonstration during flight testing. The results of the second study provide guidance to modify the wing design and move the design flutter speeds back into the flight envelope so that the original objective of X-56A flight test can be accomplished successfully. The second case also demonstrates that the object-oriented MDAO tool can handle multiple analytical configurations in a single optimization run.

  6. Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.

    2016-01-01

    An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.

  7. Low-power, high-speed 1-bit inexact Full Adder cell designs applicable to low-energy image processing

    NASA Astrophysics Data System (ADS)

    Zareei, Zahra; Navi, Keivan; Keshavarziyan, Peiman

    2018-03-01

    In this paper, three novel low-power and high-speed 1-bit inexact Full Adder cell designs are presented based on current mode logic in 32 nm carbon nanotube field effect transistor technology for the first time. The circuit-level figures of merits, i.e. power, delay and power-delay product as well as application-level metric such as error distance, are considered to assess the efficiency of the proposed cells over their counterparts. The effect of voltage scaling and temperature variation on the proposed cells is studied using HSPICE tool. Moreover, using MATLAB tool, the peak signal to noise ratio of the proposed cells is evaluated in an image-processing application referred to as motion detector. Simulation results confirm the efficiency of the proposed cells.

  8. PrimerDesign-M: A multiple-alignment based multiple-primer design tool for walking across variable genomes

    DOE PAGES

    Yoon, Hyejin; Leitner, Thomas

    2014-12-17

    Analyses of entire viral genomes or mtDNA requires comprehensive design of many primers across their genomes. In addition, simultaneous optimization of several DNA primer design criteria may improve overall experimental efficiency and downstream bioinformatic processing. To achieve these goals, we developed PrimerDesign-M. It includes several options for multiple-primer design, allowing researchers to efficiently design walking primers that cover long DNA targets, such as entire HIV-1 genomes, and that optimizes primers simultaneously informed by genetic diversity in multiple alignments and experimental design constraints given by the user. PrimerDesign-M can also design primers that include DNA barcodes and minimize primer dimerization. PrimerDesign-Mmore » finds optimal primers for highly variable DNA targets and facilitates design flexibility by suggesting alternative designs to adapt to experimental conditions.« less

  9. Characteristics of a semi-custom library development system

    NASA Technical Reports Server (NTRS)

    Yancey, M.; Cannon, R.

    1990-01-01

    Standard cell and gate array macro libraries are in common use with workstation computer aided design (CAD) tools for application specific integrated circuit (ASIC) semi-custom application and have resulted in significant improvements in the overall design efficiencies as contrasted with custom design methodologies. Similar design methodology enhancements in providing for the efficient development of the library cells is an important factor in responding to the need for continuous technology improvement. The characteristics of a library development system that provides design flexibility and productivity enhancements for the library development engineer as he provides libraries in the state-of-the-art process technologies are presented. An overview of Gould's library development system ('Accolade') is also presented.

  10. Optimal design and uncertainty quantification in blood flow simulations for congenital heart disease

    NASA Astrophysics Data System (ADS)

    Marsden, Alison

    2009-11-01

    Recent work has demonstrated substantial progress in capabilities for patient-specific cardiovascular flow simulations. Recent advances include increasingly complex geometries, physiological flow conditions, and fluid structure interaction. However inputs to these simulations, including medical image data, catheter-derived pressures and material properties, can have significant uncertainties associated with them. For simulations to predict clinically useful and reliable output information, it is necessary to quantify the effects of input uncertainties on outputs of interest. In addition, blood flow simulation tools can now be efficiently coupled to shape optimization algorithms for surgery design applications, and these tools should incorporate uncertainty information. We present a unified framework to systematically and efficient account for uncertainties in simulations using adaptive stochastic collocation. In addition, we present a framework for derivative-free optimization of cardiovascular geometries, and layer these tools to perform optimization under uncertainty. These methods are demonstrated using simulations and surgery optimization to improve hemodynamics in pediatric cardiology applications.

  11. Applicability of the Design Tool for Inventory and Monitoring (DTIM) and the Explore Sample Data Tool for the Assessment of Caribbean Forest Dynamics

    Treesearch

    Humfredo Marcano-Vega; Andrew Lister; Kevin Megown; Charles Scott

    2016-01-01

    There is a growing need within the insular Caribbean for technical assistance in planning forest-monitoring projects and data analysis. This paper gives an overview of software tools developed by the USDA Forest Service’s National Inventory and Monitoring Applications Center and the Remote Sensing Applications Center. We discuss their applicability in the efficient...

  12. Efficient design of multituned transmission line NMR probes: the electrical engineering approach.

    PubMed

    Frydel, J A; Krzystyniak, M; Pienkowski, D; Pietrzak, M; de Sousa Amadeu, N; Ratajczyk, T; Idzik, K; Gutmann, T; Tietze, D; Voigt, S; Fenn, A; Limbach, H H; Buntkowsky, G

    2011-01-01

    Transmission line-based multi-channel solid state NMR probes have many advantages regarding the cost of construction, number of RF-channels, and achievable RF-power levels. Nevertheless, these probes are only rarely employed in solid state-NMR-labs, mainly owing to the difficult experimental determination of the necessary RF-parameters. Here, the efficient design of multi-channel solid state MAS-NMR probes employing transmission line theory and modern techniques of electrical engineering is presented. As technical realization a five-channel ((1)H, (31)P, (13)C, (2)H and (15)N) probe for operation at 7 Tesla is described. This very cost efficient design goal is a multi port single coil transmission line probe based on the design developed by Schaefer and McKay. The electrical performance of the probe is determined by measuring of Scattering matrix parameters (S-parameters) in particular input/output ports. These parameters are compared to the calculated parameters of the design employing the S-matrix formalism. It is shown that the S-matrix formalism provides an excellent tool for examination of transmission line probes and thus the tool for a rational design of these probes. On the other hand, the resulting design provides excellent electrical performance. From a point of view of Nuclear Magnetic Resonance (NMR), calibration spectra of particular ports (channels) are of great importance. The estimation of the π/2 pulses length for all five NMR channels is presented. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. Computer modeling in the practice of acoustical consulting: An evolving variety of uses from marketing and diagnosis through design to eventually research

    NASA Astrophysics Data System (ADS)

    Madaras, Gary S.

    2002-05-01

    The use of computer modeling as a marketing, diagnosis, design, and research tool in the practice of acoustical consulting is discussed. From the time it is obtained, the software can be used as an effective marketing tool. It is not until the software basics are learned and some amount of testing and verification occurs that the software can be used as a tool for diagnosing the acoustics of existing rooms. A greater understanding of the output types and formats as well as experience in interpreting the results is required before the software can be used as an efficient design tool. Lastly, it is only after repetitive use as a design tool that the software can be used as a cost-effective means of conducting research in practice. The discussion is supplemented with specific examples of actual projects provided by various consultants within multiple firms. Focus is placed on the use of CATT-Acoustic software and predicting the room acoustics of large performing arts halls as well as other public assembly spaces.

  14. Design and implementation of an intranet dashboard.

    PubMed

    Wolpin, S E

    2005-01-01

    Healthcare organizations are complex systems and well served by efficient feedback mechanisms. Many organizations have invested in data warehouses; however there are few tools for automatically extracting and delivering relevant measures to decision makers. This research study resulted in the design and implementation of an intranet dashboard linked to a data warehouse.

  15. Probabilistic cost-benefit analysis of disaster risk management in a development context.

    PubMed

    Kull, Daniel; Mechler, Reinhard; Hochrainer-Stigler, Stefan

    2013-07-01

    Limited studies have shown that disaster risk management (DRM) can be cost-efficient in a development context. Cost-benefit analysis (CBA) is an evaluation tool to analyse economic efficiency. This research introduces quantitative, stochastic CBA frameworks and applies them in case studies of flood and drought risk reduction in India and Pakistan, while also incorporating projected climate change impacts. DRM interventions are shown to be economically efficient, with integrated approaches more cost-effective and robust than singular interventions. The paper highlights that CBA can be a useful tool if certain issues are considered properly, including: complexities in estimating risk; data dependency of results; negative effects of interventions; and distributional aspects. The design and process of CBA must take into account specific objectives, available information, resources, and the perceptions and needs of stakeholders as transparently as possible. Intervention design and uncertainties should be qualified through dialogue, indicating that process is as important as numerical results. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  16. Ensuring Patient Safety in Care Transitions: An Empirical Evaluation of a Handoff Intervention Tool

    PubMed Central

    Abraham, Joanna; Kannampallil, Thomas; Patel, Bela; Almoosa, Khalid; Patel, Vimla L.

    2012-01-01

    Successful handoffs ensure smooth, efficient and safe patient care transitions. Tools and systems designed for standardization of clinician handoffs often focuses on ensuring the communication activity during transitions, with limited support for preparatory activities such as information seeking and organization. We designed and evaluated a Handoff Intervention Tool (HAND-IT) based on a checklist-inspired, body system format allowing structured information organization, and a problem-case narrative format allowing temporal description of patient care events. Based on a pre-post prospective study using a multi-method analysis we evaluated the effectiveness of HAND-IT as a documentation tool. We found that the use of HAND-IT led to fewer transition breakdowns, greater tool resilience, and likely led to better learning outcomes for less-experienced clinicians when compared to the current tool. We discuss the implications of our results for improving patient safety with a continuity of care-based approach. PMID:23304268

  17. EOSCUBE: A Constraint Database System for High-Level Specification and Efficient Generation of EOSDIS Products. Phase 1; Proof-of-Concept

    NASA Technical Reports Server (NTRS)

    Brodsky, Alexander; Segal, Victor E.

    1999-01-01

    The EOSCUBE constraint database system is designed to be a software productivity tool for high-level specification and efficient generation of EOSDIS and other scientific products. These products are typically derived from large volumes of multidimensional data which are collected via a range of scientific instruments.

  18. A method of designing smartphone interface based on the extended user's mental model

    NASA Astrophysics Data System (ADS)

    Zhao, Wei; Li, Fengmin; Bian, Jiali; Pan, Juchen; Song, Song

    2017-01-01

    The user's mental model is the core guiding theory of product design, especially practical products. The essence of practical product is a tool which is used by users to meet their needs. Then, the most important feature of a tool is usability. The design method based on the user's mental model provides a series of practical and feasible theoretical guidance for improving the usability of the product according to the user's awareness of things. In this paper, we propose a method of designing smartphone interface based on the extended user's mental model according to further research on user groups. This approach achieves personalized customization of smartphone application interface and enhance application using efficiency.

  19. Do Computers Improve the Drawing of a Geometrical Figure for 10 Year-Old Children?

    ERIC Educational Resources Information Center

    Martin, Perrine; Velay, Jean-Luc

    2012-01-01

    Nowadays, computer aided design (CAD) is widely used by designers. Would children learn to draw more easily and more efficiently if they were taught with computerised tools? To answer this question, we made an experiment designed to compare two methods for children to do the same drawing: the classical "pen and paper" method and a CAD…

  20. Minimalist Design of Allosterically Regulated Protein Catalysts.

    PubMed

    Makhlynets, O V; Korendovych, I V

    2016-01-01

    Nature facilitates chemical transformations with exceptional selectivity and efficiency. Despite a tremendous progress in understanding and predicting protein function, the overall problem of designing a protein catalyst for a given chemical transformation is far from solved. Over the years, many design techniques with various degrees of complexity and rational input have been developed. Minimalist approach to protein design that focuses on the bare minimum requirements to achieve activity presents several important advantages. By focusing on basic physicochemical properties and strategic placing of only few highly active residues one can feasibly evaluate in silico a very large variety of possible catalysts. In more general terms minimalist approach looks for the mere possibility of catalysis, rather than trying to identify the most active catalyst possible. Even very basic designs that utilize a single residue introduced into nonenzymatic proteins or peptide bundles are surprisingly active. Because of the inherent simplicity of the minimalist approach computational tools greatly enhance its efficiency. No complex calculations need to be set up and even a beginner can master this technique in a very short time. Here, we present a step-by-step protocol for minimalist design of functional proteins using basic, easily available, and free computational tools. © 2016 Elsevier Inc. All rights reserved.

  1. SSSFD manipulator engineering using statistical experiment design techniques

    NASA Technical Reports Server (NTRS)

    Barnes, John

    1991-01-01

    The Satellite Servicer System Flight Demonstration (SSSFD) program is a series of Shuttle flights designed to verify major on-orbit satellite servicing capabilities, such as rendezvous and docking of free flyers, Orbital Replacement Unit (ORU) exchange, and fluid transfer. A major part of this system is the manipulator system that will perform the ORU exchange. The manipulator must possess adequate toolplate dexterity to maneuver a variety of EVA-type tools into position to interface with ORU fasteners, connectors, latches, and handles on the satellite, and to move workpieces and ORUs through 6 degree of freedom (dof) space from the Target Vehicle (TV) to the Support Module (SM) and back. Two cost efficient tools were combined to perform a study of robot manipulator design parameters. These tools are graphical computer simulations and Taguchi Design of Experiment methods. Using a graphics platform, an off-the-shelf robot simulation software package, and an experiment designed with Taguchi's approach, the sensitivities of various manipulator kinematic design parameters to performance characteristics are determined with minimal cost.

  2. GeNeDA: An Open-Source Workflow for Design Automation of Gene Regulatory Networks Inspired from Microelectronics.

    PubMed

    Madec, Morgan; Pecheux, François; Gendrault, Yves; Rosati, Elise; Lallement, Christophe; Haiech, Jacques

    2016-10-01

    The topic of this article is the development of an open-source automated design framework for synthetic biology, specifically for the design of artificial gene regulatory networks based on a digital approach. In opposition to other tools, GeNeDA is an open-source online software based on existing tools used in microelectronics that have proven their efficiency over the last 30 years. The complete framework is composed of a computation core directly adapted from an Electronic Design Automation tool, input and output interfaces, a library of elementary parts that can be achieved with gene regulatory networks, and an interface with an electrical circuit simulator. Each of these modules is an extension of microelectronics tools and concepts: ODIN II, ABC, the Verilog language, SPICE simulator, and SystemC-AMS. GeNeDA is first validated on a benchmark of several combinatorial circuits. The results highlight the importance of the part library. Then, this framework is used for the design of a sequential circuit including a biological state machine.

  3. An end user evaluation of query formulation and results review tools in three medical meta-search engines.

    PubMed

    Leroy, Gondy; Xu, Jennifer; Chung, Wingyan; Eggers, Shauna; Chen, Hsinchun

    2007-01-01

    Retrieving sufficient relevant information online is difficult for many people because they use too few keywords to search and search engines do not provide many support tools. To further complicate the search, users often ignore support tools when available. Our goal is to evaluate in a realistic setting when users use support tools and how they perceive these tools. We compared three medical search engines with support tools that require more or less effort from users to form a query and evaluate results. We carried out an end user study with 23 users who were asked to find information, i.e., subtopics and supporting abstracts, for a given theme. We used a balanced within-subjects design and report on the effectiveness, efficiency and usability of the support tools from the end user perspective. We found significant differences in efficiency but did not find significant differences in effectiveness between the three search engines. Dynamic user support tools requiring less effort led to higher efficiency. Fewer searches were needed and more documents were found per search when both query reformulation and result review tools dynamically adjust to the user query. The query reformulation tool that provided a long list of keywords, dynamically adjusted to the user query, was used most often and led to more subtopics. As hypothesized, the dynamic result review tools were used more often and led to more subtopics than static ones. These results were corroborated by the usability questionnaires, which showed that support tools that dynamically optimize output were preferred.

  4. Development of optimal grinding and polishing tools for aspheric surfaces

    NASA Astrophysics Data System (ADS)

    Burge, James H.; Anderson, Bill; Benjamin, Scott; Cho, Myung K.; Smith, Koby Z.; Valente, Martin J.

    2001-12-01

    The ability to grind and polish steep aspheric surfaces to high quality is limited by the tools used for working the surface. The optician prefers to use large, stiff tools to get good natural smoothing, avoiding small scale surface errors. This is difficult for steep aspheres because the tools must have sufficient compliance to fit the aspheric surface, yet we wish the tools to be stiff so they wear down high regions on the surface. This paper presents a toolkit for designing optimal tools that provide large scale compliance to fit the aspheric surface, yet maintain small scale stiffness for efficient polishing.

  5. Efficient Multidisciplinary Analysis Approach for Conceptual Design of Aircraft with Large Shape Change

    NASA Technical Reports Server (NTRS)

    Chwalowski, Pawel; Samareh, Jamshid A.; Horta, Lucas G.; Piatak, David J.; McGowan, Anna-Maria R.

    2009-01-01

    The conceptual and preliminary design processes for aircraft with large shape changes are generally difficult and time-consuming, and the processes are often customized for a specific shape change concept to streamline the vehicle design effort. Accordingly, several existing reports show excellent results of assessing a particular shape change concept or perturbations of a concept. The goal of the current effort was to develop a multidisciplinary analysis tool and process that would enable an aircraft designer to assess several very different morphing concepts early in the design phase and yet obtain second-order performance results so that design decisions can be made with better confidence. The approach uses an efficient parametric model formulation that allows automatic model generation for systems undergoing radical shape changes as a function of aerodynamic parameters, geometry parameters, and shape change parameters. In contrast to other more self-contained approaches, the approach utilizes off-the-shelf analysis modules to reduce development time and to make it accessible to many users. Because the analysis is loosely coupled, discipline modules like a multibody code can be easily swapped for other modules with similar capabilities. One of the advantages of this loosely coupled system is the ability to use the medium- to high-fidelity tools early in the design stages when the information can significantly influence and improve overall vehicle design. Data transfer among the analysis modules are based on an accurate and automated general purpose data transfer tool. In general, setup time for the integrated system presented in this paper is 2-4 days for simple shape change concepts and 1-2 weeks for more mechanically complicated concepts. Some of the key elements briefly described in the paper include parametric model development, aerodynamic database generation, multibody analysis, and the required software modules as well as examples for a telescoping wing, a folding wing, and a bat-like wing. The paper also includes the verification of a medium-fidelity aerodynamic tool used for the aerodynamic database generation with a steady and unsteady high-fidelity CFD analysis tool for a folding wing example.

  6. Learning Technology: Enhancing Learning in New Designs for the Comprehensive High School.

    ERIC Educational Resources Information Center

    Damyanovich, Mike; And Others

    Technology, directed to each of the parts that collectively give shape and direction to the school, should provide the critical mass necessary to realize the specifications for the New Designs for the Comprehensive High School project. Learners should have access to personal productivity tools that increase effectiveness and efficiency in the…

  7. Design and Implementation of an Intranet Dashboard

    PubMed Central

    Wolpin, SE

    2005-01-01

    Healthcare organizations are complex systems and well served by efficient feedback mechanisms. Many organizations have invested in data warehouses; however there are few tools for automatically extracting and delivering relevant measures to decision makers. This research study resulted in the design and implementation of an intranet dashboard linked to a data warehouse PMID:16779440

  8. Design and implementation of an efficient single layer five input majority voter gate in quantum-dot cellular automata.

    PubMed

    Bahar, Ali Newaz; Waheed, Sajjad

    2016-01-01

    The fundamental logical element of a quantum-dot cellular automata (QCA) circuit is majority voter gate (MV). The efficiency of a QCA circuit is depends on the efficiency of the MV. This paper presents an efficient single layer five-input majority voter gate (MV5). The structure of proposed MV5 is very simple and easy to implement in any logical circuit. This proposed MV5 reduce number of cells and use conventional QCA cells. However, using MV5 a multilayer 1-bit full-adder (FA) is designed. The functional accuracy of the proposed MV5 and FA are confirmed by QCADesigner a well-known QCA layout design and verification tools. Furthermore, the power dissipation of proposed circuits are estimated, which shows that those circuits dissipate extremely small amount of energy and suitable for reversible computing. The simulation outcomes demonstrate the superiority of the proposed circuit.

  9. Isolated Open Rotor Noise Prediction Assessment Using the F31A31 Historical Blade Set

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Jones, William T.; Boyd, D. Douglas, Jr.; Zawodny, Nikolas S.

    2016-01-01

    In an effort to mitigate next-generation fuel efficiency and environmental impact concerns for aviation, open rotor propulsion systems have received renewed interest. However, maintaining the high propulsive efficiency while simultaneously meeting noise goals has been one of the challenges in making open rotor propulsion a viable option. Improvements in prediction tools and design methodologies have opened the design space for next generation open rotor designs that satisfy these challenging objectives. As such, validation of aerodynamic and acoustic prediction tools has been an important aspect of open rotor research efforts. This paper describes validation efforts of a combined computational fluid dynamics and Ffowcs Williams and Hawkings equation methodology for open rotor aeroacoustic modeling. Performance and acoustic predictions were made for a benchmark open rotor blade set and compared with measurements over a range of rotor speeds and observer angles. Overall, the results indicate that the computational approach is acceptable for assessing low-noise open rotor designs. Additionally, this approach may be used to provide realistic incident source fields for acoustic shielding/scattering studies on various aircraft configurations.

  10. Designing overall stoichiometric conversions and intervening metabolic reactions

    DOE PAGES

    Chowdhury, Anupam; Maranas, Costas D.

    2015-11-04

    Existing computational tools for de novo metabolic pathway assembly, either based on mixed integer linear programming techniques or graph-search applications, generally only find linear pathways connecting the source to the target metabolite. The overall stoichiometry of conversion along with alternate co-reactant (or co-product) combinations is not part of the pathway design. Therefore, global carbon and energy efficiency is in essence fixed with no opportunities to identify more efficient routes for recycling carbon flux closer to the thermodynamic limit. Here, we introduce a two-stage computational procedure that both identifies the optimum overall stoichiometry (i.e., optStoic) and selects for (non-)native reactions (i.e.,more » minRxn/minFlux) that maximize carbon, energy or price efficiency while satisfying thermodynamic feasibility requirements. Implementation for recent pathway design studies identified non-intuitive designs with improved efficiencies. Specifically, multiple alternatives for non-oxidative glycolysis are generated and non-intuitive ways of co-utilizing carbon dioxide with methanol are revealed for the production of C 2+ metabolites with higher carbon efficiency.« less

  11. Electrical safety device

    DOEpatents

    White, David B.

    1991-01-01

    An electrical safety device for use in power tools that is designed to automatically discontinue operation of the power tool upon physical contact of the tool with a concealed conductive material. A step down transformer is used to supply the operating power for a disconnect relay and a reset relay. When physical contact is made between the power tool and the conductive material, an electrical circuit through the disconnect relay is completed and the operation of the power tool is automatically interrupted. Once the contact between the tool and conductive material is broken, the power tool can be quickly and easily reactivated by a reset push button activating the reset relay. A remote reset is provided for convenience and efficiency of operation.

  12. MIMO: an efficient tool for molecular interaction maps overlap

    PubMed Central

    2013-01-01

    Background Molecular pathways represent an ensemble of interactions occurring among molecules within the cell and between cells. The identification of similarities between molecular pathways across organisms and functions has a critical role in understanding complex biological processes. For the inference of such novel information, the comparison of molecular pathways requires to account for imperfect matches (flexibility) and to efficiently handle complex network topologies. To date, these characteristics are only partially available in tools designed to compare molecular interaction maps. Results Our approach MIMO (Molecular Interaction Maps Overlap) addresses the first problem by allowing the introduction of gaps and mismatches between query and template pathways and permits -when necessary- supervised queries incorporating a priori biological information. It then addresses the second issue by relying directly on the rich graph topology described in the Systems Biology Markup Language (SBML) standard, and uses multidigraphs to efficiently handle multiple queries on biological graph databases. The algorithm has been here successfully used to highlight the contact point between various human pathways in the Reactome database. Conclusions MIMO offers a flexible and efficient graph-matching tool for comparing complex biological pathways. PMID:23672344

  13. Center for Efficient Exascale Discretizations Software Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolev, Tzanio; Dobrev, Veselin; Tomov, Vladimir

    The CEED Software suite is a collection of generally applicable software tools focusing on the following computational motives: PDE discretizations on unstructured meshes, high-order finite element and spectral element methods and unstructured adaptive mesh refinement. All of this software is being developed as part of CEED, a co-design Center for Efficient Exascale Discretizations, within DOE's Exascale Computing Project (ECP) program.

  14. The design and construction of a cost-efficient confocal laser scanning microscope

    NASA Astrophysics Data System (ADS)

    Xi, Peng; Rajwa, Bartlomiej; Jones, James T.; Robinson, J. Paul

    2007-03-01

    The optical dissection ability of confocal microscopy makes it a powerful tool for biological materials. However, the cost and complexity of confocal scanning laser microscopy hinders its wide application in education. We describe the construction of a simplified confocal scanning laser microscope and demonstrate three-dimensional projection based on cost-efficient commercial hardware, together with available open source software.

  15. Nonlinear Shaping Architecture Designed with Using Evolutionary Structural Optimization Tools

    NASA Astrophysics Data System (ADS)

    Januszkiewicz, Krystyna; Banachowicz, Marta

    2017-10-01

    The paper explores the possibilities of using Structural Optimization Tools (ESO) digital tools in an integrated structural and architectural design in response to the current needs geared towards sustainability, combining ecological and economic efficiency. The first part of the paper defines the Evolutionary Structural Optimization tools, which were developed specifically for engineering purposes using finite element analysis as a framework. The development of ESO has led to several incarnations, which are all briefly discussed (Additive ESO, Bi-directional ESO, Extended ESO). The second part presents result of using these tools in structural and architectural design. Actual building projects which involve optimization as a part of the original design process will be presented (Crematorium in Kakamigahara Gifu, Japan, 2006 SANAA“s Learning Centre, EPFL in Lausanne, Switzerland 2008 among others). The conclusion emphasizes that the structural engineering and architectural design mean directing attention to the solutions which are used by Nature, designing works optimally shaped and forming their own environments. Architectural forms never constitute the optimum shape derived through a form-finding process driven only by structural optimization, but rather embody and integrate a multitude of parameters. It might be assumed that there is a similarity between these processes in nature and the presented design methods. Contemporary digital methods make the simulation of such processes possible, and thus enable us to refer back to the empirical methods of previous generations.

  16. Reversible Flip-Flops in Quantum-Dot Cellular Automata

    NASA Astrophysics Data System (ADS)

    Rad, Samaneh Kazemi; Heikalabad, Saeed Rasouli

    2017-09-01

    Quantum-dot cellular automata is a new technology to design the efficient combinational and sequential circuits at the nano-scale. This technology has many desirable advantages compared to the CMOS technology such as low power consumption, less occupation area and low latency. These features make it suitable for use in flip-flop design. In this paper, with knowing the characteristics of reversible logic, we design new structures for flip-flops. The operations of these structures are evaluated with QCADesigner Version 2.0.3 simulator. In addition, we calculate the power dissipation of these structures by QCAPro tool. The results illustrated that proposed structures are efficient compared to the previous ones.

  17. Advanced Solar Power Systems

    NASA Technical Reports Server (NTRS)

    Atkinson, J. H.; Hobgood, J. M.

    1984-01-01

    The Advanced Solar Power System (ASPS) concentrator uses a technically sophisticated design and extensive tooling to produce very efficient (80 to 90%) and versatile energy supply equipment which is inexpensive to manufacture and requires little maintenance. The advanced optical design has two 10th order, generalized aspheric surfaces in a Cassegrainian configuration which gives outstanding performance and is relatively insensitive to temperature changes and wind loading. Manufacturing tolerances also have been achieved. The key to the ASPS is the direct absorption of concentrated sunlight in the working fluid by radiative transfers in a black body cavity. The basic ASPS design concepts, efficiency, optical system, and tracking and focusing controls are described.

  18. Promoting climate literacy through social engagement: the Green Ninja Project

    NASA Astrophysics Data System (ADS)

    Cordero, E. C.; Todd, A.

    2012-12-01

    One of the challenges of communicating climate change to younger audiences is the disconnect between global issues and local impacts. The Green Ninja is a climate-action superhero that aims to energize young people about climate science through media and social engagement tools. In this presentation, we'll highlight two of the tools designed to help K-12 students implement appropriate local mitigation strategies. A mobile phone application builds and supports a social community around taking action at local businesses regarding themes such as food, packaging and energy efficiency. An energy efficiency contest in local schools utilizes smart meter technology to provide feedback on household energy use and conservation. These tools are supported by films and lesson plans that link formal and informal education channels. The effectiveness of these methodologies as tools to engage young people in climate science and action will be discussed.

  19. Grid Data and Tools | Grid Modernization | NREL

    Science.gov Websites

    technologies and strategies, including renewable resource data sets and models of the electric power system . Renewable Resource Data A library of resource information to inform the design of efficient, integrated

  20. Using computer-aided drug design and medicinal chemistry strategies in the fight against diabetes.

    PubMed

    Semighini, Evandro P; Resende, Jonathan A; de Andrade, Peterson; Morais, Pedro A B; Carvalho, Ivone; Taft, Carlton A; Silva, Carlos H T P

    2011-04-01

    The aim of this work is to present a simple, practical and efficient protocol for drug design, in particular Diabetes, which includes selection of the illness, good choice of a target as well as a bioactive ligand and then usage of various computer aided drug design and medicinal chemistry tools to design novel potential drug candidates in different diseases. We have selected the validated target dipeptidyl peptidase IV (DPP-IV), whose inhibition contributes to reduce glucose levels in type 2 diabetes patients. The most active inhibitor with complex X-ray structure reported was initially extracted from the BindingDB database. By using molecular modification strategies widely used in medicinal chemistry, besides current state-of-the-art tools in drug design (including flexible docking, virtual screening, molecular interaction fields, molecular dynamics, ADME and toxicity predictions), we have proposed 4 novel potential DPP-IV inhibitors with drug properties for Diabetes control, which have been supported and validated by all the computational tools used herewith.

  1. Studying PubMed usages in the field for complex problem solving: Implications for tool design

    PubMed Central

    Song, Jean; Tonks, Jennifer Steiner; Meng, Fan; Xuan, Weijian; Ameziane, Rafiqa

    2012-01-01

    Many recent studies on MEDLINE-based information seeking have shed light on scientists’ behaviors and associated tool innovations that may improve efficiency and effectiveness. Few if any studies, however, examine scientists’ problem-solving uses of PubMed in actual contexts of work and corresponding needs for better tool support. Addressing this gap, we conducted a field study of novice scientists (14 upper level undergraduate majors in molecular biology) as they engaged in a problem solving activity with PubMed in a laboratory setting. Findings reveal many common stages and patterns of information seeking across users as well as variations, especially variations in cognitive search styles. Based on findings, we suggest tool improvements that both confirm and qualify many results found in other recent studies. Our findings highlight the need to use results from context-rich studies to inform decisions in tool design about when to offer improved features to users. PMID:24376375

  2. Technology Combination Analysis Tool (TCAT) for Active Debris Removal

    NASA Astrophysics Data System (ADS)

    Chamot, B.; Richard, M.; Salmon, T.; Pisseloup, A.; Cougnet, C.; Axthelm, R.; Saunder, C.; Dupont, C.; Lequette, L.

    2013-08-01

    This paper present the work of the Swiss Space Center EPFL within the CNES-funded OTV-2 study. In order to find the most performant Active Debris Removal (ADR) mission architectures and technologies, a tool was developed in order to design and compare ADR spacecraft, and to plan ADR campaigns to remove large debris. Two types of architectures are considered to be efficient: the Chaser (single-debris spacecraft), the Mothership/ Kits (multiple-debris spacecraft). Both are able to perform controlled re-entry. The tool includes modules to optimise the launch dates and the order of capture, to design missions and spacecraft, and to select launch vehicles. The propulsion, power and structure subsystems are sized by the tool thanks to high-level parametric models whilst the other ones are defined by their mass and power consumption. Final results are still under investigation by the consortium but two concrete examples of the tool's outputs are presented in the paper.

  3. Investigation of the effects of process and geometrical parameters on formability in tube hydroforming using a modular hydroforming tool

    NASA Astrophysics Data System (ADS)

    Joghan, Hamed Dardaei; Staupendahl, Daniel; Hassan, Hamad ul; Henke, Andreas; Keesser, Thorsten; Legat, Francois; Tekkaya, A. Erman

    2018-05-01

    Tube hydroforming is one of the most important manufacturing processes for the production of exhaust systems. Tube hydroforming allows generating parts with highly complex geometries with the forming accuracies needed in the automotive sector. This is possible due to the form-closed nature of the production process. One of the main cost drivers is tool manufacturing, which is expensive and time consuming, especially when forming large parts. To cope with the design trend of individuality, which is gaining more and more importance and leads to a high number of product variants, a new flexible tool design was developed. The designed tool offers a high flexibility in manufacturing different shapes and geometries of tubes with just local alterations and relocation of tool segments. The tolerancing problems that segmented tools from the state of the art have are overcome by an innovative and flexible die holder design. The break-even point of this initially more expensive tool design is already overcome when forming more than 4 different tube shapes. Together with an additionally designed rotary hydraulic tube feeding system, a highly adaptable forming setup is generated. To investigate the performance of the developed tool setup, a study on geometrical and process parameters during forming of a spherical dome was done. Austenitic stainless steel (grade 1.4301) tube with a diameter of 40 mm and a thickness of 1.5 mm was used for the investigations. The experimental analyses were supported by finite element simulations and statistical analyses. The results show that the flexible tool setup can efficiently be used to analyze the interaction of the inner pressure, friction, and the location of the spherical dome and demonstrate the high influence of the feeding rate on the formed part.

  4. Designing workload analysis questionnaire to evaluate needs of employees

    NASA Astrophysics Data System (ADS)

    Astuti, Rahmaniyah Dwi; Navi, Muhammad Abdu Haq

    2018-02-01

    Incompatibility between workload with work capacity is one of main problem to make optimal result. In case at the office, there are constraints to determine workload because of non-repetitive works. Employees do work based on the targets set in a working period. At the end of the period is usually performed an evaluation of employees performance to evaluate needs of employees. The aims of this study to design a workload questionnaire tools to evaluate the efficiency level of position as indicator to determine needs of employees based on the Indonesian State Employment Agency Regulation on workload analysis. This research is applied to State-Owned Enterprise PT. X by determining 3 positions as a pilot project. Position A is held by 2 employees, position B is held by 7 employees, and position C is held by 6 employees. From the calculation result, position A has an efficiency level of 1,33 or "very good", position B has an efficiency level of 1.71 or "enough", and position C has an efficiency level of 1.03 or "very good". The application of this tools giving suggestion the needs of employees of position A is 3 people, position B is 5 people, and position C is 6 people. The difference between the number of employees and the calculation result is then analyzed by interviewing the employees to get more data about personal perception. It can be concluded that this workload evaluation tools can be used as an alternative solution to evaluate needs of employees in office.

  5. EBF3 Design and Sustainability Considerations

    NASA Technical Reports Server (NTRS)

    Taminger, Karen M. B.

    2015-01-01

    Electron beam freeform fabrication (EBF3) is a cross-cutting technology for producing structural metal parts using an electron beam and wire feed in a layer-additive fashion. This process was developed by researchers at NASA Langley to specifically address needs for aerospace applications. Additive manufacturing technologies like EBF3 enable efficient design of materials and structures by tailoring microstructures and chemistries at the local level to improve performance at the global level. Additive manufacturing also facilitates design freedom by integrating assemblies into complex single-piece components, eliminating flanges, fasteners and joints, resulting in reduced size and mass. These same efficiencies that permit new design paradigms also lend themselves to supportability and sustainability. Long duration space missions will require a high degree of self-sustainability. EBF3 is a candidate technology being developed to allow astronauts to conduct repairs and fabricate new components and tools on demand, with efficient use of feedstock materials and energy.

  6. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  7. A multi-fidelity framework for physics based rotor blade simulation and optimization

    NASA Astrophysics Data System (ADS)

    Collins, Kyle Brian

    New helicopter rotor designs are desired that offer increased efficiency, reduced vibration, and reduced noise. Rotor Designers in industry need methods that allow them to use the most accurate simulation tools available to search for these optimal designs. Computer based rotor analysis and optimization have been advanced by the development of industry standard codes known as "comprehensive" rotorcraft analysis tools. These tools typically use table look-up aerodynamics, simplified inflow models and perform aeroelastic analysis using Computational Structural Dynamics (CSD). Due to the simplified aerodynamics, most design studies are performed varying structural related design variables like sectional mass and stiffness. The optimization of shape related variables in forward flight using these tools is complicated and results are viewed with skepticism because rotor blade loads are not accurately predicted. The most accurate methods of rotor simulation utilize Computational Fluid Dynamics (CFD) but have historically been considered too computationally intensive to be used in computer based optimization, where numerous simulations are required. An approach is needed where high fidelity CFD rotor analysis can be utilized in a shape variable optimization problem with multiple objectives. Any approach should be capable of working in forward flight in addition to hover. An alternative is proposed and founded on the idea that efficient hybrid CFD methods of rotor analysis are ready to be used in preliminary design. In addition, the proposed approach recognizes the usefulness of lower fidelity physics based analysis and surrogate modeling. Together, they are used with high fidelity analysis in an intelligent process of surrogate model building of parameters in the high fidelity domain. Closing the loop between high and low fidelity analysis is a key aspect of the proposed approach. This is done by using information from higher fidelity analysis to improve predictions made with lower fidelity models. This thesis documents the development of automated low and high fidelity physics based rotor simulation frameworks. The low fidelity framework uses a comprehensive code with simplified aerodynamics. The high fidelity model uses a parallel processor capable CFD/CSD methodology. Both low and high fidelity frameworks include an aeroacoustic simulation for prediction of noise. A synergistic process is developed that uses both the low and high fidelity frameworks together to build approximate models of important high fidelity metrics as functions of certain design variables. To test the process, a 4-bladed hingeless rotor model is used as a baseline. The design variables investigated include tip geometry and spanwise twist distribution. Approximation models are built for metrics related to rotor efficiency and vibration using the results from 60+ high fidelity (CFD/CSD) experiments and 400+ low fidelity experiments. Optimization using the approximation models found the Pareto Frontier anchor points, or the design having maximum rotor efficiency and the design having minimum vibration. Various Pareto generation methods are used to find designs on the frontier between these two anchor designs. When tested in the high fidelity framework, the Pareto anchor designs are shown to be very good designs when compared with other designs from the high fidelity database. This provides evidence that the process proposed has merit. Ultimately, this process can be utilized by industry rotor designers with their existing tools to bring high fidelity analysis into the preliminary design stage of rotors. In conclusion, the methods developed and documented in this thesis have made several novel contributions. First, an automated high fidelity CFD based forward flight simulation framework has been built for use in preliminary design optimization. The framework was built around an integrated, parallel processor capable CFD/CSD/AA process. Second, a novel method of building approximate models of high fidelity parameters has been developed. The method uses a combination of low and high fidelity results and combines Design of Experiments, statistical effects analysis, and aspects of approximation model management. And third, the determination of rotor blade shape variables through optimization using CFD based analysis in forward flight has been performed. This was done using the high fidelity CFD/CSD/AA framework and method mentioned above. While the low and high fidelity predictions methods used in the work still have inaccuracies that can affect the absolute levels of the results, a framework has been successfully developed and demonstrated that allows for an efficient process to improve rotor blade designs in terms of a selected choice of objective function(s). Using engineering judgment, this methodology could be applied today to investigate opportunities to improve existing designs. With improvements in the low and high fidelity prediction components that will certainly occur, this framework could become a powerful tool for future rotorcraft design work. (Abstract shortened by UMI.)

  8. PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.

    PubMed

    Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter

    2016-04-01

    Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. AnyWave: a cross-platform and modular software for visualizing and processing electrophysiological signals.

    PubMed

    Colombet, B; Woodman, M; Badier, J M; Bénar, C G

    2015-03-15

    The importance of digital signal processing in clinical neurophysiology is growing steadily, involving clinical researchers and methodologists. There is a need for crossing the gap between these communities by providing efficient delivery of newly designed algorithms to end users. We have developed such a tool which both visualizes and processes data and, additionally, acts as a software development platform. AnyWave was designed to run on all common operating systems. It provides access to a variety of data formats and it employs high fidelity visualization techniques. It also allows using external tools as plug-ins, which can be developed in languages including C++, MATLAB and Python. In the current version, plug-ins allow computation of connectivity graphs (non-linear correlation h2) and time-frequency representation (Morlet wavelets). The software is freely available under the LGPL3 license. AnyWave is designed as an open, highly extensible solution, with an architecture that permits rapid delivery of new techniques to end users. We have developed AnyWave software as an efficient neurophysiological data visualizer able to integrate state of the art techniques. AnyWave offers an interface well suited to the needs of clinical research and an architecture designed for integrating new tools. We expect this software to strengthen the collaboration between clinical neurophysiologists and researchers in biomedical engineering and signal processing. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Advanced Structural Optimization Under Consideration of Cost Tracking

    NASA Astrophysics Data System (ADS)

    Zell, D.; Link, T.; Bickelmaier, S.; Albinger, J.; Weikert, S.; Cremaschi, F.; Wiegand, A.

    2014-06-01

    In order to improve the design process of launcher configurations in the early development phase, the software Multidisciplinary Optimization (MDO) was developed. The tool combines different efficient software tools such as Optimal Design Investigations (ODIN) for structural optimizations, Aerospace Trajectory Optimization Software (ASTOS) for trajectory and vehicle design optimization for a defined payload and mission.The present paper focuses to the integration and validation of ODIN. ODIN enables the user to optimize typical axis-symmetric structures by means of sizing the stiffening designs concerning strength and stability while minimizing the structural mass. In addition a fully automatic finite element model (FEM) generator module creates ready-to-run FEM models of a complete stage or launcher assembly.Cost tracking respectively future improvements concerning cost optimization are indicated.

  11. Marshall Space Flight Center's Virtual Reality Applications Program 1993

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P., II

    1993-01-01

    A Virtual Reality (VR) applications program has been under development at the Marshall Space Flight Center (MSFC) since 1989. Other NASA Centers, most notably Ames Research Center (ARC), have contributed to the development of the VR enabling technologies and VR systems. This VR technology development has now reached a level of maturity where specific applications of VR as a tool can be considered. The objectives of the MSFC VR Applications Program are to develop, validate, and utilize VR as a Human Factors design and operations analysis tool and to assess and evaluate VR as a tool in other applications (e.g., training, operations development, mission support, teleoperations planning, etc.). The long-term goals of this technology program is to enable specialized Human Factors analyses earlier in the hardware and operations development process and develop more effective training and mission support systems. The capability to perform specialized Human Factors analyses earlier in the hardware and operations development process is required to better refine and validate requirements during the requirements definition phase. This leads to a more efficient design process where perturbations caused by late-occurring requirements changes are minimized. A validated set of VR analytical tools must be developed to enable a more efficient process for the design and development of space systems and operations. Similarly, training and mission support systems must exploit state-of-the-art computer-based technologies to maximize training effectiveness and enhance mission support. The approach of the VR Applications Program is to develop and validate appropriate virtual environments and associated object kinematic and behavior attributes for specific classes of applications. These application-specific environments and associated simulations will be validated, where possible, through empirical comparisons with existing, accepted tools and methodologies. These validated VR analytical tools will then be available for use in the design and development of space systems and operations and in training and mission support systems.

  12. Fine-Scale Structure Design for 3D Printing

    NASA Astrophysics Data System (ADS)

    Panetta, Francis Julian

    Modern additive fabrication technologies can manufacture shapes whose geometric complexities far exceed what existing computational design tools can analyze or optimize. At the same time, falling costs have placed these fabrication technologies within the average consumer's reach. Especially for inexpert designers, new software tools are needed to take full advantage of 3D printing technology. This thesis develops such tools and demonstrates the exciting possibilities enabled by fine-tuning objects at the small scales achievable by 3D printing. The thesis applies two high-level ideas to invent these tools: two-scale design and worst-case analysis. The two-scale design approach addresses the problem that accurately simulating--let alone optimizing--the full-resolution geometry sent to the printer requires orders of magnitude more computational power than currently available. However, we can decompose the design problem into a small-scale problem (designing tileable structures achieving a particular deformation behavior) and a macro-scale problem (deciding where to place these structures in the larger object). This separation is particularly effective, since structures for every useful behavior can be designed once, stored in a database, then reused for many different macroscale problems. Worst-case analysis refers to determining how likely an object is to fracture by studying the worst possible scenario: the forces most efficiently breaking it. This analysis is needed when the designer has insufficient knowledge or experience to predict what forces an object will undergo, or when the design is intended for use in many different scenarios unknown a priori. The thesis begins by summarizing the physics and mathematics necessary to rigorously approach these design and analysis problems. Specifically, the second chapter introduces linear elasticity and periodic homogenization. The third chapter presents a pipeline to design microstructures achieving a wide range of effective isotropic elastic material properties on a single-material 3D printer. It also proposes a macroscale optimization algorithm placing these microstructures to achieve deformation goals under prescribed loads. The thesis then turns to worst-case analysis, first considering the macroscale problem: given a user's design, the fourth chapter aims to determine the distribution of pressures over the surface creating the highest stress at any point in the shape. Solving this problem exactly is difficult, so we introduce two heuristics: one to focus our efforts on only regions likely to concentrate stresses and another converting the pressure optimization into an efficient linear program. Finally, the fifth chapter introduces worst-case analysis at the microscopic scale, leveraging the insight that the structure of periodic homogenization enables us to solve the problem exactly and efficiently. Then we use this worst-case analysis to guide a shape optimization, designing structures with prescribed deformation behavior that experience minimal stresses in generic use.

  13. Integrated design, execution, and analysis of arrayed and pooled CRISPR genome-editing experiments.

    PubMed

    Canver, Matthew C; Haeussler, Maximilian; Bauer, Daniel E; Orkin, Stuart H; Sanjana, Neville E; Shalem, Ophir; Yuan, Guo-Cheng; Zhang, Feng; Concordet, Jean-Paul; Pinello, Luca

    2018-05-01

    CRISPR (clustered regularly interspaced short palindromic repeats) genome-editing experiments offer enormous potential for the evaluation of genomic loci using arrayed single guide RNAs (sgRNAs) or pooled sgRNA libraries. Numerous computational tools are available to help design sgRNAs with optimal on-target efficiency and minimal off-target potential. In addition, computational tools have been developed to analyze deep-sequencing data resulting from genome-editing experiments. However, these tools are typically developed in isolation and oftentimes are not readily translatable into laboratory-based experiments. Here, we present a protocol that describes in detail both the computational and benchtop implementation of an arrayed and/or pooled CRISPR genome-editing experiment. This protocol provides instructions for sgRNA design with CRISPOR (computational tool for the design, evaluation, and cloning of sgRNA sequences), experimental implementation, and analysis of the resulting high-throughput sequencing data with CRISPResso (computational tool for analysis of genome-editing outcomes from deep-sequencing data). This protocol allows for design and execution of arrayed and pooled CRISPR experiments in 4-5 weeks by non-experts, as well as computational data analysis that can be performed in 1-2 d by both computational and noncomputational biologists alike using web-based and/or command-line versions.

  14. Helping coaches apply the principles of representative learning design: validation of a tennis specific practice assessment tool.

    PubMed

    Krause, Lyndon; Farrow, Damian; Reid, Machar; Buszard, Tim; Pinder, Ross

    2018-06-01

    Representative Learning Design (RLD) is a framework for assessing the degree to which experimental or practice tasks simulate key aspects of specific performance environments (i.e. competition). The key premise being that when practice replicates the performance environment, skills are more likely to transfer. In applied situations, however, there is currently no simple or quick method for coaches to assess the key concepts of RLD (e.g. during on-court tasks). The aim of this study was to develop a tool for coaches to efficiently assess practice task design in tennis. A consensus-based tool was developed using a 4-round Delphi process with 10 academic and 13 tennis-coaching experts. Expert consensus was reached for the inclusion of seven items, each consisting of two sub-questions related to (i) the task goal and (ii) the relevance of the task to competition performance. The Representative Practice Assessment Tool (RPAT) is proposed for use in assessing and enhancing practice task designs in tennis to increase the functional coupling between information and movement, and to maximise the potential for skill transfer to competition contexts.

  15. Design and Analysis Techniques for Concurrent Blackboard Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Mcmanus, John William

    1992-01-01

    Blackboard systems are a natural progression of knowledge-based systems into a more powerful problem solving technique. They provide a way for several highly specialized knowledge sources to cooperate to solve large, complex problems. Blackboard systems incorporate the concepts developed by rule-based and expert systems programmers and include the ability to add conventionally coded knowledge sources. The small and specialized knowledge sources are easier to develop and test, and can be hosted on hardware specifically suited to the task that they are solving. The Formal Model for Blackboard Systems was developed to provide a consistent method for describing a blackboard system. A set of blackboard system design tools has been developed and validated for implementing systems that are expressed using the Formal Model. The tools are used to test and refine a proposed blackboard system design before the design is implemented. My research has shown that the level of independence and specialization of the knowledge sources directly affects the performance of blackboard systems. Using the design, simulation, and analysis tools, I developed a concurrent object-oriented blackboard system that is faster, more efficient, and more powerful than existing systems. The use of the design and analysis tools provided the highly specialized and independent knowledge sources required for my concurrent blackboard system to achieve its design goals.

  16. Percussive Excavation of Lunar Soil

    NASA Technical Reports Server (NTRS)

    Whittaker, Matthew P.

    2008-01-01

    It has been suggested using a percussive motion could improve the efficiency of excavation by up to 90%. If this is proven to be true it would be very beneficial to excavation projects on the Moon and Mars. The purpose of this study is to design, build and test a percussive tool which could dig a trench and then compare this data against that of a non-percussive tool of the same shape and size. The results of this test thus far have been inconclusive due to malfunctions in the testbed and percussive bucket; however, experimental results from small scale experiments confirm this higher efficiency and support further testing.

  17. Systems analysis of a closed loop ECLSS using the ASPEN simulation tool. Thermodynamic efficiency analysis of ECLSS components. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Chatterjee, Sharmista

    1993-01-01

    Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.

  18. Nose-to-tail analysis of an airbreathing hypersonic vehicle using an in-house simplified tool

    NASA Astrophysics Data System (ADS)

    Piscitelli, Filomena; Cutrone, Luigi; Pezzella, Giuseppe; Roncioni, Pietro; Marini, Marco

    2017-07-01

    SPREAD (Scramjet PREliminary Aerothermodynamic Design) is a simplified, in-house method developed by CIRA (Italian Aerospace Research Centre), able to provide a preliminary estimation of the performance of engine/aeroshape for airbreathing configurations. It is especially useful for scramjet engines, for which the strong coupling between the aerothermodynamic (external) and propulsive (internal) flow fields requires real-time screening of several engine/aeroshape configurations and the identification of the most promising one/s with respect to user-defined constraints and requirements. The outcome of this tool defines the base-line configuration for further design analyses with more accurate tools, e.g., CFD simulations and wind tunnel testing. SPREAD tool has been used to perform the nose-to-tail analysis of the LAPCAT-II Mach 8 MR2.4 vehicle configuration. The numerical results demonstrate SPREAD capability to quickly predict reliable values of aero-propulsive balance (i.e., net-thrust) and aerodynamic efficiency in a pre-design phase.

  19. Open source cardiology electronic health record development for DIGICARDIAC implementation

    NASA Astrophysics Data System (ADS)

    Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.

    2015-12-01

    This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.

  20. Modal control theory and application to aircraft lateral handling qualities design

    NASA Technical Reports Server (NTRS)

    Srinathkumar, S.

    1978-01-01

    A multivariable synthesis procedure based on eigenvalue/eigenvector assignment is reviewed and is employed to develop a systematic design procedure to meet the lateral handling qualities design objectives of a fighter aircraft over a wide range of flight conditions. The closed loop modal characterization developed provides significant insight into the design process and plays a pivotal role in the synthesis of robust feedback systems. The simplicity of the synthesis algorithm yields an efficient computer aided interactive design tool for flight control system synthesis.

  1. 2-D Circulation Control Airfoil Benchmark Experiments Intended for CFD Code Validation

    NASA Technical Reports Server (NTRS)

    Englar, Robert J.; Jones, Gregory S.; Allan, Brian G.; Lin, Johb C.

    2009-01-01

    A current NASA Research Announcement (NRA) project being conducted by Georgia Tech Research Institute (GTRI) personnel and NASA collaborators includes the development of Circulation Control (CC) blown airfoils to improve subsonic aircraft high-lift and cruise performance. The emphasis of this program is the development of CC active flow control concepts for both high-lift augmentation, drag control, and cruise efficiency. A collaboration in this project includes work by NASA research engineers, whereas CFD validation and flow physics experimental research are part of NASA s systematic approach to developing design and optimization tools for CC applications to fixed-wing aircraft. The design space for CESTOL type aircraft is focusing on geometries that depend on advanced flow control technologies that include Circulation Control aerodynamics. The ability to consistently predict advanced aircraft performance requires improvements in design tools to include these advanced concepts. Validation of these tools will be based on experimental methods applied to complex flows that go beyond conventional aircraft modeling techniques. This paper focuses on recent/ongoing benchmark high-lift experiments and CFD efforts intended to provide 2-D CFD validation data sets related to NASA s Cruise Efficient Short Take Off and Landing (CESTOL) study. Both the experimental data and related CFD predictions are discussed.

  2. TAE Plus: Transportable Applications Environment Plus tools for building graphic-oriented applications

    NASA Technical Reports Server (NTRS)

    Szczur, Martha R.

    1989-01-01

    The Transportable Applications Environment Plus (TAE Plus), developed by NASA's Goddard Space Flight Center, is a portable User Interface Management System (UIMS), which provides an intuitive WYSIWYG WorkBench for prototyping and designing an application's user interface, integrated with tools for efficiently implementing the designed user interface and effective management of the user interface during an application's active domain. During the development of TAE Plus, many design and implementation decisions were based on the state-of-the-art within graphics workstations, windowing system and object-oriented programming languages. Some of the problems and issues experienced during implementation are discussed. A description of the next development steps planned for TAE Plus is also given.

  3. Quality engineering tools focused on high power LED driver design using boost power stages in switch mode

    NASA Astrophysics Data System (ADS)

    Ileana, Ioan; Risteiu, Mircea; Marc, Gheorghe

    2016-12-01

    This paper is a part of our research dedicated to high power LED lamps designing. The boost-up selected technology wants to meet driver producers' tendency in the frame of efficiency and disturbances constrains. In our work we used modeling and simulation tools for implementing scenarios of the driver work when some controlling functions are executed (output voltage/ current versus input voltage and fixed switching frequency, input and output electric power transfer versus switching frequency, transient inductor voltage analysis, and transient out capacitor analysis). Some electrical and thermal stress conditions are also analyzed. Based on these aspects, a high reliable power LED driver has been designed.

  4. CASE tools and UML: state of the ART.

    PubMed

    Agarwal, S

    2001-05-01

    With increasing need for automated tools to assist complex systems development, software design methods are becoming popular. This article analyzes the state of art in computer-aided software engineering (CASE) tools and unified modeling language (UML), focusing on their evolution, merits, and industry usage. It identifies managerial issues for the tools' adoption and recommends an action plan to select and implement them. While CASE and UML offer inherent advantages like cheaper, shorter, and efficient development cycles, they suffer from poor user satisfaction. The critical success factors for their implementation include, among others, management and staff commitment, proper corporate infrastructure, and user training.

  5. Nozzle Extension for Safety Air Gun

    NASA Technical Reports Server (NTRS)

    Zumbrun, H. N.; Croom, Delwin R., Jr.

    1986-01-01

    New nozzle-extension design overcomes problems and incorporates original commercial nozzle, retaining intrinsic safety features. Components include extension tube, length of which made to suit application; adaptor fitting, and nozzle adaptor repinned to maintain original safety features. Design moves conical airstream to end of extension to blow machine chips away from operator. Nozzle-extension modification allows safe and efficient operation of machine tools while maintaining integrity of orginial safety-air-gun design.

  6. Streamlining the Design-to-Build Transition with Build-Optimization Software Tools.

    PubMed

    Oberortner, Ernst; Cheng, Jan-Fang; Hillson, Nathan J; Deutsch, Samuel

    2017-03-17

    Scaling-up capabilities for the design, build, and test of synthetic biology constructs holds great promise for the development of new applications in fuels, chemical production, or cellular-behavior engineering. Construct design is an essential component in this process; however, not every designed DNA sequence can be readily manufactured, even using state-of-the-art DNA synthesis methods. Current biological computer-aided design and manufacture tools (bioCAD/CAM) do not adequately consider the limitations of DNA synthesis technologies when generating their outputs. Designed sequences that violate DNA synthesis constraints may require substantial sequence redesign or lead to price-premiums and temporal delays, which adversely impact the efficiency of the DNA manufacturing process. We have developed a suite of build-optimization software tools (BOOST) to streamline the design-build transition in synthetic biology engineering workflows. BOOST incorporates knowledge of DNA synthesis success determinants into the design process to output ready-to-build sequences, preempting the need for sequence redesign. The BOOST web application is available at https://boost.jgi.doe.gov and its Application Program Interfaces (API) enable integration into automated, customized DNA design processes. The herein presented results highlight the effectiveness of BOOST in reducing DNA synthesis costs and timelines.

  7. CONSOLE: A CAD tandem for optimization-based design interacting with user-supplied simulators

    NASA Technical Reports Server (NTRS)

    Fan, Michael K. H.; Wang, Li-Sheng; Koninckx, Jan; Tits, Andre L.

    1989-01-01

    CONSOLE employs a recently developed design methodology (International Journal of Control 43:1693-1721) which provides the designer with a congenial environment to express his problem as a multiple ojective constrained optimization problem and allows him to refine his characterization of optimality when a suboptimal design is approached. To this end, in CONSOLE, the designed formulates the design problem using a high-level language and performs design task and explores tradeoff through a few short and clearly defined commands. The range of problems that can be solved efficiently using a CAD tools depends very much on the ability of this tool to be interfaced with user-supplied simulators. For instance, when designing a control system one makes use of the characteristics of the plant, and therefore, a model of the plant under study has to be made available to the CAD tool. CONSOLE allows for an easy interfacing of almost any simulator the user has available. To date CONSOLE has already been used successfully in many applications, including the design of controllers for a flexible arm and for a robotic manipulator and the solution of a parameter selection problem for a neural network.

  8. Energy efficiency façade design in high-rise apartment buildings using the calculation of solar heat transfer through windows with shading devices

    NASA Astrophysics Data System (ADS)

    Ha, P. T. H.

    2018-04-01

    The architectural design orientation at the first design stage plays a key role and has a great impact on the energy consumption of a building throughout its life-cycle. To provide designers with a simple and useful tool in quantitatively determining and simply optimizing the energy efficiency of a building at the very first stage of conceptual design, a factor namely building envelope energy efficiency (Khqnl ) should be investigated and proposed. Heat transfer through windows and other glazed areas of mezzanine floors accounts for 86% of overall thermal transfer through building envelope, so the factor Khqnl of high-rise buildings largely depends on shading solutions. The author has established tables and charts to make reference to the values of Khqnl factor in certain high-rise apartment buildings in Hanoi calculated with a software program subject to various inputs including: types and sizes of shading devices, building orientations and at different points of time to be respectively analyzed. It is possible and easier for architects to refer to these tables and charts in façade design for a higher level of energy efficiency.

  9. Parametric and Generative Design Techniques for Digitalization in Building Industry: the Case Study of Glued- Laminated-Timber Industry

    NASA Astrophysics Data System (ADS)

    Pasetti Monizza, G.; Matt, D. T.; Benedetti, C.

    2016-11-01

    According to Wortmann classification, the Building Industry (BI) can be defined as engineer-to-order (ETO) industry: the engineering-process starts only when an order is acquired. This definition implies that every final product (building) is almost unique’ and processes cannot be easily standardized or automated. Because of this, BI is one of the less efficient industries today’ mostly leaded by craftsmanship. In the last years’ several improvements in process efficiency have been made focusing on manufacturing and installation processes only. In order to improve the efficiency of design and engineering processes as well, the scientific community agrees that the most fruitful strategy should be Front-End Design (FED). Nevertheless, effective techniques and tools are missing. This paper discusses outcomes of a research activity that aims at highlighting whether Parametric and Generative Design techniques allow reducing wastes of resources and improving the overall efficiency of the BI, by pushing the Digitalization of design and engineering processes of products. Focusing on the Glued-Laminated-Timber industry, authors will show how Parametric and Generative Design techniques can be introduced in a standard supply-chain system, highlighting potentials and criticism on the supply-chain system as a whole.

  10. Aeroelastic Optimization Study Based on X-56A Model

    NASA Technical Reports Server (NTRS)

    Li, Wesley; Pak, Chan-Gi

    2014-01-01

    A design process which incorporates the object-oriented multidisciplinary design, analysis, and optimization (MDAO) tool and the aeroelastic effects of high fidelity finite element models to characterize the design space was successfully developed and established. Two multidisciplinary design optimization studies using an object-oriented MDAO tool developed at NASA Armstrong Flight Research Center were presented. The first study demonstrates the use of aeroelastic tailoring concepts to minimize the structural weight while meeting the design requirements including strength, buckling, and flutter. A hybrid and discretization optimization approach was implemented to improve accuracy and computational efficiency of a global optimization algorithm. The second study presents a flutter mass balancing optimization study. The results provide guidance to modify the fabricated flexible wing design and move the design flutter speeds back into the flight envelope so that the original objective of X-56A flight test can be accomplished.

  11. A Fixed Point VHDL Component Library for a High Efficiency Reconfigurable Radio Design Methodology

    NASA Technical Reports Server (NTRS)

    Hoy, Scott D.; Figueiredo, Marco A.

    2006-01-01

    Advances in Field Programmable Gate Array (FPGA) technologies enable the implementation of reconfigurable radio systems for both ground and space applications. The development of such systems challenges the current design paradigms and requires more robust design techniques to meet the increased system complexity. Among these techniques is the development of component libraries to reduce design cycle time and to improve design verification, consequently increasing the overall efficiency of the project development process while increasing design success rates and reducing engineering costs. This paper describes the reconfigurable radio component library developed at the Software Defined Radio Applications Research Center (SARC) at Goddard Space Flight Center (GSFC) Microwave and Communications Branch (Code 567). The library is a set of fixed-point VHDL components that link the Digital Signal Processing (DSP) simulation environment with the FPGA design tools. This provides a direct synthesis path based on the latest developments of the VHDL tools as proposed by the BEE VBDL 2004 which allows for the simulation and synthesis of fixed-point math operations while maintaining bit and cycle accuracy. The VHDL Fixed Point Reconfigurable Radio Component library does not require the use of the FPGA vendor specific automatic component generators and provide a generic path from high level DSP simulations implemented in Mathworks Simulink to any FPGA device. The access to the component synthesizable, source code provides full design verification capability:

  12. Exploring the Role of Social Memory of Floods for Designing Flood Early Warning Operations

    NASA Astrophysics Data System (ADS)

    Girons Lopez, Marc; Di Baldassarre, Giuliano; Grabs, Thomas; Halldin, Sven; Seibert, Jan

    2016-04-01

    Early warning systems are an important tool for natural disaster mitigation practices, especially for flooding events. Warnings rely on near-future forecasts to provide time to take preventive actions before a flood occurs, thus reducing potential losses. However, on top of the technical capacities, successful warnings require an efficient coordination and communication among a range of different actors and stakeholders. The complexity of integrating the technical and social spheres of warning systems has, however, resulted in system designs neglecting a number of important aspects such as social awareness of floods thus leading to suboptimal results. A better understanding of the interactions and feedbacks among the different elements of early warning systems is therefore needed to improve their efficiency and therefore social resilience. When designing an early warning system two important decisions need to be made regarding (i) the hazard magnitude at and from which a warning should be issued and (ii) the degree of confidence required for issuing a warning. The first decision is usually taken based on the social vulnerability and climatic variability while the second one is related to the performance (i.e. accuracy) of the forecasting tools. Consequently, by estimating the vulnerability and the accuracy of the forecasts, these two variables can be optimized to minimize the costs and losses. Important parameters with a strong influence on the efficiency of warning systems such as social awareness are however not considered in their design. In this study we present a theoretical exploration of the impact of social awareness on the design of early warning systems. For this purpose we use a definition of social memory of flood events as a proxy for flood risk awareness and test its effect on the optimization of the warning system design variables. Understanding the impact of social awareness on warning system design is important to make more robust warnings that can better adapt to different social settings and more efficiently reduce vulnerability.

  13. Enterprise tools to promote interoperability: MonitoringResources.org supports design and documentation of large-scale, long-term monitoringprograms

    NASA Astrophysics Data System (ADS)

    Weltzin, J. F.; Scully, R. A.; Bayer, J.

    2016-12-01

    Individual natural resource monitoring programs have evolved in response to different organizational mandates, jurisdictional needs, issues and questions. We are establishing a collaborative forum for large-scale, long-term monitoring programs to identify opportunities where collaboration could yield efficiency in monitoring design, implementation, analyses, and data sharing. We anticipate these monitoring programs will have similar requirements - e.g. survey design, standardization of protocols and methods, information management and delivery - that could be met by enterprise tools to promote sustainability, efficiency and interoperability of information across geopolitical boundaries or organizational cultures. MonitoringResources.org, a project of the Pacific Northwest Aquatic Monitoring Partnership, provides an on-line suite of enterprise tools focused on aquatic systems in the Pacific Northwest Region of the United States. We will leverage on and expand this existing capacity to support continental-scale monitoring of both aquatic and terrestrial systems. The current stakeholder group is focused on programs led by bureaus with the Department of Interior, but the tools will be readily and freely available to a broad variety of other stakeholders. Here, we report the results of two initial stakeholder workshops focused on (1) establishing a collaborative forum of large scale monitoring programs, (2) identifying and prioritizing shared needs, (3) evaluating existing enterprise resources, (4) defining priorities for development of enhanced capacity for MonitoringResources.org, and (5) identifying a small number of pilot projects that can be used to define and test development requirements for specific monitoring programs.

  14. A supportive architecture for CFD-based design optimisation

    NASA Astrophysics Data System (ADS)

    Li, Ni; Su, Zeya; Bi, Zhuming; Tian, Chao; Ren, Zhiming; Gong, Guanghong

    2014-03-01

    Multi-disciplinary design optimisation (MDO) is one of critical methodologies to the implementation of enterprise systems (ES). MDO requiring the analysis of fluid dynamics raises a special challenge due to its extremely intensive computation. The rapid development of computational fluid dynamic (CFD) technique has caused a rise of its applications in various fields. Especially for the exterior designs of vehicles, CFD has become one of the three main design tools comparable to analytical approaches and wind tunnel experiments. CFD-based design optimisation is an effective way to achieve the desired performance under the given constraints. However, due to the complexity of CFD, integrating with CFD analysis in an intelligent optimisation algorithm is not straightforward. It is a challenge to solve a CFD-based design problem, which is usually with high dimensions, and multiple objectives and constraints. It is desirable to have an integrated architecture for CFD-based design optimisation. However, our review on existing works has found that very few researchers have studied on the assistive tools to facilitate CFD-based design optimisation. In the paper, a multi-layer architecture and a general procedure are proposed to integrate different CFD toolsets with intelligent optimisation algorithms, parallel computing technique and other techniques for efficient computation. In the proposed architecture, the integration is performed either at the code level or data level to fully utilise the capabilities of different assistive tools. Two intelligent algorithms are developed and embedded with parallel computing. These algorithms, together with the supportive architecture, lay a solid foundation for various applications of CFD-based design optimisation. To illustrate the effectiveness of the proposed architecture and algorithms, the case studies on aerodynamic shape design of a hypersonic cruising vehicle are provided, and the result has shown that the proposed architecture and developed algorithms have performed successfully and efficiently in dealing with the design optimisation with over 200 design variables.

  15. Model-based optimal design of experiments - semidefinite and nonlinear programming formulations

    PubMed Central

    Duarte, Belmiro P.M.; Wong, Weng Kee; Oliveira, Nuno M.C.

    2015-01-01

    We use mathematical programming tools, such as Semidefinite Programming (SDP) and Nonlinear Programming (NLP)-based formulations to find optimal designs for models used in chemistry and chemical engineering. In particular, we employ local design-based setups in linear models and a Bayesian setup in nonlinear models to find optimal designs. In the latter case, Gaussian Quadrature Formulas (GQFs) are used to evaluate the optimality criterion averaged over the prior distribution for the model parameters. Mathematical programming techniques are then applied to solve the optimization problems. Because such methods require the design space be discretized, we also evaluate the impact of the discretization scheme on the generated design. We demonstrate the techniques for finding D–, A– and E–optimal designs using design problems in biochemical engineering and show the method can also be directly applied to tackle additional issues, such as heteroscedasticity in the model. Our results show that the NLP formulation produces highly efficient D–optimal designs but is computationally less efficient than that required for the SDP formulation. The efficiencies of the generated designs from the two methods are generally very close and so we recommend the SDP formulation in practice. PMID:26949279

  16. Model-based optimal design of experiments - semidefinite and nonlinear programming formulations.

    PubMed

    Duarte, Belmiro P M; Wong, Weng Kee; Oliveira, Nuno M C

    2016-02-15

    We use mathematical programming tools, such as Semidefinite Programming (SDP) and Nonlinear Programming (NLP)-based formulations to find optimal designs for models used in chemistry and chemical engineering. In particular, we employ local design-based setups in linear models and a Bayesian setup in nonlinear models to find optimal designs. In the latter case, Gaussian Quadrature Formulas (GQFs) are used to evaluate the optimality criterion averaged over the prior distribution for the model parameters. Mathematical programming techniques are then applied to solve the optimization problems. Because such methods require the design space be discretized, we also evaluate the impact of the discretization scheme on the generated design. We demonstrate the techniques for finding D -, A - and E -optimal designs using design problems in biochemical engineering and show the method can also be directly applied to tackle additional issues, such as heteroscedasticity in the model. Our results show that the NLP formulation produces highly efficient D -optimal designs but is computationally less efficient than that required for the SDP formulation. The efficiencies of the generated designs from the two methods are generally very close and so we recommend the SDP formulation in practice.

  17. Application of the gene editing tool, CRISPR-Cas9, for treating neurodegenerative diseases.

    PubMed

    Kolli, Nivya; Lu, Ming; Maiti, Panchanan; Rossignol, Julien; Dunbar, Gary L

    2018-01-01

    Increased accumulation of transcribed protein from the damaged DNA and reduced DNA repair capability contributes to numerous neurological diseases for which effective treatments are lacking. Gene editing techniques provide new hope for replacing defective genes and DNA associated with neurological diseases. With advancements in using such editing tools as zinc finger nucleases (ZFNs), meganucleases, and transcription activator-like effector nucleases (TALENs), etc., scientists are able to design DNA-binding proteins, which can make precise double-strand breaks (DSBs) at the target DNA. Recent developments with the CRISPR-Cas9 gene-editing technology has proven to be more precise and efficient when compared to most other gene-editing techniques. Two methods, non-homologous end joining (NHEJ) and homology-direct repair (HDR), are used in CRISPR-Cas9 system to efficiently excise the defective genes and incorporate exogenous DNA at the target site. In this review article, we provide an overview of the CRISPR-Cas9 methodology, including its molecular mechanism, with a focus on how in this gene-editing tool can be used to counteract certain genetic defects associated with neurological diseases. Detailed understanding of this new tool could help researchers design specific gene editing strategies to repair genetic disorders in selective neurological diseases. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. ADGS-2100 Adaptive Display and Guidance System Window Manager Analysis

    NASA Technical Reports Server (NTRS)

    Whalen, Mike W.; Innis, John D.; Miller, Steven P.; Wagner, Lucas G.

    2006-01-01

    Recent advances in modeling languages have made it feasible to formally specify and analyze the behavior of large system components. Synchronous data flow languages, such as Lustre, SCR, and RSML-e are particularly well suited to this task, and commercial versions of these tools such as SCADE and Simulink are growing in popularity among designers of safety critical systems, largely due to their ability to automatically generate code from the models. At the same time, advances in formal analysis tools have made it practical to formally verify important properties of these models to ensure that design defects are identified and corrected early in the lifecycle. This report describes how these tools have been applied to the ADGS-2100 Adaptive Display and Guidance Window Manager being developed by Rockwell Collins Inc. This work demonstrates how formal methods can be easily and cost-efficiently used to remove defects early in the design cycle.

  19. WASP: a Web-based Allele-Specific PCR assay designing tool for detecting SNPs and mutations

    PubMed Central

    Wangkumhang, Pongsakorn; Chaichoompu, Kridsadakorn; Ngamphiw, Chumpol; Ruangrit, Uttapong; Chanprasert, Juntima; Assawamakin, Anunchai; Tongsima, Sissades

    2007-01-01

    Background Allele-specific (AS) Polymerase Chain Reaction is a convenient and inexpensive method for genotyping Single Nucleotide Polymorphisms (SNPs) and mutations. It is applied in many recent studies including population genetics, molecular genetics and pharmacogenomics. Using known AS primer design tools to create primers leads to cumbersome process to inexperience users since information about SNP/mutation must be acquired from public databases prior to the design. Furthermore, most of these tools do not offer the mismatch enhancement to designed primers. The available web applications do not provide user-friendly graphical input interface and intuitive visualization of their primer results. Results This work presents a web-based AS primer design application called WASP. This tool can efficiently design AS primers for human SNPs as well as mutations. To assist scientists with collecting necessary information about target polymorphisms, this tool provides a local SNP database containing over 10 million SNPs of various populations from public domain databases, namely NCBI dbSNP, HapMap and JSNP respectively. This database is tightly integrated with the tool so that users can perform the design for existing SNPs without going off the site. To guarantee specificity of AS primers, the proposed system incorporates a primer specificity enhancement technique widely used in experiment protocol. In particular, WASP makes use of different destabilizing effects by introducing one deliberate 'mismatch' at the penultimate (second to last of the 3'-end) base of AS primers to improve the resulting AS primers. Furthermore, WASP offers graphical user interface through scalable vector graphic (SVG) draw that allow users to select SNPs and graphically visualize designed primers and their conditions. Conclusion WASP offers a tool for designing AS primers for both SNPs and mutations. By integrating the database for known SNPs (using gene ID or rs number), this tool facilitates the awkward process of getting flanking sequences and other related information from public SNP databases. It takes into account the underlying destabilizing effect to ensure the effectiveness of designed primers. With user-friendly SVG interface, WASP intuitively presents resulting designed primers, which assist users to export or to make further adjustment to the design. This software can be freely accessed at . PMID:17697334

  20. Designing and evaluating a STEM teacher learning opportunity in the research university.

    PubMed

    Hardré, Patricia L; Ling, Chen; Shehab, Randa L; Herron, Jason; Nanny, Mark A; Nollert, Matthias U; Refai, Hazem; Ramseyer, Christopher; Wollega, Ebisa D

    2014-04-01

    This study examines the design and evaluation strategies for a year-long teacher learning and development experience, including their effectiveness, efficiency and recommendations for strategic redesign. Design characteristics include programmatic features and outcomes: cognitive, affective and motivational processes; interpersonal and social development; and performance activities. Program participants were secondary math and science teachers, partnered with engineering faculty mentors, in a research university-based education and support program. Data from multiple sources demonstrated strengths and weaknesses in design of the program's learning environment, including: face-to-face and via digital tools; on-site and distance community interactions; and strategic evaluation tools and systems. Implications are considered for the strategic design and evaluation of similar grant-funded research experiences intended to support teacher learning, development and transfer. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. C 3, A Command-line Catalog Cross-match Tool for Large Astrophysical Catalogs

    NASA Astrophysics Data System (ADS)

    Riccio, Giuseppe; Brescia, Massimo; Cavuoti, Stefano; Mercurio, Amata; di Giorgio, Anna Maria; Molinari, Sergio

    2017-02-01

    Modern Astrophysics is based on multi-wavelength data organized into large and heterogeneous catalogs. Hence, the need for efficient, reliable and scalable catalog cross-matching methods plays a crucial role in the era of the petabyte scale. Furthermore, multi-band data have often very different angular resolution, requiring the highest generality of cross-matching features, mainly in terms of region shape and resolution. In this work we present C 3 (Command-line Catalog Cross-match), a multi-platform application designed to efficiently cross-match massive catalogs. It is based on a multi-core parallel processing paradigm and conceived to be executed as a stand-alone command-line process or integrated within any generic data reduction/analysis pipeline, providing the maximum flexibility to the end-user, in terms of portability, parameter configuration, catalog formats, angular resolution, region shapes, coordinate units and cross-matching types. Using real data, extracted from public surveys, we discuss the cross-matching capabilities and computing time efficiency also through a direct comparison with some publicly available tools, chosen among the most used within the community, and representative of different interface paradigms. We verified that the C 3 tool has excellent capabilities to perform an efficient and reliable cross-matching between large data sets. Although the elliptical cross-match and the parametric handling of angular orientation and offset are known concepts in the astrophysical context, their availability in the presented command-line tool makes C 3 competitive in the context of public astronomical tools.

  2. Adapting Pharmacoeconomics to Shape Efficient Health Systems en Route to UHC - Lessons from Two Continents.

    PubMed

    Miot, Jacqui; Thiede, Michael

    2017-01-01

    Background: Pharmacoeconomics is receiving increasing attention globally as a set of tools ensuring efficient use of resources in health systems, albeit with different applications depending on the contextual, cultural and development stages of each country. The factors guiding design, implementation and optimisation of pharmacoeconomics as a steering tool under the universal health coverage paradigm are explored using case studies of Germany and South Africa. Findings: German social health insurance is subject to the efficiency precept. Pharmaco-regulatory tools reflect the respective framework conditions under which they developed at particular points in time. The institutionalization and integration of pharmacoeconomics into the remit of the Institute for Quality and Efficiency in Health Care occurred only rather recently. The road has not been smooth, requiring political discourse and complex processes of negotiation. Although enshrined in the National Drug Policy, South Africa has had a more fragmented approach to medicine selection and pricing with different policies in private and public sectors. The regulatory reform for use of pharmacoeconomic tools is ongoing and will be further shaped by the introduction of National Health Insurance. Conclusion: A clear vision or framework is essential as the regulatory introduction of pharmacoeconomics is not a single event but rather a growing momentum. The path will always be subject to influences of politics, economics and market forces beyond the healthcare system so delays and modifications to pharmacoeconomic tools are to be expected. Health systems are dynamic and pharmacoeconomic reforms need to be sufficiently flexible to evolve alongside.

  3. Automated visual imaging interface for the plant floor

    NASA Astrophysics Data System (ADS)

    Wutke, John R.

    1991-03-01

    The paper will provide an overview of the challenges facing a user of automated visual imaging (" AVI" ) machines and the philosophies that should be employed in designing them. As manufacturing tools and equipment become more sophisticated it is increasingly difficult to maintain an efficient interaction between the operator and machine. The typical user of an AVI machine in a production environment is technically unsophisticated. Also operator and machine ergonomics are often a neglected or poorly addressed part of an efficient manufacturing process. This paper presents a number of man-machine interface design techniques and philosophies that effectively solve these problems.

  4. Rational, computer-enabled peptide drug design: principles, methods, applications and future directions.

    PubMed

    Diller, David J; Swanson, Jon; Bayden, Alexander S; Jarosinski, Mark; Audie, Joseph

    2015-01-01

    Peptides provide promising templates for developing drugs to occupy a middle space between small molecules and antibodies and for targeting 'undruggable' intracellular protein-protein interactions. Importantly, rational or in cerebro design, especially when coupled with validated in silico tools, can be used to efficiently explore chemical space and identify islands of 'drug-like' peptides to satisfy diverse drug discovery program objectives. Here, we consider the underlying principles of and recent advances in rational, computer-enabled peptide drug design. In particular, we consider the impact of basic physicochemical properties, potency and ADME/Tox opportunities and challenges, and recently developed computational tools for enabling rational peptide drug design. Key principles and practices are spotlighted by recent case studies. We close with a hypothetical future case study.

  5. Improved design method of a rotating spool compressor using a comprehensive model and comparison to experimental results

    NASA Astrophysics Data System (ADS)

    Bradshaw, Craig R.; Kemp, Greg; Orosz, Joe; Groll, Eckhard A.

    2017-08-01

    An improvement to the design process of the rotating spool compressor is presented. This improvement utilizes a comprehensive model to explore two working uids (R410A and R134a), various displaced volumes, at a variety of geometric parameters. The geometric parameters explored consists of eccentricity ratio and length-to-diameter ratio. The eccentricity ratio is varied between 0.81 and 0.92 and the length-to-diameter ratio is varied between 0.4 and 3. The key tradeoffs are evaluated and the results show that there is an optimum eccentricity and length-to-diameter ratio, which will maximize the model predicted performance, that is unique to a particular uid and displaced volume. For R410A, the modeling tool predicts that the overall isentropic efficiency will optimize at a length-to-diameter ratio that is lower than for R134a. Additionally, the tool predicts that as the displaced volume increases the overall isentropic efficiency will increase and the ideal length-to-diameter ratio will shift. The result from this study are utilized to develop a basic design for a 141 kW (40 tonsR) capacity prototype spool compressor for light-commercial air-conditioning applications. Results from a prototype compressor constructed based on these efforts is presented. The volumetric efficiency predictions are found to be very accurate with the overall isentropic efficiency predictions shown to be slightly over-predicted.

  6. Statistical properties of mean stand biomass estimators in a LIDAR-based double sampling forest survey design.

    Treesearch

    H.E. Anderson; J. Breidenbach

    2007-01-01

    Airborne laser scanning (LIDAR) can be a valuable tool in double-sampling forest survey designs. LIDAR-derived forest structure metrics are often highly correlated with important forest inventory variables, such as mean stand biomass, and LIDAR-based synthetic regression estimators have the potential to be highly efficient compared to single-stage estimators, which...

  7. Evaluating two process scale chromatography column header designs using CFD.

    PubMed

    Johnson, Chris; Natarajan, Venkatesh; Antoniou, Chris

    2014-01-01

    Chromatography is an indispensable unit operation in the downstream processing of biomolecules. Scaling of chromatographic operations typically involves a significant increase in the column diameter. At this scale, the flow distribution within a packed bed could be severely affected by the distributor design in process scale columns. Different vendors offer process scale columns with varying design features. The effect of these design features on the flow distribution in packed beds and the resultant effect on column efficiency and cleanability needs to be properly understood in order to prevent unpleasant surprises on scale-up. Computational Fluid Dynamics (CFD) provides a cost-effective means to explore the effect of various distributor designs on process scale performance. In this work, we present a CFD tool that was developed and validated against experimental dye traces and tracer injections. Subsequently, the tool was employed to compare and contrast two commercially available header designs. © 2014 American Institute of Chemical Engineers.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz

    This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variablesmore » that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.« less

  9. Engagement and Empowerment Through Self-Service.

    PubMed

    Endriss, Jason

    2016-01-01

    Self-service tools represent the next frontier for leave and disability. This article discusses several critical com- ponents of a successful leave and disability self-service tool. If given the proper investment and thoughtfully designed, self-service tools have the potential to augment an organization's existing interaction channels, im- proving the employee experience while delivering efficiencies for an administrative model. In an operating en- vironment in which cost savings sometimes are at the expense of employee experience, such a win-win solution should not be taken lightly and, more importantly, should not be missed.

  10. Introduction on Using the FastPCR Software and the Related Java Web Tools for PCR and Oligonucleotide Assembly and Analysis.

    PubMed

    Kalendar, Ruslan; Tselykh, Timofey V; Khassenov, Bekbolat; Ramanculov, Erlan M

    2017-01-01

    This chapter introduces the FastPCR software as an integrated tool environment for PCR primer and probe design, which predicts properties of oligonucleotides based on experimental studies of the PCR efficiency. The software provides comprehensive facilities for designing primers for most PCR applications and their combinations. These include the standard PCR as well as the multiplex, long-distance, inverse, real-time, group-specific, unique, overlap extension PCR for multi-fragments assembling cloning and loop-mediated isothermal amplification (LAMP). It also contains a built-in program to design oligonucleotide sets both for long sequence assembly by ligase chain reaction and for design of amplicons that tile across a region(s) of interest. The software calculates the melting temperature for the standard and degenerate oligonucleotides including locked nucleic acid (LNA) and other modifications. It also provides analyses for a set of primers with the prediction of oligonucleotide properties, dimer and G/C-quadruplex detection, linguistic complexity as well as a primer dilution and resuspension calculator. The program consists of various bioinformatical tools for analysis of sequences with the GC or AT skew, CG% and GA% content, and the purine-pyrimidine skew. It also analyzes the linguistic sequence complexity and performs generation of random DNA sequence as well as restriction endonucleases analysis. The program allows to find or create restriction enzyme recognition sites for coding sequences and supports the clustering of sequences. It performs efficient and complete detection of various repeat types with visual display. The FastPCR software allows the sequence file batch processing that is essential for automation. The program is available for download at http://primerdigital.com/fastpcr.html , and its online version is located at http://primerdigital.com/tools/pcr.html .

  11. Efficiently Photocontrollable or not? Biological Activity of Photoisomerizable Diarylethenes.

    PubMed

    Komarov, Igor V; Afonin, Sergii; Babii, Oleg; Schober, Tim; Ulrich, Anne S

    2018-04-06

    Diarylethene derivatives, whose biological activity can be reversibly changed by irradiation with light of different wavelengths, have shown promise as scientific tools and as candidates for photocontrollable drugs. However, examples demonstrating efficient photocontrol of their biological activity are still relatively rare. This concept article discusses the possible reasons for this situation and presents a critical analysis of existing data and hypotheses in this field, in order to extract the design principles enabling the construction of efficient photocontrollable diarylethene-based molecules. Papers addressing biologically relevant interactions between diarylethenes and biomolecules are analyzed; however, in most published cases, the efficiency of photocontrol in living systems remains to be demonstrated. We hope that this article will encourage further discussion of design principles, primarily among pharmacologists and synthetic and medicinal chemists. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Progress Toward an Efficient and General CFD Tool for Propulsion Design/Analysis

    NASA Technical Reports Server (NTRS)

    Cox, C. F.; Cinnella, P.; Westmoreland, S.

    1996-01-01

    The simulation of propulsive flows inherently involves chemical activity. Recent years have seen substantial strides made in the development of numerical schemes for reacting flowfields, in particular those involving finite-rate chemistry. However, finite-rate calculations are computationally intensive and require knowledge of the actual kinetics, which are not always known with sufficient accuracy. Alternatively, flow simulations based on the assumption of local chemical equilibrium are capable of obtaining physically reasonable results at far less computational cost. The present study summarizes the development of efficient numerical techniques for the simulation of flows in local chemical equilibrium, whereby a 'Black Box' chemical equilibrium solver is coupled to the usual gasdynamic equations. The generalization of the methods enables the modelling of any arbitrary mixture of thermally perfect gases, including air, combustion mixtures and plasmas. As demonstration of the potential of the methodologies, several solutions, involving reacting and perfect gas flows, will be presented. Included is a preliminary simulation of the SSME startup transient. Future enhancements to the proposed techniques will be discussed, including more efficient finite-rate and hybrid (partial equilibrium) schemes. The algorithms that have been developed and are being optimized provide for an efficient and general tool for the design and analysis of propulsion systems.

  13. Web Audio/Video Streaming Tool

    NASA Technical Reports Server (NTRS)

    Guruvadoo, Eranna K.

    2003-01-01

    In order to promote NASA-wide educational outreach program to educate and inform the public of space exploration, NASA, at Kennedy Space Center, is seeking efficient ways to add more contents to the web by streaming audio/video files. This project proposes a high level overview of a framework for the creation, management, and scheduling of audio/video assets over the web. To support short-term goals, the prototype of a web-based tool is designed and demonstrated to automate the process of streaming audio/video files. The tool provides web-enabled users interfaces to manage video assets, create publishable schedules of video assets for streaming, and schedule the streaming events. These operations are performed on user-defined and system-derived metadata of audio/video assets stored in a relational database while the assets reside on separate repository. The prototype tool is designed using ColdFusion 5.0.

  14. Efficient runner safety assessment during early design phase and root cause analysis

    NASA Astrophysics Data System (ADS)

    Liang, Q. W.; Lais, S.; Gentner, C.; Braun, O.

    2012-11-01

    Fatigue related problems in Francis turbines, especially high head Francis turbines, have been published several times in the last years. During operation the runner is exposed to various steady and unsteady hydraulic loads. Therefore the analysis of forced response of the runner structure requires a combined approach of fluid dynamics and structural dynamics. Due to the high complexity of the phenomena and due to the limitation of computer power, the numerical prediction was in the past too expensive and not feasible for the use as standard design tool. However, due to continuous improvement of the knowledge and the simulation tools such complex analysis has become part of the design procedure in ANDRITZ HYDRO. This article describes the application of most advanced analysis techniques in runner safety check (RSC), including steady state CFD analysis, transient CFD analysis considering rotor stator interaction (RSI), static FE analysis and modal analysis in water considering the added mass effect, in the early design phase. This procedure allows a very efficient interaction between the hydraulic designer and the mechanical designer during the design phase, such that a risk of failure can be detected and avoided in an early design stage.The RSC procedure can also be applied to a root cause analysis (RCA) both to find out the cause of failure and to quickly define a technical solution to meet the safety criteria. An efficient application to a RCA of cracks in a Francis runner is quoted in this article as an example. The results of the RCA are presented together with an efficient and inexpensive solution whose effectiveness could be proven again by applying the described RSC technics. It is shown that, with the RSC procedure developed and applied as standard procedure in ANDRITZ HYDRO such a failure is excluded in an early design phase. Moreover, the RSC procedure is compatible with different commercial and open source codes and can be easily adapted to apply for other types of turbines, such as pump turbines and Pelton runners.

  15. Toward a Visualization-Supported Workflow for Cyber Alert Management using Threat Models and Human-Centered Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.

    Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat modelsmore » that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.« less

  16. Unsteady Loss in the Stator Due to the Incoming Rotor Wake in a Highly-Loaded Transonic Compressor

    NASA Technical Reports Server (NTRS)

    Hah, Chunill

    2015-01-01

    The present paper reports an investigation of unsteady loss generation in the stator due to the incoming rotor wake in an advanced GE transonic compressor design with a high-fidelity numerical method. This advanced compressor with high reaction and high stage loading has been investigated both experimentally and analytically in the past. The measured efficiency in this advanced compressor is significantly lower than the design intention goal. The general understanding is that the current generation of compressor design analysis tools miss some important flow physics in this modern compressor design. To pinpoint the source of the efficiency miss, an advanced test with a detailed flow traverse was performed for the front one and a half stage at the NASA Glenn Research Center.

  17. Numerical continuation and bifurcation analysis in aircraft design: an industrial perspective.

    PubMed

    Sharma, Sanjiv; Coetzee, Etienne B; Lowenberg, Mark H; Neild, Simon A; Krauskopf, Bernd

    2015-09-28

    Bifurcation analysis is a powerful method for studying the steady-state nonlinear dynamics of systems. Software tools exist for the numerical continuation of steady-state solutions as parameters of the system are varied. These tools make it possible to generate 'maps of solutions' in an efficient way that provide valuable insight into the overall dynamic behaviour of a system and potentially to influence the design process. While this approach has been employed in the military aircraft control community to understand the effectiveness of controllers, the use of bifurcation analysis in the wider aircraft industry is yet limited. This paper reports progress on how bifurcation analysis can play a role as part of the design process for passenger aircraft. © 2015 The Author(s).

  18. Informed public choices for low-carbon electricity portfolios using a computer decision tool.

    PubMed

    Mayer, Lauren A Fleishman; Bruine de Bruin, Wändi; Morgan, M Granger

    2014-04-01

    Reducing CO2 emissions from the electricity sector will likely require policies that encourage the widespread deployment of a diverse mix of low-carbon electricity generation technologies. Public discourse informs such policies. To make informed decisions and to productively engage in public discourse, citizens need to understand the trade-offs between electricity technologies proposed for widespread deployment. Building on previous paper-and-pencil studies, we developed a computer tool that aimed to help nonexperts make informed decisions about the challenges faced in achieving a low-carbon energy future. We report on an initial usability study of this interactive computer tool. After providing participants with comparative and balanced information about 10 electricity technologies, we asked them to design a low-carbon electricity portfolio. Participants used the interactive computer tool, which constrained portfolio designs to be realistic and yield low CO2 emissions. As they changed their portfolios, the tool updated information about projected CO2 emissions, electricity costs, and specific environmental impacts. As in the previous paper-and-pencil studies, most participants designed diverse portfolios that included energy efficiency, nuclear, coal with carbon capture and sequestration, natural gas, and wind. Our results suggest that participants understood the tool and used it consistently. The tool may be downloaded from http://cedmcenter.org/tools-for-cedm/informing-the-public-about-low-carbon-technologies/ .

  19. Adaptation Design Tool for Climate-Smart Management of Coral Reefs and Other Natural Resources.

    PubMed

    West, Jordan M; Courtney, Catherine A; Hamilton, Anna T; Parker, Britt A; Gibbs, David A; Bradley, Patricia; Julius, Susan H

    2018-06-22

    Scientists and managers of natural resources have recognized an urgent need for improved methods and tools to enable effective adaptation of management measures in the face of climate change. This paper presents an Adaptation Design Tool that uses a structured approach to break down an otherwise overwhelming and complex process into tractable steps. The tool contains worksheets that guide users through a series of design considerations for adapting their planned management actions to be more climate-smart given changing environmental stressors. Also provided with other worksheets is a framework for brainstorming new adaptation options in response to climate threats not yet addressed in the current plan. Developed and tested in collaboration with practitioners in Hawai'i and Puerto Rico using coral reefs as a pilot ecosystem, the tool and associated reference materials consist of worksheets, instructions and lessons-learned from real-world examples. On the basis of stakeholder feedback from expert consultations during tool development, we present insights and recommendations regarding how to maximize tool efficiency, gain the greatest value from the thought process, and deal with issues of scale and uncertainty. We conclude by reflecting on how the tool advances the theory and practice of assessment and decision-making science, informs higher level strategic planning, and serves as a platform for a systematic, transparent and inclusive process to tackle the practical implications of climate change for management of natural resources.

  20. Exploring the combinatorial space of complete pathways to chemicals.

    PubMed

    Wang, Lin; Ng, Chiam Yu; Dash, Satyakam; Maranas, Costas D

    2018-04-06

    Computational pathway design tools often face the challenges of balancing the stoichiometry of co-metabolites and cofactors, and dealing with reaction rule utilization in a single workflow. To this end, we provide an overview of two complementary stoichiometry-based pathway design tools optStoic and novoStoic developed in our group to tackle these challenges. optStoic is designed to determine the stoichiometry of overall conversion first which optimizes a performance criterion (e.g. high carbon/energy efficiency) and ensures a comprehensive search of co-metabolites and cofactors. The procedure then identifies the minimum number of intervening reactions to connect the source and sink metabolites. We also further the pathway design procedure by expanding the search space to include both known and hypothetical reactions, represented by reaction rules, in a new tool termed novoStoic. Reaction rules are derived based on a mixed-integer linear programming (MILP) compatible reaction operator, which allow us to explore natural promiscuous enzymes, engineer candidate enzymes that are not already promiscuous as well as design de novo enzymes. The identified biochemical reaction rules then guide novoStoic to design routes that expand the currently known biotransformation space using a single MILP modeling procedure. We demonstrate the use of the two computational tools in pathway elucidation by designing novel synthetic routes for isobutanol. © 2018 The Author(s). Published by Portland Press Limited on behalf of the Biochemical Society.

  1. Formative Evaluation of Care Nexus: a Tool for the Visualization and Management of Care Teams of Complex Pediatric Patients

    PubMed Central

    Ranade-Kharkar, Pallavi; Norlin, Chuck; Del Fiol, Guilherme

    2017-01-01

    Complex and chronic conditions in pediatric patients with special needs often result in large and diverse patient care teams. Having a comprehensive view of the care teams is crucial to achieving effective and efficient care coordination for these vulnerable patients. In this study, we iteratively design and develop two alternative user interfaces (graphical and tabular) of a prototype of a tool for visualizing and managing care teams and conduct a formative assessment of the usability, usefulness, and efficiency of the tool. The median time to task completion for the 21 study participants was less than 7 seconds for 19 out of the 22 usability tasks. While both the prototype formats were well-liked in terms of usability and usefulness, the tabular format was rated higher for usefulness (p=0.02). Inclusion of CareNexus-like tools in electronic and personal health records has the potential to facilitate care coordination in complex pediatric patients. PMID:29854215

  2. DUK - A Fast and Efficient Kmer Based Sequence Matching Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Mingkun; Copeland, Alex; Han, James

    2011-03-21

    A new tool, DUK, is developed to perform matching task. Matching is to find whether a query sequence partially or totally matches given reference sequences or not. Matching is similar to alignment. Indeed many traditional analysis tasks like contaminant removal use alignment tools. But for matching, there is no need to know which bases of a query sequence matches which position of a reference sequence, it only need know whether there exists a match or not. This subtle difference can make matching task much faster than alignment. DUK is accurate, versatile, fast, and has efficient memory usage. It uses Kmermore » hashing method to index reference sequences and Poisson model to calculate p-value. DUK is carefully implemented in C++ in object oriented design. The resulted classes can also be used to develop other tools quickly. DUK have been widely used in JGI for a wide range of applications such as contaminant removal, organelle genome separation, and assembly refinement. Many real applications and simulated dataset demonstrate its power.« less

  3. Computer-aided design of biological circuits using TinkerCell

    PubMed Central

    Bergmann, Frank T; Sauro, Herbert M

    2010-01-01

    Synthetic biology is an engineering discipline that builds on modeling practices from systems biology and wet-lab techniques from genetic engineering. As synthetic biology advances, efficient procedures will be developed that will allow a synthetic biologist to design, analyze and build biological networks. In this idealized pipeline, computer-aided design (CAD) is a necessary component. The role of a CAD application would be to allow efficient transition from a general design to a final product. TinkerCell is a design tool for serving this purpose in synthetic biology. In TinkerCell, users build biological networks using biological parts and modules. The network can be analyzed using one of several functions provided by TinkerCell or custom programs from third-party sources. Since best practices for modeling and constructing synthetic biology networks have not yet been established, TinkerCell is designed as a flexible and extensible application that can adjust itself to changes in the field. PMID:21327060

  4. From parabolic-trough to metasurface-concentrator: assessing focusing in the wave-optics limit.

    PubMed

    Hsu, Liyi; Dupré, Matthieu; Ndao, Abdoulaye; Kanté, Boubacar

    2017-04-15

    Metasurfaces are promising tools toward novel designs for flat optics applications. As such, their quality and tolerance to fabrication imperfections need to be evaluated with specific tools. However, most such tools rely on the geometrical optics approximation and are not straightforwardly applicable to metasurfaces. In this Letter, we introduce and evaluate for metasurfaces parameters such as intercept factor and slope error usually defined for solar concentrators in the realm of ray-optics. After proposing definitions valid in physical optics, we put forward an approach to calculate them. As examples, we design three different concentrators based on three specific unit cells and assess them numerically. The concept allows for comparison of the efficiency of the metasurfaces and their sensitivities to fabrication imperfections and will be critical for practical systems implementation.

  5. Bio-jETI: a service integration, design, and provisioning platform for orchestrated bioinformatics processes.

    PubMed

    Margaria, Tiziana; Kubczak, Christian; Steffen, Bernhard

    2008-04-25

    With Bio-jETI, we introduce a service platform for interdisciplinary work on biological application domains and illustrate its use in a concrete application concerning statistical data processing in R and xcms for an LC/MS analysis of FAAH gene knockout. Bio-jETI uses the jABC environment for service-oriented modeling and design as a graphical process modeling tool and the jETI service integration technology for remote tool execution. As a service definition and provisioning platform, Bio-jETI has the potential to become a core technology in interdisciplinary service orchestration and technology transfer. Domain experts, like biologists not trained in computer science, directly define complex service orchestrations as process models and use efficient and complex bioinformatics tools in a simple and intuitive way.

  6. A procedural method for the efficient implementation of full-custom VLSI designs

    NASA Technical Reports Server (NTRS)

    Belk, P.; Hickey, N.

    1987-01-01

    An imbedded language system for the layout of very large scale integration (VLSI) circuits is examined. It is shown that through the judicious use of this system, a large variety of circuits can be designed with circuit density and performance comparable to traditional full-custom design methods, but with design costs more comparable to semi-custom design methods. The high performance of this methodology is attributable to the flexibility of procedural descriptions of VLSI layouts and to a number of automatic and semi-automatic tools within the system.

  7. Design and Control of Modular Spine-Like Tensegrity Structures

    NASA Technical Reports Server (NTRS)

    Mirletz, Brian T.; Park, In-Won; Flemons, Thomas E.; Agogino, Adrian K.; Quinn, Roger D.; SunSpiral, Vytas

    2014-01-01

    We present a methodology enabled by the NASA Tensegrity Robotics Toolkit (NTRT) for the rapid structural design of tensegrity robots in simulation and an approach for developing control systems using central pattern generators, local impedance controllers, and parameter optimization techniques to determine effective locomotion strategies for the robot. Biomimetic tensegrity structures provide advantageous properties to robotic locomotion and manipulation tasks, such as their adaptability and force distribution properties, flexibility, energy efficiency, and access to extreme terrains. While strides have been made in designing insightful static biotensegrity structures, gaining a clear understanding of how a particular structure can efficiently move has been an open problem. The tools in the NTRT enable the rapid exploration of the dynamics of a given morphology, and the links between structure, controllability, and resulting gait efficiency. To highlight the effectiveness of the NTRT at this exploration of morphology and control, we will provide examples from the designs and locomotion of four different modular spine-like tensegrity robots.

  8. An efficient reliability algorithm for locating design point using the combination of importance sampling concepts and response surface method

    NASA Astrophysics Data System (ADS)

    Shayanfar, Mohsen Ali; Barkhordari, Mohammad Ali; Roudak, Mohammad Amin

    2017-06-01

    Monte Carlo simulation (MCS) is a useful tool for computation of probability of failure in reliability analysis. However, the large number of required random samples makes it time-consuming. Response surface method (RSM) is another common method in reliability analysis. Although RSM is widely used for its simplicity, it cannot be trusted in highly nonlinear problems due to its linear nature. In this paper, a new efficient algorithm, employing the combination of importance sampling, as a class of MCS, and RSM is proposed. In the proposed algorithm, analysis starts with importance sampling concepts and using a represented two-step updating rule of design point. This part finishes after a small number of samples are generated. Then RSM starts to work using Bucher experimental design, with the last design point and a represented effective length as the center point and radius of Bucher's approach, respectively. Through illustrative numerical examples, simplicity and efficiency of the proposed algorithm and the effectiveness of the represented rules are shown.

  9. Multi-disciplinary optimization of aeroservoelastic systems

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1990-01-01

    Efficient analytical and computational tools for simultaneous optimal design of the structural and control components of aeroservoelastic systems are presented. The optimization objective is to achieve aircraft performance requirements and sufficient flutter and control stability margins with a minimal weight penalty and without violating the design constraints. Analytical sensitivity derivatives facilitate an efficient optimization process which allows a relatively large number of design variables. Standard finite element and unsteady aerodynamic routines are used to construct a modal data base. Minimum State aerodynamic approximations and dynamic residualization methods are used to construct a high accuracy, low order aeroservoelastic model. Sensitivity derivatives of flutter dynamic pressure, control stability margins and control effectiveness with respect to structural and control design variables are presented. The performance requirements are utilized by equality constraints which affect the sensitivity derivatives. A gradient-based optimization algorithm is used to minimize an overall cost function. A realistic numerical example of a composite wing with four controls is used to demonstrate the modeling technique, the optimization process, and their accuracy and efficiency.

  10. Multidisciplinary optimization of aeroservoelastic systems using reduced-size models

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1992-01-01

    Efficient analytical and computational tools for simultaneous optimal design of the structural and control components of aeroservoelastic systems are presented. The optimization objective is to achieve aircraft performance requirements and sufficient flutter and control stability margins with a minimal weight penalty and without violating the design constraints. Analytical sensitivity derivatives facilitate an efficient optimization process which allows a relatively large number of design variables. Standard finite element and unsteady aerodynamic routines are used to construct a modal data base. Minimum State aerodynamic approximations and dynamic residualization methods are used to construct a high accuracy, low order aeroservoelastic model. Sensitivity derivatives of flutter dynamic pressure, control stability margins and control effectiveness with respect to structural and control design variables are presented. The performance requirements are utilized by equality constraints which affect the sensitivity derivatives. A gradient-based optimization algorithm is used to minimize an overall cost function. A realistic numerical example of a composite wing with four controls is used to demonstrate the modeling technique, the optimization process, and their accuracy and efficiency.

  11. Options to improve energy efficiency for educational building

    NASA Astrophysics Data System (ADS)

    Jahan, Mafruha

    The cost of energy is a major factor that must be considered for educational facility budget planning purpose. The analysis of energy related issues and options can be complex and requires significant time and detailed effort. One way to facilitate the inclusion of energy option planning in facility planning efforts is to utilize a tool that allows for quick appraisal of the facility energy profile. Once such an appraisal is accomplished, it is then possible to rank energy improvement options consistently with other facility needs and requirements. After an energy efficiency option has been determined to have meaningful value in comparison with other facility planning options, it is then possible to utilize the initial appraisal as the basis for an expanded consideration of additional facility and energy use detail using the same analytic system used for the initial appraisal. This thesis has developed a methodology and an associated analytic model to assist in these tasks and thereby improve the energy efficiency of educational facilities. A detailed energy efficiency and analysis tool is described that utilizes specific university building characteristics such as size, architecture, envelop, lighting, occupancy, thermal design which allows reducing the annual energy consumption. Improving the energy efficiency of various aspects of an educational building's energy performance can be complex and can require significant time and experience to make decisions. The approach developed in this thesis initially assesses the energy design for a university building. This initial appraisal is intended to assist administrators in assessing the potential value of energy efficiency options for their particular facility. Subsequently this scoping design can then be extended as another stage of the model by local facility or planning personnel to add more details and engineering aspects to the initial screening model. This approach can assist university planning efforts to identify the most cost effective combinations of energy efficiency strategies. The model analyzes and compares the payback periods of all proposed Energy Performance Measures (EPMs) to determine which has the greatest potential value.

  12. Content addressable memory project

    NASA Technical Reports Server (NTRS)

    Hall, J. Storrs; Levy, Saul; Smith, Donald E.; Miyake, Keith M.

    1992-01-01

    A parameterized version of the tree processor was designed and tested (by simulation). The leaf processor design is 90 percent complete. We expect to complete and test a combination of tree and leaf cell designs in the next period. Work is proceeding on algorithms for the computer aided manufacturing (CAM), and once the design is complete we will begin simulating algorithms for large problems. The following topics are covered: (1) the practical implementation of content addressable memory; (2) design of a LEAF cell for the Rutgers CAM architecture; (3) a circuit design tool user's manual; and (4) design and analysis of efficient hierarchical interconnection networks.

  13. Computer Aided Design of Computer Generated Holograms for electron beam fabrication

    NASA Technical Reports Server (NTRS)

    Urquhart, Kristopher S.; Lee, Sing H.; Guest, Clark C.; Feldman, Michael R.; Farhoosh, Hamid

    1989-01-01

    Computer Aided Design (CAD) systems that have been developed for electrical and mechanical design tasks are also effective tools for the process of designing Computer Generated Holograms (CGHs), particularly when these holograms are to be fabricated using electron beam lithography. CAD workstations provide efficient and convenient means of computing, storing, displaying, and preparing for fabrication many of the features that are common to CGH designs. Experience gained in the process of designing CGHs with various types of encoding methods is presented. Suggestions are made so that future workstations may further accommodate the CGH design process.

  14. Experience with Using Multiple Types of Visual Educational Tools during Problem-Based Learning.

    PubMed

    Kang, Bong Jin

    2012-06-01

    This study describes the experience of using multiple types of visual educational tools in the setting of problem-based learning (PBL). The author intends to demonstrate their roles in diverse and efficient ways of clinical reasoning and problem solving. Visual educational tools were introduced in a lecture that included their various types, possible benefits, and some examples. Each group made one mechanistic case diagram per week, and each student designed one diagnostic schema or therapeutic algorithm per week, based on their learning issues. The students were also told to provide commentary, which was intended to give insights into their truthfulness. Subsequently, the author administered a questionnaire about the usefulness and weakness of visual educational tools and the difficulties with performing the work. Also, the qualities of the products were assessed by the author. There were many complaints about the adequacy of the introduction of visual educational tools, also revealed by the many initial inappropriate types of products. However, the exercise presentation in the first week improved the level of understanding regarding their purposes and the method of design. In general, students agreed on the benefits of their help in providing a deep understanding of the cases and the possibility of solving clinical problems efficiently. The commentary was helpful in evaluating the truthfulness of their efforts. Students gave suggestions for increasing the percentage of their scores, considering the efforts. Using multiple types of visual educational tools during PBL can be useful in understanding the diverse routes of clinical reasoning and clinical features.

  15. Special Report: Part One. New Tools for Professionals.

    ERIC Educational Resources Information Center

    Liskin, Miriam; And Others

    1984-01-01

    This collection of articles includes an examination of word-processing software; project management software; new expert systems that turn microcomputers into logical, well-informed consultants; simulated negotiation software; telephone management systems; and the physical design of an efficient microcomputer work space. (MBR)

  16. Microsystems Enabled Photovoltaics

    ScienceCinema

    Gupta, Vipin; Nielson, Greg; Okandan, Murat, Granata, Jennifer; Nelson, Jeff; Haney, Mike; Cruz-Campa, Jose Luiz

    2018-06-07

    Sandia's microsystems enabled photovoltaic advances combine mature technology and tools currently used in microsystem production with groundbreaking advances in photovoltaics cell design, decreasing production and system costs while improving energy conversion efficiency. The technology has potential applications in buildings, houses, clothing, portable electronics, vehicles, and other contoured structures.

  17. Gemi: PCR Primers Prediction from Multiple Alignments

    PubMed Central

    Sobhy, Haitham; Colson, Philippe

    2012-01-01

    Designing primers and probes for polymerase chain reaction (PCR) is a preliminary and critical step that requires the identification of highly conserved regions in a given set of sequences. This task can be challenging if the targeted sequences display a high level of diversity, as frequently encountered in microbiologic studies. We developed Gemi, an automated, fast, and easy-to-use bioinformatics tool with a user-friendly interface to design primers and probes based on multiple aligned sequences. This tool can be used for the purpose of real-time and conventional PCR and can deal efficiently with large sets of sequences of a large size. PMID:23316117

  18. Optimal Number and Allocation of Data Collection Points for Linear Spline Growth Curve Modeling: A Search for Efficient Designs

    ERIC Educational Resources Information Center

    Wu, Wei; Jia, Fan; Kinai, Richard; Little, Todd D.

    2017-01-01

    Spline growth modelling is a popular tool to model change processes with distinct phases and change points in longitudinal studies. Focusing on linear spline growth models with two phases and a fixed change point (the transition point from one phase to the other), we detail how to find optimal data collection designs that maximize the efficiency…

  19. Adaptive Origami for Efficiently Folded Structures

    DTIC Science & Technology

    2016-02-01

    design optimization to find optimal origami patterns for in-plane compression. 3. Self-folding and programmable material systems were developed for...2014, 1st place in the Midwest and 2nd place in the National 2014 SAMPE student research symposium). • Design of self-folding and programmable ... material systems: Nafion SMP Programming: To integrate active materials into origami, mechanical analysis and optimization tools where applied to the

  20. Agile: From Software to Mission Systems

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Shirley, Mark; Hobart, Sarah

    2017-01-01

    To maximize efficiency and flexibility in Mission Operations System (MOS) design, we are evolving principles from agile and lean methods for software, to the complete mission system. This allows for reduced operational risk at reduced cost, and achieves a more effective design through early integration of operations into mission system engineering and flight system design. The core principles are assessment of capability through demonstration, risk reduction through targeted experiments, early test and deployment, and maturation of processes and tools through use.

  1. Appraisal of comparative single-case experimental designs for instructional interventions with non-reversible target behaviors: Introducing the CSCEDARS ("Cedars").

    PubMed

    Schlosser, Ralf W; Belfiore, Phillip J; Sigafoos, Jeff; Briesch, Amy M; Wendt, Oliver

    2018-05-28

    Evidence-based practice as a process requires the appraisal of research as a critical step. In the field of developmental disabilities, single-case experimental designs (SCEDs) figure prominently as a means for evaluating the effectiveness of non-reversible instructional interventions. Comparative SCEDs contrast two or more instructional interventions to document their relative effectiveness and efficiency. As such, these designs have great potential to inform evidence-based decision-making. To harness this potential, however, interventionists and authors of systematic reviews need tools to appraise the evidence generated by these designs. Our literature review revealed that existing tools do not adequately address the specific methodological considerations of comparative SCEDs that aim to compare instructional interventions of non-reversible target behaviors. The purpose of this paper is to introduce the Comparative Single-Case Experimental Design Rating System (CSCEDARS, "cedars") as a tool for appraising the internal validity of comparative SCEDs of two or more non-reversible instructional interventions. Pertinent literature will be reviewed to establish the need for this tool and to underpin the rationales for individual rating items. Initial reliability information will be provided as well. Finally, directions for instrument validation will be proposed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. AIRNOISE: A Tool for Preliminary Noise-Abatement Terminal Approach Route Design

    NASA Technical Reports Server (NTRS)

    Li, Jinhua; Sridhar, Banavar; Xue, Min; Ng, Hok

    2016-01-01

    Noise from aircraft in the airport vicinity is one of the leading aviation-induced environmental issues. The FAA developed the Integrated Noise Model (INM) and its replacement Aviation Environmental Design Tool (AEDT) software to assess noise impact resulting from all aviation activities. However, a software tool is needed that is simple to use for terminal route modification, quick and reasonably accurate for preliminary noise impact evaluation and flexible to be used for iterative design of optimal noise-abatement terminal routes. In this paper, we extend our previous work on developing a noise-abatement terminal approach route design tool, named AIRNOISE, to satisfy this criterion. First, software efficiency has been significantly increased by over tenfold using the C programming language instead of MATLAB. Moreover, a state-of-the-art high performance GPU-accelerated computing module is implemented that was tested to be hundreds time faster than the C implementation. Secondly, a Graphical User Interface (GUI) was developed allowing users to import current terminal approach routes and modify the routes interactively to design new terminal approach routes. The corresponding noise impacts are then calculated and displayed in the GUI in seconds. Finally, AIRNOISE was applied to Baltimore-Washington International Airport terminal approach route to demonstrate its usage.

  3. From design to manufacturing of asymmetric teeth gears using computer application

    NASA Astrophysics Data System (ADS)

    Suciu, F.; Dascalescu, A.; Ungureanu, M.

    2017-05-01

    The asymmetric cylindrical gears, with involutes teeth profiles having different base circle diameters, are nonstandard gears, used with the aim to obtain better function parameters for the active profile. We will expect that the manufacturing of these gears became possible only after the design and realization of some specific tools. The paper present how the computer aided design and applications developed in MATLAB, for obtain the geometrical parameters, in the same time for calculation some functional parameters like stress and displacements, transmission error, efficiency of the gears and the 2D models, generated with AUTOLISP applications, are used for computer aided manufacturing of asymmetric gears with standard tools. So the specific tools considered one of the disadvantages of these gears are not necessary and implicitly the expected supplementary costs are reduced. The calculus algorithm established for the asymmetric gear design application use the „direct design“ of the spur gears. This method offers the possibility of determining first the parameters of the gears, followed by the determination of the asymmetric gear rack’s parameters, based on those of the gears. Using original design method and computer applications have been determined the geometrical parameters, the 2D and 3D models of the asymmetric gears and on the base of these models have been manufacturing on CNC machine tool asymmetric gears.

  4. Quantitative optical imaging and sensing by joint design of point spread functions and estimation algorithms

    NASA Astrophysics Data System (ADS)

    Quirin, Sean Albert

    The joint application of tailored optical Point Spread Functions (PSF) and estimation methods is an important tool for designing quantitative imaging and sensing solutions. By enhancing the information transfer encoded by the optical waves into an image, matched post-processing algorithms are able to complete tasks with improved performance relative to conventional designs. In this thesis, new engineered PSF solutions with image processing algorithms are introduced and demonstrated for quantitative imaging using information-efficient signal processing tools and/or optical-efficient experimental implementations. The use of a 3D engineered PSF, the Double-Helix (DH-PSF), is applied as one solution for three-dimensional, super-resolution fluorescence microscopy. The DH-PSF is a tailored PSF which was engineered to have enhanced information transfer for the task of localizing point sources in three dimensions. Both an information- and optical-efficient implementation of the DH-PSF microscope are demonstrated here for the first time. This microscope is applied to image single-molecules and micro-tubules located within a biological sample. A joint imaging/axial-ranging modality is demonstrated for application to quantifying sources of extended transverse and axial extent. The proposed implementation has improved optical-efficiency relative to prior designs due to the use of serialized cycling through select engineered PSFs. This system is demonstrated for passive-ranging, extended Depth-of-Field imaging and digital refocusing of random objects under broadband illumination. Although the serialized engineered PSF solution is an improvement over prior designs for the joint imaging/passive-ranging modality, it requires the use of multiple PSFs---a potentially significant constraint. Therefore an alternative design is proposed, the Single-Helix PSF, where only one engineered PSF is necessary and the chromatic behavior of objects under broadband illumination provides the necessary information transfer. The matched estimation algorithms are introduced along with an optically-efficient experimental system to image and passively estimate the distance to a test object. An engineered PSF solution is proposed for improving the sensitivity of optical wave-front sensing using a Shack-Hartmann Wave-front Sensor (SHWFS). The performance limits of the classical SHWFS design are evaluated and the engineered PSF system design is demonstrated to enhance performance. This system is fabricated and the mechanism for additional information transfer is identified.

  5. Efficient Digital Implementation of The Sigmoidal Function For Artificial Neural Network

    NASA Astrophysics Data System (ADS)

    Pratap, Rana; Subadra, M.

    2011-10-01

    An efficient piecewise linear approximation of a nonlinear function (PLAN) is proposed. This uses simulink environment design to perform a direct transformation from X to Y, where X is the input and Y is the approximated sigmoidal output. This PLAN is then used within the outputs of an artificial neural network to perform the nonlinear approximation. In This paper, is proposed a method to implement in FPGA (Field Programmable Gate Array) circuits different approximation of the sigmoid function.. The major benefit of the proposed method resides in the possibility to design neural networks by means of predefined block systems created in System Generator environment and the possibility to create a higher level design tools used to implement neural networks in logical circuits.

  6. Systems Engineering Building Advances Power Grid Research

    ScienceCinema

    Virden, Jud; Huang, Henry; Skare, Paul; Dagle, Jeff; Imhoff, Carl; Stoustrup, Jakob; Melton, Ron; Stiles, Dennis; Pratt, Rob

    2018-01-16

    Researchers and industry are now better equipped to tackle the nation’s most pressing energy challenges through PNNL’s new Systems Engineering Building – including challenges in grid modernization, buildings efficiency and renewable energy integration. This lab links real-time grid data, software platforms, specialized laboratories and advanced computing resources for the design and demonstration of new tools to modernize the grid and increase buildings energy efficiency.

  7. Efficient Sensitivity Methods for Probabilistic Lifing and Engine Prognostics

    DTIC Science & Technology

    2010-09-01

    AFRL-RX-WP-TR-2010-4297 EFFICIENT SENSITIVITY METHODS FOR PROBABILISTIC LIFING AND ENGINE PROGNOSTICS Harry Millwater , Ronald Bagley, Jose...5c. PROGRAM ELEMENT NUMBER 62102F 6. AUTHOR(S) Harry Millwater , Ronald Bagley, Jose Garza, D. Wagner, Andrew Bates, and Andy Voorhees 5d...Reliability Assessment, MIL-HDBK-1823, 30 April 1999. 9. Leverant GR, Millwater HR, McClung RC, Enright MP, A New Tool for Design and Certification of

  8. On-Line Tool for the Assessment of Radiation in Space - Deep Space Mission Enhancements

    NASA Technical Reports Server (NTRS)

    Sandridge, Chris a.; Blattnig, Steve R.; Norman, Ryan B.; Slaba, Tony C.; Walker, Steve A.; Spangler, Jan L.

    2011-01-01

    The On-Line Tool for the Assessment of Radiation in Space (OLTARIS, https://oltaris.nasa.gov) is a web-based set of tools and models that allows engineers and scientists to assess the effects of space radiation on spacecraft, habitats, rovers, and spacesuits. The site is intended to be a design tool for those studying the effects of space radiation for current and future missions as well as a research tool for those developing advanced material and shielding concepts. The tools and models are built around the HZETRN radiation transport code and are primarily focused on human- and electronic-related responses. The focus of this paper is to highlight new capabilities that have been added to support deep space (outside Low Earth Orbit) missions. Specifically, the electron, proton, and heavy ion design environments for the Europa mission have been incorporated along with an efficient coupled electron-photon transport capability to enable the analysis of complicated geometries and slabs exposed to these environments. In addition, a neutron albedo lunar surface environment was also added, that will be of value for the analysis of surface habitats. These updates will be discussed in terms of their implementation and on how OLTARIS can be used by instrument vendors, mission designers, and researchers to analyze their specific requirements.12

  9. OSLay: optimal syntenic layout of unfinished assemblies.

    PubMed

    Richter, Daniel C; Schuster, Stephan C; Huson, Daniel H

    2007-07-01

    The whole genome shotgun approach to genome sequencing results in a collection of contigs that must be ordered and oriented to facilitate efficient gap closure. We present a new tool OSLay that uses synteny between matching sequences in a target assembly and a reference assembly to layout the contigs (or scaffolds) in the target assembly. The underlying algorithm is based on maximum weight matching. The tool provides an interactive visualization of the computed layout and the result can be imported into the assembly editing tool Consed to support the design of primer pairs for gap closure. To enhance efficiency in the gap closure phase of a genome project it is crucial to know which contigs are adjacent in the target genome. Related genome sequences can be used to layout contigs in an assembly. OSLay is freely available from: http://www-ab.informatik.unituebingen.de/software/oslay.

  10. Energy Tracking Software Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryan Davis; Nathan Bird; Rebecca Birx

    2011-04-04

    Acceleration has created an interactive energy tracking and visualization platform that supports decreasing electric, water, and gas usage. Homeowners have access to tools that allow them to gauge their use and track progress toward a smaller energy footprint. Real estate agents have access to consumption data, allowing for sharing a comparison with potential home buyers. Home builders have the opportunity to compare their neighborhood's energy efficiency with competitors. Home energy raters have a tool for gauging the progress of their clients after efficiency changes. And, social groups are able to help encourage members to reduce their energy bills and helpmore » their environment. EnergyIT.com is the business umbrella for all energy tracking solutions and is designed to provide information about our energy tracking software and promote sales. CompareAndConserve.com (Gainesville-Green.com) helps homeowners conserve energy through education and competition. ToolsForTenants.com helps renters factor energy usage into their housing decisions.« less

  11. Assessing motivation for work environment improvements: internal consistency, reliability and factorial structure.

    PubMed

    Hedlund, Ann; Ateg, Mattias; Andersson, Ing-Marie; Rosén, Gunnar

    2010-04-01

    Workers' motivation to actively take part in improvements to the work environment is assumed to be important for the efficiency of investments for that purpose. That gives rise to the need for a tool to measure this motivation. A questionnaire to measure motivation for improvements to the work environment has been designed. Internal consistency and test-retest reliability of the domains of the questionnaire have been measured, and the factorial structure has been explored, from the answers of 113 employees. The internal consistency is high (0.94), as well as the correlation for the total score (0.84). Three factors are identified accounting for 61.6% of the total variance. The questionnaire can be a useful tool in improving intervention methods. The expectation is that the tool can be useful, particularly with the aim of improving efficiency of companies' investments for work environment improvements. Copyright 2010 Elsevier Ltd. All rights reserved.

  12. OLTARIS: On-Line Tool for the Assessment of Radiation in Space

    NASA Technical Reports Server (NTRS)

    Sandridge, Chris A.; Blattnig, Steve R.; Clowdsley, Martha S.; Norbury, John; Qualis, Garry D.; Simonsen, Lisa C.; Singleterry, Robert C.; Slaba, Tony C.; Walker, Steven A.; Badavi, Francis F.; hide

    2009-01-01

    The effects of ionizing radiation on humans in space is a major technical challenge for exploration to the moon and beyond. The radiation shielding team at NASA Langley Research Center has been working for over 30 years to develop techniques that can efficiently assist the engineer throughout the entire design process. OLTARIS: On-Line Tool for the Assessment of Radiation in Space is a new NASA website (http://oltaris.larc.nasa.gov) that allows engineers and physicists to access a variety of tools and models to study the effects of ionizing space radiation on humans and shielding materials. The site is intended to be an analysis and design tool for those working radiation issues for current and future manned missions, as well as a research tool for developing advanced material and shielding concepts. The site, along with the analysis tools and models within, have been developed using strict software practices to ensure reliable and reproducible results in a production environment. They have also been developed as a modular system so that models and algorithms can be easily added or updated.

  13. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots andmore » data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.« less

  14. Parametric Geometry, Structured Grid Generation, and Initial Design Study for REST-Class Hypersonic Inlets

    NASA Technical Reports Server (NTRS)

    Ferlemann, Paul G.; Gollan, Rowan J.

    2010-01-01

    Computational design and analysis of three-dimensional hypersonic inlets with shape transition has been a significant challenge due to the complex geometry and grid required for three-dimensional viscous flow calculations. Currently, the design process utilizes an inviscid design tool to produce initial inlet shapes by streamline tracing through an axisymmetric compression field. However, the shape is defined by a large number of points rather than a continuous surface and lacks important features such as blunt leading edges. Therefore, a design system has been developed to parametrically construct true CAD geometry and link the topology of a structured grid to the geometry. The Adaptive Modeling Language (AML) constitutes the underlying framework that is used to build the geometry and grid topology. Parameterization of the CAD geometry allows the inlet shapes produced by the inviscid design tool to be generated, but also allows a great deal of flexibility to modify the shape to account for three-dimensional viscous effects. By linking the grid topology to the parametric geometry, the GridPro grid generation software can be used efficiently to produce a smooth hexahedral multiblock grid. To demonstrate the new capability, a matrix of inlets were designed by varying four geometry parameters in the inviscid design tool. The goals of the initial design study were to explore inviscid design tool geometry variations with a three-dimensional analysis approach, demonstrate a solution rate which would enable the use of high-fidelity viscous three-dimensional CFD in future design efforts, process the results for important performance parameters, and perform a sample optimization.

  15. High-Fidelity Multidisciplinary Design Optimization of Aircraft Configurations

    NASA Technical Reports Server (NTRS)

    Martins, Joaquim R. R. A.; Kenway, Gaetan K. W.; Burdette, David; Jonsson, Eirikur; Kennedy, Graeme J.

    2017-01-01

    To evaluate new airframe technologies we need design tools based on high-fidelity models that consider multidisciplinary interactions early in the design process. The overarching goal of this NRA is to develop tools that enable high-fidelity multidisciplinary design optimization of aircraft configurations, and to apply these tools to the design of high aspect ratio flexible wings. We develop a geometry engine that is capable of quickly generating conventional and unconventional aircraft configurations including the internal structure. This geometry engine features adjoint derivative computation for efficient gradient-based optimization. We also added overset capability to a computational fluid dynamics solver, complete with an adjoint implementation and semiautomatic mesh generation. We also developed an approach to constraining buffet and started the development of an approach for constraining utter. On the applications side, we developed a new common high-fidelity model for aeroelastic studies of high aspect ratio wings. We performed optimal design trade-o s between fuel burn and aircraft weight for metal, conventional composite, and carbon nanotube composite wings. We also assessed a continuous morphing trailing edge technology applied to high aspect ratio wings. This research resulted in the publication of 26 manuscripts so far, and the developed methodologies were used in two other NRAs. 1

  16. TeraStitcher - A tool for fast automatic 3D-stitching of teravoxel-sized microscopy images

    PubMed Central

    2012-01-01

    Background Further advances in modern microscopy are leading to teravoxel-sized tiled 3D images at high resolution, thus increasing the dimension of the stitching problem of at least two orders of magnitude. The existing software solutions do not seem adequate to address the additional requirements arising from these datasets, such as the minimization of memory usage and the need to process just a small portion of data. Results We propose a free and fully automated 3D Stitching tool designed to match the special requirements coming out of teravoxel-sized tiled microscopy images that is able to stitch them in a reasonable time even on workstations with limited resources. The tool was tested on teravoxel-sized whole mouse brain images with micrometer resolution and it was also compared with the state-of-the-art stitching tools on megavoxel-sized publicy available datasets. This comparison confirmed that the solutions we adopted are suited for stitching very large images and also perform well on datasets with different characteristics. Indeed, some of the algorithms embedded in other stitching tools could be easily integrated in our framework if they turned out to be more effective on other classes of images. To this purpose, we designed a software architecture which separates the strategies that use efficiently memory resources from the algorithms which may depend on the characteristics of the acquired images. Conclusions TeraStitcher is a free tool that enables the stitching of Teravoxel-sized tiled microscopy images even on workstations with relatively limited resources of memory (<8 GB) and processing power. It exploits the knowledge of approximate tile positions and uses ad-hoc strategies and algorithms designed for such very large datasets. The produced images can be saved into a multiresolution representation to be efficiently retrieved and processed. We provide TeraStitcher both as standalone application and as plugin of the free software Vaa3D. PMID:23181553

  17. Using GIS to generate spatially balanced random survey designs for natural resource applications.

    PubMed

    Theobald, David M; Stevens, Don L; White, Denis; Urquhart, N Scott; Olsen, Anthony R; Norman, John B

    2007-07-01

    Sampling of a population is frequently required to understand trends and patterns in natural resource management because financial and time constraints preclude a complete census. A rigorous probability-based survey design specifies where to sample so that inferences from the sample apply to the entire population. Probability survey designs should be used in natural resource and environmental management situations because they provide the mathematical foundation for statistical inference. Development of long-term monitoring designs demand survey designs that achieve statistical rigor and are efficient but remain flexible to inevitable logistical or practical constraints during field data collection. Here we describe an approach to probability-based survey design, called the Reversed Randomized Quadrant-Recursive Raster, based on the concept of spatially balanced sampling and implemented in a geographic information system. This provides environmental managers a practical tool to generate flexible and efficient survey designs for natural resource applications. Factors commonly used to modify sampling intensity, such as categories, gradients, or accessibility, can be readily incorporated into the spatially balanced sample design.

  18. Boundary layers in centrifugal compressors. [application of boundary layer theory to compressor design

    NASA Technical Reports Server (NTRS)

    Dean, R. C., Jr.

    1974-01-01

    The utility of boundary-layer theory in the design of centrifugal compressors is demonstrated. Boundary-layer development in the diffuser entry region is shown to be important to stage efficiency. The result of an earnest attempt to analyze this boundary layer with the best tools available is displayed. Acceptable prediction accuracy was not achieved. The inaccuracy of boundary-layer analysis in this case would result in stage efficiency prediction as much as four points low. Fluid dynamic reasons for analysis failure are discussed with support from flow data. Empirical correlations used today to circumnavigate the weakness of the theory are illustrated.

  19. Wireless data collection retrievals of bridge inspection/management information.

    DOT National Transportation Integrated Search

    2017-02-28

    To increase the efficiency and reliability of bridge inspections, MDOT contracted to have a 3D-model-based data entry application for mobile tablets developed to aid inspectors in the field. The 3D Bridge App is a mobile software tool designed to fac...

  20. Double Dutch: A Tool for Designing Combinatorial Libraries of Biological Systems.

    PubMed

    Roehner, Nicholas; Young, Eric M; Voigt, Christopher A; Gordon, D Benjamin; Densmore, Douglas

    2016-06-17

    Recently, semirational approaches that rely on combinatorial assembly of characterized DNA components have been used to engineer biosynthetic pathways. In practice, however, it is not practical to assemble and test millions of pathway variants in order to elucidate how different DNA components affect the behavior of a pathway. To address this challenge, we apply a rigorous mathematical approach known as design of experiments (DOE) that can be used to construct empirical models of system behavior without testing all variants. To support this approach, we have developed a tool named Double Dutch, which uses a formal grammar and heuristic algorithms to automate the process of DOE library design. Compared to designing by hand, Double Dutch enables users to more efficiently and scalably design libraries of pathway variants that can be used in a DOE framework and uniquely provides a means to flexibly balance design considerations of statistical analysis, construction cost, and risk of homologous recombination, thereby demonstrating the utility of automating decision making when faced with complex design trade-offs.

  1. Analytical Model-Based Design Optimization of a Transverse Flux Machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz

    This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variablesmore » that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.« less

  2. Optimum Design of LLC Resonant Converter using Inductance Ratio (Lm/Lr)

    NASA Astrophysics Data System (ADS)

    Palle, Kowstubha; Krishnaveni, K.; Ramesh Reddy, Kolli

    2017-06-01

    The main benefits of LLC resonant dc/dc converter over conventional series and parallel resonant converters are its light load regulation, less circulating currents, larger bandwidth for zero voltage switching, and less tuning of switching frequency for controlled output. An unique analytical tool, called fundamental harmonic approximation with peak gain adjustment is used for designing the converter. In this paper, an optimum design of the converter is proposed by considering three different design criterions with different values of inductance ratio (Lm/Lr) to achieve good efficiency at high input voltage. The optimum design includes the analysis in operating range, switching frequency range, primary side losses of a switch and stability. The analysis is carried out with simulation using the software tools like MATLAB and PSIM. The performance of the optimized design is demonstrated for a design specification of 12 V, 5 A output operating with an input voltage range of 300-400 V using FSFR 2100 IC of Texas instruments.

  3. A modified operational sequence methodology for zoo exhibit design and renovation: conceptualizing animals, staff, and visitors as interdependent coworkers.

    PubMed

    Kelling, Nicholas J; Gaalema, Diann E; Kelling, Angela S

    2014-01-01

    Human factors analyses have been used to improve efficiency and safety in various work environments. Although generally limited to humans, the universality of these analyses allows for their formal application to a much broader domain. This paper outlines a model for the use of human factors to enhance zoo exhibits and optimize spaces for all user groups; zoo animals, zoo visitors, and zoo staff members. Zoo exhibits are multi-faceted and each user group has a distinct set of requirements that can clash or complement each other. Careful analysis and a reframing of the three groups as interdependent coworkers can enhance safety, efficiency, and experience for all user groups. This paper details a general creation and specific examples of the use of the modified human factors tools of function allocation, operational sequence diagram and needs assessment. These tools allow for adaptability and ease of understanding in the design or renovation of exhibits. © 2014 Wiley Periodicals, Inc.

  4. Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Bonvini, Marco; Piette, Mary Ann

    We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and buildingmore » behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.« less

  5. Aligator: A computational tool for optimizing total chemical synthesis of large proteins.

    PubMed

    Jacobsen, Michael T; Erickson, Patrick W; Kay, Michael S

    2017-09-15

    The scope of chemical protein synthesis (CPS) continues to expand, driven primarily by advances in chemical ligation tools (e.g., reversible solubilizing groups and novel ligation chemistries). However, the design of an optimal synthesis route can be an arduous and fickle task due to the large number of theoretically possible, and in many cases problematic, synthetic strategies. In this perspective, we highlight recent CPS tool advances and then introduce a new and easy-to-use program, Aligator (Automated Ligator), for analyzing and designing the most efficient strategies for constructing large targets using CPS. As a model set, we selected the E. coli ribosomal proteins and associated factors for computational analysis. Aligator systematically scores and ranks all feasible synthetic strategies for a particular CPS target. The Aligator script methodically evaluates potential peptide segments for a target using a scoring function that includes solubility, ligation site quality, segment lengths, and number of ligations to provide a ranked list of potential synthetic strategies. We demonstrate the utility of Aligator by analyzing three recent CPS projects from our lab: TNFα (157 aa), GroES (97 aa), and DapA (312 aa). As the limits of CPS are extended, we expect that computational tools will play an increasingly important role in the efficient execution of ambitious CPS projects such as production of a mirror-image ribosome. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Self-running and self-floating two-dimensional actuator using near-field acoustic levitation

    NASA Astrophysics Data System (ADS)

    Chen, Keyu; Gao, Shiming; Pan, Yayue; Guo, Ping

    2016-09-01

    Non-contact actuators are promising technologies in metrology, machine-tools, and hovercars, but have been suffering from low energy efficiency, complex design, and low controllability. Here we report a new design of a self-running and self-floating actuator capable of two-dimensional motion with an unlimited travel range. The proposed design exploits near-field acoustic levitation for heavy object lifting, and coupled resonant vibration for generation of acoustic streaming for non-contact motion in designated directions. The device utilizes resonant vibration of the structure for high energy efficiency, and adopts a single piezo element to achieve both levitation and non-contact motion for a compact and simple design. Experiments demonstrate that the proposed actuator can reach a 1.65 cm/s or faster moving speed and is capable of transporting a total weight of 80 g under 1.2 W power consumption.

  7. Structural efficiency study of composite wing rib structures

    NASA Technical Reports Server (NTRS)

    Swanson, Gary D.; Gurdal, Zafer; Starnes, James H., Jr.

    1988-01-01

    A series of short stiffened panel designs which may be applied to a preliminary design assessment of an aircraft wing rib is presented. The computer program PASCO is used as the primary design and analysis tool to assess the structural efficiency and geometry of a tailored corrugated panel, a corrugated panel with a continuous laminate, a hat stiffened panel, a blade stiffened panel, and an unstiffened flat plate. To correct some of the shortcomings in the PASCO analysis when shear is present, a two step iterative process using the computer program VICON is used. The loadings considered include combinations of axial compression, shear, and lateral pressure. The loading ranges considered are broad enough such that the designs presented may be applied to other stiffened panel applications. An assessment is made of laminate variations, increased spacing, and nonoptimum geometric variations, including a beaded panel, on the design of the panels.

  8. Some aspects of precise laser machining - Part 1: Theory

    NASA Astrophysics Data System (ADS)

    Wyszynski, Dominik; Grabowski, Marcin; Lipiec, Piotr

    2018-05-01

    The paper describes the role of laser beam polarization and deflection on quality of laser beam machined parts made of difficult to cut materials (used for cutting tools). Application of efficient and precise cutting tool (laser beam) has significant impact on preparation and finishing operations of cutting tools for aviation part manufacturing. Understanding the phenomena occurring in the polarized light laser cutting gave possibility to design, build and test opto-mechanical instrumentation to control and maintain process parameters and conditions. The research was carried within INNOLOT program funded by Polish National Centre for Research and Development.

  9. Innovative and Advanced Coupled Neutron Transport and Thermal Hydraulic Method (Tool) for the Design, Analysis and Optimization of VHTR/NGNP Prismatic Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahnema, Farzad; Garimeela, Srinivas; Ougouag, Abderrafi

    2013-11-29

    This project will develop a 3D, advanced coarse mesh transport method (COMET-Hex) for steady- state and transient analyses in advanced very high-temperature reactors (VHTRs). The project will lead to a coupled neutronics and thermal hydraulic (T/H) core simulation tool with fuel depletion capability. The computational tool will be developed in hexagonal geometry, based solely on transport theory without (spatial) homogenization in complicated 3D geometries. In addition to the hexagonal geometry extension, collaborators will concurrently develop three additional capabilities to increase the code’s versatility as an advanced and robust core simulator for VHTRs. First, the project team will develop and implementmore » a depletion method within the core simulator. Second, the team will develop an elementary (proof-of-concept) 1D time-dependent transport method for efficient transient analyses. The third capability will be a thermal hydraulic method coupled to the neutronics transport module for VHTRs. Current advancements in reactor core design are pushing VHTRs toward greater core and fuel heterogeneity to pursue higher burn-ups, efficiently transmute used fuel, maximize energy production, and improve plant economics and safety. As a result, an accurate and efficient neutron transport, with capabilities to treat heterogeneous burnable poison effects, is highly desirable for predicting VHTR neutronics performance. This research project’s primary objective is to advance the state of the art for reactor analysis.« less

  10. Experimental Investigation on Design Enhancement of Axial Fan Using Fixed Guide Vane

    NASA Astrophysics Data System (ADS)

    Munisamy, K. M.; Govindasamy, R.; Thangaraju, S. K.

    2015-09-01

    Airflow passes through the rotating blade in an axial flow fan will experience a helical flow pattern. This swirling effect leads the system to experience swirl energy losses or pressure drop yet reducing the total efficiency of the fan system. A robust tool to encounter this air spin past the blade is by introducing guide vane to the system. Owing to its importance, a new approach in designing outlet guide vane design for a commercial usage 1250mm diameter axial fan with a 30° pitch angle impeller has been introduced in this paper. A single line metal of proper curvature guide vane design technique has been adopted for this study. By choosing fan total efficiency as a target variable to be improved, the total and static pressure on the design point were set to be constraints. Therefore, the guide vane design was done based on the improvement target on the static pressure in system. The research shows that, with the improvement in static pressure by 29.63% through guide vane installation, the total fan efficiency is increased by 5.12%, thus reduces the fan power by 5.32%. Good agreement were found, that when the fan total efficiency increases, the power consumption of the fan is reduced. Therefore, this new approach of guide vane design can be applied to improve axial fan performance.

  11. Aerospace Systems Design in NASA's Collaborative Engineering Environment

    NASA Technical Reports Server (NTRS)

    Monell, Donald W.; Piland, William M.

    1999-01-01

    Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g. manufacturing and systems operations). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often lead to the inability of assessing critical programmatic and technical issues (e.g., cost risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographically distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across the Agency.

  12. Aerospace Systems Design in NASA's Collaborative Engineering Environment

    NASA Technical Reports Server (NTRS)

    Monell, Donald W.; Piland, William M.

    2000-01-01

    Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g., manufacturing and systems operation). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often lead to the inability of assessing critical programmatic and technical issues (e.g., cost, risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographical distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across Agency.

  13. Aerospace Systems Design in NASA's Collaborative Engineering Environment

    NASA Astrophysics Data System (ADS)

    Monell, Donald W.; Piland, William M.

    2000-07-01

    Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g., manufacturing and systems operations). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often led to the inability of assessing critical programmatic and technical issues (e.g., cost, risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographically distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across the Agency.

  14. Shuttle's 160 hour ground turnaround - A design driver

    NASA Technical Reports Server (NTRS)

    Widick, F.

    1977-01-01

    Turnaround analysis added a new dimension to the Space Program with the advent of the Space Shuttle. The requirement to turn the flight hardware around in 160 working hours from landing to launch was a significant design driver and a useful tool in forcing the integration of flight and ground systems design to permit an efficient ground operation. Although there was concern that time constraints might increase program costs, the result of the analysis was to minimize facility requirements and simplify operations with resultant cost savings.

  15. Nanobubbles: a promising efficient tool for therapeutic delivery.

    PubMed

    Cavalli, Roberta; Soster, Marco; Argenziano, Monica

    2016-01-01

    In recent decades ultrasound-guided delivery of drugs loaded on nanocarriers has been the focus of increasing attention to improve therapeutic treatments. Ultrasound has often been used in combination with microbubbles, micron-sized spherical gas-filled structures stabilized by a shell, to amplify the biophysical effects of the ultrasonic field. Nanometer size bubbles are defined nanobubbles. They were designed to obtain more efficient drug delivery systems. Indeed, their small sizes allow extravasation from blood vessels into surrounding tissues and ultrasound-targeted site-specific release with minimal invasiveness. Additionally, nanobubbles might be endowed with improved stability and longer residence time in systemic circulation. This review will describe the physico-chemical properties of nanobubbles, the formulation parameters and the drug loading approaches, besides potential applications as a therapeutic tool.

  16. Materials Genome Initiative

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    The Materials Genome Initiative (MGI) project element is a cross-Center effort that is focused on the integration of computational tools to simulate manufacturing processes and materials behavior. These computational simulations will be utilized to gain understanding of processes and materials behavior to accelerate process development and certification to more efficiently integrate new materials in existing NASA projects and to lead to the design of new materials for improved performance. This NASA effort looks to collaborate with efforts at other government agencies and universities working under the national MGI. MGI plans to develop integrated computational/experimental/ processing methodologies for accelerating discovery and insertion of materials to satisfy NASA's unique mission demands. The challenges include validated design tools that incorporate materials properties, processes, and design requirements; and materials process control to rapidly mature emerging manufacturing methods and develop certified manufacturing processes

  17. APRON: A Cellular Processor Array Simulation and Hardware Design Tool

    NASA Astrophysics Data System (ADS)

    Barr, David R. W.; Dudek, Piotr

    2009-12-01

    We present a software environment for the efficient simulation of cellular processor arrays (CPAs). This software (APRON) is used to explore algorithms that are designed for massively parallel fine-grained processor arrays, topographic multilayer neural networks, vision chips with SIMD processor arrays, and related architectures. The software uses a highly optimised core combined with a flexible compiler to provide the user with tools for the design of new processor array hardware architectures and the emulation of existing devices. We present performance benchmarks for the software processor array implemented on standard commodity microprocessors. APRON can be configured to use additional processing hardware if necessary and can be used as a complete graphical user interface and development environment for new or existing CPA systems, allowing more users to develop algorithms for CPA systems.

  18. Bio-jETI: a service integration, design, and provisioning platform for orchestrated bioinformatics processes

    PubMed Central

    Margaria, Tiziana; Kubczak, Christian; Steffen, Bernhard

    2008-01-01

    Background With Bio-jETI, we introduce a service platform for interdisciplinary work on biological application domains and illustrate its use in a concrete application concerning statistical data processing in R and xcms for an LC/MS analysis of FAAH gene knockout. Methods Bio-jETI uses the jABC environment for service-oriented modeling and design as a graphical process modeling tool and the jETI service integration technology for remote tool execution. Conclusions As a service definition and provisioning platform, Bio-jETI has the potential to become a core technology in interdisciplinary service orchestration and technology transfer. Domain experts, like biologists not trained in computer science, directly define complex service orchestrations as process models and use efficient and complex bioinformatics tools in a simple and intuitive way. PMID:18460173

  19. Cost of enlarged operating zone for an existing Francis runner

    NASA Astrophysics Data System (ADS)

    Monette, Christine; Marmont, Hugues; Chamberland-Lauzon, Joël; Skagerstrand, Anders; Coutu, André; Carlevi, Jens

    2016-11-01

    Traditionally, hydro power plants have been operated close to best efficiency point, the more stable operating condition for which they have been designed. However, because of changes in the electricity market, many hydro power plants operators wish to operate their machines differently to fulfil those new market needs. New operating conditions can include whole range operation, many start/stops, extensive low load operation, synchronous condenser mode and power/frequency regulation. Many of these new operating conditions may impose more severe fatigue damage than the traditional base load operation close to best efficiency point. Under these conditions, the fatigue life of the runner may be significantly reduced and reparation or replacement cost might occur sooner than expected. In order to design reliable Francis runners for those new challenging operating scenarios, Andritz Hydro has developed various proprietary tools and design rules. These are used within Andritz Hydro to design mechanically robust Francis runners for the operating scenarios fulfilling customer's specifications. To estimate residual life under different operating scenarios of an existing runner designed years ago for best efficiency base load operation, Andritz Hydro's design rules and tools would necessarily lead to conservative results. While the geometry of a new runner can be modified to fulfil all conservative mechanical design rules, the predicted fatigue life of an existing runner under off-design operating conditions may appear rather short because of the conservative safety factor included in the calculations. The most precise and reliable way to calculate residual life of an existing runner under different operating scenarios is to perform a strain gauge measurement campaign on the runner. This paper presents the runner strain gage measurement campaign of a mid-head Francis turbine over all the operating conditions available during the test, the analysis of the measurement signals and the runner residual life assessment under different operating scenarios. With these results, the maintenance cost of the change in operating mode can then be calculated and foreseen by the power plant owner.

  20. Numerical Model of Flame Spread Over Solids in Microgravity: A Supplementary Tool for Designing a Space Experiment

    NASA Technical Reports Server (NTRS)

    Shih, Hsin-Yi; Tien, James S.; Ferkul, Paul (Technical Monitor)

    2001-01-01

    The recently developed numerical model of concurrent-flow flame spread over thin solids has been used as a simulation tool to help the designs of a space experiment. The two-dimensional and three-dimensional, steady form of the compressible Navier-Stokes equations with chemical reactions are solved. With the coupled multi-dimensional solver of the radiative heat transfer, the model is capable of answering a number of questions regarding the experiment concept and the hardware designs. In this paper, the capabilities of the numerical model are demonstrated by providing the guidance for several experimental designing issues. The test matrix and operating conditions of the experiment are estimated through the modeling results. The three-dimensional calculations are made to simulate the flame-spreading experiment with realistic hardware configuration. The computed detailed flame structures provide the insight to the data collection. In addition, the heating load and the requirements of the product exhaust cleanup for the flow tunnel are estimated with the model. We anticipate that using this simulation tool will enable a more efficient and successful space experiment to be conducted.

  1. Basic principles of stability.

    PubMed

    Egan, William; Schofield, Timothy

    2009-11-01

    An understanding of the principles of degradation, as well as the statistical tools for measuring product stability, is essential to management of product quality. Key to this is management of vaccine potency. Vaccine shelf life is best managed through determination of a minimum potency release requirement, which helps assure adequate potency throughout expiry. Use of statistical tools such a least squares regression analysis should be employed to model potency decay. The use of such tools provides incentive to properly design vaccine stability studies, while holding stability measurements to specification presents a disincentive for collecting valuable data. The laws of kinetics such as Arrhenius behavior help practitioners design effective accelerated stability programs, which can be utilized to manage stability after a process change. Design of stability studies should be carefully considered, with an eye to minimizing the variability of the stability parameter. In the case of measuring the degradation rate, testing at the beginning and the end of the study improves the precision of this estimate. Additional design considerations such as bracketing and matrixing improve the efficiency of stability evaluation of vaccines.

  2. Rational protein design: developing next-generation biological therapeutics and nanobiotechnological tools.

    PubMed

    Wilson, Corey J

    2015-01-01

    Proteins are the most functionally diverse macromolecules observed in nature, participating in a broad array of catalytic, biosensing, transport, scaffolding, and regulatory functions. Fittingly, proteins have become one of the most promising nanobiotechnological tools to date, and through the use of recombinant DNA and other laboratory methods we have produced a vast number of biological therapeutics derived from human genes. Our emerging ability to rationally design proteins (e.g., via computational methods) holds the promise of significantly expanding the number and diversity of protein therapies and has opened the gateway to realizing true and uncompromised personalized medicine. In the last decade computational protein design has been transformed from a set of fundamental strategies to stringently test our understanding of the protein structure-function relationship, to practical tools for developing useful biological processes, nano-devices, and novel therapeutics. As protein design strategies improve (i.e., in terms of accuracy and efficiency) clinicians will be able to leverage individual genetic data and biological metrics to develop and deliver personalized protein therapeutics with minimal delay. © 2014 Wiley Periodicals, Inc.

  3. Using prediction uncertainty analysis to design hydrologic monitoring networks: Example applications from the Great Lakes water availability pilot project

    USGS Publications Warehouse

    Fienen, Michael N.; Doherty, John E.; Hunt, Randall J.; Reeves, Howard W.

    2010-01-01

    The importance of monitoring networks for resource-management decisions is becoming more recognized, in both theory and application. Quantitative computer models provide a science-based framework to evaluate the efficacy and efficiency of existing and possible future monitoring networks. In the study described herein, two suites of tools were used to evaluate the worth of new data for specific predictions, which in turn can support efficient use of resources needed to construct a monitoring network. The approach evaluates the uncertainty of a model prediction and, by using linear propagation of uncertainty, estimates how much uncertainty could be reduced if the model were calibrated with addition information (increased a priori knowledge of parameter values or new observations). The theoretical underpinnings of the two suites of tools addressing this technique are compared, and their application to a hypothetical model based on a local model inset into the Great Lakes Water Availability Pilot model are described. Results show that meaningful guidance for monitoring network design can be obtained by using the methods explored. The validity of this guidance depends substantially on the parameterization as well; hence, parameterization must be considered not only when designing the parameter-estimation paradigm but also-importantly-when designing the prediction-uncertainty paradigm.

  4. Design and thermal analysis of a mold used in the injection of elastomers

    NASA Astrophysics Data System (ADS)

    Fekiri, Nasser; Canto, Cécile; Madec, Yannick; Mousseau, Pierre; Plot, Christophe; Sarda, Alain

    2017-10-01

    In the process of injection molding of elastomers, improving the energy efficiency of the tools is a current challenge for industry in terms of energy consumption, productivity and product quality. In the rubber industry, 20% of the energy consumed by capital goods comes from heating processes; more than 50% of heat losses are linked to insufficient control and thermal insulation of Molds. The design of the tooling evolves in particular towards the reduction of the heated mass and the thermal insulation of the molds. In this paper, we present a complex tool composed, on one hand, of a multi-cavity mold designed by reducing the heated mass and equipped with independent control zones placed closest to each molding cavity and, on the other hand, of a regulated channel block (RCB) which makes it possible to limit the waste of rubber during the injection. The originality of this tool lies in thermally isolating the regulated channel block from the mold and the cavities between them in order to better control the temperature field in the material which is transformed. We present the design and the instrumentation of the experimental set-up. Experimental measurements allow us to understand the thermal of the tool and to show the thermal heterogeneities on the surface of the mold and in the various cavities. Tests of injection molding of the rubber and a thermal balance on the energy consumption of the tool are carried out.

  5. Digital technology and clinical decision making in depression treatment: Current findings and future opportunities.

    PubMed

    Hallgren, Kevin A; Bauer, Amy M; Atkins, David C

    2017-06-01

    Clinical decision making encompasses a broad set of processes that contribute to the effectiveness of depression treatments. There is emerging interest in using digital technologies to support effective and efficient clinical decision making. In this paper, we provide "snapshots" of research and current directions on ways that digital technologies can support clinical decision making in depression treatment. Practical facets of clinical decision making are reviewed, then research, design, and implementation opportunities where technology can potentially enhance clinical decision making are outlined. Discussions of these opportunities are organized around three established movements designed to enhance clinical decision making for depression treatment, including measurement-based care, integrated care, and personalized medicine. Research, design, and implementation efforts may support clinical decision making for depression by (1) improving tools to incorporate depression symptom data into existing electronic health record systems, (2) enhancing measurement of treatment fidelity and treatment processes, (3) harnessing smartphone and biosensor data to inform clinical decision making, (4) enhancing tools that support communication and care coordination between patients and providers and within provider teams, and (5) leveraging treatment and outcome data from electronic health record systems to support personalized depression treatment. The current climate of rapid changes in both healthcare and digital technologies facilitates an urgent need for research, design, and implementation of digital technologies that explicitly support clinical decision making. Ensuring that such tools are efficient, effective, and usable in frontline treatment settings will be essential for their success and will require engagement of stakeholders from multiple domains. © 2017 Wiley Periodicals, Inc.

  6. Investigating rate-limiting barriers to nanoscale nonviral gene transfer with nanobiophotonics

    NASA Astrophysics Data System (ADS)

    Chen, Hunter H.

    Nucleic acids are a novel class of therapeutics poised to address many unmet clinical needs. Safe and efficient delivery remains a significant challenge that has delayed the realization of the full therapeutic potential of nucleic acids. Nanoscale nonviral vectors offer an attractive alternative to viral vectors as natural and synthetic polymers or polypeptides may be rationally designed to meet the unique demands of individual applications. A mechanistic understanding of cellular barriers is necessary to develop guidelines for designing custom gene carriers which are expected to greatly impact this delivery challenge. The work herein focused on the relationships among nanocomplex stability, intracellular trafficking and unpacking kinetics, and DNA degradation. Ultrasensitive nanosensors based on QD-FRET were developed to characterize the biophysical properties of nanocomplexes and study these rate-limiting steps. Quantitative image analysis enabled the distributions of the subpopulation of condensed or released DNA to be determined within the major cellular compartments encountered during gene transfer. The steady state stability and unpacking kinetics within these compartments were found to impact transgene expression, elucidating multiple design strategies to achieve efficient gene transfer. To address enzymatic barriers, a novel two-step QD-FRET nanosensor was developed to analyze unpacking and DNA degradation simultaneously, which has not been accomplished previously. Bioresponsive strategies such as disulfide crosslinking and thermosensitivity were evaluated by QD-FRET and quantitative compartmental analysis as case studies to determine appropriate design specifications for thiolated polymers and thermoresponsive polypeptides. Relevant nanobiophotonic tools were developed as a platform to study major rate-limiting barriers to nanomedicine and demonstrated the feasibility of using mechanistic information gained from these tools to guide the rational design of gene carriers and achieve the desired properties that enable efficient gene transfer.

  7. Sequential Design of Experiments to Maximize Learning from Carbon Capture Pilot Plant Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soepyan, Frits B.; Morgan, Joshua C.; Omell, Benjamin P.

    Pilot plant test campaigns can be expensive and time-consuming. Therefore, it is of interest to maximize the amount of learning and the efficiency of the test campaign given the limited number of experiments that can be conducted. This work investigates the use of sequential design of experiments (SDOE) to overcome these challenges by demonstrating its usefulness for a recent solvent-based CO2 capture plant test campaign. Unlike traditional design of experiments methods, SDOE regularly uses information from ongoing experiments to determine the optimum locations in the design space for subsequent runs within the same experiment. However, there are challenges that needmore » to be addressed, including reducing the high computational burden to efficiently update the model, and the need to incorporate the methodology into a computational tool. We address these challenges by applying SDOE in combination with a software tool, the Framework for Optimization, Quantification of Uncertainty and Surrogates (FOQUS) (Miller et al., 2014a, 2016, 2017). The results of applying SDOE on a pilot plant test campaign for CO2 capture suggests that relative to traditional design of experiments methods, SDOE can more effectively reduce the uncertainty of the model, thus decreasing technical risk. Future work includes integrating SDOE into FOQUS and using SDOE to support additional large-scale pilot plant test campaigns.« less

  8. A Global Review of Incentive Programs to Accelerate Energy-Efficient Appliances and Equipment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de la Rue du Can, Stephane; Phadke, Amol; Leventis, Greg

    Incentive programs are an essential policy tool to move the market toward energy-efficient products. They offer a favorable complement to mandatory standards and labeling policies by accelerating the market penetration of energy-efficient products above equipment standard requirements and by preparing the market for increased future mandatory requirements. They sway purchase decisions and in some cases production decisions and retail stocking decisions toward energy-efficient products. Incentive programs are structured according to their regulatory environment, the way they are financed, by how the incentive is targeted, and by who administers them. This report categorizes the main elements of incentive programs, using casemore » studies from the Major Economies Forum to illustrate their characteristics. To inform future policy and program design, it seeks to recognize design advantages and disadvantages through a qualitative overview of the variety of programs in use around the globe. Examples range from rebate programs administered by utilities under an Energy-Efficiency Resource Standards (EERS) regulatory framework (California, USA) to the distribution of Eco-Points that reward customers for buying efficient appliances under a government recovery program (Japan). We found that evaluations have demonstrated that financial incentives programs have greater impact when they target highly efficient technologies that have a small market share. We also found that the benefits and drawbacks of different program design aspects depend on the market barriers addressed, the target equipment, and the local market context and that no program design surpasses the others. The key to successful program design and implementation is a thorough understanding of the market and effective identification of the most important local factors hindering the penetration of energy-efficient technologies.« less

  9. Evaluating a digital ship design tool prototype: Designers' perceptions of novel ergonomics software.

    PubMed

    Mallam, Steven C; Lundh, Monica; MacKinnon, Scott N

    2017-03-01

    Computer-aided solutions are essential for naval architects to manage and optimize technical complexities when developing a ship's design. Although there are an array of software solutions aimed to optimize the human element in design, practical ergonomics methodologies and technological solutions have struggled to gain widespread application in ship design processes. This paper explores how a new ergonomics technology is perceived by naval architecture students using a mixed-methods framework. Thirteen Naval Architecture and Ocean Engineering Masters students participated in the study. Overall, results found participants perceived the software and its embedded ergonomics tools to benefit their design work, increasing their empathy and ability to understand the work environment and work demands end-users face. However, participant's questioned if ergonomics could be practically and efficiently implemented under real-world project constraints. This revealed underlying social biases and a fundamental lack of understanding in engineering postgraduate students regarding applied ergonomics in naval architecture. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Evaluating hospital design from an operations management perspective.

    PubMed

    Vos, Leti; Groothuis, Siebren; van Merode, Godefridus G

    2007-12-01

    This paper describes an evaluation method for the assessment of hospital building design from the viewpoint of operations management to assure that the building design supports the efficient and effective operating of care processes now and in the future. The different steps of the method are illustrated by a case study. In the case study an experimental design is applied to assess the effect of used logistical concepts, patient mix and technologies. The study shows that the evaluation method provides a valuable tool for the assessment of both functionality and the ability to meet future developments in operational control of a building design.

  11. Peregrine Sustainer Motor Development

    NASA Technical Reports Server (NTRS)

    Brodell, Chuck; Franklin, Philip

    2015-01-01

    The Peregrine sounding rocket is an in-house NASA design that provides approximately 15 percent better performance than the motor it replaces. The design utilizes common materials and well-characterized architecture to reduce flight issues encountered with the current motors. It engages NASA design, analysts, test engineers and technicians, ballisticians, and systems engineers. The in-house work and collaboration within the government provides flexibility to efficiently accommodate design and program changes as the design matures and enhances the ability to meet schedule milestones. It provides a valuable tool to compare industry costs, develop contracts, and it develops foundational knowledge for the next generation of NASA engineers.

  12. Atomdroid: a computational chemistry tool for mobile platforms.

    PubMed

    Feldt, Jonas; Mata, Ricardo A; Dieterich, Johannes M

    2012-04-23

    We present the implementation of a new molecular mechanics program designed for use in mobile platforms, the first specifically built for these devices. The software is designed to run on Android operating systems and is compatible with several modern tablet-PCs and smartphones available in the market. It includes molecular viewer/builder capabilities with integrated routines for geometry optimizations and Monte Carlo simulations. These functionalities allow it to work as a stand-alone tool. We discuss some particular development aspects, as well as the overall feasibility of using computational chemistry software packages in mobile platforms. Benchmark calculations show that through efficient implementation techniques even hand-held devices can be used to simulate midsized systems using force fields.

  13. Combustor design tool for a gas fired thermophotovoltaic energy converter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindler, K.W.; Harper, M.J.

    1995-12-31

    Recently, there has been a renewed interest in thermophotovoltaic (TPV) energy conversion. A TPV device converts radiant energy from a high temperature incandescent emitter directly into electricity by photovoltaic cells. The current Department of Energy sponsored research involves the design, construction and demonstration of a prototype TPV converter that uses a hydrocarbon fuel (such as natural gas) as the energy source. As the photovoltaic cells are designed to efficiently convert radiant energy at a prescribed wavelength, it is important that the temperature of the emitter be nearly constant over its entire surface. The U. S. Naval Academy has been taskedmore » with the development of a small emitter (with a high emissivity) that can be maintained at 1756 K (2700 F). This paper describes the computer spreadsheet model that was developed as a tool to be used for the design of the high temperature emitter.« less

  14. P-Hint-Hunt: a deep parallelized whole genome DNA methylation detection tool.

    PubMed

    Peng, Shaoliang; Yang, Shunyun; Gao, Ming; Liao, Xiangke; Liu, Jie; Yang, Canqun; Wu, Chengkun; Yu, Wenqiang

    2017-03-14

    The increasing studies have been conducted using whole genome DNA methylation detection as one of the most important part of epigenetics research to find the significant relationships among DNA methylation and several typical diseases, such as cancers and diabetes. In many of those studies, mapping the bisulfite treated sequence to the whole genome has been the main method to study DNA cytosine methylation. However, today's relative tools almost suffer from inaccuracies and time-consuming problems. In our study, we designed a new DNA methylation prediction tool ("Hint-Hunt") to solve the problem. By having an optimal complex alignment computation and Smith-Waterman matrix dynamic programming, Hint-Hunt could analyze and predict the DNA methylation status. But when Hint-Hunt tried to predict DNA methylation status with large-scale dataset, there are still slow speed and low temporal-spatial efficiency problems. In order to solve the problems of Smith-Waterman dynamic programming and low temporal-spatial efficiency, we further design a deep parallelized whole genome DNA methylation detection tool ("P-Hint-Hunt") on Tianhe-2 (TH-2) supercomputer. To the best of our knowledge, P-Hint-Hunt is the first parallel DNA methylation detection tool with a high speed-up to process large-scale dataset, and could run both on CPU and Intel Xeon Phi coprocessors. Moreover, we deploy and evaluate Hint-Hunt and P-Hint-Hunt on TH-2 supercomputer in different scales. The experimental results illuminate our tools eliminate the deviation caused by bisulfite treatment in mapping procedure and the multi-level parallel program yields a 48 times speed-up with 64 threads. P-Hint-Hunt gain a deep acceleration on CPU and Intel Xeon Phi heterogeneous platform, which gives full play of the advantages of multi-cores (CPU) and many-cores (Phi).

  15. Modeling Off-Nominal Recovery in NextGen Terminal-Area Operations

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.

    2011-01-01

    Robust schedule-based arrival management requires efficient recovery from off-nominal situations. This paper presents research on modeling off-nominal situations and plans for recovering from them using TRAC, a route/airspace design, fast-time simulation, and analysis tool for studying NextGen trajectory-based operations. The paper provides an overview of a schedule-based arrival-management concept and supporting controller tools, then describes TRAC implementations of methods for constructing off-nominal scenarios, generating trajectory options to meet scheduling constraints, and automatically producing recovery plans.

  16. Increasingly mobile: How new technologies can enhance qualitative research

    PubMed Central

    Moylan, Carrie Ann; Derr, Amelia Seraphia; Lindhorst, Taryn

    2015-01-01

    Advances in technology, such as the growth of smart phones, tablet computing, and improved access to the internet have resulted in many new tools and applications designed to increase efficiency and improve workflow. Some of these tools will assist scholars using qualitative methods with their research processes. We describe emerging technologies for use in data collection, analysis, and dissemination that each offer enhancements to existing research processes. Suggestions for keeping pace with the ever-evolving technological landscape are also offered. PMID:25798072

  17. DOVIS 2.0: An Efficient and Easy to Use Parallel Virtual Screening Tool Based on AutoDock 4.0

    DTIC Science & Technology

    2008-09-08

    under the GNU General Public License. Background Molecular docking is a computational method that pre- dicts how a ligand interacts with a receptor...Hence, it is an important tool in studying receptor-ligand interactions and plays an essential role in drug design. Particularly, molecular docking has...libraries from OpenBabel and setup a molecular data structure as a C++ object in our program. This makes handling of molecular structures (e.g., atoms

  18. Way Beyond Widgets: Delivering Integrated Lighting Design in Actionable Solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myer, Michael; Richman, Eric E.; Jones, Carol C.

    2008-08-17

    Previously, energy-efficiency strategies for commercial spaces have focused on using efficient equipment without providing specific detailed instructions. Designs by experts in their fields are an energy-efficiency product in its own right. A new national program has developed interactive application-specific lighting designs for widespread use in four major commercial sectors. This paper will describe the technical basis for the solutions, energy efficiency and cost-savings methodology, and installations and measurement/verification to-date. Lighting designs have been developed for five types of retail stores (big box, small box, grocery, specialty market, and pharmacy) and are planned for the office, healthcare, and education sectors asmore » well. Nationally known sustainable lighting designers developed the designs using high-performance commercially available products, daylighting, and lighting controls. Input and peer review was received by stakeholders, including manufacturers, architects, utilities, energy-efficiency program sponsors (EEPS), and end-users (i.e., retailers). An interactive web tool delivers the lighting solutions and analyzes anticipated energy savings using project-specific inputs. The lighting solutions were analyzed against a reference building using the space-by-space method as allowed in the Energy Standard for Buildings Except Low-Rise Residential Buildings (ASHRAE 2004) co-sponsored by the American Society of Heating, Refrigeration, and Air Conditioning Engineers (ASHRAE) and the Illuminating Engineering Society of North America (IESNA). The results showed that the design vignettes ranged from a 9% to 28% reduction in the allowed lighting power density. Detailed control strategies are offered to further reduce the actual kilowatt-hour power consumption. When used together, the lighting design vignettes and control strategies show a modeled decrease in energy consumption (kWh) by 33% to 50% below the baseline design.« less

  19. Front panel engineering with CAD simulation tool

    NASA Astrophysics Data System (ADS)

    Delacour, Jacques; Ungar, Serge; Mathieu, Gilles; Hasna, Guenther; Martinez, Pascal; Roche, Jean-Christophe

    1999-04-01

    THe progress made recently in display technology covers many fields of application. The specification of radiance, colorimetry and lighting efficiency creates some new challenges for designers. Photometric design is limited by the capability of correctly predicting the result of a lighting system, to save on the costs and time taken to build multiple prototypes or bread board benches. The second step of the research carried out by company OPTIS is to propose an optimization method to be applied to the lighting system, developed in the software SPEOS. The main features of the tool requires include the CAD interface, to enable fast and efficient transfer between mechanical and light design software, the source modeling, the light transfer model and an optimization tool. The CAD interface is mainly a prototype of transfer, which is not the subjects here. Photometric simulation is efficiently achieved by using the measured source encoding and a simulation by the Monte Carlo method. Today, the advantages and the limitations of the Monte Carlo method are well known. The noise reduction requires a long calculation time, which increases with the complexity of the display panel. A successful optimization is difficult to achieve, due to the long calculation time required for each optimization pass including a Monte Carlo simulation. The problem was initially defined as an engineering method of study. The experience shows that good understanding and mastering of the phenomenon of light transfer is limited by the complexity of non sequential propagation. The engineer must call for the help of a simulation and optimization tool. The main point needed to be able to perform an efficient optimization is a quick method for simulating light transfer. Much work has been done in this area and some interesting results can be observed. It must be said that the Monte Carlo method wastes time calculating some results and information which are not required for the needs of the simulation. Low efficiency transfer system cost a lot of lost time. More generally, the light transfer simulation can be treated efficiently when the integrated result is composed of elementary sub results that include quick analytical calculated intersections. The first axis of research appear. The quick integration research and the quick calculation of geometric intersections. The first axis of research brings some general solutions also valid for multi-reflection systems. The second axis requires some deep thinking on the intersection calculation. An interesting way is the subdivision of space in VOXELS. This is an adapted method of 3D division of space according to the objects and their location. An experimental software has been developed to provide a validation of the method. The gain is particularly high in complex systems. An important reduction in the calculation time has been achieved.

  20. Development of Response Surface Models for Rapid Analysis & Multidisciplinary Optimization of Launch Vehicle Design Concepts

    NASA Technical Reports Server (NTRS)

    Unal, Resit

    1999-01-01

    Multdisciplinary design optimization (MDO) is an important step in the design and evaluation of launch vehicles, since it has a significant impact on performance and lifecycle cost. The objective in MDO is to search the design space to determine the values of design parameters that optimize the performance characteristics subject to system constraints. Vehicle Analysis Branch (VAB) at NASA Langley Research Center has computerized analysis tools in many of the disciplines required for the design and analysis of launch vehicles. Vehicle performance characteristics can be determined by the use of these computerized analysis tools. The next step is to optimize the system performance characteristics subject to multidisciplinary constraints. However, most of the complex sizing and performance evaluation codes used for launch vehicle design are stand-alone tools, operated by disciplinary experts. They are, in general, difficult to integrate and use directly for MDO. An alternative has been to utilize response surface methodology (RSM) to obtain polynomial models that approximate the functional relationships between performance characteristics and design variables. These approximation models, called response surface models, are then used to integrate the disciplines using mathematical programming methods for efficient system level design analysis, MDO and fast sensitivity simulations. A second-order response surface model of the form given has been commonly used in RSM since in many cases it can provide an adequate approximation especially if the region of interest is sufficiently limited.

  1. Java Tool Framework for Automation of Hardware Commissioning and Maintenance Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ho, J C; Fisher, J M; Gordon, J B

    2007-10-02

    The National Ignition Facility (NIF) is a 192-beam laser system designed to study high energy density physics. Each beam line contains a variety of line replaceable units (LRUs) that contain optics, stepping motors, sensors and other devices to control and diagnose the laser. During commissioning and subsequent maintenance of the laser, LRUs undergo a qualification process using the Integrated Computer Control System (ICCS) to verify and calibrate the equipment. The commissioning processes are both repetitive and tedious when we use remote manual computer controls, making them ideal candidates for software automation. Maintenance and Commissioning Tool (MCT) software was developed tomore » improve the efficiency of the qualification process. The tools are implemented in Java, leveraging ICCS services and CORBA to communicate with the control devices. The framework provides easy-to-use mechanisms for handling configuration data, task execution, task progress reporting, and generation of commissioning test reports. The tool framework design and application examples will be discussed.« less

  2. Assessment of Lightning Transients on a De-Iced Rotor Blade with Predictive Tools and Coaxial Return Measurements

    NASA Astrophysics Data System (ADS)

    Guillet, S.; Gosmain, A.; Ducoux, W.; Ponçon, M.; Fontaine, G.; Desseix, P.; Perraud, P.

    2012-05-01

    The increasing use of composite materials in aircrafts primary structures has led to different problematics in the field of safety of flight in lightning conditions. The consequences of this technological mutation, which occurs in a parallel context of extension of electrified critical functions, are addressed by aircraft manufacturers through the enhancement of their available assessment means of lightning transient. On the one hand, simulation tools, provided an accurate description of aircraft design, are today valuable assessment tools, in both predictive and operative terms. On the other hand, in-house test means allow confirmation and consolidation of design office hardening solutions. The combined use of predictive simulation tools and in- house test means offers an efficient and reliable support for all aircraft developments in their various life-time stages. The present paper provides PREFACE research project results that illustrate the above introduced strategy on the de-icing system of the NH90 composite main rotor blade.

  3. Sonic Boom Research at NASA Dryden: Objectives and Flight Results from the Lift and Nozzle Change Effects on Tail Shock (LaNCETS) Project

    NASA Technical Reports Server (NTRS)

    Moes, Timothy R.

    2009-01-01

    The principal objective of the Supersonics Project is to develop and validate multidisciplinary physics-based predictive design, analysis and optimization capabilities for supersonic vehicles. For aircraft, the focus will be on eliminating the efficiency, environmental and performance barriers to practical supersonic flight. Previous flight projects found that a shaped sonic boom could propagate all the way to the ground (F-5 SSBD experiment) and validated design tools for forebody shape modifications (F-5 SSBD and Quiet Spike experiments). The current project, Lift and Nozzle Change Effects on Tail Shock (LaNCETS) seeks to obtain flight data to develop and validate design tools for low-boom tail shock modifications. Attempts will be made to alter the shock structure of NASA's NF-15B TN/837 by changing the lift distribution by biasing the canard positions, changing the plume shape by under- and over-expanding the nozzles, and changing the plume shape using thrust vectoring. Additional efforts will measure resulting shocks with a probing aircraft (F-15B TN/836) and use the results to validate and update predictive tools. Preliminary flight results are presented and are available to provide truth data for developing and validating the CFD tools required to design low-boom supersonic aircraft.

  4. LoRTE: Detecting transposon-induced genomic variants using low coverage PacBio long read sequences.

    PubMed

    Disdero, Eric; Filée, Jonathan

    2017-01-01

    Population genomic analysis of transposable elements has greatly benefited from recent advances of sequencing technologies. However, the short size of the reads and the propensity of transposable elements to nest in highly repeated regions of genomes limits the efficiency of bioinformatic tools when Illumina or 454 technologies are used. Fortunately, long read sequencing technologies generating read length that may span the entire length of full transposons are now available. However, existing TE population genomic softwares were not designed to handle long reads and the development of new dedicated tools is needed. LoRTE is the first tool able to use PacBio long read sequences to identify transposon deletions and insertions between a reference genome and genomes of different strains or populations. Tested against simulated and genuine Drosophila melanogaster PacBio datasets, LoRTE appears to be a reliable and broadly applicable tool to study the dynamic and evolutionary impact of transposable elements using low coverage, long read sequences. LoRTE is an efficient and accurate tool to identify structural genomic variants caused by TE insertion or deletion. LoRTE is available for download at http://www.egce.cnrs-gif.fr/?p=6422.

  5. Restructuring a Higher Education Institution: A Case Study from a Developing Country

    ERIC Educational Resources Information Center

    Sohail, M. Sadiq; Daud, Salina; Rajadurai, Jegatheesan

    2006-01-01

    Purpose: The competitive environment facing all organizations has forced many of them to choose strategies that enhance organizational effectiveness and efficiency. Re-engineering is one of the tools used in administering productivity improvements, cost control and asset management. Design/methodology/approach: This paper examines the…

  6. Appendix X. ComField Information Management System.

    ERIC Educational Resources Information Center

    Coffin, Robert W.

    This appendix outlines the ComField information management system which is designed to give the project management a comprehensive tool for decisionmaking and to free instructors from tasks of keeping current records of every student's performance, help them plan their time more efficiently for counseling students and planning instruction, and…

  7. IMPLEMENTATION OF A CAPE-OPEN COMPLIANT PROCESS SIMULATOR USING MICROSOFT'S VISUAL STUDIO.NET AND THE .NET FRAMEWORK

    EPA Science Inventory

    The United States Environmental Protection Agency is developing a Computer
    Aided Process Engineering (CAPE) software tool for the metal finishing
    industry that helps users design efficient metal finishing processes that
    are less polluting to the environment. Metal finish...

  8. Predicting Academic Success Using Admission Profiles

    ERIC Educational Resources Information Center

    Davidovitch, Nitza; Soen, Dan

    2015-01-01

    This study, conducted at a tertiary education institution in Israel, following two previous studies, was designed to deal again with a question that is a topic of debate in Israel and worldwide: Is there justification for the approach that considers restrictive university admission policies an efficient tool for predicting students' success at the…

  9. Simplified tools for evaluating domestic ventilation systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maansson, L.G.; Orme, M.

    1999-07-01

    Within an International Energy Agency (IEA) project, Annex 27, experts from 8 countries (Canada, France, Italy, Japan, The Netherlands, Sweden, UK and USA) have developed simplified tools for evaluating domestic ventilation systems during the heating season. Tools for building and user aspects, thermal comfort, noise, energy, life cycle cost, reliability and indoor air quality (IAQ) have been devised. The results can be used both for dwellings at the design stage and after construction. The tools lead to immediate answers and indications about the consequences of different choices that may arise during discussion with clients. This paper presents an introduction tomore » these tools. Examples applications of the indoor air quality and energy simplified tools are also provided. The IAQ tool accounts for constant emission sources, CO{sub 2}, cooking products, tobacco smoke, condensation risks, humidity levels (i.e., for judging the risk for mould and house dust mites), and pressure difference (for identifying the risk for radon or land fill spillage entering the dwelling or problems with indoor combustion appliances). An elaborated set of design parameters were worked out that resulted in about 17,000 combinations. By using multi-variate analysis it was possible to reduce this to 174 combinations for IAQ. In addition, a sensitivity analysis was made using 990 combinations. The results from all the runs were used to develop a simplified tool, as well as quantifying equations relying on the design parameters. A computerized energy tool has also been developed within this project, which takes into account air tightness, climate, window airing pattern, outdoor air flow rate and heat exchange efficiency.« less

  10. Efficient, Multi-Scale Designs Take Flight

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Engineers can solve aerospace design problems faster and more efficiently with a versatile software product that performs automated structural analysis and sizing optimization. Collier Research Corporation's HyperSizer Structural Sizing Software is a design, analysis, and documentation tool that increases productivity and standardization for a design team. Based on established aerospace structural methods for strength, stability, and stiffness, HyperSizer can be used all the way from the conceptual design to in service support. The software originated from NASA s efforts to automate its capability to perform aircraft strength analyses, structural sizing, and weight prediction and reduction. With a strategy to combine finite element analysis with an automated design procedure, NASA s Langley Research Center led the development of a software code known as ST-SIZE from 1988 to 1995. Collier Research employees were principal developers of the code along with Langley researchers. The code evolved into one that could analyze the strength and stability of stiffened panels constructed of any material, including light-weight, fiber-reinforced composites.

  11. Computer-aided design of biological circuits using TinkerCell.

    PubMed

    Chandran, Deepak; Bergmann, Frank T; Sauro, Herbert M

    2010-01-01

    Synthetic biology is an engineering discipline that builds on modeling practices from systems biology and wet-lab techniques from genetic engineering. As synthetic biology advances, efficient procedures will be developed that will allow a synthetic biologist to design, analyze, and build biological networks. In this idealized pipeline, computer-aided design (CAD) is a necessary component. The role of a CAD application would be to allow efficient transition from a general design to a final product. TinkerCell is a design tool for serving this purpose in synthetic biology. In TinkerCell, users build biological networks using biological parts and modules. The network can be analyzed using one of several functions provided by TinkerCell or custom programs from third-party sources. Since best practices for modeling and constructing synthetic biology networks have not yet been established, TinkerCell is designed as a flexible and extensible application that can adjust itself to changes in the field. © 2010 Landes Bioscience

  12. PET-Tool: a software suite for comprehensive processing and managing of Paired-End diTag (PET) sequence data.

    PubMed

    Chiu, Kuo Ping; Wong, Chee-Hong; Chen, Qiongyu; Ariyaratne, Pramila; Ooi, Hong Sain; Wei, Chia-Lin; Sung, Wing-Kin Ken; Ruan, Yijun

    2006-08-25

    We recently developed the Paired End diTag (PET) strategy for efficient characterization of mammalian transcriptomes and genomes. The paired end nature of short PET sequences derived from long DNA fragments raised a new set of bioinformatics challenges, including how to extract PETs from raw sequence reads, and correctly yet efficiently map PETs to reference genome sequences. To accommodate and streamline data analysis of the large volume PET sequences generated from each PET experiment, an automated PET data process pipeline is desirable. We designed an integrated computation program package, PET-Tool, to automatically process PET sequences and map them to the genome sequences. The Tool was implemented as a web-based application composed of four modules: the Extractor module for PET extraction; the Examiner module for analytic evaluation of PET sequence quality; the Mapper module for locating PET sequences in the genome sequences; and the Project Manager module for data organization. The performance of PET-Tool was evaluated through the analyses of 2.7 million PET sequences. It was demonstrated that PET-Tool is accurate and efficient in extracting PET sequences and removing artifacts from large volume dataset. Using optimized mapping criteria, over 70% of quality PET sequences were mapped specifically to the genome sequences. With a 2.4 GHz LINUX machine, it takes approximately six hours to process one million PETs from extraction to mapping. The speed, accuracy, and comprehensiveness have proved that PET-Tool is an important and useful component in PET experiments, and can be extended to accommodate other related analyses of paired-end sequences. The Tool also provides user-friendly functions for data quality check and system for multi-layer data management.

  13. The development of a new breast feeding assessment tool and the relationship with breast feeding self-efficacy

    PubMed Central

    Ingram, Jenny; Johnson, Debbie; Copeland, Marion; Churchill, Cathy; Taylor, Hazel

    2015-01-01

    Objective to develop a breast feeding assessment tool to facilitate improved targeting of optimum positioning and attachment advice and to describe the changes seen following the release of a tongue-tie. Design development and validation of the Bristol Breastfeeding Assessment Tool (BBAT) and correlation with breast feeding self-efficacy. Setting maternity hospital in South West England. Participants 218 breast feeds (160 mother–infant dyads); seven midwife assessors. Findings the tool has more explanation than other tools to remind those supporting breast-feeding women about the components of an efficient breast feed. There was good internal reliability for the final 4-item BBAT (Cronbach׳s alpha=0.668) and the midwives who used it showed a high correlation in the consistency of its use (ICC=0.782). Midwives were able to score a breast feed consistently using the BBAT and felt that it helped them with advice to mothers about improving positioning and attachment to make breast feeding less painful, particularly with a tongue-tied infant. The tool showed strong correlation with breast feeding self-efficacy, indicating that more efficient breast feeding technique is associated with increased confidence in breast feeding an infant. Conclusions the BBAT is a concise breast feeding assessment tool facilitating accurate, rapid breast feeding appraisal, and targeting breast feeding advice to mothers acquiring early breast feeding skills or for those experiencing problems with an older infant. Accurate assessment is essential to ensure enhanced breast feeding efficiency and increased maternal self-confidence. Implications for practice the BBAT could be used both clinically and in research to target advice to improve breast feeding efficacy. Further research is needed to establish its wider usefulness. PMID:25061006

  14. Polyamine conjugation of curcumin analogues toward the discovery of mitochondria-directed neuroprotective agents.

    PubMed

    Simoni, Elena; Bergamini, Christian; Fato, Romana; Tarozzi, Andrea; Bains, Sandip; Motterlini, Roberto; Cavalli, Andrea; Bolognesi, Maria Laura; Minarini, Anna; Hrelia, Patrizia; Lenaz, Giorgio; Rosini, Michela; Melchiorre, Carlo

    2010-10-14

    Mitochondria-directed antioxidants 2-5 were designed by conjugating curcumin congeners with different polyamine motifs as vehicle tools. The conjugates emerged as efficient antioxidants in mitochondria and fibroblasts and also exerted a protecting role through heme oxygenase-1 activation. Notably, the insertion of a polyamine function into the curcumin-like moiety allowed an efficient intracellular uptake and mitochondria targeting. It also resulted in a significant decrease in the cytotoxicity effects. 2-5 are therefore promising molecules for neuroprotectant lead discovery.

  15. Fundamental Studies and Development of III-N Visible LEDs for High-Power Solid-State Lighting Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dupuis, Russell

    The goal of this program is to understand in a fundamental way the impact of strain, defects, polarization, and Stokes loss in relation to unique device structures upon the internal quantum efficiency (IQE) and efficiency droop (ED) of III-nitride (III-N) light-emitting diodes (LEDs) and to employ this understanding in the design and growth of high-efficiency LEDs capable of highly-reliable, high-current, high-power operation. This knowledge will be the basis for our advanced device epitaxial designs that lead to improved device performance. The primary approach we will employ is to exploit new scientific and engineering knowledge generated through the application of amore » set of unique advanced growth and characterization tools to develop new concepts in strain-, polarization-, and carrier dynamics-engineered and low-defect materials and device designs having reduced dislocations and improved carrier collection followed by efficient photon generation. We studied the effects of crystalline defect, polarizations, hole transport, electron-spillover, electron blocking layer, underlying layer below the multiplequantum- well active region, and developed high-efficiency and efficiency-droop-mitigated blue LEDs with a new LED epitaxial structures. We believe new LEDs developed in this program will make a breakthrough in the development of high-efficiency high-power visible III-N LEDs from violet to green spectral region.« less

  16. A Summary of the NASA Design Environment for Novel Vertical Lift Vehicles (DELIVER) Project

    NASA Technical Reports Server (NTRS)

    Theodore, Colin R.

    2018-01-01

    The number of new markets and use cases being developed for vertical take-off and landing vehicles continues to explode, including the highly publicized urban air taxi and package deliver applications. There is an equally exploding variety of novel vehicle configurations and sizes that are being proposed to fill these new market applications. The challenge for vehicle designers is that there is currently no easy and consistent way to go from a compelling mission or use case to a vehicle that is best configured and sized for the particular mission. This is because the availability of accurate and validated conceptual design tools for these novel types and sizes of vehicles have not kept pace with the new markets and vehicles themselves. The Design Environment for Novel Vertical Lift Vehicles (DELIVER) project was formulated to address this vehicle design challenge by demonstrating the use of current conceptual design tools, that have been used for decades to design and size conventional rotorcraft, applied to these novel vehicle types, configurations and sizes. In addition to demonstrating the applicability of current design and sizing tools to novel vehicle configurations and sizes, DELIVER also demonstrated the addition of key transformational technologies of noise, autonomy, and hybrid-electric and all-electric propulsion into the vehicle conceptual design process. Noise is key for community acceptance, autonomy and the need to operate autonomously are key for efficient, reliable and safe operations, and electrification of the propulsion system is a key enabler for these new vehicle types and sizes. This paper provides a summary of the DELIVER project and shows the applicability of current conceptual design and sizing tools novel vehicle configurations and sizes that are being proposed for urban air taxi and package delivery type applications.

  17. New trends in radiology workstation design

    NASA Astrophysics Data System (ADS)

    Moise, Adrian; Atkins, M. Stella

    2002-05-01

    In the radiology workstation design, the race for adding more features is now morphing into an iterative user centric design with the focus on ergonomics and usability. The extent of the list of features for the radiology workstation used to be one of the most significant factors for a Picture Archiving and Communication System (PACS) vendor's ability to sell the radiology workstation. Not anymore is now very much the same between the major players in the PACS market. How these features work together distinguishes different radiology workstations. Integration (with the PACS/Radiology Information System (RIS) systems, with the 3D tool, Reporting Tool etc.), usability (user specific preferences, advanced display protocols, smart activation of tools etc.) and efficiency (what is the output a radiologist can generate with the workstation) are now core factors for selecting a workstation. This paper discusses these new trends in radiology workstation design. We demonstrate the importance of the interaction between the PACS vendor (software engineers) and the customer (radiologists) during the radiology workstation design. We focus on iterative aspects of the workstation development, such as the presentation of early prototypes to as many representative users as possible during the software development cycle and present the results of a survey of 8 radiologists on designing a radiology workstation.

  18. Intelligent control system based on ARM for lithography tool

    NASA Astrophysics Data System (ADS)

    Chen, Changlong; Tang, Xiaoping; Hu, Song; Wang, Nan

    2014-08-01

    The control system of traditional lithography tool is based on PC and MCU. The PC handles the complex algorithm, human-computer interaction, and communicates with MCU via serial port; The MCU controls motors and electromagnetic valves, etc. This mode has shortcomings like big volume, high power consumption, and wasting of PC resource. In this paper, an embedded intelligent control system of lithography tool, based on ARM, is provided. The control system used S5PV210 as processor, completing the functions of PC in traditional lithography tool, and provided a good human-computer interaction by using LCD and capacitive touch screen. Using Android4.0.3 as operating system, the equipment provided a cool and easy UI which made the control more user-friendly, and implemented remote control and debug, pushing video information of product by network programming. As a result, it's convenient for equipment vendor to provide technical support for users. Finally, compared with traditional lithography tool, this design reduced the PC part, making the hardware resources efficiently used and reducing the cost and volume. Introducing embedded OS and the concepts in "The Internet of things" into the design of lithography tool can be a development trend.

  19. Desktop microsimulation: a tool to improve efficiency in the medical office practice.

    PubMed

    Montgomery, James B; Linville, Beth A; Slonim, Anthony D

    2013-01-01

    Because the economic crisis in the United States continues to have an impact on healthcare organizations, industry leaders must optimize their decision making. Discrete-event computer simulation is a quality tool with a demonstrated track record of improving the precision of analysis for process redesign. However, the use of simulation to consolidate practices and design efficiencies into an unfinished medical office building was a unique task. A discrete-event computer simulation package was used to model the operations and forecast future results for four orthopedic surgery practices. The scenarios were created to allow an evaluation of the impact of process change on the output variables of exam room utilization, patient queue size, and staff utilization. The model helped with decisions regarding space allocation and efficient exam room use by demonstrating the impact of process changes in patient queues at check-in/out, x-ray, and cast room locations when compared to the status quo model. The analysis impacted decisions on facility layout, patient flow, and staff functions in this newly consolidated practice. Simulation was found to be a useful tool for process redesign and decision making even prior to building occupancy. © 2011 National Association for Healthcare Quality.

  20. Toward a Real-Time Measurement-Based System for Estimation of Helicopter Engine Degradation Due to Compressor Erosion

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Simo, Donald L.

    2007-01-01

    This paper presents a preliminary demonstration of an automated health assessment tool, capable of real-time on-board operation using existing engine control hardware. The tool allows operators to discern how rapidly individual turboshaft engines are degrading. As the compressor erodes, performance is lost, and with it the ability to generate power. Thus, such a tool would provide an instant assessment of the engine s fitness to perform a mission, and would help to pinpoint any abnormal wear or performance anomalies before they became serious, thereby decreasing uncertainty and enabling improved maintenance scheduling. The research described in the paper utilized test stand data from a T700-GE-401 turboshaft engine that underwent sand-ingestion testing to scale a model-based compressor efficiency degradation estimation algorithm. This algorithm was then applied to real-time Health Usage and Monitoring System (HUMS) data from a T700-GE-701C to track compressor efficiency on-line. The approach uses an optimal estimator called a Kalman filter. The filter is designed to estimate the compressor efficiency using only data from the engine s sensors as input.

  1. Engineering Design Tools for Shape Memory Alloy Actuators: CASMART Collaborative Best Practices and Case Studies

    NASA Technical Reports Server (NTRS)

    Wheeler, Robert W.; Benafan, Othmane; Gao, Xiujie; Calkins, Frederick T; Ghanbari, Zahra; Hommer, Garrison; Lagoudas, Dimitris; Petersen, Andrew; Pless, Jennifer M.; Stebner, Aaron P.; hide

    2016-01-01

    The primary goal of the Consortium for the Advancement of Shape Memory Alloy Research and Technology (CASMART) is to enable the design of revolutionary applications based on shape memory alloy (SMA) technology. In order to help realize this goal and reduce the development time and required experience for the fabrication of SMA actuation systems, several modeling tools have been developed for common actuator types and are discussed herein along with case studies, which highlight the capabilities and limitations of these tools. Due to their ability to sustain high stresses and recover large deformations, SMAs have many potential applications as reliable, lightweight, solid-state actuators. Their advantage over classical actuators can also be further improved when the actuator geometry is modified to fit the specific application. In this paper, three common actuator designs are studied: wires, which are lightweight, low-profile, and easily implemented; springs, which offer actuation strokes upwards of 200 at reduced mechanical loads; and torque tubes, which can provide large actuation forces in small volumes and develop a repeatable zero-load actuation response (known as the two-way shape memory effect). The modeling frameworks, which have been implemented in the design tools, are developed for each of these frequently used SMA actuator types. In order to demonstrate the versatility and flexibility of the presented design tools, as well as validate their modeling framework, several design challenges were completed. These case studies include the design and development of an active hinge for the deployment of a solar array or foldable space structure, an adaptive solar array deployment and positioning system, a passive air temperature controller for regulation flow temperatures inside of a jet engine, and a redesign of the Corvette active hatch, which allows for pressure equalization of the car interior. For each of the presented case studies, a prototype or proof-of-concept was fabricated and the experimental results and lessons learned are discussed. This analysis presents a collection of CASMART collaborative best practices in order to allow readers to utilize the available design tools and understand their modeling principles. These design tools, which are based on engineering models, can provide first-order optimal designs and are a basic and efficient method for either demonstrating design feasibility or refining design parameters. Although the design and integration of an SMA-based actuation system always requires application- and environment-specific engineering considerations, common modeling tools can significantly reduce the investment required for actuation system development and provide valuable engineering insight.

  2. A web-based 3D visualisation and assessment system for urban precinct scenario modelling

    NASA Astrophysics Data System (ADS)

    Trubka, Roman; Glackin, Stephen; Lade, Oliver; Pettit, Chris

    2016-07-01

    Recent years have seen an increasing number of spatial tools and technologies for enabling better decision-making in the urban environment. They have largely arisen because of the need for cities to be more efficiently planned to accommodate growing populations while mitigating urban sprawl, and also because of innovations in rendering data in 3D being well suited for visualising the urban built environment. In this paper we review a number of systems that are better known and more commonly used in the field of urban planning. We then introduce Envision Scenario Planner (ESP), a web-based 3D precinct geodesign, visualisation and assessment tool, developed using Agile and Co-design methods. We provide a comprehensive account of the tool, beginning with a discussion of its design and development process and concluding with an example use case and a discussion of the lessons learned in its development.

  3. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    NASA Astrophysics Data System (ADS)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  4. The X-windows interactive navigation data editor

    NASA Technical Reports Server (NTRS)

    Rinker, G. C.

    1992-01-01

    A new computer program called the X-Windows Interactive Data Editor (XIDE) was developed and demonstrated as a prototype application for editing radio metric data in the orbit-determination process. The program runs on a variety of workstations and employs pull-down menus and graphical displays, which allow users to easily inspect and edit radio metric data in the orbit data files received from the Deep Space Network (DSN). The XIDE program is based on the Open Software Foundation OSF/Motif Graphical User Interface (GUI) and has proven to be an efficient tool for editing radio metric data in the navigation operations environment. It was adopted by the Magellan Navigation Team as their primary data-editing tool. Because the software was designed from the beginning to be portable, the prototype was successfully moved to new workstation environments. It was also itegrated into the design of the next-generation software tool for DSN multimission navigation interactive launch support.

  5. Graphical Tests for Power Comparison of Competing Designs.

    PubMed

    Hofmann, H; Follett, L; Majumder, M; Cook, D

    2012-12-01

    Lineups have been established as tools for visual testing similar to standard statistical inference tests, allowing us to evaluate the validity of graphical findings in an objective manner. In simulation studies lineups have been shown as being efficient: the power of visual tests is comparable to classical tests while being much less stringent in terms of distributional assumptions made. This makes lineups versatile, yet powerful, tools in situations where conditions for regular statistical tests are not or cannot be met. In this paper we introduce lineups as a tool for evaluating the power of competing graphical designs. We highlight some of the theoretical properties and then show results from two studies evaluating competing designs: both studies are designed to go to the limits of our perceptual abilities to highlight differences between designs. We use both accuracy and speed of evaluation as measures of a successful design. The first study compares the choice of coordinate system: polar versus cartesian coordinates. The results show strong support in favor of cartesian coordinates in finding fast and accurate answers to spotting patterns. The second study is aimed at finding shift differences between distributions. Both studies are motivated by data problems that we have recently encountered, and explore using simulated data to evaluate the plot designs under controlled conditions. Amazon Mechanical Turk (MTurk) is used to conduct the studies. The lineups provide an effective mechanism for objectively evaluating plot designs.

  6. Energy-Saving Melting and Revert Reduction Technology (E-SMARRT): Use of Laser Engineered Net Shaping for Rapid Manufacturing of Dies with Protective Coatings and Improved Thermal Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brevick, Jerald R.

    2014-06-13

    In the high pressure die casting process, molten metal is introduced into a die cavity at high pressure and velocity, enabling castings of thin wall section and complex geometry to be obtained. Traditional die materials have been hot work die steels, commonly H13. Manufacture of the dies involves machining the desired geometry from monolithic blocks of annealed tool steel, heat treating to desired hardness and toughness, and final machining, grinding and polishing. The die is fabricated with internal water cooling passages created by drilling. These materials and fabrication methods have been used for many years, however, there are limitations. Toolmore » steels have relatively low thermal conductivity, and as a result, it takes time to remove the heat from the tool steel via the drilled internal water cooling passages. Furthermore, the low thermal conductivity generates large thermal gradients at the die cavity surfaces, which ultimately leads to thermal fatigue cracking on the surfaces of the die steel. The high die surface temperatures also promote the metallurgical bonding of the aluminum casting alloy to the surface of the die steel (soldering). In terms of process efficiency, these tooling limitations reduce the number of die castings that can be made per unit time by increasing cycle time required for cooling, and increasing downtime and cost to replace tooling which has failed either by soldering or by thermal fatigue cracking (heat checking). The objective of this research was to evaluate the feasibility of designing, fabricating, and testing high pressure die casting tooling having properties equivalent to H13 on the surface in contact with molten casting alloy - for high temperature and high velocity molten metal erosion resistance – but with the ability to conduct heat rapidly to interior water cooling passages. A layered bimetallic tool design was selected, and the design evaluated for thermal and mechanical performance via finite element analysis. H13 was retained as the exterior layer of the tooling, while commercially pure copper was chosen for the interior structure of the tooling. The tooling was fabricated by traditional machining of the copper substrate, and H13 powder was deposited on the copper via the Laser Engineered Net Shape (LENSTM) process. The H13 deposition layer was then final machined by traditional methods. Two tooling components were designed and fabricated; a thermal fatigue test specimen, and a core for a commercial aluminum high pressure die casting tool. The bimetallic thermal fatigue specimen demonstrated promising performance during testing, and the test results were used to improve the design and LENS TM deposition methods for subsequent manufacture of the commercial core. Results of the thermal finite element analysis for the thermal fatigue test specimen indicate that it has the ability to lose heat to the internal water cooling passages, and to external spray cooling, significantly faster than a monolithic H13 thermal fatigue sample. The commercial core is currently in the final stages of fabrication, and will be evaluated in an actual production environment at Shiloh Die casting. In this research, the feasibility of designing and fabricating copper/H13 bimetallic die casting tooling via LENS TM processing, for the purpose of improving die casting process efficiency, is demonstrated.« less

  7. Stackfile Database

    NASA Technical Reports Server (NTRS)

    deVarvalho, Robert; Desai, Shailen D.; Haines, Bruce J.; Kruizinga, Gerhard L.; Gilmer, Christopher

    2013-01-01

    This software provides storage retrieval and analysis functionality for managing satellite altimetry data. It improves the efficiency and analysis capabilities of existing database software with improved flexibility and documentation. It offers flexibility in the type of data that can be stored. There is efficient retrieval either across the spatial domain or the time domain. Built-in analysis tools are provided for frequently performed altimetry tasks. This software package is used for storing and manipulating satellite measurement data. It was developed with a focus on handling the requirements of repeat-track altimetry missions such as Topex and Jason. It was, however, designed to work with a wide variety of satellite measurement data [e.g., Gravity Recovery And Climate Experiment -- GRACE). The software consists of several command-line tools for importing, retrieving, and analyzing satellite measurement data.

  8. Pointo - a Low Cost Solution to Point Cloud Processing

    NASA Astrophysics Data System (ADS)

    Houshiar, H.; Winkler, S.

    2017-11-01

    With advance in technology access to data especially 3D point cloud data becomes more and more an everyday task. 3D point clouds are usually captured with very expensive tools such as 3D laser scanners or very time consuming methods such as photogrammetry. Most of the available softwares for 3D point cloud processing are designed for experts and specialists in this field and are usually very large software packages containing variety of methods and tools. This results in softwares that are usually very expensive to acquire and also very difficult to use. Difficulty of use is caused by complicated user interfaces that is required to accommodate a large list of features. The aim of these complex softwares is to provide a powerful tool for a specific group of specialist. However they are not necessary required by the majority of the up coming average users of point clouds. In addition to complexity and high costs of these softwares they generally rely on expensive and modern hardware and only compatible with one specific operating system. Many point cloud customers are not point cloud processing experts or willing to spend the high acquisition costs of these expensive softwares and hardwares. In this paper we introduce a solution for low cost point cloud processing. Our approach is designed to accommodate the needs of the average point cloud user. To reduce the cost and complexity of software our approach focuses on one functionality at a time in contrast with most available softwares and tools that aim to solve as many problems as possible at the same time. Our simple and user oriented design improve the user experience and empower us to optimize our methods for creation of an efficient software. In this paper we introduce Pointo family as a series of connected softwares to provide easy to use tools with simple design for different point cloud processing requirements. PointoVIEWER and PointoCAD are introduced as the first components of the Pointo family to provide a fast and efficient visualization with the ability to add annotation and documentation to the point clouds.

  9. PrecisePrimer: an easy-to-use web server for designing PCR primers for DNA library cloning and DNA shuffling.

    PubMed

    Pauthenier, Cyrille; Faulon, Jean-Loup

    2014-07-01

    PrecisePrimer is a web-based primer design software made to assist experimentalists in any repetitive primer design task such as preparing, cloning and shuffling DNA libraries. Unlike other popular primer design tools, it is conceived to generate primer libraries with popular PCR polymerase buffers proposed as pre-set options. PrecisePrimer is also meant to design primers in batches, such as for DNA libraries creation of DNA shuffling experiments and to have the simplest interface possible. It integrates the most up-to-date melting temperature algorithms validated with experimental data, and cross validated with other computational tools. We generated a library of primers for the extraction and cloning of 61 genes from yeast DNA genomic extract using default parameters. All primer pairs efficiently amplified their target without any optimization of the PCR conditions. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. Using naturalistic driving films as a design tool for investigating driver requirements in HMI design for ADAS.

    PubMed

    Wang, Minjuan; Sun, Dong; Chen, Fang

    2012-01-01

    In recent years, there are many naturalistic driving projects have been conducted, such as the 100-Car Project (Naturalistic Driving study in United State), EuroFOT(European Large-Scale Field Operational Tests on Vehicle Systems), SeMi- FOT(Sweden Michigan Naturalistic Field Operational Test and etc. However, those valuable naturalistic driving data hasn't been applied into Human-machine Interaction (HMI) design for Advanced Driver Assistance Systems (ADAS), a good HMI design for ADAS requires a deep understanding of drive environment and the interactions between the driving car and other road users in different situations. The results demonstrated the benefits of using naturalistic driving films as a mean for enhancing focus group discussion for better understanding driver's needs and traffic environment constraints. It provided an efficient tool for designers to have inside knowledge about drive and the needs for information presentation; The recommendations for how to apply this method is discussed in the paper.

  11. High performance TWT development for the microwave power module

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whaley, D.R.; Armstrong, C.M.; Groshart, G.

    1996-12-31

    Northrop Grumman`s ongoing development of microwave power modules (MPM) provides microwave power at various power levels, frequencies, and bandwidths for a variety of applications. Present day requirements for the vacuum power booster traveling wave tubes of the microwave power module are becoming increasingly more demanding, necessitating the need for further enhancement of tube performance. The MPM development program at Northrop Grumman is designed specifically to meet this need by construction and test of a series of new tubes aimed at verifying computation and reaching high efficiency design goals. Tubes under test incorporate several different helix designs, as well as varyingmore » electron gun and magnetic confinement configurations. Current efforts also include further development of state-of-the-art TWT modeling and computational methods at Northrop Grumman incorporating new, more accurate models into existing design tools and developing new tools to be used in all aspects of traveling wave tube design. Current status of the Northrop Grumman MPM TWT development program will be presented.« less

  12. Development of a Mobile Tool That Semiautomatically Screens Patients for Stroke Clinical Trials.

    PubMed

    Spokoyny, Ilana; Lansberg, Maarten; Thiessen, Rosita; Kemp, Stephanie M; Aksoy, Didem; Lee, YongJae; Mlynash, Michael; Hirsch, Karen G

    2016-10-01

    Despite several national coordinated research networks, enrollment in many cerebrovascular trials remains challenging. An electronic tool was needed that would improve the efficiency and efficacy of screening for multiple simultaneous acute clinical stroke trials by automating the evaluation of inclusion and exclusion criteria, improving screening procedures and streamlining the communication process between the stroke research coordinators and the stroke clinicians. A multidisciplinary group consisting of physicians, study coordinators, and biostatisticians designed and developed an electronic clinical trial screening tool on a HIPAA (Health Insurance Portability and Accountability Act)-compliant platform. A web-based tool was developed that uses branch logic to determine eligibility for simultaneously enrolling clinical trials and automatically notifies the study coordinator teams about eligible patients. After 12 weeks of use, 225 surveys were completed, and 51 patients were enrolled in acute stroke clinical trials. Compared with the 12 weeks before implementation of the tool, there was an increase in enrollment from 16.5% of patients screened to 23.4% of patients screened (P<0.05). Clinicians and coordinators reported increased satisfaction with the process and improved ease of screening. We created a semiautomated electronic screening tool that uses branch logic to screen patients for stroke clinical trials. The tool has improved efficiency and efficacy of screening, and it could be adapted for use at other sites and in other medical fields. © 2016 American Heart Association, Inc.

  13. The Trojan. [supersonic transport

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Trojan is the culmination of thousands of engineering person-hours by the Cones of Silence Design Team. The goal was to design an economically and technologically viable supersonic transport. The Trojan is the embodiment of the latest engineering tools and technology necessary for such an advanced aircraft. The efficient design of the Trojan allows for supersonic cruise of Mach 2.0 for 5,200 nautical miles, carrying 250 passengers. The per aircraft price is placed at $200 million, making the Trojan a very realistic solution for tomorrows transportation needs. The following is a detailed study of the driving factors that determined the Trojan's super design.

  14. New multivariable capabilities of the INCA program

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1989-01-01

    The INteractive Controls Analysis (INCA) program was developed at NASA's Goddard Space Flight Center to provide a user friendly, efficient environment for the design and analysis of control systems, specifically spacecraft control systems. Since its inception, INCA has found extensive use in the design, development, and analysis of control systems for spacecraft, instruments, robotics, and pointing systems. The (INCA) program was initially developed as a comprehensive classical design analysis tool for small and large order control systems. The latest version of INCA, expected to be released in February of 1990, was expanded to include the capability to perform multivariable controls analysis and design.

  15. Control system design and analysis using the INteractive Controls Analysis (INCA) program

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H.; Downing, John P.

    1987-01-01

    The INteractive Controls Analysis (INCA) program was developed at the Goddard Space Flight Center to provide a user friendly efficient environment for the design and analysis of linear control systems. Since its inception, INCA has found extensive use in the design, development, and analysis of control systems for spacecraft, instruments, robotics, and pointing systems. Moreover, the results of the analytic tools imbedded in INCA have been flight proven with at least three currently orbiting spacecraft. This paper describes the INCA program and illustrates, using a flight proven example, how the package can perform complex design analyses with relative ease.

  16. Progressive Damage and Failure Analysis of Composite Laminates

    NASA Astrophysics Data System (ADS)

    Joseph, Ashith P. K.

    Composite materials are widely used in various industries for making structural parts due to higher strength to weight ratio, better fatigue life, corrosion resistance and material property tailorability. To fully exploit the capability of composites, it is required to know the load carrying capacity of the parts made of them. Unlike metals, composites are orthotropic in nature and fails in a complex manner under various loading conditions which makes it a hard problem to analyze. Lack of reliable and efficient failure analysis tools for composites have led industries to rely more on coupon and component level testing to estimate the design space. Due to the complex failure mechanisms, composite materials require a very large number of coupon level tests to fully characterize the behavior. This makes the entire testing process very time consuming and costly. The alternative is to use virtual testing tools which can predict the complex failure mechanisms accurately. This reduces the cost only to it's associated computational expenses making significant savings. Some of the most desired features in a virtual testing tool are - (1) Accurate representation of failure mechanism: Failure progression predicted by the virtual tool must be same as those observed in experiments. A tool has to be assessed based on the mechanisms it can capture. (2) Computational efficiency: The greatest advantages of a virtual tools are the savings in time and money and hence computational efficiency is one of the most needed features. (3) Applicability to a wide range of problems: Structural parts are subjected to a variety of loading conditions including static, dynamic and fatigue conditions. A good virtual testing tool should be able to make good predictions for all these different loading conditions. The aim of this PhD thesis is to develop a computational tool which can model the progressive failure of composite laminates under different quasi-static loading conditions. The analysis tool is validated by comparing the simulations against experiments for a selected number of quasi-static loading cases.

  17. Novel inter and intra prediction tools under consideration for the emerging AV1 video codec

    NASA Astrophysics Data System (ADS)

    Joshi, Urvang; Mukherjee, Debargha; Han, Jingning; Chen, Yue; Parker, Sarah; Su, Hui; Chiang, Angie; Xu, Yaowu; Liu, Zoe; Wang, Yunqing; Bankoski, Jim; Wang, Chen; Keyder, Emil

    2017-09-01

    Google started the WebM Project in 2010 to develop open source, royalty- free video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec AV1, in a consortium of major tech companies called the Alliance for Open Media, that achieves at least a generational improvement in coding efficiency over VP9. In this paper, we focus primarily on new tools in AV1 that improve the prediction of pixel blocks before transforms, quantization and entropy coding are invoked. Specifically, we describe tools and coding modes that improve intra, inter and combined inter-intra prediction. Results are presented on standard test sets.

  18. Development of the electric vehicle analyzer

    NASA Astrophysics Data System (ADS)

    Dickey, Michael R.; Klucz, Raymond S.; Ennix, Kimberly A.; Matuszak, Leo M.

    1990-06-01

    The increasing technological maturity of high power (greater than 20 kW) electric propulsion devices has led to renewed interest in their use as a means of efficiently transferring payloads between earth orbits. Several systems and architecture studies have identified the potential cost benefits of high performance Electric Orbital Transfer Vehicles (EOTVs). These studies led to the initiation of the Electric Insertion Transfer Experiment (ELITE) in 1988. Managed by the Astronautics Laboratory, ELITE is a flight experiment designed to sufficiently demonstrate key technologies and options to pave the way for the full-scale development of an operational EOTV. An important consideration in the development of the ELITE program is the capability of available analytical tools to simulate the orbital mechanics of a low thrust, electric propulsion transfer vehicle. These tools are necessary not only for ELITE mission planning exercises but also for continued, efficient, accurate evaluation of DoD space transportation architectures which include EOTVs. This paper presents such a tool: the Electric Vehicle Analyzer (EVA).

  19. Kernel based machine learning algorithm for the efficient prediction of type III polyketide synthase family of proteins.

    PubMed

    Mallika, V; Sivakumar, K C; Jaichand, S; Soniya, E V

    2010-07-13

    Type III Polyketide synthases (PKS) are family of proteins considered to have significant roles in the biosynthesis of various polyketides in plants, fungi and bacteria. As these proteins shows positive effects to human health, more researches are going on regarding this particular protein. Developing a tool to identify the probability of sequence being a type III polyketide synthase will minimize the time consumption and manpower efforts. In this approach, we have designed and implemented PKSIIIpred, a high performance prediction server for type III PKS where the classifier is Support Vector Machines (SVMs). Based on the limited training dataset, the tool efficiently predicts the type III PKS superfamily of proteins with high sensitivity and specificity. The PKSIIIpred is available at http://type3pks.in/prediction/. We expect that this tool may serve as a useful resource for type III PKS researchers. Currently work is being progressed for further betterment of prediction accuracy by including more sequence features in the training dataset.

  20. Automatic design of conformal cooling channels in injection molding tooling

    NASA Astrophysics Data System (ADS)

    Zhang, Yingming; Hou, Binkui; Wang, Qian; Li, Yang; Huang, Zhigao

    2018-02-01

    The generation of cooling system plays an important role in injection molding design. A conformal cooling system can effectively improve molding efficiency and product quality. This paper provides a generic approach for building conformal cooling channels. The centrelines of these channels are generated in two steps. First, we extract conformal loops based on geometric information of product. Second, centrelines in spiral shape are built by blending these loops. We devise algorithms to implement the entire design process. A case study verifies the feasibility of this approach.

  1. Rapid Geometry Creation for Computer-Aided Engineering Parametric Analyses: A Case Study Using ComGeom2 for Launch Abort System Design

    NASA Technical Reports Server (NTRS)

    Hawke, Veronica; Gage, Peter; Manning, Ted

    2007-01-01

    ComGeom2, a tool developed to generate Common Geometry representation for multidisciplinary analysis, has been used to create a large set of geometries for use in a design study requiring analysis by two computational codes. This paper describes the process used to generate the large number of configurations and suggests ways to further automate the process and make it more efficient for future studies. The design geometry for this study is the launch abort system of the NASA Crew Launch Vehicle.

  2. Factorial Experiments: Efficient Tools for Evaluation of Intervention Components

    PubMed Central

    Collins, Linda M.; Dziak, John J.; Kugler, Kari C.; Trail, Jessica B.

    2014-01-01

    Background An understanding of the individual and combined effects of a set of intervention components is important for moving the science of preventive medicine interventions forward. This understanding can often be achieved in an efficient and economical way via a factorial experiment, in which two or more independent variables are manipulated. The factorial experiment is a complement to the randomized controlled trial (RCT); the two designs address different research questions. Purpose This article offers an introduction to factorial experiments aimed at investigators trained primarily in the RCT. Method The factorial experiment is compared and contrasted with other experimental designs used commonly in intervention science to highlight where each is most efficient and appropriate. Results Several points are made: factorial experiments make very efficient use of experimental subjects when the data are properly analyzed; a factorial experiment can have excellent statistical power even if it has relatively few subjects per experimental condition; and when conducting research to select components for inclusion in a multicomponent intervention, interactions should be studied rather than avoided. Conclusions Investigators in preventive medicine and related areas should begin considering factorial experiments alongside other approaches. Experimental designs should be chosen from a resource management perspective, which states that the best experimental design is the one that provides the greatest scientific benefit without exceeding available resources. PMID:25092122

  3. Course Management Systems in Higher Education: Understanding Student Experiences

    ERIC Educational Resources Information Center

    Yuen, Allan; Fox, Robert; Sun, Angie; Deng, Liping

    2009-01-01

    Purpose: The course management system (CMS), as an evolving tool and innovation, is increasingly used to promote the quality, efficiency and flexibility of teaching and learning in higher education. This paper aims to examine students' experiences of CMSs across faculties at a comprehensive university in Hong Kong. Design/methodology/approach:…

  4. Uses of Metaphors & Imagery in Counseling. Instructor's Manual.

    ERIC Educational Resources Information Center

    Gladding, Samuel T.

    This document presents an instructor's manual designed to accompany the videotape, "Uses of Metaphors and Imagery in Counseling," a tool to teach beginning and experienced counselors how to more efficiently help their clients by focusing on the use of non-literal language and thoughts (i.e., metaphors and images). The format and content of the…

  5. SNAD: Sequence Name Annotation-based Designer.

    PubMed

    Sidorov, Igor A; Reshetov, Denis A; Gorbalenya, Alexander E

    2009-08-14

    A growing diversity of biological data is tagged with unique identifiers (UIDs) associated with polynucleotides and proteins to ensure efficient computer-mediated data storage, maintenance, and processing. These identifiers, which are not informative for most people, are often substituted by biologically meaningful names in various presentations to facilitate utilization and dissemination of sequence-based knowledge. This substitution is commonly done manually that may be a tedious exercise prone to mistakes and omissions. Here we introduce SNAD (Sequence Name Annotation-based Designer) that mediates automatic conversion of sequence UIDs (associated with multiple alignment or phylogenetic tree, or supplied as plain text list) into biologically meaningful names and acronyms. This conversion is directed by precompiled or user-defined templates that exploit wealth of annotation available in cognate entries of external databases. Using examples, we demonstrate how this tool can be used to generate names for practical purposes, particularly in virology. A tool for controllable annotation-based conversion of sequence UIDs into biologically meaningful names and acronyms has been developed and placed into service, fostering links between quality of sequence annotation, and efficiency of communication and knowledge dissemination among researchers.

  6. Recent Advances in the Synthesis, Characterization and Application of Zn+-containing Heterogeneous Catalysts.

    PubMed

    Chen, Guangbo; Zhao, Yufei; Shang, Lu; Waterhouse, Geoffrey I N; Kang, Xiaofeng; Wu, Li-Zhu; Tung, Chen-Ho; Zhang, Tierui

    2016-07-01

    Monovalent Zn + (3d 10 4s 1 ) systems possess a special electronic structure that can be exploited in heterogeneous catalysis and photocatalysis, though it remains challenge to synthesize Zn + -containing materials. By careful design, Zn + -related species can be synthesized in zeolite and layered double hydroxide systems, which in turn exhibit excellent catalytic potential in methane, CO and CO 2 activation. Furthermore, by utilizing advanced characterization tools, including electron spin resonance, X-ray absorption fine structure and density functional theory calculations, the formation mechanism of the Zn + species and their structure-performance relationships can be understood. Such advanced characterization tools guide the rational design of high-performance Zn + -containing catalysts for efficient energy conversion.

  7. Development of a Rolling Process Design Tool for Use in Improving Hot Roll Slab Recovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Couch, R; Becker, R; Rhee, M

    2004-09-24

    Lawrence Livermore National Laboratory participated in a U. S. Department of Energy/Office of Industrial Technology sponsored research project 'Development of a Rolling Process Design Tool for Use in Improving Hot Roll Slab Recovery', as a Cooperative Agreement TC-02028 with the Alcoa Technical Center (ATC). The objective of the joint project with Alcoa is to develop a numerical modeling capability to optimize the hot rolling process used to produce aluminum plate. Product lost in the rolling process and subsequent recycling, wastes resources consumed in the energy-intensive steps of remelting and reprocessing the ingot. The modeling capability developed by project partners willmore » be used to produce plate more efficiently and with reduced product loss.« less

  8. Recent Advances in the Synthesis, Characterization and Application of Zn+‐containing Heterogeneous Catalysts

    PubMed Central

    Chen, Guangbo; Zhao, Yufei; Shang, Lu; Waterhouse, Geoffrey I. N.; Kang, Xiaofeng; Wu, Li‐Zhu; Tung, Chen‐Ho

    2016-01-01

    Monovalent Zn+ (3d104s1) systems possess a special electronic structure that can be exploited in heterogeneous catalysis and photocatalysis, though it remains challenge to synthesize Zn+‐containing materials. By careful design, Zn+‐related species can be synthesized in zeolite and layered double hydroxide systems, which in turn exhibit excellent catalytic potential in methane, CO and CO2 activation. Furthermore, by utilizing advanced characterization tools, including electron spin resonance, X‐ray absorption fine structure and density functional theory calculations, the formation mechanism of the Zn+ species and their structure‐performance relationships can be understood. Such advanced characterization tools guide the rational design of high‐performance Zn+‐containing catalysts for efficient energy conversion. PMID:27818902

  9. Design of thermoelectrically highly efficient Heusler compounds using phase separations and nano-composites under an economic point of view

    NASA Astrophysics Data System (ADS)

    Balke, Benjamin

    Half-Heusler (HH) compounds are one of the most promising candidates for thermoelectric materials for automotive and industrial waste heat recovery applications. In this talk, I will give an overview about our recent investigations of phase separations in HH thermoelectrics, focusing on the ternary system TiNiSn-ZrNiSn-HfNiSn. I will show how we adapted this knowledge to design a p-type HH compound which exhibits a ZT that is increased by 130% compared to the best published bulk p-type Heusler. I will also present how we used the phase separation to design thermoelectric highly efficient nano-composites of different single-phase materials. Since the price for Hafnium doubled within the last year, our research focused on the design of HH compounds without Hafnium. I will present a very recent calculation on ZT per Euro and efficiency per Euro for various materials followed by our latest very promising results for n-type Heusler compunds without Hafnium resulting in 20 times higher ZT/Euro values. These results strongly underline the importance of phase separations as a powerful tool for designing highly efficient materials for thermoelectric applications that fulfill the industrial demands for a thermoelectric converter. The author gratefully acknowledges financial support by the thermoHEUSLER2 Project (Project No. 19U15006F) of the German Federal Ministry of Economics and Technology (BMWi).

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han, S; Ji, Y; Kim, K

    Purpose: A diagnostics Multileaf Collimator (MLC) was designed for diagnostic radiography dose reduction. Monte Carlo simulation was used to evaluate efficiency of shielding material for producing leaves of Multileaf collimator. Material & Methods: The general radiography unit (Rex-650R, Listem, Korea) was modeling with Monte Carlo simulation (MCNPX, LANL, USA) and we used SRS-78 program to calculate the energy spectrum of tube voltage (80, 100, 120 kVp). The shielding materials was SKD 11 alloy tool steel that is composed of 1.6% carbon(C), 0.4% silicon (Si), 0.6% manganese (Mn), 5% chromium (Cr), 1% molybdenum (Mo), and vanadium (V). The density of itmore » was 7.89 g/m3. We simulated leafs diagnostic MLC using SKD 11 with general radiography unit. We calculated efficiency of diagnostic MLC using tally6 card of MCNPX depending on energy. Results: The diagnostic MLC consisted of 25 individual metal shielding leaves on both sides, with dimensions of 10 × 0.5 × 0.5 cm3. The leaves of MLC were controlled by motors positioned on both sides of the MLC. According to energy (tube voltage), the shielding efficiency of MLC in Monte Carlo simulation was 99% (80 kVp), 96% (100 kVp) and 93% (120 kVp). Conclusion: We certified efficiency of diagnostic MLC fabricated from SKD11 alloy tool steel. Based on the results, the diagnostic MLC was designed. We will make the diagnostic MLC for dose reduction of diagnostic radiography.« less

  11. Design and Evaluation of the Terminal Area Precision Scheduling and Spacing System

    NASA Technical Reports Server (NTRS)

    Swenson, Harry N.; Thipphavong, Jane; Sadovsky, Alex; Chen, Liang; Sullivan, Chris; Martin, Lynne

    2011-01-01

    This paper describes the design, development and results from a high fidelity human-in-the-loop simulation of an integrated set of trajectory-based automation tools providing precision scheduling, sequencing and controller merging and spacing functions. These integrated functions are combined into a system called the Terminal Area Precision Scheduling and Spacing (TAPSS) system. It is a strategic and tactical planning tool that provides Traffic Management Coordinators, En Route and Terminal Radar Approach Control air traffic controllers the ability to efficiently optimize the arrival capacity of a demand-impacted airport while simultaneously enabling fuel-efficient descent procedures. The TAPSS system consists of four-dimensional trajectory prediction, arrival runway balancing, aircraft separation constraint-based scheduling, traffic flow visualization and trajectory-based advisories to assist controllers in efficient metering, sequencing and spacing. The TAPSS system was evaluated and compared to today's ATC operation through extensive series of human-in-the-loop simulations for arrival flows into the Los Angeles International Airport. The test conditions included the variation of aircraft demand from a baseline of today's capacity constrained periods through 5%, 10% and 20% increases. Performance data were collected for engineering and human factor analysis and compared with similar operations both with and without the TAPSS system. The engineering data indicate operations with the TAPSS show up to a 10% increase in airport throughput during capacity constrained periods while maintaining fuel-efficient aircraft descent profiles from cruise to landing.

  12. Compressor and Turbine Multidisciplinary Design for Highly Efficient Micro-gas Turbine

    NASA Astrophysics Data System (ADS)

    Barsi, Dario; Perrone, Andrea; Qu, Yonglei; Ratto, Luca; Ricci, Gianluca; Sergeev, Vitaliy; Zunino, Pietro

    2018-06-01

    Multidisciplinary design optimization (MDO) is widely employed to enhance turbomachinery components efficiency. The aim of this work is to describe a complete tool for the aero-mechanical design of a radial inflow turbine and a centrifugal compressor. The high rotational speed of such machines and the high exhaust gas temperature (only for the turbine) expose blades to really high stresses and therefore the aerodynamics design has to be coupled with the mechanical one through an integrated procedure. The described approach employs a fully 3D Reynolds Averaged Navier-Stokes (RANS) solver for the aerodynamics and an open source Finite Element Analysis (FEA) solver for the mechanical integrity assessment. Due to the high computational cost of both these two solvers, a meta model, such as an artificial neural network (ANN), is used to speed up the optimization design process. The interaction between two codes, the mesh generation and the post processing of the results are achieved via in-house developed scripting modules. The obtained results are widely presented and discussed.

  13. Designing and Interpreting Limiting Dilution Assays: General Principles and Applications to the Latent Reservoir for Human Immunodeficiency Virus-1.

    PubMed

    Rosenbloom, Daniel I S; Elliott, Oliver; Hill, Alison L; Henrich, Timothy J; Siliciano, Janet M; Siliciano, Robert F

    2015-12-01

    Limiting dilution assays are widely used in infectious disease research. These assays are crucial for current human immunodeficiency virus (HIV)-1 cure research in particular. In this study, we offer new tools to help investigators design and analyze dilution assays based on their specific research needs. Limiting dilution assays are commonly used to measure the extent of infection, and in the context of HIV they represent an essential tool for studying latency and potential curative strategies. Yet standard assay designs may not discern whether an intervention reduces an already miniscule latent infection. This review addresses challenges arising in this setting and in the general use of dilution assays. We illustrate the major statistical method for estimating frequency of infectious units from assay results, and we offer an online tool for computing this estimate. We recommend a procedure for customizing assay design to achieve desired sensitivity and precision goals, subject to experimental constraints. We consider experiments in which no viral outgrowth is observed and explain how using alternatives to viral outgrowth may make measurement of HIV latency more efficient. Finally, we discuss how biological complications, such as probabilistic growth of small infections, alter interpretations of experimental results.

  14. Seamless transitions from early prototypes to mature operational software - A technology that enables the process for planning and scheduling applications

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda S.; Wunderlich, Dana A.; Willoughby, John K.

    1992-01-01

    New and innovative software technology is presented that provides a cost effective bridge for smoothly transitioning prototype software, in the field of planning and scheduling, into an operational environment. Specifically, this technology mixes the flexibility and human design efficiency of dynamic data typing with the rigor and run-time efficiencies of static data typing. This new technology provides a very valuable tool for conducting the extensive, up-front system prototyping that leads to specifying the correct system and producing a reliable, efficient version that will be operationally effective and will be accepted by the intended users.

  15. Lowering the barriers to consumer-directed health care: responding to concerns.

    PubMed

    Baicker, Katherine; Dow, William H; Wolfson, Jonathan

    2007-01-01

    Consumer-directed health care is a potentially promising tool for moving toward more efficient use of health care resources. Tax policy has long been biased against health plans with significant patient cost sharing. Tax advantages created by health savings accounts (HSAs) began to change that, and proposed tax reforms could go even further. We assess various critiques of these plans, focusing on why they benefit not just the healthy and wealthy. Lower costs and more efficient health spending would help all patients and reduce uninsurance. Potential negative distributional effects are important but can be remedied more efficiently without distorting insurance design.

  16. Comparison Of Human Modelling Tools For Efficiency Of Prediction Of EVA Tasks

    NASA Technical Reports Server (NTRS)

    Dischinger, H. Charles, Jr.; Loughead, Tomas E.

    1998-01-01

    Construction of the International Space Station (ISS) will require extensive extravehicular activity (EVA, spacewalks), and estimates of the actual time needed continue to rise. As recently as September, 1996, the amount of time to be spent in EVA was believed to be about 400 hours, excluding spacewalks on the Russian segment. This estimate has recently risen to over 1100 hours, and it could go higher before assembly begins in the summer of 1998. These activities are extremely expensive and hazardous, so any design tools which help assure mission success and improve the efficiency of the astronaut in task completion can pay off in reduced design and EVA costs and increased astronaut safety. The tasks which astronauts can accomplish in EVA are limited by spacesuit mobility. They are therefore relatively simple, from an ergonomic standpoint, requiring gross movements rather than time motor skills. The actual tasks include driving bolts, mating and demating electric and fluid connectors, and actuating levers; the important characteristics to be considered in design improvement include the ability of the astronaut to see and reach the item to be manipulated and the clearance required to accomplish the manipulation. This makes the tasks amenable to simulation in a Computer-Assisted Design (CAD) environment. For EVA, the spacesuited astronaut must have his or her feet attached on a work platform called a foot restraint to obtain a purchase against which work forces may be actuated. An important component of the design is therefore the proper placement of foot restraints.

  17. Development of a component design tool for metal hydride heat pumps

    NASA Astrophysics Data System (ADS)

    Waters, Essene L.

    Given current demands for more efficient and environmentally friendly energy sources, hydrogen based energy systems are an increasingly popular field of interest. Within the field, metal hydrides have become a prominent focus of research due to their large hydrogen storage capacity and relative system simplicity and safety. Metal hydride heat pumps constitute one such application, in which heat and hydrogen are transferred to and from metal hydrides. While a significant amount of work has been done to study such systems, the scope of materials selection has been quite limited. Typical studies compare only a few metal hydride materials and provide limited justification for the choice of those few. In this work, a metal hydride component design tool has been developed to enable the targeted down-selection of an extensive database of metal hydrides to identify the most promising materials for use in metal hydride thermal systems. The material database contains over 300 metal hydrides with various physical and thermodynamic properties included for each material. Sub-models for equilibrium pressure, thermophysical data, and default properties are used to predict the behavior of each material within the given system. For a given thermal system, this tool can be used to identify optimal materials out of over 100,000 possible hydride combinations. The selection tool described herein has been applied to a stationary combined heat and power system containing a high-temperature proton exchange membrane (PEM) fuel cell, a hot water tank, and two metal hydride beds used as a heat pump. A variety of factors can be used to select materials including efficiency, maximum and minimum system pressures, pressure difference, coefficient of performance (COP), and COP sensitivity. The targeted down-selection of metal hydrides for this system focuses on the system's COP for each potential pair. The values of COP and COP sensitivity have been used to identify pairs of highest interest for use in this application. The metal hydride component design tool developed in this work selects between metal hydride materials on an unprecedented scale. It can be easily applied to other hydrogen-based thermal systems, making it a powerful and versatile tool.

  18. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    NASA Astrophysics Data System (ADS)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  19. Hybrid photonic-plasmonic near-field probe for efficient light conversion into the nanoscale hot spot.

    PubMed

    Koshelev, Alexander; Munechika, Keiko; Cabrini, Stefano

    2017-11-01

    In this Letter, we present a design and simulations of the novel hybrid photonic-plasmonic near-field probe. Near-field optics is a unique imaging tool that provides optical images with resolution down to tens of nanometers. One of the main limitations of this technology is its low light sensitivity. The presented hybrid probe solves this problem by combining a campanile plasmonic probe with the photonic layer, consisting of the diffractive optic element (DOE). The DOE is designed to match the plasmonic field at the broad side of the campanile probe with the fiber mode. This makes it possible to optimize the size of the campanile tip to convert light efficiently into the hot spot. The simulations show that the hybrid probe is ∼540 times more efficient compared with the conventional campanile on average in the 600-900 nm spectral range.

  20. Computational Tools for Metabolic Engineering

    PubMed Central

    Copeland, Wilbert B.; Bartley, Bryan A.; Chandran, Deepak; Galdzicki, Michal; Kim, Kyung H.; Sleight, Sean C.; Maranas, Costas D.; Sauro, Herbert M.

    2012-01-01

    A great variety of software applications are now employed in the metabolic engineering field. These applications have been created to support a wide range of experimental and analysis techniques. Computational tools are utilized throughout the metabolic engineering workflow to extract and interpret relevant information from large data sets, to present complex models in a more manageable form, and to propose efficient network design strategies. In this review, we present a number of tools that can assist in modifying and understanding cellular metabolic networks. The review covers seven areas of relevance to metabolic engineers. These include metabolic reconstruction efforts, network visualization, nucleic acid and protein engineering, metabolic flux analysis, pathway prospecting, post-structural network analysis and culture optimization. The list of available tools is extensive and we can only highlight a small, representative portion of the tools from each area. PMID:22629572

  1. Microgrid Analysis Tools Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jimenez, Antonio; Haase, Scott G; Mathur, Shivani

    2018-03-05

    The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimizationmore » tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).« less

  2. System-on-Chip Data Processing and Data Handling Spaceflight Electronics

    NASA Technical Reports Server (NTRS)

    Kleyner, I.; Katz, R.; Tiggeler, H.

    1999-01-01

    This paper presents a methodology and a tool set which implements automated generation of moderate-size blocks of customized intellectual property (IP), thus effectively reusing prior work and minimizing the labor intensive, error-prone parts of the design process. Customization of components allows for optimization for smaller area and lower power consumption, which is an important factor given the limitations of resources available in radiation-hardened devices. The effects of variations in HDL coding style on the efficiency of synthesized code for various commercial synthesis tools are also discussed.

  3. Tools for identifying gelator scaffolds and solvents.

    PubMed

    Zurcher, Danielle M; McNeil, Anne J

    2015-03-06

    Small molecule gelators are serendipitously discovered more often than they are designed. As a consequence, it has been challenging to develop applications based on the limited set of known materials. This synopsis highlights recent strategies to streamline the process of gelator discovery, with a focus on the role of unidirectional intermolecular interactions and solvation. We present these strategies as a series of tools that can be employed to help identify gelator scaffolds and solvents for gel formation. Overall, we suggest that this guided approach is more efficient than random derivatization and screening.

  4. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  5. Optimization in Cardiovascular Modeling

    NASA Astrophysics Data System (ADS)

    Marsden, Alison L.

    2014-01-01

    Fluid mechanics plays a key role in the development, progression, and treatment of cardiovascular disease. Advances in imaging methods and patient-specific modeling now reveal increasingly detailed information about blood flow patterns in health and disease. Building on these tools, there is now an opportunity to couple blood flow simulation with optimization algorithms to improve the design of surgeries and devices, incorporating more information about the flow physics in the design process to augment current medical knowledge. In doing so, a major challenge is the need for efficient optimization tools that are appropriate for unsteady fluid mechanics problems, particularly for the optimization of complex patient-specific models in the presence of uncertainty. This article reviews the state of the art in optimization tools for virtual surgery, device design, and model parameter identification in cardiovascular flow and mechanobiology applications. In particular, it reviews trade-offs between traditional gradient-based methods and derivative-free approaches, as well as the need to incorporate uncertainties. Key future challenges are outlined, which extend to the incorporation of biological response and the customization of surgeries and devices for individual patients.

  6. Feedback and efficient behavior

    PubMed Central

    2017-01-01

    Feedback is an effective tool for promoting efficient behavior: it enhances individuals’ awareness of choice consequences in complex settings. Our study aims to isolate the mechanisms underlying the effects of feedback on achieving efficient behavior in a controlled environment. We design a laboratory experiment in which individuals are not aware of the consequences of different alternatives and, thus, cannot easily identify the efficient ones. We introduce feedback as a mechanism to enhance the awareness of consequences and to stimulate exploration and search for efficient alternatives. We assess the efficacy of three different types of intervention: provision of social information, manipulation of the frequency, and framing of feedback. We find that feedback is most effective when it is framed in terms of losses, that it reduces efficiency when it includes information about inefficient peers’ behavior, and that a lower frequency of feedback does not disrupt efficiency. By quantifying the effect of different types of feedback, our study suggests useful insights for policymakers. PMID:28430787

  7. pgRNAFinder: a web-based tool to design distance independent paired-gRNA.

    PubMed

    Xiong, Yuanyan; Xie, Xiaowei; Wang, Yanzhi; Ma, Wenbing; Liang, Puping; Songyang, Zhou; Dai, Zhiming

    2017-11-15

    The CRISPR/Cas System has been shown to be an efficient and accurate genome-editing technique. There exist a number of tools to design the guide RNA sequences and predict potential off-target sites. However, most of the existing computational tools on gRNA design are restricted to small deletions. To address this issue, we present pgRNAFinder, with an easy-to-use web interface, which enables researchers to design single or distance-free paired-gRNA sequences. The web interface of pgRNAFinder contains both gRNA search and scoring system. After users input query sequences, it searches gRNA by 3' protospacer-adjacent motif (PAM), and possible off-targets, and scores the conservation of the deleted sequences rapidly. Filters can be applied to identify high-quality CRISPR sites. PgRNAFinder offers gRNA design functionality for 8 vertebrate genomes. Furthermore, to keep pgRNAFinder open, extensible to any organism, we provide the source package for local use. The pgRNAFinder is freely available at http://songyanglab.sysu.edu.cn/wangwebs/pgRNAFinder/, and the source code and user manual can be obtained from https://github.com/xiexiaowei/pgRNAFinder. songyang@bcm.edu or daizhim@mail.sysu.edu.cn. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  8. CAESY - COMPUTER AIDED ENGINEERING SYSTEM

    NASA Technical Reports Server (NTRS)

    Wette, M. R.

    1994-01-01

    Many developers of software and algorithms for control system design have recognized that current tools have limits in both flexibility and efficiency. Many forces drive the development of new tools including the desire to make complex system modeling design and analysis easier and the need for quicker turnaround time in analysis and design. Other considerations include the desire to make use of advanced computer architectures to help in control system design, adopt new methodologies in control, and integrate design processes (e.g., structure, control, optics). CAESY was developed to provide a means to evaluate methods for dealing with user needs in computer-aided control system design. It is an interpreter for performing engineering calculations and incorporates features of both Ada and MATLAB. It is designed to be reasonably flexible and powerful. CAESY includes internally defined functions and procedures, as well as user defined ones. Support for matrix calculations is provided in the same manner as MATLAB. However, the development of CAESY is a research project, and while it provides some features which are not found in commercially sold tools, it does not exhibit the robustness that many commercially developed tools provide. CAESY is written in C-language for use on Sun4 series computers running SunOS 4.1.1 and later. The program is designed to optionally use the LAPACK math library. The LAPACK math routines are available through anonymous ftp from research.att.com. CAESY requires 4Mb of RAM for execution. The standard distribution medium is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. CAESY was developed in 1993 and is a copyrighted work with all copyright vested in NASA.

  9. Comparison of Radiation Transport Codes, HZETRN, HETC and FLUKA, Using the 1956 Webber SPE Spectrum

    NASA Technical Reports Server (NTRS)

    Heinbockel, John H.; Slaba, Tony C.; Blattnig, Steve R.; Tripathi, Ram K.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.; Reddell, Brandon; Clowdsley, Martha S.; hide

    2009-01-01

    Protection of astronauts and instrumentation from galactic cosmic rays (GCR) and solar particle events (SPE) in the harsh environment of space is of prime importance in the design of personal shielding, spacec raft, and mission planning. Early entry of radiation constraints into the design process enables optimal shielding strategies, but demands efficient and accurate tools that can be used by design engineers in every phase of an evolving space project. The radiation transport code , HZETRN, is an efficient tool for analyzing the shielding effectiveness of materials exposed to space radiation. In this paper, HZETRN is compared to the Monte Carlo codes HETC-HEDS and FLUKA, for a shield/target configuration comprised of a 20 g/sq cm Aluminum slab in front of a 30 g/cm^2 slab of water exposed to the February 1956 SPE, as mode led by the Webber spectrum. Neutron and proton fluence spectra, as well as dose and dose equivalent values, are compared at various depths in the water target. This study shows that there are many regions where HZETRN agrees with both HETC-HEDS and FLUKA for this shield/target configuration and the SPE environment. However, there are also regions where there are appreciable differences between the three computer c odes.

  10. G‐LoSA: An efficient computational tool for local structure‐centric biological studies and drug design

    PubMed Central

    2016-01-01

    Abstract Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G‐LoSA. G‐LoSA aligns protein local structures in a sequence order independent way and provides a GA‐score, a chemical feature‐based and size‐independent structure similarity score. Our benchmark validation shows the robust performance of G‐LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure‐centric comparative biology studies. In particular, G‐LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G‐LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer‐aided drug design. We hope that G‐LoSA can be a useful computational method for exploring interesting biological problems through large‐scale comparison of protein local structures and facilitating drug discovery research and development. G‐LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. PMID:26813336

  11. G-LoSA: An efficient computational tool for local structure-centric biological studies and drug design.

    PubMed

    Lee, Hui Sun; Im, Wonpil

    2016-04-01

    Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G-LoSA. G-LoSA aligns protein local structures in a sequence order independent way and provides a GA-score, a chemical feature-based and size-independent structure similarity score. Our benchmark validation shows the robust performance of G-LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure-centric comparative biology studies. In particular, G-LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G-LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer-aided drug design. We hope that G-LoSA can be a useful computational method for exploring interesting biological problems through large-scale comparison of protein local structures and facilitating drug discovery research and development. G-LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. © 2016 The Protein Society.

  12. An online network tool for quality information to answer questions about occupational safety and health: usability and applicability.

    PubMed

    Rhebergen, Martijn D F; Hulshof, Carel T J; Lenderink, Annet F; van Dijk, Frank J H

    2010-10-22

    Common information facilities do not always provide the quality information needed to answer questions on health or health-related issues, such as Occupational Safety and Health (OSH) matters. Barriers may be the accessibility, quantity and readability of information. Online Question & Answer (Q&A) network tools, which link questioners directly to experts can overcome some of these barriers. When designing and testing online tools, assessing the usability and applicability is essential. Therefore, the purpose of this study is to assess the usability and applicability of a new online Q&A network tool for answers on OSH questions. We applied a cross-sectional usability test design. Eight occupational health experts and twelve potential questioners from the working population (workers) were purposively selected to include a variety of computer- and internet-experiences. During the test, participants were first observed while executing eight tasks that entailed important features of the tool. In addition, they were interviewed. Through task observations and interviews we assessed applicability, usability (effectiveness, efficiency and satisfaction) and facilitators and barriers in use. Most features were usable, though several could be improved. Most tasks were executed effectively. Some tasks, for example searching stored questions in categories, were not executed efficiently and participants were less satisfied with the corresponding features. Participants' recommendations led to improvements. The tool was found mostly applicable for additional information, to observe new OSH trends and to improve contact between OSH experts and workers. Hosting and support by a trustworthy professional organization, effective implementation campaigns, timely answering and anonymity were seen as important use requirements. This network tool is a promising new strategy for offering company workers high quality information to answer OSH questions. Q&A network tools can be an addition to existing information facilities in the field of OSH, but also to other healthcare fields struggling with how to answer questions from people in practice with high quality information. In the near future, we will focus on the use of the tool and its effects on information and knowledge dissemination.

  13. A comprehensive overview of computational resources to aid in precision genome editing with engineered nucleases.

    PubMed

    Periwal, Vinita

    2017-07-01

    Genome editing with engineered nucleases (zinc finger nucleases, TAL effector nucleases s and Clustered regularly inter-spaced short palindromic repeats/CRISPR-associated) has recently been shown to have great promise in a variety of therapeutic and biotechnological applications. However, their exploitation in genetic analysis and clinical settings largely depends on their specificity for the intended genomic target. Large and complex genomes often contain highly homologous/repetitive sequences, which limits the specificity of genome editing tools and could result in off-target activity. Over the past few years, various computational approaches have been developed to assist the design process and predict/reduce the off-target activity of these nucleases. These tools could be efficiently used to guide the design of constructs for engineered nucleases and evaluate results after genome editing. This review provides a comprehensive overview of various databases, tools, web servers and resources for genome editing and compares their features and functionalities. Additionally, it also describes tools that have been developed to analyse post-genome editing results. The article also discusses important design parameters that could be considered while designing these nucleases. This review is intended to be a quick reference guide for experimentalists as well as computational biologists working in the field of genome editing with engineered nucleases. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. PathCase-SB architecture and database design

    PubMed Central

    2011-01-01

    Background Integration of metabolic pathways resources and regulatory metabolic network models, and deploying new tools on the integrated platform can help perform more effective and more efficient systems biology research on understanding the regulation in metabolic networks. Therefore, the tasks of (a) integrating under a single database environment regulatory metabolic networks and existing models, and (b) building tools to help with modeling and analysis are desirable and intellectually challenging computational tasks. Description PathCase Systems Biology (PathCase-SB) is built and released. The PathCase-SB database provides data and API for multiple user interfaces and software tools. The current PathCase-SB system provides a database-enabled framework and web-based computational tools towards facilitating the development of kinetic models for biological systems. PathCase-SB aims to integrate data of selected biological data sources on the web (currently, BioModels database and KEGG), and to provide more powerful and/or new capabilities via the new web-based integrative framework. This paper describes architecture and database design issues encountered in PathCase-SB's design and implementation, and presents the current design of PathCase-SB's architecture and database. Conclusions PathCase-SB architecture and database provide a highly extensible and scalable environment with easy and fast (real-time) access to the data in the database. PathCase-SB itself is already being used by researchers across the world. PMID:22070889

  15. Educational software usability: Artifact or Design?

    PubMed

    Van Nuland, Sonya E; Eagleson, Roy; Rogers, Kem A

    2017-03-01

    Online educational technologies and e-learning tools are providing new opportunities for students to learn worldwide, and they continue to play an important role in anatomical sciences education. Yet, as we shift to teaching online, particularly within the anatomical sciences, it has become apparent that e-learning tool success is based on more than just user satisfaction and preliminary learning outcomes-rather it is a multidimensional construct that should be addressed from an integrated perspective. The efficiency, effectiveness and satisfaction with which a user can navigate an e-learning tool is known as usability, and represents a construct which we propose can be used to quantitatively evaluate e-learning tool success. To assess the usability of an e-learning tool, usability testing should be employed during the design and development phases (i.e., prior to its release to users) as well as during its delivery (i.e., following its release to users). However, both the commercial educational software industry and individual academic developers in the anatomical sciences have overlooked the added value of additional usability testing. Reducing learner frustration and anxiety during e-learning tool use is essential in ensuring e-learning tool success, and will require a commitment on the part of the developers to engage in usability testing during all stages of an e-learning tool's life cycle. Anat Sci Educ 10: 190-199. © 2016 American Association of Anatomists. © 2016 American Association of Anatomists.

  16. The Experiment Data Depot: A Web-Based Software Tool for Biological Experimental Data Storage, Sharing, and Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrell, William C.; Birkel, Garrett W.; Forrer, Mark

    Although recent advances in synthetic biology allow us to produce biological designs more efficiently than ever, our ability to predict the end result of these designs is still nascent. Predictive models require large amounts of high-quality data to be parametrized and tested, which are not generally available. Here, we present the Experiment Data Depot (EDD), an online tool designed as a repository of experimental data and metadata. EDD provides a convenient way to upload a variety of data types, visualize these data, and export them in a standardized fashion for use with predictive algorithms. In this paper, we describe EDDmore » and showcase its utility for three different use cases: storage of characterized synthetic biology parts, leveraging proteomics data to improve biofuel yield, and the use of extracellular metabolite concentrations to predict intracellular metabolic fluxes.« less

  17. GLobal Integrated Design Environment (GLIDE): A Concurrent Engineering Application

    NASA Technical Reports Server (NTRS)

    McGuire, Melissa L.; Kunkel, Matthew R.; Smith, David A.

    2010-01-01

    The GLobal Integrated Design Environment (GLIDE) is a client-server software application purpose-built to mitigate issues associated with real time data sharing in concurrent engineering environments and to facilitate discipline-to-discipline interaction between multiple engineers and researchers. GLIDE is implemented in multiple programming languages utilizing standardized web protocols to enable secure parameter data sharing between engineers and researchers across the Internet in closed and/or widely distributed working environments. A well defined, HyperText Transfer Protocol (HTTP) based Application Programming Interface (API) to the GLIDE client/server environment enables users to interact with GLIDE, and each other, within common and familiar tools. One such common tool, Microsoft Excel (Microsoft Corporation), paired with its add-in API for GLIDE, is discussed in this paper. The top-level examples given demonstrate how this interface improves the efficiency of the design process of a concurrent engineering study while reducing potential errors associated with manually sharing information between study participants.

  18. The Experiment Data Depot: A Web-Based Software Tool for Biological Experimental Data Storage, Sharing, and Visualization

    DOE PAGES

    Morrell, William C.; Birkel, Garrett W.; Forrer, Mark; ...

    2017-08-21

    Although recent advances in synthetic biology allow us to produce biological designs more efficiently than ever, our ability to predict the end result of these designs is still nascent. Predictive models require large amounts of high-quality data to be parametrized and tested, which are not generally available. Here, we present the Experiment Data Depot (EDD), an online tool designed as a repository of experimental data and metadata. EDD provides a convenient way to upload a variety of data types, visualize these data, and export them in a standardized fashion for use with predictive algorithms. In this paper, we describe EDDmore » and showcase its utility for three different use cases: storage of characterized synthetic biology parts, leveraging proteomics data to improve biofuel yield, and the use of extracellular metabolite concentrations to predict intracellular metabolic fluxes.« less

  19. The Experiment Data Depot: A Web-Based Software Tool for Biological Experimental Data Storage, Sharing, and Visualization.

    PubMed

    Morrell, William C; Birkel, Garrett W; Forrer, Mark; Lopez, Teresa; Backman, Tyler W H; Dussault, Michael; Petzold, Christopher J; Baidoo, Edward E K; Costello, Zak; Ando, David; Alonso-Gutierrez, Jorge; George, Kevin W; Mukhopadhyay, Aindrila; Vaino, Ian; Keasling, Jay D; Adams, Paul D; Hillson, Nathan J; Garcia Martin, Hector

    2017-12-15

    Although recent advances in synthetic biology allow us to produce biological designs more efficiently than ever, our ability to predict the end result of these designs is still nascent. Predictive models require large amounts of high-quality data to be parametrized and tested, which are not generally available. Here, we present the Experiment Data Depot (EDD), an online tool designed as a repository of experimental data and metadata. EDD provides a convenient way to upload a variety of data types, visualize these data, and export them in a standardized fashion for use with predictive algorithms. In this paper, we describe EDD and showcase its utility for three different use cases: storage of characterized synthetic biology parts, leveraging proteomics data to improve biofuel yield, and the use of extracellular metabolite concentrations to predict intracellular metabolic fluxes.

  20. Evidence of absence (v2.0) software user guide

    USGS Publications Warehouse

    Dalthorp, Daniel; Huso, Manuela; Dail, David

    2017-07-06

    Evidence of Absence software (EoA) is a user-friendly software application for estimating bird and bat fatalities at wind farms and for designing search protocols. The software is particularly useful in addressing whether the number of fatalities is below a given threshold and what search parameters are needed to give assurance that thresholds were not exceeded. The software also includes tools (1) for estimating carcass persistence distributions and searcher efficiency parameters ( and ) from field trials, (2) for projecting future mortality based on past monitoring data, and (3) for exploring the potential consequences of various choices in the design of long-term incidental take permits for protected species. The software was designed specifically for cases where tolerance for mortality is low and carcass counts are small or even 0, but the tools also may be used for mortality estimates when carcass counts are large.

  1. Next Generation CTAS Tools

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    2000-01-01

    The FAA's Free Flight Phase 1 Office is in the process of deploying the current generation of CTAS tools, which are the Traffic Management Advisor (TMA) and the passive Final Approach Spacing Tool (pFAST), at selected centers and airports. Research at NASA is now focussed on extending the CTAS software and computer human interfaces to provide more advanced capabilities. The Multi-center TMA (McTMA) is designed to operate at airports where arrival flows originate from two or more centers whose boundaries are in close proximity to the TRACON boundary. McTMA will also include techniques for routing arrival flows away from congested airspace and around airspace reserved for arrivals into other hub airports. NASA is working with FAA and MITRE to build a prototype McTMA for the Philadelphia airport. The active Final Approach Spacing Tool (aFAST) provides speed and heading advisories to help controllers achieve accurate spacing between aircraft on final approach. These advisories will be integrated with those in the existing pFAST to provide a set of comprehensive advisories for controlling arrival traffic from the TRACON boundary to touchdown at complex, high-capacity airports. A research prototype of aFAST, designed for the Dallas-Fort Worth is in an advanced stage of development. The Expedite Departure Path (EDP) and Direct-To tools are designed to help controllers guide departing aircraft out of the TRACON airspace and to climb to cruise altitude along the most efficient routes.

  2. Knowledge management in a waste based biorefinery in the QbD paradigm.

    PubMed

    Rathore, Anurag S; Chopda, Viki R; Gomes, James

    2016-09-01

    Shifting resource base from fossil feedstock to renewable raw materials for production of chemical products has opened up an area of novel applications of industrial biotechnology-based process tools. This review aims to provide a concise and focused discussion on recent advances in knowledge management to facilitate efficient and optimal operation of a biorefinery. Application of quality by design (QbD) and process analytical technology (PAT) as tools for knowledge creation and management at different levels has been highlighted. Role of process integration, government policies, knowledge exchange through collaboration, and use of databases and computational tools have also been touched upon. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    PubMed Central

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  4. Managing complex research datasets using electronic tools: a meta-analysis exemplar.

    PubMed

    Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L

    2013-06-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.

  5. Interoperability science cases with the CDPP tools

    NASA Astrophysics Data System (ADS)

    Nathanaël, J.; Cecconi, B.; André, N.; Bouchemit, M.; Gangloff, M.; Budnik, E.; Jacquey, C.; Pitout, F.; Durand, J.; Rouillard, A.; Lavraud, B.; Genot, V. N.; Popescu, D.; Beigbeder, L.; Toniutti, J. P.; Caussarieu, S.

    2017-12-01

    Data exchange protocols are never as efficient as when they are invisible for the end user who is then able to discover data, to cross compare observations and modeled data and finally to perform in depth analysis. Over the years these protocols, including SAMP from IVOA, EPN-TAP from the Europlanet 2020 RI community, backed by standard web-services, have been deployed in tools designed by the French Centre de Données de la Physique des Plasmas (CDPP) including AMDA, the Propagation Tool, 3DView, ... . This presentation will focus on science cases which show the capability of interoperability in the planetary and heliophysics contexts, involving both CDPP and companion tools. Europlanet 2020 RI has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 654208.

  6. Austin Energy: Pumping System Improvement Project Saves Energy and Improves Performance at a Power Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2005-06-01

    This two-page performance spotlight describes how, in 2004, Austin Energy (the electric utility for the city of Austin, Texas) began saving about $1.2 million in energy and maintenance costs annually as a direct result of a pumping system efficiency project. The project was designed to improve the efficiency of the circulating water pumping system serving the utility's 405-MW steam turbine. A U.S. Department of Energy Qualified Pumping System Assessment Tool Specialist at Flowserve Corporation assisted in the initial assessment of the system.

  7. Precise and Efficient Static Array Bound Checking for Large Embedded C Programs

    NASA Technical Reports Server (NTRS)

    Venet, Arnaud

    2004-01-01

    In this paper we describe the design and implementation of a static array-bound checker for a family of embedded programs: the flight control software of recent Mars missions. These codes are large (up to 250 KLOC), pointer intensive, heavily multithreaded and written in an object-oriented style, which makes their analysis very challenging. We designed a tool called C Global Surveyor (CGS) that can analyze the largest code in a couple of hours with a precision of 80%. The scalability and precision of the analyzer are achieved by using an incremental framework in which a pointer analysis and a numerical analysis of array indices mutually refine each other. CGS has been designed so that it can distribute the analysis over several processors in a cluster of machines. To the best of our knowledge this is the first distributed implementation of static analysis algorithms. Throughout the paper we will discuss the scalability setbacks that we encountered during the construction of the tool and their impact on the initial design decisions.

  8. Clinical guideline representation in a CDS: a human information processing method.

    PubMed

    Kilsdonk, Ellen; Riezebos, Rinke; Kremer, Leontien; Peute, Linda; Jaspers, Monique

    2012-01-01

    The Dutch Childhood Oncology Group (DCOG) has developed evidence-based guidelines for screening childhood cancer survivors for possible late complications of treatment. These paper-based guidelines appeared to not suit clinicians' information retrieval strategies; it was thus decided to communicate the guidelines through a Computerized Decision Support (CDS) tool. To ensure high usability of this tool, an analysis of clinicians' cognitive strategies in retrieving information from the paper-based guidelines was used as requirements elicitation method. An information processing model was developed through an analysis of think aloud protocols and used as input for the design of the CDS user interface. Usability analysis of the user interface showed that the navigational structure of the CDS tool fitted well with the clinicians' mental strategies employed in deciding on survivors screening protocols. Clinicians were more efficient and more complete in deciding on patient-tailored screening procedures when supported by the CDS tool than by the paper-based guideline booklet. The think-aloud method provided detailed insight into users' clinical work patterns that supported the design of a highly usable CDS system.

  9. Energy efficiency analysis and implementation of AES on an FPGA

    NASA Astrophysics Data System (ADS)

    Kenney, David

    The Advanced Encryption Standard (AES) was developed by Joan Daemen and Vincent Rjimen and endorsed by the National Institute of Standards and Technology in 2001. It was designed to replace the aging Data Encryption Standard (DES) and be useful for a wide range of applications with varying throughput, area, power dissipation and energy consumption requirements. Field Programmable Gate Arrays (FPGAs) are flexible and reconfigurable integrated circuits that are useful for many different applications including the implementation of AES. Though they are highly flexible, FPGAs are often less efficient than Application Specific Integrated Circuits (ASICs); they tend to operate slower, take up more space and dissipate more power. There have been many FPGA AES implementations that focus on obtaining high throughput or low area usage, but very little research done in the area of low power or energy efficient FPGA based AES; in fact, it is rare for estimates on power dissipation to be made at all. This thesis presents a methodology to evaluate the energy efficiency of FPGA based AES designs and proposes a novel FPGA AES implementation which is highly flexible and energy efficient. The proposed methodology is implemented as part of a novel scripting tool, the AES Energy Analyzer, which is able to fully characterize the power dissipation and energy efficiency of FPGA based AES designs. Additionally, this thesis introduces a new FPGA power reduction technique called Opportunistic Combinational Operand Gating (OCOG) which is used in the proposed energy efficient implementation. The AES Energy Analyzer was able to estimate the power dissipation and energy efficiency of the proposed AES design during its most commonly performed operations. It was found that the proposed implementation consumes less energy per operation than any previous FPGA based AES implementations that included power estimations. Finally, the use of Opportunistic Combinational Operand Gating on an AES cipher was found to reduce its dynamic power consumption by up to 17% when compared to an identical design that did not employ the technique.

  10. OLTARIS: An Efficient Web-Based Tool for Analyzing Materials Exposed to Space Radiation

    NASA Technical Reports Server (NTRS)

    Slaba, Tony; McMullen, Amelia M.; Thibeault, Sheila A.; Sandridge, Chris A.; Clowdsley, Martha S.; Blatting, Steve R.

    2011-01-01

    The near-Earth space radiation environment includes energetic galactic cosmic rays (GCR), high intensity proton and electron belts, and the potential for solar particle events (SPE). These sources may penetrate shielding materials and deposit significant energy in sensitive electronic devices on board spacecraft and satellites. Material and design optimization methods may be used to reduce the exposure and extend the operational lifetime of individual components and systems. Since laboratory experiments are expensive and may not cover the range of particles and energies relevant for space applications, such optimization may be done computationally with efficient algorithms that include the various constraints placed on the component, system, or mission. In the present work, the web-based tool OLTARIS (On-Line Tool for the Assessment of Radiation in Space) is presented, and the applicability of the tool for rapidly analyzing exposure levels within either complicated shielding geometries or user-defined material slabs exposed to space radiation is demonstrated. An example approach for material optimization is also presented. Slabs of various advanced multifunctional materials are defined and exposed to several space radiation environments. The materials and thicknesses defining each layer in the slab are then systematically adjusted to arrive at an optimal slab configuration.

  11. A new approach to road accident rescue.

    PubMed

    Morales, Alejandro; González-Aguilera, Diego; López, Alfonso I; Gutiérrez, Miguel A

    2016-01-01

    This article develops and validates a new methodology and tool for rescue assistance in traffic accidents, with the aim of improving its efficiency and safety in the evacuation of people, reducing the number of victims in road accidents. Different tests supported by professionals and experts have been designed under different circumstances and with different categories of damaged vehicles coming from real accidents and simulated trapped victims in order to calibrate and refine the proposed methodology and tool. To validate this new approach, a tool called App_Rescue has been developed. This tool is based on the use of a computer system that allows an efficient access to the technical information of the vehicle and sanitary information of the common passengers. The time spent during rescue using the standard protocol and the proposed method was compared. This rescue assistance system allows us to make vital information accessible in posttrauma care services, improving the effectiveness of interventions by the emergency services, reducing the rescue time and therefore minimizing the consequences involved and the number of victims. This could often mean saving lives. In the different simulated rescue operations, the rescue time has been reduced an average of 14%.

  12. Near-Infrared Neuroimaging with NinPy

    PubMed Central

    Strangman, Gary E.; Zhang, Quan; Zeffiro, Thomas

    2009-01-01

    There has been substantial recent growth in the use of non-invasive optical brain imaging in studies of human brain function in health and disease. Near-infrared neuroimaging (NIN) is one of the most promising of these techniques and, although NIN hardware continues to evolve at a rapid pace, software tools supporting optical data acquisition, image processing, statistical modeling, and visualization remain less refined. Python, a modular and computationally efficient development language, can support functional neuroimaging studies of diverse design and implementation. In particular, Python's easily readable syntax and modular architecture allow swift prototyping followed by efficient transition to stable production systems. As an introduction to our ongoing efforts to develop Python software tools for structural and functional neuroimaging, we discuss: (i) the role of non-invasive diffuse optical imaging in measuring brain function, (ii) the key computational requirements to support NIN experiments, (iii) our collection of software tools to support NIN, called NinPy, and (iv) future extensions of these tools that will allow integration of optical with other structural and functional neuroimaging data sources. Source code for the software discussed here will be made available at www.nmr.mgh.harvard.edu/Neural_SystemsGroup/software.html. PMID:19543449

  13. The value of online learning and MRI: finding a niche for expensive technologies.

    PubMed

    Cook, David A

    2014-11-01

    The benefits of online learning come at a price. How can we optimize the overall value? Critically appraise the value of online learning. Narrative review. Several prevalent myths overinflate the value of online learning. These include that online learning is cheap and easy (it is usually more expensive), that it is more efficient (efficiency depends on the instructional design, not the modality), that it will transform education (fundamental learning principles have not changed), and that the Net Generation expects it (there is no evidence of pent-up demand). However, online learning does add real value by enhancing flexibility, control and analytics. Costs may also go down if disruptive innovations (e.g. low-cost, low-tech, but instructionally sound "good enough" online learning) supplant technically superior but more expensive online learning products. Cost-lowering strategies include focusing on core principles of learning rather than technologies, using easy-to-learn authoring tools, repurposing content (organizing and sequencing existing resources rather than creating new content) and using course templates. Online learning represents just one tool in an educator's toolbox, as does the MRI for clinicians. We need to use the right tool(s) for the right learner at the right dose, time and route.

  14. An efficient framework for Java data processing systems in HPC environments

    NASA Astrophysics Data System (ADS)

    Fries, Aidan; Castañeda, Javier; Isasi, Yago; Taboada, Guillermo L.; Portell de Mora, Jordi; Sirvent, Raül

    2011-11-01

    Java is a commonly used programming language, although its use in High Performance Computing (HPC) remains relatively low. One of the reasons is a lack of libraries offering specific HPC functions to Java applications. In this paper we present a Java-based framework, called DpcbTools, designed to provide a set of functions that fill this gap. It includes a set of efficient data communication functions based on message-passing, thus providing, when a low latency network such as Myrinet is available, higher throughputs and lower latencies than standard solutions used by Java. DpcbTools also includes routines for the launching, monitoring and management of Java applications on several computing nodes by making use of JMX to communicate with remote Java VMs. The Gaia Data Processing and Analysis Consortium (DPAC) is a real case where scientific data from the ESA Gaia astrometric satellite will be entirely processed using Java. In this paper we describe the main elements of DPAC and its usage of the DpcbTools framework. We also assess the usefulness and performance of DpcbTools through its performance evaluation and the analysis of its impact on some DPAC systems deployed in the MareNostrum supercomputer (Barcelona Supercomputing Center).

  15. Attitude estimation of earth orbiting satellites by decomposed linear recursive filters

    NASA Technical Reports Server (NTRS)

    Kou, S. R.

    1975-01-01

    Attitude estimation of earth orbiting satellites (including Large Space Telescope) subjected to environmental disturbances and noises was investigated. Modern control and estimation theory is used as a tool to design an efficient estimator for attitude estimation. Decomposed linear recursive filters for both continuous-time systems and discrete-time systems are derived. By using this accurate estimation of the attitude of spacecrafts, state variable feedback controller may be designed to achieve (or satisfy) high requirements of system performance.

  16. Design and in vivo evaluation of more efficient and selective deep brain stimulation electrodes

    NASA Astrophysics Data System (ADS)

    Howell, Bryan; Huynh, Brian; Grill, Warren M.

    2015-08-01

    Objective. Deep brain stimulation (DBS) is an effective treatment for movement disorders and a promising therapy for treating epilepsy and psychiatric disorders. Despite its clinical success, the efficiency and selectivity of DBS can be improved. Our objective was to design electrode geometries that increased the efficiency and selectivity of DBS. Approach. We coupled computational models of electrodes in brain tissue with cable models of axons of passage (AOPs), terminating axons (TAs), and local neurons (LNs); we used engineering optimization to design electrodes for stimulating these neural elements; and the model predictions were tested in vivo. Main results. Compared with the standard electrode used in the Medtronic Model 3387 and 3389 arrays, model-optimized electrodes consumed 45-84% less power. Similar gains in selectivity were evident with the optimized electrodes: 50% of parallel AOPs could be activated while reducing activation of perpendicular AOPs from 44 to 48% with the standard electrode to 0-14% with bipolar designs; 50% of perpendicular AOPs could be activated while reducing activation of parallel AOPs from 53 to 55% with the standard electrode to 1-5% with an array of cathodes; and, 50% of TAs could be activated while reducing activation of AOPs from 43 to 100% with the standard electrode to 2-15% with a distal anode. In vivo, both the geometry and polarity of the electrode had a profound impact on the efficiency and selectivity of stimulation. Significance. Model-based design is a powerful tool that can be used to improve the efficiency and selectivity of DBS electrodes.

  17. Genome Partitioner: A web tool for multi-level partitioning of large-scale DNA constructs for synthetic biology applications.

    PubMed

    Christen, Matthias; Del Medico, Luca; Christen, Heinz; Christen, Beat

    2017-01-01

    Recent advances in lower-cost DNA synthesis techniques have enabled new innovations in the field of synthetic biology. Still, efficient design and higher-order assembly of genome-scale DNA constructs remains a labor-intensive process. Given the complexity, computer assisted design tools that fragment large DNA sequences into fabricable DNA blocks are needed to pave the way towards streamlined assembly of biological systems. Here, we present the Genome Partitioner software implemented as a web-based interface that permits multi-level partitioning of genome-scale DNA designs. Without the need for specialized computing skills, biologists can submit their DNA designs to a fully automated pipeline that generates the optimal retrosynthetic route for higher-order DNA assembly. To test the algorithm, we partitioned a 783 kb Caulobacter crescentus genome design. We validated the partitioning strategy by assembling a 20 kb test segment encompassing a difficult to synthesize DNA sequence. Successful assembly from 1 kb subblocks into the 20 kb segment highlights the effectiveness of the Genome Partitioner for reducing synthesis costs and timelines for higher-order DNA assembly. The Genome Partitioner is broadly applicable to translate DNA designs into ready to order sequences that can be assembled with standardized protocols, thus offering new opportunities to harness the diversity of microbial genomes for synthetic biology applications. The Genome Partitioner web tool can be accessed at https://christenlab.ethz.ch/GenomePartitioner.

  18. EUV focus sensor: design and modeling

    NASA Astrophysics Data System (ADS)

    Goldberg, Kenneth A.; Teyssier, Maureen E.; Liddle, J. Alexander

    2005-05-01

    We describe performance modeling and design optimization of a prototype EUV focus sensor (FS) designed for use with existing 0.3-NA EUV projection-lithography tools. At 0.3-NA and 13.5-nm wavelength, the depth of focus shrinks to 150 nm increasing the importance of high-sensitivity focal-plane detection tools. The FS is a free-standing Ni grating structure that works in concert with a simple mask pattern of regular lines and spaces at constant pitch. The FS pitch matches that of the image-plane aerial-image intensity: it transmits the light with high efficiency when the grating is aligned with the aerial image laterally and longitudinally. Using a single-element photodetector, to detect the transmitted flux, the FS is scanned laterally and longitudinally so the plane of peak aerial-image contrast can be found. The design under consideration has a fixed image-plane pitch of 80-nm, with aperture widths of 12-40-nm (1-3 wave-lengths), and aspect ratios of 2-8. TEMPEST-3D is used to model the light transmission. Careful attention is paid to the annular, partially coherent, unpolarized illumination and to the annular pupil of the Micro-Exposure Tool (MET) optics for which the FS is designed. The system design balances the opposing needs of high sensitivity and high throughput opti-mizing the signal-to-noise ratio in the measured intensity contrast.

  19. EUV Focus Sensor: Design and Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Kenneth A.; Teyssier, Maureen E.; Liddle, J. Alexander

    We describe performance modeling and design optimization of a prototype EUV focus sensor (FS) designed for use with existing 0.3-NA EUV projection-lithography tools. At 0.3-NA and 13.5-nm wavelength, the depth of focus shrinks to 150 nm increasing the importance of high-sensitivity focal-plane detection tools. The FS is a free-standing Ni grating structure that works in concert with a simple mask pattern of regular lines and spaces at constant pitch. The FS pitch matches that of the image-plane aerial-image intensity: it transmits the light with high efficiency when the grating is aligned with the aerial image laterally and longitudinally. Using amore » single-element photodetector, to detect the transmitted flux, the FS is scanned laterally and longitudinally so the plane of peak aerial-image contrast can be found. The design under consideration has a fixed image-plane pitch of 80-nm, with aperture widths of 12-40-nm (1-3 wavelengths), and aspect ratios of 2-8. TEMPEST-3D is used to model the light transmission. Careful attention is paid to the annular, partially coherent, unpolarized illumination and to the annular pupil of the Micro-Exposure Tool (MET) optics for which the FS is designed. The system design balances the opposing needs of high sensitivity and high throughput optimizing the signal-to-noise ratio in the measured intensity contrast.« less

  20. Look@NanoSIMS--a tool for the analysis of nanoSIMS data in environmental microbiology.

    PubMed

    Polerecky, Lubos; Adam, Birgit; Milucka, Jana; Musat, Niculina; Vagner, Tomas; Kuypers, Marcel M M

    2012-04-01

    We describe an open-source freeware programme for high throughput analysis of nanoSIMS (nanometre-scale secondary ion mass spectrometry) data. The programme implements basic data processing and analytical functions, including display and drift-corrected accumulation of scanned planes, interactive and semi-automated definition of regions of interest (ROIs), and export of the ROIs' elemental and isotopic composition in graphical and text-based formats. Additionally, the programme offers new functions that were custom-designed to address the needs of environmental microbiologists. Specifically, it allows manual and automated classification of ROIs based on the information that is derived either from the nanoSIMS dataset itself (e.g. from labelling achieved by halogen in situ hybridization) or is provided externally (e.g. as a fluorescence in situ hybridization image). Moreover, by implementing post-processing routines coupled to built-in statistical tools, the programme allows rapid synthesis and comparative analysis of results from many different datasets. After validation of the programme, we illustrate how these new processing and analytical functions increase flexibility, efficiency and depth of the nanoSIMS data analysis. Through its custom-made and open-source design, the programme provides an efficient, reliable and easily expandable tool that can help a growing community of environmental microbiologists and researchers from other disciplines process and analyse their nanoSIMS data. © 2012 Society for Applied Microbiology and Blackwell Publishing Ltd.

  1. Quality by design: optimization of a liquid filled pH-responsive macroparticles using Draper-Lin composite design.

    PubMed

    Rafati, Hasan; Talebpour, Zahra; Adlnasab, Laleh; Ebrahimi, Samad Nejad

    2009-07-01

    In this study, pH responsive macroparticles incorporating peppermint oil (PO) were prepared using a simple emulsification/polymer precipitation technique. The formulations were examined for their properties and the desired quality was then achieved using a quality by design (QBD) approach. For this purpose, a Draper-Lin small composite design study was employed in order to investigate the effect of four independent variables, including the PO to water ratio, the concentration of pH sensitive polymer (hydroxypropyl methylcellulose phthalate), acid and plasticizer concentrations, on the encapsulation efficiency and PO loading. The analysis of variance showed that the polymer concentration was the most important variable on encapsulation efficiency (p < 0.05). The multiple regression analysis of the results led to equations that adequately described the influence of the independent variables on the selected responses. Furthermore, the desirability function was employed as an effective tool for transforming each response separately and encompassing all of these responses in an overall desirability function for global optimization of the encapsulation process. The optimized macroparticles were predicted to yield 93.4% encapsulation efficiency and 72.8% PO loading, which were remarkably close to the experimental values of 89.2% and 69.5%, consequently.

  2. An energy and cost efficient majority-based RAM cell in quantum-dot cellular automata

    NASA Astrophysics Data System (ADS)

    Khosroshahy, Milad Bagherian; Moaiyeri, Mohammad Hossein; Navi, Keivan; Bagherzadeh, Nader

    Nanotechnologies, notably quantum-dot cellular automata, have achieved major attentions for their prominent features as compared to the conventional CMOS circuitry. Quantum-dot cellular automata, particularly owning to its considerable reduction in size, high switching speed and ultra-low energy consumption, is considered as a potential alternative for the CMOS technology. As the memory unit is one of the most essential components in a digital system, designing a well-optimized QCA random access memory (RAM) cell is an important area of research. In this paper, a new five-input majority gate is presented which is suitable for implementing efficient single-layer QCA circuits. In addition, a new RAM cell with set and reset capabilities is designed based on the proposed majority gate, which has an efficient and low-energy structure. The functionality, performance and energy consumption of the proposed designs are evaluated based on the QCADesigner and QCAPro tools. According to the simulation results, the proposed RAM design leads to on average 38% lower total energy dissipation, 25% smaller area, 20% lower cell count, 28% lower delay and 60% lower QCA cost as compared to its previous counterparts.

  3. DEVELOPMENT OF DIAGNOSTIC ANALYTICAL AND MECHANICAL ABILITY TESTS THROUGH FACET DESIGN AND ANALYSIS.

    ERIC Educational Resources Information Center

    GUTTMAN, LOUIS,; SCHLESINGER, I.M.

    METHODOLOGY BASED ON FACET THEORY (MODIFIED SET THEORY) WAS USED IN TEST CONSTRUCTION AND ANALYSIS TO PROVIDE AN EFFICIENT TOOL OF EVALUATION FOR VOCATIONAL GUIDANCE AND VOCATIONAL SCHOOL USE. THE TYPE OF TEST DEVELOPMENT UNDERTAKEN WAS LIMITED TO THE USE OF NONVERBAL PICTORIAL ITEMS. ITEMS FOR TESTING ABILITY TO IDENTIFY ELEMENTS BELONGING TO AN…

  4. The Complete Toolkit for Building High-Performance Work Teams.

    ERIC Educational Resources Information Center

    Golden, Nancy; Gall, Joyce P.

    This workbook is designed for leaders and members of work teams in educational and social-service systems. It presents in a systematic fashion a set of tested facilitation tools that will allow teams to work more efficiently and harmoniously, enabling them to achieve their goals, to deal directly with both personal and work-related issues that…

  5. Spray Cooling Processes for Space Applications

    NASA Technical Reports Server (NTRS)

    Kizito, John P.; VanderWal, Randy L.; Berger, Gordon; Tryggvason, Gretar

    2004-01-01

    The present paper reports ongoing work to develop numerical and modeling tools used to design efficient and effective spray cooling processes and to determine characteristic non-dimensional parametric dependence for practical fluids and conditions. In particular, we present data that will delineate conditions towards control of the impingement dynamics of droplets upon a heated substrate germane to practical situations.

  6. Evaluation of Computerised Reading-Assistance Systems for Reading Japanese Texts--From a Linguistic Point of View

    ERIC Educational Resources Information Center

    Toyoda, Etsuko

    2016-01-01

    For second-language learners to effectively and efficiently gather information from online texts in their target language, a well-designed computerised system to assist their reading is essential. While many articles and websites which introduce electronic second-language learning tools exist, evaluation of their functions in relation to the…

  7. Marrying Two Existing Software Packages into an Efficient Online Tutoring Tool

    ERIC Educational Resources Information Center

    Byrne, Timothy

    2007-01-01

    Many teachers today use Learning Management Systems (LMS), several of which are open-source. Specific examples are Claroline and Moodle. However, they are not specifically designed for language learning, and hence not entirely suitable. In this article, I will compare two uses of the Claroline LMS available at Louvain-la-Neuve within the framework…

  8. Cognitive Readiness Assessment and Reporting: An Open Source Mobile Framework for Operational Decision Support and Performance Improvement

    ERIC Educational Resources Information Center

    Heric, Matthew; Carter, Jenn

    2011-01-01

    Cognitive readiness (CR) and performance for operational time-critical environments are continuing points of focus for military and academic communities. In response to this need, we designed an open source interactive CR assessment application as a highly adaptive and efficient open source testing administration and analysis tool. It is capable…

  9. Granting Teachers the "Benefit of the Doubt" in Performance Evaluations

    ERIC Educational Resources Information Center

    Rogge, Nicky

    2011-01-01

    Purpose: This paper proposes a benefit of the doubt (BoD) approach to construct and analyse teacher effectiveness scores (i.e. SET scores). Design/methodology/approach: The BoD approach is related to data envelopment analysis (DEA), a linear programming tool for evaluating the relative efficiency performance of a set of similar units (e.g. firms,…

  10. Disrupted rhythms and mobile ICT in a surgical department.

    PubMed

    Hasvold, Per Erlend; Scholl, Jeremiah

    2011-08-01

    This study presents a study of mobile information and communication technology (ICT) for healthcare professionals in a surgical ward. The purpose of the study was to create a participatory design process to investigate factors that affect the acceptance of mobile ICT in a surgical ward. Observations, interviews, a participatory design process, and pilot testing of a prototype of a co-constructed application were used. Informal rhythms existed at the department that facilitated that people met and interacted several times throughout the day. These gatherings allowed for opportunistic encounters that were extensively used for dialogue, problem solving, coordination, message and logistics handling. A prototype based on handheld mobile computers was introduced. The tool supported information seeking functionality that previously required local mobility. By making the nurses more freely mobile, the tool disrupted these informal rhythms. This created dissatisfaction with the system, and lead to discussion and introduction of other arenas to solve coordination and other problems. Mobile ICT tools may break down informal communication and coordination structures. This may reduce the efficiency of the new tools, or contribute to resistance towards such systems. In some situations however such "disrupted rhythms" may be overcome by including additional sociotechnical mechanisms in the overall design to counteract this negative side-effect. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  11. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  12. Graph-based optimization of epitope coverage for vaccine antigen design

    DOE PAGES

    Theiler, James Patrick; Korber, Bette Tina Marie

    2017-01-29

    Epigraph is a recently developed algorithm that enables the computationally efficient design of single or multi-antigen vaccines to maximize the potential epitope coverage for a diverse pathogen population. Potential epitopes are defined as short contiguous stretches of proteins, comparable in length to T-cell epitopes. This optimal coverage problem can be formulated in terms of a directed graph, with candidate antigens represented as paths that traverse this graph. Epigraph protein sequences can also be used as the basis for designing peptides for experimental evaluation of immune responses in natural infections to highly variable proteins. The epigraph tool suite also enables rapidmore » characterization of populations of diverse sequences from an immunological perspective. Fundamental distance measures are based on immunologically relevant shared potential epitope frequencies, rather than simple Hamming or phylogenetic distances. Here, we provide a mathematical description of the epigraph algorithm, include a comparison of different heuristics that can be used when graphs are not acyclic, and we describe an additional tool we have added to the web-based epigraph tool suite that provides frequency summaries of all distinct potential epitopes in a population. Lastly, we also show examples of the graphical output and summary tables that can be generated using the epigraph tool suite and explain their content and applications.« less

  13. Graph-based optimization of epitope coverage for vaccine antigen design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theiler, James Patrick; Korber, Bette Tina Marie

    Epigraph is a recently developed algorithm that enables the computationally efficient design of single or multi-antigen vaccines to maximize the potential epitope coverage for a diverse pathogen population. Potential epitopes are defined as short contiguous stretches of proteins, comparable in length to T-cell epitopes. This optimal coverage problem can be formulated in terms of a directed graph, with candidate antigens represented as paths that traverse this graph. Epigraph protein sequences can also be used as the basis for designing peptides for experimental evaluation of immune responses in natural infections to highly variable proteins. The epigraph tool suite also enables rapidmore » characterization of populations of diverse sequences from an immunological perspective. Fundamental distance measures are based on immunologically relevant shared potential epitope frequencies, rather than simple Hamming or phylogenetic distances. Here, we provide a mathematical description of the epigraph algorithm, include a comparison of different heuristics that can be used when graphs are not acyclic, and we describe an additional tool we have added to the web-based epigraph tool suite that provides frequency summaries of all distinct potential epitopes in a population. Lastly, we also show examples of the graphical output and summary tables that can be generated using the epigraph tool suite and explain their content and applications.« less

  14. Research and Design on a Product Data Definition System of Semiconductor Packaging Industry

    NASA Astrophysics Data System (ADS)

    Shi, Jinfei; Ma, Qingyao; Zhou, Yifan; Chen, Ruwen

    2017-12-01

    This paper develops a product data definition (PDD) system for a semiconductor packaging and testing company with independent intellectual property rights. The new PDD system can solve the problems such as, the effective control of production plans, the timely feedback of production processes, and the efficient schedule of resources. Firstly, this paper introduces the general requirements of the PDD system and depicts the operation flow and the data flow of the PDD system. Secondly, the overall design scheme of the PDD system is put forward. After that, the physical data model is developed using the Power Designer15.0 tool, and the database system is built. Finally, the function realization and running effects of the PDD system are analysed. The successful operation of the PDD system can realize the information flow among various production departments of the enterprise to meet the standard of the enterprise manufacturing integration and improve the efficiency of production management.

  15. A novel reversible logic gate and its systematic approach to implement cost-efficient arithmetic logic circuits using QCA.

    PubMed

    Ahmad, Peer Zahoor; Quadri, S M K; Ahmad, Firdous; Bahar, Ali Newaz; Wani, Ghulam Mohammad; Tantary, Shafiq Maqbool

    2017-12-01

    Quantum-dot cellular automata, is an extremely small size and a powerless nanotechnology. It is the possible alternative to current CMOS technology. Reversible QCA logic is the most important issue at present time to reduce power losses. This paper presents a novel reversible logic gate called the F-Gate. It is simplest in design and a powerful technique to implement reversible logic. A systematic approach has been used to implement a novel single layer reversible Full-Adder, Full-Subtractor and a Full Adder-Subtractor using the F-Gate. The proposed Full Adder-Subtractor has achieved significant improvements in terms of overall circuit parameters among the most previously cost-efficient designs that exploit the inevitable nano-level issues to perform arithmetic computing. The proposed designs have been authenticated and simulated using QCADesigner tool ver. 2.0.3.

  16. An automated performance budget estimator: a process for use in instrumentation

    NASA Astrophysics Data System (ADS)

    Laporte, Philippe; Schnetler, Hermine; Rees, Phil

    2016-08-01

    Current day astronomy projects continue to increase in size and are increasingly becoming more complex, regardless of the wavelength domain, while risks in terms of safety, cost and operability have to be reduced to ensure an affordable total cost of ownership. All of these drivers have to be considered carefully during the development process of an astronomy project at the same time as there is a big drive to shorten the development life-cycle. From the systems engineering point of view, this evolution is a significant challenge. Big instruments imply management of interfaces within large consortia and dealing with tight design phase schedules which necessitate efficient and rapid interactions between all the stakeholders to firstly ensure that the system is defined correctly and secondly that the designs will meet all the requirements. It is essential that team members respond quickly such that the time available for the design team is maximised. In this context, performance prediction tools can be very helpful during the concept phase of a project to help selecting the best design solution. In the first section of this paper we present the development of such a prediction tool that can be used by the system engineer to determine the overall performance of the system and to evaluate the impact on the science based on the proposed design. This tool can also be used in "what-if" design analysis to assess the impact on the overall performance of the system based on the simulated numbers calculated by the automated system performance prediction tool. Having such a tool available from the beginning of a project can allow firstly for a faster turn-around between the design engineers and the systems engineer and secondly, between the systems engineer and the instrument scientist. Following the first section we described the process for constructing a performance estimator tool, followed by describing three projects in which such a tool has been utilised to illustrate how such a tool have been used in astronomy projects. The three use-cases are; EAGLE, one of the European Extremely Large Telescope (E-ELT) Multi-Object Spectrograph (MOS) instruments that was studied from 2007 to 2009, the Multi-Object Optical and Near-Infrared Spectrograph (MOONS) for the European Southern Observatory's Very Large Telescope (VLT), currently under development and SST-GATE.

  17. Micro-Spec: An Ultracompact, High-sensitivity Spectrometer for Far-Infrared and Submillimeter Astronomy

    NASA Technical Reports Server (NTRS)

    Cataldo, Giuseppe; Hsieh, Wen-Ting; Huang, Wei-Chung; Moseley, S. Harvey; Stevenson, Thomas R.; Wollack, Edward J.

    2014-01-01

    High-performance, integrated spectrometers operating in the far-infrared and submillimeter ranges promise to be powerful tools for the exploration of the epochs of reionization and initial galaxy formation. These devices, using high-efficiency superconducting transmission lines, can achieve the performance of a meter-scale grating spectrometer in an instrument implemented on a 4 inch silicon wafer. Such a device, when combined with a cryogenic telescope in space, provides an enabling capability for studies of the early universe. Here, the optical design process for Micro-Spec (micron-Spec) is presented, with particular attention given to its two-dimensional diffractive region, where the light of different wavelengths is focused on the different detectors. The method is based on the stigmatization and minimization of the light path function in this bounded region, which results in an optimized geometrical configuration. A point design with an efficiency of (is) approximately 90% has been developed for initial demonstration and can serve as the basis for future instruments. Design variations on this implementation are also discussed, which can lead to lower efficiencies due to diffractive losses in the multimode region.

  18. Micro-Spec: An Ultra-Compact, High-Sensitivity Spectrometer for Far-Infrared and Sub-Millimeter Astronomy

    NASA Technical Reports Server (NTRS)

    Cataldo, Giuseppe; Hsieh, Wen-Ting; Huang, Wei-Chung; Moseley, S. Harvey; Stevenson, Thomas R.; Wollack, Edward J.

    2013-01-01

    High-performance, integrated spectrometers operating in the far-infrared and sub-millimeter promise to be powerful tools for the exploration of the epochs of reionization and initial galaxy formation. These devices, using high-efficiency superconducting transmission lines, can achieve the performance of a meter-scale grating spectrometer in an instrument implemented on a four-inch silicon wafer. Such a device, when combined with a cryogenic telescope in space, provides an enabling capability for studies of the early universe. Here, the optical design process for Micro-Spec (mu-Spec) is presented, with particular attention given to its two-dimensional diffractive region, where the light of different wavelengths is focused on the different detectors. The method is based on the stigmatization and minimization of the light path function in this bounded region, which results in an optimized geometrical configuration. A point design with an efficiency of approx. 90% has been developed for initial demonstration, and can serve as the basis for future instruments. Design variations on this implementation are also discussed, which can lead to lower efficiencies due to diffractive losses in the multimode region.

  19. Advantages of Crystallographic Fragment Screening: Functional and Mechanistic Insights from a Powerful Platform for Efficient Drug Discovery

    PubMed Central

    Patel, Disha; Bauman, Joseph D.; Arnold, Eddy

    2015-01-01

    X-ray crystallography has been an under-appreciated screening tool for fragment-based drug discovery due to the perception of low throughput and technical difficulty. Investigators in industry and academia have overcome these challenges by taking advantage of key factors that contribute to a successful crystallographic screening campaign. Efficient cocktail design and soaking methodologies have evolved to maximize throughput while minimizing false positives/negatives. In addition, technical improvements at synchrotron beamlines have dramatically increased data collection rates thus enabling screening on a timescale comparable to other techniques. The combination of available resources and efficient experimental design has resulted in many successful crystallographic screening campaigns. The three-dimensional crystal structure of the bound fragment complexed to its target, a direct result of the screening effort, enables structure-based drug design while revealing insights regarding protein dynamics and function not readily obtained through other experimental approaches. Furthermore, this “chemical interrogation” of the target protein crystals can lead to the identification of useful reagents for improving diffraction resolution or compound solubility. PMID:25117499

  20. Advantages of crystallographic fragment screening: functional and mechanistic insights from a powerful platform for efficient drug discovery.

    PubMed

    Patel, Disha; Bauman, Joseph D; Arnold, Eddy

    2014-01-01

    X-ray crystallography has been an under-appreciated screening tool for fragment-based drug discovery due to the perception of low throughput and technical difficulty. Investigators in industry and academia have overcome these challenges by taking advantage of key factors that contribute to a successful crystallographic screening campaign. Efficient cocktail design and soaking methodologies have evolved to maximize throughput while minimizing false positives/negatives. In addition, technical improvements at synchrotron beamlines have dramatically increased data collection rates thus enabling screening on a timescale comparable to other techniques. The combination of available resources and efficient experimental design has resulted in many successful crystallographic screening campaigns. The three-dimensional crystal structure of the bound fragment complexed to its target, a direct result of the screening effort, enables structure-based drug design while revealing insights regarding protein dynamics and function not readily obtained through other experimental approaches. Furthermore, this "chemical interrogation" of the target protein crystals can lead to the identification of useful reagents for improving diffraction resolution or compound solubility. Copyright © 2014. Published by Elsevier Ltd.

  1. Micro-Spec: an ultracompact, high-sensitivity spectrometer for far-infrared and submillimeter astronomy.

    PubMed

    Cataldo, Giuseppe; Hsieh, Wen-Ting; Huang, Wei-Chung; Moseley, S Harvey; Stevenson, Thomas R; Wollack, Edward J

    2014-02-20

    High-performance, integrated spectrometers operating in the far-infrared and submillimeter ranges promise to be powerful tools for the exploration of the epochs of reionization and initial galaxy formation. These devices, using high-efficiency superconducting transmission lines, can achieve the performance of a meter-scale grating spectrometer in an instrument implemented on a 4 inch silicon wafer. Such a device, when combined with a cryogenic telescope in space, provides an enabling capability for studies of the early universe. Here, the optical design process for Micro-Spec (μ-Spec) is presented, with particular attention given to its two-dimensional diffractive region, where the light of different wavelengths is focused on the different detectors. The method is based on the stigmatization and minimization of the light path function in this bounded region, which results in an optimized geometrical configuration. A point design with an efficiency of ~90% has been developed for initial demonstration and can serve as the basis for future instruments. Design variations on this implementation are also discussed, which can lead to lower efficiencies due to diffractive losses in the multimode region.

  2. Insect transformation with piggyBac: getting the number of injections just right

    PubMed Central

    Morrison, N. I.; Shimeld, S. M.

    2016-01-01

    Abstract The insertion of exogenous genetic cargo into insects using transposable elements is a powerful research tool with potential applications in meeting food security and public health challenges facing humanity. piggyBac is the transposable element most commonly utilized for insect germline transformation. The described efficiency of this process is variable in the published literature, and a comprehensive review of transformation efficiency in insects is lacking. This study compared and contrasted all available published data with a comprehensive data set provided by a biotechnology group specializing in insect transformation. Based on analysis of these data, with particular focus on the more complete observational data from the biotechnology group, we designed a decision tool to aid researchers' decision‐making when using piggyBac to transform insects by microinjection. A combination of statistical techniques was used to define appropriate summary statistics of piggyBac transformation efficiency by species and insect order. Publication bias was assessed by comparing the data sets. The bias was assessed using strategies co‐opted from the medical literature. The work culminated in building the Goldilocks decision tool, a Markov‐Chain Monte‐Carlo simulation operated via a graphical interface and providing guidance on best practice for those seeking to transform insects using piggyBac. PMID:27027400

  3. Force feedback requirements for efficient laparoscopic grasp control.

    PubMed

    Westebring-van der Putten, Eleonora P; van den Dobbelsteen, John J; Goossens, Richard H M; Jakimowicz, Jack J; Dankelman, Jenny

    2009-09-01

    During laparoscopic grasping, tissue damage may occur due to use of excessive grasp forces and tissue slippage, whereas in barehanded grasping, humans control their grasp to prevent slippage and use of excessive force (safe grasp). This study investigates the differences in grasp control during barehanded and laparoscopic lifts. Ten novices performed lifts in order to compare pinch forces under four conditions: barehanded; using tweezers; a low-efficient grasper; and a high-efficient grasper. Results showed that participants increased their pinch force significantly later during a barehanded lift (at a pull-force level of 2.63 N) than when lifting laparoscopically (from pull-force levels of 0.77 to 1.08 N). In barehanded lifts all participants could accomplish a safe grasp, whereas in laparoscopic lifts excessive force (up to 7.9 N) and slippage (up to 38% of the trials) occurred frequently. For novices, it can be concluded that force feedback (additional to the hand-tool interface), as in skin-tissue contact, is a prerequisite to maintain a safe grasp. Much is known about grasp control during barehanded object manipulation, especially the control of pinch forces to changing loading, whereas little is known about force perception and grasp control during tool usage. This knowledge is a prerequisite for the ergonomic design of tools that are used to manipulate objects.

  4. Langley Ground Facilities and Testing in the 21st Century

    NASA Technical Reports Server (NTRS)

    Ambur, Damodar R.; Kegelman, Jerome T.; Kilgore, William A.

    2010-01-01

    A strategic approach for retaining and more efficiently operating the essential Langley Ground Testing Facilities in the 21st Century is presented. This effort takes advantage of the previously completed and ongoing studies at the Agency and National levels. This integrated approach takes into consideration the overall decline in test business base within the nation and reduced utilization in each of the Langley facilities with capabilities to test in the subsonic, transonic, supersonic, and hypersonic speed regimes. The strategy accounts for capability needs to meet the Agency programmatic requirements and strategic goals and to execute test activities in the most efficient and flexible facility operating structure. The structure currently being implemented at Langley offers agility to right-size our capability and capacity from a national perspective, to accommodate the dynamic nature of the testing needs, and will address the influence of existing and emerging analytical tools for design. The paradigm for testing in the retained facilities is to efficiently and reliably provide more accurate and high-quality test results at an affordable cost to support design information needs for flight regimes where the computational capability is not adequate and to verify and validate the existing and emerging computational tools. Each of the above goals are planned to be achieved, keeping in mind the increasing small industry customer base engaged in developing unpiloted aerial vehicles and commercial space transportation systems.

  5. Research Needs and Impacts in Predictive Simulation for Internal Combustion Engines (PreSICE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eckerle, Wayne; Rutland, Chris; Rohlfing, Eric

    This report is based on a SC/EERE Workshop to Identify Research Needs and Impacts in Predictive Simulation for Internal Combustion Engines (PreSICE), held March 3, 2011, to determine strategic focus areas that will accelerate innovation in engine design to meet national goals in transportation efficiency. The U.S. has reached a pivotal moment when pressures of energy security, climate change, and economic competitiveness converge. Oil prices remain volatile and have exceeded $100 per barrel twice in five years. At these prices, the U.S. spends $1 billion per day on imported oil to meet our energy demands. Because the transportation sector accountsmore » for two-thirds of our petroleum use, energy security is deeply entangled with our transportation needs. At the same time, transportation produces one-quarter of the nation’s carbon dioxide output. Increasing the efficiency of internal combustion engines is a technologically proven and cost-effective approach to dramatically improving the fuel economy of the nation’s fleet of vehicles in the near- to mid-term, with the corresponding benefits of reducing our dependence on foreign oil and reducing carbon emissions. Because of their relatively low cost, high performance, and ability to utilize renewable fuels, internal combustion engines—including those in hybrid vehicles—will continue to be critical to our transportation infrastructure for decades. Achievable advances in engine technology can improve the fuel economy of automobiles by over 50% and trucks by over 30%. Achieving these goals will require the transportation sector to compress its product development cycle for cleaner, more efficient engine technologies by 50% while simultaneously exploring innovative design space. Concurrently, fuels will also be evolving, adding another layer of complexity and further highlighting the need for efficient product development cycles. Current design processes, using “build and test” prototype engineering, will not suffice. Current market penetration of new engine technologies is simply too slow—it must be dramatically accelerated. These challenges present a unique opportunity to marshal U.S. leadership in science-based simulation to develop predictive computational design tools for use by the transportation industry. The use of predictive simulation tools for enhancing combustion engine performance will shrink engine development timescales, accelerate time to market, and reduce development costs, while ensuring the timely achievement of energy security and emissions targets and enhancing U.S. industrial competitiveness. In 2007 Cummins achieved a milestone in engine design by bringing a diesel engine to market solely with computer modeling and analysis tools. The only testing was after the fact to confirm performance. Cummins achieved a reduction in development time and cost. As important, they realized a more robust design, improved fuel economy, and met all environmental and customer constraints. This important first step demonstrates the potential for computational engine design. But, the daunting complexity of engine combustion and the revolutionary increases in efficiency needed require the development of simulation codes and computation platforms far more advanced than those available today. Based on these needs, a Workshop to Identify Research Needs and Impacts in Predictive Simulation for Internal Combustion Engines (PreSICE) convened over 60 U.S. leaders in the engine combustion field from industry, academia, and national laboratories to focus on two critical areas of advanced simulation, as identified by the U.S. automotive and engine industries. First, modern engines require precise control of the injection of a broad variety of fuels that is far more subtle than achievable to date and that can be obtained only through predictive modeling and simulation. Second, the simulation, understanding, and control of these stochastic in-cylinder combustion processes lie on the critical path to realizing more efficient engines with greater power density. Fuel sprays set the initial conditions for combustion in essentially all future transportation engines; yet today designers primarily use empirical methods that limit the efficiency achievable. Three primary spray topics were identified as focus areas in the workshop: The fuel delivery system, which includes fuel manifolds and internal injector flow, The multi-phase fuel–air mixing in the combustion chamber of the engine, and The heat transfer and fluid interactions with cylinder walls. Current understanding and modeling capability of stochastic processes in engines remains limited and prevents designers from achieving significantly higher fuel economy. To improve this situation, the workshop participants identified three focus areas for stochastic processes: Improve fundamental understanding that will help to establish and characterize the physical causes of stochastic events, Develop physics-based simulation models that are accurate and sensitive enough to capture performance-limiting variability, and Quantify and manage uncertainty in model parameters and boundary conditions. Improved models and understanding in these areas will allow designers to develop engines with reduced design margins and that operate reliably in more efficient regimes. All of these areas require improved basic understanding, high-fidelity model development, and rigorous model validation. These advances will greatly reduce the uncertainties in current models and improve understanding of sprays and fuel–air mixture preparation that limit the investigation and development of advanced combustion technologies. The two strategic focus areas have distinctive characteristics but are inherently coupled. Coordinated activities in basic experiments, fundamental simulations, and engineering-level model development and validation can be used to successfully address all of the topics identified in the PreSICE workshop. The outcome will be: New and deeper understanding of the relevant fundamental physical and chemical processes in advanced combustion technologies, Implementation of this understanding into models and simulation tools appropriate for both exploration and design, and Sufficient validation with uncertainty quantification to provide confidence in the simulation results. These outcomes will provide the design tools for industry to reduce development time by up to 30% and improve engine efficiencies by 30% to 50%. The improved efficiencies applied to the national mix of transportation applications have the potential to save over 5 million barrels of oil per day, a current cost savings of $500 million per day.« less

  6. Computer Graphics-aided systems analysis: application to well completion design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Detamore, J.E.; Sarma, M.P.

    1985-03-01

    The development of an engineering tool (in the form of a computer model) for solving design and analysis problems related with oil and gas well production operations is discussed. The development of the method is based on integrating the concepts of ''Systems Analysis'' with the techniques of ''Computer Graphics''. The concepts behind the method are very general in nature. This paper, however, illustrates the application of the method in solving gas well completion design problems. The use of the method will save time and improve the efficiency of such design and analysis problems. The method can be extended to othermore » design and analysis aspects of oil and gas wells.« less

  7. Application of BIM technology in green scientific research office building

    NASA Astrophysics Data System (ADS)

    Ni, Xin; Sun, Jianhua; Wang, Bo

    2017-05-01

    BIM technology as a kind of information technology, has been along with the advancement of building industrialization application in domestic building industry gradually. Based on reasonable construction BIM model, using BIM technology platform, through collaborative design tools can effectively improve the design efficiency and design quality. Vanda northwest engineering design and research institute co., LTD., the scientific research office building project in combination with the practical situation of engineering using BIM technology, formed in the BIM model combined with related information according to the energy energy model (BEM) and the application of BIM technology in construction management stage made exploration, and the direct experience and the achievements gained by the architectural design part made a summary.

  8. Design and Fabrication of Graphene Reinforced Polymer Conductive Patch-Based Inset Fed Microstrip Antenna

    NASA Astrophysics Data System (ADS)

    Deepak, A.; Kannan, P. Muthu; Shankar, P.

    This work explores the design and fabrication of graphene reinforced polyvinylidene fluoride (PVDF) patch-based microstrip antenna. Primarily, antenna was designed at 6GHz frequency and simulation results were obtained using Ansoft HFSS tool. Later fabrication of antenna was carried out with graphene-PVDF films as conducting patch deposited on bakelite substrate and copper as ground plane. Graphene-PVDF films were prepared using solvent casting process. The radiation efficiency of fabricated microstrip patch antenna was 48% entailing it to be adapted as a practically functional antenna. Both simulated and the practical results were compared and analyzed.

  9. Design, Specification and Construction of Specialized Measurement System in the Experimental Building

    NASA Astrophysics Data System (ADS)

    Fedorczak-Cisak, Malgorzata; Kwasnowski, Pawel; Furtak, Marcin; Hayduk, Grzegorz

    2017-10-01

    Experimental buildings for “in situ” research are a very important tool for collecting data on energy efficiency of the energy-saving technologies. One of the most advanced building of this type in Poland is the Maloposkie Laboratory of Energy-saving Buildings at Cracow University of Technology. The building itself is used by scientists as a research object and research tool to test energy-saving technologies. It is equipped with a specialized measuring system consisting of approx. 3 000 different sensors distributed in technical installations and structural elements of the building (walls, ceilings, cornices) and the ground. The authors of the paper will present the innovative design and technology of this specialized instrumentation. They will discuss issues arising during the implementation and use of the building.

  10. Modeling the Environmental Impact of Air Traffic Operations

    NASA Technical Reports Server (NTRS)

    Chen, Neil

    2011-01-01

    There is increased interest to understand and mitigate the impacts of air traffic on the climate, since greenhouse gases, nitrogen oxides, and contrails generated by air traffic can have adverse impacts on the climate. The models described in this presentation are useful for quantifying these impacts and for studying alternative environmentally aware operational concepts. These models have been developed by leveraging and building upon existing simulation and optimization techniques developed for the design of efficient traffic flow management strategies. Specific enhancements to the existing simulation and optimization techniques include new models that simulate aircraft fuel flow, emissions and contrails. To ensure that these new models are beneficial to the larger climate research community, the outputs of these new models are compatible with existing global climate modeling tools like the FAA's Aviation Environmental Design Tool.

  11. Image Navigation and Registration Performance Assessment Evaluation Tools for GOES-R ABI and GLM

    NASA Technical Reports Server (NTRS)

    Houchin, Scott; Porter, Brian; Graybill, Justin; Slingerland, Philip

    2017-01-01

    The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24 hour evaluation period. This paper describes the software design and implementation of IPATS and provides preliminary test results.

  12. Design of a final approach spacing tool for TRACON air traffic control

    NASA Technical Reports Server (NTRS)

    Davis, Thomas J.; Erzberger, Heinz; Bergeron, Hugh

    1989-01-01

    This paper describes an automation tool that assists air traffic controllers in the Terminal Radar Approach Control (TRACON) Facilities in providing safe and efficient sequencing and spacing of arrival traffic. The automation tool, referred to as the Final Approach Spacing Tool (FAST), allows the controller to interactively choose various levels of automation and advisory information ranging from predicted time errors to speed and heading advisories for controlling time error. FAST also uses a timeline to display current scheduling and sequencing information for all aircraft in the TRACON airspace. FAST combines accurate predictive algorithms and state-of-the-art mouse and graphical interface technology to present advisory information to the controller. Furthermore, FAST exchanges various types of traffic information and communicates with automation tools being developed for the Air Route Traffic Control Center. Thus it is part of an integrated traffic management system for arrival traffic at major terminal areas.

  13. Usability in product design--the importance and need for systematic assessment models in product development--Usa-Design Model (U-D) ©.

    PubMed

    Merino, Giselle Schmidt Alves Díaz; Teixeira, Clarissa Stefani; Schoenardie, Rodrigo Petry; Merino, Eugenio Andrés Diáz; Gontijo, Leila Amaral

    2012-01-01

    In product design, human factors are considered as an element of differentiation given that today's consumer demands are increasing. Safety, wellbeing, satisfaction, health, effectiveness, efficiency, and other aspects must be effectively incorporated into the product development process. This work proposes a usability assessment model that can be incorporated as an assessment tool. The methodological approach is settled in two stages. First a literature review focus specifically on usability and developing user-centred products. After this, a model of usability named Usa-Design (U-D©) is presented. Consisted of four phases: understanding the use context, pre-preliminary usability assessment (efficiency/effectiveness/satisfaction); assessment of usability principles and results, U-D© features are modular and flexible, allowing principles used in Phase 3 to be changed according to the needs and scenario of each situation. With qualitative/quantitative measurement scales of easy understanding and application, the model results are viable and applicable throughout all the product development process.

  14. Multidisciplinary Multiobjective Optimal Design for Turbomachinery Using Evolutionary Algorithm

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This report summarizes Dr. Lian s efforts toward developing a robust and efficient tool for multidisciplinary and multi-objective optimal design for turbomachinery using evolutionary algorithms. This work consisted of two stages. The first stage (from July 2003 to June 2004) Dr. Lian focused on building essential capabilities required for the project. More specifically, Dr. Lian worked on two subjects: an enhanced genetic algorithm (GA) and an integrated optimization system with a GA and a surrogate model. The second stage (from July 2004 to February 2005) Dr. Lian formulated aerodynamic optimization and structural optimization into a multi-objective optimization problem and performed multidisciplinary and multi-objective optimizations on a transonic compressor blade based on the proposed model. Dr. Lian s numerical results showed that the proposed approach can effectively reduce the blade weight and increase the stage pressure ratio in an efficient manner. In addition, the new design was structurally safer than the original design. Five conference papers and three journal papers were published on this topic by Dr. Lian.

  15. A method of transmissibility design for dual-chamber pneumatic vibration isolator

    NASA Astrophysics Data System (ADS)

    Lee, Jeung-Hoon; Kim, Kwang-Joon

    2009-06-01

    Dual-chamber pneumatic vibration isolators have a wide range of applications for vibration isolation of vibration-sensitive equipment. Recent advances in precision machine tools and instruments such as medical devices and those related to nano-technology require better isolation performance, which can be efficiently achieved by precise modeling- and design- of the isolation system. This paper discusses an efficient transmissibility design method of a pneumatic vibration isolator wherein a complex stiffness model of a dual-chamber pneumatic spring developed in our previous study is employed. Three design parameters, the volume ratio between the two pneumatic chambers, the geometry of the capillary tube connecting the two pneumatic chambers, and, finally, the stiffness of the diaphragm employed for prevention of air leakage, were found to be important factors in transmissibility design. Based on a design technique that maximizes damping of the dual-chamber pneumatic spring, trade-offs among the resonance frequency of transmissibility, peak transmissibility, and transmissibility in high frequency range were found, which were not ever stated in previous researches. Furthermore, this paper discusses the negative role of the diaphragm in transmissibility design. The design method proposed in this paper is illustrated through experimental measurements.

  16. Stretchable Materials for Robust Soft Actuators towards Assistive Wearable Devices

    NASA Astrophysics Data System (ADS)

    Agarwal, Gunjan; Besuchet, Nicolas; Audergon, Basile; Paik, Jamie

    2016-09-01

    Soft actuators made from elastomeric active materials can find widespread potential implementation in a variety of applications ranging from assistive wearable technologies targeted at biomedical rehabilitation or assistance with activities of daily living, bioinspired and biomimetic systems, to gripping and manipulating fragile objects, and adaptable locomotion. In this manuscript, we propose a novel two-component soft actuator design and design tool that produces actuators targeted towards these applications with enhanced mechanical performance and manufacturability. Our numerical models developed using the finite element method can predict the actuator behavior at large mechanical strains to allow efficient design iterations for system optimization. Based on two distinctive actuator prototypes’ (linear and bending actuators) experimental results that include free displacement and blocked-forces, we have validated the efficacy of the numerical models. The presented extensive investigation of mechanical performance for soft actuators with varying geometric parameters demonstrates the practical application of the design tool, and the robustness of the actuator hardware design, towards diverse soft robotic systems for a wide set of assistive wearable technologies, including replicating the motion of several parts of the human body.

  17. Stretchable Materials for Robust Soft Actuators towards Assistive Wearable Devices

    PubMed Central

    Agarwal, Gunjan; Besuchet, Nicolas; Audergon, Basile; Paik, Jamie

    2016-01-01

    Soft actuators made from elastomeric active materials can find widespread potential implementation in a variety of applications ranging from assistive wearable technologies targeted at biomedical rehabilitation or assistance with activities of daily living, bioinspired and biomimetic systems, to gripping and manipulating fragile objects, and adaptable locomotion. In this manuscript, we propose a novel two-component soft actuator design and design tool that produces actuators targeted towards these applications with enhanced mechanical performance and manufacturability. Our numerical models developed using the finite element method can predict the actuator behavior at large mechanical strains to allow efficient design iterations for system optimization. Based on two distinctive actuator prototypes’ (linear and bending actuators) experimental results that include free displacement and blocked-forces, we have validated the efficacy of the numerical models. The presented extensive investigation of mechanical performance for soft actuators with varying geometric parameters demonstrates the practical application of the design tool, and the robustness of the actuator hardware design, towards diverse soft robotic systems for a wide set of assistive wearable technologies, including replicating the motion of several parts of the human body. PMID:27670953

  18. Dynamic Systems Analysis for Turbine Based Aero Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.

    2016-01-01

    The aircraft engine design process seeks to optimize the overall system-level performance, weight, and cost for a given concept. Steady-state simulations and data are used to identify trade-offs that should be balanced to optimize the system in a process known as systems analysis. These systems analysis simulations and data may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic systems analysis provides the capability for assessing the dynamic tradeoffs at an earlier stage of the engine design process. The dynamic systems analysis concept, developed tools, and potential benefit are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed to provide the user with an estimate of the closed-loop performance (response time) and operability (high pressure compressor surge margin) for a given engine design and set of control design requirements. TTECTrA along with engine deterioration information, can be used to develop a more generic relationship between performance and operability that can impact the engine design constraints and potentially lead to a more efficient engine.

  19. Stretchable Materials for Robust Soft Actuators towards Assistive Wearable Devices.

    PubMed

    Agarwal, Gunjan; Besuchet, Nicolas; Audergon, Basile; Paik, Jamie

    2016-09-27

    Soft actuators made from elastomeric active materials can find widespread potential implementation in a variety of applications ranging from assistive wearable technologies targeted at biomedical rehabilitation or assistance with activities of daily living, bioinspired and biomimetic systems, to gripping and manipulating fragile objects, and adaptable locomotion. In this manuscript, we propose a novel two-component soft actuator design and design tool that produces actuators targeted towards these applications with enhanced mechanical performance and manufacturability. Our numerical models developed using the finite element method can predict the actuator behavior at large mechanical strains to allow efficient design iterations for system optimization. Based on two distinctive actuator prototypes' (linear and bending actuators) experimental results that include free displacement and blocked-forces, we have validated the efficacy of the numerical models. The presented extensive investigation of mechanical performance for soft actuators with varying geometric parameters demonstrates the practical application of the design tool, and the robustness of the actuator hardware design, towards diverse soft robotic systems for a wide set of assistive wearable technologies, including replicating the motion of several parts of the human body.

  20. New paradigms in internal architecture design and freeform fabrication of tissue engineering porous scaffolds.

    PubMed

    Yoo, Dongjin

    2012-07-01

    Advanced additive manufacture (AM) techniques are now being developed to fabricate scaffolds with controlled internal pore architectures in the field of tissue engineering. In general, these techniques use a hybrid method which combines computer-aided design (CAD) with computer-aided manufacturing (CAM) tools to design and fabricate complicated three-dimensional (3D) scaffold models. The mathematical descriptions of micro-architectures along with the macro-structures of the 3D scaffold models are limited by current CAD technologies as well as by the difficulty of transferring the designed digital models to standard formats for fabrication. To overcome these difficulties, we have developed an efficient internal pore architecture design system based on triply periodic minimal surface (TPMS) unit cell libraries and associated computational methods to assemble TPMS unit cells into an entire scaffold model. In addition, we have developed a process planning technique based on TPMS internal architecture pattern of unit cells to generate tool paths for freeform fabrication of tissue engineering porous scaffolds. Copyright © 2012 IPEM. Published by Elsevier Ltd. All rights reserved.

  1. Design synthesis and optimization of permanent magnet synchronous machines based on computationally-efficient finite element analysis

    NASA Astrophysics Data System (ADS)

    Sizov, Gennadi Y.

    In this dissertation, a model-based multi-objective optimal design of permanent magnet ac machines, supplied by sine-wave current regulated drives, is developed and implemented. The design procedure uses an efficient electromagnetic finite element-based solver to accurately model nonlinear material properties and complex geometric shapes associated with magnetic circuit design. Application of an electromagnetic finite element-based solver allows for accurate computation of intricate performance parameters and characteristics. The first contribution of this dissertation is the development of a rapid computational method that allows accurate and efficient exploration of large multi-dimensional design spaces in search of optimum design(s). The computationally efficient finite element-based approach developed in this work provides a framework of tools that allow rapid analysis of synchronous electric machines operating under steady-state conditions. In the developed modeling approach, major steady-state performance parameters such as, winding flux linkages and voltages, average, cogging and ripple torques, stator core flux densities, core losses, efficiencies and saturated machine winding inductances, are calculated with minimum computational effort. In addition, the method includes means for rapid estimation of distributed stator forces and three-dimensional effects of stator and/or rotor skew on the performance of the machine. The second contribution of this dissertation is the development of the design synthesis and optimization method based on a differential evolution algorithm. The approach relies on the developed finite element-based modeling method for electromagnetic analysis and is able to tackle large-scale multi-objective design problems using modest computational resources. Overall, computational time savings of up to two orders of magnitude are achievable, when compared to current and prevalent state-of-the-art methods. These computational savings allow one to expand the optimization problem to achieve more complex and comprehensive design objectives. The method is used in the design process of several interior permanent magnet industrial motors. The presented case studies demonstrate that the developed finite element-based approach practically eliminates the need for using less accurate analytical and lumped parameter equivalent circuit models for electric machine design optimization. The design process and experimental validation of the case-study machines are detailed in the dissertation.

  2. Factorial experiments: efficient tools for evaluation of intervention components.

    PubMed

    Collins, Linda M; Dziak, John J; Kugler, Kari C; Trail, Jessica B

    2014-10-01

    An understanding of the individual and combined effects of a set of intervention components is important for moving the science of preventive medicine interventions forward. This understanding can often be achieved in an efficient and economical way via a factorial experiment, in which two or more independent variables are manipulated. The factorial experiment is a complement to the RCT; the two designs address different research questions. To offer an introduction to factorial experiments aimed at investigators trained primarily in the RCT. The factorial experiment is compared and contrasted with other experimental designs used commonly in intervention science to highlight where each is most efficient and appropriate. Several points are made: factorial experiments make very efficient use of experimental subjects when the data are properly analyzed; a factorial experiment can have excellent statistical power even if it has relatively few subjects per experimental condition; and when conducting research to select components for inclusion in a multicomponent intervention, interactions should be studied rather than avoided. Investigators in preventive medicine and related areas should begin considering factorial experiments alongside other approaches. Experimental designs should be chosen from a resource management perspective, which states that the best experimental design is the one that provides the greatest scientific benefit without exceeding available resources. Copyright © 2014 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  3. Modified Universal Design Survey: Enhancing Operability of Launch Vehicle Ground Crew Worksites

    NASA Technical Reports Server (NTRS)

    Blume, Jennifer L.

    2010-01-01

    Operability is a driving requirement for next generation space launch vehicles. Launch site ground operations include numerous operator tasks to prepare the vehicle for launch or to perform preflight maintenance. Ensuring that components requiring operator interaction at the launch site are designed for optimal human use is a high priority for operability. To promote operability, a Design Quality Evaluation Survey based on Universal Design framework was developed to support Human Factors Engineering (HFE) evaluation for NASA s launch vehicles. Universal Design per se is not a priority for launch vehicle processing however; applying principles of Universal Design will increase the probability of an error free and efficient design which promotes operability. The Design Quality Evaluation Survey incorporates and tailors the seven Universal Design Principles and adds new measures for Safety and Efficiency. Adapting an approach proven to measure Universal Design Performance in Product, each principle is associated with multiple performance measures which are rated with the degree to which the statement is true. The Design Quality Evaluation Survey was employed for several launch vehicle ground processing worksite analyses. The tool was found to be most useful for comparative judgments as opposed to an assessment of a single design option. It provided a useful piece of additional data when assessing possible operator interfaces or worksites for operability.

  4. Medication Reconciliation: Work Domain Ontology, prototype development, and a predictive model.

    PubMed

    Markowitz, Eliz; Bernstam, Elmer V; Herskovic, Jorge; Zhang, Jiajie; Shneiderman, Ben; Plaisant, Catherine; Johnson, Todd R

    2011-01-01

    Medication errors can result from administration inaccuracies at any point of care and are a major cause for concern. To develop a successful Medication Reconciliation (MR) tool, we believe it necessary to build a Work Domain Ontology (WDO) for the MR process. A WDO defines the explicit, abstract, implementation-independent description of the task by separating the task from work context, application technology, and cognitive architecture. We developed a prototype based upon the WDO and designed to adhere to standard principles of interface design. The prototype was compared to Legacy Health System's and Pre-Admission Medication List Builder MR tools via a Keystroke-Level Model analysis for three MR tasks. The analysis found the prototype requires the fewest mental operations, completes tasks in the fewest steps, and completes tasks in the least amount of time. Accordingly, we believe that developing a MR tool, based upon the WDO and user interface guidelines, improves user efficiency and reduces cognitive load.

  5. Medication Reconciliation: Work Domain Ontology, Prototype Development, and a Predictive Model

    PubMed Central

    Markowitz, Eliz; Bernstam, Elmer V.; Herskovic, Jorge; Zhang, Jiajie; Shneiderman, Ben; Plaisant, Catherine; Johnson, Todd R.

    2011-01-01

    Medication errors can result from administration inaccuracies at any point of care and are a major cause for concern. To develop a successful Medication Reconciliation (MR) tool, we believe it necessary to build a Work Domain Ontology (WDO) for the MR process. A WDO defines the explicit, abstract, implementation-independent description of the task by separating the task from work context, application technology, and cognitive architecture. We developed a prototype based upon the WDO and designed to adhere to standard principles of interface design. The prototype was compared to Legacy Health System’s and Pre-Admission Medication List Builder MR tools via a Keystroke-Level Model analysis for three MR tasks. The analysis found the prototype requires the fewest mental operations, completes tasks in the fewest steps, and completes tasks in the least amount of time. Accordingly, we believe that developing a MR tool, based upon the WDO and user interface guidelines, improves user efficiency and reduces cognitive load. PMID:22195146

  6. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  7. An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design

    NASA Technical Reports Server (NTRS)

    Lin, Risheng; Afjeh, Abdollah A.

    2003-01-01

    Crucial to an efficient aircraft simulation-based design is a robust data modeling methodology for both recording the information and providing data transfer readily and reliably. To meet this goal, data modeling issues involved in the aircraft multidisciplinary design are first analyzed in this study. Next, an XML-based. extensible data object model for multidisciplinary aircraft design is constructed and implemented. The implementation of the model through aircraft databinding allows the design applications to access and manipulate any disciplinary data with a lightweight and easy-to-use API. In addition, language independent representation of aircraft disciplinary data in the model fosters interoperability amongst heterogeneous systems thereby facilitating data sharing and exchange between various design tools and systems.

  8. In-Space Radiator Shape Optimization using Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Hull, Patrick V.; Kittredge, Ken; Tinker, Michael; SanSoucie, Michael

    2006-01-01

    Future space exploration missions will require the development of more advanced in-space radiators. These radiators should be highly efficient and lightweight, deployable heat rejection systems. Typical radiators for in-space heat mitigation commonly comprise a substantial portion of the total vehicle mass. A small mass savings of even 5-10% can greatly improve vehicle performance. The objective of this paper is to present the development of detailed tools for the analysis and design of in-space radiators using evolutionary computation techniques. The optimality criterion is defined as a two-dimensional radiator with a shape demonstrating the smallest mass for the greatest overall heat transfer, thus the end result is a set of highly functional radiator designs. This cross-disciplinary work combines topology optimization and thermal analysis design by means of a genetic algorithm The proposed design tool consists of the following steps; design parameterization based on the exterior boundary of the radiator, objective function definition (mass minimization and heat loss maximization), objective function evaluation via finite element analysis (thermal radiation analysis) and optimization based on evolutionary algorithms. The radiator design problem is defined as follows: the input force is a driving temperature and the output reaction is heat loss. Appropriate modeling of the space environment is added to capture its effect on the radiator. The design parameters chosen for this radiator shape optimization problem fall into two classes, variable height along the width of the radiator and a spline curve defining the -material boundary of the radiator. The implementation of multiple design parameter schemes allows the user to have more confidence in the radiator optimization tool upon demonstration of convergence between the two design parameter schemes. This tool easily allows the user to manipulate the driving temperature regions thus permitting detailed design of in-space radiators for unique situations. Preliminary results indicate an optimized shape following that of the temperature distribution regions in the "cooler" portions of the radiator. The results closely follow the expected radiator shape.

  9. Implementation of Health Insurance Support Tools in Community Health Centers.

    PubMed

    Huguet, Nathalie; Hatch, Brigit; Sumic, Aleksandra; Tillotson, Carrie; Hicks, Elizabeth; Nelson, Joan; DeVoe, Jennifer E

    2018-01-01

    Health information technology (HIT) provides new opportunities for primary care clinics to support patients with health insurance enrollment and maintenance. We present strategies, early findings, and clinic reflections on the development and implementation of HIT tools designed to streamline and improve health insurance tracking at community health centers. We are conducting a hybrid implementation-effectiveness trial to assess novel health insurance enrollment and support tools in primary care clinics. Twenty-three clinics in 7 health centers from the OCHIN practice-based research network are participating in the implementation component of the trial. Participating health centers were randomized to 1 of 2 levels of implementation support, including arm 1 (n = 4 health centers, 11 clinic sites) that received HIT tools and educational materials and arm 2 (n = 3 health centers, 12 clinic sites) that received HIT tools, educational materials, and individualized implementation support with a practice coach. We used mixed-methods (qualitative and quantitative) to assess tool use rates and facilitators and barriers to implementation in the first 6 months. Clinics reported favorable attitudes toward the HIT tools, which replace less efficient and more cumbersome processes, and reflect on the importance of clinic engagement in tool development and refinement. Five of 7 health centers are now regularly using the tools and are actively working to increase tool use. Six months after formal implementation, arm 2 clinics demonstrated higher rates of tool use, compared with arm 1. These results highlight the value of early clinic input in tool development, the potential benefit of practice coaching during HIT tool development and implementation, and a novel method for coupling a hybrid implementation-effectiveness design with principles of improvement science in primary care research. © Copyright 2018 by the American Board of Family Medicine.

  10. Designing and Developing a NASA Research Projects Knowledge Base and Implementing Knowledge Management and Discovery Techniques

    NASA Astrophysics Data System (ADS)

    Dabiru, L.; O'Hara, C. G.; Shaw, D.; Katragadda, S.; Anderson, D.; Kim, S.; Shrestha, B.; Aanstoos, J.; Frisbie, T.; Policelli, F.; Keblawi, N.

    2006-12-01

    The Research Project Knowledge Base (RPKB) is currently being designed and will be implemented in a manner that is fully compatible and interoperable with enterprise architecture tools developed to support NASA's Applied Sciences Program. Through user needs assessment, collaboration with Stennis Space Center, Goddard Space Flight Center, and NASA's DEVELOP Staff personnel insight to information needs for the RPKB were gathered from across NASA scientific communities of practice. To enable efficient, consistent, standard, structured, and managed data entry and research results compilation a prototype RPKB has been designed and fully integrated with the existing NASA Earth Science Systems Components database. The RPKB will compile research project and keyword information of relevance to the six major science focus areas, 12 national applications, and the Global Change Master Directory (GCMD). The RPKB will include information about projects awarded from NASA research solicitations, project investigator information, research publications, NASA data products employed, and model or decision support tools used or developed as well as new data product information. The RPKB will be developed in a multi-tier architecture that will include a SQL Server relational database backend, middleware, and front end client interfaces for data entry. The purpose of this project is to intelligently harvest the results of research sponsored by the NASA Applied Sciences Program and related research program results. We present various approaches for a wide spectrum of knowledge discovery of research results, publications, projects, etc. from the NASA Systems Components database and global information systems and show how this is implemented in SQL Server database. The application of knowledge discovery is useful for intelligent query answering and multiple-layered database construction. Using advanced EA tools such as the Earth Science Architecture Tool (ESAT), RPKB will enable NASA and partner agencies to efficiently identify the significant results for new experiment directions and principle investigators to formulate experiment directions for new proposals.

  11. Multidisciplinary Design Technology Development: A Comparative Investigation of Integrated Aerospace Vehicle Design Tools

    NASA Technical Reports Server (NTRS)

    Renaud, John E.; Batill, Stephen M.; Brockman, Jay B.

    1999-01-01

    This research effort is a joint program between the Departments of Aerospace and Mechanical Engineering and the Computer Science and Engineering Department at the University of Notre Dame. The purpose of the project was to develop a framework and systematic methodology to facilitate the application of Multidisciplinary Design Optimization (MDO) to a diverse class of system design problems. For all practical aerospace systems, the design of a systems is a complex sequence of events which integrates the activities of a variety of discipline "experts" and their associated "tools". The development, archiving and exchange of information between these individual experts is central to the design task and it is this information which provides the basis for these experts to make coordinated design decisions (i.e., compromises and trade-offs) - resulting in the final product design. Grant efforts focused on developing and evaluating frameworks for effective design coordination within a MDO environment. Central to these research efforts was the concept that the individual discipline "expert", using the most appropriate "tools" available and the most complete description of the system should be empowered to have the greatest impact on the design decisions and final design. This means that the overall process must be highly interactive and efficiently conducted if the resulting design is to be developed in a manner consistent with cost and time requirements. The methods developed as part of this research effort include; extensions to a sensitivity based Concurrent Subspace Optimization (CSSO) NMO algorithm; the development of a neural network response surface based CSSO-MDO algorithm; and the integration of distributed computing and process scheduling into the MDO environment. This report overviews research efforts in each of these focus. A complete bibliography of research produced with support of this grant is attached.

  12. Toward Genome-Based Metabolic Engineering in Bacteria.

    PubMed

    Oesterle, Sabine; Wuethrich, Irene; Panke, Sven

    2017-01-01

    Prokaryotes modified stably on the genome are of great importance for production of fine and commodity chemicals. Traditional methods for genome engineering have long suffered from imprecision and low efficiencies, making construction of suitable high-producer strains laborious. Here, we review the recent advances in discovery and refinement of molecular precision engineering tools for genome-based metabolic engineering in bacteria for chemical production, with focus on the λ-Red recombineering and the clustered regularly interspaced short palindromic repeats/Cas9 nuclease systems. In conjunction, they enable the integration of in vitro-synthesized DNA segments into specified locations on the chromosome and allow for enrichment of rare mutants by elimination of unmodified wild-type cells. Combination with concurrently developing improvements in important accessory technologies such as DNA synthesis, high-throughput screening methods, regulatory element design, and metabolic pathway optimization tools has resulted in novel efficient microbial producer strains and given access to new metabolic products. These new tools have made and will likely continue to make a big impact on the bioengineering strategies that transform the chemical industry. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. IDEAL: Images Across Domains, Experiments, Algorithms and Learning

    NASA Astrophysics Data System (ADS)

    Ushizima, Daniela M.; Bale, Hrishikesh A.; Bethel, E. Wes; Ercius, Peter; Helms, Brett A.; Krishnan, Harinarayan; Grinberg, Lea T.; Haranczyk, Maciej; Macdowell, Alastair A.; Odziomek, Katarzyna; Parkinson, Dilworth Y.; Perciano, Talita; Ritchie, Robert O.; Yang, Chao

    2016-11-01

    Research across science domains is increasingly reliant on image-centric data. Software tools are in high demand to uncover relevant, but hidden, information in digital images, such as those coming from faster next generation high-throughput imaging platforms. The challenge is to analyze the data torrent generated by the advanced instruments efficiently, and provide insights such as measurements for decision-making. In this paper, we overview work performed by an interdisciplinary team of computational and materials scientists, aimed at designing software applications and coordinating research efforts connecting (1) emerging algorithms for dealing with large and complex datasets; (2) data analysis methods with emphasis in pattern recognition and machine learning; and (3) advances in evolving computer architectures. Engineering tools around these efforts accelerate the analyses of image-based recordings, improve reusability and reproducibility, scale scientific procedures by reducing time between experiments, increase efficiency, and open opportunities for more users of the imaging facilities. This paper describes our algorithms and software tools, showing results across image scales, demonstrating how our framework plays a role in improving image understanding for quality control of existent materials and discovery of new compounds.

  14. Advanced data management for optimising the operation of a full-scale WWTP.

    PubMed

    Beltrán, Sergio; Maiza, Mikel; de la Sota, Alejandro; Villanueva, José María; Ayesa, Eduardo

    2012-01-01

    The lack of appropriate data management tools is presently a limiting factor for a broader implementation and a more efficient use of sensors and analysers, monitoring systems and process controllers in wastewater treatment plants (WWTPs). This paper presents a technical solution for advanced data management of a full-scale WWTP. The solution is based on an efficient and intelligent use of the plant data by a standard centralisation of the heterogeneous data acquired from different sources, effective data processing to extract adequate information, and a straightforward connection to other emerging tools focused on the operational optimisation of the plant such as advanced monitoring and control or dynamic simulators. A pilot study of the advanced data manager tool was designed and implemented in the Galindo-Bilbao WWTP. The results of the pilot study showed its potential for agile and intelligent plant data management by generating new enriched information combining data from different plant sources, facilitating the connection of operational support systems, and developing automatic plots and trends of simulated results and actual data for plant performance and diagnosis.

  15. Design of efficient molecular organic light-emitting diodes by a high-throughput virtual screening and experimental approach.

    PubMed

    Gómez-Bombarelli, Rafael; Aguilera-Iparraguirre, Jorge; Hirzel, Timothy D; Duvenaud, David; Maclaurin, Dougal; Blood-Forsythe, Martin A; Chae, Hyun Sik; Einzinger, Markus; Ha, Dong-Gwang; Wu, Tony; Markopoulos, Georgios; Jeon, Soonok; Kang, Hosuk; Miyazaki, Hiroshi; Numata, Masaki; Kim, Sunghan; Huang, Wenliang; Hong, Seong Ik; Baldo, Marc; Adams, Ryan P; Aspuru-Guzik, Alán

    2016-10-01

    Virtual screening is becoming a ground-breaking tool for molecular discovery due to the exponential growth of available computer time and constant improvement of simulation and machine learning techniques. We report an integrated organic functional material design process that incorporates theoretical insight, quantum chemistry, cheminformatics, machine learning, industrial expertise, organic synthesis, molecular characterization, device fabrication and optoelectronic testing. After exploring a search space of 1.6 million molecules and screening over 400,000 of them using time-dependent density functional theory, we identified thousands of promising novel organic light-emitting diode molecules across the visible spectrum. Our team collaboratively selected the best candidates from this set. The experimentally determined external quantum efficiencies for these synthesized candidates were as large as 22%.

  16. Design of efficient molecular organic light-emitting diodes by a high-throughput virtual screening and experimental approach

    NASA Astrophysics Data System (ADS)

    Gómez-Bombarelli, Rafael; Aguilera-Iparraguirre, Jorge; Hirzel, Timothy D.; Duvenaud, David; MacLaurin, Dougal; Blood-Forsythe, Martin A.; Chae, Hyun Sik; Einzinger, Markus; Ha, Dong-Gwang; Wu, Tony; Markopoulos, Georgios; Jeon, Soonok; Kang, Hosuk; Miyazaki, Hiroshi; Numata, Masaki; Kim, Sunghan; Huang, Wenliang; Hong, Seong Ik; Baldo, Marc; Adams, Ryan P.; Aspuru-Guzik, Alán

    2016-10-01

    Virtual screening is becoming a ground-breaking tool for molecular discovery due to the exponential growth of available computer time and constant improvement of simulation and machine learning techniques. We report an integrated organic functional material design process that incorporates theoretical insight, quantum chemistry, cheminformatics, machine learning, industrial expertise, organic synthesis, molecular characterization, device fabrication and optoelectronic testing. After exploring a search space of 1.6 million molecules and screening over 400,000 of them using time-dependent density functional theory, we identified thousands of promising novel organic light-emitting diode molecules across the visible spectrum. Our team collaboratively selected the best candidates from this set. The experimentally determined external quantum efficiencies for these synthesized candidates were as large as 22%.

  17. Framework for Architecture Trade Study Using MBSE and Performance Simulation

    NASA Technical Reports Server (NTRS)

    Ryan, Jessica; Sarkani, Shahram; Mazzuchim, Thomas

    2012-01-01

    Increasing complexity in modern systems as well as cost and schedule constraints require a new paradigm of system engineering to fulfill stakeholder needs. Challenges facing efficient trade studies include poor tool interoperability, lack of simulation coordination (design parameters) and requirements flowdown. A recent trend toward Model Based System Engineering (MBSE) includes flexible architecture definition, program documentation, requirements traceability and system engineering reuse. As a new domain MBSE still lacks governing standards and commonly accepted frameworks. This paper proposes a framework for efficient architecture definition using MBSE in conjunction with Domain Specific simulation to evaluate trade studies. A general framework is provided followed with a specific example including a method for designing a trade study, defining candidate architectures, planning simulations to fulfill requirements and finally a weighted decision analysis to optimize system objectives.

  18. Laser beam machining of polycrystalline diamond for cutting tool manufacturing

    NASA Astrophysics Data System (ADS)

    Wyszyński, Dominik; Ostrowski, Robert; Zwolak, Marek; Bryk, Witold

    2017-10-01

    The paper concerns application of DPSS Nd: YAG 532nm pulse laser source for machining of polycrystalline WC based diamond inserts (PCD). The goal of the research was to determine optimal laser cutting parameters for cutting tool shaping. Basic criteria to reach the goal was cutting edge quality (minimalization of finishing operations), material removal rate (time and cost efficiency), choice of laser beam characteristics (polarization, power, focused beam diameter). The research was planned and realised and analysed according to design of experiment rules (DOE). The analysis of the cutting edge was prepared with use of Alicona Infinite Focus measurement system.

  19. iDrug: a web-accessible and interactive drug discovery and design platform

    PubMed Central

    2014-01-01

    Background The progress in computer-aided drug design (CADD) approaches over the past decades accelerated the early-stage pharmaceutical research. Many powerful standalone tools for CADD have been developed in academia. As programs are developed by various research groups, a consistent user-friendly online graphical working environment, combining computational techniques such as pharmacophore mapping, similarity calculation, scoring, and target identification is needed. Results We presented a versatile, user-friendly, and efficient online tool for computer-aided drug design based on pharmacophore and 3D molecular similarity searching. The web interface enables binding sites detection, virtual screening hits identification, and drug targets prediction in an interactive manner through a seamless interface to all adapted packages (e.g., Cavity, PocketV.2, PharmMapper, SHAFTS). Several commercially available compound databases for hit identification and a well-annotated pharmacophore database for drug targets prediction were integrated in iDrug as well. The web interface provides tools for real-time molecular building/editing, converting, displaying, and analyzing. All the customized configurations of the functional modules can be accessed through featured session files provided, which can be saved to the local disk and uploaded to resume or update the history work. Conclusions iDrug is easy to use, and provides a novel, fast and reliable tool for conducting drug design experiments. By using iDrug, various molecular design processing tasks can be submitted and visualized simply in one browser without installing locally any standalone modeling softwares. iDrug is accessible free of charge at http://lilab.ecust.edu.cn/idrug. PMID:24955134

  20. A tool for simulating parallel branch-and-bound methods

    NASA Astrophysics Data System (ADS)

    Golubeva, Yana; Orlov, Yury; Posypkin, Mikhail

    2016-01-01

    The Branch-and-Bound method is known as one of the most powerful but very resource consuming global optimization methods. Parallel and distributed computing can efficiently cope with this issue. The major difficulty in parallel B&B method is the need for dynamic load redistribution. Therefore design and study of load balancing algorithms is a separate and very important research topic. This paper presents a tool for simulating parallel Branchand-Bound method. The simulator allows one to run load balancing algorithms with various numbers of processors, sizes of the search tree, the characteristics of the supercomputer's interconnect thereby fostering deep study of load distribution strategies. The process of resolution of the optimization problem by B&B method is replaced by a stochastic branching process. Data exchanges are modeled using the concept of logical time. The user friendly graphical interface to the simulator provides efficient visualization and convenient performance analysis.

  1. Monitoring of European corn borer with pheromone-baited traps: review of trapping system basics and remaining problems.

    PubMed

    Laurent, Pélozuelo; Frérot, Brigitte

    2007-12-01

    Since the identification of female European corn borer, Ostrinia nubilalis (Hübner) pheromone, pheromone-baited traps have been regarded as a promising tool to monitor populations of this pest. This article reviews the literature produced on this topic since the 1970s. Its aim is to provide extension entomologists and other researchers with all the necessary information to establish an efficient trapping procedure for this moth. The different pheromone races of the European corn borer are described, and research results relating to the optimization of pheromone blend, pheromone bait, trap design, and trap placement are summarized followed by a state-of-the-art summary of data comparing blacklight trap and pheromone-baited trap techniques to monitor European corn borer flight. Finally, we identify the information required to definitively validate/invalidate the pheromone-baited traps as an efficient decision support tool in European corn borer control.

  2. User-oriented evaluation of mechanical single-channel axial pipettes.

    PubMed

    Sormunen, Erja; Nevala, Nina

    2013-09-01

    Hand tools should be designed so that they are comfortable to use, fit the hand and are user-oriented. Six different manual, single-channel axial pipettes were evaluated for such objective outcomes as muscular activity, wrist postures and efficiency, as well as for subjective outcomes concerning self-assessed features of pipette usability and musculoskeletal strain. Ten experienced laboratory employees volunteered for the study. The results showed that light and short pipettes with better tool comfort resulted in reduced muscular activity and perceived musculoskeletal strain when they were compared with a long and heavy pipette. There were no differences in the efficiency between the different pipettes. Combining both the objective and subjective measures enabled a broader evaluation of product usability. The results of this study can be used both in product development and as information on which to base the purchase of new pipettes for laboratory work. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  3. The Effects of a Grouping by Tens Manipulative on Children's Strategy Use, Base Ten Understanding and Mathematical Knowledge

    ERIC Educational Resources Information Center

    Pagar, Dana

    2013-01-01

    Manipulatives have the potential to be powerful tools in helping children improve their number sense, develop advanced mathematical strategies, and build an understanding of the base ten number system. Physical manipulatives used in classrooms, however, are often not designed to promote efficient strategy use, such as counting on, and typically do…

  4. Louisiana State University Health Sciences Center Katrina Inspired Disaster Screenings (KIDS): Psychometric Testing of the National Child Traumatic Stress Network Hurricane Assessment and Referral Tool

    ERIC Educational Resources Information Center

    Hansel, Tonya Cross; Osofsky, Joy D.; Osofsky, Howard J.

    2015-01-01

    Background: Post disaster psychosocial surveillance procedures are important for guiding effective and efficient recovery. The Louisiana State University Health Sciences Center Katrina Inspired Disaster Screenings (KIDS) is a model designed with the goal of assisting recovering communities in understanding the needs of and targeting services…

  5. Towards Context-Aware and User-Centered Analysis in Assistive Environments: A Methodology and a Software Tool.

    PubMed

    Fontecha, Jesús; Hervás, Ramón; Mondéjar, Tania; González, Iván; Bravo, José

    2015-10-01

    One of the main challenges on Ambient Assisted Living (AAL) is to reach an appropriate acceptance level of the assistive systems, as well as to analyze and monitor end user tasks in a feasible and efficient way. The development and evaluation of AAL solutions based on user-centered perspective help to achive these goals. In this work, we have designed a methodology to integrate and develop analytics user-centered tools into assistive systems. An analysis software tool gathers information of end users from adapted psychological questionnaires and naturalistic observation of their own context. The aim is to enable an in-deep analysis focused on improving the life quality of elderly people and their caregivers.

  6. Freiburg RNA tools: a central online resource for RNA-focused research and teaching.

    PubMed

    Raden, Martin; Ali, Syed M; Alkhnbashi, Omer S; Busch, Anke; Costa, Fabrizio; Davis, Jason A; Eggenhofer, Florian; Gelhausen, Rick; Georg, Jens; Heyne, Steffen; Hiller, Michael; Kundu, Kousik; Kleinkauf, Robert; Lott, Steffen C; Mohamed, Mostafa M; Mattheis, Alexander; Miladi, Milad; Richter, Andreas S; Will, Sebastian; Wolff, Joachim; Wright, Patrick R; Backofen, Rolf

    2018-05-21

    The Freiburg RNA tools webserver is a well established online resource for RNA-focused research. It provides a unified user interface and comprehensive result visualization for efficient command line tools. The webserver includes RNA-RNA interaction prediction (IntaRNA, CopraRNA, metaMIR), sRNA homology search (GLASSgo), sequence-structure alignments (LocARNA, MARNA, CARNA, ExpaRNA), CRISPR repeat classification (CRISPRmap), sequence design (antaRNA, INFO-RNA, SECISDesign), structure aberration evaluation of point mutations (RaSE), and RNA/protein-family models visualization (CMV), and other methods. Open education resources offer interactive visualizations of RNA structure and RNA-RNA interaction prediction as well as basic and advanced sequence alignment algorithms. The services are freely available at http://rna.informatik.uni-freiburg.de.

  7. Using Teamcenter engineering software for a successive punching tool lifecycle management

    NASA Astrophysics Data System (ADS)

    Blaga, F.; Pele, A.-V.; Stǎnǎşel, I.; Buidoş, T.; Hule, V.

    2015-11-01

    The paper presents studies and researches results of the implementation of Teamcenter (TC) integrated management of a product lifecycle, in a virtual enterprise. The results are able to be implemented also in a real enterprise. The product was considered a successive punching and cutting tool, designed to materialize a metal sheet part. The paper defines the technical documentation flow (flow of information) in the process of constructive computer aided design of the tool. After the design phase is completed a list of parts is generated containing standard or manufactured components (BOM, Bill of Materials). The BOM may be exported to MS Excel (.xls) format and can be transferred to other departments of the company in order to supply the necessary materials and resources to achieve the final product. This paper describes the procedure to modify or change certain dimensions of sheet metal part obtained by punching. After 3D and 2D design, the digital prototype of punching tool moves to following lifecycle phase of the manufacturing process. For each operation of the technological process the corresponding phases are described in detail. Teamcenter enables to describe manufacturing company structure, underlying workstations that carry out various operations of manufacturing process. The paper revealed that the implementation of Teamcenter PDM in a company, improves efficiency of managing product information, eliminating time working with search, verification and correction of documentation, while ensuring the uniqueness and completeness of the product data.

  8. Portfolio: a prototype workstation for development and evaluation of tools for analysis and management of digital portal images.

    PubMed

    Boxwala, A A; Chaney, E L; Fritsch, D S; Friedman, C P; Rosenman, J G

    1998-09-01

    The purpose of this investigation was to design and implement a prototype physician workstation, called PortFolio, as a platform for developing and evaluating, by means of controlled observer studies, user interfaces and interactive tools for analyzing and managing digital portal images. The first observer study was designed to measure physician acceptance of workstation technology, as an alternative to a view box, for inspection and analysis of portal images for detection of treatment setup errors. The observer study was conducted in a controlled experimental setting to evaluate physician acceptance of the prototype workstation technology exemplified by PortFolio. PortFolio incorporates a windows user interface, a compact kit of carefully selected image analysis tools, and an object-oriented data base infrastructure. The kit evaluated in the observer study included tools for contrast enhancement, registration, and multimodal image visualization. Acceptance was measured in the context of performing portal image analysis in a structured protocol designed to simulate clinical practice. The acceptability and usage patterns were measured from semistructured questionnaires and logs of user interactions. Radiation oncologists, the subjects for this study, perceived the tools in PortFolio to be acceptable clinical aids. Concerns were expressed regarding user efficiency, particularly with respect to the image registration tools. The results of our observer study indicate that workstation technology is acceptable to radiation oncologists as an alternative to a view box for clinical detection of setup errors from digital portal images. Improvements in implementation, including more tools and a greater degree of automation in the image analysis tasks, are needed to make PortFolio more clinically practical.

  9. Design and implementation of visualization methods for the CHANGES Spatial Decision Support System

    NASA Astrophysics Data System (ADS)

    Cristal, Irina; van Westen, Cees; Bakker, Wim; Greiving, Stefan

    2014-05-01

    The CHANGES Spatial Decision Support System (SDSS) is a web-based system aimed for risk assessment and the evaluation of optimal risk reduction alternatives at local level as a decision support tool in long-term natural risk management. The SDSS use multidimensional information, integrating thematic, spatial, temporal and documentary data. The role of visualization in this context becomes of vital importance for efficiently representing each dimension. This multidimensional aspect of the required for the system risk information, combined with the diversity of the end-users imposes the use of sophisticated visualization methods and tools. The key goal of the present work is to exploit efficiently the large amount of data in relation to the needs of the end-user, utilizing proper visualization techniques. Three main tasks have been accomplished for this purpose: categorization of the end-users, the definition of system's modules and the data definition. The graphical representation of the data and the visualization tools were designed to be relevant to the data type and the purpose of the analysis. Depending on the end-users category, each user should have access to different modules of the system and thus, to the proper visualization environment. The technologies used for the development of the visualization component combine the latest and most innovative open source JavaScript frameworks, such as OpenLayers 2.13.1, ExtJS 4 and GeoExt 2. Moreover, the model-view-controller (MVC) pattern is used in order to ensure flexibility of the system at the implementation level. Using the above technologies, the visualization techniques implemented so far offer interactive map navigation, querying and comparison tools. The map comparison tools are of great importance within the SDSS and include the following: swiping tool for comparison of different data of the same location; raster subtraction for comparison of the same phenomena varying in time; linked views for comparison of data from different locations and a time slider tool for monitoring changes in spatio-temporal data. All these techniques are part of the interactive interface of the system and make use of spatial and spatio-temporal data. Further significant aspects of the visualization component include conventional cartographic techniques and visualization of non-spatial data. The main expectation from the present work is to offer efficient visualization of risk-related data in order to facilitate the decision making process, which is the final purpose of the CHANGES SDSS. This work is part of the "CHANGES" project, funded by the European Community's 7th Framework Programme.

  10. Gulf Petro Initiative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fathi Boukadi

    2011-02-05

    In this report, technologies for petroleum production and exploration enhancement in deepwater and mature fields are developed through basic and applied research by: (1) Designing new fluids to efficiently drill deepwater wells that can not be cost-effectively drilled with current technologies. The new fluids will be heavy liquid foams that have low-density at shallow dept to avoid formation breakdown and high density at drilling depth to control formation pressure. The goal of this project is to provide industry with formulations of new fluids for reducing casing programs and thus well construction cost in deepwater development. (2) Studying the effects ofmore » flue gas/CO{sub 2} huff n puff on incremental oil recovery in Louisiana oilfields bearing light oil. An artificial neural network (ANN) model will be developed and used to map recovery efficiencies for candidate reservoirs in Louisiana. (3) Arriving at a quantitative understanding for the three-dimensional controlled-source electromagnetic (CSEM) geophysical response of typical Gulf of Mexico hydrocarbon reservoirs. We will seek to make available tools for the qualitative, rapid interpretation of marine CSEM signatures, and tools for efficient, three-dimensional subsurface conductivity modeling.« less

  11. Protein structural similarity search by Ramachandran codes

    PubMed Central

    Lo, Wei-Cheng; Huang, Po-Jung; Chang, Chih-Hung; Lyu, Ping-Chiang

    2007-01-01

    Background Protein structural data has increased exponentially, such that fast and accurate tools are necessary to access structure similarity search. To improve the search speed, several methods have been designed to reduce three-dimensional protein structures to one-dimensional text strings that are then analyzed by traditional sequence alignment methods; however, the accuracy is usually sacrificed and the speed is still unable to match sequence similarity search tools. Here, we aimed to improve the linear encoding methodology and develop efficient search tools that can rapidly retrieve structural homologs from large protein databases. Results We propose a new linear encoding method, SARST (Structural similarity search Aided by Ramachandran Sequential Transformation). SARST transforms protein structures into text strings through a Ramachandran map organized by nearest-neighbor clustering and uses a regenerative approach to produce substitution matrices. Then, classical sequence similarity search methods can be applied to the structural similarity search. Its accuracy is similar to Combinatorial Extension (CE) and works over 243,000 times faster, searching 34,000 proteins in 0.34 sec with a 3.2-GHz CPU. SARST provides statistically meaningful expectation values to assess the retrieved information. It has been implemented into a web service and a stand-alone Java program that is able to run on many different platforms. Conclusion As a database search method, SARST can rapidly distinguish high from low similarities and efficiently retrieve homologous structures. It demonstrates that the easily accessible linear encoding methodology has the potential to serve as a foundation for efficient protein structural similarity search tools. These search tools are supposed applicable to automated and high-throughput functional annotations or predictions for the ever increasing number of published protein structures in this post-genomic era. PMID:17716377

  12. Determining preventability of pediatric readmissions using fault tree analysis.

    PubMed

    Jonas, Jennifer A; Devon, Erin Pete; Ronan, Jeanine C; Ng, Sonia C; Owusu-McKenzie, Jacqueline Y; Strausbaugh, Janet T; Fieldston, Evan S; Hart, Jessica K

    2016-05-01

    Previous studies attempting to distinguish preventable from nonpreventable readmissions reported challenges in completing reviews efficiently and consistently. (1) Examine the efficiency and reliability of a Web-based fault tree tool designed to guide physicians through chart reviews to a determination about preventability. (2) Investigate root causes of general pediatrics readmissions and identify the percent that are preventable. General pediatricians from The Children's Hospital of Philadelphia used a Web-based fault tree tool to classify root causes of all general pediatrics 15-day readmissions in 2014. The tool guided reviewers through a logical progression of questions, which resulted in 1 of 18 root causes of readmission, 8 of which were considered potentially preventable. Twenty percent of cases were cross-checked to measure inter-rater reliability. Of the 7252 discharges, 248 were readmitted, for an all-cause general pediatrics 15-day readmission rate of 3.4%. Of those readmissions, 15 (6.0%) were deemed potentially preventable, corresponding to 0.2% of total discharges. The most common cause of potentially preventable readmissions was premature discharge. For the 50 cross-checked cases, both reviews resulted in the same root cause for 44 (86%) of files (κ = 0.79; 95% confidence interval: 0.60-0.98). Completing 1 review using the tool took approximately 20 minutes. The Web-based fault tree tool helped physicians to identify root causes of hospital readmissions and classify them as either preventable or not preventable in an efficient and consistent way. It also confirmed that only a small percentage of general pediatrics 15-day readmissions are potentially preventable. Journal of Hospital Medicine 2016;11:329-335. © 2016 Society of Hospital Medicine. © 2016 Society of Hospital Medicine.

  13. A knowledge-based design framework for airplane conceptual and preliminary design

    NASA Astrophysics Data System (ADS)

    Anemaat, Wilhelmus A. J.

    The goal of work described herein is to develop the second generation of Advanced Aircraft Analysis (AAA) into an object-oriented structure which can be used in different environments. One such environment is the third generation of AAA with its own user interface, the other environment with the same AAA methods (i.e. the knowledge) is the AAA-AML program. AAA-AML automates the initial airplane design process using current AAA methods in combination with AMRaven methodologies for dependency tracking and knowledge management, using the TechnoSoft Adaptive Modeling Language (AML). This will lead to the following benefits: (1) Reduced design time: computer aided design methods can reduce design and development time and replace tedious hand calculations. (2) Better product through improved design: more alternative designs can be evaluated in the same time span, which can lead to improved quality. (3) Reduced design cost: due to less training and less calculation errors substantial savings in design time and related cost can be obtained. (4) Improved Efficiency: the design engineer can avoid technically correct but irrelevant calculations on incomplete or out of sync information, particularly if the process enables robust geometry earlier. Although numerous advancements in knowledge based design have been developed for detailed design, currently no such integrated knowledge based conceptual and preliminary airplane design system exists. The third generation AAA methods are tested over a ten year period on many different airplane designs. Using AAA methods will demonstrate significant time savings. The AAA-AML system will be exercised and tested using 27 existing airplanes ranging from single engine propeller, business jets, airliners, UAV's to fighters. Data for the varied sizing methods will be compared with AAA results, to validate these methods. One new design, a Light Sport Aircraft (LSA), will be developed as an exercise to use the tool for designing a new airplane. Using these tools will show an improvement in efficiency over using separate programs due to the automatic recalculation with any change of input data. The direct visual feedback of 3D geometry in the AAA-AML, will lead to quicker resolving of problems as opposed to conventional methods.

  14. An Overview of Tools for Creating, Validating and Using PDS Metadata

    NASA Astrophysics Data System (ADS)

    King, T. A.; Hardman, S. H.; Padams, J.; Mafi, J. N.; Cecconi, B.

    2017-12-01

    NASA's Planetary Data System (PDS) has defined information models for creating metadata to describe bundles, collections and products for all the assets acquired by a planetary science projects. Version 3 of the PDS Information Model (commonly known as "PDS3") is widely used and is used to describe most of the existing planetary archive. Recently PDS has released version 4 of the Information Model (commonly known as "PDS4") which is designed to improve consistency, efficiency and discoverability of information. To aid in creating, validating and using PDS4 metadata the PDS and a few associated groups have developed a variety of tools. In addition, some commercial tools, both free and for a fee, can be used to create and work with PDS4 metadata. We present an overview of these tools, describe those tools currently under development and provide guidance as to which tools may be most useful for missions, instrument teams and the individual researcher.

  15. Development of a User Interface for a Regression Analysis Software Tool

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  16. SYRCLE’s risk of bias tool for animal studies

    PubMed Central

    2014-01-01

    Background Systematic Reviews (SRs) of experimental animal studies are not yet common practice, but awareness of the merits of conducting such SRs is steadily increasing. As animal intervention studies differ from randomized clinical trials (RCT) in many aspects, the methodology for SRs of clinical trials needs to be adapted and optimized for animal intervention studies. The Cochrane Collaboration developed a Risk of Bias (RoB) tool to establish consistency and avoid discrepancies in assessing the methodological quality of RCTs. A similar initiative is warranted in the field of animal experimentation. Methods We provide an RoB tool for animal intervention studies (SYRCLE’s RoB tool). This tool is based on the Cochrane RoB tool and has been adjusted for aspects of bias that play a specific role in animal intervention studies. To enhance transparency and applicability, we formulated signalling questions to facilitate judgment. Results The resulting RoB tool for animal studies contains 10 entries. These entries are related to selection bias, performance bias, detection bias, attrition bias, reporting bias and other biases. Half these items are in agreement with the items in the Cochrane RoB tool. Most of the variations between the two tools are due to differences in design between RCTs and animal studies. Shortcomings in, or unfamiliarity with, specific aspects of experimental design of animal studies compared to clinical studies also play a role. Conclusions SYRCLE’s RoB tool is an adapted version of the Cochrane RoB tool. Widespread adoption and implementation of this tool will facilitate and improve critical appraisal of evidence from animal studies. This may subsequently enhance the efficiency of translating animal research into clinical practice and increase awareness of the necessity of improving the methodological quality of animal studies. PMID:24667063

  17. GMOseek: a user friendly tool for optimized GMO testing.

    PubMed

    Morisset, Dany; Novak, Petra Kralj; Zupanič, Darko; Gruden, Kristina; Lavrač, Nada; Žel, Jana

    2014-08-01

    With the increasing pace of new Genetically Modified Organisms (GMOs) authorized or in pipeline for commercialization worldwide, the task of the laboratories in charge to test the compliance of food, feed or seed samples with their relevant regulations became difficult and costly. Many of them have already adopted the so called "matrix approach" to rationalize the resources and efforts used to increase their efficiency within a limited budget. Most of the time, the "matrix approach" is implemented using limited information and some proprietary (if any) computational tool to efficiently use the available data. The developed GMOseek software is designed to support decision making in all the phases of routine GMO laboratory testing, including the interpretation of wet-lab results. The tool makes use of a tabulated matrix of GM events and their genetic elements, of the laboratory analysis history and the available information about the sample at hand. The tool uses an optimization approach to suggest the most suited screening assays for the given sample. The practical GMOseek user interface allows the user to customize the search for a cost-efficient combination of screening assays to be employed on a given sample. It further guides the user to select appropriate analyses to determine the presence of individual GM events in the analyzed sample, and it helps taking a final decision regarding the GMO composition in the sample. GMOseek can also be used to evaluate new, previously unused GMO screening targets and to estimate the profitability of developing new GMO screening methods. The presented freely available software tool offers the GMO testing laboratories the possibility to select combinations of assays (e.g. quantitative real-time PCR tests) needed for their task, by allowing the expert to express his/her preferences in terms of multiplexing and cost. The utility of GMOseek is exemplified by analyzing selected food, feed and seed samples from a national reference laboratory for GMO testing and by comparing its performance to existing tools which use the matrix approach. GMOseek proves superior when tested on real samples in terms of GMO coverage and cost efficiency of its screening strategies, including its capacity of simple interpretation of the testing results.

  18. Miniaturization of Planar Horn Motors

    NASA Technical Reports Server (NTRS)

    Sherrit, Stewart; Ostlund, Patrick N.; Chang, Zensheu; Bao, Xiaoqi; Bar-Cohen, Yoseph; Widholm, Scott E.; Badescu, Mircea

    2012-01-01

    There is a great need for compact, efficient motors for driving various mechanisms including robots or mobility platforms. A study is currently underway to develop a new type of piezoelectric actuators with significantly more strength, low mass, small footprint, and efficiency. The actuators/motors utilize piezoelectric actuated horns which have a very high power density and high electromechanical conversion efficiency. The horns are fabricated using our recently developed novel pre-stress flexures that make them thermally stable and increases their coupling efficiency. The monolithic design and integrated flexures that pre-stresses the piezoelectric stack eliminates the use of stress bolt. This design allows embedding solid-state motors and actuators in any structure so that the only macroscopically moving parts are the rotor or the linear translator. The developed actuator uses a stack/horn actuation and has a Barth motor configuration, which potentially generates very large torque and speeds that do not require gearing. Finite element modeling and design tools were investigated to determine the requirements and operation parameters and the results were used to design and fabricate a motor. This new design offers a highly promising actuation mechanism that can potentially be miniaturized and integrated into systems and structures. It can be configured in many shapes to operate as multi-degrees of freedom and multi-dimensional motors/actuators including unidirectional, bidirectional, 2D and 3D. In this manuscript, we are reporting the experimental measurements from a bench top design and the results from the efforts to miniaturize the design using 2x2x2 mm piezoelectric stacks integrated into thin plates that are of the order of3 x 3x 0.2 cm.

  19. On Designing Multicore-Aware Simulators for Systems Biology Endowed with OnLine Statistics

    PubMed Central

    Calcagno, Cristina; Coppo, Mario

    2014-01-01

    The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed. PMID:25050327

  20. E-KIT: An Electronic-Knowledge Information Tool for Organizing Site Information and Improving Technical Communication with Stakeholders - 13082

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kautsky, Mark; Findlay, Richard C.; Hodges, Rex A.

    2013-07-01

    Managing technical references for projects that have long histories is hampered by the large collection of documents, each of which might contain discrete pieces of information relevant to the site conceptual model. A database application has been designed to improve the efficiency of retrieving technical information for a project. Although many databases are currently used for accessing analytical and geo-referenced data, applications designed specifically to manage technical reference material for projects are scarce. Retrieving site data from the array of available references becomes an increasingly inefficient use of labor. The electronic-Knowledge Information Tool (e-KIT) is designed as a project-level resourcemore » to access and communicate technical information. The e-KIT is a living tool that grows as new information becomes available, and its value to the project increases as the volume of site information increases. Having all references assembled in one location with complete reference citations and links to elements of the site conceptual model offers a way to enhance communication with outside groups. The published and unpublished references are incorporated into the e-KIT, while the compendium of references serves as a complete bibliography for the project. (authors)« less

Top