Sample records for concurrent engineering design

  1. Characterizing Distributed Concurrent Engineering Teams: A Descriptive Framework for Aerospace Concurrent Engineering Design Teams

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Debarati; Hihn, Jairus; Warfield, Keith

    2011-01-01

    As aerospace missions grow larger and more technically complex in the face of ever tighter budgets, it will become increasingly important to use concurrent engineering methods in the development of early conceptual designs because of their ability to facilitate rapid assessments and trades in a cost-efficient manner. To successfully accomplish these complex missions with limited funding, it is also essential to effectively leverage the strengths of individuals and teams across government, industry, academia, and international agencies by increased cooperation between organizations. As a result, the existing concurrent engineering teams will need to increasingly engage in distributed collaborative concurrent design. This paper is an extension of a recent white paper written by the Concurrent Engineering Working Group, which details the unique challenges of distributed collaborative concurrent engineering. This paper includes a short history of aerospace concurrent engineering, and defines the terms 'concurrent', 'collaborative' and 'distributed' in the context of aerospace concurrent engineering. In addition, a model for the levels of complexity of concurrent engineering teams is presented to provide a way to conceptualize information and data flow within these types of teams.

  2. Aerospace Concurrent Engineering Design Teams: Current State, Next Steps and a Vision for the Future

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus; Chattopadhyay, Debarati; Karpati, Gabriel; McGuire, Melissa; Borden, Chester; Panek, John; Warfield, Keith

    2011-01-01

    Over the past sixteen years, government aerospace agencies and aerospace industry have developed and evolved operational concurrent design teams to create novel spaceflight mission concepts and designs. These capabilities and teams, however, have evolved largely independently. In today's environment of increasingly complex missions with limited budgets it is becoming readily apparent that both implementing organizations and today's concurrent engineering teams will need to interact more often than they have in the past. This will require significant changes in the current state of practice. This paper documents the findings from a concurrent engineering workshop held in August 2010 to identify the key near term improvement areas for concurrent engineering capabilities and challenges to the long-term advancement of concurrent engineering practice. The paper concludes with a discussion of a proposed vision for the evolution of these teams over the next decade.

  3. Model-Based Engineering Design for Trade Space Exploration throughout the Design Cycle

    NASA Technical Reports Server (NTRS)

    Lamassoure, Elisabeth S.; Wall, Stephen D.; Easter, Robert W.

    2004-01-01

    This paper presents ongoing work to standardize model-based system engineering as a complement to point design development in the conceptual design phase of deep space missions. It summarizes two first steps towards practical application of this capability within the framework of concurrent engineering design teams and their customers. The first step is standard generation of system sensitivities models as the output of concurrent engineering design sessions, representing the local trade space around a point design. A review of the chosen model development process, and the results of three case study examples, demonstrate that a simple update to the concurrent engineering design process can easily capture sensitivities to key requirements. It can serve as a valuable tool to analyze design drivers and uncover breakpoints in the design. The second step is development of rough-order- of-magnitude, broad-range-of-validity design models for rapid exploration of the trade space, before selection of a point design. At least one case study demonstrated the feasibility to generate such models in a concurrent engineering session. The experiment indicated that such a capability could yield valid system-level conclusions for a trade space composed of understood elements. Ongoing efforts are assessing the practicality of developing end-to-end system-level design models for use before even convening the first concurrent engineering session, starting with modeling an end-to-end Mars architecture.

  4. Concurrent Engineering Working Group White Paper Distributed Collaborative Design: The Next Step in Aerospace Concurrent Engineering

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus; Chattopadhyay, Debarati; Karpati, Gabriel; McGuire, Melissa; Panek, John; Warfield, Keith; Borden, Chester

    2011-01-01

    As aerospace missions grow larger and more technically complex in the face of ever tighter budgets, it will become increasingly important to use concurrent engineering methods in the development of early conceptual designs because of their ability to facilitate rapid assessments and trades of performance, cost and schedule. To successfully accomplish these complex missions with limited funding, it is essential to effectively leverage the strengths of individuals and teams across government, industry, academia, and international agencies by increased cooperation between organizations. As a result, the existing concurrent engineering teams will need to increasingly engage in distributed collaborative concurrent design. The purpose of this white paper is to identify a near-term vision for the future of distributed collaborative concurrent engineering design for aerospace missions as well as discuss the challenges to achieving that vision. The white paper also documents the advantages of creating a working group to investigate how to engage the expertise of different teams in joint design sessions while enabling organizations to maintain their organizations competitive advantage.

  5. Concurrent engineering: Spacecraft and mission operations system design

    NASA Technical Reports Server (NTRS)

    Landshof, J. A.; Harvey, R. J.; Marshall, M. H.

    1994-01-01

    Despite our awareness of the mission design process, spacecraft historically have been designed and developed by one team and then turned over as a system to the Mission Operations organization to operate on-orbit. By applying concurrent engineering techniques and envisioning operability as an essential characteristic of spacecraft design, tradeoffs can be made in the overall mission design to minimize mission lifetime cost. Lessons learned from previous spacecraft missions will be described, as well as the implementation of concurrent mission operations and spacecraft engineering for the Near Earth Asteroid Rendezvous (NEAR) program.

  6. The MEOW lunar project for education and science based on concurrent engineering approach

    NASA Astrophysics Data System (ADS)

    Roibás-Millán, E.; Sorribes-Palmer, F.; Chimeno-Manguán, M.

    2018-07-01

    The use of concurrent engineering in the design of space missions allows to take into account in an interrelated methodology the high level of coupling and iteration of mission subsystems in the preliminary conceptual phase. This work presents the result of applying concurrent engineering in a short time lapse to design the main elements of the preliminary design for a lunar exploration mission, developed within ESA Academy Concurrent Engineering Challenge 2017. During this program, students of the Master in Space Systems at Technical University of Madrid designed a low cost satellite to find water on the Moon south pole as prospect of a future human lunar base. The resulting mission, The Moon Explorer And Observer of Water/Ice (MEOW) compromises a 262 kg spacecraft to be launched into a Geostationary Transfer Orbit as a secondary payload in the 2023/2025 time frame. A three months Weak Stability Boundary transfer via the Sun-Earth L1 Lagrange point allows for a high launch timeframe flexibility. The different aspects of the mission (orbit analysis, spacecraft design and payload) and possibilities of concurrent engineering are described.

  7. Electronic Design Automation: Integrating the Design and Manufacturing Functions

    NASA Technical Reports Server (NTRS)

    Bachnak, Rafic; Salkowski, Charles

    1997-01-01

    As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.

  8. Concurrent Software Engineering Project

    ERIC Educational Resources Information Center

    Stankovic, Nenad; Tillo, Tammam

    2009-01-01

    Concurrent engineering or overlapping activities is a business strategy for schedule compression on large development projects. Design parameters and tasks from every aspect of a product's development process and their interdependencies are overlapped and worked on in parallel. Concurrent engineering suffers from negative effects such as excessive…

  9. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  10. DARPA Concurrent Design/Concurrent Engineering Workshop Held in Key West, Florida on December 6-8, 1988

    DTIC Science & Technology

    1988-12-01

    engineering disciplines. (Here I refer to training in multifunction team mana ement dir’lplines, quality engineering methods, experimental design by such...4001 SSOME ISSUES S• View of strategic issues has been evolving - Speed of design and product deployment - to accelerate experimentation with new...manufacturingprocess design n New technologies (e.g., composites) which can revolutionize prod-uct technical design in some cases Issue still to be faced: " non

  11. Integrated System-Level Optimization for Concurrent Engineering With Parametric Subsystem Modeling

    NASA Technical Reports Server (NTRS)

    Schuman, Todd; DeWeck, Oliver L.; Sobieski, Jaroslaw

    2005-01-01

    The introduction of concurrent design practices to the aerospace industry has greatly increased the productivity of engineers and teams during design sessions as demonstrated by JPL's Team X. Simultaneously, advances in computing power have given rise to a host of potent numerical optimization methods capable of solving complex multidisciplinary optimization problems containing hundreds of variables, constraints, and governing equations. Unfortunately, such methods are tedious to set up and require significant amounts of time and processor power to execute, thus making them unsuitable for rapid concurrent engineering use. This paper proposes a framework for Integration of System-Level Optimization with Concurrent Engineering (ISLOCE). It uses parametric neural-network approximations of the subsystem models. These approximations are then linked to a system-level optimizer that is capable of reaching a solution quickly due to the reduced complexity of the approximations. The integration structure is described in detail and applied to the multiobjective design of a simplified Space Shuttle external fuel tank model. Further, a comparison is made between the new framework and traditional concurrent engineering (without system optimization) through an experimental trial with two groups of engineers. Each method is evaluated in terms of optimizer accuracy, time to solution, and ease of use. The results suggest that system-level optimization, running as a background process during integrated concurrent engineering sessions, is potentially advantageous as long as it is judiciously implemented.

  12. Implementation of a Three-Semester Concurrent Engineering Design Sequence for Lower-Division Engineering Students

    ERIC Educational Resources Information Center

    Bertozzi, N.; Hebert, C.; Rought, J.; Staniunas, C.

    2007-01-01

    Over the past decade the software products available for solid modeling, dynamic, stress, thermal, and flow analysis, and computer-aiding manufacturing (CAM) have become more powerful, affordable, and easier to use. At the same time it has become increasingly important for students to gain concurrent engineering design and systems integration…

  13. Group Design Problems in Engineering Design Graphics.

    ERIC Educational Resources Information Center

    Kelley, David

    2001-01-01

    Describes group design techniques used within the engineering design graphics sequence at Western Washington University. Engineering and design philosophies such as concurrent engineering place an emphasis on group collaboration for the solving of design problems. (Author/DDR)

  14. Integrating Thermal Tools Into the Mechanical Design Process

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  15. Model-Based Systems Engineering in Concurrent Engineering Centers

    NASA Technical Reports Server (NTRS)

    Iwata, Curtis; Infeld, Samantha; Bracken, Jennifer Medlin; McGuire; McQuirk, Christina; Kisdi, Aron; Murphy, Jonathan; Cole, Bjorn; Zarifian, Pezhman

    2015-01-01

    Concurrent Engineering Centers (CECs) are specialized facilities with a goal of generating and maturing engineering designs by enabling rapid design iterations. This is accomplished by co-locating a team of experts (either physically or virtually) in a room with a focused design goal and a limited timeline of a week or less. The systems engineer uses a model of the system to capture the relevant interfaces and manage the overall architecture. A single model that integrates other design information and modeling allows the entire team to visualize the concurrent activity and identify conflicts more efficiently, potentially resulting in a systems model that will continue to be used throughout the project lifecycle. Performing systems engineering using such a system model is the definition of model-based systems engineering (MBSE); therefore, CECs evolving their approach to incorporate advances in MBSE are more successful in reducing time and cost needed to meet study goals. This paper surveys space mission CECs that are in the middle of this evolution, and the authors share their experiences in order to promote discussion within the community.

  16. Model-Based Systems Engineering in Concurrent Engineering Centers

    NASA Technical Reports Server (NTRS)

    Iwata, Curtis; Infeld, Samatha; Bracken, Jennifer Medlin; McGuire, Melissa; McQuirk, Christina; Kisdi, Aron; Murphy, Jonathan; Cole, Bjorn; Zarifian, Pezhman

    2015-01-01

    Concurrent Engineering Centers (CECs) are specialized facilities with a goal of generating and maturing engineering designs by enabling rapid design iterations. This is accomplished by co-locating a team of experts (either physically or virtually) in a room with a narrow design goal and a limited timeline of a week or less. The systems engineer uses a model of the system to capture the relevant interfaces and manage the overall architecture. A single model that integrates other design information and modeling allows the entire team to visualize the concurrent activity and identify conflicts more efficiently, potentially resulting in a systems model that will continue to be used throughout the project lifecycle. Performing systems engineering using such a system model is the definition of model-based systems engineering (MBSE); therefore, CECs evolving their approach to incorporate advances in MBSE are more successful in reducing time and cost needed to meet study goals. This paper surveys space mission CECs that are in the middle of this evolution, and the authors share their experiences in order to promote discussion within the community.

  17. GLobal Integrated Design Environment (GLIDE): A Concurrent Engineering Application

    NASA Technical Reports Server (NTRS)

    McGuire, Melissa L.; Kunkel, Matthew R.; Smith, David A.

    2010-01-01

    The GLobal Integrated Design Environment (GLIDE) is a client-server software application purpose-built to mitigate issues associated with real time data sharing in concurrent engineering environments and to facilitate discipline-to-discipline interaction between multiple engineers and researchers. GLIDE is implemented in multiple programming languages utilizing standardized web protocols to enable secure parameter data sharing between engineers and researchers across the Internet in closed and/or widely distributed working environments. A well defined, HyperText Transfer Protocol (HTTP) based Application Programming Interface (API) to the GLIDE client/server environment enables users to interact with GLIDE, and each other, within common and familiar tools. One such common tool, Microsoft Excel (Microsoft Corporation), paired with its add-in API for GLIDE, is discussed in this paper. The top-level examples given demonstrate how this interface improves the efficiency of the design process of a concurrent engineering study while reducing potential errors associated with manually sharing information between study participants.

  18. Next-generation concurrent engineering: developing models to complement point designs

    NASA Technical Reports Server (NTRS)

    Morse, Elizabeth; Leavens, Tracy; Cohanim, Barbak; Harmon, Corey; Mahr, Eric; Lewis, Brian

    2006-01-01

    Concurrent Engineering Design teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a next generation CED; nin addition to a point design, the team develops a model of the local trade space. The process is a balance between the power of model-developing tools and the creativity of human experts, enabling the development of a variety of trade models for any space mission.

  19. The Application of Concurrent Engineering Tools and Design Structure Matrix in Designing Tire

    NASA Astrophysics Data System (ADS)

    Ginting, Rosnani; Fachrozi Fitra Ramadhan, T.

    2016-02-01

    The development of automobile industry in Indonesia is growing rapidly. This phenomenon causes companies related to the automobile industry such as tire industry must develop products based on customers’ needs and considering the timeliness of delivering the product to the customer. It could be reached by applying strategic planning in developing an integrated concept of product development. This research was held in PT. XYZ that applied the sequential approach in designing and developing products. The need to improve in one stage of product development could occur re-designing that needs longer time in developing a new product. This research is intended to get an integrated product design concept of tire pertaining to the customer's needs using Concurrent Engineering Tools by implementing the two-phased of product development. The implementation of Concurrent Engineering approach results in applying the stage of project planning, conceptual design, and product modules. The product modules consist of four modules that using Product Architecture - Design Structure Matrix to ease the designing process of new product development.

  20. How Engineers Really Think About Risk: A Study of JPL Engineers

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus; Chattopadhyay, Deb; Valerdi, Ricardo

    2011-01-01

    The objectives of this work are: To improve risk assessment practices as used during the mission design process by JPL's concurrent engineering teams. (1) Developing effective ways to identify and assess mission risks (2) Providing a process for more effective dialog between stakeholders about the existence and severity of mission risks (3) Enabling the analysis of interactions of risks across concurrent engineering roles.

  1. Product design for energy reduction in concurrent engineering: An Inverted Pyramid Approach

    NASA Astrophysics Data System (ADS)

    Alkadi, Nasr M.

    Energy factors in product design in concurrent engineering (CE) are becoming an emerging dimension for several reasons; (a) the rising interest in "green design and manufacturing", (b) the national energy security concerns and the dramatic increase in energy prices, (c) the global competition in the marketplace and global climate change commitments including carbon tax and emission trading systems, and (d) the widespread recognition of the need for sustainable development. This research presents a methodology for the intervention of energy factors in concurrent engineering product development process to significantly reduce the manufacturing energy requirement. The work presented here is the first attempt at integrating the design for energy in concurrent engineering framework. It adds an important tool to the DFX toolbox for evaluation of the impact of design decisions on the product manufacturing energy requirement early during the design phase. The research hypothesis states that "Product Manufacturing Energy Requirement is a Function of Design Parameters". The hypothesis was tested by conducting experimental work in machining and heat treating that took place at the manufacturing lab of the Industrial and Management Systems Engineering Department (IMSE) at West Virginia University (WVU) and at a major U.S steel manufacturing plant, respectively. The objective of the machining experiment was to study the effect of changing specific product design parameters (Material type and diameter) and process design parameters (metal removal rate) on a gear head lathe input power requirement through performing defined sets of machining experiments. The objective of the heat treating experiment was to study the effect of varying product charging temperature on the fuel consumption of a walking beams reheat furnace. The experimental work in both directions have revealed important insights into energy utilization in machining and heat-treating processes and its variance based on product, process, and system design parameters. In depth evaluation to how the design and manufacturing normally happen in concurrent engineering provided a framework to develop energy system levels in machining within the concurrent engineering environment using the method of "Inverted Pyramid Approach", (IPA). The IPA features varying levels of output energy based information depending on the input design parameters that is available during each stage (level) of the product design. The experimental work, the in-depth evaluation of design and manufacturing in CE, and the developed energy system levels in machining provided a solid base for the development of the model for the design for energy reduction in CE. The model was used to analyze an example part where 12 evolving designs were thoroughly reviewed to investigate the sensitivity of energy to design parameters in machining. The model allowed product design teams to address manufacturing energy concerns early during the design stage. As a result, ranges for energy sensitive design parameters impacting product manufacturing energy consumption were found in earlier levels. As designer proceeds to deeper levels in the model, this range tightens and results in significant energy reductions.

  2. True Concurrent Thermal Engineering Integrating CAD Model Building with Finite Element and Finite Difference Methods

    NASA Technical Reports Server (NTRS)

    Panczak, Tim; Ring, Steve; Welch, Mark

    1999-01-01

    Thermal engineering has long been left out of the concurrent engineering environment dominated by CAD (computer aided design) and FEM (finite element method) software. Current tools attempt to force the thermal design process into an environment primarily created to support structural analysis, which results in inappropriate thermal models. As a result, many thermal engineers either build models "by hand" or use geometric user interfaces that are separate from and have little useful connection, if any, to CAD and FEM systems. This paper describes the development of a new thermal design environment called the Thermal Desktop. This system, while fully integrated into a neutral, low cost CAD system, and which utilizes both FEM and FD methods, does not compromise the needs of the thermal engineer. Rather, the features needed for concurrent thermal analysis are specifically addressed by combining traditional parametric surface based radiation and FD based conduction modeling with CAD and FEM methods. The use of flexible and familiar temperature solvers such as SINDA/FLUINT (Systems Improved Numerical Differencing Analyzer/Fluid Integrator) is retained.

  3. How to Quickly Import CAD Geometry into Thermal Desktop

    NASA Technical Reports Server (NTRS)

    Wright, Shonte; Beltran, Emilio

    2002-01-01

    There are several groups at JPL (Jet Propulsion Laboratory) that are committed to concurrent design efforts, two are featured here. Center for Space Mission Architecture and Design (CSMAD) enables the practical application of advanced process technologies in JPL's mission architecture process. Team I functions as an incubator for projects that are in the Discovery, and even pre-Discovery proposal stages. JPL's concurrent design environment is to a large extent centered on the CAD (Computer Aided Design) file. During concurrent design sessions CAD geometry is ported to other more specialized engineering design packages.

  4. Utilization of CAD/CAE for concurrent design of structural aircraft components

    NASA Technical Reports Server (NTRS)

    Kahn, William C.

    1993-01-01

    The feasibility of installing the Stratospheric Observatory for Infrared Astronomy telescope (named SOFIA) into an aircraft for NASA astronomy studies is investigated using CAD/CAE equipment to either design or supply data for every facet of design engineering. The aircraft selected for the platform was a Boeing 747, chosen on the basis of its ability to meet the flight profiles required for the given mission and payload. CAD models of the fuselage of two of the aircraft models studied (747-200 and 747 SP) were developed, and models for the component parts of the telescope and subsystems were developed by the various concurrent engineering groups of the SOFIA program, to determine the requirements for the cavity opening and for design configuration. It is noted that, by developing a plan to use CAD/CAE for concurrent engineering at the beginning of the study, it was possible to produce results in about two-thirds of the time required using traditional methods.

  5. Next-generation concurrent engineering: developing models to complement point designs

    NASA Technical Reports Server (NTRS)

    Morse, Elizabeth; Leavens, Tracy; Cohanim, Babak; Harmon, Corey; Mahr, Eric; Lewis, Brian

    2006-01-01

    Concurrent Engineering Design (CED) teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a 'next-generation CED; in addition to a point design, the Team develops a model of the local trade space. The process is a balance between the power of a model developing tools and the creativity of humal experts, enabling the development of a variety of trade models for any space mission. This paper reviews the modeling method and its practical implementation in the ED environment. Example results illustrate the benefit of this approach.

  6. Risk Identification and Visualization in a Concurrent Engineering Team Environment

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus; Chattopadhyay, Debarati; Shishko, Robert

    2010-01-01

    Incorporating risk assessment into the dynamic environment of a concurrent engineering team requires rapid response and adaptation. Generating consistent risk lists with inputs from all the relevant subsystems and presenting the results clearly to the stakeholders in a concurrent engineering environment is difficult because of the speed with which decisions are made. In this paper we describe the various approaches and techniques that have been explored for the point designs of JPL's Team X and the Trade Space Studies of the Rapid Mission Architecture Team. The paper will also focus on the issues of the misuse of categorical and ordinal data that keep arising within current engineering risk approaches and also in the applied risk literature.

  7. Use of Concurrent Engineering in Space Mission Design

    NASA Technical Reports Server (NTRS)

    Wall, S.

    2000-01-01

    In recent years, conceptual-phase (proposal level) design of space missions has been improved considerably. Team structures, tool linkage, specialized facilities known as design centers and scripted processes have been demonstrated to cut proposal-level engineering design time from a few months to a few weeks.

  8. The Concurrent Engineering Design Paradigm Is Now Fully Functional for Graphics Education

    ERIC Educational Resources Information Center

    Krueger, Thomas J.; Barr, Ronald E.

    2007-01-01

    Engineering design graphics education has come a long way in the past two decades. The emergence of solid geometric modeling technology has become the focal point for the graphical development of engineering design ideas. The main attraction of this 3-D modeling approach is the downstream application of the data base to analysis and…

  9. A general engineering scenario for concurrent engineering environments

    NASA Astrophysics Data System (ADS)

    Mucino, V. H.; Pavelic, V.

    The paper describes an engineering method scenario which categorizes the various activities and tasks into blocks seen as subjects which consume and produce data and information. These methods, tools, and associated utilities interact with other engineering tools by exchanging information in such a way that a relationship between customers and suppliers of engineering data is established clearly, while data exchange consistency is maintained throughout the design process. The events and data transactions are presented in the form of flowcharts in which data transactions represent the connection between the various bricks, which in turn represent the engineering activities developed for the particular task required in the concurrent engineering environment.

  10. Systems Engineering News | Wind | NREL

    Science.gov Websites

    News Systems Engineering News The Wind Plant Optimization and Systems Engineering newsletter covers range from multi-disciplinary design analysis and optimization of wind turbine sub-components to wind plant optimization and uncertainty analysis to concurrent engineering and financial engineering

  11. A Design-Based Engineering Graphics Course for First-Year Students.

    ERIC Educational Resources Information Center

    Smith, Shana Shiang-Fong

    2003-01-01

    Describes the first-year Introduction to Design course at Iowa State University which incorporates design for manufacturing and concurrent engineering principles into the curriculum. Autodesk Inventor was used as the primary CAD tool for parametric solid modeling. Test results show that student spatial visualization skills were dramatically…

  12. The Design of a Primary Flight Trainer using Concurrent Engineering Concepts

    NASA Technical Reports Server (NTRS)

    Ladesic, James G.; Eastlake, Charles N.; Kietzmann, Nicholas H.

    1993-01-01

    Concurrent Engineering (CE) concepts seek to coordinate the expertise of various disciplines from initial design configuration selection through product disposal so that cost efficient design solutions may be achieve. Integrating this methodology into an undergraduate design course sequence may provide a needed enhancement to engineering education. The Advanced Design Program (ADP) project at Embry-Riddle Aeronautical University (EMU) is focused on developing recommendations for the general aviation Primary Flight Trainer (PFT) of the twenty first century using methods of CE. This project, over the next two years, will continue synthesizing the collective knowledge of teams composed of engineering students along with students from other degree programs, their faculty, and key industry representatives. During the past year (Phase I). conventional trainer configurations that comply with current regulations and existing technologies have been evaluated. Phase I efforts have resulted in two baseline concepts, a high-wing, conventional design named Triton and a low-wing, mid-engine configuration called Viper. In the second and third years (Phases II and III). applications of advanced propulsion, advanced materials, and unconventional airplane configurations along with military and commercial technologies which are anticipated to be within the economic range of general aviation by the year 2000, will be considered.

  13. Concurrent engineering design and management knowledge capture

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The topics are presented in viewgraph form and include the following: real-time management, personnel management, project management, conceptual design and decision making; the SITRF design problem; and the electronic-design notebook.

  14. Working on the Boundaries: Philosophies and Practices of the Design Process

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.

    1996-01-01

    While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.

  15. Design for improved maintenance of the fiber-optic cable system (As carried out in a concurrent engineering environment)

    NASA Astrophysics Data System (ADS)

    Tremoulet, P. C.

    The author describes a number of maintenance improvements in the Fiber Optic Cable System (FOCS). They were achieved during a production phase pilot concurrent engineering program. Listed in order of importance (saved maintenance time and material) by maintenance level, they are: (1) organizational level: improved fiber optic converter (FOC) BITE; (2) Intermediate level: reduced FOC adjustments from 20 to 2; partitioned FOC into electrical and optical parts; developed cost-effective fault isolation test points and test using standard test equipment; improved FOC chassis to have lower mean time to repair; and (3) depot level: revised test requirements documents (TRDs) for common automatic test equipment and incorporated ATE testability into circuit and assemblies and application-specific integrated circuits. These improvements met this contract's tailored logistics MIL-STD 1388-1A requirements of monitoring the design for supportability and determining the most effective support equipment. Important logistics lessons learned while accomplishing these maintainability and supportability improvements on the pilot concurrent engineering program are also discussed.

  16. Multidisciplinary optimization for engineering systems - Achievements and potential

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1989-01-01

    The currently common sequential design process for engineering systems is likely to lead to suboptimal designs. Recently developed decomposition methods offer an alternative for coming closer to optimum by breaking the large task of system optimization into smaller, concurrently executed and, yet, coupled tasks, identified with engineering disciplines or subsystems. The hierarchic and non-hierarchic decompositions are discussed and illustrated by examples. An organization of a design process centered on the non-hierarchic decomposition is proposed.

  17. Multidisciplinary optimization for engineering systems: Achievements and potential

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1989-01-01

    The currently common sequential design process for engineering systems is likely to lead to suboptimal designs. Recently developed decomposition methods offer an alternative for coming closer to optimum by breaking the large task of system optimization into smaller, concurrently executed and, yet, coupled tasks, identified with engineering disciplines or subsystems. The hierarchic and non-hierarchic decompositions are discussed and illustrated by examples. An organization of a design process centered on the non-hierarchic decomposition is proposed.

  18. DREAMS and IMAGE: A Model and Computer Implementation for Concurrent, Life-Cycle Design of Complex Systems

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.

    1995-01-01

    Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.

  19. Processing multilevel secure test and evaluation information

    NASA Astrophysics Data System (ADS)

    Hurlburt, George; Hildreth, Bradley; Acevedo, Teresa

    1994-07-01

    The Test and Evaluation Community Network (TECNET) is building a Multilevel Secure (MLS) system. This system features simultaneous access to classified and unclassified information and easy access through widely available communications channels. It provides the necessary separation of classification levels, assured through the use of trusted system design techniques, security assessments and evaluations. This system enables cleared T&E users to view and manipulate classified and unclassified information resources either using a single terminal interface or multiple windows in a graphical user interface. TECNET is in direct partnership with the National Security Agency (NSA) to develop and field the MLS TECNET capability in the near term. The centerpiece of this partnership is a state-of-the-art Concurrent Systems Security Engineering (CSSE) process. In developing the MLS TECNET capability, TECNET and NSA are providing members, with various expertise and diverse backgrounds, to participate in the CSSE process. The CSSE process is founded on the concepts of both Systems Engineering and Concurrent Engineering. Systems Engineering is an interdisciplinary approach to evolve and verify an integrated and life cycle balanced set of system product and process solutions that satisfy customer needs (ASD/ENS-MIL STD 499B 1992). Concurrent Engineering is design and development using the simultaneous, applied talents of a diverse group of people with the appropriate skills. Harnessing diverse talents to support CSSE requires active participation by team members in an environment that both respects and encourages diversity.

  20. Multi-Attribute Tradespace Exploration in Space System Design

    NASA Astrophysics Data System (ADS)

    Ross, A. M.; Hastings, D. E.

    2002-01-01

    The complexity inherent in space systems necessarily requires intense expenditures of resources both human and monetary. The high level of ambiguity present in the early design phases of these systems causes long, highly iterative, and costly design cycles. This paper looks at incorporating decision theory methods into the early design processes to streamline communication of wants and needs among stakeholders and between levels of design. Communication channeled through formal utility interviews and analysis enables engineers to better understand the key drivers for the system and allows a more thorough exploration of the design tradespace. Multi-Attribute Tradespace Exploration (MATE), an evolving process incorporating decision theory into model and simulation- based design, has been applied to several space system case studies at MIT. Preliminary results indicate that this process can improve the quality of communication to more quickly resolve project ambiguity, and enable the engineer to discover better value designs for multiple stakeholders. MATE is also being integrated into a concurrent design environment to facilitate the transfer knowledge of important drivers into higher fidelity design phases. Formal utility theory provides a mechanism to bridge the language barrier between experts of different backgrounds and differing needs (e.g. scientists, engineers, managers, etc). MATE with concurrent design couples decision makers more closely to the design, and most importantly, maintains their presence between formal reviews.

  1. Studies on spatial modes and the correlation anisotropy of entangled photons generated from 2D quadratic nonlinear photonic crystals

    NASA Astrophysics Data System (ADS)

    Luo, X. W.; Xu, P.; Sun, C. W.; Jin, H.; Hou, R. J.; Leng, H. Y.; Zhu, S. N.

    2017-06-01

    Concurrent spontaneous parametric down-conversion (SPDC) processes have proved to be an appealing approach for engineering the path-entangled photonic state with designable and tunable spatial modes. In this work, we propose a general scheme to construct high-dimensional path entanglement and demonstrate the basic properties of concurrent SPDC processes from domain-engineered quadratic nonlinear photonic crystals, including the spatial modes and the photon flux, as well as the anisotropy of spatial correlation under noncollinear quasi-phase-matching geometry. The overall understanding about the performance of concurrent SPDC processes will give valuable references to the construction of compact path entanglement and the development of new types of photonic quantum technologies.

  2. Integration of Design, Thermal, Structural, and Optical Analysis, Including Thermal Animation

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.

    1993-01-01

    In many industries there has recently been a concerted movement toward 'quality management' and the issue of how to accomplish work more efficiently. Part of this effort is focused on concurrent engineering; the idea of integrating the design and analysis processes so that they are not separate, sequential processes (often involving design rework due to analytical findings) but instead form an integrated system with smooth transfers of information. Presented herein are several specific examples of concurrent engineering methods being carried out at Langley Research Center (LaRC): integration of thermal, structural and optical analyses to predict changes in optical performance based on thermal and structural effects; integration of the CAD design process with thermal and structural analyses; and integration of analysis and presentation by animating the thermal response of a system as an active color map -- a highly effective visual indication of heat flow.

  3. Information Integration for Concurrent Engineering (IICE) IDEF3 Process Description Capture Method Report

    DTIC Science & Technology

    1992-05-01

    methodology, knowledge acquisition, 140 requirements definition, information systems, information engineering, 16. PRICE CODE systems engineering...and knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be...evolve towards an information -integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key

  4. A Response Surface Methodology for Bi-Level Integrated System Synthesis (BLISS)

    NASA Technical Reports Server (NTRS)

    Altus, Troy David; Sobieski, Jaroslaw (Technical Monitor)

    2002-01-01

    The report describes a new method for optimization of engineering systems such as aerospace vehicles whose design must harmonize a number of subsystems and various physical phenomena, each represented by a separate computer code, e.g., aerodynamics, structures, propulsion, performance, etc. To represent the system internal couplings, the codes receive output from other codes as part of their inputs. The system analysis and optimization task is decomposed into subtasks that can be executed concurrently, each subtask conducted using local state and design variables and holding constant a set of the system-level design variables. The subtasks results are stored in form of the Response Surfaces (RS) fitted in the space of the system-level variables to be used as the subtask surrogates in a system-level optimization whose purpose is to optimize the system objective(s) and to reconcile the system internal couplings. By virtue of decomposition and execution concurrency, the method enables a broad workfront in organization of an engineering project involving a number of specialty groups that might be geographically dispersed, and it exploits the contemporary computing technology of massively concurrent and distributed processing. The report includes a demonstration test case of supersonic business jet design.

  5. Interdisciplinary and multilevel optimum design. [in aerospace structural engineering

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Haftka, Raphael T.

    1987-01-01

    Interactions among engineering disciplines and subsystems in engineering system design are surveyed and specific instances of such interactions are described. Examination of the interactions that a traditional design process in which the numerical values of major design variables are decided consecutively is likely to lead to a suboptimal design. Supporting numerical examples are a glider and a space antenna. Under an alternative approach introduced, the design and its sensitivity data from the subsystems and disciplines are generated concurrently and then made available to the system designer enabling him to modify the system design so as to improve its performance. Examples of a framework structure and an airliner wing illustrate that approach.

  6. NASA's Planetary Science Summer School: Training Future Mission Leaders in a Concurrent Engineering Environment

    NASA Astrophysics Data System (ADS)

    Mitchell, K. L.; Lowes, L. L.; Budney, C. J.; Sohus, A.

    2014-12-01

    NASA's Planetary Science Summer School (PSSS) is an intensive program for postdocs and advanced graduate students in science and engineering fields with a keen interest in planetary exploration. The goal is to train the next generation of planetary science mission leaders in a hands-on environment involving a wide range of engineers and scientists. It was established in 1989, and has undergone several incarnations. Initially a series of seminars, it became a more formal mission design experience in 1999. Admission is competitive, with participants given financial support. The competitively selected trainees develop an early mission concept study in teams of 15-17, responsive to a typical NASA Science Mission Directorate Announcement of Opportunity. They select the mission concept from options presented by the course sponsors, based on high-priority missions as defined by the Decadal Survey, prepare a presentation for a proposal authorization review, present it to a senior review board and receive critical feedback. Each participant assumes multiple roles, on science, instrument and project teams. They develop an understanding of top-level science requirements and instrument priorities in advance through a series of reading assignments and webinars help trainees. Then, during the five day session at Jet Propulsion Laboratory, they work closely with concurrent engineers including JPL's Advanced Projects Design Team ("Team X"), a cross-functional multidisciplinary team of engineers that utilizes concurrent engineering methodologies to complete rapid design, analysis and evaluation of mission concept designs. All are mentored and assisted directly by Team X members and course tutors in their assigned project roles. There is a strong emphasis on making difficult trades, simulating a real mission design process as accurately as possible. The process is intense and at times dramatic, with fast-paced design sessions and late evening study sessions. A survey of PSSS alumni administered in 2013 provides information on the program's impact on trainees' career choices and leadership roles as they pursue their employment in planetary science and related fields. Results will be presented during the session, along with highlights of topics and missions covered since the program's inception.

  7. A minimum cost tolerance allocation method for rocket engines and robust rocket engine design

    NASA Technical Reports Server (NTRS)

    Gerth, Richard J.

    1993-01-01

    Rocket engine design follows three phases: systems design, parameter design, and tolerance design. Systems design and parameter design are most effectively conducted in a concurrent engineering (CE) environment that utilize methods such as Quality Function Deployment and Taguchi methods. However, tolerance allocation remains an art driven by experience, handbooks, and rules of thumb. It was desirable to develop and optimization approach to tolerancing. The case study engine was the STME gas generator cycle. The design of the major components had been completed and the functional relationship between the component tolerances and system performance had been computed using the Generic Power Balance model. The system performance nominals (thrust, MR, and Isp) and tolerances were already specified, as were an initial set of component tolerances. However, the question was whether there existed an optimal combination of tolerances that would result in the minimum cost without any degradation in system performance.

  8. Integrating post-manufacturing issues into design and manufacturing decisions

    NASA Technical Reports Server (NTRS)

    Eubanks, Charles F.

    1996-01-01

    An investigation is conducted on research into some of the fundamental issues underlying the design for manufacturing, service and recycling that affect engineering decisions early in the conceptual design phase of mechanical systems. The investigation focuses on a system-based approach to material selection, manufacturing methods and assembly processes related to overall product requirements, performance and life-cycle costs. Particular emphasis is placed on concurrent engineering decision support for post-manufacturing issues such as serviceability, recyclability, and product retirement.

  9. A 20k Payload Launch Vehicle Fast Track Development Concept Using an RD-180 Engine and a Centaur Upper Stage

    NASA Technical Reports Server (NTRS)

    Toelle, Ronald (Compiler)

    1995-01-01

    A launch vehicle concept to deliver 20,000 lb of payload to a 100-nmi orbit has been defined. A new liquid oxygen/kerosene booster powered by an RD-180 engine was designed while using a slightly modified Centaur upper stage. The design, development, and test program met the imposed 40-mo schedule by elimination of major structural testing by increased factors of safety and concurrent engineering concepts. A growth path to attain 65,000 lb of payload is developed.

  10. Computer Program Re-layers Engineering Drawings

    NASA Technical Reports Server (NTRS)

    Crosby, Dewey C., III

    1990-01-01

    RULCHK computer program aids in structuring layers of information pertaining to part or assembly designed with software described in article "Software for Drawing Design Details Concurrently" (MFS-28444). Checks and optionally updates structure of layers for part. Enables designer to construct model and annotate its documentation without burden of manually layering part to conform to standards at design time.

  11. Advanced main combustion chamber program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The topics presented are covered in viewgraph form and include the following: investment of low cost castings; usage of SSME program; usage of MSFC personnel for design effort; and usage of concurrent engineering techniques.

  12. Interdisciplinary and multilevel optimum design

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Haftka, Raphael T.

    1986-01-01

    Interactions among engineering disciplines and subsystems in engineering system design are surveyed and specific instances of such interactions are described. Examination of the interactions that a traditional design process in which the numerical values of major design variables are decided consecutively is likely to lead to a suboptimal design. Supporting numerical examples are a glider and a space antenna. Under an alternative approach introduced, the design and its sensitivity data from the subsystems and disciplines are generated concurrently and then made available to the system designer enabling him to modify the system design so as to improve its performance. Examples of a framework structure and an airliner wing illustrate that approach.

  13. Integrated Engineering Information Technology, FY93 accommplishments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, R.N.; Miller, D.K.; Neugebauer, G.L.

    1994-03-01

    The Integrated Engineering Information Technology (IEIT) project is providing a comprehensive, easy-to-use computer network solution or communicating with coworkers both inside and outside Sandia National Laboratories. IEIT capabilities include computer networking, electronic mail, mechanical design, and data management. These network-based tools have one fundamental purpose: to help create a concurrent engineering environment that will enable Sandia organizations to excel in today`s increasingly competitive business environment.

  14. Elements of Designing for Cost

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Unal, Resit

    1992-01-01

    During recent history in the United States, government systems development has been performance driven. As a result, systems within a class have experienced exponentially increasing cost over time in fixed year dollars. Moreover, little emphasis has been placed on reducing cost. This paper defines designing for cost and presents several tools which, if used in the engineering process, offer the promise of reducing cost. Although other potential tools exist for designing for cost, this paper focuses on rules of thumb, quality function deployment, Taguchi methods, concurrent engineering, and activity based costing. Each of these tools has been demonstrated to reduce cost if used within the engineering process.

  15. Elements of designing for cost

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Unal, Resit

    1992-01-01

    During recent history in the United States, government systems development has been performance driven. As a result, systems within a class have experienced exponentially increasing cost over time in fixed year dollars. Moreover, little emphasis has been placed on reducing cost. This paper defines designing for cost and presents several tools which, if used in the engineering process, offer the promise of reducing cost. Although other potential tools exist for designing for cost, this paper focuses on rules of thumb, quality function deployment, Taguchi methods, concurrent engineering, and activity-based costing. Each of these tools has been demonstrated to reduce cost if used within the engineering process.

  16. Computational approaches to metabolic engineering utilizing systems biology and synthetic biology.

    PubMed

    Fong, Stephen S

    2014-08-01

    Metabolic engineering modifies cellular function to address various biochemical applications. Underlying metabolic engineering efforts are a host of tools and knowledge that are integrated to enable successful outcomes. Concurrent development of computational and experimental tools has enabled different approaches to metabolic engineering. One approach is to leverage knowledge and computational tools to prospectively predict designs to achieve the desired outcome. An alternative approach is to utilize combinatorial experimental tools to empirically explore the range of cellular function and to screen for desired traits. This mini-review focuses on computational systems biology and synthetic biology tools that can be used in combination for prospective in silico strain design.

  17. Resource Management and Contingencies in Aerospace Concurrent Engineering

    NASA Technical Reports Server (NTRS)

    Karpati, Gabe; Hyde, Tupper; Peabody, Hume; Garrison, Matthew

    2012-01-01

    significant concern in designing complex systems implementing new technologies is that while knowledge about the system is acquired incrementally, substantial financial commitments, even make-or-break decisions, must be made upfront, essentially in the unknown. One practice that helps in dealing with this dichotomy is the smart embedding of contingencies and margins in the design to serve as buffers against surprises. This issue presents itself in full force in the aerospace industry, where unprecedented systems are formulated and committed to as a matter of routine. As more and more aerospace mission concepts are generated by concurrent design laboratories, it is imperative that such laboratories apply well thought-out contingency and margin structures to their designs. The first part of this publication provides an overview of resource management techniques and standards used in the aerospace industry. That is followed by a thought provoking treatise on margin policies. The expose presents the actual flight telemetry data recorded by the thermal discipline during several recent NASA Goddard Space Flight Center missions. The margins actually achieved in flight are compared against pre-flight predictions, and the appropriateness and the ramifications of having designed with rigid margins to bounding stacked worst case conditions are assessed. The second half of the paper examines the particular issues associated with the application of contingencies and margins in the concurrent engineering environment. In closure, a discipline-by-discipline disclosure of the contingency and margin policies in use at the Integrated Design Center at NASA s Goddard Space Flight Center is made.

  18. Probabilistic simulation of concurrent engineering of propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Technology readiness and the available infrastructure is assessed for timely computational simulation of concurrent engineering for propulsion systems. Results for initial coupled multidisciplinary, fabrication-process, and system simulators are presented including uncertainties inherent in various facets of engineering processes. An approach is outlined for computationally formalizing the concurrent engineering process from cradle-to-grave via discipline dedicated workstations linked with a common database.

  19. Designing Microstructures/Structures for Desired Functional Material and Local Fields

    DTIC Science & Technology

    2015-12-02

    utilized to engineer multifunctional soft materials for multi-sensing, multi- actuating , human-machine interfaces. [3] Establish a theoretical framework...model for surface elasticity, (ii) derived a new type of Maxwell stress in soft materials due to quantum mechanical-elasticity coupling and...elucidated its ramification in engineering multifunctional soft materials, and (iii) demonstrated the possibility of concurrent magnetoelectricity and

  20. Early Formulation Model-centric Engineering on NASA's Europa Mission Concept Study

    NASA Technical Reports Server (NTRS)

    Bayer, Todd; Chung, Seung; Cole, Bjorn; Cooke, Brian; Dekens, Frank; Delp, Chris; Gontijo, Ivair; Lewis, Kari; Moshir, Mehrdad; Rasmussen, Robert; hide

    2012-01-01

    The proposed Jupiter Europa Orbiter and Jupiter Ganymede Orbiter missions were formulated using current state-of-the-art MBSE facilities: - JPL's TeamX, Rapid Mission Architecting - ESA's Concurrent Design Facility - APL's ACE Concurrent Engineering Facility. When JEO became an official "pre-project" in Sep 2010, we had already developed a strong partnership with JPL's Integrated Model Centric Engineering (IMCE) initiative; decided to apply Architecting and SysML-based MBSE from the beginning, begun laying these foundations to support work in Phase A. Release of Planetary Science Decadal Survey and FY12 President's Budget in March 2011 changed the landscape. JEO reverted to being a pre-phase A study. A conscious choice was made to continue application of MBSE on the Europa Study, refocused for early formulation. This presentation describes the approach, results, and lessons.

  1. Concurrent Mission and Systems Design at NASA Glenn Research Center: The Origins of the COMPASS Team

    NASA Technical Reports Server (NTRS)

    McGuire, Melissa L.; Oleson, Steven R.; Sarver-Verhey, Timothy R.

    2012-01-01

    Established at the NASA Glenn Research Center (GRC) in 2006 to meet the need for rapid mission analysis and multi-disciplinary systems design for in-space and human missions, the Collaborative Modeling for Parametric Assessment of Space Systems (COMPASS) team is a multidisciplinary, concurrent engineering group whose primary purpose is to perform integrated systems analysis, but it is also capable of designing any system that involves one or more of the disciplines present in the team. The authors were involved in the development of the COMPASS team and its design process, and are continuously making refinements and enhancements. The team was unofficially started in the early 2000s as part of the distributed team known as Team JIMO (Jupiter Icy Moons Orbiter) in support of the multi-center collaborative JIMO spacecraft design during Project Prometheus. This paper documents the origins of a concurrent mission and systems design team at GRC and how it evolved into the COMPASS team, including defining the process, gathering the team and tools, building the facility, and performing studies.

  2. Computational simulation of concurrent engineering for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  3. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  4. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Astrophysics Data System (ADS)

    Chamis, C. C.; Singhal, S. N.

    1993-02-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  5. Integrated design and manufacturing for the high speed civil transport

    NASA Technical Reports Server (NTRS)

    1993-01-01

    In June 1992, Georgia Tech's School of Aerospace Engineering was awarded a NASA University Space Research Association (USRA) Advanced Design Program (ADP) to address 'Integrated Design and Manufacturing for the High Speed Civil Transport (HSCT)' in its graduate aerospace systems design courses. This report summarizes the results of the five courses incorporated into the Georgia Tech's USRA ADP program. It covers AE8113: Introduction to Concurrent Engineering, AE4360: Introduction to CAE/CAD, AE4353: Design for Life Cycle Cost, AE6351: Aerospace Systems Design One, and AE6352: Aerospace Systems Design Two. AE8113: Introduction to Concurrent Engineering was an introductory course addressing the basic principles of concurrent engineering (CE) or integrated product development (IPD). The design of a total system was not the objective of this course. The goal was to understand and define the 'up-front' customer requirements, their decomposition, and determine the value objectives for a complex product, such as the high speed civil transport (HSCT). A generic CE methodology developed at Georgia Tech was used for this purpose. AE4353: Design for Life Cycle Cost addressed the basic economic issues for an HSCT using a robust design technique, Taguchi's parameter design optimization method (PDOM). An HSCT economic sensitivity assessment was conducted using a Taguchi PDOM approach to address the robustness of the basic HSCT design. AE4360: Introduction to CAE/CAD permitted students to develop and utilize CAE/CAD/CAM knowledge and skills using CATIA and CADAM as the basic geometric tools. AE6351: Aerospace Systems Design One focused on the conceptual design refinement of a baseline HSCT configuration as defined by Boeing, Douglas, and NASA in their system studies. It required the use of NASA's synthesis codes FLOPS and ACSYNT. A criterion called the productivity index (P.I.) was used to evaluate disciplinary sensitivities and provide refinements of the baseline HSCT configuration. AE6352: Aerospace Systems Design Two was a continuation of Aerospace Systems Design One in which wing concepts were researched and analyzed in more detail. FLOPS and ACSYNT were again used at the system level while other off-the-shelf computer codes were used for more detailed wing disciplinary analysis and optimization. The culmination of all efforts and submission of this report conclude the first year's efforts of Georgia Tech's NASA USRA ADP. It will hopefully provide the foundation for next year's efforts concerning continuous improvement of integrated design and manufacturing for the HSCT.

  6. Concurrent Engineering for Composites

    DTIC Science & Technology

    1991-10-01

    1990), 44. Cooper, R.G. and Kleinschmidt, E.J., Journal of Product Innovation Management . 3[2], (1986), 71.. Drucker, P.F., Harvard Business Review...Journal of Product Innovation Management 6(1], (1989), 43. Hollins, B. and Pugh, S., Successful Product Design, Buttcrworths, London, 1990. Johnson

  7. Bi-Level Integrated System Synthesis (BLISS)

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Agte, Jeremy S.; Sandusky, Robert R., Jr.

    1998-01-01

    BLISS is a method for optimization of engineering systems by decomposition. It separates the system level optimization, having a relatively small number of design variables, from the potentially numerous subsystem optimizations that may each have a large number of local design variables. The subsystem optimizations are autonomous and may be conducted concurrently. Subsystem and system optimizations alternate, linked by sensitivity data, producing a design improvement in each iteration. Starting from a best guess initial design, the method improves that design in iterative cycles, each cycle comprised of two steps. In step one, the system level variables are frozen and the improvement is achieved by separate, concurrent, and autonomous optimizations in the local variable subdomains. In step two, further improvement is sought in the space of the system level variables. Optimum sensitivity data link the second step to the first. The method prototype was implemented using MATLAB and iSIGHT programming software and tested on a simplified, conceptual level supersonic business jet design, and a detailed design of an electronic device. Satisfactory convergence and favorable agreement with the benchmark results were observed. Modularity of the method is intended to fit the human organization and map well on the computing technology of concurrent processing.

  8. Hierarchical Modeling and Robust Synthesis for the Preliminary Design of Large Scale Complex Systems

    NASA Technical Reports Server (NTRS)

    Koch, Patrick N.

    1997-01-01

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis; Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration; and Noise modeling techniques for implementing robust preliminary design when approximate models are employed. Hierarchical partitioning and modeling techniques including intermediate responses, linking variables, and compatibility constraints are incorporated within a hierarchical compromise decision support problem formulation for synthesizing subproblem solutions for a partitioned system. Experimentation and approximation techniques are employed for concurrent investigations and modeling of partitioned subproblems. A modified composite experiment is introduced for fitting better predictive models across the ranges of the factors, and an approach for constructing partitioned response surfaces is developed to reduce the computational expense of experimentation for fitting models in a large number of factors. Noise modeling techniques are compared and recommendations are offered for the implementation of robust design when approximate models are sought. These techniques, approaches, and recommendations are incorporated within the method developed for hierarchical robust preliminary design exploration. This method as well as the associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system. The case study is developed in collaboration with Allison Engine Company, Rolls Royce Aerospace, and is based on the Allison AE3007 existing engine designed for midsize commercial, regional business jets. For this case study, the turbofan system-level problem is partitioned into engine cycle design and configuration design and a compressor modules integrated for more detailed subsystem-level design exploration, improving system evaluation. The fan and low pressure turbine subsystems are also modeled, but in less detail. Given the defined partitioning, these subproblems are investigated independently and concurrently, and response surface models are constructed to approximate the responses of each. These response models are then incorporated within a commercial turbofan hierarchical compromise decision support problem formulation. Five design scenarios are investigated, and robust solutions are identified. The method and solutions identified are verified by comparison with the AE3007 engine. The solutions obtained are similar to the AE3007 cycle and configuration, but are better with respect to many of the requirements.

  9. Fuzzy simulation in concurrent engineering

    NASA Technical Reports Server (NTRS)

    Kraslawski, A.; Nystrom, L.

    1992-01-01

    Concurrent engineering is becoming a very important practice in manufacturing. A problem in concurrent engineering is the uncertainty associated with the values of the input variables and operating conditions. The problem discussed in this paper concerns the simulation of processes where the raw materials and the operational parameters possess fuzzy characteristics. The processing of fuzzy input information is performed by the vertex method and the commercial simulation packages POLYMATH and GEMS. The examples are presented to illustrate the usefulness of the method in the simulation of chemical engineering processes.

  10. Domain-specific languages and diagram customization for a concurrent engineering environment

    NASA Astrophysics Data System (ADS)

    Cole, B.; Dubos, G.; Banazadeh, P.; Reh, J.; Case, K.; Wang, Y.; Jones, S.; Picha, F.

    A major open question for advocates of Model-Based Systems Engineering (MBSE) is the question of how system and subsystem engineers will work together. The Systems Modeling Language (SysML), like any language intended for a large audience, is in tension between the desires for simplicity and for expressiveness. In order to be more expressive, many specialized language elements may be introduced, which will unfortunately make a complete understanding of the language a more daunting task. While this may be acceptable for systems modelers, it will increase the challenge of including subsystem engineers in the modeling effort. One possible answer to this situation is the use of Domain-Specific Languages (DSL), which are fully supported by the Unified Modeling Language (UML). SysML is in fact a DSL for systems engineering. The expressive power of a DSL can be enhanced through the use of diagram customization. Various domains have already developed their own schematic vocabularies. Within the space engineering community, two excellent examples are the propulsion and telecommunication subsystems. A return to simple box-and-line diagrams (e.g., the SysML Internal Block Diagram) are in many ways a step backward. In order allow subsystem engineers to contribute directly to the model, it is necessary to make a system modeling tool at least approximate in accessibility to drawing tools like Microsoft PowerPoint and Visio. The challenge is made more extreme in a concurrent engineering environment, where designs must often be drafted in an hour or two. In the case of the Jet Propulsion Laboratory's Team X concurrent design team, a subsystem is specified using a combination of PowerPoint for drawing and Excel for calculation. A pilot has been undertaken in order to meld the drawing portion and the production of master equipment lists (MELs) via a SysML authoring tool, MagicDraw. Team X currently interacts with its customers in a process of sharing presentations. There are severa- inefficiencies that arise from this situation. The first is that a customer team must wait two weeks to a month (which is 2-4 times the duration of most Team X studies themselves) for a finalized, detailed design description. Another is that this information must be re-entered by hand into the set of engineering artifacts and design tools that the mission concept team uses after a study is complete. Further, there is no persistent connection to Team X or institutionally shared formulation design tools and data after a given study, again reducing the direct reuse of designs created in a Team X study. This paper presents the underpinnings of subsystem DSLs as they were developed for this pilot. This includes specialized semantics for different domains as well as the process by which major categories of objects were derived in support of defining the DSLs. The feedback given to us by the domain experts on usability, along with a pilot study with the partial inclusion of these tools is also discussed.

  11. Domain-Specific Languages and Diagram Customization for a Concurrent Engineering Environment

    NASA Technical Reports Server (NTRS)

    Cole, Bjorn; Dubos, Greg; Banazadeh, Payam; Reh, Jonathan; Case, Kelley; Wang, Yeou-Fang; Jones, Susan; Picha, Frank

    2013-01-01

    A major open question for advocates of Model-Based Systems Engineering (MBSE) is the question of how system and subsystem engineers will work together. The Systems Modeling Language (SysML), like any language intended for a large audience, is in tension between the desires for simplicity and for expressiveness. In order to be more expressive, many specialized language elements may be introduced, which will unfortunately make a complete understanding of the language a more daunting task. While this may be acceptable for systems modelers, it will increase the challenge of including subsystem engineers in the modeling effort. One possible answer to this situation is the use of Domain-Specific Languages (DSL), which are fully supported by the Unified Modeling Language (UML). SysML is in fact a DSL for systems engineering. The expressive power of a DSL can be enhanced through the use of diagram customization. Various domains have already developed their own schematic vocabularies. Within the space engineering community, two excellent examples are the propulsion and telecommunication subsystems. A return to simple box-and-line diagrams (e.g., the SysML Internal Block Diagram) are in many ways a step backward. In order allow subsystem engineers to contribute directly to the model, it is necessary to make a system modeling tool at least approximate in accessibility to drawing tools like Microsoft PowerPoint and Visio. The challenge is made more extreme in a concurrent engineering environment, where designs must often be drafted in an hour or two. In the case of the Jet Propulsion Laboratory's Team X concurrent design team, a subsystem is specified using a combination of PowerPoint for drawing and Excel for calculation. A pilot has been undertaken in order to meld the drawing portion and the production of master equipment lists (MELs) via a SysML authoring tool, MagicDraw. Team X currently interacts with its customers in a process of sharing presentations. There are several inefficiencies that arise from this situation. The first is that a customer team must wait two weeks to a month (which is 2-4 times the duration of most Team X studies themselves) for a finalized, detailed design description. Another is that this information must be re-entered by hand into the set of engineering artifacts and design tools that the mission concept team uses after a study is complete. Further, there is no persistent connection to Team X or institutionally shared formulation design tools and data after a given study, again reducing the direct reuse of designs created in a Team X study. This paper presents the underpinnings of subsystem DSLs as they were developed for this pilot. This includes specialized semantics for different domains as well as the process by which major categories of objects were derived in support of defining the DSLs. The feedback given to us by the domain experts on usability, along with a pilot study with the partial inclusion of these tools is also discussed.

  12. Application of Concurrent Engineering Methods to the Design of an Autonomous Aerial Robot

    DTIC Science & Technology

    1991-12-01

    power within the system, either airborne or at a ground station, was left to the team’s discretion. Data link from the aerial vehicle to the ground...Design Process 1 4 10 0% Conceptual 100% Preliminary 100% Detailed 100% Design Freedom Kowledge About the Design TIME INTO THE DESIGN PROCESS Figure 15...mission planning and control tasks was accomplished. Key system issues regarding power up and component initialization procedures began to be addressed

  13. Advanced engineering environment pilot project.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwegel, Jill; Pomplun, Alan R.; Abernathy, Rusty

    2006-10-01

    The Advanced Engineering Environment (AEE) is a concurrent engineering concept that enables real-time process tooling design and analysis, collaborative process flow development, automated document creation, and full process traceability throughout a product's life cycle. The AEE will enable NNSA's Design and Production Agencies to collaborate through a singular integrated process. Sandia National Laboratories and Parametric Technology Corporation (PTC) are working together on a prototype AEE pilot project to evaluate PTC's product collaboration tools relative to the needs of the NWC. The primary deliverable for the project is a set of validated criteria for defining a complete commercial off-the-shelf (COTS) solutionmore » to deploy the AEE across the NWC.« less

  14. An Example of Concurrent Engineering

    NASA Technical Reports Server (NTRS)

    Rowe, Sidney; Whitten, David; Cloyd, Richard; Coppens, Chris; Rodriguez, Pedro

    1998-01-01

    The Collaborative Engineering Design and Analysis Room (CEDAR) facility allows on-the- spot design review capability for any project during all phases of development. The required disciplines assemble in this facility to work on any problems (analysis, manufacturing, inspection, etc.) associated with a particular design. A small highly focused team of specialists can meet in this room to better expedite the process of developing a solution to an engineering task within the framework of the constraints that are unique to each discipline. This facility provides the engineering tools and translators to develop a concept within the confines of the room or with remote team members that could access the team's data from other locations. The CEDAR area is envisioned as excellent for failure investigation meetings to be conducted where the computer capabilities can be utilized in conjunction with the Smart Board display to develop failure trees, brainstorm failure modes, and evaluate possible solutions.

  15. Launch vehicle systems design analysis

    NASA Technical Reports Server (NTRS)

    Ryan, Robert; Verderaime, V.

    1993-01-01

    Current launch vehicle design emphasis is on low life-cycle cost. This paper applies total quality management (TQM) principles to a conventional systems design analysis process to provide low-cost, high-reliability designs. Suggested TQM techniques include Steward's systems information flow matrix method, quality leverage principle, quality through robustness and function deployment, Pareto's principle, Pugh's selection and enhancement criteria, and other design process procedures. TQM quality performance at least-cost can be realized through competent concurrent engineering teams and brilliance of their technical leadership.

  16. Computational Infrastructure for Engine Structural Performance Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Select computer codes developed over the years to simulate specific aspects of engine structures are described. These codes include blade impact integrated multidisciplinary analysis and optimization, progressive structural fracture, quantification of uncertainties for structural reliability and risk, benefits estimation of new technology insertion and hierarchical simulation of engine structures made from metal matrix and ceramic matrix composites. Collectively these codes constitute a unique infrastructure readiness to credibly evaluate new and future engine structural concepts throughout the development cycle from initial concept, to design and fabrication, to service performance and maintenance and repairs, and to retirement for cause and even to possible recycling. Stated differently, they provide 'virtual' concurrent engineering for engine structures total-life-cycle-cost.

  17. Knowledge Management tools integration within DLR's concurrent engineering facility

    NASA Astrophysics Data System (ADS)

    Lopez, R. P.; Soragavi, G.; Deshmukh, M.; Ludtke, D.

    The complexity of space endeavors has increased the need for Knowledge Management (KM) tools. The concept of KM involves not only the electronic storage of knowledge, but also the process of making this knowledge available, reusable and traceable. Establishing a KM concept within the Concurrent Engineering Facility (CEF) has been a research topic of the German Aerospace Centre (DLR). This paper presents the current KM tools of the CEF: the Software Platform for Organizing and Capturing Knowledge (S.P.O.C.K.), the data model Virtual Satellite (VirSat), and the Simulation Model Library (SimMoLib), and how their usage improved the Concurrent Engineering (CE) process. This paper also exposes the lessons learned from the introduction of KM practices into the CEF and elaborates a roadmap for the further development of KM in CE activities at DLR. The results of the application of the Knowledge Management tools have shown the potential of merging the three software platforms with their functionalities, as the next step towards the fully integration of KM practices into the CE process. VirSat will stay as the main software platform used within a CE study, and S.P.O.C.K. and SimMoLib will be integrated into VirSat. These tools will support the data model as a reference and documentation source, and as an access to simulation and calculation models. The use of KM tools in the CEF aims to become a basic practice during the CE process. The settlement of this practice will result in a much more extended knowledge and experience exchange within the Concurrent Engineering environment and, consequently, the outcome of the studies will comprise higher quality in the design of space systems.

  18. Systems Engineering Model for ART Energy Conversion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendez Cruz, Carmen Margarita; Rochau, Gary E.; Wilson, Mollye C.

    The near-term objective of the EC team is to establish an operating, commercially scalable Recompression Closed Brayton Cycle (RCBC) to be constructed for the NE - STEP demonstration system (demo) with the lowest risk possible. A systems engineering approach is recommended to ensure adequate requirements gathering, documentation, and mode ling that supports technology development relevant to advanced reactors while supporting crosscut interests in potential applications. A holistic systems engineering model was designed for the ART Energy Conversion program by leveraging Concurrent Engineering, Balance Model, Simplified V Model, and Project Management principles. The resulting model supports the identification and validation ofmore » lifecycle Brayton systems requirements, and allows designers to detail system-specific components relevant to the current stage in the lifecycle, while maintaining a holistic view of all system elements.« less

  19. Systems design analysis applied to launch vehicle configuration

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Verderaime, V.

    1993-01-01

    As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.

  20. Product development: the making of the Abbott ARCHITECT.

    PubMed

    Kisner, H J

    1997-01-01

    Many laboratorians have a limited perspective on what is involved in developing an instrument and bringing it to market. This article traces the product development process used by Abbott Diagnostics Division that resulted in Abbott being named the 1996 Concurrent Engineering Company of the Year for the design of the ARCHITECT.

  1. X-33 Attitude Control Using the XRS-2200 Linear Aerospike Engine

    NASA Technical Reports Server (NTRS)

    Hall, Charles E.; Panossian, Hagop V.

    1999-01-01

    The Vehicle Control Systems Team at Marshall Space Flight Center, Structures and Dynamics Laboratory, Guidance and Control Systems Division is designing, under a cooperative agreement with Lockheed Martin Skunkworks, the Ascent, Transition, and Entry flight attitude control systems for the X-33 experimental vehicle. Test flights, while suborbital, will achieve sufficient altitudes and Mach numbers to test Single Stage To Orbit, Reusable Launch Vehicle technologies. Ascent flight control phase, the focus of this paper, begins at liftoff and ends at linear aerospike main engine cutoff (MECO). The X-33 attitude control system design is confronted by a myriad of design challenges: a short design cycle, the X-33 incremental test philosophy, the concurrent design philosophy chosen for the X-33 program, and the fact that the attitude control system design is, as usual, closely linked to many other subsystems and must deal with constraints and requirements from these subsystems. Additionally, however, and of special interest, the use of the linear aerospike engine is a departure from the gimbaled engines traditionally used for thrust vector control (TVC) in launch vehicles and poses certain design challenges. This paper discusses the unique problem of designing the X-33 attitude control system with the linear aerospike engine, requirements development, modeling and analyses that verify the design.

  2. Concurrent design of an RTP chamber and advanced control system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spence, P.; Schaper, C.; Kermani, A.

    1995-12-31

    A concurrent-engineering approach is applied to the development of an axisymmetric rapid-thermal-processing (RTP) reactor and its associated temperature controller. Using a detailed finite-element thermal model as a surrogate for actual hardware, the authors have developed and tested a multi-input multi-output (MIMO) controller. Closed-loop simulations are performed by linking the control algorithm with the finite-element code. Simulations show that good temperature uniformity is maintained on the wafer during both steady and transient conditions. A numerical study shows the effect of ramp rate, feedback gain, sensor placement, and wafer-emissivity patterns on system performance.

  3. Additive manufacturing: Toward holistic design

    DOE PAGES

    Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.; ...

    2017-03-18

    Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.

  4. Infusion of a Gaming Paradigm into Computer-Aided Engineering Design Tools

    DTIC Science & Technology

    2012-05-03

    Virtual Test Bed (VTB), and the gaming tool, Unity3D . This hybrid gaming environment coupled a three-dimensional (3D) multibody vehicle system model...from Google Earth to the 3D visual front-end fabricated around Unity3D . The hybrid environment was sufficiently developed to support analyses of the...ndFr Cti3r4 G’OjrdFr ctior-2 The VTB simulation of the vehicle dynamics ran concurrently with and interacted with the gaming engine, Unity3D which

  5. Quality Function Deployment for Large Systems

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1992-01-01

    Quality Function Deployment (QFD) is typically applied to small subsystems. This paper describes efforts to extend QFD to large scale systems. It links QFD to the system engineering process, the concurrent engineering process, the robust design process, and the costing process. The effect is to generate a tightly linked project management process of high dimensionality which flushes out issues early to provide a high quality, low cost, and, hence, competitive product. A pre-QFD matrix linking customers to customer desires is described.

  6. Integrating reliability and maintainability into a concurrent engineering environment

    NASA Astrophysics Data System (ADS)

    Phillips, Clifton B.; Peterson, Robert R.

    1993-02-01

    This paper describes the results of a reliability and maintainability study conducted at the University of California, San Diego and supported by private industry. Private industry thought the study was important and provided the university access to innovative tools under cooperative agreement. The current capability of reliability and maintainability tools and how they fit into the design process is investigated. The evolution of design methodologies leading up to today's capability is reviewed for ways to enhance the design process while keeping cost under control. A method for measuring the consequences of reliability and maintainability policy for design configurations in an electronic environment is provided. The interaction of selected modern computer tool sets is described for reliability, maintainability, operations, and other elements of the engineering design process. These tools provide a robust system evaluation capability that brings life cycle performance improvement information to engineers and their managers before systems are deployed, and allow them to monitor and track performance while it is in operation.

  7. Bi-Level Integrated System Synthesis (BLISS) for Concurrent and Distributed Processing

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Altus, Troy D.; Phillips, Matthew; Sandusky, Robert

    2002-01-01

    The paper introduces a new version of the Bi-Level Integrated System Synthesis (BLISS) methods intended for optimization of engineering systems conducted by distributed specialty groups working concurrently and using a multiprocessor computing environment. The method decomposes the overall optimization task into subtasks associated with disciplines or subsystems where the local design variables are numerous and a single, system-level optimization whose design variables are relatively few. The subtasks are fully autonomous as to their inner operations and decision making. Their purpose is to eliminate the local design variables and generate a wide spectrum of feasible designs whose behavior is represented by Response Surfaces to be accessed by a system-level optimization. It is shown that, if the problem is convex, the solution of the decomposed problem is the same as that obtained without decomposition. A simplified example of an aircraft design shows the method working as intended. The paper includes a discussion of the method merits and demerits and recommendations for further research.

  8. Flight Testing Surfaces Engineered for Mitigating Insect Adhesion on a Falcon HU-25C

    NASA Technical Reports Server (NTRS)

    Shanahan, Michelle; Wohl, Chris J.; Smith, Joseph G., Jr.; Connell, John W.; Siochi, Emilie J.; Doss, Jereme R.; Penner, Ronald K.

    2015-01-01

    Insect residue contamination on aircraft wings can decrease fuel efficiency in aircraft designed for natural laminar flow. Insect residues can cause a premature transition to turbulent flow, increasing fuel burn and making the aircraft less environmentally friendly. Surfaces, designed to minimize insect residue adhesion, were evaluated through flight testing on a Falcon HU-25C aircraft flown along the coast of Virginia and North Carolina. The surfaces were affixed to the wing leading edge and the aircraft remained at altitudes lower than 1000 feet throughout the flight to assure high insect density. The number of strikes on the engineered surfaces was compared to, and found to be lower than, untreated aluminum control surfaces flown concurrently. Optical profilometry was used to determine insect residue height and areal coverage. Differences in results between flight and laboratory tests suggest the importance of testing in realistic use environments to evaluate the effectiveness of engineered surface designs.

  9. A new nano-engineered hierarchical membrane for concurrent removal of surfactant and oil from oil-in-water nanoemulsion

    PubMed Central

    Qin, Detao; Liu, Zhaoyang; Bai, Hongwei; Sun, Darren Delai; Song, Xiaoxiao

    2016-01-01

    Surfactant stabilized oil-in-water nanoemulsions pose a severe threat to both the environment and human health. Recent development of membrane filtration technology has enabled efficient oil removal from oil/water nanoemulsion, however, the concurrent removal of surfactant and oil remains unsolved because the existing filtration membranes still suffer from low surfactant removal rate and serious surfactant-induced fouling issue. In this study, to realize the concurrent removal of surfactant and oil from nanoemulsion, a novel hierarchically-structured membrane is designed with a nanostructured selective layer on top of a microstructured support layer. The physical and chemical properties of the overall membrane, including wettability, surface roughness, electric charge, thickness and structures, are delicately tailored through a nano-engineered fabrication process, that is, graphene oxide (GO) nanosheet assisted phase inversion coupled with surface functionalization. Compared with the membrane fabricated by conventional phase inversion, this novel membrane has four times higher water flux, significantly higher rejections of both oil (~99.9%) and surfactant (as high as 93.5%), and two thirds lower fouling ratio when treating surfactant stabilized oil-in-water nanoemulsion. Due to its excellent performances and facile fabrication process, this nano-engineered membrane is expected to have wide practical applications in the oil/water separation fields of environmental protection and water purification. PMID:27087362

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.

    Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The U.S. Department of Energy's (DOE) Co-Optimization of Fuels & Engines (Co-Optima) initiative is accelerating the introduction of affordable, scalable, and sustainable fuels and high-efficiency, low-emission engines with a first-of-its-kind effort to simultaneously tackle fuel and engine research and development (R&D). This report summarizes accomplishments in the first year of the project. Co-Optima is conducting concurrent research to identify the fuel properties and engine design characteristics needed to maximize vehicle performance and affordability, while deeply cutting emissions. Nine national laboratories - the National Renewable Energy Laboratory and Argonne, Idaho, Lawrence Berkeley, Lawrence Livermore, Los Alamos, Oak Ridge, Pacific Northwest, andmore » Sandia National Laboratories - are collaborating with industry and academia on this groundbreaking research.« less

  12. Ames Engineering Directorate

    NASA Technical Reports Server (NTRS)

    Phillips, Veronica J.

    2017-01-01

    The Ames Engineering Directorate is the principal engineering organization supporting aerospace systems and spaceflight projects at NASA's Ames Research Center in California's Silicon Valley. The Directorate supports all phases of engineering and project management for flight and mission projects-from R&D to Close-out-by leveraging the capabilities of multiple divisions and facilities.The Mission Design Center (MDC) has full end-to-end mission design capability with sophisticated analysis and simulation tools in a collaborative concurrent design environment. Services include concept maturity level (CML) maturation, spacecraft design and trades, scientific instruments selection, feasibility assessments, and proposal support and partnerships. The Engineering Systems Division provides robust project management support as well as systems engineering, mechanical and electrical analysis and design, technical authority and project integration support to a variety of programs and projects across NASA centers. The Applied Manufacturing Division turns abstract ideas into tangible hardware for aeronautics, spaceflight and science applications, specializing in fabrication methods and management of complex fabrication projects. The Engineering Evaluation Lab (EEL) provides full satellite or payload environmental testing services including vibration, temperature, humidity, immersion, pressure/altitude, vacuum, high G centrifuge, shock impact testing and the Flight Processing Center (FPC), which includes cleanrooms, bonded stores and flight preparation resources. The Multi-Mission Operations Center (MMOC) is composed of the facilities, networks, IT equipment, software and support services needed by flight projects to effectively and efficiently perform all mission functions, including planning, scheduling, command, telemetry processing and science analysis.

  13. Update on Integrated Optical Design Analyzer

    NASA Technical Reports Server (NTRS)

    Moore, James D., Jr.; Troy, Ed

    2003-01-01

    Updated information on the Integrated Optical Design Analyzer (IODA) computer program has become available. IODA was described in Software for Multidisciplinary Concurrent Optical Design (MFS-31452), NASA Tech Briefs, Vol. 25, No. 10 (October 2001), page 8a. To recapitulate: IODA facilitates multidisciplinary concurrent engineering of highly precise optical instruments. The architecture of IODA was developed by reviewing design processes and software in an effort to automate design procedures. IODA significantly reduces design iteration cycle time and eliminates many potential sources of error. IODA integrates the modeling efforts of a team of experts in different disciplines (e.g., optics, structural analysis, and heat transfer) working at different locations and provides seamless fusion of data among thermal, structural, and optical models used to design an instrument. IODA is compatible with data files generated by the NASTRAN structural-analysis program and the Code V (Registered Trademark) optical-analysis program, and can be used to couple analyses performed by these two programs. IODA supports multiple-load-case analysis for quickly accomplishing trade studies. IODA can also model the transient response of an instrument under the influence of dynamic loads and disturbances.

  14. Space Transportation Engine Program (STEP), phase B

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The Space Transportation Engine Program (STEP) Phase 2 effort includes preliminary design and activities plan preparation that will allow smooth and time transition into a Prototype Phase and then into Phases 3, 4, and 5. A Concurrent Engineering approach using Total Quality Management (TQM) techniques, is being applied to define an oxygen-hydrogen engine. The baseline from Phase 1/1' studies was used as a point of departure for trade studies and analyses. Existing STME system models are being enhanced as more detailed module/component characteristics are determined. Preliminary designs for the open expander, closed expander, and gas generator cycles were prepared, and recommendations for cycle selection made at the Design Concept Review (DCR). As a result of July '90 DCR, and information subsequently supplied to the Technical Review Team, a gas generator cycle was selected. Results of the various Advanced Development Programs (ADP's) for the Advanced Launch Systems (ALS) were contributive to this effort. An active vehicle integration effort is supplying the NASA, Air Force, and vehicle contractors with engine parameters and data, and flowing down appropriate vehicle requirements. Engine design and analysis trade studies are being documented in a data base that was developed and is being used to organize information. To date, seventy four trade studies were input to the data base.

  15. System Software Framework for System of Systems Avionics

    NASA Technical Reports Server (NTRS)

    Ferguson, Roscoe C.; Peterson, Benjamin L; Thompson, Hiram C.

    2005-01-01

    Project Constellation implements NASA's vision for space exploration to expand human presence in our solar system. The engineering focus of this project is developing a system of systems architecture. This architecture allows for the incremental development of the overall program. Systems can be built and connected in a "Lego style" manner to generate configurations supporting various mission objectives. The development of the avionics or control systems of such a massive project will result in concurrent engineering. Also, each system will have software and the need to communicate with other (possibly heterogeneous) systems. Fortunately, this design problem has already been solved during the creation and evolution of systems such as the Internet and the Department of Defense's successful effort to standardize distributed simulation (now IEEE 1516). The solution relies on the use of a standard layered software framework and a communication protocol. A standard framework and communication protocol is suggested for the development and maintenance of Project Constellation systems. The ARINC 653 standard is a great start for such a common software framework. This paper proposes a common system software framework that uses the Real Time Publish/Subscribe protocol for framework-to-framework communication to extend ARINC 653. It is highly recommended that such a framework be established before development. This is important for the success of concurrent engineering. The framework provides an infrastructure for general system services and is designed for flexibility to support a spiral development effort.

  16. Information Integration for Concurrent Engineering (IICE) IDEF3 Process Description Capture Method Report

    DTIC Science & Technology

    1995-09-01

    vital processes of a business. process, IDEF, method, methodology, modeling, knowledge acquisition, requirements definition, information systems... knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems

  17. Information Integration for Concurrent Engineering (IICE) Compendium of Methods Report

    DTIC Science & Technology

    1995-06-01

    technological, economic, and strategic benefits can be attained through the effective capture, control, and management of information and knowledge ...resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to achieve...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems that

  18. The development of internet based ship design support system for small and medium sized shipyards

    NASA Astrophysics Data System (ADS)

    Shin, Sung-Chul; Lee, Soon-Sup; Kang, Dong-Hoon; Lee, Kyung-Ho

    2012-03-01

    In this paper, a prototype of ship basic planning system is implemented for the small and medium sized shipyards based on the internet technology and concurrent engineering concept. The system is designed from the user requirements. Consequently, standardized development environment and tools are selected. These tools are used for the system development to define and evaluate core application technologies. The system will contribute to increasing competitiveness of small and medium sized shipyards in the 21st century industrial en-vironment.

  19. A unifying framework for systems modeling, control systems design, and system operation

    NASA Technical Reports Server (NTRS)

    Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.

    2005-01-01

    Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.

  20. Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models

    NASA Astrophysics Data System (ADS)

    Altuntas, Alper; Baugh, John

    2017-07-01

    Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.

  1. Design and Implementation of a Threaded Search Engine for Tour Recommendation Systems

    NASA Astrophysics Data System (ADS)

    Lee, Junghoon; Park, Gyung-Leen; Ko, Jin-Hee; Shin, In-Hye; Kang, Mikyung

    This paper implements a threaded scan engine for the O(n!) search space and measures its performance, aiming at providing a responsive tour recommendation and scheduling service. As a preliminary step of integrating POI ontology, mobile object database, and personalization profile for the development of new vehicular telematics services, this implementation can give a useful guideline to design a challenging and computation-intensive vehicular telematics service. The implemented engine allocates the subtree to the respective threads and makes them run concurrently exploiting the primitives provided by the operating system and the underlying multiprocessor architecture. It also makes it easy to add a variety of constraints, for example, the search tree is pruned if the cost of partial allocation already exceeds the current best. The performance measurement result shows that the service can run even in the low-power telematics device when the number of destinations does not exceed 15, with an appropriate constraint processing.

  2. DARPA Initiative in Concurrent Engineering (DICE). Phase 2

    DTIC Science & Technology

    1990-07-31

    XS spreadsheet tool " Q-Calc spreadsheet tool " TAE+ outer wrapper for XS • Framemaker-based formal EDN (Electronic Design Notebook) " Data...shared global object space and object persistence. Technical Results Module Development XS Integration Environment A prototype of the wrapper concepts...for a spreadsheet integration environment, using an X-Windows based extensible Lotus 1-2-3 emulation called XS , and was (initially) targeted for

  3. Initiative in Concurrent Engineering (DICE). Phase 1.

    DTIC Science & Technology

    1990-02-09

    and power of commercial and military electronics systems. The continual evolution of HDE technology offers far greater flexibility in circuit design... powerful magnetic field of the permanent magnets in the sawyer motors. This makes it possible to have multiple robots in the workcell and to have them...Controller. The Adept IC was chosen because of its extensive processing power , integrated grayscale vision, standard 28 industrial I/O control

  4. Implementing Set Based Design into Department of Defense Acquisition

    DTIC Science & Technology

    2016-12-01

    challenges for the DOD. This report identifies the original SBD principles and characteristics based on Toyota Motor Corporation’s Set Based Concurrent...Engineering Model. Additionally, the team reviewed DOD case studies that implemented SBD. The SBD principles , along with the common themes from the...perennial challenges for the DOD. This report identifies the original SBD principles and characteristics based on Toyota Motor Corporation’s Set

  5. Economical launching and accelerating control strategy for a single-shaft parallel hybrid electric bus

    NASA Astrophysics Data System (ADS)

    Yang, Chao; Song, Jian; Li, Liang; Li, Shengbo; Cao, Dongpu

    2016-08-01

    This paper presents an economical launching and accelerating mode, including four ordered phases: pure electrical driving, clutch engagement and engine start-up, engine active charging, and engine driving, which can be fit for the alternating conditions and improve the fuel economy of hybrid electric bus (HEB) during typical city-bus driving scenarios. By utilizing the fast response feature of electric motor (EM), an adaptive controller for EM is designed to realize the power demand during the pure electrical driving mode, the engine starting mode and the engine active charging mode. Concurrently, the smoothness issue induced by the sequential mode transitions is solved with a coordinated control logic for engine, EM and clutch. Simulation and experimental results show that the proposed launching and accelerating mode and its control methods are effective in improving the fuel economy and ensure the drivability during the fast transition between the operation modes of HEB.

  6. A sociocultural analysis of Latino high school students' funds of knowledge and implications for culturally responsive engineering education

    NASA Astrophysics Data System (ADS)

    Mejia, Joel Alejandro

    Previous studies have suggested that, when funds of knowledge are incorporated into science and mathematics curricula, students are more engaged and often develop richer understandings of scientific concepts. While there has been a growing body of research addressing how teachers may integrate students' linguistic, social, and cultural practices with science and mathematics instruction, very little research has been conducted on how the same can be accomplished with Latino and Latina students in engineering. The purpose of this study was to address this gap in the literature by investigating how fourteen Latino and Latina high school adolescents used their funds of knowledge to address engineering design challenges. This project was intended to enhance the educational experience of underrepresented minorities whose social and cultural practices have been traditionally undervalued in schools. This ethnographic study investigated the funds of knowledge of fourteen Latino and Latina high school adolescents and how they used these funds of knowledge in engineering design. Participant observation, bi-monthly group discussion, retrospective and concurrent protocols, and monthly one-on-one interviews were conducted during the study. A constant comparative analysis suggested that Latino and Latina adolescents, although profoundly underrepresented in engineering, bring a wealth of knowledge and experiences that are relevant to engineering design thinking and practice.

  7. Engineering design activities and conceptual change in middle school science

    NASA Astrophysics Data System (ADS)

    Schnittka, Christine G.

    The purpose of this research was to investigate the impact of engineering design classroom activities on conceptual change in science, and on attitudes toward and knowledge about engineering. Students were given a situated learning context and a rationale for learning science in an active, inquiry-based method, and worked in small collaborative groups. One eighth-grade physical science teacher and her students participated in a unit on heat transfer and thermal energy. One class served as the control while two others received variations of an engineering design treatment. Data were gathered from teacher and student entrance and exit interviews, audio recordings of student dialog during group work, video recordings and observations of all classes, pre- and posttests on science content and engineering attitudes, and artifacts and all assignments completed by students. Qualitative and quantitative data were collected concurrently, but analysis took place in two phases. Qualitative data were analyzed in an ongoing manner so that the researcher could explore emerging theories and trends as the study progressed. These results were compared to and combined with the results of the quantitative data analysis. Analysis of the data was carried out in the interpretive framework of analytic induction. Findings indicated that students overwhelmingly possessed alternative conceptions about heat transfer, thermal energy, and engineering prior to the interventions. While all three classes made statistically significant gains in their knowledge about heat and energy, students in the engineering design class with the targeted demonstrations made the most significant gains over the other two other classes. Engineering attitudes changed significantly in the two classes that received the engineering design intervention. Implications from this study can inform teachers' use of engineering design activities in science classrooms. These implications are: (1) Alternative conceptions will persist when not specifically addressed. (2) Engineering design activities are not enough to promote conceptual change. (3) A middle school teacher can successfully implement an engineering design-based curriculum in a science class. (4) Results may also be of interest to science curriculum developers and engineering educators involved in developing engineering outreach curricula for middle school students.

  8. Space Station logistics policy - Risk management from the top down

    NASA Technical Reports Server (NTRS)

    Paules, Granville; Graham, James L., Jr.

    1990-01-01

    Considerations are presented in the area of risk management specifically relating to logistics and system supportability. These considerations form a basis for confident application of concurrent engineering principles to a development program, aiming at simultaneous consideration of support and logistics requirements within the engineering process as the system concept and designs develop. It is shown that, by applying such a process, the chances of minimizing program logistics and supportability risk in the long term can be improved. The problem of analyzing and minimizing integrated logistics risk for the Space Station Freedom Program is discussed.

  9. Space Station Freedom - Configuration management approach to supporting concurrent engineering and total quality management. [for NASA Space Station Freedom Program

    NASA Technical Reports Server (NTRS)

    Gavert, Raymond B.

    1990-01-01

    Some experiences of NASA configuration management in providing concurrent engineering support to the Space Station Freedom program for the achievement of life cycle benefits and total quality are discussed. Three change decision experiences involving tracing requirements and automated information systems of the electrical power system are described. The potential benefits of concurrent engineering and total quality management include improved operational effectiveness, reduced logistics and support requirements, prevention of schedule slippages, and life cycle cost savings. It is shown how configuration management can influence the benefits attained through disciplined approaches and innovations that compel consideration of all the technical elements of engineering and quality factors that apply to the program development, transition to operations and in operations. Configuration management experiences involving the Space Station program's tiered management structure, the work package contractors, international partners, and the participating NASA centers are discussed.

  10. Performance Modeling in CUDA Streams - A Means for High-Throughput Data Processing.

    PubMed

    Li, Hao; Yu, Di; Kumar, Anand; Tu, Yi-Cheng

    2014-10-01

    Push-based database management system (DBMS) is a new type of data processing software that streams large volume of data to concurrent query operators. The high data rate of such systems requires large computing power provided by the query engine. In our previous work, we built a push-based DBMS named G-SDMS to harness the unrivaled computational capabilities of modern GPUs. A major design goal of G-SDMS is to support concurrent processing of heterogenous query processing operations and enable resource allocation among such operations. Understanding the performance of operations as a result of resource consumption is thus a premise in the design of G-SDMS. With NVIDIA's CUDA framework as the system implementation platform, we present our recent work on performance modeling of CUDA kernels running concurrently under a runtime mechanism named CUDA stream . Specifically, we explore the connection between performance and resource occupancy of compute-bound kernels and develop a model that can predict the performance of such kernels. Furthermore, we provide an in-depth anatomy of the CUDA stream mechanism and summarize the main kernel scheduling disciplines in it. Our models and derived scheduling disciplines are verified by extensive experiments using synthetic and real-world CUDA kernels.

  11. Process and assembly plans for low cost commercial fuselage structure

    NASA Technical Reports Server (NTRS)

    Willden, Kurtis; Metschan, Stephen; Starkey, Val

    1991-01-01

    Cost and weight reduction for a composite structure is a result of selecting design concepts that can be built using efficient low cost manufacturing and assembly processes. Since design and manufacturing are inherently cost dependent, concurrent engineering in the form of a Design-Build Team (DBT) is essential for low cost designs. Detailed cost analysis from DBT designs and hardware verification must be performed to identify the cost drivers and relationships between design and manufacturing processes. Results from the global evaluation are used to quantitatively rank design, identify cost centers for higher ranking design concepts, define and prioritize a list of technical/economic issues and barriers, and identify parameters that control concept response. These results are then used for final design optimization.

  12. Feasibility study for convertible engine torque converter

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The feasibility study has shown that a dump/fill type torque converter has excellent potential for the convertible fan/shaft engine. The torque converter space requirement permits internal housing within the normal flow path of a turbofan engine at acceptable engine weight. The unit permits operating the engine in the turboshaft mode by decoupling the fan. To convert to turbofan mode, the torque converter overdrive capability bring the fan speed up to the power turbine speed to permit engagement of a mechanical lockup device when the shaft speed are synchronized. The conversion to turbofan mode can be made without drop of power turbine speed in less than 10 sec. Total thrust delivered to the aircraft by the proprotor, fan, and engine during tansient can be controlled to prevent loss of air speed or altitude. Heat rejection to the oil is low, and additional oil cooling capacity is not required. The turbofan engine aerodynamic design is basically uncompromised by convertibility and allows proper fan design for quiet and efficient cruise operation. Although the results of the feasibility study are exceedingly encouraging, it must be noted that they are based on extrapolation of limited existing data on torque converters. A component test program with three trial torque converter designs and concurrent computer modeling for fluid flow, stress, and dynamics, updated with test results from each unit, is recommended.

  13. Research in computer science

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1984-01-01

    Several short summaries of the work performed during this reporting period are presented. Topics discussed in this document include: (1) resilient seeded errors via simple techniques; (2) knowledge representation for engineering design; (3) analysis of faults in a multiversion software experiment; (4) implementation of parallel programming environment; (5) symbolic execution of concurrent programs; (6) two computer graphics systems for visualization of pressure distribution and convective density particles; (7) design of a source code management system; (8) vectorizing incomplete conjugate gradient on the Cyber 203/205; (9) extensions of domain testing theory and; (10) performance analyzer for the pisces system.

  14. System software for the finite element machine

    NASA Technical Reports Server (NTRS)

    Crockett, T. W.; Knott, J. D.

    1985-01-01

    The Finite Element Machine is an experimental parallel computer developed at Langley Research Center to investigate the application of concurrent processing to structural engineering analysis. This report describes system-level software which has been developed to facilitate use of the machine by applications researchers. The overall software design is outlined, and several important parallel processing issues are discussed in detail, including processor management, communication, synchronization, and input/output. Based on experience using the system, the hardware architecture and software design are critiqued, and areas for further work are suggested.

  15. The Effect of Enhanced Visualization Instruction on First Grade Students' Scores on the North Carolina Standard Course Assessment

    ERIC Educational Resources Information Center

    Thompson, Amber Cole

    2012-01-01

    Visualization was once thought to be an important skill for professions only related to engineering, but due to the realization of concurrent design and the fast pace of technology, it is now desirable in other professions as well. The importance of learning basic knowledge of geometrical concepts has a greater impact than it did prior to the 21st…

  16. Selected Current Acquisitions and Articles from Periodicals

    DTIC Science & Technology

    1994-06-01

    on Theater High Altitude Area Defense (THAAD) System : briefing report to the Chairman, Committee on Foreign Relations, U.S. Senate. [Washington, D.C...John Marshall Law School , 1993- LAW PERIODICALS Main PAGE 6 CONCURRENT ENGINEERING. Karbhari, Vistaspa Maneck. Concurrent engineering for composites...Postgraduate School , [19901 VC267.U6 H37 1991 United States. DOD FAR supplement : Department of Defense as of . . Chicago, Ill. : Commerce Clearing House, 1994

  17. Improving generalized inverted index lock wait times

    NASA Astrophysics Data System (ADS)

    Borodin, A.; Mirvoda, S.; Porshnev, S.; Ponomareva, O.

    2018-01-01

    Concurrent operations on tree like data structures is a cornerstone of any database system. Concurrent operations intended for improving read\\write performance and usually implemented via some way of locking. Deadlock-free methods of concurrency control are known as tree locking protocols. These protocols provide basic operations(verbs) and algorithm (ways of operation invocations) for applying it to any tree-like data structure. These algorithms operate on data, managed by storage engine which are very different among RDBMS implementations. In this paper, we discuss tree locking protocol implementation for General inverted index (Gin) applied to multiversion concurrency control (MVCC) storage engine inside PostgreSQL RDBMS. After that we introduce improvements to locking protocol and provide usage statistics about evaluation of our improvement in very high load environment in one of the world’s largest IT company.

  18. Lessons learned for composite structures

    NASA Technical Reports Server (NTRS)

    Whitehead, R. S.

    1991-01-01

    Lessons learned for composite structures are presented in three technology areas: materials, manufacturing, and design. In addition, future challenges for composite structures are presented. Composite materials have long gestation periods from the developmental stage to fully matured production status. Many examples exist of unsuccessful attempts to accelerate this gestation period. Experience has shown that technology transition of a new material system to fully matured production status is time consuming, involves risk, is expensive and should not be undertaken lightly. The future challenges for composite materials require an intensification of the science based approach to material development, extension of the vendor/customer interaction process to include all engineering disciplines of the end user, reduced material costs because they are a significant factor in overall part cost, and improved batch-to-batch pre-preg physical property control. Historical manufacturing lessons learned are presented using current in-service production structure as examples. Most producibility problems for these structures can be traced to their sequential engineering design. This caused an excessive emphasis on design-to-weight and schedule at the expense of design-to-cost. This resulted in expensive performance originated designs, which required costly tooling and led to non-producible parts. Historically these problems have been allowed to persist throughout the production run. The current/future approach for the production of affordable composite structures mandates concurrent engineering design where equal emphasis is placed on product and process design. Design for simplified assembly is also emphasized, since assembly costs account for a major portion of total airframe costs. The future challenge for composite manufacturing is, therefore, to utilize concurrent engineering in conjunction with automated manufacturing techniques to build affordable composite structures. Composite design experience has shown that significant weight savings have been achieved, outstanding fatigue and corrosion resistance have been demonstrated, and in-service performance has been very successful. Currently no structural design show stoppers exist for composite structures. A major lesson learned is that the full scale static test is the key test for composites, since it is the primary structural 'hot spot' indicator. The major durability issue is supportability of thin skinned structure. Impact damage has been identified as the most significant issue for the damage tolerance control of composite structures. However, delaminations induced during assembly operations have demonstrated a significant nuisance value. The future challenges for composite structures are threefold. Firstly, composite airframe weight fraction should increase to 60 percent. At the same time, the cost of composite structures must be reduced by 50 percent to attain the goal of affordability. To support these challenges it is essential to develop lower cost materials and processes.

  19. Three gene expression vector sets for concurrently expressing multiple genes in Saccharomyces cerevisiae.

    PubMed

    Ishii, Jun; Kondo, Takashi; Makino, Harumi; Ogura, Akira; Matsuda, Fumio; Kondo, Akihiko

    2014-05-01

    Yeast has the potential to be used in bulk-scale fermentative production of fuels and chemicals due to its tolerance for low pH and robustness for autolysis. However, expression of multiple external genes in one host yeast strain is considerably labor-intensive due to the lack of polycistronic transcription. To promote the metabolic engineering of yeast, we generated systematic and convenient genetic engineering tools to express multiple genes in Saccharomyces cerevisiae. We constructed a series of multi-copy and integration vector sets for concurrently expressing two or three genes in S. cerevisiae by embedding three classical promoters. The comparative expression capabilities of the constructed vectors were monitored with green fluorescent protein, and the concurrent expression of genes was monitored with three different fluorescent proteins. Our multiple gene expression tool will be helpful to the advanced construction of genetically engineered yeast strains in a variety of research fields other than metabolic engineering. © 2014 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.

  20. Decibel: The Relational Dataset Branching System

    PubMed Central

    Maddox, Michael; Goehring, David; Elmore, Aaron J.; Madden, Samuel; Parameswaran, Aditya; Deshpande, Amol

    2017-01-01

    As scientific endeavors and data analysis become increasingly collaborative, there is a need for data management systems that natively support the versioning or branching of datasets to enable concurrent analysis, cleaning, integration, manipulation, or curation of data across teams of individuals. Common practice for sharing and collaborating on datasets involves creating or storing multiple copies of the dataset, one for each stage of analysis, with no provenance information tracking the relationships between these datasets. This results not only in wasted storage, but also makes it challenging to track and integrate modifications made by different users to the same dataset. In this paper, we introduce the Relational Dataset Branching System, Decibel, a new relational storage system with built-in version control designed to address these shortcomings. We present our initial design for Decibel and provide a thorough evaluation of three versioned storage engine designs that focus on efficient query processing with minimal storage overhead. We also develop an exhaustive benchmark to enable the rigorous testing of these and future versioned storage engine designs. PMID:28149668

  1. The BOEING 777 - concurrent engineering and digital pre-assembly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abarbanel, B.

    The processes created on the 777 for checking designs were called {open_quotes}digital pre-assembly{close_quotes}. Using FlyThru(tm), a spin-off of a Boeing advanced computing research project, engineers were able to view up to 1500 models (15000 solids) in 3d traversing that data at high speed. FlyThru(tm) was rapidly deployed in 1991 to meet the needs of the 777 for large scale product visualization and verification. The digital pre-assembly process has bad fantastic results. The 777 has had far fewer assembly and systems problems compared to previous airplane programs. Today, FlyThru(tm) is installed on hundreds of workstations on almost every airplane program, andmore » is being used on Space Station, F22, AWACS, and other defense projects. It`s applications have gone far beyond just design review. In many ways, FlyThru is a Data Warehouse supported by advanced tools for analysis. It is today being integrated with Knowledge Based Engineering geometry generation tools.« less

  2. Analysis of Aurora's Performance Simulation Engine for Three Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman, Janine; Simon, Joseph

    2015-07-07

    Aurora Solar Inc. is building a cloud-based optimization platform to automate the design, engineering, and permit generation process of solar photovoltaic (PV) installations. They requested that the National Renewable Energy Laboratory (NREL) validate the performance of the PV system performance simulation engine of Aurora Solar’s solar design platform, Aurora. In previous work, NREL performed a validation of multiple other PV modeling tools 1, so this study builds upon that work by examining all of the same fixed-tilt systems with available module datasheets that NREL selected and used in the aforementioned study. Aurora Solar set up these three operating PV systemsmore » in their modeling platform using NREL-provided system specifications and concurrent weather data. NREL then verified the setup of these systems, ran the simulations, and compared the Aurora-predicted performance data to measured performance data for those three systems, as well as to performance data predicted by other PV modeling tools.« less

  3. Aerospace engineering design by systematic decomposition and multilevel optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Barthelemy, J. F. M.; Giles, G. L.

    1984-01-01

    A method for systematic analysis and optimization of large engineering systems, by decomposition of a large task into a set of smaller subtasks that is solved concurrently is described. The subtasks may be arranged in hierarchical levels. Analyses are carried out in each subtask using inputs received from other subtasks, and are followed by optimizations carried out from the bottom up. Each optimization at the lower levels is augmented by analysis of its sensitivity to the inputs received from other subtasks to account for the couplings among the subtasks in a formal manner. The analysis and optimization operations alternate iteratively until they converge to a system design whose performance is maximized with all constraints satisfied. The method, which is still under development, is tentatively validated by test cases in structural applications and an aircraft configuration optimization.

  4. Rosetta:MSF: a modular framework for multi-state computational protein design.

    PubMed

    Löffler, Patrick; Schmitz, Samuel; Hupfeld, Enrico; Sterner, Reinhard; Merkl, Rainer

    2017-06-01

    Computational protein design (CPD) is a powerful technique to engineer existing proteins or to design novel ones that display desired properties. Rosetta is a software suite including algorithms for computational modeling and analysis of protein structures and offers many elaborate protocols created to solve highly specific tasks of protein engineering. Most of Rosetta's protocols optimize sequences based on a single conformation (i. e. design state). However, challenging CPD objectives like multi-specificity design or the concurrent consideration of positive and negative design goals demand the simultaneous assessment of multiple states. This is why we have developed the multi-state framework MSF that facilitates the implementation of Rosetta's single-state protocols in a multi-state environment and made available two frequently used protocols. Utilizing MSF, we demonstrated for one of these protocols that multi-state design yields a 15% higher performance than single-state design on a ligand-binding benchmark consisting of structural conformations. With this protocol, we designed de novo nine retro-aldolases on a conformational ensemble deduced from a (βα)8-barrel protein. All variants displayed measurable catalytic activity, testifying to a high success rate for this concept of multi-state enzyme design.

  5. Rosetta:MSF: a modular framework for multi-state computational protein design

    PubMed Central

    Hupfeld, Enrico; Sterner, Reinhard

    2017-01-01

    Computational protein design (CPD) is a powerful technique to engineer existing proteins or to design novel ones that display desired properties. Rosetta is a software suite including algorithms for computational modeling and analysis of protein structures and offers many elaborate protocols created to solve highly specific tasks of protein engineering. Most of Rosetta’s protocols optimize sequences based on a single conformation (i. e. design state). However, challenging CPD objectives like multi-specificity design or the concurrent consideration of positive and negative design goals demand the simultaneous assessment of multiple states. This is why we have developed the multi-state framework MSF that facilitates the implementation of Rosetta’s single-state protocols in a multi-state environment and made available two frequently used protocols. Utilizing MSF, we demonstrated for one of these protocols that multi-state design yields a 15% higher performance than single-state design on a ligand-binding benchmark consisting of structural conformations. With this protocol, we designed de novo nine retro-aldolases on a conformational ensemble deduced from a (βα)8-barrel protein. All variants displayed measurable catalytic activity, testifying to a high success rate for this concept of multi-state enzyme design. PMID:28604768

  6. A testing platform for durability studies of polymers and fiber-reinforced polymer composites under concurrent hygrothermo-mechanical stimuli.

    PubMed

    Gomez, Antonio; Pires, Robert; Yambao, Alyssa; La Saponara, Valeria

    2014-12-11

    The durability of polymers and fiber-reinforced polymer composites under service condition is a critical aspect to be addressed for their robust designs and condition-based maintenance. These materials are adopted in a wide range of engineering applications, from aircraft and ship structures, to bridges, wind turbine blades, biomaterials and biomedical implants. Polymers are viscoelastic materials, and their response may be highly nonlinear and thus make it challenging to predict and monitor their in-service performance. The laboratory-scale testing platform presented herein assists the investigation of the influence of concurrent mechanical loadings and environmental conditions on these materials. The platform was designed to be low-cost and user-friendly. Its chemically resistant materials make the platform adaptable to studies of chemical degradation due to in-service exposure to fluids. An example of experiment was conducted at RT on closed-cell polyurethane foam samples loaded with a weight corresponding to ~50% of their ultimate static and dry load. Results show that the testing apparatus is appropriate for these studies. Results also highlight the larger vulnerability of the polymer under concurrent loading, based on the higher mid-point displacements and lower residual failure loads. Recommendations are made for additional improvements to the testing apparatus.

  7. A Testing Platform for Durability Studies of Polymers and Fiber-reinforced Polymer Composites under Concurrent Hygrothermo-mechanical Stimuli

    PubMed Central

    Gomez, Antonio; Pires, Robert; Yambao, Alyssa; La Saponara, Valeria

    2014-01-01

    The durability of polymers and fiber-reinforced polymer composites under service condition is a critical aspect to be addressed for their robust designs and condition-based maintenance. These materials are adopted in a wide range of engineering applications, from aircraft and ship structures, to bridges, wind turbine blades, biomaterials and biomedical implants. Polymers are viscoelastic materials, and their response may be highly nonlinear and thus make it challenging to predict and monitor their in-service performance. The laboratory-scale testing platform presented herein assists the investigation of the influence of concurrent mechanical loadings and environmental conditions on these materials. The platform was designed to be low-cost and user-friendly. Its chemically resistant materials make the platform adaptable to studies of chemical degradation due to in-service exposure to fluids. An example of experiment was conducted at RT on closed-cell polyurethane foam samples loaded with a weight corresponding to ~50% of their ultimate static and dry load. Results show that the testing apparatus is appropriate for these studies. Results also highlight the larger vulnerability of the polymer under concurrent loading, based on the higher mid-point displacements and lower residual failure loads. Recommendations are made for additional improvements to the testing apparatus. PMID:25548950

  8. Performance Modeling in CUDA Streams - A Means for High-Throughput Data Processing

    PubMed Central

    Li, Hao; Yu, Di; Kumar, Anand; Tu, Yi-Cheng

    2015-01-01

    Push-based database management system (DBMS) is a new type of data processing software that streams large volume of data to concurrent query operators. The high data rate of such systems requires large computing power provided by the query engine. In our previous work, we built a push-based DBMS named G-SDMS to harness the unrivaled computational capabilities of modern GPUs. A major design goal of G-SDMS is to support concurrent processing of heterogenous query processing operations and enable resource allocation among such operations. Understanding the performance of operations as a result of resource consumption is thus a premise in the design of G-SDMS. With NVIDIA’s CUDA framework as the system implementation platform, we present our recent work on performance modeling of CUDA kernels running concurrently under a runtime mechanism named CUDA stream. Specifically, we explore the connection between performance and resource occupancy of compute-bound kernels and develop a model that can predict the performance of such kernels. Furthermore, we provide an in-depth anatomy of the CUDA stream mechanism and summarize the main kernel scheduling disciplines in it. Our models and derived scheduling disciplines are verified by extensive experiments using synthetic and real-world CUDA kernels. PMID:26566545

  9. Functional assembly of engineered myocardium by electrical stimulation of cardiac myocytes cultured on scaffolds.

    PubMed

    Radisic, Milica; Park, Hyoungshin; Shing, Helen; Consi, Thomas; Schoen, Frederick J; Langer, Robert; Freed, Lisa E; Vunjak-Novakovic, Gordana

    2004-12-28

    The major challenge of tissue engineering is directing the cells to establish the physiological structure and function of the tissue being replaced across different hierarchical scales. To engineer myocardium, biophysical regulation of the cells needs to recapitulate multiple signals present in the native heart. We hypothesized that excitation-contraction coupling, critical for the development and function of a normal heart, determines the development and function of engineered myocardium. To induce synchronous contractions of cultured cardiac constructs, we applied electrical signals designed to mimic those in the native heart. Over only 8 days in vitro, electrical field stimulation induced cell alignment and coupling, increased the amplitude of synchronous construct contractions by a factor of 7, and resulted in a remarkable level of ultrastructural organization. Development of conductive and contractile properties of cardiac constructs was concurrent, with strong dependence on the initiation and duration of electrical stimulation.

  10. New technologies for space avionics

    NASA Technical Reports Server (NTRS)

    Aibel, David W.; Dingus, Peter; Lanciault, Mark; Hurdlebrink, Debra; Gurevich, Inna; Wenglar, Lydia

    1994-01-01

    This report reviews a 1994 effort that continued 1993 investigations into issues associated with the definition of requirements, with the practice concurrent engineering and rapid prototyping in the context of the development of a prototyping of a next-generation reaction jet driver controller. This report discusses lessons learned, the testing of the current prototype, the details of the current design, and the nature and performance of a mathematical model of the life cycle of a pilot operated valve solenoid.

  11. Training mechanical engineering students to utilize biological inspiration during product development.

    PubMed

    Bruck, Hugh A; Gershon, Alan L; Golden, Ira; Gupta, Satyandra K; Gyger, Lawrence S; Magrab, Edward B; Spranklin, Brent W

    2007-12-01

    The use of bio-inspiration for the development of new products and devices requires new educational tools for students consisting of appropriate design and manufacturing technologies, as well as curriculum. At the University of Maryland, new educational tools have been developed that introduce bio-inspired product realization to undergraduate mechanical engineering students. These tools include the development of a bio-inspired design repository, a concurrent fabrication and assembly manufacturing technology, a series of undergraduate curriculum modules and a new senior elective in the bio-inspired robotics area. This paper first presents an overview of the two new design and manufacturing technologies that enable students to realize bio-inspired products, and describes how these technologies are integrated into the undergraduate educational experience. Then, the undergraduate curriculum modules are presented, which provide students with the fundamental design and manufacturing principles needed to support bio-inspired product and device development. Finally, an elective bio-inspired robotics project course is present, which provides undergraduates with the opportunity to demonstrate the application of the knowledge acquired through the curriculum modules in their senior year using the new design and manufacturing technologies.

  12. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    NASA Technical Reports Server (NTRS)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  13. Multi-Organization Multi-Discipline Effort Developing a Mitigation Concept for Planetary Defense

    NASA Technical Reports Server (NTRS)

    Leung, Ronald Y.; Barbee, Brent W.; Seery, Bernard D.; Bambacus, Myra; Finewood, Lee; Greenaugh, Kevin C.; Lewis, Anthony; Dearborn, David; Miller, Paul L.; Weaver, Robert P.; hide

    2017-01-01

    There have been significant recent efforts in addressing mitigation approaches to neutralize Potentially Hazardous Asteroids (PHA). One such research effort was performed in 2015 by an integrated, inter-disciplinary team of asteroid scientists, energy deposition modeling scientists, payload engineers, orbital dynamist engineers, spacecraft discipline engineers, and systems architecture engineer from NASAs Goddard Space Flight Center (GSFC) and the Department of Energy (DoE) National Nuclear Security Administration (NNSA) laboratories (Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratories (LLNL) and Sandia National Laboratories). The study team collaborated with GSFCs Integrated Design Centers Mission Design Lab (MDL) which engaged a team of GSFC flight hardware discipline engineers to work with GSFC, LANL, and LLNL NEA-related subject matter experts during a one-week intensive concept formulation study in an integrated concurrent engineering environment. This team has analyzed the first of several distinct study cases for a multi-year NASA research grant. This Case 1 study references the Near-Earth Asteroid (NEA) named Bennu as the notional target due to the availability of a very detailed Design Reference Asteroid (DRA) model for its orbit and physical characteristics (courtesy of the Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) mission team). The research involved the formulation and optimization of spacecraft trajectories to intercept Bennu, overall mission and architecture concepts, and high-fidelity modeling of both kinetic impact (spacecraft collision to change a NEAs momentum and orbit) and nuclear detonation effects on Bennu, for purposes of deflecting Bennu.

  14. Radiation Shielding for Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Caffrey, Jarvis A.

    2016-01-01

    Design and analysis of radiation shielding for nuclear thermal propulsion has continued at Marshall Space Flight Center. A set of optimization tools are in development, and strategies for shielding optimization will be discussed. Considerations for the concurrent design of internal and external shielding are likely required for a mass optimal shield design. The task of reducing radiation dose to crew from a nuclear engine is considered to be less challenging than the task of thermal mitigation for cryogenic propellant, especially considering the likely implementation of additional crew shielding for protection from solar particles and cosmic rays. Further consideration is thus made for the thermal effects of radiation absorption in cryogenic propellant. Materials challenges and possible methods of manufacturing are also discussed.

  15. Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems

    NASA Astrophysics Data System (ADS)

    Koch, Patrick Nathan

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.

  16. A novel method for biomaterial scaffold internal architecture design to match bone elastic properties with desired porosity.

    PubMed

    Lin, Cheng Yu; Kikuchi, Noboru; Hollister, Scott J

    2004-05-01

    An often-proposed tissue engineering design hypothesis is that the scaffold should provide a biomimetic mechanical environment for initial function and appropriate remodeling of regenerating tissue while concurrently providing sufficient porosity for cell migration and cell/gene delivery. To provide a systematic study of this hypothesis, the ability to precisely design and manufacture biomaterial scaffolds is needed. Traditional methods for scaffold design and fabrication cannot provide the control over scaffold architecture design to achieve specified properties within fixed limits on porosity. The purpose of this paper was to develop a general design optimization scheme for 3D internal scaffold architecture to match desired elastic properties and porosity simultaneously, by introducing the homogenization-based topology optimization algorithm (also known as general layout optimization). With an initial target for bone tissue engineering, we demonstrate that the method can produce highly porous structures that match human trabecular bone anisotropic stiffness using accepted biomaterials. In addition, we show that anisotropic bone stiffness may be matched with scaffolds of widely different porosity. Finally, we also demonstrate that prototypes of the designed structures can be fabricated using solid free-form fabrication (SFF) techniques.

  17. Concurrency in product realization

    NASA Astrophysics Data System (ADS)

    Kelly, Michael J.

    1994-03-01

    Technology per se does not provide a competitive advantage. Timely exploitation of technology is what gives the competitive edge, and this demands a major shift in the product development process and management of the industrial enterprise. `Teaming to win' is more than a management theme; it is the disciplined engineering practice that is essential to success in today's global marketplace. Teaming supports the concurrent engineering practices required to integrate the activities of people responsible for product realization through achievement of shorter development cycles, lower costs, and defect-free products.

  18. Introduction to Concurrent Engineering: Electronic Circuit Design and Production Applications

    DTIC Science & Technology

    1992-09-01

    STD-1629. Failure mode distribution data for many different types of parts may be found in RAC publication FMD -91. FMEA utilizes inductive logic in a...contrasts with a Fault Tree Analysis ( FTA ) which utilizes deductive logic in a "top down" approach. In FTA , a system failure is assumed and traced down...Analysis ( FTA ) is a graphical method of risk analysis used to identify critical failure modes within a system or equipment. Utilizing a pictorial approach

  19. MEMS product engineering: methodology and tools

    NASA Astrophysics Data System (ADS)

    Ortloff, Dirk; Popp, Jens; Schmidt, Thilo; Hahn, Kai; Mielke, Matthias; Brück, Rainer

    2011-03-01

    The development of MEMS comprises the structural design as well as the definition of an appropriate manufacturing process. Technology constraints have a considerable impact on the device design and vice-versa. Product design and technology development are therefore concurrent tasks. Based on a comprehensive methodology the authors introduce a software environment that links commercial design tools from both area into a common design flow. In this paper emphasis is put on automatic low threshold data acquisition. The intention is to collect and categorize development data for further developments with minimum overhead and minimum disturbance of established business processes. As a first step software tools that automatically extract data from spreadsheets or file-systems and put them in context with existing information are presented. The developments are currently carried out in a European research project.

  20. Peroxide Propulsion at the Turn of the Century

    NASA Technical Reports Server (NTRS)

    Anderson, William E.; Butler, Kathy; Crocket, Dave; Lewis, Tim; McNeal, Curtis

    2000-01-01

    A resurgence of interest in peroxide propulsion has occurred in the last years of the 21st Century. This interest is driven by the need for lower cost propulsion systems and the need for storable reusable propulsion systems to meet future space transportation system architectures. NASA and the Air Force are jointly developing two propulsion systems for flight demonstration early in the 21st Century. One system will be a development of Boeing's AR2-3 engine, which was successfully fielded in the 1960s. The other is a new pressure-fed design by Orbital Sciences Corporation for expendable mission requirements. Concurrently NASA and industry are pursuing the key peroxide technologies needed to design, fabricate, and test advanced peroxide engines to meet the mission needs beyond 2005. This paper will present a description of the AR2-3, report the status of its current test program, and describe its intended flight demonstration. This paper will then describe the Orbital 10K engine, the status of its test program, and describe its planned flight demonstration. Finally the paper will present a plan, or technology roadmap, for the development of an advanced peroxide engine for the 21st Century.

  1. Automated Concurrent Blackboard System Generation in C++

    NASA Technical Reports Server (NTRS)

    Kaplan, J. A.; McManus, J. W.; Bynum, W. L.

    1999-01-01

    In his 1992 Ph.D. thesis, "Design and Analysis Techniques for Concurrent Blackboard Systems", John McManus defined several performance metrics for concurrent blackboard systems and developed a suite of tools for creating and analyzing such systems. These tools allow a user to analyze a concurrent blackboard system design and predict the performance of the system before any code is written. The design can be modified until simulated performance is satisfactory. Then, the code generator can be invoked to generate automatically all of the code required for the concurrent blackboard system except for the code implementing the functionality of each knowledge source. We have completed the port of the source code generator and a simulator for a concurrent blackboard system. The source code generator generates the necessary C++ source code to implement the concurrent blackboard system using Parallel Virtual Machine (PVM) running on a heterogeneous network of UNIX(trademark) workstations. The concurrent blackboard simulator uses the blackboard specification file to predict the performance of the concurrent blackboard design. The only part of the source code for the concurrent blackboard system that the user must supply is the code implementing the functionality of the knowledge sources.

  2. Software modifications to the Demonstration Advanced Avionics Systems (DAAS)

    NASA Technical Reports Server (NTRS)

    Nedell, B. F.; Hardy, G. H.

    1984-01-01

    Critical information required for the design of integrated avionics suitable for generation aviation is applied towards software modifications for the Demonstration Advanced Avionics System (DAAS). The program emphasizes the use of data busing, distributed microprocessors, shared electronic displays and data entry devices, and improved functional capability. A demonstration advanced avionics system (DAAS) is designed, built, and flight tested in a Cessna 402, twin engine, general aviation aircraft. Software modifications are made to DAAS at Ames concurrent with the flight test program. The changes are the result of the experience obtained with the system at Ames, and the comments of the pilots who evaluated the system.

  3. Aerospace engineering design by systematic decomposition and multilevel optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Giles, G. L.; Barthelemy, J.-F. M.

    1984-01-01

    This paper describes a method for systematic analysis and optimization of large engineering systems, e.g., aircraft, by decomposition of a large task into a set of smaller, self-contained subtasks that can be solved concurrently. The subtasks may be arranged in many hierarchical levels with the assembled system at the top level. Analyses are carried out in each subtask using inputs received from other subtasks, and are followed by optimizations carried out from the bottom up. Each optimization at the lower levels is augmented by analysis of its sensitivity to the inputs received from other subtasks to account for the couplings among the subtasks in a formal manner. The analysis and optimization operations alternate iteratively until they converge to a system design whose performance is maximized with all constraints satisfied. The method, which is still under development, is tentatively validated by test cases in structural applications and an aircraft configuration optimization. It is pointed out that the method is intended to be compatible with the typical engineering organization and the modern technology of distributed computing.

  4. Analyzing and designing object-oriented missile simulations with concurrency

    NASA Astrophysics Data System (ADS)

    Randorf, Jeffrey Allen

    2000-11-01

    A software object model for the six degree-of-freedom missile modeling domain is presented. As a precursor, a domain analysis of the missile modeling domain was started, based on the Feature-Oriented Domain Analysis (FODA) technique described by the Software Engineering Institute (SEI). It was subsequently determined the FODA methodology is functionally equivalent to the Object Modeling Technique. The analysis used legacy software documentation and code from the ENDOSIM, KDEC, and TFrames 6-DOF modeling tools, including other technical literature. The SEI Object Connection Architecture (OCA) was the template for designing the object model. Three variants of the OCA were considered---a reference structure, a recursive structure, and a reference structure with augmentation for flight vehicle modeling. The reference OCA design option was chosen for maintaining simplicity while not compromising the expressive power of the OMT model. The missile architecture was then analyzed for potential areas of concurrent computing. It was shown how protected objects could be used for data passing between OCA object managers, allowing concurrent access without changing the OCA reference design intent or structure. The implementation language was the 1995 release of Ada. OCA software components were shown how to be expressed as Ada child packages. While acceleration of several low level and other high operations level are possible on proper hardware, there was a 33% degradation of 4th order Runge-Kutta integrator performance of two simultaneous ordinary differential equations using Ada tasking on a single processor machine. The Defense Department's High Level Architecture was introduced and explained in context with the OCA. It was shown the HLA and OCA were not mutually exclusive architectures, but complimentary. HLA was shown as an interoperability solution, with the OCA as an architectural vehicle for software reuse. Further directions for implementing a 6-DOF missile modeling environment are discussed.

  5. Radioisotope Power Systems Reference Book for Mission Designers and Planners

    NASA Technical Reports Server (NTRS)

    Lee, Young; Bairstow, Brian

    2015-01-01

    The RPS Program's Program Planning and Assessment (PPA) Office commissioned the Mission Analysis team to develop the Radioisotope Power Systems (RPS) Reference Book for Mission Planners and Designers to define a baseline of RPS technology capabilities with specific emphasis on performance parameters and technology readiness. The main objective of this book is to provide RPS technology information that could be utilized by future mission concept studies and concurrent engineering practices. A progress summary from the major branches of RPS technology research provides mission analysis teams with a vital tool for assessing the RPS trade space, and provides concurrent engineering centers with a consistent set of guidelines for RPS performance characteristics. This book will be iterated when substantial new information becomes available to ensure continued relevance, serving as one of the cornerstone products of the RPS PPA Office. This book updates the original 2011 internal document, using data from the relevant publicly released RPS technology references and consultations with RPS technologists. Each performance parameter and RPS product subsection has been reviewed and cleared by at least one subject matter representative. A virtual workshop was held to reach consensus on the scope and contents of the book, and the definitions and assumptions that should be used. The subject matter experts then reviewed and updated the appropriate sections of the book. The RPS Mission Analysis Team then performed further updates and crosschecked the book for consistency. Finally, a second virtual workshop was held to ensure all subject matter experts and stakeholders concurred on the contents.

  6. 78 FR 8596 - Committee on Equal Opportunities in Science and Engineering #1173; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-06

    ... NATIONAL SCIENCE FOUNDATION Committee on Equal Opportunities in Science and Engineering 1173... Science and Engineering (CEOSE). Dates/Time: February 25, 2013, 9:00 a.m.-5:30 p.m.; February 26, 2013, 9... participation in science and engineering. Agenda: Opening Statement by the CEOSE Chair Discussions: Concurrence...

  7. Structural optimization: Status and promise

    NASA Astrophysics Data System (ADS)

    Kamat, Manohar P.

    Chapters contained in this book include fundamental concepts of optimum design, mathematical programming methods for constrained optimization, function approximations, approximate reanalysis methods, dual mathematical programming methods for constrained optimization, a generalized optimality criteria method, and a tutorial and survey of multicriteria optimization in engineering. Also included are chapters on the compromise decision support problem and the adaptive linear programming algorithm, sensitivity analyses of discrete and distributed systems, the design sensitivity analysis of nonlinear structures, optimization by decomposition, mixed elements in shape sensitivity analysis of structures based on local criteria, and optimization of stiffened cylindrical shells subjected to destabilizing loads. Other chapters are on applications to fixed-wing aircraft and spacecraft, integrated optimum structural and control design, modeling concurrency in the design of composite structures, and tools for structural optimization. (No individual items are abstracted in this volume)

  8. The Effectiveness of Concurrent Design on the Cost and Schedule Performance of Defense Weapons System Acquisitions

    NASA Astrophysics Data System (ADS)

    Robertson, Randolph B.

    This study investigates the impact of concurrent design on the cost growth and schedule growth of US Department of Defense Major Defense Acquisition Systems (MDAPs). It is motivated by the question of whether employment of concurrent design in the development of a major weapon system will produce better results in terms of cost and schedule than traditional serial development methods. Selected Acquisition Reports were used to determine the cost and schedule growth of MDAPs as well as the degree of concurrency employed. Two simple linear regression analyses were used to determine the degree to which cost growth and schedule growth vary with concurrency. The results were somewhat surprising in that for major weapon systems the utilization of concurrency as it was implemented in the programs under study was shown to have no effect on cost performance, and that performance to development schedule, one of the purported benefits of concurrency, was actually shown to deteriorate with increases in concurrency. These results, while not an indictment of the concept of concurrency, indicate that better practices and methods are needed in the implementation of concurrency in major weapon systems. The findings are instructive to stakeholders in the weapons acquisition process in their consideration of whether and how to employ concurrent design strategies in their planning of new weapons acquisition programs.

  9. Ceramic applications in turbine engines

    NASA Technical Reports Server (NTRS)

    Byrd, J. A.; Janovicz, M. A.; Thrasher, S. R.

    1981-01-01

    Development testing activities on the 1900 F-configuration ceramic parts were completed, 2070 F-configuration ceramic component rig and engine testing was initiated, and the conceptual design for the 2265 F-configuration engine was identified. Fabrication of the 2070 F-configuration ceramic parts continued, along with burner rig development testing of the 2070 F-configuration metal combustor in preparation for 1132 C (2070 F) qualification test conditions. Shakedown testing of the hot engine simulator (HES) rig was also completed in preparation for testing of a spin rig-qualified ceramic-bladed rotor assembly at 1132 C (2070 F) test conditions. Concurrently, ceramics from new sources and alternate materials continued to be evaluated, and fabrication of 2070 F-configuration ceramic component from these new sources continued. Cold spin testing of the critical 2070 F-configuration blade continued in the spin test rig to qualify a set of ceramic blades at 117% engine speed for the gasifier turbine rotor. Rig testing of the ceramic-bladed gasifier turbine rotor assembly at 108% engine speed was also performed, which resulted in the failure of one blade. The new three-piece hot seal with the nickel oxide/calcium fluoride wearface composition was qualified in the regenerator rig and introduced to engine operation wiwth marginal success.

  10. Collaborative Mission Design at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Gough, Kerry M.; Allen, B. Danette; Amundsen, Ruth M.

    2005-01-01

    NASA Langley Research Center (LaRC) has developed and tested two facilities dedicated to increasing efficiency in key mission design processes, including payload design, mission planning, and implementation plan development, among others. The Integrated Design Center (IDC) is a state-of-the-art concurrent design facility which allows scientists and spaceflight engineers to produce project designs and mission plans in a real-time collaborative environment, using industry-standard physics-based development tools and the latest communication technology. The Mission Simulation Lab (MiSL), a virtual reality (VR) facility focused on payload and project design, permits engineers to quickly translate their design and modeling output into enhanced three-dimensional models and then examine them in a realistic full-scale virtual environment. The authors were responsible for envisioning both facilities and turning those visions into fully operational mission design resources at LaRC with multiple advanced capabilities and applications. In addition, the authors have created a synergistic interface between these two facilities. This combined functionality is the Interactive Design and Simulation Center (IDSC), a meta-facility which offers project teams a powerful array of highly advanced tools, permitting them to rapidly produce project designs while maintaining the integrity of the input from every discipline expert on the project. The concept-to-flight mission support provided by IDSC has shown improved inter- and intra-team communication and a reduction in the resources required for proposal development, requirements definition, and design effort.

  11. Cost-engineering modeling to support rapid concept development of an advanced infrared satellite system

    NASA Astrophysics Data System (ADS)

    Bell, Kevin D.; Dafesh, Philip A.; Hsu, L. A.; Tsuda, A. S.

    1995-12-01

    Current architectural and design trade techniques often carry unaffordable alternatives late into the decision process. Early decisions made during the concept exploration and development (CE&D) phase will drive the cost of a program more than any other phase of development; thus, designers must be able to assess both the performance and cost impacts of their early choices. The Space Based Infrared System (SBIRS) cost engineering model (CEM) described in this paper is an end-to-end process integrating engineering and cost expertise through commonly available spreadsheet software, allowing for concurrent design engineering and cost estimation to identify and balance system drives to reduce acquisition costs. The automated interconnectivity between subsystem models using spreadsheet software allows for the quick and consistent assessment of the system design impacts and relative cost impacts due to requirement changes. It is different from most CEM efforts attempted in the past as it incorporates more detailed spacecraft and sensor payload models, and has been applied to determine the cost drivers for an advanced infrared satellite system acquisition. The CEM is comprised of integrated detailed engineering and cost estimating relationships describing performance, design, and cost parameters. Detailed models have been developed to evaluate design parameters for the spacecraft bus and sensor; both step-starer and scanner sensor types incorporate models of focal plane array, optics, processing, thermal, communications, and mission performance. The current CEM effort has provided visibility to requirements, design, and cost drivers for system architects and decision makers to determine the configuration of an infrared satellite architecture that meets essential requirements cost effectively. In general, the methodology described in this paper consists of process building blocks that can be tailored to the needs of many applications. Descriptions of the spacecraft and payload subsystem models provide insight into The Aerospace Corporation expertise and scope of the SBIRS concept development effort.

  12. Modeling of Broadband Liners Applied to the Advanced Noise Control Fan

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Jones, Michael G.; Sutliff, Daniel L.

    2015-01-01

    The broadband component of fan noise has grown in relevance with an increase in bypass ratio and incorporation of advanced fan designs. Therefore, while the attenuation of fan tones remains a major factor in engine nacelle acoustic liner design, the simultaneous reduction of broadband fan noise levels has received increased interest. As such, a previous investigation focused on improvements to an established broadband acoustic liner optimization process using the Advanced Noise Control Fan (ANCF) rig as a demonstrator. Constant-depth, double-degree of freedom and variable-depth, multi-degree of freedom liner designs were carried through design, fabrication, and testing. This paper addresses a number of areas for further research identified in the initial assessment of the ANCF study. Specifically, incident source specification and uncertainty in some aspects of the predicted liner impedances are addressed. This information is incorporated in updated predictions of the liner performance and comparisons with measurement are greatly improved. Results illustrate the value of the design process in concurrently evaluating the relative costs/benefits of various liner designs. This study also provides further confidence in the integrated use of duct acoustic propagation/radiation and liner modeling tools in the design and evaluation of novel broadband liner concepts for complex engine configurations.

  13. Challenges Facing Design and Analysis Tools

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)

    2001-01-01

    The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools will be described.

  14. The nuclear thermal electric rocket: a proposed innovative propulsion concept for manned interplanetary missions

    NASA Astrophysics Data System (ADS)

    Dujarric, C.; Santovincenzo, A.; Summerer, L.

    2013-03-01

    Conventional propulsion technology (chemical and electric) currently limits the possibilities for human space exploration to the neighborhood of the Earth. If farther destinations (such as Mars) are to be reached with humans on board, a more capable interplanetary transfer engine featuring high thrust, high specific impulse is required. The source of energy which could in principle best meet these engine requirements is nuclear thermal. However, the nuclear thermal rocket technology is not yet ready for flight application. The development of new materials which is necessary for the nuclear core will require further testing on ground of full-scale nuclear rocket engines. Such testing is a powerful inhibitor to the nuclear rocket development, as the risks of nuclear contamination of the environment cannot be entirely avoided with current concepts. Alongside already further matured activities in the field of space nuclear power sources for generating on-board power, a low level investigation on nuclear propulsion has been running since long within ESA, and innovative concepts have already been proposed at an IAF conference in 1999 [1, 2]. Following a slow maturation process, a new concept was defined which was submitted to a concurrent design exercise in ESTEC in 2007. Great care was taken in the selection of the design parameters to ensure that this quite innovative concept would in all respects likely be feasible with margins. However, a thorough feasibility demonstration will require a more detailed design including the selection of appropriate materials and the verification that these can withstand the expected mechanical, thermal, and chemical environment. So far, the predefinition work made clear that, based on conservative technology assumptions, a specific impulse of 920 s could be obtained with a thrust of 110 kN. Despite the heavy engine dry mass, a preliminary mission analysis using conservative assumptions showed that the concept was reducing the required Initial Mass in Low Earth Orbit compared to conventional nuclear thermal rockets for a human mission to Mars. Of course, the realization of this concept still requires proper engineering and the dimensioning of quite unconventional machinery. A patent was filed on the concept. Because of the operating parameters of the nuclear core, which are very specific to this type of concept, it seems possible to test on ground this kind of engine at full scale in close loop using a reasonable size test facility with safe and clean conditions. Such tests can be conducted within fully confined enclosure, which would substantially increase the associated inherent nuclear safety levels. This breakthrough removes a showstopper for nuclear rocket engines development. The present paper will disclose the NTER (Nuclear Thermal Electric Rocket) engine concept, will present some of the results of the ESTEC concurrent engineering exercise, and will explain the concept for the NTER on-ground testing facility. Regulations and safety issues related to the development and implementation of the NTER concept will be addressed as well.

  15. Multidisciplinary Concurrent Design Optimization via the Internet

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Kelkar, Atul G.; Koganti, Gopichand

    2001-01-01

    A methodology is presented which uses commercial design and analysis software and the Internet to perform concurrent multidisciplinary optimization. The methodology provides a means to develop multidisciplinary designs without requiring that all software be accessible from the same local network. The procedures are amenable to design and development teams whose members, expertise and respective software are not geographically located together. This methodology facilitates multidisciplinary teams working concurrently on a design problem of common interest. Partition of design software to different machines allows each constituent software to be used on the machine that provides the most economy and efficiency. The methodology is demonstrated on the concurrent design of a spacecraft structure and attitude control system. Results are compared to those derived from performing the design with an autonomous FORTRAN program.

  16. Identification and Classification of Common Risks in Space Science Missions

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus M.; Chattopadhyay, Debarati; Hanna, Robert A.; Port, Daniel; Eggleston, Sabrina

    2010-01-01

    Due to the highly constrained schedules and budgets that NASA missions must contend with, the identification and management of cost, schedule and risks in the earliest stages of the lifecycle is critical. At the Jet Propulsion Laboratory (JPL) it is the concurrent engineering teams that first address these items in a systematic manner. Foremost of these concurrent engineering teams is Team X. Started in 1995, Team X has carried out over 1000 studies, dramatically reducing the time and cost involved, and has been the model for other concurrent engineering teams both within NASA and throughout the larger aerospace community. The ability to do integrated risk identification and assessment was first introduced into Team X in 2001. Since that time the mission risks identified in each study have been kept in a database. In this paper we will describe how the Team X risk process is evolving highlighting the strengths and weaknesses of the different approaches. The paper will especially focus on the identification and classification of common risks that have arisen during Team X studies of space based science missions.

  17. Software Development Technologies for Reactive, Real-Time, and Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Manna, Zohar

    1996-01-01

    The research is directed towards the design and implementation of a comprehensive deductive environment for the development of high-assurance systems, especially reactive (concurrent, real-time, and hybrid) systems. Reactive systems maintain an ongoing interaction with their environment, and are among the most difficult to design and verify. The project aims to provide engineers with a wide variety of tools within a single, general, formal framework in which the tools will be most effective. The entire development process is considered, including the construction, transformation, validation, verification, debugging, and maintenance of computer systems. The goal is to automate the process as much as possible and reduce the errors that pervade hardware and software development.

  18. Using SFOC to fly the Magellan Venus mapping mission

    NASA Technical Reports Server (NTRS)

    Bucher, Allen W.; Leonard, Robert E., Jr.; Short, Owen G.

    1993-01-01

    Traditionally, spacecraft flight operations at the Jet Propulsion Laboratory (JPL) have been performed by teams of spacecraft experts utilizing ground software designed specifically for the current mission. The Jet Propulsion Laboratory set out to reduce the cost of spacecraft mission operations by designing ground data processing software that could be used by multiple spacecraft missions, either sequentially or concurrently. The Space Flight Operations Center (SFOC) System was developed to provide the ground data system capabilities needed to monitor several spacecraft simultaneously and provide enough flexibility to meet the specific needs of individual projects. The Magellan Spacecraft Team utilizes the SFOC hardware and software designed for engineering telemetry analysis, both real-time and non-real-time. The flexibility of the SFOC System has allowed the spacecraft team to integrate their own tools with SFOC tools to perform the tasks required to operate a spacecraft mission. This paper describes how the Magellan Spacecraft Team is utilizing the SFOC System in conjunction with their own software tools to perform the required tasks of spacecraft event monitoring as well as engineering data analysis and trending.

  19. Control-structure-thermal interactions in analysis of lunar telescopes

    NASA Technical Reports Server (NTRS)

    Thompson, Roger C.

    1992-01-01

    The lunar telescope project was an excellent model for the CSTI study because a telescope is a very sensitive instrument, and thermal expansion or mechanical vibration of the mirror assemblies will rapidly degrade the resolution of the device. Consequently, the interactions are strongly coupled. The lunar surface experiences very large temperature variations that range from approximately -180 C to over 100 C. Although the optical assemblies of the telescopes will be well insulated, the temperature of the mirrors will inevitably fluctuate in a similar cycle, but of much smaller magnitude. In order to obtain images of high quality and clarity, allowable thermal deformations of any point on a mirror must be less than 1 micron. Initial estimates indicate that this corresponds to a temperature variation of much less than 1 deg through the thickness of the mirror. Therefore, a lunar telescope design will most probably include active thermal control, a means of controlling the shape of the mirrors, or a combination of both systems. Historically, the design of a complex vehicle was primarily a sequential process in which the basic structure was defined without concurrent detailed analyses or other subsystems. The basic configuration was then passed to the different teams responsible for each subsystem, and their task was to produce a workable solution without requiring major alterations to any principal components or subsystems. Consequently, the final design of the vehicle was not always the most efficient, owing to the fact that each subsystem design was partially constrained by the previous work. This procedure was necessary at the time because the analysis process was extremely time-consuming and had to be started over with each significant alteration of the vehicle. With recent advances in the power and capacity of small computers, and the parallel development of powerful software in structural, thermal, and control system analysis, it is now possible to produce very detailed analyses of intermediate designs in a much shorter period of time. The subsystems can thus be designed concurrently, and alterations in the overall design can be quickly adopted into each analysis; the design becomes an iterative process in which it is much easier to experiment with new ideas, configurations, and components. Concurrent engineering has the potential to produce efficient, highly capable designs because the effect of one subystem on another can be assessed in much more detail at a very early point in the program. The research program consisted of several tasks: scale a prototype telescope assembly to a 1 m aperture, develop a model of the telescope assembly by using finite element (FEM) codes that are available on site, determine structural deflections of the mirror surfaces due to the temperature variations, develop a prototype control system to maintain the proper shape of the optical elements, and most important of all, demonstrate the concurrent engineering approach with this example. In addition, the software used for the finite element models and thermal analysis was relatively new within the Program Development Office and had yet to be applied to systems this large or complex; understanding the software and modifying it for use with this project was also required. The I-DEAS software by Structural Dynamics Research Corporation (SDRC) was used to build the finite element models, and TMG developed by Maya Heat Transfer Technologies, Ltd. (which runs as an I-DEAS module) was used for the thermal model calculations. All control system development was accomplished with MATRIX(sub X) by Integrated Systems, Inc.

  20. The accomplishment of the Engineering Design Activities of IFMIF/EVEDA: The European-Japanese project towards a Li(d,xn) fusion relevant neutron source

    NASA Astrophysics Data System (ADS)

    Knaster, J.; Ibarra, A.; Abal, J.; Abou-Sena, A.; Arbeiter, F.; Arranz, F.; Arroyo, J. M.; Bargallo, E.; Beauvais, P.-Y.; Bernardi, D.; Casal, N.; Carmona, J. M.; Chauvin, N.; Comunian, M.; Delferriere, O.; Delgado, A.; Diaz-Arocas, P.; Fischer, U.; Frisoni, M.; Garcia, A.; Garin, P.; Gobin, R.; Gouat, P.; Groeschel, F.; Heidinger, R.; Ida, M.; Kondo, K.; Kikuchi, T.; Kubo, T.; Le Tonqueze, Y.; Leysen, W.; Mas, A.; Massaut, V.; Matsumoto, H.; Micciche, G.; Mittwollen, M.; Mora, J. C.; Mota, F.; Nghiem, P. A. P.; Nitti, F.; Nishiyama, K.; Ogando, F.; O'hira, S.; Oliver, C.; Orsini, F.; Perez, D.; Perez, M.; Pinna, T.; Pisent, A.; Podadera, I.; Porfiri, M.; Pruneri, G.; Queral, V.; Rapisarda, D.; Roman, R.; Shingala, M.; Soldaini, M.; Sugimoto, M.; Theile, J.; Tian, K.; Umeno, H.; Uriot, D.; Wakai, E.; Watanabe, K.; Weber, M.; Yamamoto, M.; Yokomine, T.

    2015-08-01

    The International Fusion Materials Irradiation Facility (IFMIF), presently in its Engineering Validation and Engineering Design Activities (EVEDA) phase under the frame of the Broader Approach Agreement between Europe and Japan, accomplished in summer 2013, on schedule, its EDA phase with the release of the engineering design report of the IFMIF plant, which is here described. Many improvements of the design from former phases are implemented, particularly a reduction of beam losses and operational costs thanks to the superconducting accelerator concept, the re-location of the quench tank outside the test cell (TC) with a reduction of tritium inventory and a simplification on its replacement in case of failure, the separation of the irradiation modules from the shielding block gaining irradiation flexibility and enhancement of the remote handling equipment reliability and cost reduction, and the water cooling of the liner and biological shielding of the TC, enhancing the efficiency and economy of the related sub-systems. In addition, the maintenance strategy has been modified to allow a shorter yearly stop of the irradiation operations and a more careful management of the irradiated samples. The design of the IFMIF plant is intimately linked with the EVA phase carried out since the entry into force of IFMIF/EVEDA in June 2007. These last activities and their on-going accomplishment have been thoroughly described elsewhere (Knaster J et al [19]), which, combined with the present paper, allows a clear understanding of the maturity of the European-Japanese international efforts. This released IFMIF Intermediate Engineering Design Report (IIEDR), which could be complemented if required concurrently with the outcome of the on-going EVA, will allow decision making on its construction and/or serve as the basis for the definition of the next step, aligned with the evolving needs of our fusion community.

  1. Global Design Optimization for Aerodynamics and Rocket Propulsion Components

    NASA Technical Reports Server (NTRS)

    Shyy, Wei; Papila, Nilay; Vaidyanathan, Rajkumar; Tucker, Kevin; Turner, James E. (Technical Monitor)

    2000-01-01

    Modern computational and experimental tools for aerodynamics and propulsion applications have matured to a stage where they can provide substantial insight into engineering processes involving fluid flows, and can be fruitfully utilized to help improve the design of practical devices. In particular, rapid and continuous development in aerospace engineering demands that new design concepts be regularly proposed to meet goals for increased performance, robustness and safety while concurrently decreasing cost. To date, the majority of the effort in design optimization of fluid dynamics has relied on gradient-based search algorithms. Global optimization methods can utilize the information collected from various sources and by different tools. These methods offer multi-criterion optimization, handle the existence of multiple design points and trade-offs via insight into the entire design space, can easily perform tasks in parallel, and are often effective in filtering the noise intrinsic to numerical and experimental data. However, a successful application of the global optimization method needs to address issues related to data requirements with an increase in the number of design variables, and methods for predicting the model performance. In this article, we review recent progress made in establishing suitable global optimization techniques employing neural network and polynomial-based response surface methodologies. Issues addressed include techniques for construction of the response surface, design of experiment techniques for supplying information in an economical manner, optimization procedures and multi-level techniques, and assessment of relative performance between polynomials and neural networks. Examples drawn from wing aerodynamics, turbulent diffuser flows, gas-gas injectors, and supersonic turbines are employed to help demonstrate the issues involved in an engineering design context. Both the usefulness of the existing knowledge to aid current design practices and the need for future research are identified.

  2. Real engineering in a virtual world

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deitz, D.

    1995-07-01

    VR technology can be thought of as the next point on a continuum that leads from 1-D data (such as the text and numbers on a finite element analysis printout), through 2-D drawings and 3-D solid models to 4-D digital prototypes that eventually will have texture and weight and can be held in one`s hand. If it lives up to its potential, VR could become just another tool--like 3-D CAD/CAM systems and FEA software--that can be used to pursue continuous improvements in design and manufacturing processes. For example VR could help manufacturers reduce the number of prototypes and engineering changemore » orders (ECOs) generated during the product life cycle. Virtual reality could also be used to promote concurrent engineering. Because realistic virtual models are easier to interpret and interrogate than 2-D drawings or even 3-D solid models, they have the potential to simplify design reviews. They could also make it easier for non-engineers (such as salespeople and potential customers) to contribute to the design process. VR technology still has a way to go before it becomes a standard engineering tool, however. Peripheral devices are still being perfected, and engineers seem to agree that the jury`s still out on which peripherals are most appropriate for which applications. Further, advanced VR applications are largely confined to research and development departments of large corporations or to public and private research centers. Finally, potential users will have to wait a few years before desktop computers are powerful enough to run such applications--and inexpensive enough to survive a cost-benefit analysis.« less

  3. Developing a workstation-based, real-time simulation for rapid handling qualities evaluations during design

    NASA Technical Reports Server (NTRS)

    Anderson, Frederick; Biezad, Daniel J.

    1994-01-01

    This paper describes the Rapid Aircraft DynamIcs AssessmeNt (RADIAN) project - an integration of the Aircraft SYNThesis (ACSTNT) design code with the USAD DATCOM code that estimates stability derivatives. Both of these codes are available to universities. These programs are then linked to flight simulation and flight controller synthesis tools and resulting design is evaluated on a graphics workstation. The entire process reduces the preliminary design time by an order of magnitude and provides an initial handling qualities evaluation of the design coupled to a control law. The integrated design process is applicable to both conventional aircraft taken from current textbooks and to unconventional designs emphasizing agility and propulsive control of attitude. The interactive and concurrent nature of the design process has been well received by industry and by design engineers at NASA. The process is being implemented into the design curriculum and is being used by students who view it as a significant advance over prior methods.

  4. Verified compilation of Concurrent Managed Languages

    DTIC Science & Technology

    2017-11-01

    designs for compiler intermediate representations that facilitate mechanized proofs and verification; and (d) a realistic case study that combines these...ideas to prove the correctness of a state-of- the-art concurrent garbage collector. 15. SUBJECT TERMS Program verification, compiler design ...Even though concurrency is a pervasive part of modern software and hardware systems, it has often been ignored in safety-critical system designs . A

  5. NASA Planetary Science Summer School: Preparing the Next Generation of Planetary Mission Leaders

    NASA Astrophysics Data System (ADS)

    Budney, C. J.; Lowes, L. L.; Sohus, A.; Wheeler, T.; Wessen, A.; Scalice, D.

    2010-12-01

    Sponsored by NASA’s Planetary Science Division, and managed by the Jet Propulsion Laboratory, the Planetary Science Summer School prepares the next generation of engineers and scientists to participate in future solar system exploration missions. Participants learn the mission life cycle, roles of scientists and engineers in a mission environment, mission design interconnectedness and trade-offs, and the importance of teamwork. For this professional development opportunity, applicants are sought who have a strong interest and experience in careers in planetary exploration, and who are science and engineering post-docs, recent PhDs, and doctoral students, and faculty teaching such students. Disciplines include planetary science, geoscience, geophysics, environmental science, aerospace engineering, mechanical engineering, and materials science. Participants are selected through a competitive review process, with selections based on the strength of the application and advisor’s recommendation letter. Under the mentorship of a lead engineer (Dr. Charles Budney), students select, design, and develop a mission concept in response to the NASA New Frontiers Announcement of Opportunity. They develop their mission in the JPL Advanced Projects Design Team (Team X) environment, which is a cross-functional multidisciplinary team of professional engineers that utilizes concurrent engineering methodologies to complete rapid design, analysis and evaluation of mission concept designs. About 36 students participate each year, divided into two summer sessions. In advance of an intensive week-long session in the Project Design Center at JPL, students select the mission and science goals during a series of six weekly WebEx/telecons, and develop a preliminary suite of instrumentation and a science traceability matrix. Students assume both a science team and a mission development role with JPL Team X mentors. Once at JPL, students participate in a series of Team X project design sessions, during which their mentors aid them in finalizing their mission design and instrument suite, and in making the necessary trade-offs to stay within the cost cap. Tours of JPL facilities highlight the end-to-end life cycle of a mission. At week’s end, students present their Concept Study to a “proposal review board” of JPL scientists and engineers and NASA Headquarters executives, who feed back the strengths and weaknesses of their proposal and mission design. The majority of students come from top US universities with planetary science or engineering programs, such as Brown University, MIT, Georgia Tech, University of Colorado, Caltech, Stanford, University of Arizona, UCLA, and University of Michigan. Almost a third of Planetary Science Summer School alumni from the last 10 years of the program are currently employed by NASA or JPL. The Planetary Science Summer School is implemented by the JPL Education Office in partnership with JPL’s Team X Project Design Center.

  6. 78 FR 72859 - Concurrence With OIE Risk Designations for Bovine Spongiform Encephalopathy

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-04

    ... DEPARTMENT OF AGRICULTURE Animal and Plant Health Inspection Service [Docket No. APHIS-2013-0064] Concurrence With OIE Risk Designations for Bovine Spongiform Encephalopathy AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION: Notice. SUMMARY: We are advising the public of our preliminary concurrence...

  7. Reducing Design Cycle Time and Cost Through Process Resequencing

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    2004-01-01

    In today's competitive environment, companies are under enormous pressure to reduce the time and cost of their design cycle. One method for reducing both time and cost is to develop an understanding of the flow of the design processes and the effects of the iterative subcycles that are found in complex design projects. Once these aspects are understood, the design manager can make decisions that take advantage of decomposition, concurrent engineering, and parallel processing techniques to reduce the total time and the total cost of the design cycle. One software tool that can aid in this decision-making process is the Design Manager's Aid for Intelligent Decomposition (DeMAID). The DeMAID software minimizes the feedback couplings that create iterative subcycles, groups processes into iterative subcycles, and decomposes the subcycles into a hierarchical structure. The real benefits of producing the best design in the least time and at a minimum cost are obtained from sequencing the processes in the subcycles.

  8. Simplex turbopump design

    NASA Technical Reports Server (NTRS)

    Marsh, Matt; Cowan, Penny

    1994-01-01

    Turbomachinery used in liquid rocket engines typically are composed of complex geometries made from high strength-to-weight super alloys and have long design and fabrication cycle times (3 to 5 years). A simple, low-cost turbopump is being designed in-house to demonstrate the ability to reduce the overall cost to $500K and compress life cycle time to 18 months. The simplex turbopump was designed to provide a discharge pressure of 1500 psia of liquid oxygen at 90 lbm/s. The turbine will be powered by gaseous oxygen. This eliminates the need for an inter-propellant seal typically required to separate the fuel-rich turbine gases from the liquid oxygen pump components. Materials used in the turbine flow paths will utilize existing characterized metals at 800 deg R that are compatible with a warm oxygen environment. This turbopump design would be suitable for integration with a 40 K pound thrust hybrid motor that provides warm oxygen from a tapped-off location to power the turbine. The preliminary and detailed analysis was completed in a year by a multiple discipline, concurrent engineering team. Manpower, schedule, and cost data were tracked during the process for a comparison to the initial goal. The Simplex hardware is the procurement cycle with the expectation of the first test to occur approximately 1.5 months behind the original schedule goal.

  9. Toward Genome-Based Metabolic Engineering in Bacteria.

    PubMed

    Oesterle, Sabine; Wuethrich, Irene; Panke, Sven

    2017-01-01

    Prokaryotes modified stably on the genome are of great importance for production of fine and commodity chemicals. Traditional methods for genome engineering have long suffered from imprecision and low efficiencies, making construction of suitable high-producer strains laborious. Here, we review the recent advances in discovery and refinement of molecular precision engineering tools for genome-based metabolic engineering in bacteria for chemical production, with focus on the λ-Red recombineering and the clustered regularly interspaced short palindromic repeats/Cas9 nuclease systems. In conjunction, they enable the integration of in vitro-synthesized DNA segments into specified locations on the chromosome and allow for enrichment of rare mutants by elimination of unmodified wild-type cells. Combination with concurrently developing improvements in important accessory technologies such as DNA synthesis, high-throughput screening methods, regulatory element design, and metabolic pathway optimization tools has resulted in novel efficient microbial producer strains and given access to new metabolic products. These new tools have made and will likely continue to make a big impact on the bioengineering strategies that transform the chemical industry. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. A definition of high-level decisions in the engineering of systems

    NASA Astrophysics Data System (ADS)

    Powell, Robert Anthony

    The role of the systems engineer defines that he or she be proactive and guide the program manager and their customers through their decisions to enhance the effectiveness of system development---producing faster, better, and cheaper systems. The present lack of coverage in literature on what these decisions are and how they relate to each other may be a contributing factor to the high rate of failure among system projects. At the onset of the system development process, decisions have an integral role in the design of a system that meets stakeholders' needs. This is apparent during the design and qualification of both the Development System and the Operational System. The performance, cost and schedule of the Development System affect the performance of the Operational System and are affected by decisions that influence physical elements of the Development System. The performance, cost, and schedule of the Operational System is affected by decisions that influence physical elements of the Operational System. Traditionally, product and process have been designed using know-how and trial and error. However, the empiricism of engineers and program managers is limited which can, and has led to costly mistakes. To date, very little research has explored decisions made in the engineering of a system. In government, literature exists on procurement processes for major system development; but in general literature on decisions, how they relate to each other, and the key information requirements within one of two systems and across the two systems is not readily available. This research hopes to improve the processes inherent in the engineering of systems. The primary focus of this research is on department of defense (DoD) military systems, specifically aerospace systems and may generalize more broadly. The result of this research is a process tool, a Decision System Model, which can be used by systems engineers to guide the program manager and their customers through the decisions about concurrently designing and qualifying both the Development and Operational systems.

  11. NASA Planetary Science Summer School: Preparing the Next Generation of Planetary Mission Leaders

    NASA Astrophysics Data System (ADS)

    Lowes, L. L.; Budney, C. J.; Sohus, A.; Wheeler, T.; Urban, A.; NASA Planetary Science Summer School Team

    2011-12-01

    Sponsored by NASA's Planetary Science Division, and managed by the Jet Propulsion Laboratory, the Planetary Science Summer School prepares the next generation of engineers and scientists to participate in future solar system exploration missions. Participants learn the mission life cycle, roles of scientists and engineers in a mission environment, mission design interconnectedness and trade-offs, and the importance of teamwork. For this professional development opportunity, applicants are sought who have a strong interest and experience in careers in planetary exploration, and who are science and engineering post-docs, recent PhDs, and doctoral students, and faculty teaching such students. Disciplines include planetary science, geoscience, geophysics, environmental science, aerospace engineering, mechanical engineering, and materials science. Participants are selected through a competitive review process, with selections based on the strength of the application and advisor's recommendation letter. Under the mentorship of a lead engineer (Dr. Charles Budney), students select, design, and develop a mission concept in response to the NASA New Frontiers Announcement of Opportunity. They develop their mission in the JPL Advanced Projects Design Team (Team X) environment, which is a cross-functional multidisciplinary team of professional engineers that utilizes concurrent engineering methodologies to complete rapid design, analysis and evaluation of mission concept designs. About 36 students participate each year, divided into two summer sessions. In advance of an intensive week-long session in the Project Design Center at JPL, students select the mission and science goals during a series of six weekly WebEx/telecons, and develop a preliminary suite of instrumentation and a science traceability matrix. Students assume both a science team and a mission development role with JPL Team X mentors. Once at JPL, students participate in a series of Team X project design sessions, during which their mentors aid them in finalizing their mission design and instrument suite, and in making the necessary trade-offs to stay within the cost cap. Tours of JPL facilities highlight the end-to-end life cycle of a mission. At week's end, students present their Concept Study to a "proposal review board" of JPL scientists and engineers and NASA Headquarters executives, who feed back the strengths and weaknesses of their proposal and mission design. A survey of Planetary Science Summer School alumni administered in summer of 2011 provides information on the program's impact on students' career choices and leadership roles as they pursue their employment in planetary science and related fields. Preliminary results will be discussed during the session. Almost a third of the approximately 450 Planetary Science Summer School alumni from the last 10 years of the program are currently employed by NASA or JPL. The Planetary Science Summer School is implemented by the JPL Education Office in partnership with JPL's Team X Project Design Center.

  12. A systems-based approach for integrated design of materials, products and design process chains

    NASA Astrophysics Data System (ADS)

    Panchal, Jitesh H.; Choi, Hae-Jin; Allen, Janet K.; McDowell, David L.; Mistree, Farrokh

    2007-12-01

    The concurrent design of materials and products provides designers with flexibility to achieve design objectives that were not previously accessible. However, the improved flexibility comes at a cost of increased complexity of the design process chains and the materials simulation models used for executing the design chains. Efforts to reduce the complexity generally result in increased uncertainty. We contend that a systems based approach is essential for managing both the complexity and the uncertainty in design process chains and simulation models in concurrent material and product design. Our approach is based on simplifying the design process chains systematically such that the resulting uncertainty does not significantly affect the overall system performance. Similarly, instead of striving for accurate models for multiscale systems (that are inherently complex), we rely on making design decisions that are robust to uncertainties in the models. Accordingly, we pursue hierarchical modeling in the context of design of multiscale systems. In this paper our focus is on design process chains. We present a systems based approach, premised on the assumption that complex systems can be designed efficiently by managing the complexity of design process chains. The approach relies on (a) the use of reusable interaction patterns to model design process chains, and (b) consideration of design process decisions using value-of-information based metrics. The approach is illustrated using a Multifunctional Energetic Structural Material (MESM) design example. Energetic materials store considerable energy which can be released through shock-induced detonation; conventionally, they are not engineered for strength properties. The design objectives for the MESM in this paper include both sufficient strength and energy release characteristics. The design is carried out by using models at different length and time scales that simulate different aspects of the system. Finally, by applying the method to the MESM design problem, we show that the integrated design of materials and products can be carried out more efficiently by explicitly accounting for design process decisions with the hierarchy of models.

  13. An Integrated Approach to Risk Assessment for Concurrent Design

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Voss, Luke; Feather, Martin; Cornford, Steve

    2005-01-01

    This paper describes an approach to risk assessment and analysis suited to the early phase, concurrent design of a space mission. The approach integrates an agile, multi-user risk collection tool, a more in-depth risk analysis tool, and repositories of risk information. A JPL developed tool, named RAP, is used for collecting expert opinions about risk from designers involved in the concurrent design of a space mission. Another in-house developed risk assessment tool, named DDP, is used for the analysis.

  14. Experimental clean combustor program, phase 2

    NASA Technical Reports Server (NTRS)

    Roberts, R.; Peduzzi, A.; Vitti, G. E.

    1976-01-01

    Combustor pollution reduction technology for commercial CTOL engines was generated and this technology was demonstrated in a full-scale JT9D engine in 1976. Component rig refinement of the two best combustor concepts were tested. These concepts are the vorbix combustor, and a hybrid combustor which combines the pilot zone of the staged premix combustor and the main zone of the swirl-can combustor. Both concepts significantly reduced all pollutant emissions relative to the JT9D-7 engine combustor. However, neither concept met all program goals. The hybrid combustor met pollution goals for unburned hydrocarbons and carbon monoxide but did not achieve the oxides of nitrogen goal. This combustor had significant performance deficiencies. The Vorbix combustor met goals for unburned hydrocarbons and oxides of nitrogen but did not achieve the carbon monoxide goal. Performance of the vorbix combustor approached the engine requirements. On the basis of these results, the vorbix combustor was selected for the engine demonstration program. A control study was conducted to establish fuel control requirements imposed by the low-emission combustor concepts and to identify conceptual control system designs. Concurrent efforts were also completed on two addendums: an alternate fuels addendum and a combustion noise addendum.

  15. Regeneratively cooled rocket engine for space storable propellants

    NASA Technical Reports Server (NTRS)

    Wagner, W. R.

    1973-01-01

    Analysis, design, fabrication, and test efforts were performed for the existing OF2/B2H6 regeneratively cooled lK (4448 N) thrust chamber to illustrate simultaneous B2H6 fuel and OF2 oxidizer cooling and to provide results for a gaseous propellant condition injected into the combustion chamber. Data derived from performance, thermal and flow measurements confirmed predictions derived from previous test work and from concurrent analytical study. Development data derived from the experimental study were indicated to be sufficient to develop a preflight thrust chamber demonstrator prototype for future space mission objectives.

  16. Materials technology assessment for stirling engines

    NASA Technical Reports Server (NTRS)

    Stephens, J. R.; Witzke, W. R.; Watson, G. K.; Johnston, J. R.; Croft, W. J.

    1977-01-01

    A materials technology assessment of high temperature components in the improved (metal) and advanced (ceramic) Stirling engines was undertaken to evaluate the current state-of-the-art of metals and ceramics, identify materials research and development required to support the development of automotive Stirling engines, and to recommend materials technology programs to assure material readiness concurrent with engine system development programs. The most critical component for each engine is identified and some of the material problem areas are discussed.

  17. Control system software, simulation, and robotic applications

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    1991-01-01

    All essential existing capabilities needed to create a man-machine interaction dynamics and performance (MMIDAP) capability are reviewed. The multibody system dynamics software program Order N DISCOS will be used for machine and musculo-skeletal dynamics modeling. The program JACK will be used for estimating and animating whole body human response to given loading situations and motion constraints. The basic elements of performance (BEP) task decomposition methodologies associated with the Human Performance Institute database will be used for performance assessment. Techniques for resolving the statically indeterminant muscular load sharing problem will be used for a detailed understanding of potential musculotendon or ligamentous fatigue, pain, discomfort, and trauma. The envisioned capacity is to be used for mechanical system design, human performance assessment, extrapolation of man/machine interaction test data, biomedical engineering, and soft prototyping within a concurrent engineering (CE) system.

  18. A Centaur Reconnaissance Mission: a NASA JPL Planetary Science Summer Seminar mission design experience

    NASA Astrophysics Data System (ADS)

    Chou, L.; Howell, S. M.; Bhattaru, S.; Blalock, J. J.; Bouchard, M.; Brueshaber, S.; Cusson, S.; Eggl, S.; Jawin, E.; Marcus, M.; Miller, K.; Rizzo, M.; Smith, H. B.; Steakley, K.; Thomas, N. H.; Thompson, M.; Trent, K.; Ugelow, M.; Budney, C. J.; Mitchell, K. L.

    2017-12-01

    The NASA Planetary Science Summer Seminar (PSSS), sponsored by the Jet Propulsion Laboratory (JPL), offers advanced graduate students and recent doctoral graduates the unique opportunity to develop a robotic planetary exploration mission that answers NASA's Science Mission Directorate's Announcement of Opportunity for the New Frontiers Program. Preceded by a series of 10 weekly webinars, the seminar is an intensive one-week exercise at JPL, where students work directly with JPL's project design team "TeamX" on the process behind developing mission concepts through concurrent engineering, project design sessions, instrument selection, science traceability matrix development, and risks and cost management. The 2017 NASA PSSS team included 18 participants from various U.S. institutions with a diverse background in science and engineering. We proposed a Centaur Reconnaissance Mission, named CAMILLA, designed to investigate the geologic state, surface evolution, composition, and ring systems through a flyby and impact of Chariklo. Centaurs are defined as minor planets with semi-major axis that lies between Jupiter and Neptune's orbit. Chariklo is both the largest Centaur and the only known minor planet with rings. CAMILLA was designed to address high priority cross-cutting themes defined in National Research Council's Vision and Voyages for Planetary Science in the Decade 2013-2022. At the end of the seminar, a final presentation was given by the participants to a review board of JPL scientists and engineers as well as NASA headquarters executives. The feedback received on the strengths and weaknesses of our proposal provided a rich and valuable learning experience in how to design a successful NASA planetary exploration mission and generate a successful New Frontiers proposal. The NASA PSSS is an educational experience that trains the next generation of NASA's planetary explorers by bridging the gap between scientists and engineers, allowing for participants to learn how to design a mission and build a spacecraft in a collaborative and fast-pace environment.

  19. Composite Crew Module: Primary Structure

    NASA Technical Reports Server (NTRS)

    Kirsch, Michael T.

    2011-01-01

    In January 2007, the NASA Administrator and Associate Administrator for the Exploration Systems Mission Directorate chartered the NASA Engineering and Safety Center to design, build, and test a full-scale crew module primary structure, using carbon fiber reinforced epoxy based composite materials. The overall goal of the Composite Crew Module project was to develop a team from the NASA family with hands-on experience in composite design, manufacturing, and testing in anticipation of future space exploration systems being made of composite materials. The CCM project was planned to run concurrently with the Orion project's baseline metallic design within the Constellation Program so that features could be compared and discussed without inducing risk to the overall Program. This report discusses the project management aspects of the project including team organization, decision making, independent technical reviews, and cost and schedule management approach.

  20. NASA Planetary Science Summer School: Longitudinal Study

    NASA Astrophysics Data System (ADS)

    Giron, Jennie M.; Sohus, A.

    2006-12-01

    NASA’s Planetary Science Summer School is a program designed to prepare the next generation of scientists and engineers to participate in future missions of solar system exploration. The opportunity is advertised to science and engineering post-doctoral and graduate students with a strong interest in careers in planetary exploration. Preference is given to U.S. citizens. The “school” consists of a one-week intensive team exercise learning the process of developing a robotic mission concept into reality through concurrent engineering, working with JPL’s Advanced Project Design Team (Team X). This program benefits the students by providing them with skills, knowledge and the experience of collaborating with a concept mission design. A longitudinal study was conducted to assess the impact of the program on the past participants of the program. Data collected included their current contact information, if they are currently part of the planetary exploration community, if participation in the program contributed to any career choices, if the program benefited their career paths, etc. Approximately 37% of 250 past participants responded to the online survey. Of these, 83% indicated that they are actively involved in planetary exploration or aerospace in general; 78% said they had been able to apply what they learned in the program to their current job or professional career; 100% said they would recommend this program to a colleague.

  1. X-33 Attitude Control System Design for Ascent, Transition, and Entry Flight Regimes

    NASA Technical Reports Server (NTRS)

    Hall, Charles E.; Gallaher, Michael W.; Hendrix, Neal D.

    1998-01-01

    The Vehicle Control Systems Team at Marshall Space Flight Center, Systems Dynamics Laboratory, Guidance and Control Systems Division is designing under a cooperative agreement with Lockheed Martin Skunkworks, the Ascent, Transition, and Entry flight attitude control system for the X-33 experimental vehicle. Ascent flight control begins at liftoff and ends at linear aerospike main engine cutoff (NECO) while Transition and Entry flight control begins at MECO and concludes at the terminal area energy management (TAEM) interface. TAEM occurs at approximately Mach 3.0. This task includes not only the design of the vehicle attitude control systems but also the development of requirements for attitude control system components and subsystems. The X-33 attitude control system design is challenged by a short design cycle, the design environment (Mach 0 to about Mach 15), and the X-33 incremental test philosophy. The X-33 design-to-launch cycle of less than 3 years requires a concurrent design approach while the test philosophy requires design adaptation to vehicle variations that are a function of Mach number and mission profile. The flight attitude control system must deal with the mixing of aerosurfaces, reaction control thrusters, and linear aerospike engine control effectors and handle parasitic effects such as vehicle flexibility and propellant sloshing from the uniquely shaped propellant tanks. The attitude control system design is, as usual, closely linked to many other subsystems and must deal with constraints and requirements from these subsystems.

  2. Multidisciplinary Design Technology Development: A Comparative Investigation of Integrated Aerospace Vehicle Design Tools

    NASA Technical Reports Server (NTRS)

    Renaud, John E.; Batill, Stephen M.; Brockman, Jay B.

    1998-01-01

    This research effort is a joint program between the Departments of Aerospace and Mechanical Engineering and the Computer Science and Engineering Department at the University of Notre Dame. Three Principal Investigators; Drs. Renaud, Brockman and Batill directed this effort. During the four and a half year grant period, six Aerospace and Mechanical Engineering Ph.D. students and one Masters student received full or partial support, while four Computer Science and Engineering Ph.D. students and one Masters student were supported. During each of the summers up to four undergraduate students were involved in related research activities. The purpose of the project was to develop a framework and systematic methodology to facilitate the application of Multidisciplinary Design Optimization (N4DO) to a diverse class of system design problems. For all practical aerospace systems, the design of a systems is a complex sequence of events which integrates the activities of a variety of discipline "experts" and their associated "tools". The development, archiving and exchange of information between these individual experts is central to the design task and it is this information which provides the basis for these experts to make coordinated design decisions (i.e., compromises and trade-offs) - resulting in the final product design. Grant efforts focused on developing and evaluating frameworks for effective design coordination within a MDO environment. Central to these research efforts was the concept that the individual discipline "expert", using the most appropriate "tools" available and the most complete description of the system should be empowered to have the greatest impact on the design decisions and final design. This means that the overall process must be highly interactive and efficiently conducted if the resulting design is to be developed in a manner consistent with cost and time requirements. The methods developed as part of this research effort include; extensions to a sensitivity based Concurrent Subspace Optimization (CSSO) MDO algorithm; the development of a neural network response surface based CSSO-MDO algorithm; and the integration of distributed computing and process scheduling into the MDO environment. This report overviews research efforts in each of these focus. A complete bibliography of research produced with support of this grant is attached.

  3. Action Learning in Undergraduate Engineering Thesis Supervision

    ERIC Educational Resources Information Center

    Stappenbelt, Brad

    2017-01-01

    In the present action learning implementation, twelve action learning sets were conducted over eight years. The action learning sets consisted of students involved in undergraduate engineering research thesis work. The concurrent study accompanying this initiative investigated the influence of the action learning environment on student approaches…

  4. Bioreactor System Using Noninvasive Imaging and Mechanical Stretch for Biomaterial Screening

    PubMed Central

    Kluge, Jonathan A.; Leisk, Gary G.; Cardwell, Robyn S.; Fernandes, Alexander P.; House, Michael; Ward, Andrew; Dorfmann, A. Luis; Kaplan, David L.

    2012-01-01

    Screening biomaterial and tissue systems in vitro, for guidance of performance in vivo, remains a major requirement in the field of tissue engineering. It is critical to understand how culture stimulation affects both tissue construct maturation and function, with the goal of eliminating resource-intensive trial-and-error screening and better matching specifications for various in vivo needs. We present a multifunctional and robust bioreactor design that addresses this need. The design enables a range of mechanical inputs, durations, and frequencies to be applied in coordination with noninvasive optical assessments. A variety of biomaterial systems, including micro- and nano-fiber and porous sponge biomaterials, as well as cell-laden tissue engineering constructs were used in validation studies in order to demonstrate the versatility and utility of this new bioreactor design. The silk-based biomaterials highlighted in these studies offered several unique optical signatures for use in label-free nondestructive imaging that allowed for sequential profiling. Both short- and long-term culture studies were conducted to evaluate several practical scenarios of usage: on a short-term basis, we demonstrate that construct cellularity can be monitored by usage of nonpermanent dyes; on a more long-term basis, we show that cell ingrowth can be monitored by GFP-labeling and construct integrity probed with concurrent load/displacement data. The ability to nondestructively track cells, biomaterials, and new matrix formation without harvesting designated samples at each time point will lead to less resource-intensive studies and should enhance our understanding and the discovery of biomaterial designs related to functional tissue engineering. PMID:21298345

  5. Estimation Model of Spacecraft Parameters and Cost Based on a Statistical Analysis of COMPASS Designs

    NASA Technical Reports Server (NTRS)

    Gerberich, Matthew W.; Oleson, Steven R.

    2013-01-01

    The Collaborative Modeling for Parametric Assessment of Space Systems (COMPASS) team at Glenn Research Center has performed integrated system analysis of conceptual spacecraft mission designs since 2006 using a multidisciplinary concurrent engineering process. The set of completed designs was archived in a database, to allow for the study of relationships between design parameters. Although COMPASS uses a parametric spacecraft costing model, this research investigated the possibility of using a top-down approach to rapidly estimate the overall vehicle costs. This paper presents the relationships between significant design variables, including breakdowns of dry mass, wet mass, and cost. It also develops a model for a broad estimate of these parameters through basic mission characteristics, including the target location distance, the payload mass, the duration, the delta-v requirement, and the type of mission, propulsion, and electrical power. Finally, this paper examines the accuracy of this model in regards to past COMPASS designs, with an assessment of outlying spacecraft, and compares the results to historical data of completed NASA missions.

  6. ACSYNT inner loop flight control design study

    NASA Technical Reports Server (NTRS)

    Bortins, Richard; Sorensen, John A.

    1993-01-01

    The NASA Ames Research Center developed the Aircraft Synthesis (ACSYNT) computer program to synthesize conceptual future aircraft designs and to evaluate critical performance metrics early in the design process before significant resources are committed and cost decisions made. ACSYNT uses steady-state performance metrics, such as aircraft range, payload, and fuel consumption, and static performance metrics, such as the control authority required for the takeoff rotation and for landing with an engine out, to evaluate conceptual aircraft designs. It can also optimize designs with respect to selected criteria and constraints. Many modern aircraft have stability provided by the flight control system rather than by the airframe. This may allow the aircraft designer to increase combat agility, or decrease trim drag, for increased range and payload. This strategy requires concurrent design of the airframe and the flight control system, making trade-offs of performance and dynamics during the earliest stages of design. ACSYNT presently lacks means to implement flight control system designs but research is being done to add methods for predicting rotational degrees of freedom and control effector performance. A software module to compute and analyze the dynamics of the aircraft and to compute feedback gains and analyze closed loop dynamics is required. The data gained from these analyses can then be fed back to the aircraft design process so that the effects of the flight control system and the airframe on aircraft performance can be included as design metrics. This report presents results of a feasibility study and the initial design work to add an inner loop flight control system (ILFCS) design capability to the stability and control module in ACSYNT. The overall objective is to provide a capability for concurrent design of the aircraft and its flight control system, and enable concept designers to improve performance by exploiting the interrelationships between aircraft and flight control system design parameters.

  7. Concurrent Design used in the Design of Space Instruments

    NASA Technical Reports Server (NTRS)

    Oxnevad, Knut I.

    1998-01-01

    At the Project Design Center at the Jet Propulsion Laboratory, a concurrent design environment is under development for supporting development and analyses of space instruments in the early, conceptual design phases. This environment is being utilized by a Team I, a multidisciplinary group of experts. Team I is providing study and proposal support. To provide the required support, the Team I concurrent design environment features effectively interconnected high-end optics, CAD, and thermal design and analysis tools. Innovative approaches for linking tools, and for transferring files between applications have been implemented. These approaches together with effective sharing of geometry between the optics, CAD, and thermal tools are already showing significant timesavings.

  8. 7 CFR 1794.10 - Applicant responsibilities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... prepare the applicable environmental documentation concurrent with a proposed action's engineering... AGRICULTURE (CONTINUED) ENVIRONMENTAL POLICIES AND PROCEDURES Implementation of the National Environmental...

  9. Software engineering aspects of real-time programming concepts

    NASA Astrophysics Data System (ADS)

    Schoitsch, Erwin

    1986-08-01

    Real-time programming is a discipline of great importance not only in process control, but also in fields like communication, office automation, interactive databases, interactive graphics and operating systems development. General concepts of concurrent programming and constructs for process-synchronization are discussed in detail. Tasking and synchronization concepts, methods of process communication, interrupt and timeout handling in systems based on semaphores, signals, conditional critical regions or on real-time languages like Concurrent PASCAL, MODULA, CHILL and ADA are explained and compared with each other. The second part deals with structuring and modularization of technical processes to build reliable and maintainable real time systems. Software-quality and software engineering aspects are considered throughout the paper.

  10. Concurrent and Collaborative Engineering Implementation in an R and D Organization

    NASA Technical Reports Server (NTRS)

    DelRosario, Ruben; Davis, Jose M.; Keys, L. Ken

    2003-01-01

    Concurrent Engineering (CE), and Collaborative Engineering (or Collaborative Product Development - CPD) have emerged as new paradigms with significant impact in the development of new products and processes. With documented and substantiated success in the automotive and technology industries CE and, most recently, CPD are being touted as innovative management philosophies for many other business sectors including Research and De- velopment. This paper introduces two independent research initiatives conducted at the NASA Glenn Research Center (GRC) in Cleveland, Ohio investigating the application of CE and CPD in an RdiD environment. Since little research has been conducted in the use of CE and CPD in sectors other than the high mass production manufacturing, the objective of these independent studies is to provide a systematic evaluation of the applicability of these paradigms (concur- rent and collaborative) in a low/no production, service environment, in particular R&D.

  11. Using Life-Cycle Human Factors Engineering to Avoid $2.4 Million in Costs: Lessons Learned from NASA's Requirements Verification Process for Space Payloads

    NASA Technical Reports Server (NTRS)

    Carr, Daniel; Ellenberger, Rich

    2008-01-01

    The Human Factors Implementation Team (HFIT) process has been used to verify human factors requirements for NASA International Space Station (ISS) payloads since 2003, resulting in $2.4 million in avoided costs. This cost benefit has been realized by greatly reducing the need to process time-consuming formal waivers (exceptions) for individual requirements violations. The HFIT team, which includes astronauts and their technical staff, acts as the single source for human factors requirements integration of payloads. HFIT has the authority to provide inputs during early design phases, thus eliminating many potential requirements violations in a cost-effective manner. In those instances where it is not economically or technically feasible to meet the precise metric of a given requirement, HFIT can work with the payload engineers to develop common sense solutions and formally document that the resulting payload design does not materially affect the astronaut s ability to operate and interact with the payload. The HFIT process is fully ISO 9000 compliant and works concurrently with NASA s formal systems engineering work flow. Due to its success with payloads, the HFIT process is being adapted and extended to ISS systems hardware. Key aspects of this process are also being considered for NASA's Space Shuttle replacement, the Crew Exploration Vehicle.

  12. Career and Workforce Impacts of the NASA Planetary Science Summer School: TEAM X model 1999-2015

    NASA Astrophysics Data System (ADS)

    Lowes, Leslie L.; Budney, Charles; Mitchell, Karl; Wessen, Alice; JPL Education Office, JPL Team X

    2016-10-01

    Sponsored by NASA's Planetary Science Division, and managed by the Jet Propulsion Laboratory (JPL), the Planetary Science Summer School prepares the next generation of engineers and scientists to participate in future solar system exploration missions. PSSS utilizes JPL's emerging concurrent mission design "Team X" as mentors. With this model, participants learn the mission life cycle, roles of scientists and engineers in a mission environment, mission design interconnectedness and trade-offs, and the importance of teamwork. Applicants are sought who have a strong interest and experience in careers in planetary exploration, and who are science and engineering post-docs, recent PhDs, doctoral or graduate students, and faculty teaching such students. An overview of the program will be presented, along with results of a diversity study conducted in fall 2015 to assess the gender and ethnic diversity of participants since 1999. PSSS seeks to have a positive influence on participants' career choice and career progress, and to help feed the employment pipeline for NASA, aerospace, and related academia. Results will also be presented of an online search that located alumni in fall 2015 related to their current occupations (primarily through LinkedIn and university and corporate websites), as well as a 2015 survey of alumni.

  13. The Development and Validation of a Life Experience Inventory for the Identification of Creative Electrical Engineers.

    ERIC Educational Resources Information Center

    Michael, William B.; Colson, Kenneth R.

    1979-01-01

    The construction and validation of the Life Experience Inventory (LEI) for the identification of creative electrical engineers are described. Using the number of patents held or pending as a criterion measure, the LEI was found to have high concurrent validity. (JKS)

  14. High-throughput state-machine replication using software transactional memory.

    PubMed

    Zhao, Wenbing; Yang, William; Zhang, Honglei; Yang, Jack; Luo, Xiong; Zhu, Yueqin; Yang, Mary; Luo, Chaomin

    2016-11-01

    State-machine replication is a common way of constructing general purpose fault tolerance systems. To ensure replica consistency, requests must be executed sequentially according to some total order at all non-faulty replicas. Unfortunately, this could severely limit the system throughput. This issue has been partially addressed by identifying non-conflicting requests based on application semantics and executing these requests concurrently. However, identifying and tracking non-conflicting requests require intimate knowledge of application design and implementation, and a custom fault tolerance solution developed for one application cannot be easily adopted by other applications. Software transactional memory offers a new way of constructing concurrent programs. In this article, we present the mechanisms needed to retrofit existing concurrency control algorithms designed for software transactional memory for state-machine replication. The main benefit for using software transactional memory in state-machine replication is that general purpose concurrency control mechanisms can be designed without deep knowledge of application semantics. As such, new fault tolerance systems based on state-machine replications with excellent throughput can be easily designed and maintained. In this article, we introduce three different concurrency control mechanisms for state-machine replication using software transactional memory, namely, ordered strong strict two-phase locking, conventional timestamp-based multiversion concurrency control, and speculative timestamp-based multiversion concurrency control. Our experiments show that speculative timestamp-based multiversion concurrency control mechanism has the best performance in all types of workload, the conventional timestamp-based multiversion concurrency control offers the worst performance due to high abort rate in the presence of even moderate contention between transactions. The ordered strong strict two-phase locking mechanism offers the simplest solution with excellent performance in low contention workload, and fairly good performance in high contention workload.

  15. High-throughput state-machine replication using software transactional memory

    PubMed Central

    Yang, William; Zhang, Honglei; Yang, Jack; Luo, Xiong; Zhu, Yueqin; Yang, Mary; Luo, Chaomin

    2017-01-01

    State-machine replication is a common way of constructing general purpose fault tolerance systems. To ensure replica consistency, requests must be executed sequentially according to some total order at all non-faulty replicas. Unfortunately, this could severely limit the system throughput. This issue has been partially addressed by identifying non-conflicting requests based on application semantics and executing these requests concurrently. However, identifying and tracking non-conflicting requests require intimate knowledge of application design and implementation, and a custom fault tolerance solution developed for one application cannot be easily adopted by other applications. Software transactional memory offers a new way of constructing concurrent programs. In this article, we present the mechanisms needed to retrofit existing concurrency control algorithms designed for software transactional memory for state-machine replication. The main benefit for using software transactional memory in state-machine replication is that general purpose concurrency control mechanisms can be designed without deep knowledge of application semantics. As such, new fault tolerance systems based on state-machine replications with excellent throughput can be easily designed and maintained. In this article, we introduce three different concurrency control mechanisms for state-machine replication using software transactional memory, namely, ordered strong strict two-phase locking, conventional timestamp-based multiversion concurrency control, and speculative timestamp-based multiversion concurrency control. Our experiments show that speculative timestamp-based multiversion concurrency control mechanism has the best performance in all types of workload, the conventional timestamp-based multiversion concurrency control offers the worst performance due to high abort rate in the presence of even moderate contention between transactions. The ordered strong strict two-phase locking mechanism offers the simplest solution with excellent performance in low contention workload, and fairly good performance in high contention workload. PMID:29075049

  16. DeMAID/GA an Enhanced Design Manager's Aid for Intelligent Decomposition

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.

    1996-01-01

    Many companies are looking for new tools and techniques to aid a design manager in making decisions that can reduce the time and cost of a design cycle. One tool is the Design Manager's Aid for Intelligent Decomposition (DeMAID). Since the initial public release of DeMAID in 1989, much research has been done in the areas of decomposition, concurrent engineering, parallel processing, and process management; many new tools and techniques have emerged. Based on these recent research and development efforts, numerous enhancements have been added to DeMAID to further aid the design manager in saving both cost and time in a design cycle. The key enhancement, a genetic algorithm (GA), will be available in the next public release called DeMAID/GA. The GA sequences the design processes to minimize the cost and time in converging a solution. The major enhancements in the upgrade of DeMAID to DeMAID/GA are discussed in this paper. A sample conceptual design project is used to show how these enhancements can be applied to improve the design cycle.

  17. Thrust reverser design studies for an over-the-wing STOL transport

    NASA Technical Reports Server (NTRS)

    Ammer, R. C.; Sowers, H. D.

    1977-01-01

    Aerodynamic and acoustics analytical studies were conducted to evaluate three thrust reverser designs for potential use on commercial over-the-wing STOL transports. The concepts were: (1) integral D nozzle/target reverser, (2) integral D nozzle/top arc cascade reverser, and (3) post exit target reverser integral with wing. Aerodynamic flowpaths and kinematic arrangements for each concept were established to provide a 50% thrust reversal capability. Analytical aircraft stopping distance/noise trade studies conducted concurrently with flow path design showed that these high efficiency reverser concepts are employed at substantially reduced power settings to meet noise goals of 100 PNdB on a 152.4 m sideline and still meet 609.6 m landing runway length requirements. From an overall installation standpoint, only the integral D nozzle/target reverser concept was found to penalize nacelle cruise performance; for this concept a larger nacelle diameter was required to match engine cycle effective area demand in reverse thrust.

  18. Highlights of X-Stack ExM Deliverable: MosaStore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ripeanu, Matei

    2016-07-20

    This brief report highlights the experience gained with MosaStore, an exploratory part of the X-Stack project “ExM: System support for extreme-scale, many-task applications”. The ExM project proposed to use concurrent workflows supported by the Swift language and runtime as an innovative programming model to exploit parallelism in exascale computers. MosaStore aims to support this endeavor by improving storage support for workflow-based applications, more precisely by exploring the gains that can be obtained from co-designing the storage system and the workflow runtime engine. MosaStore has been developed primarily at the University of British Columbia.

  19. Cost Validation Using PRICE H

    NASA Technical Reports Server (NTRS)

    Jack, John; Kwan, Eric; Wood, Milana

    2011-01-01

    PRICE H was introduced into the JPL cost estimation tool set circa 2003. It became more available at JPL when IPAO funded the NASA-wide site license for all NASA centers. PRICE H was mainly used as one of the cost tools to validate proposal grassroots cost estimates. Program offices at JPL view PRICE H as an additional crosscheck to Team X (JPL Concurrent Engineering Design Center) estimates. PRICE H became widely accepted ca, 2007 at JPL when the program offices moved away from grassroots cost estimation for Step 1 proposals. PRICE H is now one of the key cost tools used for cost validation, cost trades, and independent cost estimates.

  20. Structural optimization by multilevel decomposition

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; James, B.; Dovi, A.

    1983-01-01

    A method is described for decomposing an optimization problem into a set of subproblems and a coordination problem which preserves coupling between the subproblems. The method is introduced as a special case of multilevel, multidisciplinary system optimization and its algorithm is fully described for two level optimization for structures assembled of finite elements of arbitrary type. Numerical results are given for an example of a framework to show that the decomposition method converges and yields results comparable to those obtained without decomposition. It is pointed out that optimization by decomposition should reduce the design time by allowing groups of engineers, using different computers to work concurrently on the same large problem.

  1. Advancement of Bi-Level Integrated System Synthesis (BLISS)

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Emiley, Mark S.; Agte, Jeremy S.; Sandusky, Robert R., Jr.

    2000-01-01

    Bi-Level Integrated System Synthesis (BLISS) is a method for optimization of an engineering system, e.g., an aerospace vehicle. BLISS consists of optimizations at the subsystem (module) and system levels to divide the overall large optimization task into sets of smaller ones that can be executed concurrently. In the initial version of BLISS that was introduced and documented in previous publications, analysis in the modules was kept at the early conceptual design level. This paper reports on the next step in the BLISS development in which the fidelity of the aerodynamic drag and structural stress and displacement analyses were upgraded while the method's satisfactory convergence rate was retained.

  2. Fully Integral, Flexible Composite Driveshaft

    NASA Technical Reports Server (NTRS)

    Lawrie, Duncan

    2014-01-01

    An all-composite driveshaft incorporating integral flexible diaphragms was developed for prime contractor testing. This new approach makes obsolete the split lines required to attach metallic flex elements and either metallic or composite spacing tubes in current solutions. Subcritical driveshaft weights can be achieved that are half that of incumbent technology for typical rotary wing shaft lengths. Spacing tubes compose an integral part of the initial tooling but remain part of the finished shaft and control natural frequencies and torsional stability. A concurrently engineered manufacturing process and design for performance competes with incumbent solutions at significantly lower weight and with the probability of improved damage tolerance and fatigue life.

  3. Architecture independent environment for developing engineering software on MIMD computers

    NASA Technical Reports Server (NTRS)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  4. Structural Design Exploration of an Electric Powered Multi-Propulsor Wing Configuration

    NASA Technical Reports Server (NTRS)

    Moore, James B.; Cutright, Steve

    2017-01-01

    Advancements in aircraft electric propulsion may enable an expanded operational envelope for electrically powered vehicles compared to their internal combustion engine counterparts. High aspect ratio wings provide additional lift and drag reduction for a proposed multi-propulsor design, however, the challenge is to reduce the weight of wing structures while maintaining adequate structural and aeroelastic margins. Design exploration using a conventional design-and-build philosophy coupled with a finite element method (FEM)-based design of experiments (DOE) strategy are presented to examine high aspect ratio wing structures that have spanwise distributed electric motors. Multiple leading-edge-mounted engine masses presented a challenge to design a wing within acceptable limits for dynamic and aeroelastic stability. Because the first four primary bending eigenmodes of the proposed wing structure are very sensitive to outboard motor placement, safety-of-flight requirements drove the need for multiple spars, rib attachments, and outboard structural reinforcements in the design. Global aeroelasticity became an increasingly important design constraint during the on-going design process, with outboard motor pod flutter ultimately becoming a primary design constraint. Designers successively generated models to examine stress, dynamics, and aeroelasticity concurrently. This research specifically addressed satisfying multi-disciplinary design criteria to generate fluid-structure interaction solution sets, and produced high aspect ratio primary structure designs for the NASA Scalable Convergent Electric Propulsion Technology and Operations Research (SCEPTOR) project in the Aeronautic Research Mission Directorate at NASA. In this paper, a dynamics-driven, quasi-inverse design methodology is presented to address aerodynamic performance goals and structural challenges encountered for the SCEPTOR demonstrator vehicle. These results are compared with a traditional computer aided design based approach.

  5. Concurrent enterprise: a conceptual framework for enterprise supply-chain network activities

    NASA Astrophysics Data System (ADS)

    Addo-Tenkorang, Richard; Helo, Petri T.; Kantola, Jussi

    2017-04-01

    Supply-chain management (SCM) in manufacturing industries has evolved significantly over the years. Recently, a lot more relevant research has picked up on the development of integrated solutions. Thus, seeking a collaborative optimisation of geographical, just-in-time (JIT), quality (customer demand/satisfaction) and return-on-investment (profits), aspects of organisational management and planning through 'best practice' business-process management - concepts and application; employing system tools such as certain applications/aspects of enterprise resource planning (ERP) - SCM systems information technology (IT) enablers to enhance enterprise integrated product development/concurrent engineering principles. This article assumed three main organisation theory applications in positioning its assumptions. Thus, proposing a feasible industry-specific framework not currently included within the SCOR model's level four (4) implementation level, as well as other existing SCM integration reference models such as in the MIT process handbook's - Process Interchange Format (PIF), the TOVE project, etc. which could also be replicated in other SCs. However, the wider focus of this paper's contribution will be concentrated on a complimentary proposed framework to the SCC's SCOR reference model. Quantitative empirical closed-ended questionnaires in addition to the main data collected from a qualitative empirical real-life industrial-based pilot case study were used: To propose a conceptual concurrent enterprise framework for SCM network activities. This research adopts a design structure matrix simulation approach analysis to propose an optimal enterprise SCM-networked value-adding, customised master data-management platform/portal for efficient SCM network information exchange and an effective supply-chain (SC) network systems-design teams' structure. Furthermore, social network theory analysis will be employed in a triangulation approach with statistical correlation analysis to assess the scale/level of frequency, importance, level of collaborative-ness, mutual trust as well as roles and responsibility among the enterprise SCM network for systems product development (PD) design teams' technical communication network as well as extensive literature reviews.

  6. Enhanced identification of eligibility for depression research using an electronic medical record search engine.

    PubMed

    Seyfried, Lisa; Hanauer, David A; Nease, Donald; Albeiruti, Rashad; Kavanagh, Janet; Kales, Helen C

    2009-12-01

    Electronic medical records (EMRs) have become part of daily practice for many physicians. Attempts have been made to apply electronic search engine technology to speed EMR review. This was a prospective, observational study to compare the speed and clinical accuracy of a medical record search engine vs. manual review of the EMR. Three raters reviewed 49 cases in the EMR to screen for eligibility in a depression study using the electronic medical record search engine (EMERSE). One week later raters received a scrambled set of the same patients including 9 distractor cases, and used manual EMR review to determine eligibility. For both methods, accuracy was assessed for the original 49 cases by comparison with a gold standard rater. Use of EMERSE resulted in considerable time savings; chart reviews using EMERSE were significantly faster than traditional manual review (p=0.03). The percent agreement of raters with the gold standard (e.g. concurrent validity) using either EMERSE or manual review was not significantly different. Using a search engine optimized for finding clinical information in the free-text sections of the EMR can provide significant time savings while preserving clinical accuracy. The major power of this search engine is not from a more advanced and sophisticated search algorithm, but rather from a user interface designed explicitly to help users search the entire medical record in a way that protects health information.

  7. Enhanced Identification of Eligibility for Depression Research Using an Electronic Medical Record Search Engine

    PubMed Central

    Seyfried, Lisa; Hanauer, David; Nease, Donald; Albeiruti, Rashad; Kavanagh, Janet; Kales, Helen C.

    2009-01-01

    Purpose Electronic medical records (EMR) have become part of daily practice for many physicians. Attempts have been made to apply electronic search engine technology to speed EMR review. This was a prospective, observational study to compare the speed and accuracy of electronic search engine vs. manual review of the EMR. Methods Three raters reviewed 49 cases in the EMR to screen for eligibility in a depression study using the electronic search engine (EMERSE). One week later raters received a scrambled set of the same patients including 9 distractor cases, and used manual EMR review to determine eligibility. For both methods, accuracy was assessed for the original 49 cases by comparison with a gold standard rater. Results Use of EMERSE resulted in considerable time savings; chart reviews using EMERSE were significantly faster than traditional manual review (p=0.03). The percent agreement of raters with the gold standard (e.g. concurrent validity) using either EMERSE or manual review was not significantly different. Conclusions Using a search engine optimized for finding clinical information in the free-text sections of the EMR can provide significant time savings while preserving reliability. The major power of this search engine is not from a more advanced and sophisticated search algorithm, but rather from a user interface designed explicitly to help users search the entire medical record in a way that protects health information. PMID:19560962

  8. Concurrent Engineering for the Management of Research and Development

    NASA Technical Reports Server (NTRS)

    DelRosario, Ruben; Petersen, Paul F.; Keys, L. Ken; Chen, Injazz J.

    2004-01-01

    The Management of Research and Development (R&D) is facing the challenges of reducing time from R&D to customer, reducing the cost of R&D, having higher accountability for results (improved quality), and increasing focus on customers. Concurrent engineering (CE) has shown great success in the automotive and technology industries resulting in significant decreases in cycle time, reduction of total cost, and increases in quality and reliability. This philosophy of concurrency can have similar implications or benefits for the management of R&D organizations. Since most studies on the application of CE have been performed in manufacturing environments, research into the benefits of CE into other environments is needed. This paper presents research conducted at the NASA Glenn Research Center (GRC) investigating the application of CE in the management of an R&D organization. In particular the paper emphasizes possible barriers and enhancers that this environment presents to the successful implementation of CE. Preliminary results and recommendations are based on a series of interviews and subsequent surveys, from which data has been gathered and analyzed as part of the GRC's Continuous Improvement Process.

  9. The Rapid Response Radiation Survey (R3S) Mission Using the HISat Conformal Satellite Architecture

    NASA Technical Reports Server (NTRS)

    Miller, Nathanael

    2015-01-01

    The Rapid Response Radiation Survey (R3S) experiment, designed as a quick turnaround mission to make radiation measurements in LEO, will fly as a hosted payload in partnership with NovaWurks using their Hyper-integrated Satlet (HiSat) architecture. The need for the mission arises as the Nowcast of Atmospheric Ionization Radiation for Aviation Safety (NAIRAS) model moves from a research effort into an operational radiation assessment tool. The data collected by R3S, in addition to the complementary data from a NASA Langley Research Center (LaRC) atmospheric balloon mission entitled Radiation Dosimetry Experiment (RaDX), will validate exposure prediction capabilities of NAIRAS. This paper discusses the development of the R3S experiment as made possible by use of the HiSat architecture. The system design and operational modes of the experiment are described, as well as the experiment interfaces to the HiSat satellite via the user defined adapter (UDA) provided by NovaWurks. This paper outlines the steps taken by the project to execute the R3S mission in the 4 months of design, build, and test. Finally, description of the engineering process is provided, including the use of facilitated rapid/concurrent engineering sessions, the associated documentation, and the review process employed.

  10. NASA's Gravitational - Wave Mission Concept Study

    NASA Technical Reports Server (NTRS)

    Stebbins, Robin; Jennrich, Oliver; McNamara, Paul

    2012-01-01

    With the conclusion of the NASA/ESA partnership on the Laser Interferometer Space Antenna (LISA) Project, NASA initiated a study to explore mission concepts that will accomplish some or all of the LISA science objectives at lower cost. The Gravitational-Wave Mission Concept Study consisted of a public Request for Information (RFI), a Core Team of NASA engineers and scientists, a Community Science Team, a Science Task Force, and an open workshop. The RFI yielded were 12 mission concepts, 3 instrument concepts and 2 technologies. The responses ranged from concepts that eliminated the drag-free test mass of LISA to concepts that replace the test mass with an atom interferometer. The Core Team reviewed the noise budgets and sensitivity curves, the payload and spacecraft designs and requirements, orbits and trajectories and technical readiness and risk. The Science Task Force assessed the science performance by calculating the horizons. the detection rates and the accuracy of astrophysical parameter estimation for massive black hole mergers, stellar-mass compact objects inspiraling into central engines. and close compact binary systems. Three mission concepts have been studied by Team-X, JPL's concurrent design facility. to define a conceptual design evaluate kt,y performance parameters. assess risk and estimate cost and schedule. The Study results are summarized.

  11. Multidisciplinary collaboration as a sustainable research model for device development.

    PubMed

    Chandra, Ankur

    2013-02-01

    The concurrent problems of research sustainability and decreased clinician involvement with medical device development can be jointly addressed through a novel, multidisciplinary solution. The University of Rochester Cardiovascular Device Design Program is a sustainable program in medical device design supported through a collaboration between the Schools of Medicine and Engineering. This article provides a detailed description of the motivation for starting the program, the current structure of the program, the methods of financial sustainability, and the direct impact it intends to have on the national vascular surgery community. The further expansion of this program and encouragement for development of similar programs throughout the country aims to address many of our current challenges in both research funding and device development education. Copyright © 2013 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.

  12. Multidisciplinary Design Technology Development: A Comparative Investigation of Integrated Aerospace Vehicle Design Tools

    NASA Technical Reports Server (NTRS)

    Renaud, John E.; Batill, Stephen M.; Brockman, Jay B.

    1999-01-01

    This research effort is a joint program between the Departments of Aerospace and Mechanical Engineering and the Computer Science and Engineering Department at the University of Notre Dame. The purpose of the project was to develop a framework and systematic methodology to facilitate the application of Multidisciplinary Design Optimization (MDO) to a diverse class of system design problems. For all practical aerospace systems, the design of a systems is a complex sequence of events which integrates the activities of a variety of discipline "experts" and their associated "tools". The development, archiving and exchange of information between these individual experts is central to the design task and it is this information which provides the basis for these experts to make coordinated design decisions (i.e., compromises and trade-offs) - resulting in the final product design. Grant efforts focused on developing and evaluating frameworks for effective design coordination within a MDO environment. Central to these research efforts was the concept that the individual discipline "expert", using the most appropriate "tools" available and the most complete description of the system should be empowered to have the greatest impact on the design decisions and final design. This means that the overall process must be highly interactive and efficiently conducted if the resulting design is to be developed in a manner consistent with cost and time requirements. The methods developed as part of this research effort include; extensions to a sensitivity based Concurrent Subspace Optimization (CSSO) NMO algorithm; the development of a neural network response surface based CSSO-MDO algorithm; and the integration of distributed computing and process scheduling into the MDO environment. This report overviews research efforts in each of these focus. A complete bibliography of research produced with support of this grant is attached.

  13. Concurrent Formative Evaluation: Guidelines and Implications for Multimedia Designers.

    ERIC Educational Resources Information Center

    Northrup, Pamela Taylor

    1995-01-01

    Discusses formative evaluation for multimedia instruction and presents guidelines for formatively evaluating multimedia instruction concurrent with analysis, design, and development. Data collection criteria that include group involvement, data collection strategies, and information to be gathered are presented, and rapid prototypes and…

  14. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  15. Concurrent Engineering through Product Data Standards

    DTIC Science & Technology

    1991-05-01

    standards, represents the power of a new industrial revolution . The role of the NIST National PDES testbed, technical leadership and a testing-based foundation for the development of STEP, is described.

  16. Fuel quantity modulation in pilot ignited engines

    DOEpatents

    May, Andrew

    2006-05-16

    An engine system includes a first fuel regulator adapted to control an amount of a first fuel supplied to the engine, a second fuel regulator adapted to control an amount of a second fuel supplied to the engine concurrently with the first fuel being supplied to the engine, and a controller coupled to at least the second fuel regulator. The controller is adapted to determine the amount of the second fuel supplied to the engine in a relationship to the amount of the first fuel supplied to the engine to operate in igniting the first fuel at a specified time in steady state engine operation and adapted to determine the amount of the second fuel supplied to the engine in a manner different from the relationship at steady state engine operation in transient engine operation.

  17. Man-machine analysis of translation and work tasks of Skylab films

    NASA Technical Reports Server (NTRS)

    Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.

    1979-01-01

    An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.

  18. Computer Sciences and Data Systems, volume 1

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics addressed include: software engineering; university grants; institutes; concurrent processing; sparse distributed memory; distributed operating systems; intelligent data management processes; expert system for image analysis; fault tolerant software; and architecture research.

  19. A game-based decision support methodology for competitive systems design

    NASA Astrophysics Data System (ADS)

    Briceno, Simon Ignacio

    This dissertation describes the development of a game-based methodology that facilitates the exploration and selection of research and development (R&D) projects under uncertain competitive scenarios. The proposed method provides an approach that analyzes competitor positioning and formulates response strategies to forecast the impact of technical design choices on a project's market performance. A critical decision in the conceptual design phase of propulsion systems is the selection of the best architecture, centerline, core size, and technology portfolio. This selection can be challenging when considering evolving requirements from both the airframe manufacturing company and the airlines in the market. Furthermore, the exceedingly high cost of core architecture development and its associated risk makes this strategic architecture decision the most important one for an engine company. Traditional conceptual design processes emphasize performance and affordability as their main objectives. These areas alone however, do not provide decision-makers with enough information as to how successful their engine will be in a competitive market. A key objective of this research is to examine how firm characteristics such as their relative differences in completing R&D projects, differences in the degree of substitutability between different project types, and first/second-mover advantages affect their product development strategies. Several quantitative methods are investigated that analyze business and engineering strategies concurrently. In particular, formulations based on the well-established mathematical field of game theory are introduced to obtain insights into the project selection problem. The use of game theory is explored in this research as a method to assist the selection process of R&D projects in the presence of imperfect market information. The proposed methodology focuses on two influential factors: the schedule uncertainty of project completion times and the uncertainty associated with competitive reactions. A normal-form matrix is created to enumerate players, their moves and payoffs, and to formulate a process by which an optimal decision can be achieved. The non-cooperative model is tested using the concept of a Nash equilibrium to identify potential strategies that are robust to uncertain market fluctuations (e.g: uncertainty in airline demand, airframe requirements and competitor positioning). A first/second-mover advantage parameter is used as a scenario dial to adjust market rewards and firms' payoffs. The methodology is applied to a commercial aircraft engine selection study where engine firms must select an optimal engine project for development. An engine modeling and simulation framework is developed to generate a broad engine project portfolio. The creation of a customer value model enables designers to incorporate airline operation characteristics into the engine modeling and simulation process to improve the accuracy of engine/customer matching. Summary. Several key findings are made that provide recommendations on project selection strategies for firms uncertain as to when they will enter the market. The proposed study demonstrates that within a technical design environment, a rational and analytical means of modeling project development strategies is beneficial in high market risk situations.

  20. Summer Research Program - 1997 Summer Faculty Research Program Volume 6 Arnold Engineering Development Center United States Air Force Academy Air Logistics Centers

    DTIC Science & Technology

    1997-12-01

    Fracture Analysis of the F-5, 15%-Spar Bolt DR Devendra Kumar SAALC/LD 6- 16 CUNY-City College, New York, NY A Simple, Multiversion Concurrency Control...Program, University of Dayton, Dayton, OH. [3]AFGROW, Air Force Crack Propagation Analysis Program, Version 3.82 (1997) 15-8 A SIMPLE, MULTIVERSION ...Office of Scientific Research Boiling Air Force Base, DC and San Antonio Air Logistic Center August 1997 16-1 A SIMPLE, MULTIVERSION CONCURRENCY

  1. Software For Drawing Design Details Concurrently

    NASA Technical Reports Server (NTRS)

    Crosby, Dewey C., III

    1990-01-01

    Software system containing five computer-aided-design programs enables more than one designer to work on same part or assembly at same time. Reduces time necessary to produce design by implementing concept of parallel or concurrent detailing, in which all detail drawings documenting three-dimensional model of part or assembly produced simultaneously, rather than sequentially. Keeps various detail drawings consistent with each other and with overall design by distributing changes in each detail to all other affected details.

  2. IMAGE: A Design Integration Framework Applied to the High Speed Civil Transport

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.

    1993-01-01

    Effective design of the High Speed Civil Transport requires the systematic application of design resources throughout a product's life-cycle. Information obtained from the use of these resources is used for the decision-making processes of Concurrent Engineering. Integrated computing environments facilitate the acquisition, organization, and use of required information. State-of-the-art computing technologies provide the basis for the Intelligent Multi-disciplinary Aircraft Generation Environment (IMAGE) described in this paper. IMAGE builds upon existing agent technologies by adding a new component called a model. With the addition of a model, the agent can provide accountable resource utilization in the presence of increasing design fidelity. The development of a zeroth-order agent is used to illustrate agent fundamentals. Using a CATIA(TM)-based agent from previous work, a High Speed Civil Transport visualization system linking CATIA, FLOPS, and ASTROS will be shown. These examples illustrate the important role of the agent technologies used to implement IMAGE, and together they demonstrate that IMAGE can provide an integrated computing environment for the design of the High Speed Civil Transport.

  3. A Survey of Applications and Research in Integrated Design Systems Technology

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The initial part of the study was begun with a combination of literature searches, World Wide Web searches, and contacts with individuals and companies who were known to members of our team to have an interest in topics that seemed to be related to our study. There is a long list of such topics, such as concurrent engineering, design for manufacture, life-cycle engineering, systems engineering, systems integration, systems design, design systems, integrated product and process approaches, enterprise integration, integrated product realization, and similar terms. These all capture, at least in part, the flavor of what we describe here as integrated design systems. An inhibiting factor in this inquiry was the absence of agreed terminology for the study of integrated design systems. It is common for the term to be applied to what are essentially augmented Computer-Aided Design (CAD) systems, which are integrated only to the extent that agreements have been reached to attach proprietary extensions to proprietary CAD programs. It is also common for some to use the term integrated design systems to mean a system that applies only, or mainly, to the design phase of a product life cycle. It is likewise common for many of the terms listed earlier to be used as synonyms for integrated design systems. We tried to avoid this ambiguity by adopting the definition of integrated design systems that is implied in the introductory notes that we provided to our contacts, cited earlier. We thus arrived at this definition: Integrated Design Systems refers to the integration of the different tools and processes that comprise the engineering, of complex systems. It takes a broad view of the engineering of systems, to include consideration of the entire product realization process and the product life cycle. An important aspect of integrated design systems is the extent to which they integrate existing, "islands of automation" into a comprehensive design and product realization environment. As the study progressed, we relied increasingly upon a networking approach to lead us to new information. The departure point for such searches often was a government-sponsored project or a company initiative. The advantage of this approach was that short conversations with knowledgeable persons would usually cut through confusion over differences of terminology, thereby somewhat reducing the search space of the study. Even so, it was not until late in our eight-month inquiry that we began to see signs of convergence of the search, in the sense that a number of the latest inquiries began to turn up references to earlier contacts. As suggested above, this convergence often occurred with respect to particular government or company projects.

  4. Assessing the validity of sales self-efficacy: a cautionary tale.

    PubMed

    Gupta, Nina; Ganster, Daniel C; Kepes, Sven

    2013-07-01

    We developed a focused, context-specific measure of sales self-efficacy and assessed its incremental validity against the broad Big 5 personality traits with department store salespersons, using (a) both a concurrent and a predictive design and (b) both objective sales measures and supervisory ratings of performance. We found that in the concurrent study, sales self-efficacy predicted objective and subjective measures of job performance more than did the Big 5 measures. Significant differences between the predictability of subjective and objective measures of performance were not observed. Predictive validity coefficients were generally lower than concurrent validity coefficients. The results suggest that there are different dynamics operating in concurrent and predictive designs and between broad and contextualized measures; they highlight the importance of distinguishing between these designs and measures in meta-analyses. The results also point to the value of focused, context-specific personality predictors in selection research. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  5. Metric integration architecture for product development

    NASA Astrophysics Data System (ADS)

    Sieger, David B.

    1997-06-01

    Present-day product development endeavors utilize the concurrent engineering philosophy as a logical means for incorporating a variety of viewpoints into the design of products. Since this approach provides no explicit procedural provisions, it is necessary to establish at least a mental coupling with a known design process model. The central feature of all such models is the management and transformation of information. While these models assist in structuring the design process, characterizing the basic flow of operations that are involved, they provide no guidance facilities. The significance of this feature, and the role it plays in the time required to develop products, is increasing in importance due to the inherent process dynamics, system/component complexities, and competitive forces. The methodology presented in this paper involves the use of a hierarchical system structure, discrete event system specification (DEVS), and multidimensional state variable based metrics. This approach is unique in its capability to quantify designer's actions throughout product development, provide recommendations about subsequent activity selection, and coordinate distributed activities of designers and/or design teams across all design stages. Conceptual design tool implementation results are used to demonstrate the utility of this technique in improving the incremental decision making process.

  6. Insulin analogs for the treatment of diabetes mellitus: therapeutic applications of protein engineering.

    PubMed

    Berenson, Daniel F; Weiss, Allison R; Wan, Zhu-Li; Weiss, Michael A

    2011-12-01

    The engineering of insulin analogs represents a triumph of structure-based protein design. A framework has been provided by structures of insulin hexamers. Containing a zinc-coordinated trimer of dimers, such structures represent a storage form of the active insulin monomer. Initial studies focused on destabilization of subunit interfaces. Because disassembly facilitates capillary absorption, such targeted destabilization enabled development of rapid-acting insulin analogs. Converse efforts were undertaken to stabilize the insulin hexamer and promote higher-order self-assembly within the subcutaneous depot toward the goal of enhanced basal glycemic control with reduced risk of hypoglycemia. Current products either operate through isoelectric precipitation (insulin glargine, the active component of Lantus(®); Sanofi-Aventis) or employ an albumin-binding acyl tether (insulin detemir, the active component of Levemir(®); Novo-Nordisk). To further improve pharmacokinetic properties, modified approaches are presently under investigation. Novel strategies have recently been proposed based on subcutaneous supramolecular assembly coupled to (a) large-scale allosteric reorganization of the insulin hexamer (the TR transition), (b) pH-dependent binding of zinc ions to engineered His-X(3)-His sites at hexamer surfaces, or (c) the long-range vision of glucose-responsive polymers for regulated hormone release. Such designs share with wild-type insulin and current insulin products a susceptibility to degradation above room temperature, and so their delivery, storage, and use require the infrastructure of an affluent society. Given the global dimensions of the therapeutic supply chain, we envisage that concurrent engineering of ultra-stable protein analog formulations would benefit underprivileged patients in the developing world.

  7. Innovative Approaches to Fuel-Air Mixing and Combustion in Airbreathing Hypersonic Engines

    NASA Astrophysics Data System (ADS)

    MacLeod, C.

    This paper describes some innovative methods for achieving enhanced fuel-air mixing and combustion in Scramjet-like spaceplane engines. A multimodal approach to the problem is discussed; this involves using several concurrent methods of forced mixing. The paper concentrates on Electromagnetic Activation (EMA) and Electrostatic Attraction as suitable techniques for this purpose - although several other potential methods are also discussed. Previously published empirical data is used to draw conclusions about the likely effectiveness of the system and possible engine topologies are outlined.

  8. Concurrently adjusting interrelated control parameters to achieve optimal engine performance

    DOEpatents

    Jiang, Li; Lee, Donghoon; Yilmaz, Hakan; Stefanopoulou, Anna

    2015-12-01

    Methods and systems for real-time engine control optimization are provided. A value of an engine performance variable is determined, a value of a first operating condition and a value of a second operating condition of a vehicle engine are detected, and initial values for a first engine control parameter and a second engine control parameter are determined based on the detected first operating condition and the detected second operating condition. The initial values for the first engine control parameter and the second engine control parameter are adjusted based on the determined value of the engine performance variable to cause the engine performance variable to approach a target engine performance variable. In order to cause the engine performance variable to approach the target engine performance variable, adjusting the initial value for the first engine control parameter necessitates a corresponding adjustment of the initial value for the second engine control parameter.

  9. Separate versus Concurrent Calibration Methods in Vertical Scaling.

    ERIC Educational Resources Information Center

    Karkee, Thakur; Lewis, Daniel M.; Hoskens, Machteld; Yao, Lihua; Haug, Carolyn

    Two methods to establish a common scale across grades within a content area using a common item design (separate and concurrent) have previously been studied under simulated conditions. Separate estimation is accomplished through separate calibration and grade-by-grade chained linking. Concurrent calibration established the vertical scale in a…

  10. Accounting for Test Variability through Sizing Local Domains in Sequential Design Optimization with Concurrent Calibration-Based Model Validation

    DTIC Science & Technology

    2013-08-01

    in Sequential Design Optimization with Concurrent Calibration-Based Model Validation Dorin Drignei 1 Mathematics and Statistics Department...Validation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Dorin Drignei; Zissimos Mourelatos; Vijitashwa Pandey

  11. Description of inpatient medication management using cognitive work analysis.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Sengstacke, L T C Daniel N

    2009-01-01

    The purpose of this article was to describe key elements of an inpatient medication system using the cognitive work analysis method of Rasmussen et al (Cognitive Systems Engineering. Wiley Series in Systems Engineering; 1994). The work of nurses and physicians were observed in routine care of inpatients on a medical-surgical unit and attached ICU. Interaction with pharmacists was included. Preoperative, postoperative, and medical care was observed. Personnel were interviewed to obtain information not easily observable during routine work. Communication between healthcare workers was projected onto an abstraction/decomposition hierarchy. Decision ladders and information flow charts were developed. Results suggest that decision making on an inpatient medical/surgical unit or ICU setting is a parallel, distributed process. Personnel are highly mobile and often are working on multiple issues concurrently. In this setting, communication is key to maintaining organization and synchronization for effective care. Implications for research approaches to system and interface designs and decision support for personnel involved in the process are discussed.

  12. Artificial concurrent catalytic processes involving enzymes.

    PubMed

    Köhler, Valentin; Turner, Nicholas J

    2015-01-11

    The concurrent operation of multiple catalysts can lead to enhanced reaction features including (i) simultaneous linear multi-step transformations in a single reaction flask (ii) the control of intermediate equilibria (iii) stereoconvergent transformations (iv) rapid processing of labile reaction products. Enzymes occupy a prominent position for the development of such processes, due to their high potential compatibility with other biocatalysts. Genes for different enzymes can be co-expressed to reconstruct natural or construct artificial pathways and applied in the form of engineered whole cell biocatalysts to carry out complex transformations or, alternatively, the enzymes can be combined in vitro after isolation. Moreover, enzyme variants provide a wider substrate scope for a given reaction and often display altered selectivities and specificities. Man-made transition metal catalysts and engineered or artificial metalloenzymes also widen the range of reactivities and catalysed reactions that are potentially employable. Cascades for simultaneous cofactor or co-substrate regeneration or co-product removal are now firmly established. Many applications of more ambitious concurrent cascade catalysis are only just beginning to appear in the literature. The current review presents some of the most recent examples, with an emphasis on the combination of transition metal with enzymatic catalysis and aims to encourage researchers to contribute to this emerging field.

  13. Computer aided design and manufacturing of composite propfan blades for a cruise missile wind tunnel model

    NASA Technical Reports Server (NTRS)

    Thorp, Scott A.; Downey, Kevin M.

    1992-01-01

    One of the propulsion concepts being investigated for future cruise missiles is advanced unducted propfans. To support the evaluation of this technology applied to the cruise missile, a joint DOD and NASA test project was conducted to design and then test the characteristics of the propfans on a 0.55-scale, cruise missile model in a NASA wind tunnel. The configuration selected for study is a counterrotating rearward swept propfan. The forward blade row, having six blades, rotates in a counterclockwise direction, and the aft blade row, having six blades, rotates in a clockwise direction, as viewed from aft of the test model. Figures show the overall cruise missile and propfan blade configurations. The objective of this test was to evaluate propfan performance and suitability as a viable propulsion option for next generation of cruise missiles. This paper details the concurrent computer aided design, engineering, and manufacturing of the carbon fiber/epoxy propfan blades as the NASA Lewis Research Center.

  14. Emerging Applications of Porphryins in Photomedicine

    NASA Astrophysics Data System (ADS)

    Huang, Haoyuan; Song, Wentao; Rieffel, James; Lovell, Jonathan

    2015-04-01

    Biomedical applications of porphyrins and related molecules have been extensively pursued in the context of photodynamic therapy (PDT). Recent advances in nanoscale engineering have opened the door for new ways that porphyrins stand to potentially benefit human health. Metalloporphyrins are inherently suitable for many types of medical imaging and therapy. Traditional nanocarriers such as liposomes, dendrimers and silica nanoparticles have been explored for photosensitizer delivery. Concurrently, entirely new classes of porphyrin nanostructures are being developed, such as smart materials that are activated by specific biochemicals encountered at disease sites. Techniques have been developed that improve treatments by combining biomaterials with photosensitizers and functional moieties such as peptides, DNA and antibodies. Compared to simpler structures, these more complex and functional designs can potentially decrease side effects and lead to safer and more efficient phototherapies. This review examines recent research on porphyrin-derived materials in multimodal imaging, drug delivery, bio-sensing, phototherapy and probe design, demonstrating their bright future for biomedical applications.

  15. The development of turbojet aircraft in Germany, Britain, and the United States: A multi-national comparison of aeronautical engineering, 1935--1946

    NASA Astrophysics Data System (ADS)

    Pavelec, Sterling Michael

    In the 1930s aeronautical engineering needed revision. A presumptive anomaly was envisaged as piston-engine aircraft flew higher and faster. Radical alternatives to piston engines were considered in the unending quest for speed. Concurrently, but unwittingly, two turbojet engine programs were undertaken in Europe. The air-breathing three-stage turbojet engine was based on previous turbine technology; the revolutionary idea was the gas turbine as a prime mover for aircraft. In Germany, Dr. Hans von Ohain was the first to complete a flight-worthy turbojet engine for aircraft. Installed in a Heinkel designed aircraft, the Germans began the jet age on 27 August 1939. The Germans led throughout the war and were the first to produce jet aircraft for combat operations. The principal limiting factor for the German jet program was a lack of reliable engines. The continuing myths that Hitler orders, too little fuel, or too few pilots hindered the program are false. In England, Frank Whittle, without substantial support, but with dogged determination, also developed a turbojet engine. The British came second in the jet race when the Whittle engine powered the Gloster Pioneer on 15 May 1941. The Whittle-Gloster relationship continued and produced the only Allied combat jet aircraft during the war, the Meteor, which was confined to Home Defense in Britain. The American turbojet program was built directly from the Whittle engine. General Electric copied the Whittle designs and Bell Aircraft was contracted to build the first American jet plane. The Americans began the jet age on 1 October 1942 with a lackluster performance from their first jet, the Airacomet. But the Americans forged ahead, and had numerous engine and airframe programs in development by the end of the war. But, the Germans did it right and did it first. Partly because of a predisposition towards excellent engineering and physics, partly out of necessity, the Germans were able to produce combat turbojet aircraft during the war. The Allies lagged from a lack of necessity, operational incompatibility, and stringent acceptance requirements. By the end of the war the Germans needed qualitative technological superiority to combat an overwhelming Allied quantitative advantage.

  16. Image-Processing Software For A Hypercube Computer

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Mazer, Alan S.; Groom, Steven L.; Williams, Winifred I.

    1992-01-01

    Concurrent Image Processing Executive (CIPE) is software system intended to develop and use image-processing application programs on concurrent computing environment. Designed to shield programmer from complexities of concurrent-system architecture, it provides interactive image-processing environment for end user. CIPE utilizes architectural characteristics of particular concurrent system to maximize efficiency while preserving architectural independence from user and programmer. CIPE runs on Mark-IIIfp 8-node hypercube computer and associated SUN-4 host computer.

  17. Mechatronics by Analogy and Application to Legged Locomotion

    NASA Astrophysics Data System (ADS)

    Ragusila, Victor

    A new design methodology for mechatronic systems, dubbed as Mechatronics by Analogy (MbA), is introduced and applied to designing a leg mechanism. The new methodology argues that by establishing a similarity relation between a complex system and a number of simpler models it is possible to design the former using the analysis and synthesis means developed for the latter. The methodology provides a framework for concurrent engineering of complex systems while maintaining the transparency of the system behaviour through making formal analogies between the system and those with more tractable dynamics. The application of the MbA methodology to the design of a monopod robot leg, called the Linkage Leg, is also studied. A series of simulations show that the dynamic behaviour of the Linkage Leg is similar to that of a combination of a double pendulum and a spring-loaded inverted pendulum, based on which the system kinematic, dynamic, and control parameters can be designed concurrently. The first stage of Mechatronics by Analogy is a method of extracting significant features of system dynamics through simpler models. The goal is to determine a set of simpler mechanisms with similar dynamic behaviour to that of the original system in various phases of its motion. A modular bond-graph representation of the system is determined, and subsequently simplified using two simplification algorithms. The first algorithm determines the relevant dynamic elements of the system for each phase of motion, and the second algorithm finds the simple mechanism described by the remaining dynamic elements. In addition to greatly simplifying the controller for the system, using simpler mechanisms with similar behaviour provides a greater insight into the dynamics of the system. This is seen in the second stage of the new methodology, which concurrently optimizes the simpler mechanisms together with a control system based on their dynamics. Once the optimal configuration of the simpler system is determined, the original mechanism is optimized such that its dynamic behaviour is analogous. It is shown that, if this analogy is achieved, the control system designed based on the simpler mechanisms can be directly implemented to the more complex system, and their dynamic behaviours are close enough for the system performance to be effectively the same. Finally it is shown that, for the employed objective of fast legged locomotion, the proposed methodology achieves a better design than Reduction-by-Feedback, a competing methodology that uses control layers to simplify the dynamics of the system.

  18. Computational predictions of the tensile properties of electrospun fiber meshes: effect of fiber diameter and fiber orientation

    PubMed Central

    Stylianopoulos, Triantafyllos; Bashur, Chris A.; Goldstein, Aaron S.; Guelcher, Scott A.; Barocas, Victor H.

    2008-01-01

    The mechanical properties of biomaterial scaffolds are crucial for their efficacy in tissue engineering and regenerative medicine. At the microscopic scale, the scaffold must be sufficiently rigid to support cell adhesion, spreading, and normal extracellular matrix deposition. Concurrently, at the macroscopic scale the scaffold must have mechanical properties that closely match those of the target tissue. The achievement of both goals may be possible by careful control of the scaffold architecture. Recently, electrospinning has emerged as an attractive means to form fused fiber scaffolds for tissue engineering. The diameter and relative orientation of fibers affect cell behavior, but their impact on the tensile properties of the scaffolds has not been rigorously characterized. To examine the structure-property relationship, electrospun meshes were made from a polyurethane elastomer with different fiber diameters and orientations and mechanically tested to determine the dependence of the elastic modulus on the mesh architecture. Concurrently, a multiscale modeling strategy developed for type I collagen networks was employed to predict the mechanical behavior of the polyurethane meshes. Experimentally, the measured elastic modulus of the meshes varied from 0.56 to 3.0 MPa depending on fiber diameter and the degree of fiber alignment. Model predictions for tensile loading parallel to fiber orientation agreed well with experimental measurements for a wide range of conditions when a fitted fiber modulus of 18 MPa was used. Although the model predictions were less accurate in transverse loading of anisotropic samples, these results indicate that computational modeling can assist in design of electrospun artificial tissue scaffolds. PMID:19627797

  19. Requiem for a Data Base System.

    DTIC Science & Technology

    1979-01-18

    were defined -- - 2) the final syntax and semantics of QUEL were defined 3) protection was figured out 14) EQUEL was designed 5) concurrency control and...features which were not thought about in the initial design (such as concurrency control and recovery) and began worrying about distributed data...made in progress rather than on eventual corrections. Some attention is also given to the role of structured design in a data base system implementation

  20. Concurrent design of quasi-random photonic nanostructures

    PubMed Central

    Lee, Won-Kyu; Yu, Shuangcheng; Engel, Clifford J.; Reese, Thaddeus; Rhee, Dongjoon; Chen, Wei

    2017-01-01

    Nanostructured surfaces with quasi-random geometries can manipulate light over broadband wavelengths and wide ranges of angles. Optimization and realization of stochastic patterns have typically relied on serial, direct-write fabrication methods combined with real-space design. However, this approach is not suitable for customizable features or scalable nanomanufacturing. Moreover, trial-and-error processing cannot guarantee fabrication feasibility because processing–structure relations are not included in conventional designs. Here, we report wrinkle lithography integrated with concurrent design to produce quasi-random nanostructures in amorphous silicon at wafer scales that achieved over 160% light absorption enhancement from 800 to 1,200 nm. The quasi-periodicity of patterns, materials filling ratio, and feature depths could be independently controlled. We statistically represented the quasi-random patterns by Fourier spectral density functions (SDFs) that could bridge the processing–structure and structure–performance relations. Iterative search of the optimal structure via the SDF representation enabled concurrent design of nanostructures and processing. PMID:28760975

  1. On the Difficulties of Concurrent-System Design, Illustrated with a 2×2 Switch Case Study

    NASA Astrophysics Data System (ADS)

    Daylight, Edgar G.; Shukla, Sandeep K.

    While various specification languages for concurrent-system design exist today, it is often not clear which specification language is more suitable than another for a particular case study. To address this problem, we study four different specification languages for the same 2×2 Switch case study: TLA + , Bluespec, Statecharts, and the Algebra of Communicating Processes (ACP). By slightly altering the design intent of the Switch, we obtain more complicated behaviors of the Switch. For each design intent, we investigate how each specification, in each of the specification languages, captures the corresponding behavior. By using three different criteria, we judge each specification and specification language. For our case study, however, all four specification languages perform poorly in at least two criteria! Hence, this paper illustrates, on a seemingly simple case study, some of the prevailing difficulties of concurrent-system design.

  2. A PDA-based system for online recording and analysis of concurrent events in complex behavioral processes.

    PubMed

    Held, Jürgen; Manser, Tanja

    2005-02-01

    This article outlines how a Palm- or Newton-based PDA (personal digital assistant) system for online event recording was used to record and analyze concurrent events. We describe the features of this PDA-based system, called the FIT-System (flexible interface technique), and its application to the analysis of concurrent events in complex behavioral processes--in this case, anesthesia work processes. The patented FIT-System has a unique user interface design allowing the user to design an interface template with a pencil and paper or using a transparency film. The template usually consists of a drawing or sketch that includes icons or symbols that depict the observer's representation of the situation to be observed. In this study, the FIT-System allowed us to create a design for fast, intuitive online recording of concurrent events using a set of 41 observation codes. An analysis of concurrent events leads to a description of action density, and our results revealed a characteristic distribution of action density during the administration of anesthesia in the operating room. This distribution indicated the central role of the overlapping operations in the action sequences of medical professionals as they deal with the varying requirements of this complex task. We believe that the FIT-System for online recording of concurrent events in complex behavioral processes has the potential to be useful across a broad spectrum of research areas.

  3. Distributed Parallel Processing and Dynamic Load Balancing Techniques for Multidisciplinary High Speed Aircraft Design

    NASA Technical Reports Server (NTRS)

    Krasteva, Denitza T.

    1998-01-01

    Multidisciplinary design optimization (MDO) for large-scale engineering problems poses many challenges (e.g., the design of an efficient concurrent paradigm for global optimization based on disciplinary analyses, expensive computations over vast data sets, etc.) This work focuses on the application of distributed schemes for massively parallel architectures to MDO problems, as a tool for reducing computation time and solving larger problems. The specific problem considered here is configuration optimization of a high speed civil transport (HSCT), and the efficient parallelization of the embedded paradigm for reasonable design space identification. Two distributed dynamic load balancing techniques (random polling and global round robin with message combining) and two necessary termination detection schemes (global task count and token passing) were implemented and evaluated in terms of effectiveness and scalability to large problem sizes and a thousand processors. The effect of certain parameters on execution time was also inspected. Empirical results demonstrated stable performance and effectiveness for all schemes, and the parametric study showed that the selected algorithmic parameters have a negligible effect on performance.

  4. First Thin Film Festival

    NASA Astrophysics Data System (ADS)

    Samson, Philippe

    2005-05-01

    The constant evolution of the satellite market is asking for better technical performances and reliability for a reduced cost. Solar array is in front line of this challenge.This can be achieved by present technologies progressive improvement in cost reduction or by technological breakthrough.To reach an effective End Of Live performance100 W/kg of solar array is not so easy, even if you suppose that the mass of everything is nothing!Thin film cells are potential candidate to contribute to this challenge with certain confidence level and consequent development plan validation and qualification on ground and flight.Based on a strong flight heritage in flexible Solar Array design, the work has allowed in these last years, to pave the way on road map of thin film technologies . This is encouraged by ESA on many technological contracts put in concurrent engineering.CISG was selected cell and their strategy of design, contributions and results will be presented.Trade-off results and Design to Cost solutions will discussed.Main technical drivers, system design constraints, market access, key technologies needed will be detailed in this paper and the resulting road-map and development plan will be presented.

  5. Design of object-oriented distributed simulation classes

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D. (Principal Investigator)

    1995-01-01

    Distributed simulation of aircraft engines as part of a computer aided design package is being developed by NASA Lewis Research Center for the aircraft industry. The project is called NPSS, an acronym for 'Numerical Propulsion Simulation System'. NPSS is a flexible object-oriented simulation of aircraft engines requiring high computing speed. It is desirable to run the simulation on a distributed computer system with multiple processors executing portions of the simulation in parallel. The purpose of this research was to investigate object-oriented structures such that individual objects could be distributed. The set of classes used in the simulation must be designed to facilitate parallel computation. Since the portions of the simulation carried out in parallel are not independent of one another, there is the need for communication among the parallel executing processors which in turn implies need for their synchronization. Communication and synchronization can lead to decreased throughput as parallel processors wait for data or synchronization signals from other processors. As a result of this research, the following have been accomplished. The design and implementation of a set of simulation classes which result in a distributed simulation control program have been completed. The design is based upon MIT 'Actor' model of a concurrent object and uses 'connectors' to structure dynamic connections between simulation components. Connectors may be dynamically created according to the distribution of objects among machines at execution time without any programming changes. Measurements of the basic performance have been carried out with the result that communication overhead of the distributed design is swamped by the computation time of modules unless modules have very short execution times per iteration or time step. An analytical performance model based upon queuing network theory has been designed and implemented. Its application to realistic configurations has not been carried out.

  6. Design of Object-Oriented Distributed Simulation Classes

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1995-01-01

    Distributed simulation of aircraft engines as part of a computer aided design package being developed by NASA Lewis Research Center for the aircraft industry. The project is called NPSS, an acronym for "Numerical Propulsion Simulation System". NPSS is a flexible object-oriented simulation of aircraft engines requiring high computing speed. It is desirable to run the simulation on a distributed computer system with multiple processors executing portions of the simulation in parallel. The purpose of this research was to investigate object-oriented structures such that individual objects could be distributed. The set of classes used in the simulation must be designed to facilitate parallel computation. Since the portions of the simulation carried out in parallel are not independent of one another, there is the need for communication among the parallel executing processors which in turn implies need for their synchronization. Communication and synchronization can lead to decreased throughput as parallel processors wait for data or synchronization signals from other processors. As a result of this research, the following have been accomplished. The design and implementation of a set of simulation classes which result in a distributed simulation control program have been completed. The design is based upon MIT "Actor" model of a concurrent object and uses "connectors" to structure dynamic connections between simulation components. Connectors may be dynamically created according to the distribution of objects among machines at execution time without any programming changes. Measurements of the basic performance have been carried out with the result that communication overhead of the distributed design is swamped by the computation time of modules unless modules have very short execution times per iteration or time step. An analytical performance model based upon queuing network theory has been designed and implemented. Its application to realistic configurations has not been carried out.

  7. Testing and evaluation for astronaut extravehicular activity (EVA) operability.

    PubMed

    Shields, N; King, L C

    1998-09-01

    Because it is the human component that defines space mission success, careful planning is required to ensure that hardware can be operated and maintained by crews on-orbit. Several methods exist to allow researchers and designers to better predict how hardware designs will behave under the harsh environment of low Earth orbit, and whether designs incorporate the necessary features for Extra Vehicular Activity (EVA) operability. Testing under conditions of simulated microgravity can occur during the design concept phase when verifying design operability, during mission training, or concurrently with on-orbit mission operations. The bulk of testing is focused on normal operations, but also includes evaluation of credible mission contingencies or "what would happen if" planning. The astronauts and cosmonauts who fly these space missions are well prepared and trained to survive and be productive in Earth's orbit. The engineers, designers, and training crews involved in space missions subject themselves to Earth based simulation techniques that also expose them to extreme environments. Aircraft falling ten thousand feet, alternating g-loads, underwater testing at 45 foot depth, enclosure in a vacuum chamber and subject to thermal extremes, each carries with it inherent risks to the humans preparing for space missions.

  8. Future requirements in surface modeling and grid generation

    NASA Technical Reports Server (NTRS)

    Cosner, Raymond R.

    1995-01-01

    The past ten years have seen steady progress in surface modeling procedures, and wholesale changes in grid generation technology. Today, it seems fair to state that a satisfactory grid can be developed to model nearly any configuration of interest. The issues at present focus on operational concerns such as cost and quality. Continuing evolution of the engineering process is placing new demands on the technologies of surface modeling and grid generation. In the evolution toward a multidisciplinary analysis-bascd design environment, methods developed for Computational Fluid Dynamics are finding acceptance in many additional applications. These two trends, the normal evolution of the process and a watershed shift toward concurrent and multidisciplinary analysis, will be considered in assessing current capabilities and needed technological improvements.

  9. Concurrent engineering research center

    NASA Technical Reports Server (NTRS)

    Callahan, John R.

    1995-01-01

    The projects undertaken by The Concurrent Engineering Research Center (CERC) at West Virginia University are reported and summarized. CERC's participation in the Department of Defense's Defense Advanced Research Project relating to technology needed to improve the product development process is described, particularly in the area of advanced weapon systems. The efforts committed to improving collaboration among the diverse and distributed health care providers are reported, along with the research activities for NASA in Independent Software Verification and Validation. CERC also takes part in the electronic respirator certification initiated by The National Institute for Occupational Safety and Health, as well as in the efforts to find a solution to the problem of producing environment-friendly end-products for product developers worldwide. The 3M Fiber Metal Matrix Composite Model Factory Program is discussed. CERC technologies, facilities,and personnel-related issues are described, along with its library and technical services and recent publications.

  10. NREL Advancements in Methane Conversion Lead to Cleaner Air, Useful Products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-06-01

    Researchers at NREL leveraged the recent on-site development of gas fermentation capabilities and novel genetic tools to directly convert methane to lactic acid using an engineered methanotrophic bacterium. The results provide proof-of-concept data for a gas-to-liquids bioprocess that concurrently produces fuels and chemicals from methane. NREL researchers developed genetic tools to express heterologous genes in methanotrophic organisms, which have historically been difficult to genetically engineer. Using these tools, researchers demonstrated microbial conversion of methane to lactate, a high-volume biochemical precursor predominantly utilized for the production of bioplastics. Methane biocatalysis offers a means to concurrently liquefy and upgrade natural gas andmore » renewable biogas, enabling their utilization in conventional transportation and industrial manufacturing infrastructure. Producing chemicals and fuels from methane expands the suite of products currently generated from biorefineries, municipalities, and agricultural operations, with the potential to increase revenue and significantly reduce greenhouse gas emissions.« less

  11. Cold-Flow Testing of a Proposed Integrated Center-Body Diffuser/Steam Blocker Concept for Plum Brook Station's B-2 Test Facility

    NASA Technical Reports Server (NTRS)

    Edwards, Daryl A.; Weaver, Harold F; Kastner, Carl E., Jr.

    2009-01-01

    The center-body diffuser (CBD) steam blocker (SB) system is a concept that incorporates a set of secondary drive nozzles into the envelope of a CBD, such that both nozzle systems (i.e., the rocket engine and the steam blocking nozzles) utilize the same supersonic diffuser, and will operate either singularly or concurrently. In this manner, the SB performs as an exhaust system stage when the rocket engine is not operating, and virtually eliminates discharge flow on rocket engine shutdown. A 2.25-percent scale model of a proposed SB integrated into a diffuser for the Plum Brook B-2 facility was constructed and cold-flow tested for the purpose of evaluating performance characteristics of various design options. These specific design options addressed secondary drive nozzle design (method of steam injection), secondary drive nozzle location relative to CBD throat, and center-body throat length to diameter (L/D) ratios. The objective of the test program is to identify the desired configuration to carry forward should the next phase of design proceed. The tested scale model can provide data for various pressure ratios; however, its design is based on a proposed B-2 spray chamber (SC) operating pressure of 4.0 psia and a steam supply pressure of 165 psia. Evaluation of the test data acquired during these tests indicate that either the discrete axial or annular nozzle configuration integrated into a CBD, with an annular throat length of 1.5 L/D at the nominal injection position, would be suitable to carry forward from the SB's perspective. Selection between these two then becomes more a function of constructability and implementation than performance. L/D also has some flexibility, and final L/D selection can be a function of constructability issues within a limited range.

  12. History of the Fluids Engineering Division

    DOE PAGES

    Cooper, Paul; Martin, C. Samuel; O'Hern, Timothy J.

    2016-08-03

    The 90th Anniversary of the Fluids Engineering Division (FED) of ASME will be celebrated on July 10–14, 2016 in Washington, DC. The venue is ASME's Summer Heat Transfer Conference (SHTC), Fluids Engineering Division Summer Meeting (FEDSM), and International Conference on Nanochannels and Microchannels (ICNMM). The occasion is an opportune time to celebrate and reflect on the origin of FED and its predecessor—the Hydraulic Division (HYD), which existed from 1926–1963. Furthermore, the FED Executive Committee decided that it would be appropriate to publish concurrently a history of the HYD/FED.

  13. History of the Fluids Engineering Division

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, Paul; Martin, C. Samuel; O'Hern, Timothy J.

    The 90th Anniversary of the Fluids Engineering Division (FED) of ASME will be celebrated on July 10–14, 2016 in Washington, DC. The venue is ASME's Summer Heat Transfer Conference (SHTC), Fluids Engineering Division Summer Meeting (FEDSM), and International Conference on Nanochannels and Microchannels (ICNMM). The occasion is an opportune time to celebrate and reflect on the origin of FED and its predecessor—the Hydraulic Division (HYD), which existed from 1926–1963. Furthermore, the FED Executive Committee decided that it would be appropriate to publish concurrently a history of the HYD/FED.

  14. The CIRP International Workshop on Concurrent Engineering for Product Realization (1st) Held in Tokyo, Japan on June 27 - 28, 1992

    DTIC Science & Technology

    1992-11-01

    7075 Dr. Samuel K.M.HO Dept. of Engineering Warwick University Coventry CV 47 AL UK 15 Tel.(44)203 524173 Fax.(44)203 524307 Mr. Mikio Inagaki...Nishi-8, Kita-ku Sapporo 060, JAPAN Tel. +81-11-716-211 ex.6447 Fax. +81-11-758-1619 Mr. Mikio Kitano 16 Motomachi Plant Toyota Moter Corporation 1...Hirosawa 2-1, Wako, Saitama 351-01 JAPAN Tel 81-484-65-6641 Fax 81-484-67-5942 Professor Hisayoshi Sato Director, Mechanical Engineering Laboratory

  15. Implementing model-based system engineering for the whole lifecycle of a spacecraft

    NASA Astrophysics Data System (ADS)

    Fischer, P. M.; Lüdtke, D.; Lange, C.; Roshani, F.-C.; Dannemann, F.; Gerndt, A.

    2017-09-01

    Design information of a spacecraft is collected over all phases in the lifecycle of a project. A lot of this information is exchanged between different engineering tasks and business processes. In some lifecycle phases, model-based system engineering (MBSE) has introduced system models and databases that help to organize such information and to keep it consistent for everyone. Nevertheless, none of the existing databases approached the whole lifecycle yet. Virtual Satellite is the MBSE database developed at DLR. It has been used for quite some time in Phase A studies and is currently extended for implementing it in the whole lifecycle of spacecraft projects. Since it is unforeseeable which future use cases such a database needs to support in all these different projects, the underlying data model has to provide tailoring and extension mechanisms to its conceptual data model (CDM). This paper explains the mechanisms as they are implemented in Virtual Satellite, which enables extending the CDM along the project without corrupting already stored information. As an upcoming major use case, Virtual Satellite will be implemented as MBSE tool in the S2TEP project. This project provides a new satellite bus for internal research and several different payload missions in the future. This paper explains how Virtual Satellite will be used to manage configuration control problems associated with such a multi-mission platform. It discusses how the S2TEP project starts using the software for collecting the first design information from concurrent engineering studies, then making use of the extension mechanisms of the CDM to introduce further information artefacts such as functional electrical architecture, thus linking more and more processes into an integrated MBSE approach.

  16. Differences in results of analyses of concurrent and split stream-water samples collected and analyzed by the US Geological Survey and the Illinois Environmental Protection Agency, 1985-91

    USGS Publications Warehouse

    Melching, C.S.; Coupe, R.H.

    1995-01-01

    During water years 1985-91, the U.S. Geological Survey (USGS) and the Illinois Environmental Protection Agency (IEPA) cooperated in the collection and analysis of concurrent and split stream-water samples from selected sites in Illinois. Concurrent samples were collected independently by field personnel from each agency at the same time and sent to the IEPA laboratory, whereas the split samples were collected by USGS field personnel and divided into aliquots that were sent to each agency's laboratory for analysis. The water-quality data from these programs were examined by means of the Wilcoxon signed ranks test to identify statistically significant differences between results of the USGS and IEPA analyses. The data sets for constituents and properties identified by the Wilcoxon test as having significant differences were further examined by use of the paired t-test, mean relative percentage difference, and scattergrams to determine if the differences were important. Of the 63 constituents and properties in the concurrent-sample analysis, differences in only 2 (pH and ammonia) were statistically significant and large enough to concern water-quality engineers and planners. Of the 27 constituents and properties in the split-sample analysis, differences in 9 (turbidity, dissolved potassium, ammonia, total phosphorus, dissolved aluminum, dissolved barium, dissolved iron, dissolved manganese, and dissolved nickel) were statistically significant and large enough to con- cern water-quality engineers and planners. The differences in concentration between pairs of the concurrent samples were compared to the precision of the laboratory or field method used. The differences in concentration between pairs of the concurrent samples were compared to the precision of the laboratory or field method used. The differences in concentration between paris of split samples were compared to the precision of the laboratory method used and the interlaboratory precision of measuring a given concentration or property. Consideration of method precision indicated that differences between concurrent samples were insignificant for all concentrations and properties except pH, and that differences between split samples were significant for all concentrations and properties. Consideration of interlaboratory precision indicated that the differences between the split samples were not unusually large. The results for the split samples illustrate the difficulty in obtaining comparable and accurate water-quality data.

  17. Building the interspace: Digital library infrastructure for a University Engineering Community

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schatz, B.

    A large-scale digital library is being constructed and evaluated at the University of Illinois, with the goal of bringing professional search and display to Internet information services. A testbed planned to grow to 10K documents and 100K users is being constructed in the Grainger Engineering Library Information Center, as a joint effort of the University Library and the National Center for Supercomputing Applications (NCSA), with evaluation and research by the Graduate School of Library and Information Science and the Department of Computer Science. The electronic collection will be articles from engineering and science journals and magazines, obtained directly from publishersmore » in SGML format and displayed containing all text, figures, tables, and equations. The publisher partners include IEEE Computer Society, AIAA (Aerospace Engineering), American Physical Society, and Wiley & Sons. The software will be based upon NCSA Mosaic as a network engine connected to commercial SGML displayers and full-text searchers. The users will include faculty/students across the midwestern universities in the Big Ten, with evaluations via interviews, surveys, and transaction logs. Concurrently, research into scaling the testbed is being conducted. This includes efforts in computer science, information science, library science, and information systems. These efforts will evaluate different semantic retrieval technologies, including automatic thesaurus and subject classification graphs. New architectures will be designed and implemented for a next generation digital library infrastructure, the Interspace, which supports interaction with information spread across information spaces within the Net.« less

  18. 40 CFR 35.927-3 - Rehabilitation.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... minor rehabilitation concurrently with the sewer system evaluation survey in any step under a grant if... consisting of major rehabilitation work may be awarded concurrently with step 2 work for the design of the...

  19. Design of the Detector II: A CMOS Gate Array for the Study of Concurrent Error Detection Techniques.

    DTIC Science & Technology

    1987-07-01

    detection schemes and temporary failures. The circuit consists- or of six different adders with concurrent error detection schemes . The error detection... schemes are - simple duplication, duplication with functional dual implementation, duplication with different &I [] .6implementations, two-rail encoding...THE SYSTEM. .. .... ...... ...... ...... 5 7. DESIGN OF CED SCHEMES .. ... ...... ...... ........ 7 7.1 Simple Duplication

  20. A Principled Approach to the Specification of System Architectures for Space Missions

    NASA Technical Reports Server (NTRS)

    McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad

    2015-01-01

    Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.

  1. Catalog of Training and Education Sources in Concurrent Engineering

    DTIC Science & Technology

    1989-11-01

    Undergraduate degree in engineering or hard science. TOEFL (Test of English as a Foreign Language) of 550 or better for international students and GMAT (Graduate...Graduate Record Examination)of 1000 0 (Verbal + Quantitative); TOEFL (Test of English as a Foreign Language) of 550 for students whose first language...Graduate Record Examination) and TOEFL (Test of English as a Foreign Language) 0 scores. Comments: Recipient of the CASA/SME 1988 University LEAD

  2. Air Force Engineering Research Initiation Grant Program

    DTIC Science & Technology

    1994-06-21

    MISFET Structures for High-Frequency Device Applications" RI-B-91-13 Prof. John W. Silvestro Clemson University "The Effect of Scattering by a Near...Synthesis Method for Concurrent Engineering Applications" RI-B-92-03 Prof. Steven H. Collicott Purdue University "An Experimental Study of the Effect of a ...beams is studied. The effect of interply delam- inations on natural frequencies and mode shapes is evaluated analytically. A generalized variational

  3. A Concurrent Distributed System for Aircraft Tactical Decision Generation

    NASA Technical Reports Server (NTRS)

    McManus, John W.

    1990-01-01

    A research program investigating the use of artificial intelligence (AI) techniques to aid in the development of a Tactical Decision Generator (TDG) for Within Visual Range (WVR) air combat engagements is discussed. The application of AI programming and problem solving methods in the development and implementation of a concurrent version of the Computerized Logic For Air-to-Air Warfare Simulations (CLAWS) program, a second generation TDG, is presented. Concurrent computing environments and programming approaches are discussed and the design and performance of a prototype concurrent TDG system are presented.

  4. Parametric optimization in virtual prototyping environment of the control device for a robotic system used in thin layers deposition

    NASA Astrophysics Data System (ADS)

    Enescu (Balaş, M. L.; Alexandru, C.

    2016-08-01

    The paper deals with the optimal design of the control system for a 6-DOF robot used in thin layers deposition. The optimization is based on parametric technique, by modelling the design objective as a numerical function, and then establishing the optimal values of the design variables so that to minimize the objective function. The robotic system is a mechatronic product, which integrates the mechanical device and the controlled operating device.The mechanical device of the robot was designed in the CAD (Computer Aided Design) software CATIA, the 3D-model being then transferred to the MBS (Multi-Body Systems) environment ADAMS/View. The control system was developed in the concurrent engineering concept, through the integration with the MBS mechanical model, by using the DFC (Design for Control) software solution EASY5. The necessary angular motions in the six joints of the robot, in order to obtain the imposed trajectory of the end-effector, have been established by performing the inverse kinematic analysis. The positioning error in each joint of the robot is used as design objective, the optimization goal being to minimize the root mean square during simulation, which is a measure of the magnitude of the positioning error varying quantity.

  5. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Part 2

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas; Yuan, Charles; Sobieski, Jaroslaw (Technical Monitor)

    2000-01-01

    A new MDO method, BLISS, and two different variants of the method, BLISS/RS and BLISS/S, have been implemented using iSIGHT's scripting language and evaluated in this report on multidisciplinary problems. All of these methods are based on decomposing a modular system optimization system into several subtasks optimization, that may be executed concurrently, and the system optimization that coordinates the subtasks optimization. The BLISS method and its variants are well suited for exploiting the concurrent processing capabilities in a multiprocessor machine. Several steps, including the local sensitivity analysis, local optimization, response surfaces construction and updates are all ideally suited for concurrent processing. Needless to mention, such algorithms that can effectively exploit the concurrent processing capabilities of the compute servers will be a key requirement for solving large-scale industrial design problems, such as the automotive vehicle problem detailed in Section 3.4.

  6. Design and Implementation of the Boundary Layer Transition Flight Experiment on Space Shuttle Discovery

    NASA Technical Reports Server (NTRS)

    Spanos, Theodoros A.; Micklos, Ann

    2010-01-01

    In an effort to better the understanding of high speed aerodynamics, a series of flight experiments were installed on Space Shuttle Discovery during the STS-119 and STS-128 missions. This experiment, known as the Boundary Layer Transition Flight Experiment (BLTFE), provided the technical community with actual entry flight data from a known height protuberance at Mach numbers at and above Mach 15. Any such data above Mach 15 is irreproducible in a laboratory setting. Years of effort have been invested in obtaining this valuable data, and many obstacles had to be overcome in order to ensure the success of implementing an Orbiter modification. Many Space Shuttle systems were involved in the installation of appropriate components that revealed 'concurrent engineering' was a key integration tool. This allowed the coordination of all various parts and pieces which had to be sequenced appropriately and installed at the right time. Several issues encountered include Orbiter configuration and access, design requirements versus current layout, implementing the modification versus typical processing timelines, and optimizing the engineering design cycles and changes. Open lines of communication within the entire modification team were essential to project success as the team was spread out across the United States, from NASA Kennedy Space Center in Florida, to NASA Johnson Space Center in Texas, to Boeing Huntington Beach, California among others. The forum permits the discussion of processing concerns from the design phase to the implementation phase, which eventually saw the successful flights and data acquisition on STS-119 in March 2009 and on STS-128 in September 2009.

  7. Challenges in Teaching Modern Manufacturing Technologies

    ERIC Educational Resources Information Center

    Ngaile, Gracious; Wang, Jyhwen; Gau, Jenn-Terng

    2015-01-01

    Teaching of manufacturing courses for undergraduate engineering students has become a challenge due to industrial globalisation coupled with influx of new innovations, technologies, customer-driven products. This paper discusses development of a modern manufacturing course taught concurrently in three institutions where students collaborate in…

  8. NASA's Gravitational-Wave Mission Concept Study

    NASA Technical Reports Server (NTRS)

    Stebbins, Robin

    2012-01-01

    With the conclusion of the NASA/ESA partnership on the Laser interferometer Space Antenna (LISA) Project, NASA initiated a study to explore mission concepts that will accomplish some or all of the LISA science objectives at lower cost. The Gravitational-Wave Mission Concept Study consists of a public Request for Information (RFI), a Core Team of NASA engineers and scientists, a Community Science Team, a Science Task Force, and an open workshop. The RFI yielded 12 mission concepts, 3 instrument concepts and 2 technologies. The responses ranged from concepts that eliminated the drag-free test mass of LISA to concepts that replace the test mass with an atom interferometer. The Core Team reviewed the noise budgets and sensitivity curves, the payload and spacecraft designs and requirements, orbits and trajectories and technical readiness and risk. The Science Task Force assessed the science performance. Three mission concepts have been studied by Team-X, JPL's concurrent design facility, to refine the conceptual design, evaluate key performance parameters, assess risk and estimate cost and schedule. The status of the Study are reported.

  9. Addressing Research Design Problem in Mixed Methods Research

    NASA Astrophysics Data System (ADS)

    Alavi, Hamed; Hąbek, Patrycja

    2016-03-01

    Alongside other disciplines in social sciences, management researchers use mixed methods research more and more in conduct of their scientific investigations. Mixed methods approach can also be used in the field of production engineering. In comparison with traditional quantitative and qualitative research methods, reasons behind increasing popularity of mixed research method in management science can be traced in different factors. First of all, any particular discipline in management can be theoretically related to it. Second is that concurrent approach of mixed research method to inductive and deductive research logic provides researchers with opportunity to generate theory and test hypothesis in one study simultaneously. In addition, it provides a better justification for chosen method of investigation and higher validity for obtained answers to research questions. Despite increasing popularity of mixed research methods among management scholars, there is still need for a comprehensive approach to research design typology and process in mixed research method from the perspective of management science. The authors in this paper try to explain fundamental principles of mixed research method, its typology and different steps in its design process.

  10. An Integrated Approach to Functional Engineering: An Engineering Database for Harness, Avionics and Software

    NASA Astrophysics Data System (ADS)

    Piras, Annamaria; Malucchi, Giovanni

    2012-08-01

    In the design and development phase of a new program one of the critical aspects is the integration of all the functional requirements of the system and the control of the overall consistency between the identified needs on one side and the available resources on the other side, especially when both the required needs and available resources are not yet consolidated, but they are evolving as the program maturity increases.The Integrated Engineering Harness Avionics and Software database (IDEHAS) is a tool that has been developed to support this process in the frame of the Avionics and Software disciplines through the different phases of the program. The tool is in fact designed to allow an incremental build up of the avionics and software systems, from the description of the high level architectural data (available in the early stages of the program) to the definition of the pin to pin connectivity information (typically consolidated in the design finalization stages) and finally to the construction and validation of the detailed telemetry parameters and commands to be used in the test phases and in the Mission Control Centre. The key feature of this approach and of the associated tool is that it allows the definition and the maintenance / update of all these data in a single, consistent environment.On one side a system level and concurrent approach requires the feasibility to easily integrate and update the best data available since the early stages of a program in order to improve confidence in the consistency and to control the design information.On the other side, the amount of information of different typologies and the cross-relationships among the data imply highly consolidated structures requiring lot of checks to guarantee the data content consistency with negative effects on simplicity and flexibility and often limiting the attention to special needs and to the interfaces with other disciplines.

  11. Method for reducing peak phase current and decreasing staring time for an internal combustion engine having an induction machine

    DOEpatents

    Amey, David L.; Degner, Michael W.

    2002-01-01

    A method for reducing the starting time and reducing the peak phase currents for an internal combustion engine that is started using an induction machine starter/alternator. The starting time is reduced by pre-fluxing the induction machine and the peak phase currents are reduced by reducing the flux current command after a predetermined period of time has elapsed and concurrent to the application of the torque current command. The method of the present invention also provides a strategy for anticipating the start command for an internal combustion engine and determines a start strategy based on the start command and the operating state of the internal combustion engine.

  12. Electro-optic holography method for determination of surface shape and deformation

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1998-06-01

    Current demanding engineering analysis and design applications require effective experimental methodologies for characterization of surface shape and deformation. Such characterization is of primary importance in many applications, because these quantities are related to the functionality, performance, and integrity of the objects of interest, especially in view of advances relating to concurrent engineering. In this paper, a new approach to characterization of surface shape and deformation using a simple optical setup is described. The approach consists of a fiber optic based electro-optic holography (EOH) system based on an IR, temperature tuned laser diode, a single mode fiber optic directional coupler assembly, and a video processing computer. The EOH can be arranged in multiple configurations which include, the three-camera, three- illumination, and speckle correlation modes.In particular, the three-camera mode is described, as well as a brief description of the procedures for obtaining quantitative 3D shape and deformation information. A representative application of the three-camera EOH system demonstrates the viability of the approach as an effective engineering tool. A particular feature of this system and the procedure described in this paper is that the 3D quantitative data are written to data files which can be readily interfaced to commercial CAD/CAM environments.

  13. Re-engineering the Multimission Command System at the Jet Propulsion Laboratory

    NASA Technical Reports Server (NTRS)

    Alexander, Scott; Biesiadecki, Jeff; Cox, Nagin; Murphy, Susan C.; Reeve, Tim

    1994-01-01

    The Operations Engineering Lab (OEL) at JPL has developed the multimission command system as part of JPL's Advanced Multimission Operations System. The command system provides an advanced multimission environment for secure, concurrent commanding of multiple spacecraft. The command functions include real-time command generation, command translation and radiation, status reporting, some remote control of Deep Space Network antenna functions, and command file management. The mission-independent architecture has allowed easy adaptation to new flight projects and the system currently supports all JPL planetary missions (Voyager, Galileo, Magellan, Ulysses, Mars Pathfinder, and CASSINI). This paper will discuss the design and implementation of the command software, especially trade-offs and lessons learned from practical operational use. The lessons learned have resulted in a re-engineering of the command system, especially in its user interface and new automation capabilities. The redesign has allowed streamlining of command operations with significant improvements in productivity and ease of use. In addition, the new system has provided a command capability that works equally well for real-time operations and within a spacecraft testbed. This paper will also discuss new development work including a multimission command database toolkit, a universal command translator for sequencing and real-time commands, and incorporation of telecommand capabilities for new missions.

  14. Bioinspired magnetoreception and navigation using magnetic signatures as waypoints.

    PubMed

    Taylor, Brian K

    2018-05-15

    Diverse taxa use Earth's magnetic field in conjunction with other sensory modalities to accomplish navigation tasks ranging from local homing to long-distance migration across continents and ocean basins. However, despite extensive research, the mechanisms that underlie animal magnetoreception are not clearly understood, and how animals use Earth's magnetic field to navigate is an active area of investigation. Concurrently, Earth's magnetic field offers a signal that engineered systems can leverage for navigation in environments where man-made systems such as GPS are unavailable or unreliable. Using a proxy for Earth's magnetic field, and inspired by migratory animal behavior, this work implements a behavioral strategy that uses combinations of magnetic field properties as rare or unique signatures that mark specific locations. Using a discrete number of these signatures as goal waypoints, the strategy navigates through a closed set of points several times in a variety of environmental conditions, and with various levels of sensor noise. The results from this engineering/quantitative biology approach support existing notions that some animals may use combinations of magnetic properties as navigational markers, and provides insights into features and constraints that would enable navigational success or failure. The findings also offer insights into how autonomous engineered platforms might be designed to leverage the magnetic field as a navigational resource.

  15. Examination of Various Methods Used in Support of Concurrent Engineering

    DTIC Science & Technology

    1990-03-01

    1989. F.Y.I. Drawing a2Ther Productivity. Industrial Engineering 21: 80. Ishi82 Ishikawa , Kaoru . 1982. Guide to Quality Control. White Plains, NY: Kraus...observe it in practice have an easier time identifying the different methods or tech- niques (such as the Ishikawa tools) used than understanding the...simple histogram to show what prob- lems should be attacked first. Cause and Effect Diagrams Sometimes called the fishbone or Ishikawa diagrams-a kind

  16. Mixing modes in a population-based interview survey: comparison of a sequential and a concurrent mixed-mode design for public health research.

    PubMed

    Mauz, Elvira; von der Lippe, Elena; Allen, Jennifer; Schilling, Ralph; Müters, Stephan; Hoebel, Jens; Schmich, Patrick; Wetzstein, Matthias; Kamtsiuris, Panagiotis; Lange, Cornelia

    2018-01-01

    Population-based surveys currently face the problem of decreasing response rates. Mixed-mode designs are now being implemented more often to account for this, to improve sample composition and to reduce overall costs. This study examines whether a concurrent or sequential mixed-mode design achieves better results on a number of indicators of survey quality. Data were obtained from a population-based health interview survey of adults in Germany that was conducted as a methodological pilot study as part of the German Health Update (GEDA). Participants were randomly allocated to one of two surveys; each of the surveys had a different design. In the concurrent mixed-mode design ( n  = 617) two types of self-administered questionnaires (SAQ-Web and SAQ-Paper) and computer-assisted telephone interviewing were offered simultaneously to the respondents along with the invitation to participate. In the sequential mixed-mode design ( n  = 561), SAQ-Web was initially provided, followed by SAQ-Paper, with an option for a telephone interview being sent out together with the reminders at a later date. Finally, this study compared the response rates, sample composition, health indicators, item non-response, the scope of fieldwork and the costs of both designs. No systematic differences were identified between the two mixed-mode designs in terms of response rates, the socio-demographic characteristics of the achieved samples, or the prevalence rates of the health indicators under study. The sequential design gained a higher rate of online respondents. Very few telephone interviews were conducted for either design. With regard to data quality, the sequential design (which had more online respondents) showed less item non-response. There were minor differences between the designs in terms of their costs. Postage and printing costs were lower in the concurrent design, but labour costs were lower in the sequential design. No differences in health indicators were found between the two designs. Modelling these results for higher response rates and larger net sample sizes indicated that the sequential design was more cost and time-effective. This study contributes to the research available on implementing mixed-mode designs as part of public health surveys. Our findings show that SAQ-Paper and SAQ-Web questionnaires can be combined effectively. Sequential mixed-mode designs with higher rates of online respondents may be of greater benefit to studies with larger net sample sizes than concurrent mixed-mode designs.

  17. Concurrent flow lanes - phase III.

    DOT National Transportation Integrated Search

    2011-01-01

    This report describes efforts taken to develop and calibrate VISSIM models of existing : concurrent flow lane designs of north- and southbound lanes of I-270 from the interchange at : I-70 to interchanges on I-495 at Connecticut Avenue in Maryland an...

  18. Engineered Nanomaterials, Sexy New Technology and Potential Hazards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaulieu, R A

    Engineered nanomaterials enhance exciting new applications that can greatly benefit society in areas of cancer treatments, solar energy, energy storage, and water purification. While nanotechnology shows incredible promise in these and other areas by exploiting nanomaterials unique properties, these same properties can potentially cause adverse health effects to workers who may be exposed during work. Dispersed nanoparticles in air can cause adverse health effects to animals not merely due to their chemical properties but due to their size, structure, shape, surface chemistry, solubility, carcinogenicity, reproductive toxicity, mutagenicity, dermal toxicity, and parent material toxicity. Nanoparticles have a greater likelihood of lungmore » deposition and blood absorption than larger particles due to their size. Nanomaterials can also pose physical hazards due to their unusually high reactivity, which makes them useful as catalysts, but has the potential to cause fires and explosions. Characterization of the hazards (and potential for exposures) associated with nanomaterial development and incorporation in other products is an essential step in the development of nanotechnologies. Developing controls for these hazards are equally important. Engineered controls should be integrated into nanomaterial manufacturing process design according to 10CFR851, DOE Policy 456.1, and DOE Notice 456.1 as safety-related hardware or administrative controls for worker safety. Nanomaterial hazards in a nuclear facility must also meet control requirements per DOE standards 3009, 1189, and 1186. Integration of safe designs into manufacturing processes for new applications concurrent with the developing technology is essential for worker safety. This paper presents a discussion of nanotechnology, nanomaterial properties/hazards and controls.« less

  19. Concurrent Image Processing Executive (CIPE). Volume 1: Design overview

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.

    1990-01-01

    The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are described. The target machine for this software is a JPL/Caltech Mark 3fp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules: user interface, host-resident executive, hypercube-resident executive, and application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube, a data management method which distributes, redistributes, and tracks data set information was implemented. The data management also allows data sharing among application programs. The CIPE software architecture provides a flexible environment for scientific analysis of complex remote sensing image data, such as planetary data and imaging spectrometry, utilizing state-of-the-art concurrent computation capabilities.

  20. Consolidated Laser-Induced Fluorescence Diagnostic Systems for the NASA Ames Arc Jet Facilities

    NASA Technical Reports Server (NTRS)

    Grinstead, Jay H.; Wilder, Michael C.; Porter, Barry J.; Brown, Jeffrey D.; Yeung, Dickson; Battazzo, Stephen J.; Brubaker, Timothy R.

    2016-01-01

    The spectroscopic diagnostic technique of two photon absorption laser-induced fluorescence (LIF) of atomic species for non-intrusive arc jet flow property measurement was first implemented at NASA Ames in the mid-1990s. In 2013-2014, NASA combined the agency's large-scale arc jet test capabilities at NASA Ames. Concurrent with that effort, the agency also sponsored a project to establish two comprehensive LIF diagnostic systems for the Aerodynamic Heating Facility (AHF) and Interaction Heating Facility (IHF) arc jets. The scope of the project enabled further engineering development of the existing IHF LIF system as well as the complete reconstruction of the AHF LIF system. The updated LIF systems are identical in design and capability. They represent the culmination of over 20 years of development experience in transitioning a specialized laboratory research tool into a measurement system for large-scale, high-demand test facilities. This paper will document the latest improvements of the LIF system design and demonstrations of the redeveloped AHF and IHF LIF systems.

  1. The Effect of Basepair Mismatch on DNA Strand Displacement.

    PubMed

    Broadwater, D W Bo; Kim, Harold D

    2016-04-12

    DNA strand displacement is a key reaction in DNA homologous recombination and DNA mismatch repair and is also heavily utilized in DNA-based computation and locomotion. Despite its ubiquity in science and engineering, sequence-dependent effects of displacement kinetics have not been extensively characterized. Here, we measured toehold-mediated strand displacement kinetics using single-molecule fluorescence in the presence of a single basepair mismatch. The apparent displacement rate varied significantly when the mismatch was introduced in the invading DNA strand. The rate generally decreased as the mismatch in the invader was encountered earlier in displacement. Our data indicate that a single base pair mismatch in the invader stalls branch migration and displacement occurs via direct dissociation of the destabilized incumbent strand from the substrate strand. We combined both branch migration and direct dissociation into a model, which we term the concurrent displacement model, and used the first passage time approach to quantitatively explain the salient features of the observed relationship. We also introduce the concept of splitting probabilities to justify that the concurrent model can be simplified into a three-step sequential model in the presence of an invader mismatch. We expect our model to become a powerful tool to design DNA-based reaction schemes with broad functionality. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  2. Validation of Reverse-Engineered and Additive-Manufactured Microsurgical Instrument Prototype.

    PubMed

    Singh, Ramandeep; Suri, Ashish; Anand, Sneh; Baby, Britty

    2016-12-01

    With advancements in imaging techniques, neurosurgical procedures are becoming highly precise and minimally invasive, thus demanding development of new ergonomically aesthetic instruments. Conventionally, neurosurgical instruments are manufactured using subtractive manufacturing methods. Such a process is complex, time-consuming, and impractical for prototype development and validation of new designs. Therefore, an alternative design process has been used utilizing blue light scanning, computer-aided designing, and additive manufacturing direct metal laser sintering (DMLS) for microsurgical instrument prototype development. Deviations of DMLS-fabricated instrument were studied by superimposing scan data of fabricated instrument with the computer-aided designing model. Content and concurrent validity of the fabricated prototypes was done by a group of 15 neurosurgeons by performing sciatic nerve anastomosis in small laboratory animals. Comparative scoring was obtained for the control and study instrument. T test was applied to the individual parameters and P values for force (P < .0001) and surface roughness (P < .01) were found to be statistically significant. These 2 parameters were further analyzed using objective measures. Results depicts that additive manufacturing by DMLS provides an effective method for prototype development. However, direct application of these additive-manufactured instruments in the operating room requires further validation. © The Author(s) 2016.

  3. An optimization design proposal of automated guided vehicles for mixed type transportation in hospital environments.

    PubMed

    González, Domingo; Romero, Luis; Espinosa, María Del Mar; Domínguez, Manuel

    2017-01-01

    The aim of this paper is to present an optimization proposal in the automated guided vehicles design used in hospital logistics, as well as to analyze the impact of its implementation in a real environment. This proposal is based on the design of those elements that would allow the vehicles to deliver an extra cart by the towing method. So, the proposal intention is to improve the productivity and the performance of the current vehicles by using a transportation method of combined carts. The study has been developed following concurrent engineering premises from three different viewpoints. First, the sequence of operations has been described, and second, a proposal of design of the equipment has been undertaken. Finally, the impact of the proposal has been analyzed according to real data from the Hospital Universitario Rio Hortega in Valladolid (Spain). In this particular case, by the implementation of the analyzed proposal in the hospital a reduction of over 35% of the current time of use can be achieved. This result may allow adding new tasks to the vehicles, and according to this, both a new kind of vehicle and a specific module can be developed in order to get a better performance.

  4. Reconciling pairs of concurrently used clinical practice guidelines using Constraint Logic Programming.

    PubMed

    Wilk, Szymon; Michalowski, Martin; Michalowski, Wojtek; Hing, Marisela Mainegra; Farion, Ken

    2011-01-01

    This paper describes a new methodological approach to reconciling adverse and contradictory activities (called points of contention) occurring when a patient is managed according to two or more concurrently used clinical practice guidelines (CPGs). The need to address these inconsistencies occurs when a patient with more than one disease, each of which is a comorbid condition, has to be managed according to different treatment regimens. We propose an automatic procedure that constructs a mathematical guideline model using the Constraint Logic Programming (CLP) methodology, uses this model to identify and mitigate encountered points of contention, and revises the considered CPGs accordingly. The proposed procedure is used as an alerting mechanism and coupled with a guideline execution engine warns the physician about potential problems with the concurrent application of two or more guidelines. We illustrate the operation of our procedure in a clinical scenario describing simultaneous use of CPGs for duodenal ulcer and transient ischemic attack.

  5. Improved Broadband Liner Optimization Applied to the Advanced Noise Control Fan

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Jones, Michael G.; Sutliff, Daniel L.; Ayle, Earl; Ichihashi, Fumitaka

    2014-01-01

    The broadband component of fan noise has grown in relevance with the utilization of increased bypass ratio and advanced fan designs. Thus, while the attenuation of fan tones remains paramount, the ability to simultaneously reduce broadband fan noise levels has become more desirable. This paper describes improvements to a previously established broadband acoustic liner optimization process using the Advanced Noise Control Fan rig as a demonstrator. Specifically, in-duct attenuation predictions with a statistical source model are used to obtain optimum impedance spectra over the conditions of interest. The predicted optimum impedance information is then used with acoustic liner modeling tools to design liners aimed at producing impedance spectra that most closely match the predicted optimum values. Design selection is based on an acceptance criterion that provides the ability to apply increased weighting to specific frequencies and/or operating conditions. Constant-depth, double-degree of freedom and variable-depth, multi-degree of freedom designs are carried through design, fabrication, and testing to validate the efficacy of the design process. Results illustrate the value of the design process in concurrently evaluating the relative costs/benefits of these liner designs. This study also provides an application for demonstrating the integrated use of duct acoustic propagation/radiation and liner modeling tools in the design and evaluation of novel broadband liner concepts for complex engine configurations.

  6. Concurrent System Engineering and Risk Reduction for Dual-Band (RF/optical) Spacecraft Communications

    NASA Technical Reports Server (NTRS)

    Fielhauer, Karl, B.; Boone, Bradley, G.; Raible, Daniel, E.

    2012-01-01

    This paper describes a system engineering approach to examining the potential for combining elements of a deep-space RF and optical communications payload, for the purpose of reducing the size, weight and power burden on the spacecraft and the mission. Figures of merit and analytical methodologies are discussed to conduct trade studies, and several potential technology integration strategies are presented. Finally, the NASA Integrated Radio and Optical Communications (iROC) project is described, which directly addresses the combined RF and optical approach.

  7. Stacking transgenes in forest trees.

    PubMed

    Halpin, Claire; Boerjan, Wout

    2003-08-01

    Huge potential exists for improving plant raw materials and foodstuffs via metabolic engineering. To date, progress has mostly been limited to modulating the expression of single genes of well-studied pathways, such as the lignin biosynthetic pathway, in model species. However, a recent report illustrates a new level of sophistication in metabolic engineering by overexpressing one lignin enzyme while simultaneously suppressing the expression of another lignin gene in a tree, aspen. This novel approach to multi-gene manipulation has succeeded in concurrently improving several wood-quality traits.

  8. Advanced Technology Lifecycle Analysis System (ATLAS)

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Mankins, John C.

    2004-01-01

    Developing credible mass and cost estimates for space exploration and development architectures require multidisciplinary analysis based on physics calculations, and parametric estimates derived from historical systems. Within the National Aeronautics and Space Administration (NASA), concurrent engineering environment (CEE) activities integrate discipline oriented analysis tools through a computer network and accumulate the results of a multidisciplinary analysis team via a centralized database or spreadsheet Each minute of a design and analysis study within a concurrent engineering environment is expensive due the size of the team and supporting equipment The Advanced Technology Lifecycle Analysis System (ATLAS) reduces the cost of architecture analysis by capturing the knowledge of discipline experts into system oriented spreadsheet models. A framework with a user interface presents a library of system models to an architecture analyst. The analyst selects models of launchers, in-space transportation systems, and excursion vehicles, as well as space and surface infrastructure such as propellant depots, habitats, and solar power satellites. After assembling the architecture from the selected models, the analyst can create a campaign comprised of missions spanning several years. The ATLAS controller passes analyst specified parameters to the models and data among the models. An integrator workbook calls a history based parametric analysis cost model to determine the costs. Also, the integrator estimates the flight rates, launched masses, and architecture benefits over the years of the campaign. An accumulator workbook presents the analytical results in a series of bar graphs. In no way does ATLAS compete with a CEE; instead, ATLAS complements a CEE by ensuring that the time of the experts is well spent Using ATLAS, an architecture analyst can perform technology sensitivity analysis, study many scenarios, and see the impact of design decisions. When the analyst is satisfied with the system configurations, technology portfolios, and deployment strategies, he or she can present the concepts to a team, which will conduct a detailed, discipline-oriented analysis within a CEE. An analog to this approach is the music industry where a songwriter creates the lyrics and music before entering a recording studio.

  9. The AP1000{sup R} nuclear power plant innovative features for extended station blackout mitigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vereb, F.; Winters, J.; Schulz, T.

    2012-07-01

    Station Blackout (SBO) is defined as 'a condition wherein a nuclear power plant sustains a loss of all offsite electric power system concurrent with turbine trip and unavailability of all onsite emergency alternating current (AC) power system. Station blackout does not include the loss of available AC power to buses fed by station batteries through inverters or by alternate AC sources as defined in this section, nor does it assume a concurrent single failure or design basis accident...' in accordance with Reference 1. In this paper, the innovative features of the AP1000 plant design are described with their operation inmore » the scenario of an extended station blackout event. General operation of the passive safety systems are described as well as the unique features which allow the AP1000 plant to cope for at least 7 days during station blackout. Points of emphasis will include: - Passive safety system operation during SBO - 'Fail-safe' nature of key passive safety system valves; automatically places the valve in a conservatively safe alignment even in case of multiple failures in all power supply systems, including normal AC and battery backup - Passive Spent Fuel Pool cooling and makeup water supply during SBO - Robustness of AP1000 plant due to the location of key systems, structures and components required for Safe Shutdown - Diverse means of supplying makeup water to the Passive Containment Cooling System (PCS) and the Spent Fuel Pool (SFP) through use of an engineered, safety-related piping interface and portable equipment, as well as with permanently installed onsite ancillary equipment. (authors)« less

  10. Development of a Prototype Simulation Executive with Zooming in the Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Afjeh, Abdollah A.

    1995-01-01

    A major difficulty in designing aeropropulsion systems is that of identifying and understanding the interactions between the separate engine components and disciplines (e.g., fluid mechanics, structural mechanics, heat transfer, material properties, etc.). The traditional analysis approach is to decompose the system into separate components with the interaction between components being evaluated by the application of each of the single disciplines in a sequential manner. Here, one discipline uses information from the calculation of another discipline to determine the effects of component coupling. This approach, however, may not properly identify the consequences of these effects during the design phase, leaving the interactions to be discovered and evaluated during engine testing. This contributes to the time and cost of developing new propulsion systems as, typically, several design-build-test cycles are needed to fully identify multidisciplinary effects and reach the desired system performance. The alternative to sequential isolated component analysis is to use multidisciplinary coupling at a more fundamental level. This approach has been made more plausible due to recent advancements in computation simulation along with application of concurrent engineering concepts. Computer simulation systems designed to provide an environment which is capable of integrating the various disciplines into a single simulation system have been proposed and are currently being developed. One such system is being developed by the Numerical Propulsion System Simulation (NPSS) project. The NPSS project, being developed at the Interdisciplinary Technology Office at the NASA Lewis Research Center is a 'numerical test cell' designed to provide for comprehensive computational design and analysis of aerospace propulsion systems. It will provide multi-disciplinary analyses on a variety of computational platforms, and a user-interface consisting of expert systems, data base management and visualization tools, to allow the designer to investigate the complex interactions inherent in these systems. An interactive programming software system, known as the Application Visualization System (AVS), was utilized for the development of the propulsion system simulation. The modularity of this system provides the ability to couple propulsion system components, as well as disciplines, and provides for the ability to integrate existing, well established analysis codes into the overall system simulation. This feature allows the user to customize the simulation model by inserting desired analysis codes. The prototypical simulation environment for multidisciplinary analysis, called Turbofan Engine System Simulation (TESS), which incorporates many of the characteristics of the simulation environment proposed herein, is detailed.

  11. RAFCON: A Graphical Tool for Engineering Complex, Robotic Tasks

    DTIC Science & Technology

    2016-10-09

    Robotic tasks are becoming increasingly complex, and with this also the robotic systems. This requires new tools to manage this complexity and to...execution of robotic tasks, called RAFCON. These tasks are described in hierarchical state machines supporting concurrency. A formal notation of this concept

  12. Decision support and disease management: a logic engineering approach.

    PubMed

    Fox, J; Thomson, R

    1998-12-01

    This paper describes the development and application of PROforma, a unified technology for clinical decision support and disease management. Work leading to the implementation of PROforma has been carried out in a series of projects funded by European agencies over the past 13 years. The work has been based on logic engineering, a distinct design and development methodology that combines concepts from knowledge engineering, logic programming, and software engineering. Several of the projects have used the approach to demonstrate a wide range of applications in primary and specialist care and clinical research. Concurrent academic research projects have provided a sound theoretical basis for the safety-critical elements of the methodology. The principal technical results of the work are the PROforma logic language for defining clinical processes and an associated suite of software tools for delivering applications, such as decision support and disease management procedures. The language supports four standard objects (decisions, plans, actions, and enquiries), each of which has an intuitive meaning with well-understood logical semantics. The development toolset includes a powerful visual programming environment for composing applications from these standard components, for verifying consistency and completeness of the resulting specification and for delivering stand-alone or embeddable applications. Tools and applications that have resulted from the work are described and illustrated, with examples from specialist cancer care and primary care. The results of a number of evaluation activities are included to illustrate the utility of the technology.

  13. Formulation and demonstration of a robust mean variance optimization approach for concurrent airline network and aircraft design

    NASA Astrophysics Data System (ADS)

    Davendralingam, Navindran

    Conceptual design of aircraft and the airline network (routes) on which aircraft fly on are inextricably linked to passenger driven demand. Many factors influence passenger demand for various Origin-Destination (O-D) city pairs including demographics, geographic location, seasonality, socio-economic factors and naturally, the operations of directly competing airlines. The expansion of airline operations involves the identificaion of appropriate aircraft to meet projected future demand. The decisions made in incorporating and subsequently allocating these new aircraft to serve air travel demand affects the inherent risk and profit potential as predicted through the airline revenue management systems. Competition between airlines then translates to latent passenger observations of the routes served between OD pairs and ticket pricing---this in effect reflexively drives future states of demand. This thesis addresses the integrated nature of aircraft design, airline operations and passenger demand, in order to maximize future expected profits as new aircraft are brought into service. The goal of this research is to develop an approach that utilizes aircraft design, airline network design and passenger demand as a unified framework to provide better integrated design solutions in order to maximize expexted profits of an airline. This is investigated through two approaches. The first is a static model that poses the concurrent engineering paradigm above as an investment portfolio problem. Modern financial portfolio optimization techniques are used to leverage risk of serving future projected demand using a 'yet to be introduced' aircraft against potentially generated future profits. Robust optimization methodologies are incorporated to mitigate model sensitivity and address estimation risks associated with such optimization techniques. The second extends the portfolio approach to include dynamic effects of an airline's operations. A dynamic programming approach is employed to simulate the reflexive nature of airline supply-demand interactions by modeling the aggregate changes in demand that would result from tactical allocations of aircraft to maximize profit. The best yet-to-be-introduced aircraft maximizes profit by minimizing the long term fleetwide direct operating costs.

  14. A simulator-based analysis of engineering treatments for right-hook bicycle crashes at signalized intersections.

    PubMed

    Warner, Jennifer; Hurwitz, David S; Monsere, Christopher M; Fleskes, Kayla

    2017-07-01

    A right-hook crash is a crash between a right-turning motor vehicle and an adjacent through-moving bicycle. At signalized intersections, these crashes can occur during any portion of the green interval when conflicting bicycles and vehicles are moving concurrently. The objective of this research was to evaluate the effectiveness of four types of engineering countermeasures - regulatory signage, intersection pavement marking, smaller curb radius, and protected intersection design - at modifying driver behaviors that are known contributing factors in these crashes. This research focused on right-hook crashes that occur during the latter stage of the circular green indication at signalized intersections with a shared right-turn and through lane. Changes in driver performance in response to treatments were measured in a high-fidelity driving simulator. Twenty-eight participants each completed 22 right-turn maneuvers. A partially counterbalanced experimental design exposed drivers to critical scenarios, which had been determined in a previous experiment. For each turn, driver performance measures, including visual attention, crash avoidance, and potential crash severity, were collected. A total of 75 incidents (47 near-collisions and 28 collisions) were observed during the 616 right turns. All treatments had some positive effect on measured driver performance with respect to the right-turn vehicle conflicts. Further work is required to map the magnitude of these changes in driver performance to crash-based outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Evaluation of concurrent priority queue algorithms. Technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Q.

    1991-02-01

    The priority queue is a fundamental data structure that is used in a large variety of parallel algorithms, such as multiprocessor scheduling and parallel best-first search of state-space graphs. This thesis addresses the design and experimental evaluation of two novel concurrent priority queues: a parallel Fibonacci heap and a concurrent priority pool, and compares them with the concurrent binary heap. The parallel Fibonacci heap is based on the sequential Fibonacci heap, which is theoretically the most efficient data structure for sequential priority queues. This scheme not only preserves the efficient operation time bounds of its sequential counterpart, but also hasmore » very low contention by distributing locks over the entire data structure. The experimental results show its linearly scalable throughput and speedup up to as many processors as tested (currently 18). A concurrent access scheme for a doubly linked list is described as part of the implementation of the parallel Fibonacci heap. The concurrent priority pool is based on the concurrent B-tree and the concurrent pool. The concurrent priority pool has the highest throughput among the priority queues studied. Like the parallel Fibonacci heap, the concurrent priority pool scales linearly up to as many processors as tested. The priority queues are evaluated in terms of throughput and speedup. Some applications of concurrent priority queues such as the vertex cover problem and the single source shortest path problem are tested.« less

  16. A concurrent distributed system for aircraft tactical decision generation

    NASA Technical Reports Server (NTRS)

    Mcmanus, John W.

    1990-01-01

    A research program investigating the use of AI techniques to aid in the development of a tactical decision generator (TDG) for within visual range (WVR) air combat engagements is discussed. The application of AI programming and problem-solving methods in the development and implementation of a concurrent version of the computerized logic for air-to-air warfare simulations (CLAWS) program, a second-generation TDG, is presented. Concurrent computing environments and programming approaches are discussed, and the design and performance of prototype concurrent TDG system (Cube CLAWS) are presented. It is concluded that the Cube CLAWS has provided a useful testbed to evaluate the development of a distributed blackboard system. The project has shown that the complexity of developing specialized software on a distributed, message-passing architecture such as the Hypercube is not overwhelming, and that reasonable speedups and processor efficiency can be achieved by a distributed blackboard system. The project has also highlighted some of the costs of using a distributed approach to designing a blackboard system.

  17. Integrating Design and Manufacturing for a High Speed Civil Transport Wing

    NASA Technical Reports Server (NTRS)

    Marx, William J.; Mavris, Dimitri N.; Schrage, Daniel P.

    1994-01-01

    The aerospace industry is currently addressing the problem of integrating design and manufacturing. Because of the difficulties associated with using conventional, procedural techniques and algorithms, it is the authors' belief that the only feasible way to integrate the two concepts is with the development of an appropriate Knowledge-Based System (KBS). The authors propose a methodology for an aircraft producibility assessment, including a KBS, that addresses both procedural and heuristic aspects of integrating design and manufacturing of a High Speed Civil Transport (HSCT) wing. The HSCT was chosen as the focus of this investigation since it is a current NASA/aerospace industry initiative full of technological challenges involving many disciplines. The paper gives a brief background of selected previous supersonic transport studies followed by descriptions of key relevant design and manufacturing methodologies. Georgia Tech's Concurrent Engineering/Integrated Product and Process Development methodology is discussed with reference to this proposed conceptual producibility assessment. Evaluation criteria are presented that relate pertinent product and process parameters to overall product producibility. In addition, the authors' integration methodology and reasons for selecting a KBS to integrate design and manufacturing are presented in this paper. Finally, a proposed KBS is given, as well as statements of future work and overall investigation objectives.

  18. Solving a mathematical model integrating unequal-area facilities layout and part scheduling in a cellular manufacturing system by a genetic algorithm.

    PubMed

    Ebrahimi, Ahmad; Kia, Reza; Komijan, Alireza Rashidi

    2016-01-01

    In this article, a novel integrated mixed-integer nonlinear programming model is presented for designing a cellular manufacturing system (CMS) considering machine layout and part scheduling problems simultaneously as interrelated decisions. The integrated CMS model is formulated to incorporate several design features including part due date, material handling time, operation sequence, processing time, an intra-cell layout of unequal-area facilities, and part scheduling. The objective function is to minimize makespan, tardiness penalties, and material handling costs of inter-cell and intra-cell movements. Two numerical examples are solved by the Lingo software to illustrate the results obtained by the incorporated features. In order to assess the effects and importance of integration of machine layout and part scheduling in designing a CMS, two approaches, sequentially and concurrent are investigated and the improvement resulted from a concurrent approach is revealed. Also, due to the NP-hardness of the integrated model, an efficient genetic algorithm is designed. As a consequence, computational results of this study indicate that the best solutions found by GA are better than the solutions found by B&B in much less time for both sequential and concurrent approaches. Moreover, the comparisons between the objective function values (OFVs) obtained by sequential and concurrent approaches demonstrate that the OFV improvement is averagely around 17 % by GA and 14 % by B&B.

  19. Hands-On Teaching and Entrepreneurship Development.

    ERIC Educational Resources Information Center

    da Silveira, Marcos Azevedo; da Silva, Mauro Schwanke; Kelber, Christian R.; de Freitas, Manuel R.

    This paper presents the experiment being conducted in the Electric Circuits II course (ELE1103) at PUC-Rio's Electrical Engineering Department since March 1997. This experiment was held in both the fall and the spring semesters of 1997. The basis for the experiment was concurrent teaching methodology, to which the principles of entrepreneurship…

  20. 40 CFR 87.7 - Exemptions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Exemptions. 87.7 Section 87.7... POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES General Provisions § 87.7 Exemptions. (a) Exemptions based on..., with the concurrence of the Administrator, that application of any standard under § 87.21 is not...

  1. Review of Electrocution Deaths in Iraq: Part 1 - Electrocution of Staff Sergeant Ryan D. Maseth, U.S. Army

    DTIC Science & Technology

    2009-07-24

    concurrently. Photographs of LSF-1 from before and after June 2006 are consistent with the believed installation date. A forensic engineering...other services such as refuse collection and disposal, entomology , etc. Starting in November 2003, Washington Group International/Black and Veatch

  2. 33 CFR 203.84 - Forms of local participation-cost sharing.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... sharing. 203.84 Section 203.84 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE EMERGENCY EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Local...; and/or accomplishment of work either concurrently or within a specified reasonable period of time. The...

  3. 33 CFR 203.84 - Forms of local participation-cost sharing.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... sharing. 203.84 Section 203.84 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE EMERGENCY EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Local...; and/or accomplishment of work either concurrently or within a specified reasonable period of time. The...

  4. 33 CFR 203.84 - Forms of local participation-cost sharing.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... sharing. 203.84 Section 203.84 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE EMERGENCY EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Local...; and/or accomplishment of work either concurrently or within a specified reasonable period of time. The...

  5. 33 CFR 203.84 - Forms of local participation-cost sharing.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... sharing. 203.84 Section 203.84 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE EMERGENCY EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Local...; and/or accomplishment of work either concurrently or within a specified reasonable period of time. The...

  6. 33 CFR 203.84 - Forms of local participation-cost sharing.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... sharing. 203.84 Section 203.84 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE EMERGENCY EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Local...; and/or accomplishment of work either concurrently or within a specified reasonable period of time. The...

  7. Computational Fluid Dynamics and Additive Manufacturing to Diagnose and Treat Cardiovascular Disease.

    PubMed

    Randles, Amanda; Frakes, David H; Leopold, Jane A

    2017-11-01

    Noninvasive engineering models are now being used for diagnosing and planning the treatment of cardiovascular disease. Techniques in computational modeling and additive manufacturing have matured concurrently, and results from simulations can inform and enable the design and optimization of therapeutic devices and treatment strategies. The emerging synergy between large-scale simulations and 3D printing is having a two-fold benefit: first, 3D printing can be used to validate the complex simulations, and second, the flow models can be used to improve treatment planning for cardiovascular disease. In this review, we summarize and discuss recent methods and findings for leveraging advances in both additive manufacturing and patient-specific computational modeling, with an emphasis on new directions in these fields and remaining open questions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Uncovering the Problem-Solving Process: Cued Retrospective Reporting Versus Concurrent and Retrospective Reporting

    ERIC Educational Resources Information Center

    van Gog, Tamara; Paas, Fred; Merrienboer, Jeroen J. G.; Witte, Puk

    2005-01-01

    This study investigated the amounts of problem-solving process information ("action," "why," "how," and "metacognitive") elicited by means of concurrent, retrospective, and cued retrospective reporting. In a within-participants design, 26 participants completed electrical circuit troubleshooting tasks under different reporting conditions. The…

  9. Design and Analysis Techniques for Concurrent Blackboard Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Mcmanus, John William

    1992-01-01

    Blackboard systems are a natural progression of knowledge-based systems into a more powerful problem solving technique. They provide a way for several highly specialized knowledge sources to cooperate to solve large, complex problems. Blackboard systems incorporate the concepts developed by rule-based and expert systems programmers and include the ability to add conventionally coded knowledge sources. The small and specialized knowledge sources are easier to develop and test, and can be hosted on hardware specifically suited to the task that they are solving. The Formal Model for Blackboard Systems was developed to provide a consistent method for describing a blackboard system. A set of blackboard system design tools has been developed and validated for implementing systems that are expressed using the Formal Model. The tools are used to test and refine a proposed blackboard system design before the design is implemented. My research has shown that the level of independence and specialization of the knowledge sources directly affects the performance of blackboard systems. Using the design, simulation, and analysis tools, I developed a concurrent object-oriented blackboard system that is faster, more efficient, and more powerful than existing systems. The use of the design and analysis tools provided the highly specialized and independent knowledge sources required for my concurrent blackboard system to achieve its design goals.

  10. Concurrent removal of elemental mercury and SO2 from flue gas using a thiol-impregnated CaCO3-based adsorbent: a full factorial design study.

    PubMed

    Balasundaram, Karthik; Sharma, Mukesh

    2018-06-01

    Mercury (Hg) emitted from coal-based thermal power plants (CTPPs) can accumulate and bio-magnify in the food chain, thereby posing a risk to humans and wildlife. The central idea of this study was to develop an adsorbent which can concurrently remove elemental mercury (Hg 0 ) and SO 2 emitted from coal-based thermal power plants (CTPPs) in a single unit operation. Specifically, a composite adsorbent of CaCO 3 impregnated with 2-mercaptobenimidazole (2-MBI) (referred to as modified calcium carbonate (MCC)) was developed. While 2-MBI having sulfur functional group could selectively adsorb Hg 0 , CaCO 3 could remove SO 2 . Performance of the adsorbent was evaluated in terms of (i) removal (%) of Hg 0 and SO 2 , (ii) adsorption mechanism, (iii) adsorption kinetics, and (iv) leaching potential of mercury from spent adsorbent. The adsorption studies were performed using a 2 2 full factorial design of experiments with 15 ppbV of Hg 0 and 600 ppmV of SO 2 . Two factors, (i) reaction temperature (80 and 120 °C; temperature range in flue gas) and (ii) mass of 2-MBI (10 and 15 wt%), were investigated for the removal of Hg 0 and SO 2 (as %). The maximum Hg 0 and SO 2 removal was 86 and 93%, respectively. The results of XPS characterization showed that chemisorption is the predominant mechanism of Hg 0 and SO 2 adsorption on MCC. The Hg 0 adsorption on MCC followed Elovich kinetic model which is also indicative of chemisorption on heterogeneous surface. The toxicity characteristic leaching procedure (TCLP) and synthetic precipitation leaching procedure (SPLP) leached mercury from the spent adsorbent were within the acceptable levels defined in these tests. The engineering significance of this study is that the 2-MBI-modified CaCO 3 -based adsorbent has potential for concurrent removal of Hg 0 and SO 2 in a single unit operation. With only minor process modifications, the newly developed adsorbent can replace CaCO 3 in the flue-gas desulfurization (FGD) system.

  11. 77 FR 66864 - Delegation of Concurrent Authority to the Deputy Secretary

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-07

    ... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5671-D-01] Delegation of Concurrent Authority to the Deputy Secretary AGENCY: Office of the Secretary, HUD. ACTION: Notice of delegation of... officers and employees of the Department as the Secretary may designate, and may authorize successive...

  12. Developmental problems and their solution for the Space Shuttle main engine alternate liquid oxygen high-pressure turbopump: Anomaly or failure investigation the key

    NASA Astrophysics Data System (ADS)

    Ryan, R.; Gross, L. A.

    1995-05-01

    The Space Shuttle main engine (SSME) alternate high-pressure liquid oxygen pump experienced synchronous vibration and ball bearing life problems that were program threatening. The success of the program hinged on the ability to solve these development problems. The design and solutions to these problems are engirded in the lessons learned and experiences from prior programs, technology programs, and the ability to properly conduct failure or anomaly investigations. The failure investigation determines the problem cause and is the basis for recommending design solutions. For a complex problem, a comprehensive solution requires that formal investigation procedures be used, including fault trees, resolution logic, and action items worked through a concurrent engineering-multidiscipline team. The normal tendency to use an intuitive, cut-and-try approach will usually prove to be costly, both in money and time and will reach a less than optimum, poorly understood answer. The SSME alternate high-pressure oxidizer turbopump development has had two complex problems critical to program success: (1) high synchronous vibrations and (2) excessive ball bearing wear. This paper will use these two problems as examples of this formal failure investigation approach. The results of the team's investigation provides insight into the complexity of the turbomachinery technical discipline interacting/sensitivities and the fine balance of competing investigations required to solve problems and guarantee program success. It is very important to the solution process that maximum use be made of the resources that both the contractor and Government can bring to the problem in a supporting and noncompeting way. There is no place for the not-invented-here attitude. The resources include, but are not limited to: (1) specially skilled professionals; (2) supporting technologies; (3) computational codes and capabilities; and (4) test and manufacturing facilities.

  13. Developmental problems and their solution for the Space Shuttle main engine alternate liquid oxygen high-pressure turbopump: Anomaly or failure investigation the key

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Gross, L. A.

    1995-01-01

    The Space Shuttle main engine (SSME) alternate high-pressure liquid oxygen pump experienced synchronous vibration and ball bearing life problems that were program threatening. The success of the program hinged on the ability to solve these development problems. The design and solutions to these problems are engirded in the lessons learned and experiences from prior programs, technology programs, and the ability to properly conduct failure or anomaly investigations. The failure investigation determines the problem cause and is the basis for recommending design solutions. For a complex problem, a comprehensive solution requires that formal investigation procedures be used, including fault trees, resolution logic, and action items worked through a concurrent engineering-multidiscipline team. The normal tendency to use an intuitive, cut-and-try approach will usually prove to be costly, both in money and time and will reach a less than optimum, poorly understood answer. The SSME alternate high-pressure oxidizer turbopump development has had two complex problems critical to program success: (1) high synchronous vibrations and (2) excessive ball bearing wear. This paper will use these two problems as examples of this formal failure investigation approach. The results of the team's investigation provides insight into the complexity of the turbomachinery technical discipline interacting/sensitivities and the fine balance of competing investigations required to solve problems and guarantee program success. It is very important to the solution process that maximum use be made of the resources that both the contractor and Government can bring to the problem in a supporting and noncompeting way. There is no place for the not-invented-here attitude. The resources include, but are not limited to: (1) specially skilled professionals; (2) supporting technologies; (3) computational codes and capabilities; and (4) test and manufacturing facilities.

  14. Multifidelity, multidisciplinary optimization of turbomachines with shock interaction

    NASA Astrophysics Data System (ADS)

    Joly, Michael Marie

    Research on high-speed air-breathing propulsion aims at developing aircraft with antipodal range and space access. Before reaching high speed at high altitude, the flight vehicle needs to accelerate from takeoff to scramjet takeover. Air turbo rocket engines combine turbojet and rocket engine cycles to provide the necessary thrust in the so-called low-speed regime. Challenges related to turbomachinery components are multidisciplinary, since both the high compression ratio compressor and the powering high-pressure turbine operate in the transonic regime in compact environments with strong shock interactions. Besides, lightweight is vital to avoid hindering the scramjet operation. Recent progress in evolutionary computing provides aerospace engineers with robust and efficient optimization algorithms to address concurrent objectives. The present work investigates Multidisciplinary Design Optimization (MDO) of innovative transonic turbomachinery components. Inter-stage aerodynamic shock interaction in turbomachines are known to generate high-cycle fatigue on the rotor blades compromising their structural integrity. A soft-computing strategy is proposed to mitigate the vane downstream distortion, and shown to successfully attenuate the unsteady forcing on the rotor of a high-pressure turbine. Counter-rotation offers promising prospects to reduce the weight of the machine, with fewer stages and increased load per row. An integrated approach based on increasing level of fidelity and aero-structural coupling is then presented and allows achieving a highly loaded compact counter-rotating compressor.

  15. Examining Science Teachers' Argumentation in a Teacher Workshop on Earthquake Engineering

    NASA Astrophysics Data System (ADS)

    Cavlazoglu, Baki; Stuessy, Carol

    2018-02-01

    The purpose of this study was to examine changes in the quality of science teachers' argumentation as a result of their engagement in a teacher workshop on earthquake engineering emphasizing distributed learning approaches, which included concept mapping, collaborative game playing, and group lesson planning. The participants were ten high school science teachers from US high schools who elected to attend the workshop. To begin and end the teacher workshop, teachers in small groups engaged in concept mapping exercises with other teachers. Researchers audio-recorded individual teachers' argumentative statements about the inclusion of earthquake engineering concepts in their concept maps, which were then analyzed to reveal the quality of teachers' argumentation. Toulmin's argumentation model formed the framework for designing a classification schema to analyze the quality of participants' argumentative statements. While the analysis of differences in pre- and post-workshop concept mapping exercises revealed that the number of argumentative statements did not change significantly, the quality of participants' argumentation did increase significantly. As these differences occurred concurrently with distributed learning approaches used throughout the workshop, these results provide evidence to support distributed learning approaches in professional development workshop activities to increase the quality of science teachers' argumentation. Additionally, these results support the use of concept mapping as a cognitive scaffold to organize participants' knowledge, facilitate the presentation of argumentation, and as a research tool for providing evidence of teachers' argumentation skills.

  16. A comparison of forward and concurrent chaining strategies in teaching laundromat skills to students with severe handicaps.

    PubMed

    McDonnell, J; McFarland, S

    1988-01-01

    This study compared the relative efficiency of forward and concurrent chaining strategies in teaching the use of a commercial washing machine and laundry soap dispenser to four high school students with severe handicaps. Acquisition and maintenance of the laundromat skills were assessed through a multielement, alternating treatment within subject design. Results indicated that the concurrent chaining strategy was more efficient than forward chaining in facilitating acquisition of the activities. Four week and eight week follow-up probes indicated that concurrent chaining resulted in better maintenance of the activities. The implications of these results for teaching community activities and future research in building complex chains are discussed.

  17. On the operation of machines powered by quantum non-thermal baths

    DOE PAGES

    Niedenzu, Wolfgang; Gelbwaser-Klimovsky, David; Kofman, Abraham G.; ...

    2016-08-02

    Diverse models of engines energised by quantum-coherent, hence non-thermal, baths allow the engine efficiency to transgress the standard thermodynamic Carnot bound. These transgressions call for an elucidation of the underlying mechanisms. Here we show that non-thermal baths may impart not only heat, but also mechanical work to a machine. The Carnot bound is inapplicable to such a hybrid machine. Intriguingly, it may exhibit dual action, concurrently as engine and refrigerator, with up to 100% efficiency. Here, we conclude that even though a machine powered by a quantum bath may exhibit an unconventional performance, it still abides by the traditional principlesmore » of thermodynamics.« less

  18. Concurrent topological design of composite structures and materials containing multiple phases of distinct Poisson's ratios

    NASA Astrophysics Data System (ADS)

    Long, Kai; Yuan, Philip F.; Xu, Shanqing; Xie, Yi Min

    2018-04-01

    Most studies on composites assume that the constituent phases have different values of stiffness. Little attention has been paid to the effect of constituent phases having distinct Poisson's ratios. This research focuses on a concurrent optimization method for simultaneously designing composite structures and materials with distinct Poisson's ratios. The proposed method aims to minimize the mean compliance of the macrostructure with a given mass of base materials. In contrast to the traditional interpolation of the stiffness matrix through numerical results, an interpolation scheme of the Young's modulus and Poisson's ratio using different parameters is adopted. The numerical results demonstrate that the Poisson effect plays a key role in reducing the mean compliance of the final design. An important contribution of the present study is that the proposed concurrent optimization method can automatically distribute base materials with distinct Poisson's ratios between the macrostructural and microstructural levels under a single constraint of the total mass.

  19. 33 CFR 385.5 - Guidance memoranda.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Secretary of the Army. The Corps of Engineers and the South Florida Water Management District shall also... achieving the goals and purposes of the Plan. (2) The Secretary of the Army shall afford the public an... concurrence of the Secretary of the Interior and the Governor. Within 180 days after being provided with the...

  20. Materials and Process Activities for NASA's Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Polis, Daniel L.

    2012-01-01

    In January 2007, the NASA Administrator and Associate Administrator for the Exploration Systems Mission Directorate chartered the NASA Engineering and Safety Center (NESC) to design, build, and test a full-scale Composite Crew Module (CCM). The overall goal of the CCM project was to develop a team from the NASA family with hands-on experience in composite design, manufacturing, and testing in anticipation of future space exploration systems being made of composite materials. The CCM project was planned to run concurrently with the Orion project s baseline metallic design within the Constellation Program so that features could be compared and discussed without inducing risk to the overall Program. The materials and process activities were prioritized based on a rapid prototype approach. This approach focused developmental activities on design details with greater risk and uncertainty, such as out-of-autoclave joining, over some of the more traditional lamina and laminate building block levels. While process development and associated building block testing were performed, several anomalies were still observed at the full-scale level due to interactions between process robustness and manufacturing scale-up. This paper describes the process anomalies that were encountered during the CCM development and the subsequent root cause investigations that led to the final design solutions. These investigations highlight the importance of full-scale developmental work early in the schedule of a complex composite design/build project.

  1. Reconciling Pairs of Concurrently Used Clinical Practice Guidelines Using Constraint Logic Programming

    PubMed Central

    Wilk, Szymon; Michalowski, Martin; Michalowski, Wojtek; Hing, Marisela Mainegra; Farion, Ken

    2011-01-01

    This paper describes a new methodological approach to reconciling adverse and contradictory activities (called points of contention) occurring when a patient is managed according to two or more concurrently used clinical practice guidelines (CPGs). The need to address these inconsistencies occurs when a patient with more than one disease, each of which is a comorbid condition, has to be managed according to different treatment regimens. We propose an automatic procedure that constructs a mathematical guideline model using the Constraint Logic Programming (CLP) methodology, uses this model to identify and mitigate encountered points of contention, and revises the considered CPGs accordingly. The proposed procedure is used as an alerting mechanism and coupled with a guideline execution engine warns the physician about potential problems with the concurrent application of two or more guidelines. We illustrate the operation of our procedure in a clinical scenario describing simultaneous use of CPGs for duodenal ulcer and transient ischemic attack. PMID:22195153

  2. Construct and Concurrent Validity of a Prototype Questionnaire to Survey Public Attitudes toward Stuttering

    ERIC Educational Resources Information Center

    St. Louis, Kenneth O.; Reichel, Isabella K.; Yaruss, J. Scott; Lubker, Bobbie Boyd

    2009-01-01

    Purpose: Construct validity and concurrent validity were investigated in a prototype survey instrument, the "Public Opinion Survey of Human Attributes-Experimental Edition" (POSHA-E). The POSHA-E was designed to measure public attitudes toward stuttering within the context of eight other attributes, or "anchors," assumed to range from negative…

  3. Wireless Computing Architecture III

    DTIC Science & Technology

    2013-09-01

    MIMO Multiple-Input and Multiple-Output MIMO /CON MIMO with concurrent hannel access and estimation MU- MIMO Multiuser MIMO OFDM Orthogonal...compressive sensing \\; a design for concurrent channel estimation in scalable multiuser MIMO networking; and novel networking protocols based on machine...Network, Antenna Arrays, UAV networking, Angle of Arrival, Localization MIMO , Access Point, Channel State Information, Compressive Sensing 16

  4. An Adaptive Tradeoff Algorithm for Multi-issue SLA Negotiation

    NASA Astrophysics Data System (ADS)

    Son, Seokho; Sim, Kwang Mong

    Since participants in a Cloud may be independent bodies, mechanisms are necessary for resolving different preferences in leasing Cloud services. Whereas there are currently mechanisms that support service-level agreement negotiation, there is little or no negotiation support for concurrent price and timeslot for Cloud service reservations. For the concurrent price and timeslot negotiation, a tradeoff algorithm to generate and evaluate a proposal which consists of price and timeslot proposal is necessary. The contribution of this work is thus to design an adaptive tradeoff algorithm for multi-issue negotiation mechanism. The tradeoff algorithm referred to as "adaptive burst mode" is especially designed to increase negotiation speed and total utility and to reduce computational load by adaptively generating concurrent set of proposals. The empirical results obtained from simulations carried out using a testbed suggest that due to the concurrent price and timeslot negotiation mechanism with adaptive tradeoff algorithm: 1) both agents achieve the best performance in terms of negotiation speed and utility; 2) the number of evaluations of each proposal is comparatively lower than previous scheme (burst-N).

  5. Design of testbed and emulation tools

    NASA Technical Reports Server (NTRS)

    Lundstrom, S. F.; Flynn, M. J.

    1986-01-01

    The research summarized was concerned with the design of testbed and emulation tools suitable to assist in projecting, with reasonable accuracy, the expected performance of highly concurrent computing systems on large, complete applications. Such testbed and emulation tools are intended for the eventual use of those exploring new concurrent system architectures and organizations, either as users or as designers of such systems. While a range of alternatives was considered, a software based set of hierarchical tools was chosen to provide maximum flexibility, to ease in moving to new computers as technology improves and to take advantage of the inherent reliability and availability of commercially available computing systems.

  6. An optimization design proposal of automated guided vehicles for mixed type transportation in hospital environments

    PubMed Central

    González, Domingo; Espinosa, María del Mar; Domínguez, Manuel

    2017-01-01

    Aim The aim of this paper is to present an optimization proposal in the automated guided vehicles design used in hospital logistics, as well as to analyze the impact of its implementation in a real environment. Method This proposal is based on the design of those elements that would allow the vehicles to deliver an extra cart by the towing method. So, the proposal intention is to improve the productivity and the performance of the current vehicles by using a transportation method of combined carts. Results The study has been developed following concurrent engineering premises from three different viewpoints. First, the sequence of operations has been described, and second, a proposal of design of the equipment has been undertaken. Finally, the impact of the proposal has been analyzed according to real data from the Hospital Universitario Rio Hortega in Valladolid (Spain). In this particular case, by the implementation of the analyzed proposal in the hospital a reduction of over 35% of the current time of use can be achieved. This result may allow adding new tasks to the vehicles, and according to this, both a new kind of vehicle and a specific module can be developed in order to get a better performance. PMID:28562681

  7. A novel approach to quality improvement in a safety-net practice: concurrent peer review visits.

    PubMed

    Fiscella, Kevin; Volpe, Ellen; Winters, Paul; Brown, Melissa; Idris, Amna; Harren, Tricia

    2010-12-01

    Concurrent peer review visits are structured office visits conducted by clinician peers of the primary care clinician that are specifically designed to reduce competing demands, clinical inertia, and bias. We assessed whether a single concurrent peer review visit reduced clinical inertia and improved control of hypertension, hyperlipidemia, and diabetes control among underserved patients. We conducted a randomized encouragement trial to evaluate concurrent peer review visits with a community health center. Seven hundred twenty-seven patients with hypertension, hyperlipidemia, and/or diabetes who were not at goal for systolic blood pressure (SBP), low-density lipoprotein cholesterol (LDL-C), and/or glycated hemoglobin (A1c) were randomly assigned to an invitation to participate in a concurrent peer review visit or to usual care. We compared change in these measures using mixed models and rates of therapeutic intensification during concurrent peer review visits with control visits. One hundred seventy-one patients completed a concurrent peer review visit. SBP improved significantly (p < .01) more among those completing concurrent peer review visits than among those who failed to respond to a concurrent peer review invitation or those randomized to usual care. There were no differences seen for changes in LDL-C or A1c. Concurrent peer review visits were associated with statistically significant greater clinician intensification of blood pressure (p < .001), lipid (p < .001), and diabetes (p < .005) treatment than either for control visits for patients in either the nonresponse group or usual care group. Concurrent peer review visits represent a promising strategy for improving blood pressure control and improving therapeutic intensification in community health centers.

  8. Towards a general object-oriented software development methodology

    NASA Technical Reports Server (NTRS)

    Seidewitz, ED; Stark, Mike

    1986-01-01

    Object diagrams were used to design a 5000 statement team training exercise and to design the entire dynamics simulator. The object diagrams are also being used to design another 50,000 statement Ada system and a personal computer based system that will be written in Modula II. The design methodology evolves out of these experiences as well as the limitations of other methods that were studied. Object diagrams, abstraction analysis, and associated principles provide a unified framework which encompasses concepts from Yourdin, Booch, and Cherry. This general object-oriented approach handles high level system design, possibly with concurrency, through object-oriented decomposition down to a completely functional level. How object-oriented concepts can be used in other phases of the software life-cycle, such as specification and testing is being studied concurrently.

  9. Ares I-X Roll Control System Development

    NASA Technical Reports Server (NTRS)

    Unger, Ronald J.; Massey, Edmund C.

    2009-01-01

    Project Managers often face challenging technical, schedule and budget issues. This presentation will explore how the Ares I-X Roll Control System Integrated Product Team (IPT) mitigated challenges such as concurrent engineering requirements and environments and evolving program processes, while successfully managing an aggressive project schedule and tight budget. IPT challenges also included communications and negotiations among inter- and intra-government agencies, including the US Air Force, NASA/MSFC Propulsion Engineering, LaRC, GRC, KSC, WSTF, and the Constellation Program. In order to successfully meet these challenges it was essential that the IPT define those items that most affected the schedule critical path, define early mitigation strategies to reduce technical, schedule, and budget risks, and maintain the end-product focus of an "unmanned test flight" context for the flight hardware. The makeup of the IPT and how it would function were also important considerations. The IPT consisted of NASA/MSFC (project management, engineering, and safety/quality) and contractors (Teledyne Brown Engineering and Pratt and Whitney Rocketdyne, who supplied heritage hardware experience). The early decision to have a small focused IPT working "badgelessly" across functional lines to eliminate functional stove-piping allowed for many more tasks to be done by fewer people. It also enhanced a sense of ownership of the products, while still being able to revert back to traditional roles in order to provide the required technical independence in design reviews and verification closures. This presentation will highlight several prominent issues and discuss how they were mitigated and the resulting Lessons Learned that might benefit other projects.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heroux, Michael; Lethin, Richard

    Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that makemore » design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.« less

  11. Methodologies and systems for heterogeneous concurrent computing

    NASA Technical Reports Server (NTRS)

    Sunderam, V. S.

    1994-01-01

    Heterogeneous concurrent computing is gaining increasing acceptance as an alternative or complementary paradigm to multiprocessor-based parallel processing as well as to conventional supercomputing. While algorithmic and programming aspects of heterogeneous concurrent computing are similar to their parallel processing counterparts, system issues, partitioning and scheduling, and performance aspects are significantly different. In this paper, we discuss critical design and implementation issues in heterogeneous concurrent computing, and describe techniques for enhancing its effectiveness. In particular, we highlight the system level infrastructures that are required, aspects of parallel algorithm development that most affect performance, system capabilities and limitations, and tools and methodologies for effective computing in heterogeneous networked environments. We also present recent developments and experiences in the context of the PVM system and comment on ongoing and future work.

  12. Diffusion phenomena of cells and biomolecules in microfluidic devices.

    PubMed

    Yildiz-Ozturk, Ece; Yesil-Celiktas, Ozlem

    2015-09-01

    Biomicrofluidics is an emerging field at the cross roads of microfluidics and life sciences which requires intensive research efforts in terms of introducing appropriate designs, production techniques, and analysis. The ultimate goal is to deliver innovative and cost-effective microfluidic devices to biotech, biomedical, and pharmaceutical industries. Therefore, creating an in-depth understanding of the transport phenomena of cells and biomolecules becomes vital and concurrently poses significant challenges. The present article outlines the recent advancements in diffusion phenomena of cells and biomolecules by highlighting transport principles from an engineering perspective, cell responses in microfluidic devices with emphases on diffusion- and flow-based microfluidic gradient platforms, macroscopic and microscopic approaches for investigating the diffusion phenomena of biomolecules, microfluidic platforms for the delivery of these molecules, as well as the state of the art in biological applications of mammalian cell responses and diffusion of biomolecules.

  13. Diffusion phenomena of cells and biomolecules in microfluidic devices

    PubMed Central

    Yildiz-Ozturk, Ece; Yesil-Celiktas, Ozlem

    2015-01-01

    Biomicrofluidics is an emerging field at the cross roads of microfluidics and life sciences which requires intensive research efforts in terms of introducing appropriate designs, production techniques, and analysis. The ultimate goal is to deliver innovative and cost-effective microfluidic devices to biotech, biomedical, and pharmaceutical industries. Therefore, creating an in-depth understanding of the transport phenomena of cells and biomolecules becomes vital and concurrently poses significant challenges. The present article outlines the recent advancements in diffusion phenomena of cells and biomolecules by highlighting transport principles from an engineering perspective, cell responses in microfluidic devices with emphases on diffusion- and flow-based microfluidic gradient platforms, macroscopic and microscopic approaches for investigating the diffusion phenomena of biomolecules, microfluidic platforms for the delivery of these molecules, as well as the state of the art in biological applications of mammalian cell responses and diffusion of biomolecules. PMID:26180576

  14. Development of a forestry government agency enterprise GIS system: a disconnected editing approach

    NASA Astrophysics Data System (ADS)

    Zhu, Jin; Barber, Brad L.

    2008-10-01

    The Texas Forest Service (TFS) has developed a geographic information system (GIS) for use by agency personnel in central Texas for managing oak wilt suppression and other landowner assistance programs. This Enterprise GIS system was designed to support multiple concurrent users accessing shared information resources. The disconnected editing approach was adopted in this system to avoid the overhead of maintaining an active connection between TFS central Texas field offices and headquarters since most field offices are operating with commercially provided Internet service. The GIS system entails maintaining a personal geodatabase on each local field office computer. Spatial data from the field is periodically up-loaded into a central master geodatabase stored in a Microsoft SQL Server at the TFS headquarters in College Station through the ESRI Spatial Database Engine (SDE). This GIS allows users to work off-line when editing data and requires connecting to the central geodatabase only when needed.

  15. Improved Signal Processing Technique Leads to More Robust Self Diagnostic Accelerometer System

    NASA Technical Reports Server (NTRS)

    Tokars, Roger; Lekki, John; Jaros, Dave; Riggs, Terrence; Evans, Kenneth P.

    2010-01-01

    The self diagnostic accelerometer (SDA) is a sensor system designed to actively monitor the health of an accelerometer. In this case an accelerometer is considered healthy if it can be determined that it is operating correctly and its measurements may be relied upon. The SDA system accomplishes this by actively monitoring the accelerometer for a variety of failure conditions including accelerometer structural damage, an electrical open circuit, and most importantly accelerometer detachment. In recent testing of the SDA system in emulated engine operating conditions it has been found that a more robust signal processing technique was necessary. An improved accelerometer diagnostic technique and test results of the SDA system utilizing this technique are presented here. Furthermore, the real time, autonomous capability of the SDA system to concurrently compensate for effects from real operating conditions such as temperature changes and mechanical noise, while monitoring the condition of the accelerometer health and attachment, will be demonstrated.

  16. Optimization of Car Body under Constraints of Noise, Vibration, and Harshness (NVH), and Crash

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas; Yang, Ren-Jye; Sobieszczanski-Sobieski, Jaroslaw (Editor)

    2000-01-01

    To be competitive on the today's market, cars have to be as light as possible while meeting the Noise, Vibration, and Harshness (NVH) requirements and conforming to Government-man dated crash survival regulations. The latter are difficult to meet because they involve very compute-intensive, nonlinear analysis, e.g., the code RADIOSS capable of simulation of the dynamics, and the geometrical and material nonlinearities of a thin-walled car structure in crash, would require over 12 days of elapsed time for a single design of a 390K elastic degrees of freedom model, if executed on a single processor of the state-of-the-art SGI Origin2000 computer. Of course, in optimization that crash analysis would have to be invoked many times. Needless to say, that has rendered such optimization intractable until now. The car finite element model is shown. The advent of computers that comprise large numbers of concurrently operating processors has created a new environment wherein the above optimization, and other engineering problems heretofore regarded as intractable may be solved. The procedure, shown, is a piecewise approximation based method and involves using a sensitivity based Taylor series approximation model for NVH and a polynomial response surface model for Crash. In that method the NVH constraints are evaluated using a finite element code (MSC/NASTRAN) that yields the constraint values and their derivatives with respect to design variables. The crash constraints are evaluated using the explicit code RADIOSS on the Origin 2000 operating on 256 processors simultaneously to generate data for a polynomial response surface in the design variable domain. The NVH constraints and their derivatives combined with the response surface for the crash constraints form an approximation to the system analysis (surrogate analysis) that enables a cycle of multidisciplinary optimization within move limits. In the inner loop, the NVH sensitivities are recomputed to update the NVH approximation model while keeping the Crash response surface constant. In every outer loop, the Crash response surface approximation is updated, including a gradual increase in the order of the response surface and the response surface extension in the direction of the search. In this optimization task, the NVH discipline has 30 design variables while the crash discipline has 20 design variables. A subset of these design variables (10) are common to both the NVH and crash disciplines. In order to construct a linear response surface for the Crash discipline constraints, a minimum of 21 design points would have to be analyzed using the RADIOSS code. On a single processor in Origin 2000 that amount of computing would require over 9 months! In this work, these runs were carried out concurrently on the Origin 2000 using multiple processors, ranging from 8 to 16, for each crash (RADIOSS) analysis. Another figure shows the wall time required for a single RADIOSS analysis using varying number of processors, as well as provides a comparison of 2 different common data placement procedures within the allotted memories for each analysis. The initial design is an infeasible design with NVH discipline Static Torsion constraint violations of over 10%. The final optimized design is a feasible design with a weight reduction of 15 kg compared to the initial design. This work demonstrates how advanced methodology for optimization combined with the technology of concurrent processing enables applications that until now were out of reach because of very long time-to-solution.

  17. Research Needs and Impacts in Predictive Simulation for Internal Combustion Engines (PreSICE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eckerle, Wayne; Rutland, Chris; Rohlfing, Eric

    This report is based on a SC/EERE Workshop to Identify Research Needs and Impacts in Predictive Simulation for Internal Combustion Engines (PreSICE), held March 3, 2011, to determine strategic focus areas that will accelerate innovation in engine design to meet national goals in transportation efficiency. The U.S. has reached a pivotal moment when pressures of energy security, climate change, and economic competitiveness converge. Oil prices remain volatile and have exceeded $100 per barrel twice in five years. At these prices, the U.S. spends $1 billion per day on imported oil to meet our energy demands. Because the transportation sector accountsmore » for two-thirds of our petroleum use, energy security is deeply entangled with our transportation needs. At the same time, transportation produces one-quarter of the nation’s carbon dioxide output. Increasing the efficiency of internal combustion engines is a technologically proven and cost-effective approach to dramatically improving the fuel economy of the nation’s fleet of vehicles in the near- to mid-term, with the corresponding benefits of reducing our dependence on foreign oil and reducing carbon emissions. Because of their relatively low cost, high performance, and ability to utilize renewable fuels, internal combustion engines—including those in hybrid vehicles—will continue to be critical to our transportation infrastructure for decades. Achievable advances in engine technology can improve the fuel economy of automobiles by over 50% and trucks by over 30%. Achieving these goals will require the transportation sector to compress its product development cycle for cleaner, more efficient engine technologies by 50% while simultaneously exploring innovative design space. Concurrently, fuels will also be evolving, adding another layer of complexity and further highlighting the need for efficient product development cycles. Current design processes, using “build and test” prototype engineering, will not suffice. Current market penetration of new engine technologies is simply too slow—it must be dramatically accelerated. These challenges present a unique opportunity to marshal U.S. leadership in science-based simulation to develop predictive computational design tools for use by the transportation industry. The use of predictive simulation tools for enhancing combustion engine performance will shrink engine development timescales, accelerate time to market, and reduce development costs, while ensuring the timely achievement of energy security and emissions targets and enhancing U.S. industrial competitiveness. In 2007 Cummins achieved a milestone in engine design by bringing a diesel engine to market solely with computer modeling and analysis tools. The only testing was after the fact to confirm performance. Cummins achieved a reduction in development time and cost. As important, they realized a more robust design, improved fuel economy, and met all environmental and customer constraints. This important first step demonstrates the potential for computational engine design. But, the daunting complexity of engine combustion and the revolutionary increases in efficiency needed require the development of simulation codes and computation platforms far more advanced than those available today. Based on these needs, a Workshop to Identify Research Needs and Impacts in Predictive Simulation for Internal Combustion Engines (PreSICE) convened over 60 U.S. leaders in the engine combustion field from industry, academia, and national laboratories to focus on two critical areas of advanced simulation, as identified by the U.S. automotive and engine industries. First, modern engines require precise control of the injection of a broad variety of fuels that is far more subtle than achievable to date and that can be obtained only through predictive modeling and simulation. Second, the simulation, understanding, and control of these stochastic in-cylinder combustion processes lie on the critical path to realizing more efficient engines with greater power density. Fuel sprays set the initial conditions for combustion in essentially all future transportation engines; yet today designers primarily use empirical methods that limit the efficiency achievable. Three primary spray topics were identified as focus areas in the workshop: The fuel delivery system, which includes fuel manifolds and internal injector flow, The multi-phase fuel–air mixing in the combustion chamber of the engine, and The heat transfer and fluid interactions with cylinder walls. Current understanding and modeling capability of stochastic processes in engines remains limited and prevents designers from achieving significantly higher fuel economy. To improve this situation, the workshop participants identified three focus areas for stochastic processes: Improve fundamental understanding that will help to establish and characterize the physical causes of stochastic events, Develop physics-based simulation models that are accurate and sensitive enough to capture performance-limiting variability, and Quantify and manage uncertainty in model parameters and boundary conditions. Improved models and understanding in these areas will allow designers to develop engines with reduced design margins and that operate reliably in more efficient regimes. All of these areas require improved basic understanding, high-fidelity model development, and rigorous model validation. These advances will greatly reduce the uncertainties in current models and improve understanding of sprays and fuel–air mixture preparation that limit the investigation and development of advanced combustion technologies. The two strategic focus areas have distinctive characteristics but are inherently coupled. Coordinated activities in basic experiments, fundamental simulations, and engineering-level model development and validation can be used to successfully address all of the topics identified in the PreSICE workshop. The outcome will be: New and deeper understanding of the relevant fundamental physical and chemical processes in advanced combustion technologies, Implementation of this understanding into models and simulation tools appropriate for both exploration and design, and Sufficient validation with uncertainty quantification to provide confidence in the simulation results. These outcomes will provide the design tools for industry to reduce development time by up to 30% and improve engine efficiencies by 30% to 50%. The improved efficiencies applied to the national mix of transportation applications have the potential to save over 5 million barrels of oil per day, a current cost savings of $500 million per day.« less

  18. Predicting the language proficiency of Chinese student pilots within American airspace: Single-task versus dual-task English-language assessment

    NASA Astrophysics Data System (ADS)

    Noble, Clifford Elliott, II

    2002-09-01

    The problem. The purpose of this study was to investigate the ability of three single-task instruments---(a) the Test of English as a Foreign Language, (b) the Aviation Test of Spoken English, and (c) the Single Manual-Tracking Test---and three dual-task instruments---(a) the Concurrent Manual-Tracking and Communication Test, (b) the Certified Flight Instructor's Test, and (c) the Simulation-Based English Test---to predict the language performance of 10 Chinese student pilots speaking English as a second language when operating single-engine and multiengine aircraft within American airspace. Method. This research implemented a correlational design to investigate the ability of the six described instruments to predict the mean score of the criterion evaluation, which was the Examiner's Test. This test assessed the oral communication skill of student pilots on the flight portion of the terminal checkride in the Piper Cadet, Piper Seminole, and Beechcraft King Air airplanes. Results. Data from the Single Manual-Tracking Test, as well as the Concurrent Manual-Tracking and Communication Test, were discarded due to performance ceiling effects. Hypothesis 1, which stated that the average correlation between the mean scores of the dual-task evaluations and that of the Examiner's Test would predict the mean score of the criterion evaluation with a greater degree of accuracy than that of single-task evaluations, was not supported. Hypothesis 2, which stated that the correlation between the mean scores of the participants on the Simulation-Based English Test and the Examiner's Test would predict the mean score of the criterion evaluation with a greater degree of accuracy than that of all single- and dual-task evaluations, was also not supported. The findings suggest that single- and dual-task assessments administered after initial flight training are equivalent predictors of language performance when piloting single-engine and multiengine aircraft.

  19. The James Webb Telescope Instrument Suite Layout: Optical System Engineering Considerations for a Large, Deployable Space Telescope

    NASA Technical Reports Server (NTRS)

    Bos, Brent; Davila, Pam; Jurotich, Matthew; Hobbs, Gurnie; Lightsey, Paul; Contreras, Jim; Whitman, Tony

    2003-01-01

    The James Webb Space Telescope (JWST) is a space-based, infrared observatory designed to study the early stages of galaxy formation in the Universe. The telescope will be launched into an elliptical orbit about the second Lagrange point and passively cooled to 30-50 K to enable astronomical observations from 0.6 to 28 microns. A group from the NASA Goddard Space Flight Center and the Northrop Grumman Space Technology prime contractor team has developed an optical and mechanical layout for the science instruments within the JWST field of view that satisfies the telescope s high-level performance requirements. Four instruments required accommodation within the telescope's field of view: a Near-Infrared Camera (NIRCam) provided by the University of Arizona; a Near-Mared Spectrometer (NIRSpec) provided by the European Space Agency; a Mid-Infrared Instrument (MIRI) provided by the Jet Propulsion Laboratory and a European consortium; and a Fine Guidance Sensor (FGS) with a tunable filter module provided by the Canadian Space Agency. The size and position of each instrument's field of view allocation were developed through an iterative, concurrent engineering process involving the key observatory stakeholders. While some of the system design considerations were those typically encountered during the development of an infrared observatory, others were unique to the deployable and controllable nature of JWST. This paper describes the optical and mechanical issues considered during the field of view layout development, as well as the supporting modeling and analysis activities.

  20. The Teenage Nonviolence Test: Concurrent and Discriminant Validity.

    ERIC Educational Resources Information Center

    Konen, Kristopher; Mayton, Daniel M., II; Delva, Zenita; Sonnen, Melinda; Dahl, William; Montgomery, Richard

    This study was designed to document the validity of the Teenage Nonviolence Test (TNT). In this study the concurrent validity of the TNT in various ways, the validity of the TNT using known groups, and the discriminant validity of the TNT by evaluating its relationships with other psychological constructs were assessed. The results showed that the…

  1. Concurrent display of both α- and β-turns in a model peptide.

    PubMed

    Srinivas, Deekonda; Vijayadas, Kuruppanthara N; Gonnade, Rajesh; Phalgune, Usha D; Rajamohanan, Pattuparambil R; Sanjayan, Gangadhar J

    2011-08-21

    This article describes a model peptide that concurrently displays both α- and β-turns, as demonstrated by structural investigations using single crystal X-ray crystallography and solution-state NMR studies. The motif reported herein has the potential for the design of novel conformationally ordered synthetic oligomers with structural architectures distinct from those classically observed.

  2. A Mixed Methods Approach to Understanding School Counseling Program Evaluation: High School Counselors' Methods and Perceptions

    ERIC Educational Resources Information Center

    Aucoin, Jennifer Mangrum

    2013-01-01

    The purpose of this mixed methods concurrent triangulation study was to examine the program evaluation practices of high school counselors. A total of 294 high school counselors in Texas were assessed using a mixed methods concurrent triangulation design. A researcher-developed survey, the School Counseling Program Evaluation Questionnaire…

  3. Proposed variations of the stepped-wedge design can be used to accommodate multiple interventions

    PubMed Central

    Lyons, Vivian H; Li, Lingyu; Hughes, James P; Rowhani-Rahbar, Ali

    2018-01-01

    Objective Stepped wedge design (SWD) cluster randomized trials have traditionally been used for evaluating a single intervention. We aimed to explore design variants suitable for evaluating multiple interventions in a SWD trial. Study Design and Setting We identified four specific variants of the traditional SWD that would allow two interventions to be conducted within a single cluster randomized trial: Concurrent, Replacement, Supplementation and Factorial SWDs. These variants were chosen to flexibly accommodate study characteristics that limit a one-size-fits-all approach for multiple interventions. Results In the Concurrent SWD, each cluster receives only one intervention, unlike the other variants. The Replacement SWD supports two interventions that will not or cannot be employed at the same time. The Supplementation SWD is appropriate when the second intervention requires the presence of the first intervention, and the Factorial SWD supports the evaluation of intervention interactions. The precision for estimating intervention effects varies across the four variants. Conclusion Selection of the appropriate design variant should be driven by the research question while considering the trade-off between the number of steps, number of clusters, restrictions for concurrent implementation of the interventions, lingering effects of each intervention, and precision of the intervention effect estimates. PMID:28412466

  4. A Training Tool and Methodology to Allow Concurrent Multidisciplinary Experimental Projects in Engineering Education

    ERIC Educational Resources Information Center

    Maseda, F. J.; Martija, I.; Martija, I.

    2012-01-01

    This paper describes a novel Electrical Machine and Power Electronic Training Tool (EM&PE[subscript TT]), a methodology for using it, and associated experimental educational activities. The training tool is implemented by recreating a whole power electronics system, divided into modular blocks. This process is similar to that applied when…

  5. Concurrent topology optimization for minimization of total mass considering load-carrying capabilities and thermal insulation simultaneously

    NASA Astrophysics Data System (ADS)

    Long, Kai; Wang, Xuan; Gu, Xianguang

    2017-09-01

    The present work introduces a novel concurrent optimization formulation to meet the requirements of lightweight design and various constraints simultaneously. Nodal displacement of macrostructure and effective thermal conductivity of microstructure are regarded as the constraint functions, which means taking into account both the load-carrying capabilities and the thermal insulation properties. The effective properties of porous material derived from numerical homogenization are used for macrostructural analysis. Meanwhile, displacement vectors of macrostructures from original and adjoint load cases are used for sensitivity analysis of the microstructure. Design variables in the form of reciprocal functions of relative densities are introduced and used for linearization of the constraint function. The objective function of total mass is approximately expressed by the second order Taylor series expansion. Then, the proposed concurrent optimization problem is solved using a sequential quadratic programming algorithm, by splitting into a series of sub-problems in the form of the quadratic program. Finally, several numerical examples are presented to validate the effectiveness of the proposed optimization method. The various effects including initial designs, prescribed limits of nodal displacement, and effective thermal conductivity on optimized designs are also investigated. An amount of optimized macrostructures and their corresponding microstructures are achieved.

  6. Proposed variations of the stepped-wedge design can be used to accommodate multiple interventions.

    PubMed

    Lyons, Vivian H; Li, Lingyu; Hughes, James P; Rowhani-Rahbar, Ali

    2017-06-01

    Stepped-wedge design (SWD) cluster-randomized trials have traditionally been used for evaluating a single intervention. We aimed to explore design variants suitable for evaluating multiple interventions in an SWD trial. We identified four specific variants of the traditional SWD that would allow two interventions to be conducted within a single cluster-randomized trial: concurrent, replacement, supplementation, and factorial SWDs. These variants were chosen to flexibly accommodate study characteristics that limit a one-size-fits-all approach for multiple interventions. In the concurrent SWD, each cluster receives only one intervention, unlike the other variants. The replacement SWD supports two interventions that will not or cannot be used at the same time. The supplementation SWD is appropriate when the second intervention requires the presence of the first intervention, and the factorial SWD supports the evaluation of intervention interactions. The precision for estimating intervention effects varies across the four variants. Selection of the appropriate design variant should be driven by the research question while considering the trade-off between the number of steps, number of clusters, restrictions for concurrent implementation of the interventions, lingering effects of each intervention, and precision of the intervention effect estimates. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Changes in Concurrent Risk of Warm and Dry Years under Impact of Climate Change

    NASA Astrophysics Data System (ADS)

    Sarhadi, A.; Wiper, M.; Touma, D. E.; Ausín, M. C.; Diffenbaugh, N. S.

    2017-12-01

    Anthropogenic global warming has changed the nature and the risk of extreme climate phenomena. The changing concurrence of multiple climatic extremes (warm and dry years) may result in intensification of undesirable consequences for water resources, human and ecosystem health, and environmental equity. The present study assesses how global warming influences the probability that warm and dry years co-occur in a global scale. In the first step of the study a designed multivariate Mann-Kendall trend analysis is used to detect the areas in which the concurrence of warm and dry years has increased in the historical climate records and also climate models in the global scale. The next step investigates the concurrent risk of the extremes under dynamic nonstationary conditions. A fully generalized multivariate risk framework is designed to evolve through time under dynamic nonstationary conditions. In this methodology, Bayesian, dynamic copulas are developed to model the time-varying dependence structure between the two different climate extremes (warm and dry years). The results reveal an increasing trend in the concurrence risk of warm and dry years, which are in agreement with the multivariate trend analysis from historical and climate models. In addition to providing a novel quantification of the changing probability of compound extreme events, the results of this study can help decision makers develop short- and long-term strategies to prepare for climate stresses now and in the future.

  8. Parallel Treatments Design: A Nested Single Subject Design for Comparing Instructional Procedures.

    ERIC Educational Resources Information Center

    Gast, David L.; Wolery, Mark

    1988-01-01

    This paper describes the parallel treatments design, a nested single subject experimental design that combines two concurrently implemented multiple probe designs, allows control for effects of extraneous variables through counterbalancing, and replicates its effects across behaviors. Procedural guidelines for the design's use and issues related…

  9. Soft robot design methodology for `push-button' manufacturing

    NASA Astrophysics Data System (ADS)

    Paik, Jamie

    2018-06-01

    `Push-button' or fully automated manufacturing would enable the production of robots with zero intervention from human hands. Realizing this utopia requires a fundamental shift from a sequential (design-materials-manufacturing) to a concurrent design methodology.

  10. Comparisons on Efficacy of Elcatonin and Limaprost Alfadex in Patients with Lumbar Spinal Stenosis and Concurrent Osteoporosis: A Preliminary Study Using a Crossover Design

    PubMed Central

    Imajo, Yasuaki; Suzuki, Hidenori; Yoshida, Yuichiro; Taguchi, Toshihiko; Tominaga, Toshikatsu; Toyoda, Koichiro

    2014-01-01

    Study Design Multicenter prospective study with a crossover design. Purpose The objective of this study is to compare the efficacy of limaprost alfadex (LP) and elcatonin (EL) for lumbar spinal stenosis (LSS) patients with concurrent osteoporosis. Overview of Literature It has been increasingly important to improve quality of life by establishing appropriate conservative treatments for LSS patients with concurrent osteoporosis who will presumably continue to increase due to the percentage of the aging elevations, however there is no prospective study. Methods A total of 19 patients with LSS and concurrent osteoporosis were enrolled in this study. The patients were divided into two groups and compared using a crossover design. The Japanese Orthopaedic Association Back Pain Evaluation Questionnaire (JOABPEQ) and short-form (SF)-8 health survey scale were used for clinical evaluations. Results There was a significant improvement of buttock-leg pain and numbness in the EL group. A significant improvement of impaired walking function was noted for the LP group according to the JOABPEQ while the rest of the items in the JOABPEQ showed no significant differences. The SF-8 health survey revealed that somatic pains and physical summary scores in the EL group and physical functioning and physical summary scores in the LP group tended to improve but not to any statistically significant extents. Conclusions Concomitant uses of EL may be useful in patients who do not respond satisfactorily to the treatments of LP for 6-8 weeks. PMID:25187864

  11. Interpretive model for ''A Concurrency Method''

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carter, C.L.

    1987-01-01

    This paper describes an interpreter for ''A Concurrency Method,'' in which concurrency is the inherent mode of operation and not an appendage to sequentiality. This method is based on the notions of data-drive and single-assignment while preserving a natural manner of programming. The interpreter is designed for and implemented on a network of Corvus Concept Personal Workstations, which are based on the Motorola MC68000 super-microcomputer. The interpreter utilizes the MC68000 processors in each workstation by communicating across OMNINET, the local area network designed for the workstations. The interpreter is a complete system, containing an editor, a compiler, an operating systemmore » with load balancer, and a communication facility. The system includes the basic arithmetic and trigonometric primitive operations for mathematical computations as well as the ability to construct more complex operations from these. 9 refs., 5 figs.« less

  12. Comparing two methods to promote generalization of receptive identification in children with autism spectrum disorders.

    PubMed

    Dufour, Marie-Michèle; Lanovaz, Marc J

    2017-11-01

    The purpose of our study was to compare the effects of serial and concurrent training on the generalization of receptive identification in children with autism spectrum disorders (ASD). We taught one to three pairs of stimulus sets to nine children with ASD between the ages of three and six. One stimulus set within each pair was taught using concurrent training and the other using serial training. We alternated the training sessions within a multielement design and staggered the introduction of subsequent pairs for each participant as in a multiple baseline design. Overall, six participants generalized at least one stimulus set more rapidly with concurrent training whereas two participants showed generalization more rapidly with serial training. Our results differ from other comparison studies on the topic and indicate that practitioners should consider assessing the effects of both procedures prior to teaching receptive identification to children with ASD.

  13. Flywheels Upgraded for Systems Research

    NASA Technical Reports Server (NTRS)

    Jansen, Ralph H.

    2003-01-01

    With the advent of high-strength composite materials and microelectronics, flywheels are becoming attractive as a means of storing electrical energy. In addition to the high energy density that flywheels provide, other advantages over conventional electrochemical batteries include long life, high reliability, high efficiency, greater operational flexibility, and higher depths of discharge. High pulse energy is another capability that flywheels can provide. These attributes are favorable for satellites as well as terrestrial energy storage applications. In addition to energy storage for satellites, the several flywheels operating concurrently can provide attitude control, thus combine two functions into one system. This translates into significant weight savings. The NASA Glenn Research Center is involved in the development of this technology for space and terrestrial applications. Glenn is well suited for this research because of its world-class expertise in power electronics design, rotor dynamics, composite material research, magnetic bearings, and motor design and control. Several Glenn organizations are working together on this program. The Structural Mechanics and Dynamics Branch is providing magnetic bearing, controls, and mechanical engineering skills. It is working with the Electrical Systems Development Branch, which has expertise in motors and generators, controls, and avionics systems. Facility support is being provided by the Space Electronic Test Engineering Branch, and the program is being managed by the Space Flight Project Branch. NASA is funding an Aerospace Flywheel Technology Development Program to design, fabricate, and test the Attitude Control/Energy Storage Experiment (ACESE). Two flywheels will be integrated onto a single power bus and run simultaneously to demonstrate a combined energy storage and 1-degree-of-freedom momentum control system. An algorithm that independently regulates direct-current bus voltage and net torque output will be experimentally demonstrated.

  14. Interphase layer optimization for metal matrix composites with fabrication considerations

    NASA Technical Reports Server (NTRS)

    Morel, M.; Saravanos, D. A.; Chamis, C. C.

    1991-01-01

    A methodology is presented to reduce the final matrix microstresses for metal matrix composites by concurrently optimizing the interphase characteristics and fabrication process. Application cases include interphase tailoring with and without fabrication considerations for two material systems, graphite/copper and silicon carbide/titanium. Results indicate that concurrent interphase/fabrication optimization produces significant reductions in the matrix residual stresses and strong coupling between interphase and fabrication tailoring. The interphase coefficient of thermal expansion and the fabrication consolidation pressure are the most important design parameters and must be concurrently optimized to further reduce the microstresses to more desirable magnitudes.

  15. Concurrent Image Processing Executive (CIPE)

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Cooper, Gregory T.; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.

    1988-01-01

    The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are discussed. The target machine for this software is a JPL/Caltech Mark IIIfp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules; (1) user interface, (2) host-resident executive, (3) hypercube-resident executive, and (4) application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube a data management method which distributes, redistributes, and tracks data set information was implemented.

  16. Analysis and design of algorithm-based fault-tolerant systems

    NASA Technical Reports Server (NTRS)

    Nair, V. S. Sukumaran

    1990-01-01

    An important consideration in the design of high performance multiprocessor systems is to ensure the correctness of the results computed in the presence of transient and intermittent failures. Concurrent error detection and correction have been applied to such systems in order to achieve reliability. Algorithm Based Fault Tolerance (ABFT) was suggested as a cost-effective concurrent error detection scheme. The research was motivated by the complexity involved in the analysis and design of ABFT systems. To that end, a matrix-based model was developed and, based on that, algorithms for both the design and analysis of ABFT systems are formulated. These algorithms are less complex than the existing ones. In order to reduce the complexity further, a hierarchical approach is developed for the analysis of large systems.

  17. Religion and Wellbeing: Concurrent Validation of the Spiritual Well-Being Scale.

    ERIC Educational Resources Information Center

    Bufford, Rodger K.; Parker, Thomas G., Jr.

    This study was designed to explore the concurrent validity of the Spiritual Well-being Scale (SWB). Ninety first-year student volunteers at an evangelical seminary served as subjects. As part of a larger study, the students completed the SWB and the Interpersonal Behavior Survey (IBS). The SWB Scale is a 20-item self-report scale. Ten items…

  18. Investigating Separate and Concurrent Approaches for Item Parameter Drift in 3PL Item Response Theory Equating

    ERIC Educational Resources Information Center

    Arce-Ferrer, Alvaro J.; Bulut, Okan

    2017-01-01

    This study examines separate and concurrent approaches to combine the detection of item parameter drift (IPD) and the estimation of scale transformation coefficients in the context of the common item nonequivalent groups design with the three-parameter item response theory equating. The study uses real and synthetic data sets to compare the two…

  19. Advanced Collaborative Environments Supporting Systems Integration and Design

    DTIC Science & Technology

    2003-03-01

    concurrently view a virtual system or product model while maintaining natural, human communication . These virtual systems operate within a computer-generated...These environments allow multiple individuals to concurrently view a virtual system or product model while simultaneously maintaining natural, human ... communication . As a result, TARDEC researchers and system developers are using this advanced high-end visualization technology to develop future

  20. Concurrent Smalltalk on the Message-Driven Processor

    DTIC Science & Technology

    1991-09-01

    language close to Concurrent Smalltalk and having an almost identical name is CONCURRENTSMALLTALK [39] [40] independently developed by Yasuhiko Yokote and...Laboratory Memo 1044, October 1988. [391 Yokote, Yasuhiko , and Tokoro, Mario. ’The Design and Implementation of Concur- rentSmalltalk." Proceedings...of the 1986 Object-Oriented Programming Systems, Lan- guages, and Applications Conference, September 1986. 222 Bibliography [401 Yokote, Yasuhiko , and

  1. Concurrent Validity and Sensitivity to Change of Direct Behavior Rating Single-Item Scales (DBR-SIS) within an Elementary Sample

    ERIC Educational Resources Information Center

    Smith, Rhonda L.; Eklund, Katie; Kilgus, Stephen P.

    2018-01-01

    The purpose of this study was to evaluate the concurrent validity, sensitivity to change, and teacher acceptability of Direct Behavior Rating single-item scales (DBR-SIS), a brief progress monitoring measure designed to assess student behavioral change in response to intervention. Twenty-four elementary teacher-student dyads implemented a daily…

  2. From Desktop to Teraflop: Exploiting the U.S. Lead in High Performance Computing. NSF Blue Ribbon Panel on High Performance Computing.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC.

    This report addresses an opportunity to accelerate progress in virtually every branch of science and engineering concurrently, while also boosting the American economy as business firms also learn to exploit these new capabilities. The successful rapid advancement in both science and technology creates its own challenges, four of which are…

  3. 33 CFR 165.768 - Security Zone; MacDill Air Force Base, Tampa Bay, FL.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 2 2014-07-01 2014-07-01 false Security Zone; MacDill Air Force....768 Security Zone; MacDill Air Force Base, Tampa Bay, FL. (a) Location. The following area is a security zone which exists concurrent with an Army Corps of Engineers restricted area in § 334.635 of this...

  4. 33 CFR 165.768 - Security Zone; MacDill Air Force Base, Tampa Bay, FL.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 2 2013-07-01 2013-07-01 false Security Zone; MacDill Air Force....768 Security Zone; MacDill Air Force Base, Tampa Bay, FL. (a) Location. The following area is a security zone which exists concurrent with an Army Corps of Engineers restricted area in § 334.635 of this...

  5. 33 CFR 165.768 - Security Zone; MacDill Air Force Base, Tampa Bay, FL.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false Security Zone; MacDill Air Force....768 Security Zone; MacDill Air Force Base, Tampa Bay, FL. (a) Location. The following area is a security zone which exists concurrent with an Army Corps of Engineers restricted area in § 334.635 of this...

  6. 33 CFR 165.768 - Security Zone; MacDill Air Force Base, Tampa Bay, FL.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Security Zone; MacDill Air Force....768 Security Zone; MacDill Air Force Base, Tampa Bay, FL. (a) Location. The following area is a security zone which exists concurrent with an Army Corps of Engineers restricted area in § 334.635 of this...

  7. Proceedings of the Department of Defense Environmental Technology Workshop

    DTIC Science & Technology

    1995-05-01

    Fabrication Laboratory Results in Waste Elimination William J. Kelso, Parsons Engineering Science, Inc.; Susan H. Errett, Lt. Col. Ronald D. Fancher... Williams , Ocean City Research Corporation ......................... 109 NDCEE Reduces Risk in Technology Transfer Jack H. Cavanaugh, Concurrent...Ecological Receptors William R. Alsop, Mark E. Stelljes, Elizabeth T. Hawkins, Harding Lawson Associates; W illiam Collins, U.S. Department of the Army

  8. A Primer for DoD Reliability, Maintainability and Safety Standards

    DTIC Science & Technology

    1988-03-02

    the project engineer and the concurrence of their respective managers. The primary consideration in such cases is the thoroughness of the ...basic approaches to the application of environmental stress screening. In one approach, the government explicitly specifies the screens and screening...TO USE DOD-HDBK-344 (USAF) There are two basic approaches to the application of environmental stress

  9. 76 FR 31958 - Information Collection Being Submitted for Review and Approval to the Office of Management and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-02

    ... enhance the quality, utility, and clarity of the information collected; (d) ways to minimize the burden of... frequencies. Section 90.545(c)(1) requires that public safety applicants select one of three ways to meet TV... engineering study to justify other separations; or (3) obtain concurrence from the applicable TV/DTV station(s...

  10. A Model-based Approach to Reactive Self-Configuring Systems

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Nayak, P. Pandurang

    1996-01-01

    This paper describes Livingstone, an implemented kernel for a self-reconfiguring autonomous system, that is reactive and uses component-based declarative models. The paper presents a formal characterization of the representation formalism used in Livingstone, and reports on our experience with the implementation in a variety of domains. Livingstone's representation formalism achieves broad coverage of hybrid software/hardware systems by coupling the concurrent transition system models underlying concurrent reactive languages with the discrete qualitative representations developed in model-based reasoning. We achieve a reactive system that performs significant deductions in the sense/response loop by drawing on our past experience at building fast prepositional conflict-based algorithms for model-based diagnosis, and by framing a model-based configuration manager as a prepositional, conflict-based feedback controller that generates focused, optimal responses. Livingstone automates all these tasks using a single model and a single core deductive engine, thus making significant progress towards achieving a central goal of model-based reasoning. Livingstone, together with the HSTS planning and scheduling engine and the RAPS executive, has been selected as the core autonomy architecture for Deep Space One, the first spacecraft for NASA's New Millennium program.

  11. Engineering S. equi subsp. zooepidemicus towards concurrent production of hyaluronic acid and chondroitin biopolymers of biomedical interest.

    PubMed

    Cimini, Donatella; Iacono, Ileana Dello; Carlino, Elisabetta; Finamore, Rosario; Restaino, Odile F; Diana, Paola; Bedini, Emiliano; Schiraldi, Chiara

    2017-12-01

    Glycosaminoglycans, such as hyaluronic acid and chondroitin sulphate, are not only more and more required as main ingredients in cosmeceutical and nutraceutical preparations, but also as active principles in medical devices and pharmaceutical products. However, while biotechnological production of hyaluronic acid is industrially established through fermentation of Streptococcus spp. and recently Bacillus subtilis, biotechnological chondroitin is not yet on the market. A non-hemolytic and hyaluronidase negative S. equi subsp. zooepidemicus mutant strain was engineered in this work by the addition of two E. coli K4 genes, namely kfoA and kfoC, involved in the biosynthesis of chondroitin-like polysaccharide. Chondroitin is the precursor of chondroitin sulphate, a nutraceutical present on the market as anti-arthritic drug, that is lately being studied for its intrinsic bioactivity. In small scale bioreactor batch experiments the production of about 1.46 ± 0.38 g/L hyaluronic acid and 300 ± 28 mg/L of chondroitin with an average molecular weight of 1750 and 25 kDa, respectively, was demonstrated, providing an approach to the concurrent production of both biopolymers in a single fermentation.

  12. Concurrent bariatric operations and association with perioperative outcomes: registry based cohort study.

    PubMed

    Liu, Jason B; Ban, Kristen A; Berian, Julia R; Hutter, Matthew M; Huffman, Kristopher M; Liu, Yaoming; Hoyt, David B; Hall, Bruce L; Ko, Clifford Y

    2017-09-26

    Objective  To determine whether perioperative outcomes differ between patients undergoing concurrent compared with non-concurrent bariatric operations in the USA. Design  Retrospective, propensity score matched cohort study. Setting  Hospitals in the US accredited by the American College of Surgeons' metabolic and bariatric surgery accreditation and quality improvement program. Participants  513 167 patients undergoing bariatric operations between 1 January 2014 and 31 December 2016. Main outcome measures  The primary outcome measure was a composite of 30 day death, morbidity, readmission, reoperation, anastomotic or staple line leak, and bleeding events. Operative duration and lengths of stay were also assessed. Operations were defined as concurrent if they overlapped by 60 or more minutes or in their entirety. Results  In this study of 513 167 operations, 739 (29.5%) surgeons at 483 (57.8%) hospitals performed 6087 (1.2%) concurrent operations. The most frequently performed concurrent bariatric operations were sleeve gastrectomy (n=3250, 53.4%) and Roux-en-Y gastric bypass (n=1601, 26.3%). Concurrent operations were more often performed at large academic medical centers with higher operative volumes and numbers of trainees and by higher volume surgeons. Compared with non-concurrent operations, concurrent operations lasted a median of 34 minutes longer (P<0.001) and resulted in 0.3 days longer average length of stay (P<0.001). Perioperative adverse events were not observed to more likely occur in concurrent compared with non-concurrent operations (7.5% v 7.4%; relative risk 1.02, 95% confidence interval 0.90 to 1.15; P=0.84). Conclusions  Concurrent bariatric operations occurred infrequently, but when they did, there was no observable increased risk for adverse perioperative outcomes compared with non-concurrent operations. These results, however, do not argue against improved and more meaningful disclosure of concurrent surgery practices. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  13. Integrating design science theory and methods to improve the development and evaluation of health communication programs.

    PubMed

    Neuhauser, Linda; Kreps, Gary L

    2014-12-01

    Traditional communication theory and research methods provide valuable guidance about designing and evaluating health communication programs. However, efforts to use health communication programs to educate, motivate, and support people to adopt healthy behaviors often fail to meet the desired goals. One reason for this failure is that health promotion issues are complex, changeable, and highly related to the specific needs and contexts of the intended audiences. It is a daunting challenge to effectively influence health behaviors, particularly culturally learned and reinforced behaviors concerning lifestyle factors related to diet, exercise, and substance (such as alcohol and tobacco) use. Too often, program development and evaluation are not adequately linked to provide rapid feedback to health communication program developers so that important revisions can be made to design the most relevant and personally motivating health communication programs for specific audiences. Design science theory and methods commonly used in engineering, computer science, and other fields can address such program and evaluation weaknesses. Design science researchers study human-created programs using tightly connected build-and-evaluate loops in which they use intensive participatory methods to understand problems and develop solutions concurrently and throughout the duration of the program. Such thinking and strategies are especially relevant to address complex health communication issues. In this article, the authors explore the history, scientific foundation, methods, and applications of design science and its potential to enhance health communication programs and their evaluation.

  14. Comparisons on efficacy of elcatonin and limaprost alfadex in patients with lumbar spinal stenosis and concurrent osteoporosis: a preliminary study using a crossover design.

    PubMed

    Kanchiku, Tsukasa; Imajo, Yasuaki; Suzuki, Hidenori; Yoshida, Yuichiro; Taguchi, Toshihiko; Tominaga, Toshikatsu; Toyoda, Koichiro

    2014-08-01

    Multicenter prospective study with a crossover design. The objective of this study is to compare the efficacy of limaprost alfadex (LP) and elcatonin (EL) for lumbar spinal stenosis (LSS) patients with concurrent osteoporosis. It has been increasingly important to improve quality of life by establishing appropriate conservative treatments for LSS patients with concurrent osteoporosis who will presumably continue to increase due to the percentage of the aging elevations, however there is no prospective study. A total of 19 patients with LSS and concurrent osteoporosis were enrolled in this study. The patients were divided into two groups and compared using a crossover design. The Japanese Orthopaedic Association Back Pain Evaluation Questionnaire (JOABPEQ) and short-form (SF)-8 health survey scale were used for clinical evaluations. There was a significant improvement of buttock-leg pain and numbness in the EL group. A significant improvement of impaired walking function was noted for the LP group according to the JOABPEQ while the rest of the items in the JOABPEQ showed no significant differences. The SF-8 health survey revealed that somatic pains and physical summary scores in the EL group and physical functioning and physical summary scores in the LP group tended to improve but not to any statistically significant extents. Concomitant uses of EL may be useful in patients who do not respond satisfactorily to the treatments of LP for 6-8 weeks.

  15. Exascale computing and what it means for shock physics

    NASA Astrophysics Data System (ADS)

    Germann, Timothy

    2015-06-01

    The U.S. Department of Energy is preparing to launch an Exascale Computing Initiative, to address the myriad challenges required to deploy and effectively utilize an exascale-class supercomputer (i.e., one capable of performing 1018 operations per second) in the 2023 timeframe. Since physical (power dissipation) requirements limit clock rates to at most a few GHz, this will necessitate the coordination of on the order of a billion concurrent operations, requiring sophisticated system and application software, and underlying mathematical algorithms, that may differ radically from traditional approaches. Even at the smaller workstation or cluster level of computation, the massive concurrency and heterogeneity within each processor will impact computational scientists. Through the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx), we have initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. In my talk, I will discuss these challenges, and what it will mean for exascale-era electronic structure, molecular dynamics, and engineering-scale simulations of shock-compressed condensed matter. In particular, we anticipate that the emerging hierarchical, heterogeneous architectures can be exploited to achieve higher physical fidelity simulations using adaptive physics refinement. This work is supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research.

  16. Taguchi-generalized regression neural network micro-screening for physical and sensory characteristics of bread.

    PubMed

    Besseris, George J

    2018-03-01

    Generalized regression neural networks (GRNN) may act as crowdsourcing cognitive agents to screen small, dense and complex datasets. The concurrent screening and optimization of several complex physical and sensory traits of bread is developed using a structured Taguchi-type micro-mining technique. A novel product outlook is offered to industrial operations to cover separate aspects of smart product design, engineering and marketing. Four controlling factors were selected to be modulated directly on a modern production line: 1) the dough weight, 2) the proofing time, 3) the baking time, and 4) the oven zone temperatures. Concentrated experimental recipes were programmed using the Taguchi-type L 9 (3 4 ) OA-sampler to detect potentially non-linear multi-response tendencies. The fused behavior of the master-ranked bread characteristics behavior was smart sampled with GRNN-crowdsourcing and robust analysis. It was found that the combination of the oven zone temperatures to play a highly influential role in all investigated scenarios. Moreover, the oven zone temperatures and the dough weight appeared to be instrumental when attempting to synchronously adjusting all four physical characteristics. The optimal oven-zone temperature setting for concurrent screening-and-optimization was found to be 270-240 °C. The optimized (median) responses for loaf weight, moisture, height, width, color, flavor, crumb structure, softness, and elasticity are: 782 g, 34.8 %, 9.36 cm, 10.41 cm, 6.6, 7.2, 7.6, 7.3, and 7.0, respectively.

  17. Implementing Software Safety in the NASA Environment

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha S.; Radley, Charles F.

    1994-01-01

    Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of the system to be built. Shortly thereafter, as the system requirements are being defined, the second iteration of hazard analyses takes place, the systems hazard analysis (SHA). During the systems requirements phase, decisions are made as to what functions of the system will be the responsibility of software. This is the most critical time to affect the safety of the software. From this point, software safety analyses as well as software engineering practices are the main focus for assuring safe software. While many of the steps proposed in this paper seem like just sound engineering practices, they are the best technical and most cost effective means to assure safe software within a safe system.

  18. Bidding-based autonomous process planning and scheduling

    NASA Astrophysics Data System (ADS)

    Gu, Peihua; Balasubramanian, Sivaram; Norrie, Douglas H.

    1995-08-01

    Improving productivity through computer integrated manufacturing systems (CIMS) and concurrent engineering requires that the islands of automation in an enterprise be completely integrated. The first step in this direction is to integrate design, process planning, and scheduling. This can be achieved through a bidding-based process planning approach. The product is represented in a STEP model with detailed design and administrative information including design specifications, batch size, and due dates. Upon arrival at the manufacturing facility, the product registered in the shop floor manager which is essentially a coordinating agent. The shop floor manager broadcasts the product's requirements to the machines. The shop contains autonomous machines that have knowledge about their functionality, capabilities, tooling, and schedule. Each machine has its own process planner and responds to the product's request in a different way that is consistent with its capabilities and capacities. When more than one machine offers certain process(es) for the same requirements, they enter into negotiation. Based on processing time, due date, and cost, one of the machines wins the contract. The successful machine updates its schedule and advises the product to request raw material for processing. The concept was implemented using a multi-agent system with the task decomposition and planning achieved through contract nets. The examples are included to illustrate the approach.

  19. Methods for design and evaluation of integrated hardware-software systems for concurrent computation

    NASA Technical Reports Server (NTRS)

    Pratt, T. W.

    1985-01-01

    Research activities and publications are briefly summarized. The major tasks reviewed are: (1) VAX implementation of the PISCES parallel programming environment; (2) Apollo workstation network implementation of the PISCES environment; (3) FLEX implementation of the PISCES environment; (4) sparse matrix iterative solver in PSICES Fortran; (5) image processing application of PISCES; and (6) a formal model of concurrent computation being developed.

  20. Theory of remote entanglement via quantum-limited phase-preserving amplification

    NASA Astrophysics Data System (ADS)

    Silveri, Matti; Zalys-Geller, Evan; Hatridge, Michael; Leghtas, Zaki; Devoret, Michel H.; Girvin, S. M.

    2016-06-01

    We show that a quantum-limited phase-preserving amplifier can act as a which-path information eraser when followed by heterodyne detection. This "beam splitter with gain" implements a continuous joint measurement on the signal sources. As an application, we propose heralded concurrent remote entanglement generation between two qubits coupled dispersively to separate cavities. Dissimilar qubit-cavity pairs can be made indistinguishable by simple engineering of the cavity driving fields providing further experimental flexibility and the prospect for scalability. Additionally, we find an analytic solution for the stochastic master equation, a quantum filter, yielding a thorough physical understanding of the nonlinear measurement process leading to an entangled state of the qubits. We determine the concurrence of the entangled states and analyze its dependence on losses and measurement inefficiencies.

  1. Program Correctness, Verification and Testing for Exascale (Corvette)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Koushik; Iancu, Costin; Demmel, James W

    The goal of this project is to provide tools to assess the correctness of parallel programs written using hybrid parallelism. There is a dire lack of both theoretical and engineering know-how in the area of finding bugs in hybrid or large scale parallel programs, which our research aims to change. In the project we have demonstrated novel approaches in several areas: 1. Low overhead automated and precise detection of concurrency bugs at scale. 2. Using low overhead bug detection tools to guide speculative program transformations for performance. 3. Techniques to reduce the concurrency required to reproduce a bug using partialmore » program restart/replay. 4. Techniques to provide reproducible execution of floating point programs. 5. Techniques for tuning the floating point precision used in codes.« less

  2. Design of a network for concurrent message passing systems

    NASA Astrophysics Data System (ADS)

    Song, Paul Y.

    1988-08-01

    We describe the design of the network design frame (NDF), a self-timed routing chip for a message-passing concurrent computer. The NDF uses a partitioned data path, low-voltage output drivers, and a distributed token-passing arbiter to provide a bandwidth of 450 Mbits/sec into the network. Wormhole routing and bidirectional virtual channels are used to provide low latency communications, less than 2us latency to deliver a 216 bit message across the diameter of a 1K node mess-connected machine. To support concurrent software systems, the NDF provides two logical networks, one for user messages and one for system messages. The two networks share the same set of physical wires. To facilitate the development of network nodes, the NDF is a design frame. The NDF circuitry is integrated into the pad frame of a chip leaving the center of the chip uncommitted. We define an analytic framework in which to study the effects of network size, network buffering capacity, bidirectional channels, and traffic on this class of networks. The response of the network to various combinations of these parameters are obtained through extensive simulation of the network model. Through simulation, we are able to observe the macro behavior of the network as opposed to the micro behavior of the NDF routing controller.

  3. Immune-tolerant elastin-like polypeptides (iTEPs) and their application as CTL vaccine carriers.

    PubMed

    Cho, S; Dong, S; Parent, K N; Chen, M

    2016-01-01

    Cytotoxic T lymphocyte (CTL) vaccine carriers are known to enhance the efficacy of vaccines, but a search for more effective carriers is warranted. Elastin-like polypeptides (ELPs) have been examined for many medical applications but not as CTL vaccine carriers. We aimed to create immune tolerant ELPs using a new polypeptide engineering practice and create CTL vaccine carriers using the ELPs. Four sets of novel ELPs, termed immune-tolerant elastin-like polypeptide (iTEP) were generated according to the principles dictating humoral immunogenicity of polypeptides and phase transition property of ELPs. The iTEPs were non-immunogenic in mice. Their phase transition feature was confirmed through a turbidity assay. An iTEP nanoparticle (NP) was assembled from an amphiphilic iTEP copolymer plus a CTL peptide vaccine, SIINFEKL. The NP facilitated the presentation of the vaccine by dendritic cells (DCs) and enhanced vaccine-induced CTL responses. A new ELP design and development practice was established. The non-canonical motif and the immune tolerant nature of the iTEPs broaden our insights about ELPs. ELPs, for the first time, were successfully used as carriers for CTL vaccines. It is feasible to concurrently engineer both immune-tolerant and functional peptide materials. ELPs are a promising type of CTL vaccine carriers.

  4. Composite chronicles: A study of the lessons learned in the development, production, and service of composite structures

    NASA Technical Reports Server (NTRS)

    Vosteen, Louis F.; Hadcock, Richard N.

    1994-01-01

    A study of past composite aircraft structures programs was conducted to determine the lessons learned during the programs. The study focused on finding major underlying principles and practices that experience showed have significant effects on the development process and should be recognized and understood by those responsible for using of composites. Published information on programs was reviewed and interviews were conducted with personnel associated with current and past major development programs. In all, interviews were conducted with about 56 people representing 32 organizations. Most of the people interviewed have been involved in the engineering and manufacturing development of composites for the past 20 to 25 years. Although composites technology has made great advances over the past 30 years, the effective application of composites to aircraft is still a complex problem that requires experienced personnel with special knowledge. All disciplines involved in the development process must work together in real time to minimize risk and assure total product quality and performance at acceptable costs. The most successful programs have made effective use of integrated, collocated, concurrent engineering teams, and most often used well-planned, systematic development efforts wherein the design and manufacturing processes are validated in a step-by-step or 'building block' approach. Such approaches reduce program risk and are cost effective.

  5. Uncovering the problem-solving process: cued retrospective reporting versus concurrent and retrospective reporting.

    PubMed

    van Gog, Tamara; Paas, Fred; van Merriënboer, Jeroen J G; Witte, Puk

    2005-12-01

    This study investigated the amounts of problem-solving process information ("action," "why," "how," and "metacognitive") elicited by means of concurrent, retrospective, and cued retrospective reporting. In a within-participants design, 26 participants completed electrical circuit troubleshooting tasks under different reporting conditions. The method of cued retrospective reporting used the original computer-based task and a superimposed record of the participant's eye fixations and mouse-keyboard operations as a cue for retrospection. Cued retrospective reporting (with the exception of why information) and concurrent reporting (with the exception of metacognitive information) resulted in a higher number of codes on the different types of information than did retrospective reporting.

  6. NASA's computer science research program

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  7. Rotor Re-Design for the SSME Fuel Flowmeter

    NASA Technical Reports Server (NTRS)

    Marcu, Bogdan

    1999-01-01

    The present report describes the process of redesigning a new rotor for the SSME Fuel Flowmeter. The new design addresses the specific requirement of a lower rotor speed which would allow the SSME operation at 1 15% rated power level without reaching a blade excitation by the wakes behind the hexagonal flow straightener upstream at frequencies close to the blade natural frequency. A series of calculations combining fleet flowmeters test data, airfoil fluid dynamics and CFD simulations of flow patterns behind the flowmeter's hexagonal straightener has led to a blade twist design alpha = alpha (radius) targeting a kf constant of 0.8256. The kf constant relates the fuel volume flow to the flowmeter rotor speed, for this particular value 17685 GPM at 3650 RPM. Based on this angle distribution, two actual blade designs were developed. A first design using the same blade airfoil as the original design targeted the new kf value only. A second design using a variable blade chord length and airfoil relative thickness targeted simultaneously the new kf value and an optimum blade design destined to provide smooth and stable operation and a significant increase in the blade natural frequency associated with the first bending mode, such that a comfortable margin could be obtained at 115% RPL. The second design is a result of a concurrent engineering process, during which several iterations were made in order to achieve a targeted blade natural frequency associated with the first bending mode of 1300 Hz. Water flow tests preliminary results indicate a kf value of 0.8179 for the f-irst design, which is within 1% of the target value. The second design rotor shows a natural frequency associated with the first bending mode of 1308 Hz, and a water-flow calibration constant of kf 0.8169.

  8. Concurrent versus sequential sorafenib therapy in combination with radiation for hepatocellular carcinoma.

    PubMed

    Wild, Aaron T; Gandhi, Nishant; Chettiar, Sivarajan T; Aziz, Khaled; Gajula, Rajendra P; Williams, Russell D; Kumar, Rachit; Taparra, Kekoa; Zeng, Jing; Cades, Jessica A; Velarde, Esteban; Menon, Siddharth; Geschwind, Jean F; Cosgrove, David; Pawlik, Timothy M; Maitra, Anirban; Wong, John; Hales, Russell K; Torbenson, Michael S; Herman, Joseph M; Tran, Phuoc T

    2013-01-01

    Sorafenib (SOR) is the only systemic agent known to improve survival for hepatocellular carcinoma (HCC). However, SOR prolongs survival by less than 3 months and does not alter symptomatic progression. To improve outcomes, several phase I-II trials are currently examining SOR with radiation (RT) for HCC utilizing heterogeneous concurrent and sequential treatment regimens. Our study provides preclinical data characterizing the effects of concurrent versus sequential RT-SOR on HCC cells both in vitro and in vivo. Concurrent and sequential RT-SOR regimens were tested for efficacy among 4 HCC cell lines in vitro by assessment of clonogenic survival, apoptosis, cell cycle distribution, and γ-H2AX foci formation. Results were confirmed in vivo by evaluating tumor growth delay and performing immunofluorescence staining in a hind-flank xenograft model. In vitro, concurrent RT-SOR produced radioprotection in 3 of 4 cell lines, whereas sequential RT-SOR produced decreased colony formation among all 4. Sequential RT-SOR increased apoptosis compared to RT alone, while concurrent RT-SOR did not. Sorafenib induced reassortment into less radiosensitive phases of the cell cycle through G1-S delay and cell cycle slowing. More double-strand breaks (DSBs) persisted 24 h post-irradiation for RT alone versus concurrent RT-SOR. In vivo, sequential RT-SOR produced the greatest tumor growth delay, while concurrent RT-SOR was similar to RT alone. More persistent DSBs were observed in xenografts treated with sequential RT-SOR or RT alone versus concurrent RT-SOR. Sequential RT-SOR additionally produced a greater reduction in xenograft tumor vascularity and mitotic index than either concurrent RT-SOR or RT alone. In conclusion, sequential RT-SOR demonstrates greater efficacy against HCC than concurrent RT-SOR both in vitro and in vivo. These results may have implications for clinical decision-making and prospective trial design.

  9. Concurrent versus Sequential Sorafenib Therapy in Combination with Radiation for Hepatocellular Carcinoma

    PubMed Central

    Chettiar, Sivarajan T.; Aziz, Khaled; Gajula, Rajendra P.; Williams, Russell D.; Kumar, Rachit; Taparra, Kekoa; Zeng, Jing; Cades, Jessica A.; Velarde, Esteban; Menon, Siddharth; Geschwind, Jean F.; Cosgrove, David; Pawlik, Timothy M.; Maitra, Anirban; Wong, John; Hales, Russell K.; Torbenson, Michael S.; Herman, Joseph M.; Tran, Phuoc T.

    2013-01-01

    Sorafenib (SOR) is the only systemic agent known to improve survival for hepatocellular carcinoma (HCC). However, SOR prolongs survival by less than 3 months and does not alter symptomatic progression. To improve outcomes, several phase I-II trials are currently examining SOR with radiation (RT) for HCC utilizing heterogeneous concurrent and sequential treatment regimens. Our study provides preclinical data characterizing the effects of concurrent versus sequential RT-SOR on HCC cells both in vitro and in vivo. Concurrent and sequential RT-SOR regimens were tested for efficacy among 4 HCC cell lines in vitro by assessment of clonogenic survival, apoptosis, cell cycle distribution, and γ-H2AX foci formation. Results were confirmed in vivo by evaluating tumor growth delay and performing immunofluorescence staining in a hind-flank xenograft model. In vitro, concurrent RT-SOR produced radioprotection in 3 of 4 cell lines, whereas sequential RT-SOR produced decreased colony formation among all 4. Sequential RT-SOR increased apoptosis compared to RT alone, while concurrent RT-SOR did not. Sorafenib induced reassortment into less radiosensitive phases of the cell cycle through G1-S delay and cell cycle slowing. More double-strand breaks (DSBs) persisted 24 h post-irradiation for RT alone versus concurrent RT-SOR. In vivo, sequential RT-SOR produced the greatest tumor growth delay, while concurrent RT-SOR was similar to RT alone. More persistent DSBs were observed in xenografts treated with sequential RT-SOR or RT alone versus concurrent RT-SOR. Sequential RT-SOR additionally produced a greater reduction in xenograft tumor vascularity and mitotic index than either concurrent RT-SOR or RT alone. In conclusion, sequential RT-SOR demonstrates greater efficacy against HCC than concurrent RT-SOR both in vitro and in vivo. These results may have implications for clinical decision-making and prospective trial design. PMID:23762417

  10. Inter-examiner classification reliability of Mechanical Diagnosis and Therapy for extremity problems - Systematic review.

    PubMed

    Takasaki, Hiroshi; Okuyama, Kousuke; Rosedale, Richard

    2017-02-01

    Mechanical Diagnosis and Therapy (MDT) is used in the treatment of extremity problems. Classifying clinical problems is one method of providing effective treatment to a target population. Classification reliability is a key factor to determine the precise clinical problem and to direct an appropriate intervention. To explore inter-examiner reliability of the MDT classification for extremity problems in three reliability designs: 1) vignette reliability using surveys with patient vignettes, 2) concurrent reliability, where multiple assessors decide a classification by observing someone's assessment, 3) successive reliability, where multiple assessors independently assess the same patient at different times. Systematic review with data synthesis in a quantitative format. Agreement of MDT subgroups was examined using the Kappa value, with the operational definition of acceptable reliability set at ≥ 0.6. The level of evidence was determined considering the methodological quality of the studies. Six studies were included and all studies met the criteria for high quality. Kappa values for the vignette reliability design (five studies) were ≥ 0.7. There was data from two cohorts in one study for the concurrent reliability design and the Kappa values ranged from 0.45 to 1.0. Kappa values for the successive reliability design (data from three cohorts in one study) were < 0.6. The current review found strong evidence of acceptable inter-examiner reliability of MDT classification for extremity problems in the vignette reliability design, limited evidence of acceptable reliability in the concurrent reliability design and unacceptable reliability in the successive reliability design. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Design, Development and Utilization Perspectives on Database Management Systems

    ERIC Educational Resources Information Center

    Shneiderman, Ben

    1977-01-01

    This paper reviews the historical development of integrated data base management systems and examines competing approaches. Topics include management and utilization, implementation and design, query languages, security, integrity, privacy and concurrency. (Author/KP)

  12. 23 CFR 950.7 - Interoperability requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION INTELLIGENT TRANSPORTATION SYSTEMS... receive the FHWA's concurrence that the facility's toll collection system's standards and design meet the... system design shall include the communications requirements between roadside equipment and toll...

  13. 23 CFR 950.7 - Interoperability requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION INTELLIGENT TRANSPORTATION SYSTEMS... receive the FHWA's concurrence that the facility's toll collection system's standards and design meet the... system design shall include the communications requirements between roadside equipment and toll...

  14. 23 CFR 950.7 - Interoperability requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION INTELLIGENT TRANSPORTATION SYSTEMS... receive the FHWA's concurrence that the facility's toll collection system's standards and design meet the... system design shall include the communications requirements between roadside equipment and toll...

  15. 23 CFR 950.7 - Interoperability requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION INTELLIGENT TRANSPORTATION SYSTEMS... receive the FHWA's concurrence that the facility's toll collection system's standards and design meet the... system design shall include the communications requirements between roadside equipment and toll...

  16. 23 CFR 950.7 - Interoperability requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION INTELLIGENT TRANSPORTATION SYSTEMS... receive the FHWA's concurrence that the facility's toll collection system's standards and design meet the... system design shall include the communications requirements between roadside equipment and toll...

  17. Concurrent Engineering Teams. Volume 2: Annotated Bibliography

    DTIC Science & Technology

    1990-11-01

    publishles. They normally embody restilts of major projects which (a) have a direct bearing am decisionse affecting major program , III) addrnss...D., "What Processes do You Own? How are They Doing?," Program Manager, Journal of the Defense Systems Management College, September-October 1989, pp...216. The key ingredient to any successful TQM program is top management commitment and involvement. The early top management involvement reflects

  18. Adaptive intensity modulated radiotherapy for advanced prostate cancer

    NASA Astrophysics Data System (ADS)

    Ludlum, Erica Marie

    The purpose of this research is to develop and evaluate improvements in intensity modulated radiotherapy (IMRT) for concurrent treatment of prostate and pelvic lymph nodes. The first objective is to decrease delivery time while maintaining treatment quality, and evaluate the effectiveness and efficiency of novel one-step optimization compared to conventional two-step optimization. Both planning methods are examined at multiple levels of complexity by comparing the number of beam apertures, or segments, the amount of radiation delivered as measured by monitor units (MUs), and delivery time. One-step optimization is demonstrated to simplify IMRT planning and reduce segments (from 160 to 40), MUs (from 911 to 746), and delivery time (from 22 to 7 min) with comparable plan quality. The second objective is to examine the capability of three commercial dose calculation engines employing different levels of accuracy and efficiency to handle high--Z materials, such as metallic hip prostheses, included in the treatment field. Pencil beam, convolution superposition, and Monte Carlo dose calculation engines are compared by examining the dose differences for patient plans with unilateral and bilateral hip prostheses, and for phantom plans with a metal insert for comparison with film measurements. Convolution superposition and Monte Carlo methods calculate doses that are 1.3% and 34.5% less than the pencil beam method, respectively. Film results demonstrate that Monte Carlo most closely represents actual radiation delivery, but none of the three engines accurately predict the dose distribution when high-Z heterogeneities exist in the treatment fields. The final objective is to improve the accuracy of IMRT delivery by accounting for independent organ motion during concurrent treatment of the prostate and pelvic lymph nodes. A leaf-shifting algorithm is developed to track daily prostate position without requiring online dose calculation. Compared to conventional methods of adjusting patient position, adjusting the multileaf collimator (MLC) leaves associated with the prostate in each segment significantly improves lymph node dose coverage (maintains 45 Gy compared to 42.7, 38.3, and 34.0 Gy for iso-shifts of 0.5, 1 and 1.5 cm). Altering the MLC portal shape is demonstrated as a new and effective solution to independent prostate movement during concurrent treatment.

  19. Consolidated Laser-Induced Fluorescence Diagnostic Systems for the NASA Ames Arc Jet Facilities

    NASA Technical Reports Server (NTRS)

    Grinstead, Jay H.; Wilder, Michael C.; Porter, Barry J.; Brown, Jeffrey D.; Yeung, Dickson; Battazzo, Stephen J.; Brubaker, Timothy R.

    2016-01-01

    The spectroscopic diagnostic technique of two photon absorption laser-induced fluorescence (TALIF) of atomic species for non-intrusive arc jet flow property measurement was first implemented at NASA Ames in the mid-1990s. Use of TALIF expanded at NASA Ames and to NASA Johnson's arc jet facility in the late 2000s. In 2013-2014, NASA combined the agency's large-scale arc jet test capabilities at NASA Ames. Concurrent with that effort, the agency also sponsored a project to establish two comprehensive LIF diagnostic systems for the Aerodynamic Heating Facility (AHF) and Interaction Heating Facility (IHF) arc jets. The scope of the project enabled further engineering development of the existing IHF LIF system as well as the complete reconstruction of the original AHF LIF system. The updated LIF systems are identical in design and capability. They represent the culmination of over 20 years of development experience in transitioning a specialized laboratory research tool into a measurement system for large-scale, high-demand test facilities. This paper documents the overall system design from measurement requirements to implementation. Representative data from the redeveloped AHF and IHF LIF systems are also presented.

  20. X-43 Hypersonic Vehicle Technology Development

    NASA Technical Reports Server (NTRS)

    Voland, Randall T.; Huebner, Lawrence D.; McClinton, Charles R.

    2005-01-01

    NASA recently completed two major programs in Hypersonics: Hyper-X, with the record-breaking flights of the X-43A, and the Next Generation Launch Technology (NGLT) Program. The X-43A flights, the culmination of the Hyper-X Program, were the first-ever examples of a scramjet engine propelling a hypersonic vehicle and provided unique, convincing, detailed flight data required to validate the design tools needed for design and development of future operational hypersonic airbreathing vehicles. Concurrent with Hyper-X, NASA's NGLT Program focused on technologies needed for future revolutionary launch vehicles. The NGLT was "competed" by NASA in response to the President s redirection of the agency to space exploration, after making significant progress towards maturing technologies required to enable airbreathing hypersonic launch vehicles. NGLT quantified the benefits, identified technology needs, developed airframe and propulsion technology, chartered a broad University base, and developed detailed plans to mature and validate hypersonic airbreathing technology for space access. NASA is currently in the process of defining plans for a new Hypersonic Technology Program. Details of that plan are not currently available. This paper highlights results from the successful Mach 7 and 10 flights of the X-43A, and the current state of hypersonic technology.

  1. Consolidated Laser-Induced Fluorescence Diagnostic Systems for the NASA Ames Arc Jet Facilities

    NASA Technical Reports Server (NTRS)

    Grinstead, Jay; Wilder, Michael C.; Porter, Barry; Brown, Jeff; Yeung, Dickson; Battazzo, Steve; Brubaker, Tim

    2016-01-01

    The spectroscopic diagnostic technique of two photon absorption laser-induced fluorescence (TALIF) of atomic species for non-intrusive arc jet flow property measurement was first implemented at NASA Ames in the mid-1990s. Use of TALIF expanded at NASA Ames and to NASA Johnsons arc jet facility in the late 2000s. In 2013-2014, NASA combined the agency's large-scale arc jet test capabilities at NASA Ames. Concurrent with that effort, the agency also sponsored a project to establish two comprehensive LIF diagnostic systems for the Aerodynamic Heating Facility (AHF) and Interaction Heating Facility (IHF) arc jets. The scope of the project enabled further engineering development of the existing IHF LIF system as well as the complete reconstruction of the original AHF LIF system. The updated LIF systems are identical in design and capability. They represent the culmination of over 20 years of development experience in transitioning a specialized laboratory research tool into a measurement system for large-scale, high-demand test facilities. This paper documents the overall system design from measurement requirements to implementation. Representative data from the redeveloped AHF and IHF LIF systems are also presented.

  2. Simplified spacecraft vulnerability assessments at component level in early design phase at the European Space Agency's Concurrent Design Facility

    NASA Astrophysics Data System (ADS)

    Kempf, Scott; Schäfer, Frank K.; Cardone, Tiziana; Ferreira, Ivo; Gerené, Sam; Destefanis, Roberto; Grassi, Lilith

    2016-12-01

    During recent years, the state-of-the-art risk assessment of the threat posed to spacecraft by micrometeoroids and space debris has been expanded to the analysis of failure modes of internal spacecraft components. This method can now be used to perform risk analyses for satellites to assess various failure levels - from failure of specific sub-systems to catastrophic break-up. This new assessment methodology is based on triple-wall ballistic limit equations (BLEs), specifically the Schäfer-Ryan-Lambert (SRL) BLE, which is applicable for describing failure threshold levels for satellite components following a hypervelocity impact. The methodology is implemented in the form of the software tool Particle Impact Risk and vulnerability Analysis Tool (PIRAT). During a recent European Space Agency (ESA) funded study, the PIRAT functionality was expanded in order to provide an interface to ESA's Concurrent Design Facility (CDF). The additions include a geometry importer and an OCDT (Open Concurrent Design Tool) interface. The new interface provides both the expanded geometrical flexibility, which is provided by external computer aided design (CAD) modelling, and an ease of import of existing data without the need for extensive preparation of the model. The reduced effort required to perform vulnerability analyses makes it feasible for application during early design phase, at which point modifications to satellite design can be undertaken with relatively little extra effort. The integration of PIRAT in the CDF represents the first time that vulnerability analyses can be performed in-session in ESA's CDF and the first time that comprehensive vulnerability studies can be applied cost-effectively in early design phase in general.

  3. Visual scanning with or without spatial uncertainty and time-sharing performance

    NASA Technical Reports Server (NTRS)

    Liu, Yili; Wickens, Christopher D.

    1989-01-01

    An experiment is reported that examines the pattern of task interference between visual scanning as a sequential and selective attention process and other concurrent spatial or verbal processing tasks. A distinction is proposed between visual scanning with or without spatial uncertainty regarding the possible differential effects of these two types of scanning on interference with other concurrent processes. The experiment required the subject to perform a simulated primary tracking task, which was time-shared with a secondary spatial or verbal decision task. The relevant information that was needed to perform the decision tasks were displayed with or without spatial uncertainty. The experiment employed a 2 x 2 x 2 design with types of scanning (with or without spatial uncertainty), expected scanning distance (low/high), and codes of concurrent processing (spatial/verbal) as the three experimental factors. The results provide strong evidence that visual scanning as a spatial exploratory activity produces greater task interference with concurrent spatial tasks than with concurrent verbal tasks. Furthermore, spatial uncertainty in visual scanning is identified to be the crucial factor in producing this differential effect.

  4. Cybernetics and Workshop Design.

    ERIC Educational Resources Information Center

    Eckstein, Daniel G.

    1979-01-01

    Cybernetic sessions allow for the investigation of several variables concurrently, resulting in a large volume of input compacted into a concise time frame. Three session questions are reproduced to illustrate the variety of ideas generated relative to workshop design. (Author)

  5. Scaffolding knowledge building in a Web-based communication and cultural competence program for international medical graduates.

    PubMed

    Lax, Leila R; Russell, M Lynn; Nelles, Laura J; Smith, Cathy M

    2009-10-01

    Professional behaviors, tacitly understood by Canadian-trained physicians, are difficult to teach and often create practice barriers for IMGs. The purpose of this design research study was to develop a Web-based program simulating Canadian medical literacy and culture, and to evaluate strategies of scaffolding individual knowledge building. Study 1 (N = 20) examined usability and pedagogic design. Studies 2 (N = 39) and 3 (N = 33) examined case participation patterns. Model design was validated in Study 1. Studies 2 and 3 demonstrated high levels of participation, on unprompted third tries, on knowledge tests. Recursive patterns were strongest on Reflective Exercises. Five strategies scaffolded knowledge building: (1) video simulations, (2) contextualized resources, (3) concurrent feedback, (4) Reflective Exercises, and (5) commentaries prompting "reflection on reflection." Scaffolded design supports complex knowledge building. These findings are concurrent with educational research on the importance of recursion and revision of knowledge for improvable and relational understanding.

  6. Self-controlled concurrent feedback facilitates the learning of the final approach phase in a fixed-base flight simulator.

    PubMed

    Huet, Michaël; Jacobs, David M; Camachon, Cyril; Goulon, Cedric; Montagne, Gilles

    2009-12-01

    This study (a) compares the effectiveness of different types of feedback for novices who learn to land a virtual aircraft in a fixed-base flight simulator and (b) analyzes the informational variables that learners come to use after practice. An extensive body of research exists concerning the informational variables that allow successful landing. In contrast, few studies have examined how the attention of pilots can be directed toward these sources of information. In this study, 15 participants were asked to land a virtual Cessna 172 on 245 trials while trying to follow the glide-slope area as accurately as possible. Three groups of participants practiced under different feedback conditions: with self-controlled concurrent feedback (the self-controlled group), with imposed concurrent feedback (the yoked group), or without concurrent feedback (the control group). The self-controlled group outperformed the yoked group, which in turn outperformed the control group. Removing or manipulating specific sources of information during transfer tests had different effects for different individuals. However, removing the cockpit from the visual scene had a detrimental effect on the performance of the majority of the participants. Self-controlled concurrent feedback helps learners to more quickly attune to the informational variables that allow them to control the aircraft during the approach phase. Knowledge concerning feedback schedules can be used for the design of optimal practice methods for student pilots, and knowledge about the informational variables used by expert performers has implications for the design of cockpits and runways that facilitate the detection of these variables.

  7. Manufacturing process and material selection in concurrent collaborative design of MEMS devices

    NASA Astrophysics Data System (ADS)

    Zha, Xuan F.; Du, H.

    2003-09-01

    In this paper we present knowledge of an intensive approach and system for selecting suitable manufacturing processes and materials for microelectromechanical systems (MEMS) devices in concurrent collaborative design environment. In the paper, fundamental issues on MEMS manufacturing process and material selection such as concurrent design framework, manufacturing process and material hierarchies, and selection strategy are first addressed. Then, a fuzzy decision support scheme for a multi-criteria decision-making problem is proposed for estimating, ranking and selecting possible manufacturing processes, materials and their combinations. A Web-based prototype advisory system for the MEMS manufacturing process and material selection, WebMEMS-MASS, is developed based on the client-knowledge server architecture and framework to help the designer find good processes and materials for MEMS devices. The system, as one of the important parts of an advanced simulation and modeling tool for MEMS design, is a concept level process and material selection tool, which can be used as a standalone application or a Java applet via the Web. The running sessions of the system are inter-linked with webpages of tutorials and reference pages to explain the facets, fabrication processes and material choices, and calculations and reasoning in selection are performed using process capability and material property data from a remote Web-based database and interactive knowledge base that can be maintained and updated via the Internet. The use of the developed system including operation scenario, use support, and integration with an MEMS collaborative design system is presented. Finally, an illustration example is provided.

  8. Regenerative life support system research

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Sections on modeling, experimental activities during the grant period, and topics under consideration for the future are contained. The sessions contain discussions of: four concurrent modeling approaches that were being integrated near the end of the period (knowledge-based modeling support infrastructure and data base management, object-oriented steady state simulations for three concepts, steady state mass-balance engineering tradeoff studies, and object-oriented time-step, quasidynamic simulations of generic concepts); interdisciplinary research activities, beginning with a discussion of RECON lab development and use, and followed with discussions of waste processing research, algae studies and subsystem modeling, low pressure growth testing of plants, subsystem modeling of plants, control of plant growth using lighting and CO2 supply as variables, search for and development of lunar soil simulants, preliminary design parameters for a lunar base life support system, and research considerations for food processing in space; and appendix materials, including a discussion of the CELSS Conference, detailed analytical equations for mass-balance modeling, plant modeling equations, and parametric data on existing life support systems for use in modeling.

  9. Pituitary hyperplasia and gigantism in mice caused by a cholera toxin transgene.

    PubMed

    Burton, F H; Hasel, K W; Bloom, F E; Sutcliffe, J G

    1991-03-07

    Cyclic AMP is thought to act as an intracellular second messenger, mediating the physiological response of many cell types to extracellular signals. In the pituitary, growth hormone (GH)-producing cells (somatotrophs) proliferate and produce GH in response to hypothalamic GH-releasing factor, which binds a receptor that stimulates Gs protein activation of adenylyl cyclase. We have now determined whether somatotroph proliferation and GH production are stimulated by cAMP alone, or require concurrent, non-Gs-mediated induction of other regulatory molecules by designing a transgene to induce chronic supraphysiological concentrations of cAMP in somatotrophs. The rat GH promoter was used to express an intracellular form of cholera toxin, a non-cytotoxic and irreversible activator of Gs. Introduction of this transgene into mice caused gigantism, elevated serum GH levels, somatotroph proliferation and pituitary hyperplasia. These results support the direct triggering of these events by cAMP, and illustrate the utility of cholera toxin transgenes as a tool for physiological engineering.

  10. Key technologies for manufacturing and processing sheet materials: A global perspective

    NASA Astrophysics Data System (ADS)

    Demeri, Mahmoud Y.

    2001-02-01

    Modern industrial technologies continue to seek new materials and processes to produce products that meet design and functional requirements. Sheet materials made from ferrous and non-ferrous metals, laminates, composites, and reinforced plastics constitute a large percentage of today’s products, components, and systems. Major manufacturers of sheet products include automotive, aerospace, appliance, and food-packaging industries. The Second Global Symposium on Innovations in Materials Processing & Manufacturing: Sheet Materials is organized to provide a forum for presenting advances in sheet processing and manufacturing by worldwide researchers and engineers from industrial, research, and academic centers. The symposium, sponsored by the TMS Materials Processing & Manufacturing Division (MPMD), was planned for the 2001 TMS Annual Meeting, New Orleans, Louisiana, February 11 15, 2001. This article is a review of key papers submitted for publication in the concurrent volume. The selected papers present significant developments in the rapidly expanding areas of advanced sheet materials, innovative forming methods, industrial applications, primary and secondary processing, composite processing, and numerical modeling of manufacturing processes.

  11. An Annotated Reading List for Concurrent Engineering

    DTIC Science & Technology

    1989-07-01

    The seven tools are sometimes referred to as the seven old tools.) -9- Ishikawa , Kaoru , What is Total Quality Control? The Japanese Way, Prentice-Hall...some solutions. * Ishikawa (1982) presents a practical guide (with easy to use tools) for implementing qual- ity control at the working level...study of, :-, ieering for the last two years. Is..ikawa, Kaoru , Guide to Quality Control, Kraus International Publications, White Plains, NY, 1982. The

  12. The Role of Concurrent Engineering in Weapons System Acquisition

    DTIC Science & Technology

    1988-12-01

    checx sheets, Pareto diagrams, graphs, control charts, and scatter diagrams. Kaoru Iskilkawa, Guide to Qualiy Conmi, Asian Productivity Organization...Dewing [3 !,Juran [141, and Ishikawa [l𔃿). Managers in the United States and Japan have used techniques of statistics to measure performance and they have...New York (1962). 15. Kaoru Ishkawa, Guide to Quall’y Control, KRAUS International Publications, White Plains, NY (1-982). 16. Robert H. Hayc,%. St

  13. Systematic and Scalable Testing of Concurrent Programs

    DTIC Science & Technology

    2013-12-16

    The evaluation of CHESS [107] checked eight different programs ranging from process management libraries to a distributed execution engine to a research...tool (§3.1) targets systematic testing of scheduling nondeterminism in multi- threaded components of the Omega cluster management system [129], while...tool for systematic testing of multithreaded com- ponents of the Omega cluster management system [129]. In particular, §3.1.1 defines a model for

  14. Semiannual Report for Contract NAS1-19480 (Institute for Computer Applications in Science and Engineering)

    DTIC Science & Technology

    1994-06-01

    algorithms for large, irreducibly coupled systems iteratively solve concurrent problems within different subspaces of a Hilbert space, or within different...effective on problems amenable to SIMD solution. Together with researchers at AT&T Bell Labs (Boris Lubachevsky, Albert Greenberg ) we have developed...reasonable measurement. In the study of different speedups, various causes of superlinear speedup are also presented. Greenberg , Albert G., Boris D

  15. Submicron Systems Architecture Project

    DTIC Science & Technology

    1981-11-01

    This project is concerned with the architecture , design , and testing of VLSI Systems. The principal activities in this report period include: The Tree Machine; COPE, The Homogeneous Machine; Computational Arrays; Switch-Level Model for MOS Logic Design; Testing; Local Network and Designer Workstations; Self-timed Systems; Characterization of Deadlock Free Resource Contention; Concurrency Algebra; Language Design and Logic for Program Verification.

  16. Instructional Design Issues in a Distributed Collaborative Engineering Design (CED) Instructional Environment

    ERIC Educational Resources Information Center

    Koszalka, Tiffany A.; Wu, Yiyan

    2010-01-01

    Changes in engineering practices have spawned changes in engineering education and prompted the use of distributed learning environments. A distributed collaborative engineering design (CED) course was designed to engage engineering students in learning about and solving engineering design problems. The CED incorporated an advanced interactive…

  17. Extensive Reading Program Which Changes Reluctant Engineering Students into Autonomous Learners of English

    NASA Astrophysics Data System (ADS)

    Nishizawa, Hitoshi; Yoshioka, Takayoshi; Itoh, Kazuaki

    This article introduces extensive reading (ER) as an approach to improve fundamental communication skills in English of reluctant EFL learners : average Japanese engineering students. It is distinct from concurrent translation approach from a perspective that the learners use English instead of Japanese to grasp the meaning of what they read and enjoy reading. In the ER program at Toyota National College of Technology, many students developed more positive attitude toward English, increased their reading speed, and achieved higher TOEIC scores, which was compared to those of the students before this ER program was introduced. Comparison between three groups of the students showed strong correlation between their TOEIC scores and the reading amount.

  18. Problem Decomposition and Recomposition in Engineering Design: A Comparison of Design Behavior between Professional Engineers, Engineering Seniors, and Engineering Freshmen

    ERIC Educational Resources Information Center

    Song, Ting; Becker, Kurt; Gero, John; DeBerard, Scott; DeBerard, Oenardi; Reeve, Edward

    2016-01-01

    The authors investigated the differences in using problem decomposition and problem recomposition between dyads of engineering experts, engineering seniors, and engineering freshmen. Participants worked in dyads to complete an engineering design challenge within 1 hour. The entire design process was video and audio recorded. After the design…

  19. Engineering design skills coverage in K-12 engineering program curriculum materials in the USA

    NASA Astrophysics Data System (ADS)

    Chabalengula, Vivien M.; Mumba, Frackson

    2017-11-01

    The current K-12 Science Education framework and Next Generation Science Standards (NGSS) in the United States emphasise the integration of engineering design in science instruction to promote scientific literacy and engineering design skills among students. As such, many engineering education programmes have developed curriculum materials that are being used in K-12 settings. However, little is known about the nature and extent to which engineering design skills outlined in NGSS are addressed in these K-12 engineering education programme curriculum materials. We analysed nine K-12 engineering education programmes for the nature and extent of engineering design skills coverage. Results show that developing possible solutions and actual designing of prototypes were the highly covered engineering design skills; specification of clear goals, criteria, and constraints received medium coverage; defining and identifying an engineering problem; optimising the design solution; and demonstrating how a prototype works, and making iterations to improve designs were lowly covered. These trends were similar across grade levels and across discipline-specific curriculum materials. These results have implications on engineering design-integrated science teaching and learning in K-12 settings.

  20. Civil engineering reference guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merritt, F.S.

    1986-01-01

    The civil engineering reference guide contains the following: Structural theory. Structural steel design. Concrete design and construction. Wood design and construction. Bridge engineering. Geotechnical engineering. Water engineering. Environmental engineering. Surveying.

  1. The Influence of Toy Design Activities on Middle School Students' Understanding of the Engineering Design Processes

    NASA Astrophysics Data System (ADS)

    Zhou, Ninger; Pereira, Nielsen L.; George, Tarun Thomas; Alperovich, Jeffrey; Booth, Joran; Chandrasegaran, Senthil; Tew, Jeffrey David; Kulkarni, Devadatta M.; Ramani, Karthik

    2017-10-01

    The societal demand for inspiring and engaging science, technology, engineering, and mathematics (STEM) students and preparing our workforce for the emerging creative economy has necessitated developing students' self-efficacy and understanding of engineering design processes from as early as elementary school levels. Hands-on engineering design activities have shown the potential to promote middle school students' self-efficacy and understanding of engineering design processes. However, traditional classrooms often lack hands-on engineering design experiences, leaving students unprepared to solve real-world design problems. In this study, we introduce the framework of a toy design workshop and investigate the influence of the workshop activities on students' understanding of and self-efficacy beliefs in engineering design. Using a mixed method approach, we conducted quantitative analyses to show changes in students' engineering design self-efficacy and qualitative analyses to identify students' understanding of the engineering design processes. Findings show that among the 24 participants, there is a significant increase in students' self-efficacy beliefs after attending the workshop. We also identified major themes such as design goals and prototyping in students' understanding of engineering design processes. This research provides insights into the key elements of middle school students' engineering design learning and the benefits of engaging middle school students in hands-on toy design workshops.

  2. The Interaction of Sexual Validation, Criminal Justice Involvement, and Sexually Transmitted Infection Risk Among Adolescent and Young Adult Males.

    PubMed

    Matson, Pamela A; Towe, Vivian; Ellen, Jonathan M; Chung, Shang-En; Sherman, Susan G

    2018-03-01

    Young men who have been involved with the criminal justice system are more likely to have concurrent sexual partners, a key driver of sexually transmitted infections. The value men place on having sexual relationships to validate themselves may play an important role in understanding this association. Data were from a household survey. Young men (N = 132), aged 16 to 24 years, self-reported whether they ever spent time in jail or juvenile detention and if they had sexual partnerships that overlapped in time. A novel scale, "Validation through Sex and Sexual Relationships" (VTSSR) assessed the importance young men place on sex and sexual relationships (α = 0.91). Weighted logistic regression accounted for the sampling design. The mean (SD) VTSSR score was 23.7 (8.8) with no differences by race. Both criminal justice involvement (CJI) (odds ratio [OR], 3.69; 95% confidence interval [CI], 1.12-12.1) and sexual validation (OR, 1.10; 95% CI, 1.04-1.16) were associated with an increased odds of concurrency; however, CJI did not remain associated with concurrency in the fully adjusted model. There was effect modification, CJI was associated with concurrency among those who scored high on sexual validation (OR, 9.18; 95% CI, 1.73-48.6]; however, there was no association among those who scored low on sexual validation. Racial differences were observed between CJI and concurrency, but not between sexual validation and concurrency. Sexual validation may be an important driver of concurrency for men who have been involved with the criminal justice system. Study findings have important implications on how sexual validation may explain racial differences in rates of concurrency.

  3. Concurrent prediction of muscle and tibiofemoral contact forces during treadmill gait.

    PubMed

    Guess, Trent M; Stylianou, Antonis P; Kia, Mohammad

    2014-02-01

    Detailed knowledge of knee kinematics and dynamic loading is essential for improving the design and outcomes of surgical procedures, tissue engineering applications, prosthetics design, and rehabilitation. This study used publicly available data provided by the "Grand Challenge Competition to Predict in-vivo Knee Loads" for the 2013 American Society of Mechanical Engineers Summer Bioengineering Conference (Fregly et al., 2012, "Grand Challenge Competition to Predict in vivo Knee Loads," J. Orthop. Res., 30, pp. 503-513) to develop a full body, musculoskeletal model with subject specific right leg geometries that can concurrently predict muscle forces, ligament forces, and knee and ground contact forces. The model includes representation of foot/floor interactions and predicted tibiofemoral joint loads were compared to measured tibial loads for two different cycles of treadmill gait. The model used anthropometric data (height and weight) to scale the joint center locations and mass properties of a generic model and then used subject bone geometries to more accurately position the hip and ankle. The musculoskeletal model included 44 muscles on the right leg, and subject specific geometries were used to create a 12 degrees-of-freedom anatomical right knee that included both patellofemoral and tibiofemoral articulations. Tibiofemoral motion was constrained by deformable contacts defined between the tibial insert and femoral component geometries and by ligaments. Patellofemoral motion was constrained by contact between the patellar button and femoral component geometries and the patellar tendon. Shoe geometries were added to the feet, and shoe motion was constrained by contact between three shoe segments per foot and the treadmill surface. Six-axis springs constrained motion between the feet and shoe segments. Experimental motion capture data provided input to an inverse kinematics stage, and the final forward dynamics simulations tracked joint angle errors for the left leg and upper body and tracked muscle length errors for the right leg. The one cycle RMS errors between the predicted and measured tibia contact were 178 N and 168 N for the medial and lateral sides for the first gait cycle and 209 N and 228 N for the medial and lateral sides for the faster second gait cycle. One cycle RMS errors between predicted and measured ground reaction forces were 12 N, 13 N, and 65 N in the anterior-posterior, medial-lateral, and vertical directions for the first gait cycle and 43 N, 15 N, and 96 N in the anterior-posterior, medial-lateral, and vertical directions for the second gait cycle.

  4. Proceedings of the Twenty-Fourth Annual Conference of the Cognitive Science Society

    DTIC Science & Technology

    2002-01-01

    6 Walter Schneider (University of Pittsburgh) A Cognitive Approach to Designing Human Error...Experiment Design and Comparison of Human and Model Data: David Diller and Yvette Tenney (BBN Technologies) An EPIC-Soar Model of Concurrent...the Roles of Design History and Affordances in the HIPE Theory of Function

  5. Statistical process management: An essential element of quality improvement

    NASA Astrophysics Data System (ADS)

    Buckner, M. R.

    Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.

  6. Acquisition of choice in concurrent chains: Assessing the cumulative decision model.

    PubMed

    Grace, Randolph C

    2016-05-01

    Concurrent chains is widely used to study pigeons' choice between terminal links that can vary in delay, magnitude, or probability of reinforcement. We review research on the acquisition of choice in this procedure. Acquisition has been studied with a variety of research designs, and some studies have incorporated no-food trials to allow for timing and choice to be observed concurrently. Results show that: Choice can be acquired rapidly within sessions when terminal links change unpredictably; under steady-state conditions, acquisition depends on both initial- and terminal-link schedules; and initial-link responding is mediated by learning about the terminal-link stimulus-reinforcer relations. The cumulative decision model (CDM) proposed by Christensen and Grace (2010) and Grace and McLean (2006, 2015) provides a good description of within-session acquisition, and correctly predicts the effects of initial and terminal-link schedules in steady-state designs (Grace, 2002a). Questions for future research include how abrupt shifts in preference within individual sessions and temporal control of terminal-link responding can be modeled. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. 78 FR 38718 - Lists of Designated Primary Medical Care, Mental Health, and Dental Health Professional Shortage...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-27

    .... Requests that come from other sources are referred to the PCOs for their review and concurrence. In.... Annually, lists of designated HPSAs are made available to all PCOs, state medical and dental societies, and...

  8. Residential Interior Design as Complex Composition: A Case Study of a High School Senior's Composing Process

    ERIC Educational Resources Information Center

    Smagorinsky, Peter; Zoss, Michelle; Reed, Patty M.

    2006-01-01

    This research analyzed the composing processes of one high school student as she designed the interiors of homes for a course in interior design. Data included field notes, an interview with the teacher, artifacts from the class, and the focal student's concurrent and retrospective protocols in relation to her design of home interiors. The…

  9. Live-cell imaging of cell signaling using genetically encoded fluorescent reporters.

    PubMed

    Ni, Qiang; Mehta, Sohum; Zhang, Jin

    2018-01-01

    Synergistic advances in fluorescent protein engineering and live-cell imaging techniques in recent years have fueled the concurrent development and application of genetically encoded fluorescent reporters that are tailored for tracking signaling dynamics in living systems over multiple length and time scales. These biosensors are uniquely suited for this challenging task, owing to their specificity, sensitivity, and versatility, as well as to the noninvasive and nondestructive nature of fluorescence and the power of genetic encoding. Over the past 10 years, a growing number of fluorescent reporters have been developed for tracking a wide range of biological signals in living cells and animals, including second messenger and metabolite dynamics, enzyme activation and activity, and cell cycle progression and neuronal activity. Many of these biosensors are gaining wide use and are proving to be indispensable for unraveling the complex biological functions of individual signaling molecules in their native environment, the living cell, shedding new light on the structural and molecular underpinnings of cell signaling. In this review, we highlight recent advances in protein engineering that are likely to help expand and improve the design and application of these valuable tools. We then turn our focus to specific examples of live-cell imaging using genetically encoded fluorescent reporters as an important platform for advancing our understanding of G protein-coupled receptor signaling and neuronal activity. © 2017 Federation of European Biochemical Societies.

  10. A Physics-Based Vibrotactile Feedback Library for Collision Events.

    PubMed

    Park, Gunhyuk; Choi, Seungmoon

    2017-01-01

    We present PhysVib: a software solution on the mobile platform extending an open-source physics engine in a multi-rate rendering architecture for automatic vibrotactile feedback upon collision events. PhysVib runs concurrently with a physics engine at a low update rate and generates vibrotactile feedback commands at a high update rate based on the simulation results of the physics engine using an exponentially-decaying sinusoidal model. We demonstrate through a user study that this vibration model is more appropriate to our purpose in terms of perceptual quality than more complex models based on sound synthesis. We also evaluated the perceptual performance of PhysVib by comparing eight vibrotactile rendering methods. Experimental results suggested that PhysVib enables more realistic vibrotactile feedback than the other methods as to perceived similarity to the visual events. PhysVib is an effective solution for providing physically plausible vibrotactile responses while reducing application development time to great extent.

  11. Liquid Rocket Booster (LRB) for the Space Transportation System (STS) systems study, volume 2, addendum 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The feasibility of developing and producing a launch vehicle from an external tank (ET) and an engine module that mounts inline to the tankage at the aft end and contains six space transportation main engines (STME), was assessed. The primary mission of this launch vehicle would be to place a PLS (personnel launch vehicle) into a low earth orbit (LEO). The vehicle tankage and the assembly of the engine module, was evaluated to determine what, if any, manufacturing/production impacts would be incurred if this vehicle were built along side the current ET at Michoud Assembly Facility. It was determined that there would be no significant impacts to produce seven of these vehicles per year while concurrently producing 12 ETs per year. Preliminary estimates of both nonrecurring and recurring costs for this vehicle concept were made.

  12. Non-invasive lightweight integration engine for building EHR from autonomous distributed systems.

    PubMed

    Angulo, Carlos; Crespo, Pere; Maldonado, José A; Moner, David; Pérez, Daniel; Abad, Irene; Mandingorra, Jesús; Robles, Montserrat

    2007-12-01

    In this paper we describe Pangea-LE, a message-oriented lightweight data integration engine that allows homogeneous and concurrent access to clinical information from disperse and heterogeneous data sources. The engine extracts the information and passes it to the requesting client applications in a flexible XML format. The XML response message can be formatted on demand by appropriate Extensible Stylesheet Language (XSL) transformations in order to meet the needs of client applications. We also present a real deployment in a hospital where Pangea-LE collects and generates an XML view of all the available patient clinical information. The information is presented to healthcare professionals in an Electronic Health Record (EHR) viewer Web application with patient search and EHR browsing capabilities. Implantation in a real setting has been a success due to the non-invasive nature of Pangea-LE which respects the existing information systems.

  13. Non-invasive light-weight integration engine for building EHR from autonomous distributed systems.

    PubMed

    Crespo Molina, Pere; Angulo Fernández, Carlos; Maldonado Segura, José A; Moner Cano, David; Robles Viejo, Montserrat

    2006-01-01

    Pangea-LE is a message oriented light-weight integration engine, allowing concurrent access to clinical information from disperse and heterogeneous data sources. The engine extracts the information and serves it to the requester client applications in a flexible XML format. This XML response message can be formatted on demand by the appropriate XSL (Extensible Stylesheet Language) transformation in order to fit client application needs. In this article we present a real use case sample where Pangea-LE collects and generates "on the fly" a structured view of all the patient clinical information available in a healthcare organisation. This information is presented to healthcare professionals in an EHR (Electronic Health Record) viewer Web application with patient search and EHR browsing capabilities. Implantation in a real environment has been a notable success due to the non-invasive method which extremely respects the existing information systems.

  14. NASA Engine Icing Research Overview: Aeronautics Evaluation and Test Capabilities (AETC) Project

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    2015-01-01

    The occurrence of ice accretion within commercial high bypass aircraft turbine engines has been reported by airlines under certain atmospheric conditions. Engine anomalies have taken place at high altitudes that have been attributed to ice crystal ingestion by the engine. The ice crystals can result in degraded engine performance, loss of thrust control, compressor surge or stall, and flameout of the combustor. The Aviation Safety Program at NASA has taken on the technical challenge of a turbofan engine icing caused by ice crystals which can exist in high altitude convective clouds. The NASA engine icing project consists of an integrated approach with four concurrent and ongoing research elements, each of which feeds critical information to the next element. The project objective is to gain understanding of high altitude ice crystals by developing knowledge bases and test facilities for testing full engines and engine components. The first element is to utilize a highly instrumented aircraft to characterize the high altitude convective cloud environment. The second element is the enhancement of the Propulsion Systems Laboratory altitude test facility for gas turbine engines to include the addition of an ice crystal cloud. The third element is basic research of the fundamental physics associated with ice crystal ice accretion. The fourth and final element is the development of computational tools with the goal of simulating the effects of ice crystal ingestion on compressor and gas turbine engine performance. The NASA goal is to provide knowledge to the engine and aircraft manufacturing communities to help mitigate, or eliminate turbofan engine interruptions, engine damage, and failures due to ice crystal ingestion.

  15. Concurrent simulation of a parallel jaw end effector

    NASA Technical Reports Server (NTRS)

    Bynum, Bill

    1985-01-01

    A system of programs developed to aid in the design and development of the command/response protocol between a parallel jaw end effector and the strategic planner program controlling it are presented. The system executes concurrently with the LISP controlling program to generate a graphical image of the end effector that moves in approximately real time in response to commands sent from the controlling program. Concurrent execution of the simulation program is useful for revealing flaws in the communication command structure arising from the asynchronous nature of the message traffic between the end effector and the strategic planner. Software simulation helps to minimize the number of hardware changes necessary to the microprocessor driving the end effector because of changes in the communication protocol. The simulation of other actuator devices can be easily incorporated into the system of programs by using the underlying support that was developed for the concurrent execution of the simulation process and the communication between it and the controlling program.

  16. Algorithms and software for nonlinear structural dynamics

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted; Gilbertsen, Noreen D.; Neal, Mark O.

    1989-01-01

    The objective of this research is to develop efficient methods for explicit time integration in nonlinear structural dynamics for computers which utilize both concurrency and vectorization. As a framework for these studies, the program WHAMS, which is described in Explicit Algorithms for the Nonlinear Dynamics of Shells (T. Belytschko, J. I. Lin, and C.-S. Tsay, Computer Methods in Applied Mechanics and Engineering, Vol. 42, 1984, pp 225 to 251), is used. There are two factors which make the development of efficient concurrent explicit time integration programs a challenge in a structural dynamics program: (1) the need for a variety of element types, which complicates the scheduling-allocation problem; and (2) the need for different time steps in different parts of the mesh, which is here called mixed delta t integration, so that a few stiff elements do not reduce the time steps throughout the mesh.

  17. Nuclei-mode particulate emissions and their response to fuel sulfur content and primary dilution during transient operations of old and modern diesel engines.

    PubMed

    Liu, Z Gerald; Vasys, Victoria N; Kittelson, David B

    2007-09-15

    The effects of fuel sulfur content and primary dilution on PM number emissions were investigated during transient operations of an old and a modern diesel engine. Emissions were also studied during steady-state operations in order to confirm consistency with previous findings. Testing methods were concurrent with those implemented by the EPA to regulate PM mass emissions, including the use of the Federal Transient Testing Procedure-Heavy Duty cycle to simulate transient conditions and the use of a Critical Flow Venturi-Constant Volume System to provide primary dilution. Steady-state results were found to be consistent with previous studies in that nuclei-mode particulate emissions were largely reduced when lower-sulfur content fuel was used in the newer engine, while the nuclei-mode PM emissions from the older engine were much less affected by fuel sulfur content. The transient results, however, show that the total number of nuclei-mode PM emissions from both engines increases with fuel sulfur content, although this effect is only seen under the higher primary dilution ratios with the older engine. Transient results further show that higher primary dilution ratios increase total nuclei-mode PM number emissions in both engines.

  18. Collaborative engineering-design support system

    NASA Technical Reports Server (NTRS)

    Lee, Dong HO; Decker, D. Richard

    1994-01-01

    Designing engineering objects requires many engineers' knowledge from different domains. There needs to be cooperative work among engineering designers to complete a design. Revisions of a design are time consuming, especially if designers work at a distance and with different design description formats. In order to reduce the design cycle, there needs to be a sharable design describing the engineering community, which can be electronically transportable. Design is a process of integrating that is not easy to define definitively. This paper presents Design Script which is a generic engineering design knowledge representation scheme that can be applied in any engineering domain. The Design Script is developed through encapsulation of common design activities and basic design components based on problem decomposition. It is implemented using CLIPS with a Windows NT graphical user interface. The physical relationships between engineering objects and their subparts can be constructed in a hierarchical manner. The same design process is repeatedly applied at each given level of hierarchy and recursively into lower levels of the hierarchy. Each class of the structure can be represented using the Design Script.

  19. Vapor port and groundwater sampling well

    DOEpatents

    Hubbell, Joel M.; Wylie, Allan H.

    1996-01-01

    A method and apparatus has been developed for combining groundwater monitoring wells with unsaturated-zone vapor sampling ports. The apparatus allows concurrent monitoring of both the unsaturated and the saturated zone from the same well at contaminated areas. The innovative well design allows for concurrent sampling of groundwater and volatile organic compounds (VOCs) in the vadose (unsaturated) zone from a single well, saving considerable time and money. The sample tubes are banded to the outer well casing during installation of the well casing.

  20. Vapor port and groundwater sampling well

    DOEpatents

    Hubbell, J.M.; Wylie, A.H.

    1996-01-09

    A method and apparatus have been developed for combining groundwater monitoring wells with unsaturated-zone vapor sampling ports. The apparatus allows concurrent monitoring of both the unsaturated and the saturated zone from the same well at contaminated areas. The innovative well design allows for concurrent sampling of groundwater and volatile organic compounds (VOCs) in the vadose (unsaturated) zone from a single well, saving considerable time and money. The sample tubes are banded to the outer well casing during installation of the well casing. 10 figs.

  1. Locality Aware Concurrent Start for Stencil Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shrestha, Sunil; Gao, Guang R.; Manzano Franco, Joseph B.

    Stencil computations are at the heart of many physical simulations used in scientific codes. Thus, there exists a plethora of optimization efforts for this family of computations. Among these techniques, tiling techniques that allow concurrent start have proven to be very efficient in providing better performance for these critical kernels. Nevertheless, with many core designs being the norm, these optimization techniques might not be able to fully exploit locality (both spatial and temporal) on multiple levels of the memory hierarchy without compromising parallelism. It is no longer true that the machine can be seen as a homogeneous collection of nodesmore » with caches, main memory and an interconnect network. New architectural designs exhibit complex grouping of nodes, cores, threads, caches and memory connected by an ever evolving network-on-chip design. These new designs may benefit greatly from carefully crafted schedules and groupings that encourage parallel actors (i.e. threads, cores or nodes) to be aware of the computational history of other actors in close proximity. In this paper, we provide an efficient tiling technique that allows hierarchical concurrent start for memory hierarchy aware tile groups. Each execution schedule and tile shape exploit the available parallelism, load balance and locality present in the given applications. We demonstrate our technique on the Intel Xeon Phi architecture with selected and representative stencil kernels. We show improvement ranging from 5.58% to 31.17% over existing state-of-the-art techniques.« less

  2. 5 CFR 5502.105 - Agency procedures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ....105 Administrative Personnel DEPARTMENT OF HEALTH AND HUMAN SERVICES SUPPLEMENTAL FINANCIAL DISCLOSURE REQUIREMENTS FOR EMPLOYEES OF THE DEPARTMENT OF HEALTH AND HUMAN SERVICES § 5502.105 Agency procedures. (a) The designated agency ethics official or, with the concurrence of the designated agency ethics official, each of...

  3. 5 CFR 5502.105 - Agency procedures.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ....105 Administrative Personnel DEPARTMENT OF HEALTH AND HUMAN SERVICES SUPPLEMENTAL FINANCIAL DISCLOSURE REQUIREMENTS FOR EMPLOYEES OF THE DEPARTMENT OF HEALTH AND HUMAN SERVICES § 5502.105 Agency procedures. (a) The designated agency ethics official or, with the concurrence of the designated agency ethics official, each of...

  4. 5 CFR 5502.105 - Agency procedures.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ....105 Administrative Personnel DEPARTMENT OF HEALTH AND HUMAN SERVICES SUPPLEMENTAL FINANCIAL DISCLOSURE REQUIREMENTS FOR EMPLOYEES OF THE DEPARTMENT OF HEALTH AND HUMAN SERVICES § 5502.105 Agency procedures. (a) The designated agency ethics official or, with the concurrence of the designated agency ethics official, each of...

  5. 5 CFR 5502.105 - Agency procedures.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ....105 Administrative Personnel DEPARTMENT OF HEALTH AND HUMAN SERVICES SUPPLEMENTAL FINANCIAL DISCLOSURE REQUIREMENTS FOR EMPLOYEES OF THE DEPARTMENT OF HEALTH AND HUMAN SERVICES § 5502.105 Agency procedures. (a) The designated agency ethics official or, with the concurrence of the designated agency ethics official, each of...

  6. 5 CFR 5502.105 - Agency procedures.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ....105 Administrative Personnel DEPARTMENT OF HEALTH AND HUMAN SERVICES SUPPLEMENTAL FINANCIAL DISCLOSURE REQUIREMENTS FOR EMPLOYEES OF THE DEPARTMENT OF HEALTH AND HUMAN SERVICES § 5502.105 Agency procedures. (a) The designated agency ethics official or, with the concurrence of the designated agency ethics official, each of...

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Womeldorff, Geoffrey Alan; Payne, Joshua Estes; Bergen, Benjamin Karl

    These are slides for a presentation on PARTISN Research and FleCSI Updates. The following topics are covered: SNAP vs PARTISN, Background Research, Production Code (structural design and changes, kernel design and implementation, lessons learned), NuT IMC Proxy, FleCSI Update (design and lessons learned). It can all be summarized in the following manner: Kokkos was shown to be effective in FY15 in implementing a C++ version of SNAP's kernel. This same methodology was applied to a production IC code, PARTISN. This was a much more complex endeavour than in FY15 for many reasons; a C++ kernel embedded in Fortran, overloading Fortranmore » memory allocations, general language interoperability, and a fully fleshed out production code versus a simplified proxy code. Lessons learned are Legion. In no particular order: Interoperability between Fortran and C++ was really not that hard, and a useful engineering effort. Tracking down all necessary memory allocations for a kernel in a production code is pretty hard. Modifying a production code to work for more than a handful of use cases is also pretty hard. Figuring out the toolchain that will allow a successful implementation of design decisions is quite hard, if making use of "bleeding edge" design choices. In terms of performance, production code concurrency architecture can be a virtual showstopper; being too complex to easily rewrite and test in a short period of time, or depending on tool features which do not exist yet. Ultimately, while the tools used in this work were not successful in speeding up the production code, they helped to identify how work would be done, and provide requirements to tools.« less

  8. New Methods in Design Education: The Systemic Methodology and the Use of Sketch in the Conceptual Design Stage

    ERIC Educational Resources Information Center

    Westermeyer, Juan Carlos Briede; Ortuno, Bernabe Hernandis

    2011-01-01

    This study describes the application of a new product concurrent design methodologies in the context in the education of industrial design. The use of the sketch has been utilized many times as a tool of creative expression especially in the conceptual design stage, in an intuitive way and a little out of the context of the reality needs that the…

  9. Space Transportation Main Engine

    NASA Technical Reports Server (NTRS)

    Monk, Jan C.

    1992-01-01

    The topics are presented in viewgraph form and include the following: Space Transportation Main Engine (STME) definition, design philosophy, robust design, maximum design condition, casting vs. machined and welded forgings, operability considerations, high reliability design philosophy, engine reliability enhancement, low cost design philosophy, engine systems requirements, STME schematic, fuel turbopump, liquid oxygen turbopump, main injector, and gas generator. The major engine components of the STME and the Space Shuttle Main Engine are compared.

  10. Linkographic Evidence for Concurrent Divergent and Convergent Thinking in Creative Design

    ERIC Educational Resources Information Center

    Goldschmidt, Gabriela

    2016-01-01

    For a long time, the creativity literature has stressed the role of divergent thinking in creative endeavor. More recently, it has been recognized that convergent thinking also has a role in creativity, and the design literature, which sees design as a creative activity a priori, has largely adopted this view: Divergent and convergent thinking are…

  11. Elementary teachers' mental models of engineering design processes: A comparison of two communities of practice

    NASA Astrophysics Data System (ADS)

    McMahon, Ann P.

    Educating K-12 students in the processes of design engineering is gaining popularity in public schools. Several states have adopted standards for engineering design despite the fact that no common agreement exists on what should be included in the K-12 engineering design process. Furthermore, little pre-service and in-service professional development exists that will prepare teachers to teach a design process that is fundamentally different from the science teaching process found in typical public schools. This study provides a glimpse into what teachers think happens in engineering design compared to articulated best practices in engineering design. Wenger's communities of practice work and van Dijk's multidisciplinary theory of mental models provide the theoretical bases for comparing the mental models of two groups of elementary teachers (one group that teaches engineering and one that does not) to the mental models of design engineers (including this engineer/researcher/educator and professionals described elsewhere). The elementary school teachers and this engineer/researcher/educator observed the design engineering process enacted by professionals, then answered questions designed to elicit their mental models of the process they saw in terms of how they would teach it to elementary students. The key finding is this: Both groups of teachers embedded the cognitive steps of the design process into the matrix of the social and emotional roles and skills of students. Conversely, the engineers embedded the social and emotional aspects of the design process into the matrix of the cognitive steps of the design process. In other words, teachers' mental models show that they perceive that students' social and emotional communicative roles and skills in the classroom drive their cognitive understandings of the engineering process, while the mental models of this engineer/researcher/educator and the engineers in the video show that we perceive that cognitive understandings of the engineering process drive the social and emotional roles and skills used in that process. This comparison of mental models with the process that professional designers use defines a problem space for future studies that investigate how to incorporate engineering practices into elementary classrooms. Recommendations for engineering curriculum development and teacher professional development based on this study are presented.

  12. Orbit Transfer Vehicle (OTV) advanced expander cycle engine point design study. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Mellish, J. A.

    1980-01-01

    Engine control techniques were established and new technology requirements were identified. The designs of the components and engine were prepared in sufficient depth to calculate engine and component weights and envelopes, turbopump efficiencies and recirculation leakage rates, and engine performance. Engine design assumptions are presented along with the structural design criteria.

  13. Wave rotor demonstrator engine assessment

    NASA Technical Reports Server (NTRS)

    Snyder, Philip H.

    1996-01-01

    The objective of the program was to determine a wave rotor demonstrator engine concept using the Allison 250 series engine. The results of the NASA LERC wave rotor effort were used as a basis for the wave rotor design. A wave rotor topped gas turbine engine was identified which incorporates five basic requirements of a successful demonstrator engine. Predicted performance maps of the wave rotor cycle were used along with maps of existing gas turbine hardware in a design point study. The effects of wave rotor topping on the engine cycle and the subsequent need to rematch compressor and turbine sections in the topped engine were addressed. Comparison of performance of the resulting engine is made on the basis of wave rotor topped engine versus an appropriate baseline engine using common shaft compressor hardware. The topped engine design clearly demonstrates an impressive improvement in shaft horsepower (+11.4%) and SFC (-22%). Off design part power engine performance for the wave rotor topped engine was similarly improved including that at engine idle conditions. Operation of the engine at off design was closely examined with wave rotor operation at less than design burner outlet temperatures and rotor speeds. Challenges identified in the development of a demonstrator engine are discussed. A preliminary design was made of the demonstrator engine including wave rotor to engine transition ducts. Program cost and schedule for a wave rotor demonstrator engine fabrication and test program were developed.

  14. Predictive Design of Interfacial Functionality in Polymer Matrix Composites

    DTIC Science & Technology

    2017-05-24

    structural design criteria. Due to the poor accessibility of interfaces by experimental means, little is known about the molecular definition, defect...is designed to allow for concurrent light scattering measurements, which establishes a unique experimental resource. We were able to leverage this...AFRL-AFOSR-VA-TR-2017-0103 Predictive Design of Interfacial Functionality in Polymer Matrix Composites John Kieffer UNIVERSITY OF MICHIGAN 503

  15. Designing for Success: Developing Engineers Who Consider Universal Design Principles

    ERIC Educational Resources Information Center

    Bigelow, Kimberly Edginton

    2012-01-01

    Engineers must design for a diverse group of potential users of their products; however, engineering curricula rarely include an emphasis on universal design principles. This research article details the effectiveness of a design project implemented in a first-year engineering course in an effort to raise awareness of the need for engineers to be…

  16. Development and manufacture of visor for helmet-mounted display

    NASA Astrophysics Data System (ADS)

    Krevor, David H.; McNelly, Gregg; Skubon, John; Speirs, Robert

    2004-01-01

    The manufacturing design and process development for the Visor for the JHMCS (Joint Helmet Mounted Cueing System) are discussed. The JHMCS system is a Helmet Mounted Display (HMD) system currently flying on the F-15, F-16 and F/A-18 aircraft. The Visor manufacturing processes are essential to both system performance and economy. The Visor functions both as the system optical combiner and personal protective equipment for the pilot. The Visor material is optical polycarbonate. For a military HMD system, the mechanical and environmental properties of the Visor are as necessary as the optical properties. The visor must meet stringent dimensional requirements to assure adequate system optical performance. Injection molding can provide dimensional fidelity to the requirements, if done properly. Concurrent design of the visor and the tool (i.e., the injection mold) is essential. The concurrent design necessarily considers manufacturing operations and the use environment of the Visor. Computer modeling of the molding process is a necessary input to the mold design. With proper attention to product design and tool development, it is possible to improve upon published standard dimensional tolerances for molded polycarbonate articles.

  17. Longitudinal in vivo evaluation of bone regeneration by combined measurement of multi-pinhole SPECT and micro-CT for tissue engineering

    NASA Astrophysics Data System (ADS)

    Lienemann, Philipp S.; Metzger, Stéphanie; Kiveliö, Anna-Sofia; Blanc, Alain; Papageorgiou, Panagiota; Astolfo, Alberto; Pinzer, Bernd R.; Cinelli, Paolo; Weber, Franz E.; Schibli, Roger; Béhé, Martin; Ehrbar, Martin

    2015-05-01

    Over the last decades, great strides were made in the development of novel implants for the treatment of bone defects. The increasing versatility and complexity of these implant designs request for concurrent advances in means to assess in vivo the course of induced bone formation in preclinical models. Since its discovery, micro-computed tomography (micro-CT) has excelled as powerful high-resolution technique for non-invasive assessment of newly formed bone tissue. However, micro-CT fails to provide spatiotemporal information on biological processes ongoing during bone regeneration. Conversely, due to the versatile applicability and cost-effectiveness, single photon emission computed tomography (SPECT) would be an ideal technique for assessing such biological processes with high sensitivity and for nuclear imaging comparably high resolution (<1 mm). Herein, we employ modular designed poly(ethylene glycol)-based hydrogels that release bone morphogenetic protein to guide the healing of critical sized calvarial bone defects. By combined in vivo longitudinal multi-pinhole SPECT and micro-CT evaluations we determine the spatiotemporal course of bone formation and remodeling within this synthetic hydrogel implant. End point evaluations by high resolution micro-CT and histological evaluation confirm the value of this approach to follow and optimize bone-inducing biomaterials.

  18. Fusion energy with lasers, direct drive targets, and dry wall chambers

    NASA Astrophysics Data System (ADS)

    Sethian, J. D.; Friedman, M.; Lehmberg, R. H.; Myers, M.; Obenschain, S. P.; Giuliani, J.; Kepple, P.; Schmitt, A. J.; Colombant, D.; Gardner, J.; Hegeler, F.; Wolford, M.; Swanekamp, S. B.; Weidenheimer, D.; Welch, D.; Rose, D.; Payne, S.; Bibeau, C.; Baraymian, A.; Beach, R.; Schaffers, K.; Freitas, B.; Skulina, K.; Meier, W.; Latkowski, J.; Perkins, L. J.; Goodin, D.; Petzoldt, R.; Stephens, E.; Najmabadi, F.; Tillack, M.; Raffray, R.; Dragojlovic, Z.; Haynes, D.; Peterson, R.; Kulcinski, G.; Hoffer, J.; Geller, D.; Schroen, D.; Streit, J.; Olson, C.; Tanaka, T.; Renk, T.; Rochau, G.; Snead, L.; Ghoneim, N.; Lucas, G.

    2003-12-01

    A coordinated, focused effort is underway to develop Laser Inertial Fusion Energy. The key components are developed in concert with one another and the science and engineering issues are addressed concurrently. Recent advances include: target designs have been evaluated that show it could be possible to achieve the high gains (>100) needed for a practical fusion system.These designs feature a low-density CH foam that is wicked with solid DT and over-coated with a thin high-Z layer. These results have been verified with three independent one-dimensional codes, and are now being evaluated with two- and three-dimensional codes. Two types of lasers are under development: Krypton Fluoride (KrF) gas lasers and Diode Pumped Solid State Lasers (DPSSL). Both have recently achieved repetitive 'first light', and both have made progress in meeting the fusion energy requirements for durability, efficiency, and cost. This paper also presents the advances in development of chamber operating windows (target survival plus no wall erosion), final optics (aluminium at grazing incidence has high reflectivity and exceeds the required laser damage threshold), target fabrication (demonstration of smooth DT ice layers grown over foams, batch production of foam shells, and appropriate high-Z overcoats), and target injection (new facility for target injection and tracking studies).

  19. Expert vs. novice: Problem decomposition/recomposition in engineering design

    NASA Astrophysics Data System (ADS)

    Song, Ting

    The purpose of this research was to investigate the differences of using problem decomposition and problem recomposition among dyads of engineering experts, dyads of engineering seniors, and dyads of engineering freshmen. Fifty participants took part in this study. Ten were engineering design experts, 20 were engineering seniors, and 20 were engineering freshmen. Participants worked in dyads to complete an engineering design challenge within an hour. The entire design process was video and audio recorded. After the design session, members participated in a group interview. This study used protocol analysis as the methodology. Video and audio data were transcribed, segmented, and coded. Two coding systems including the FBS ontology and "levels of the problem" were used in this study. A series of statistical techniques were used to analyze data. Interview data and participants' design sketches also worked as supplemental data to help answer the research questions. By analyzing the quantitative and qualitative data, it was found that students used less problem decomposition and problem recomposition than engineer experts in engineering design. This result implies that engineering education should place more importance on teaching problem decomposition and problem recomposition. Students were found to spend less cognitive effort when considering the problem as a whole and interactions between subsystems than engineer experts. In addition, students were also found to spend more cognitive effort when considering details of subsystems. These results showed that students tended to use dept-first decomposition and experts tended to use breadth-first decomposition in engineering design. The use of Function (F), Behavior (B), and Structure (S) among engineering experts, engineering seniors, and engineering freshmen was compared on three levels. Level 1 represents designers consider the problem as an integral whole, Level 2 represents designers consider interactions between subsystems, and Level 3 represents designers consider details of subsystems. The results showed that students used more S on Level 1 and 3 but they used less F on Level 1 than engineering experts. The results imply that engineering curriculum should improve the teaching of problem definition in engineering design because students need to understand the problem before solving it.

  20. NREL: News - Solar Decathlon Engineering Design Results Announced

    Science.gov Websites

    Engineering Design Results Announced Thursday, October 3, 2002 Distinguished Panel Picks University first place in the Engineering Design results announced today at the Department of Energy's (DOE) Solar the University of Maryland remains in third. The Engineering Design panel includes engineers prominent

  1. Incorporating a Product Archaeology Paradigm across the Mechanical Engineering Curriculum

    ERIC Educational Resources Information Center

    Moore-Russo, Deborah; Cormier, Phillip; Lewis, Kemper; Devendorf, Erich

    2013-01-01

    Historically, the teaching of design theory in an engineering curriculum has been relegated to a senior capstone design experience. Presently, however, engineering design concepts and courses can be found through the entirety of most engineering programs. Educators have recognized that engineering design provides a foundational platform that can…

  2. Engineering Design Education Program for Graduate School

    NASA Astrophysics Data System (ADS)

    Ohbuchi, Yoshifumi; Iida, Haruhiko

    The new educational methods of engineering design have attempted to improve mechanical engineering education for graduate students in a way of the collaboration in education of engineer and designer. The education program is based on the lecture and practical exercises concerning the product design, and has engineering themes and design process themes, i.e. project management, QFD, TRIZ, robust design (Taguchi method) , ergonomics, usability, marketing, conception etc. At final exercise, all students were able to design new product related to their own research theme by applying learned knowledge and techniques. By the method of engineering design education, we have confirmed that graduate students are able to experience technological and creative interest.

  3. Control Synthesis for a Class of Hybrid Systems Subject to Configuration-Based Safety Constraints

    NASA Technical Reports Server (NTRS)

    Heymann, Michael; Lin, Feng; Meyer, George

    1997-01-01

    We examine a class of hybrid systems which we call Composite Hybrid Machines (CHM's) that consists of the concurrent (and partially synchronized) operation of Elementary Hybrid Machines (EHM's). Legal behavior, specified by a set of illegal configurations that the CHM may not enter, is to be achieved by the concurrent operation of the CHM with a suitably designed legal controller. In the present paper we focus on the problem of synthesizing a legal controller, whenever such a controller exists. More specifically, we address the problem of synthesizing the minimally restrictive legal controller. A controller is minimally restrictive if, when composed to operate concurrently with another legal controller, it will never interfere with the operation of the other controller and, therefore, can be composed to operate concurrently with any other controller that may be designed to achieve liveness specifications or optimality requirements without the need to reinvestigate or reverify legality of the composite controller. We confine our attention to a special class of CHM's where system dynamics is rate-limited and legal guards are conjunctions or disjunctions of atomic formulas in the dynamic variables (of the type x less than or equal to x(sub 0), or x greater than or equal to x(sub 0)). We present an algorithm for synthesis of the minimally restrictive legal controller. We demonstrate our approach by synthesizing a minimally restrictive controller for a steam boiler (the verification of which recently received a great deal of attention).

  4. Lightning safety of animals.

    PubMed

    Gomes, Chandima

    2012-11-01

    This paper addresses a concurrent multidisciplinary problem: animal safety against lightning hazards. In regions where lightning is prevalent, either seasonally or throughout the year, a considerable number of wild, captive and tame animals are injured due to lightning generated effects. The paper discusses all possible injury mechanisms, focusing mainly on animals with commercial value. A large number of cases from several countries have been analyzed. Economically and practically viable engineering solutions are proposed to address the issues related to the lightning threats discussed.

  5. Architectural Guidelines for Multimedia and Hypermedia Data Interchange: Computer Aided Acquisition and Logistics Support/Concurrent Engineering (CALS/ CE) and Electronic Commerce/Electronic Data Interchange (EC/EDI)

    DTIC Science & Technology

    1991-09-01

    other networks . 69 For example, E-mail can be sent to an SNA network through a Softswitch gateway, but at a very slow rate. As discussed in Chapter III...10 6. Communication Protocols ..................... 10 D. NEW INFRASTRUCTURES ....................... 11 1. CALS Test Network (CTN...11 2. Industrial Networks ......................... 12 3. FTS-2000 and ISDN ........................ 12 4. CALS Operational Resource

  6. New Technologies for Space Avionics, 1993

    NASA Technical Reports Server (NTRS)

    Aibel, David W.; Harris, David R.; Bartlett, Dave; Black, Steve; Campagna, Dave; Fernald, Nancy; Garbos, Ray

    1993-01-01

    The report reviews a 1993 effort that investigated issues associated with the development of requirements, with the practice of concurrent engineering and with rapid prototyping, in the development of a next-generation Reaction Jet Drive Controller. This report details lessons learned, the current status of the prototype, and suggestions for future work. The report concludes with a discussion of the vision of future avionics architectures based on the principles associated with open architectures and integrated vehicle health management.

  7. Coverage Metrics for Model Checking

    NASA Technical Reports Server (NTRS)

    Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)

    2001-01-01

    When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.

  8. Iteration in Early-Elementary Engineering Design

    NASA Astrophysics Data System (ADS)

    McFarland Kendall, Amber Leigh

    K-12 standards and curricula are beginning to include engineering design as a key practice within Science Technology Engineering and Mathematics (STEM) education. However, there is little research on how the youngest students engage in engineering design within the elementary classroom. This dissertation focuses on iteration as an essential aspect of engineering design, and because research at the college and professional level suggests iteration improves the designer's understanding of problems and the quality of design solutions. My research presents qualitative case studies of students in kindergarten and third-grade as they engage in classroom engineering design challenges which integrate with traditional curricula standards in mathematics, science, and literature. I discuss my results through the lens of activity theory, emphasizing practices, goals, and mediating resources. Through three chapters, I provide insight into how early-elementary students iterate upon their designs by characterizing the ways in which lesson design impacts testing and revision, by analyzing the plan-driven and experimentation-driven approaches that student groups use when solving engineering design challenges, and by investigating how students attend to constraints within the challenge. I connect these findings to teacher practices and curriculum design in order to suggest methods of promoting iteration within open-ended, classroom-based engineering design challenges. This dissertation contributes to the field of engineering education by providing evidence of productive engineering practices in young students and support for the value of engineering design challenges in developing students' participation and agency in these practices.

  9. Orbital transfer rocket engine technology 7.5K-LB thrust rocket engine preliminary design

    NASA Technical Reports Server (NTRS)

    Harmon, T. J.; Roschak, E.

    1993-01-01

    A preliminary design of an advanced LOX/LH2 expander cycle rocket engine producing 7,500 lbf thrust for Orbital Transfer vehicle missions was completed. Engine system, component and turbomachinery analysis at both on design and off design conditions were completed. The preliminary design analysis results showed engine requirements and performance goals were met. Computer models are described and model outputs are presented. Engine system assembly layouts, component layouts and valve and control system analysis are presented. Major design technologies were identified and remaining issues and concerns were listed.

  10. Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis

    NASA Technical Reports Server (NTRS)

    Sexstone, Matthew G.

    1998-01-01

    This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level. ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed. Examples of mass property stochastic calculations produced during a recent systems study are provided. This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime, few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.

  11. Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis

    NASA Technical Reports Server (NTRS)

    Sexstone, Matthew G.

    1998-01-01

    This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed Examples of mass property stochastic calculations produced during a recent systems study are provided This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime,few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.

  12. The Complex Dynamics of Student Engagement in Novel Engineering Design Activities

    NASA Astrophysics Data System (ADS)

    McCormick, Mary

    In engineering design, making sense of "messy," design situations is at the heart of the discipline (Schon, 1983); engineers in practice bring structure to design situations by organizing, negotiating, and coordinating multiple aspects (Bucciarelli, 1994; Stevens, Johri, & O'Connor, 2014). In classroom settings, however, students are more often given well-defined, content-focused engineering tasks (Jonassen, 2014). These tasks are based on the assumption that elementary students are unable to grapple with the complexity or open-endedness of engineering design (Crismond & Adams, 2012). The data I present in this dissertation suggest the opposite. I show that students are not only able to make sense of, or frame (Goffman, 1974), complex design situations, but that their framings dynamically involve their nascent abilities for engineering design. The context of this work is Novel Engineering, a larger research project that explores using children's literature as an access point for engineering design. Novel Engineering activities are inherently messy: there are characters with needs, settings with implicit constraints, and rich design situations. In a series of three studies, I show how students' framings of Novel Engineering design activities involve their reasoning and acting as beginning engineers. In the first study, I show two students whose caring for the story characters contributes to their stability in framing the task: they identify the needs of their fictional clients and iteratively design a solution to meet their clients' needs. In the second, I show how students' shifting and negotiating framings influence their engineering assumptions and evaluation criteria. In the third, I show how students' coordinating framings involve navigating a design process to meet clients' needs, classroom expectations, and technical requirements. Collectively, these studies contribute to literature by documenting students' productive beginnings in engineering design. The implications span research and practice, specifically targeting how we attend to and support students as they engage in engineering design.

  13. Cell Division Synchronization

    DTIC Science & Technology

    The report summarizes the progress in the design and construction of automatic equipment for synchronizing cell division in culture by periodic...Concurrent experiments in hypothermic synchronization of algal cell division are reported.

  14. ALTERNATIVE EXPOSURE MEASUREMENT DESIGNS TO IMPROVE EPIDEMIOLOGICAL STUDY DESIGNS: DETERMINANTS OF TEMPORAL VARIABILITY IN ENVIRONMENTAL CONCENTRATIONS, EXPOSURES, AND BIOMARKERS

    EPA Science Inventory

    The National Human Exposure Assessment Survey in Maryland (NHEXAS-MD) was a longitudinal study of multimedia exposure to metals, pesticides, and polycyclic aromatic compounds (PAHs). Measurements were made and questionnaires were concurrently administered to identify sources o...

  15. 36 CFR 71.3 - Designation.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 1 2011-07-01 2011-07-01 false Designation. 71.3 Section 71.3 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR RECREATION... conditions are found to exist concurrently: (1) The area is a unit of the National Park System administered...

  16. 36 CFR 71.3 - Designation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Designation. 71.3 Section 71.3 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR RECREATION... conditions are found to exist concurrently: (1) The area is a unit of the National Park System administered...

  17. Enabling and Enhancing Space Mission Success and Reduction of Risk through the Application of an Integrated Data Architecture

    NASA Technical Reports Server (NTRS)

    Brummett, Robert C.

    2008-01-01

    The engineering phases of design, development, test, and evaluation (DDT and E) and subsequent planning, preparation, and operation (Ops) of space vehicles in a complex and distributed environment requires massive and continuous flows of information across the enterprise and across temporal stages of the vehicle lifecycle. The resulting capabilities at each subsequent stage depend in part on the capture, preparation, storage, and subsequent provision of information from prior stages. The United States National Aeronautics and Space Administration (NASA) is currently designing a fleet of new vehicles that will replace the Space Shuttle and expand space operations and exploration capabilities. This includes the 2 stage human rated lift vehicle Ares 1 and its associated crew vehicle the Orion, and a service module; the heavy lift cargo vehicle, Ares 5, and an associated cargo stage known as the Earth Departure Stage; and a Lunar Lander vehicle that contains a descent stage, and ascent stage, and a habitation module. A variety of concurrent assorted ground operations infrastructure including software and facilities are also being developed, assorted technology and assembly designs and development for equipment such as EVA suits, life support systems, command and control technologies are also in the pipeline. The development is occurring in a distributed manner, with project deliverables being contributed by a large and diverse assortment of vendors and most space faring nations. Critical information about all of the components, software, and procedures must be shared during the DDT and E phases and then made readily available to the mission operations staff for access during the planning, preparation, and operations phases, and also need to be readily available for system to system interactions. The Constellation Data Systems Project (CxDS) is identifying the needs, and designing and deploying systems and processes to support these needs. This paper details the steps and processes that NASA is applying within the Constellation Program to manage this data and information, and to insure that the correct information is available, correctly annotated, and can be provisioned digitally to enhance response times, and support engineering analysis and anomaly resolution.

  18. DoD Key Technologies Plan

    DTIC Science & Technology

    1992-07-01

    methodologies ; software performance analysis; software testing; and concurrent languages. Finally, efforts in algorithms, which are primarily designed to upgrade...These codes provide a powerful research tool for testing new concepts and designs prior to experimental implementation. DoE’s laser program has also...development, and specially designed production facilities. World leadership in bth non -fluorinated and fluorinated materials resides in the U.S. but Japan

  19. Computer Design Technology of the Small Thrust Rocket Engines Using CAE / CAD Systems

    NASA Astrophysics Data System (ADS)

    Ryzhkov, V.; Lapshin, E.

    2018-01-01

    The paper presents an algorithm for designing liquid small thrust rocket engine, the process of which consists of five aggregated stages with feedback. Three stages of the algorithm provide engineering support for design, and two stages - the actual engine design. A distinctive feature of the proposed approach is a deep study of the main technical solutions at the stage of engineering analysis and interaction with the created knowledge (data) base, which accelerates the process and provides enhanced design quality. The using multifunctional graphic package Siemens NX allows to obtain the final product -rocket engine and a set of design documentation in a fairly short time; the engine design does not require a long experimental development.

  20. Development of Engineering Design Education in the Department of Mechanical Engineering at Kanazawa Technical College

    NASA Astrophysics Data System (ADS)

    Yamada, Hirofumi; Ten-Nichi, Michio; Mathui, Hirosi; Nakamura, Akizi

    This paper introduces a method of the engineering design education for college of technology mechanical engineering students. In order to teach the practical engineering design, the MIL-STD-499A process is adapted and improved upon for a Mechatronics hands-on lesson used as the MOT method. The educational results in five years indicate that knowledge of the engineering management is useful for college students in learning engineering design. Portfolio for lessons and the hypothesis method also have better effects on the understanding of the engineering specialty.

  1. Construction of an Engineer's Notebook Rubric

    ERIC Educational Resources Information Center

    Kelley, Todd R.

    2014-01-01

    It is evident that there is a need for assessment instruments that measure design and engineering design skills, knowledge, and ways of design thinking. These student assessments must be authentic to engineering design practices and measure key elements of the engineering design process. Kelley (2011) presented a rationale to include…

  2. Concurrent negotiation and coordination for grid resource coallocation.

    PubMed

    Sim, Kwang Mong; Shi, Benyun

    2010-06-01

    Bolstering resource coallocation is essential for realizing the Grid vision, because computationally intensive applications often require multiple computing resources from different administrative domains. Given that resource providers and consumers may have different requirements, successfully obtaining commitments through concurrent negotiations with multiple resource providers to simultaneously access several resources is a very challenging task for consumers. The impetus of this paper is that it is one of the earliest works that consider a concurrent negotiation mechanism for Grid resource coallocation. The concurrent negotiation mechanism is designed for 1) managing (de)commitment of contracts through one-to-many negotiations and 2) coordination of multiple concurrent one-to-many negotiations between a consumer and multiple resource providers. The novel contributions of this paper are devising 1) a utility-oriented coordination (UOC) strategy, 2) three classes of commitment management strategies (CMSs) for concurrent negotiation, and 3) the negotiation protocols of consumers and providers. Implementing these ideas in a testbed, three series of experiments were carried out in a variety of settings to compare the following: 1) the CMSs in this paper with the work of others in a single one-to-many negotiation environment for one resource where decommitment is allowed for both provider and consumer agents; 2) the performance of the three classes of CMSs in different resource market types; and 3) the UOC strategy with the work of others [e.g., the patient coordination strategy (PCS )] for coordinating multiple concurrent negotiations. Empirical results show the following: 1) the UOC strategy achieved higher utility, faster negotiation speed, and higher success rates than PCS for different resource market types; and 2) the CMS in this paper achieved higher final utility than the CMS in other works. Additionally, the properties of the three classes of CMSs in different kinds of resource markets are also verified.

  3. Collaboration between Industrial Designers and Design Engineers - Comparing the Understanding of Design Intent.

    PubMed

    Laursen, Esben Skov; Møller, Louise

    2015-01-01

    This paper describes a case study comparing the understanding of design intent between industrial designers and design engineers. The study is based on the hypothesis that it is not all aspects of the design intent that are equally difficult to share between industrial designers and design engineers in the product development process. The study builds on five semi-structured interviews, where two industrial designers and three design engineers were interviewed about different aspects of the design intent. Based on our results, there seem to be indications that the more complex and abstract elements of industrial design knowledge such as the meaning, semantics, values, emotions and social aspects of the product are less shared by the design engineers. Moreover, the results also indicate that the different aspects of the design intent are perceived separately, rather than as part of a whole by the design engineers. The connection between the different aspects of the design intent is not shared between the industrial designer and design engineer making the shared knowledge less meaningful to the design engineers. The results of this study cannot be claimed to be conclusive due to the limited empirical material. Further investigation and analytically richer data are required in order to verify and broaden the findings. More case studies have therefore been planned in order to understand the area better.

  4. Design of a miniature hydrogen fueled gas turbine engine

    NASA Technical Reports Server (NTRS)

    Burnett, M.; Lopiccolo, R. C.; Simonson, M. R.; Serovy, G. K.; Okiishi, T. H.; Miller, M. J.; Sisto, F.

    1973-01-01

    The design, development, and delivery of a miniature hydrogen-fueled gas turbine engine are discussed. The engine was to be sized to approximate a scaled-down lift engine such as the teledyne CAE model 376. As a result, the engine design emerged as a 445N(100 lb.)-thrust engine flowing 0.86 kg (1.9 lbs.) air/sec. A 4-stage compressor was designed at a 4.0 to 1 pressure ratio for the above conditions. The compressor tip diameter was 9.14 cm (3.60 in.). To improve overall engine performance, another compressor with a 4.75 to 1 pressure ratio at the same tip diameter was designed. A matching turbine for each compressor was also designed. The turbine tip diameter was 10.16 cm (4.0 in.). A combustion chamber was designed, built, and tested for this engine. A preliminary design of the mechanical rotating parts also was completed and is discussed. Three exhaust nozzle designs are presented.

  5. 14 CFR 183.29 - Designated engineering representatives.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Designated engineering representatives. 183... § 183.29 Designated engineering representatives. (a) A structural engineering representative may approve structural engineering information and other structural considerations within limits prescribed by and under...

  6. 14 CFR 183.29 - Designated engineering representatives.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 3 2014-01-01 2014-01-01 false Designated engineering representatives. 183... § 183.29 Designated engineering representatives. (a) A structural engineering representative may approve structural engineering information and other structural considerations within limits prescribed by and under...

  7. 14 CFR 183.29 - Designated engineering representatives.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Designated engineering representatives. 183... § 183.29 Designated engineering representatives. (a) A structural engineering representative may approve structural engineering information and other structural considerations within limits prescribed by and under...

  8. 14 CFR 183.29 - Designated engineering representatives.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Designated engineering representatives. 183... § 183.29 Designated engineering representatives. (a) A structural engineering representative may approve structural engineering information and other structural considerations within limits prescribed by and under...

  9. 14 CFR 183.29 - Designated engineering representatives.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Designated engineering representatives. 183... § 183.29 Designated engineering representatives. (a) A structural engineering representative may approve structural engineering information and other structural considerations within limits prescribed by and under...

  10. Engineering Design Handbook. Helicopter Engineering. Part One. Preliminary Design

    DTIC Science & Technology

    1974-08-30

    1.3 ENGINE REPLACEMENT .............. ......................... 8-1 8-1.4 ENGINE AIR INDUCTION SYSTEM .............................. 8-2 8-1.5 ENGINE ...8-5 8-2.2 ENGINE AIR INDUCTION SYSTEM .............................. 8-5 8-2.2.1 G eneral Design...8-5 8-2.2.2 Air Induction System Inlet Location ............................... 8-6 8-2.2.3 Engine Air Induction System Pressure Losses

  11. Heterogeneous Concurrent Modeling and Design in Java (Volume 3: Ptolemy II Domains)

    DTIC Science & Technology

    2008-04-15

    Starting the model 88 6.5.3. Atomic Communication in Concurrent Execution 90 6.5.4. Detecting Deadlocks: 90 6.6. Application to Resource Management 90 6.6.1...Resource Management Demo 90 6.6.2. ResourcePool 91 6.7. Threads in an Actor 91 6.7.1. Creating Extra Threads in an Actor 91 6.7.2. Manually Blocking...Local Time Management 117 8.4.2. Detecting Deadlock 118 8.4.3. Ending Execution 118 8.5. Example DDE Applications 119 9. PN Domain 121 9.1

  12. Confocal three dimensional tracking of a single nanoparticle with concurrent spectroscopic readouts

    NASA Astrophysics Data System (ADS)

    Cang, Hu; Wong, Chung M.; Xu, C. Shan; Rizvi, Abbas H.; Yang, Haw

    2006-05-01

    We present an apparatus that noninvasively tracks a moving nanoparticle in three dimensions while providing concurrent sequential spectroscopic measurements. The design, based on confocal microscopy, uses a near-infrared laser and a dark-field condenser for illumination of a gold nanoparticle. By monitoring the scattered light from the nanoparticle and using a piezoelectric stage, the system was able to continuously bring the diffusive particle in a glycerol/water solution back to the focal volume with spatial resolution and response time of less than 210nm and a millisecond, respectively.

  13. The effect of concurrent bandwidth feedback on learning the lane-keeping task in a driving simulator.

    PubMed

    de Groot, Stefan; de Winter, Joost C F; López García, José Manuel; Mulder, Max; Wieringa, Peter A

    2011-02-01

    The aim of this study was to investigate whether concurrent bandwidth feedback improves learning of the lane-keeping task in a driving simulator. Previous research suggests that bandwidth feedback improves learning and that off-target feedback is superior to on-target feedback. This study aimed to extend these findings for the lane-keeping task. Participants without a driver's license drove five 8-min lane-keeping sessions in a driver training simulator: three practice sessions, an immediate retention session, and a delayed retention session I day later. There were four experimental groups (n=15 per group): (a) on-target, receiving seat vibrations when the center of the car was within 0.5 m of the lane center; (b) off-target, receiving seat vibrations when the center of the car was more than 0.5 m away from the lane center; (c) control, receiving no vibrations; and (d) realistic, receiving seat vibrations depending on engine speed. During retention, all groups were provided with the realistic vibrations. During practice, on-target and off-target groups had better lane-keeping performance than the nonaugmented groups, but this difference diminished in the retention phase. Furthermore, during late practice and retention, the off-target group outperformed the on-target group.The off-target group had a higher rate of steering reversal and higher steering entropy than the nonaugmented groups, whereas no clear group differences were found regarding mean speed, mental workload, or self-reported measures. Off-target feedback is superior to on-target feedback for learning the lane-keeping task. This research provides knowledge to researchers and designers of training systems about the value of feedback in simulator-based training of vehicular control.

  14. Expanded study of feasibility of measuring in-flight 747/JT9D loads, performance, clearance, and thermal data

    NASA Technical Reports Server (NTRS)

    Sallee, G. P.; Martin, R. L.

    1980-01-01

    The JT9D jet engine exhibits a TSFC loss of about 1 percent in the initial 50 flight cycles of a new engine. These early losses are caused by seal-wear induced opening of running clearances in the engine gas path. The causes of this seal wear have been identified as flight induced loads which deflect the engine cases and rotors, causing the rotating blades to rub against the seal surfaces, producing permanent clearance changes. The real level of flight loads encountered during airplane acceptance testing and revenue service and the engine's response in the dynamic flight environment were investigated. The feasibility of direct measurement of these flight loads and their effects by concurrent measurement of 747/JT9D propulsion system aerodynamic and inertia loads and the critical engine clearance and performance changes during 747 flight and ground operations was evaluated. A number of technical options were examined in relation to the total estimated program cost to facilitate selection of the most cost effective option. It is concluded that a flight test program meeting the overall objective of determining the levels of aerodynamic and inertia load levels to which the engine is exposed during the initial flight acceptance test and normal flight maneuvers is feasible and desirable. A specific recommended flight test program, based on the evaluation of cost effectiveness, is defined.

  15. Engine Development Design Margins Briefing Charts

    NASA Technical Reports Server (NTRS)

    Bentz, Chuck

    2006-01-01

    New engines experience durability problems after entering service. The most prevalent and costly is the hot section, particularly the high-pressure turbine. The origin of durability problems can be traced back to: 1) the basic aero-mechanical design systems, assumptions, and design margins used by the engine designers, 2) the available materials systems, and 3) to a large extent, aggressive marketing in a highly competitive environment that pushes engine components beyond the demonstrated capability of the basic technology available for the hardware designs. Unfortunately the user must operate the engine in the service environment in order to learn the actual thrust loading and the time at max effort take-off conditions used in service are needed to determine the hot section life. Several hundred thousand hours of operational service will be required before the demonstrated reliability of a fleet of engines or the design deficiencies of the engine hot section parts can be determined. Also, it may take three to four engine shop visits for heavy maintenance on the gas path hardware to establish cost effective build standards. Spare parts drive the oerator's engine maintenance costs but spare parts also makes lots of money for the engine manufacturer during the service life of an engine. Unless competition prevails for follow-on engine buys, there is really no motivation for an OEM to spend internal money to improve parts durability and reduce earnings derived from a lucrative spare parts business. If the hot section life is below design goals or promised values, the OEM migh argue that the engine is being operated beyond its basic design intent. On the other hand, the airframer and the operator will continue to remind the OEM that his engine was selected based on a lot of promises to deliver spec thrust with little impact on engine service life if higher thrust is used intermittently. In the end, a standoff prevails and nothing gets fixed. This briefing will propose ways to hold competing engine manufacturers more accountable for engine hot section design margins during the entire Engine Development process as well as provide tools to assess the design temperature margins in the hot section parts of Service Engines.

  16. An anisotropic, hyperelastic model for skin: experimental measurements, finite element modelling and identification of parameters for human and murine skin.

    PubMed

    Groves, Rachel B; Coulman, Sion A; Birchall, James C; Evans, Sam L

    2013-02-01

    The mechanical characteristics of skin are extremely complex and have not been satisfactorily simulated by conventional engineering models. The ability to predict human skin behaviour and to evaluate changes in the mechanical properties of the tissue would inform engineering design and would prove valuable in a diversity of disciplines, for example the pharmaceutical and cosmetic industries, which currently rely upon experiments performed in animal models. The aim of this study was to develop a predictive anisotropic, hyperelastic constitutive model of human skin and to validate this model using laboratory data. As a corollary, the mechanical characteristics of human and murine skin have been compared. A novel experimental design, using tensile tests on circular skin specimens, and an optimisation procedure were adopted for laboratory experiments to identify the material parameters of the tissue. Uniaxial tensile tests were performed along three load axes on excised murine and human skin samples, using a single set of material parameters for each skin sample. A finite element model was developed using the transversely isotropic, hyperelastic constitutive model of Weiss et al. (1996) and was embedded within a Veronda-Westmann isotropic material matrix, using three fibre families to create anisotropic behaviour. The model was able to represent the nonlinear, anisotropic behaviour of the skin well. Additionally, examination of the optimal material coefficients and the experimental data permitted quantification of the mechanical differences between human and murine skin. Differences between the skin types, most notably the extension of the skin at low load, have highlighted some of the limitations of murine skin as a biomechanical model of the human tissue. The development of accurate, predictive computational models of human tissue, such as skin, to reduce, refine or replace animal models and to inform developments in the medical, engineering and cosmetic fields, is a significant challenge but is highly desirable. Concurrent advances in computer technology and our understanding of human physiology must be utilised to produce more accurate and accessible predictive models, such as the finite element model described in this study. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Orbit transfer vehicle advanced expander cycle engine point design study. Volume 2: Study results

    NASA Technical Reports Server (NTRS)

    Diem, H. G.

    1980-01-01

    The design characteristics of the baseline engine configuration of the advanced expander cycle engine are described. Several aspects of engine optimization are considered which directly impact the design of the baseline thrust chamber. Four major areas of the power cycle optimization are emphasized: main turbine arrangement; cycle engine source; high pressure pump design; and boost pump drive.

  18. Shedding Light on Engineering Design

    ERIC Educational Resources Information Center

    Capobianco, Brenda M.; Nyquist, Chell; Tyrie, Nancy

    2013-01-01

    This article describes the steps incorporated to teach an engineering design process in a fifth-grade science classroom. The engineering design-based activity was an existing scientific inquiry activity using UV light--detecting beads and purposefully creating a series of engineering design-based challenges around the investigation. The…

  19. The Implementation of a Multi-Backend Database System (MDBS). Part I. Software Engineering Strategies and Efforts Towards a Prototype MDBS.

    DTIC Science & Technology

    1983-06-01

    for DEC PDPll systems. MAINSAIL was developed and is marketed with a set of integrated tools for program development. The syntax of the language is...stack, and to test for stack-full and stack-empty conditions. This technique is useful in enforcing data integrity and in con- trolling concurrent...and market MAINSAIL. The language is distinguished by its portability. The same compiler and runtime system, both written in MAINSAIL, are the basis

  20. SEI Report on Graduate Software Engineering Education for 1991

    DTIC Science & Technology

    1991-04-01

    12, 12 (Dec. 1979), 85-94. Andrews83 Andrews, Gregory R . and Schneider, Fred B. “Concepts and Notations for Concurrent Programming.” ACM Computing...Barringer87 Barringer , H. “Up and Down the Temporal Way.” Computer J. 30, 2 (Apr. 1987), 134-148. Bjørner78 The Vienna Development Method: The Meta-Language...Lecture Notes in Computer Science. Bruns86 Bruns, Glenn R . Technology Assessment: PAISLEY. Tech. Rep. MCC TR STP-296-86, MCC, Austin, Texas, Sept

Top