Sample records for concurrent engineering process

  1. Probabilistic simulation of concurrent engineering of propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Technology readiness and the available infrastructure is assessed for timely computational simulation of concurrent engineering for propulsion systems. Results for initial coupled multidisciplinary, fabrication-process, and system simulators are presented including uncertainties inherent in various facets of engineering processes. An approach is outlined for computationally formalizing the concurrent engineering process from cradle-to-grave via discipline dedicated workstations linked with a common database.

  2. Fuzzy simulation in concurrent engineering

    NASA Technical Reports Server (NTRS)

    Kraslawski, A.; Nystrom, L.

    1992-01-01

    Concurrent engineering is becoming a very important practice in manufacturing. A problem in concurrent engineering is the uncertainty associated with the values of the input variables and operating conditions. The problem discussed in this paper concerns the simulation of processes where the raw materials and the operational parameters possess fuzzy characteristics. The processing of fuzzy input information is performed by the vertex method and the commercial simulation packages POLYMATH and GEMS. The examples are presented to illustrate the usefulness of the method in the simulation of chemical engineering processes.

  3. Concurrent Software Engineering Project

    ERIC Educational Resources Information Center

    Stankovic, Nenad; Tillo, Tammam

    2009-01-01

    Concurrent engineering or overlapping activities is a business strategy for schedule compression on large development projects. Design parameters and tasks from every aspect of a product's development process and their interdependencies are overlapped and worked on in parallel. Concurrent engineering suffers from negative effects such as excessive…

  4. How Engineers Really Think About Risk: A Study of JPL Engineers

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus; Chattopadhyay, Deb; Valerdi, Ricardo

    2011-01-01

    The objectives of this work are: To improve risk assessment practices as used during the mission design process by JPL's concurrent engineering teams. (1) Developing effective ways to identify and assess mission risks (2) Providing a process for more effective dialog between stakeholders about the existence and severity of mission risks (3) Enabling the analysis of interactions of risks across concurrent engineering roles.

  5. Integrating Thermal Tools Into the Mechanical Design Process

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  6. Computational simulation of concurrent engineering for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  7. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  8. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Astrophysics Data System (ADS)

    Chamis, C. C.; Singhal, S. N.

    1993-02-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  9. Model-Based Engineering Design for Trade Space Exploration throughout the Design Cycle

    NASA Technical Reports Server (NTRS)

    Lamassoure, Elisabeth S.; Wall, Stephen D.; Easter, Robert W.

    2004-01-01

    This paper presents ongoing work to standardize model-based system engineering as a complement to point design development in the conceptual design phase of deep space missions. It summarizes two first steps towards practical application of this capability within the framework of concurrent engineering design teams and their customers. The first step is standard generation of system sensitivities models as the output of concurrent engineering design sessions, representing the local trade space around a point design. A review of the chosen model development process, and the results of three case study examples, demonstrate that a simple update to the concurrent engineering design process can easily capture sensitivities to key requirements. It can serve as a valuable tool to analyze design drivers and uncover breakpoints in the design. The second step is development of rough-order- of-magnitude, broad-range-of-validity design models for rapid exploration of the trade space, before selection of a point design. At least one case study demonstrated the feasibility to generate such models in a concurrent engineering session. The experiment indicated that such a capability could yield valid system-level conclusions for a trade space composed of understood elements. Ongoing efforts are assessing the practicality of developing end-to-end system-level design models for use before even convening the first concurrent engineering session, starting with modeling an end-to-end Mars architecture.

  10. Concurrent engineering: Spacecraft and mission operations system design

    NASA Technical Reports Server (NTRS)

    Landshof, J. A.; Harvey, R. J.; Marshall, M. H.

    1994-01-01

    Despite our awareness of the mission design process, spacecraft historically have been designed and developed by one team and then turned over as a system to the Mission Operations organization to operate on-orbit. By applying concurrent engineering techniques and envisioning operability as an essential characteristic of spacecraft design, tradeoffs can be made in the overall mission design to minimize mission lifetime cost. Lessons learned from previous spacecraft missions will be described, as well as the implementation of concurrent mission operations and spacecraft engineering for the Near Earth Asteroid Rendezvous (NEAR) program.

  11. Integrated System-Level Optimization for Concurrent Engineering With Parametric Subsystem Modeling

    NASA Technical Reports Server (NTRS)

    Schuman, Todd; DeWeck, Oliver L.; Sobieski, Jaroslaw

    2005-01-01

    The introduction of concurrent design practices to the aerospace industry has greatly increased the productivity of engineers and teams during design sessions as demonstrated by JPL's Team X. Simultaneously, advances in computing power have given rise to a host of potent numerical optimization methods capable of solving complex multidisciplinary optimization problems containing hundreds of variables, constraints, and governing equations. Unfortunately, such methods are tedious to set up and require significant amounts of time and processor power to execute, thus making them unsuitable for rapid concurrent engineering use. This paper proposes a framework for Integration of System-Level Optimization with Concurrent Engineering (ISLOCE). It uses parametric neural-network approximations of the subsystem models. These approximations are then linked to a system-level optimizer that is capable of reaching a solution quickly due to the reduced complexity of the approximations. The integration structure is described in detail and applied to the multiobjective design of a simplified Space Shuttle external fuel tank model. Further, a comparison is made between the new framework and traditional concurrent engineering (without system optimization) through an experimental trial with two groups of engineers. Each method is evaluated in terms of optimizer accuracy, time to solution, and ease of use. The results suggest that system-level optimization, running as a background process during integrated concurrent engineering sessions, is potentially advantageous as long as it is judiciously implemented.

  12. Studies on spatial modes and the correlation anisotropy of entangled photons generated from 2D quadratic nonlinear photonic crystals

    NASA Astrophysics Data System (ADS)

    Luo, X. W.; Xu, P.; Sun, C. W.; Jin, H.; Hou, R. J.; Leng, H. Y.; Zhu, S. N.

    2017-06-01

    Concurrent spontaneous parametric down-conversion (SPDC) processes have proved to be an appealing approach for engineering the path-entangled photonic state with designable and tunable spatial modes. In this work, we propose a general scheme to construct high-dimensional path entanglement and demonstrate the basic properties of concurrent SPDC processes from domain-engineered quadratic nonlinear photonic crystals, including the spatial modes and the photon flux, as well as the anisotropy of spatial correlation under noncollinear quasi-phase-matching geometry. The overall understanding about the performance of concurrent SPDC processes will give valuable references to the construction of compact path entanglement and the development of new types of photonic quantum technologies.

  13. A general engineering scenario for concurrent engineering environments

    NASA Astrophysics Data System (ADS)

    Mucino, V. H.; Pavelic, V.

    The paper describes an engineering method scenario which categorizes the various activities and tasks into blocks seen as subjects which consume and produce data and information. These methods, tools, and associated utilities interact with other engineering tools by exchanging information in such a way that a relationship between customers and suppliers of engineering data is established clearly, while data exchange consistency is maintained throughout the design process. The events and data transactions are presented in the form of flowcharts in which data transactions represent the connection between the various bricks, which in turn represent the engineering activities developed for the particular task required in the concurrent engineering environment.

  14. Electronic Design Automation: Integrating the Design and Manufacturing Functions

    NASA Technical Reports Server (NTRS)

    Bachnak, Rafic; Salkowski, Charles

    1997-01-01

    As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.

  15. Processing multilevel secure test and evaluation information

    NASA Astrophysics Data System (ADS)

    Hurlburt, George; Hildreth, Bradley; Acevedo, Teresa

    1994-07-01

    The Test and Evaluation Community Network (TECNET) is building a Multilevel Secure (MLS) system. This system features simultaneous access to classified and unclassified information and easy access through widely available communications channels. It provides the necessary separation of classification levels, assured through the use of trusted system design techniques, security assessments and evaluations. This system enables cleared T&E users to view and manipulate classified and unclassified information resources either using a single terminal interface or multiple windows in a graphical user interface. TECNET is in direct partnership with the National Security Agency (NSA) to develop and field the MLS TECNET capability in the near term. The centerpiece of this partnership is a state-of-the-art Concurrent Systems Security Engineering (CSSE) process. In developing the MLS TECNET capability, TECNET and NSA are providing members, with various expertise and diverse backgrounds, to participate in the CSSE process. The CSSE process is founded on the concepts of both Systems Engineering and Concurrent Engineering. Systems Engineering is an interdisciplinary approach to evolve and verify an integrated and life cycle balanced set of system product and process solutions that satisfy customer needs (ASD/ENS-MIL STD 499B 1992). Concurrent Engineering is design and development using the simultaneous, applied talents of a diverse group of people with the appropriate skills. Harnessing diverse talents to support CSSE requires active participation by team members in an environment that both respects and encourages diversity.

  16. Characterizing Distributed Concurrent Engineering Teams: A Descriptive Framework for Aerospace Concurrent Engineering Design Teams

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Debarati; Hihn, Jairus; Warfield, Keith

    2011-01-01

    As aerospace missions grow larger and more technically complex in the face of ever tighter budgets, it will become increasingly important to use concurrent engineering methods in the development of early conceptual designs because of their ability to facilitate rapid assessments and trades in a cost-efficient manner. To successfully accomplish these complex missions with limited funding, it is also essential to effectively leverage the strengths of individuals and teams across government, industry, academia, and international agencies by increased cooperation between organizations. As a result, the existing concurrent engineering teams will need to increasingly engage in distributed collaborative concurrent design. This paper is an extension of a recent white paper written by the Concurrent Engineering Working Group, which details the unique challenges of distributed collaborative concurrent engineering. This paper includes a short history of aerospace concurrent engineering, and defines the terms 'concurrent', 'collaborative' and 'distributed' in the context of aerospace concurrent engineering. In addition, a model for the levels of complexity of concurrent engineering teams is presented to provide a way to conceptualize information and data flow within these types of teams.

  17. GLobal Integrated Design Environment (GLIDE): A Concurrent Engineering Application

    NASA Technical Reports Server (NTRS)

    McGuire, Melissa L.; Kunkel, Matthew R.; Smith, David A.

    2010-01-01

    The GLobal Integrated Design Environment (GLIDE) is a client-server software application purpose-built to mitigate issues associated with real time data sharing in concurrent engineering environments and to facilitate discipline-to-discipline interaction between multiple engineers and researchers. GLIDE is implemented in multiple programming languages utilizing standardized web protocols to enable secure parameter data sharing between engineers and researchers across the Internet in closed and/or widely distributed working environments. A well defined, HyperText Transfer Protocol (HTTP) based Application Programming Interface (API) to the GLIDE client/server environment enables users to interact with GLIDE, and each other, within common and familiar tools. One such common tool, Microsoft Excel (Microsoft Corporation), paired with its add-in API for GLIDE, is discussed in this paper. The top-level examples given demonstrate how this interface improves the efficiency of the design process of a concurrent engineering study while reducing potential errors associated with manually sharing information between study participants.

  18. Software engineering aspects of real-time programming concepts

    NASA Astrophysics Data System (ADS)

    Schoitsch, Erwin

    1986-08-01

    Real-time programming is a discipline of great importance not only in process control, but also in fields like communication, office automation, interactive databases, interactive graphics and operating systems development. General concepts of concurrent programming and constructs for process-synchronization are discussed in detail. Tasking and synchronization concepts, methods of process communication, interrupt and timeout handling in systems based on semaphores, signals, conditional critical regions or on real-time languages like Concurrent PASCAL, MODULA, CHILL and ADA are explained and compared with each other. The second part deals with structuring and modularization of technical processes to build reliable and maintainable real time systems. Software-quality and software engineering aspects are considered throughout the paper.

  19. Working on the Boundaries: Philosophies and Practices of the Design Process

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.

    1996-01-01

    While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.

  20. Next-generation concurrent engineering: developing models to complement point designs

    NASA Technical Reports Server (NTRS)

    Morse, Elizabeth; Leavens, Tracy; Cohanim, Barbak; Harmon, Corey; Mahr, Eric; Lewis, Brian

    2006-01-01

    Concurrent Engineering Design teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a next generation CED; nin addition to a point design, the team develops a model of the local trade space. The process is a balance between the power of model-developing tools and the creativity of human experts, enabling the development of a variety of trade models for any space mission.

  1. Knowledge Management tools integration within DLR's concurrent engineering facility

    NASA Astrophysics Data System (ADS)

    Lopez, R. P.; Soragavi, G.; Deshmukh, M.; Ludtke, D.

    The complexity of space endeavors has increased the need for Knowledge Management (KM) tools. The concept of KM involves not only the electronic storage of knowledge, but also the process of making this knowledge available, reusable and traceable. Establishing a KM concept within the Concurrent Engineering Facility (CEF) has been a research topic of the German Aerospace Centre (DLR). This paper presents the current KM tools of the CEF: the Software Platform for Organizing and Capturing Knowledge (S.P.O.C.K.), the data model Virtual Satellite (VirSat), and the Simulation Model Library (SimMoLib), and how their usage improved the Concurrent Engineering (CE) process. This paper also exposes the lessons learned from the introduction of KM practices into the CEF and elaborates a roadmap for the further development of KM in CE activities at DLR. The results of the application of the Knowledge Management tools have shown the potential of merging the three software platforms with their functionalities, as the next step towards the fully integration of KM practices into the CE process. VirSat will stay as the main software platform used within a CE study, and S.P.O.C.K. and SimMoLib will be integrated into VirSat. These tools will support the data model as a reference and documentation source, and as an access to simulation and calculation models. The use of KM tools in the CEF aims to become a basic practice during the CE process. The settlement of this practice will result in a much more extended knowledge and experience exchange within the Concurrent Engineering environment and, consequently, the outcome of the studies will comprise higher quality in the design of space systems.

  2. How to Quickly Import CAD Geometry into Thermal Desktop

    NASA Technical Reports Server (NTRS)

    Wright, Shonte; Beltran, Emilio

    2002-01-01

    There are several groups at JPL (Jet Propulsion Laboratory) that are committed to concurrent design efforts, two are featured here. Center for Space Mission Architecture and Design (CSMAD) enables the practical application of advanced process technologies in JPL's mission architecture process. Team I functions as an incubator for projects that are in the Discovery, and even pre-Discovery proposal stages. JPL's concurrent design environment is to a large extent centered on the CAD (Computer Aided Design) file. During concurrent design sessions CAD geometry is ported to other more specialized engineering design packages.

  3. Concurrency in product realization

    NASA Astrophysics Data System (ADS)

    Kelly, Michael J.

    1994-03-01

    Technology per se does not provide a competitive advantage. Timely exploitation of technology is what gives the competitive edge, and this demands a major shift in the product development process and management of the industrial enterprise. `Teaming to win' is more than a management theme; it is the disciplined engineering practice that is essential to success in today's global marketplace. Teaming supports the concurrent engineering practices required to integrate the activities of people responsible for product realization through achievement of shorter development cycles, lower costs, and defect-free products.

  4. True Concurrent Thermal Engineering Integrating CAD Model Building with Finite Element and Finite Difference Methods

    NASA Technical Reports Server (NTRS)

    Panczak, Tim; Ring, Steve; Welch, Mark

    1999-01-01

    Thermal engineering has long been left out of the concurrent engineering environment dominated by CAD (computer aided design) and FEM (finite element method) software. Current tools attempt to force the thermal design process into an environment primarily created to support structural analysis, which results in inappropriate thermal models. As a result, many thermal engineers either build models "by hand" or use geometric user interfaces that are separate from and have little useful connection, if any, to CAD and FEM systems. This paper describes the development of a new thermal design environment called the Thermal Desktop. This system, while fully integrated into a neutral, low cost CAD system, and which utilizes both FEM and FD methods, does not compromise the needs of the thermal engineer. Rather, the features needed for concurrent thermal analysis are specifically addressed by combining traditional parametric surface based radiation and FD based conduction modeling with CAD and FEM methods. The use of flexible and familiar temperature solvers such as SINDA/FLUINT (Systems Improved Numerical Differencing Analyzer/Fluid Integrator) is retained.

  5. Computer Sciences and Data Systems, volume 1

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics addressed include: software engineering; university grants; institutes; concurrent processing; sparse distributed memory; distributed operating systems; intelligent data management processes; expert system for image analysis; fault tolerant software; and architecture research.

  6. Product design for energy reduction in concurrent engineering: An Inverted Pyramid Approach

    NASA Astrophysics Data System (ADS)

    Alkadi, Nasr M.

    Energy factors in product design in concurrent engineering (CE) are becoming an emerging dimension for several reasons; (a) the rising interest in "green design and manufacturing", (b) the national energy security concerns and the dramatic increase in energy prices, (c) the global competition in the marketplace and global climate change commitments including carbon tax and emission trading systems, and (d) the widespread recognition of the need for sustainable development. This research presents a methodology for the intervention of energy factors in concurrent engineering product development process to significantly reduce the manufacturing energy requirement. The work presented here is the first attempt at integrating the design for energy in concurrent engineering framework. It adds an important tool to the DFX toolbox for evaluation of the impact of design decisions on the product manufacturing energy requirement early during the design phase. The research hypothesis states that "Product Manufacturing Energy Requirement is a Function of Design Parameters". The hypothesis was tested by conducting experimental work in machining and heat treating that took place at the manufacturing lab of the Industrial and Management Systems Engineering Department (IMSE) at West Virginia University (WVU) and at a major U.S steel manufacturing plant, respectively. The objective of the machining experiment was to study the effect of changing specific product design parameters (Material type and diameter) and process design parameters (metal removal rate) on a gear head lathe input power requirement through performing defined sets of machining experiments. The objective of the heat treating experiment was to study the effect of varying product charging temperature on the fuel consumption of a walking beams reheat furnace. The experimental work in both directions have revealed important insights into energy utilization in machining and heat-treating processes and its variance based on product, process, and system design parameters. In depth evaluation to how the design and manufacturing normally happen in concurrent engineering provided a framework to develop energy system levels in machining within the concurrent engineering environment using the method of "Inverted Pyramid Approach", (IPA). The IPA features varying levels of output energy based information depending on the input design parameters that is available during each stage (level) of the product design. The experimental work, the in-depth evaluation of design and manufacturing in CE, and the developed energy system levels in machining provided a solid base for the development of the model for the design for energy reduction in CE. The model was used to analyze an example part where 12 evolving designs were thoroughly reviewed to investigate the sensitivity of energy to design parameters in machining. The model allowed product design teams to address manufacturing energy concerns early during the design stage. As a result, ranges for energy sensitive design parameters impacting product manufacturing energy consumption were found in earlier levels. As designer proceeds to deeper levels in the model, this range tightens and results in significant energy reductions.

  7. Information Integration for Concurrent Engineering (IICE) IDEF3 Process Description Capture Method Report

    DTIC Science & Technology

    1992-05-01

    methodology, knowledge acquisition, 140 requirements definition, information systems, information engineering, 16. PRICE CODE systems engineering...and knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be...evolve towards an information -integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key

  8. Multidisciplinary optimization for engineering systems - Achievements and potential

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1989-01-01

    The currently common sequential design process for engineering systems is likely to lead to suboptimal designs. Recently developed decomposition methods offer an alternative for coming closer to optimum by breaking the large task of system optimization into smaller, concurrently executed and, yet, coupled tasks, identified with engineering disciplines or subsystems. The hierarchic and non-hierarchic decompositions are discussed and illustrated by examples. An organization of a design process centered on the non-hierarchic decomposition is proposed.

  9. Multidisciplinary optimization for engineering systems: Achievements and potential

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1989-01-01

    The currently common sequential design process for engineering systems is likely to lead to suboptimal designs. Recently developed decomposition methods offer an alternative for coming closer to optimum by breaking the large task of system optimization into smaller, concurrently executed and, yet, coupled tasks, identified with engineering disciplines or subsystems. The hierarchic and non-hierarchic decompositions are discussed and illustrated by examples. An organization of a design process centered on the non-hierarchic decomposition is proposed.

  10. Use of Concurrent Engineering in Space Mission Design

    NASA Technical Reports Server (NTRS)

    Wall, S.

    2000-01-01

    In recent years, conceptual-phase (proposal level) design of space missions has been improved considerably. Team structures, tool linkage, specialized facilities known as design centers and scripted processes have been demonstrated to cut proposal-level engineering design time from a few months to a few weeks.

  11. Quality Function Deployment for Large Systems

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1992-01-01

    Quality Function Deployment (QFD) is typically applied to small subsystems. This paper describes efforts to extend QFD to large scale systems. It links QFD to the system engineering process, the concurrent engineering process, the robust design process, and the costing process. The effect is to generate a tightly linked project management process of high dimensionality which flushes out issues early to provide a high quality, low cost, and, hence, competitive product. A pre-QFD matrix linking customers to customer desires is described.

  12. Next-generation concurrent engineering: developing models to complement point designs

    NASA Technical Reports Server (NTRS)

    Morse, Elizabeth; Leavens, Tracy; Cohanim, Babak; Harmon, Corey; Mahr, Eric; Lewis, Brian

    2006-01-01

    Concurrent Engineering Design (CED) teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a 'next-generation CED; in addition to a point design, the Team develops a model of the local trade space. The process is a balance between the power of a model developing tools and the creativity of humal experts, enabling the development of a variety of trade models for any space mission. This paper reviews the modeling method and its practical implementation in the ED environment. Example results illustrate the benefit of this approach.

  13. Aerospace Concurrent Engineering Design Teams: Current State, Next Steps and a Vision for the Future

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus; Chattopadhyay, Debarati; Karpati, Gabriel; McGuire, Melissa; Borden, Chester; Panek, John; Warfield, Keith

    2011-01-01

    Over the past sixteen years, government aerospace agencies and aerospace industry have developed and evolved operational concurrent design teams to create novel spaceflight mission concepts and designs. These capabilities and teams, however, have evolved largely independently. In today's environment of increasingly complex missions with limited budgets it is becoming readily apparent that both implementing organizations and today's concurrent engineering teams will need to interact more often than they have in the past. This will require significant changes in the current state of practice. This paper documents the findings from a concurrent engineering workshop held in August 2010 to identify the key near term improvement areas for concurrent engineering capabilities and challenges to the long-term advancement of concurrent engineering practice. The paper concludes with a discussion of a proposed vision for the evolution of these teams over the next decade.

  14. The Application of Concurrent Engineering Tools and Design Structure Matrix in Designing Tire

    NASA Astrophysics Data System (ADS)

    Ginting, Rosnani; Fachrozi Fitra Ramadhan, T.

    2016-02-01

    The development of automobile industry in Indonesia is growing rapidly. This phenomenon causes companies related to the automobile industry such as tire industry must develop products based on customers’ needs and considering the timeliness of delivering the product to the customer. It could be reached by applying strategic planning in developing an integrated concept of product development. This research was held in PT. XYZ that applied the sequential approach in designing and developing products. The need to improve in one stage of product development could occur re-designing that needs longer time in developing a new product. This research is intended to get an integrated product design concept of tire pertaining to the customer's needs using Concurrent Engineering Tools by implementing the two-phased of product development. The implementation of Concurrent Engineering approach results in applying the stage of project planning, conceptual design, and product modules. The product modules consist of four modules that using Product Architecture - Design Structure Matrix to ease the designing process of new product development.

  15. DREAMS and IMAGE: A Model and Computer Implementation for Concurrent, Life-Cycle Design of Complex Systems

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.

    1995-01-01

    Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.

  16. Integration of Design, Thermal, Structural, and Optical Analysis, Including Thermal Animation

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.

    1993-01-01

    In many industries there has recently been a concerted movement toward 'quality management' and the issue of how to accomplish work more efficiently. Part of this effort is focused on concurrent engineering; the idea of integrating the design and analysis processes so that they are not separate, sequential processes (often involving design rework due to analytical findings) but instead form an integrated system with smooth transfers of information. Presented herein are several specific examples of concurrent engineering methods being carried out at Langley Research Center (LaRC): integration of thermal, structural and optical analyses to predict changes in optical performance based on thermal and structural effects; integration of the CAD design process with thermal and structural analyses; and integration of analysis and presentation by animating the thermal response of a system as an active color map -- a highly effective visual indication of heat flow.

  17. Artificial concurrent catalytic processes involving enzymes.

    PubMed

    Köhler, Valentin; Turner, Nicholas J

    2015-01-11

    The concurrent operation of multiple catalysts can lead to enhanced reaction features including (i) simultaneous linear multi-step transformations in a single reaction flask (ii) the control of intermediate equilibria (iii) stereoconvergent transformations (iv) rapid processing of labile reaction products. Enzymes occupy a prominent position for the development of such processes, due to their high potential compatibility with other biocatalysts. Genes for different enzymes can be co-expressed to reconstruct natural or construct artificial pathways and applied in the form of engineered whole cell biocatalysts to carry out complex transformations or, alternatively, the enzymes can be combined in vitro after isolation. Moreover, enzyme variants provide a wider substrate scope for a given reaction and often display altered selectivities and specificities. Man-made transition metal catalysts and engineered or artificial metalloenzymes also widen the range of reactivities and catalysed reactions that are potentially employable. Cascades for simultaneous cofactor or co-substrate regeneration or co-product removal are now firmly established. Many applications of more ambitious concurrent cascade catalysis are only just beginning to appear in the literature. The current review presents some of the most recent examples, with an emphasis on the combination of transition metal with enzymatic catalysis and aims to encourage researchers to contribute to this emerging field.

  18. Concurrent Engineering Working Group White Paper Distributed Collaborative Design: The Next Step in Aerospace Concurrent Engineering

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus; Chattopadhyay, Debarati; Karpati, Gabriel; McGuire, Melissa; Panek, John; Warfield, Keith; Borden, Chester

    2011-01-01

    As aerospace missions grow larger and more technically complex in the face of ever tighter budgets, it will become increasingly important to use concurrent engineering methods in the development of early conceptual designs because of their ability to facilitate rapid assessments and trades of performance, cost and schedule. To successfully accomplish these complex missions with limited funding, it is essential to effectively leverage the strengths of individuals and teams across government, industry, academia, and international agencies by increased cooperation between organizations. As a result, the existing concurrent engineering teams will need to increasingly engage in distributed collaborative concurrent design. The purpose of this white paper is to identify a near-term vision for the future of distributed collaborative concurrent engineering design for aerospace missions as well as discuss the challenges to achieving that vision. The white paper also documents the advantages of creating a working group to investigate how to engage the expertise of different teams in joint design sessions while enabling organizations to maintain their organizations competitive advantage.

  19. Identification and Classification of Common Risks in Space Science Missions

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus M.; Chattopadhyay, Debarati; Hanna, Robert A.; Port, Daniel; Eggleston, Sabrina

    2010-01-01

    Due to the highly constrained schedules and budgets that NASA missions must contend with, the identification and management of cost, schedule and risks in the earliest stages of the lifecycle is critical. At the Jet Propulsion Laboratory (JPL) it is the concurrent engineering teams that first address these items in a systematic manner. Foremost of these concurrent engineering teams is Team X. Started in 1995, Team X has carried out over 1000 studies, dramatically reducing the time and cost involved, and has been the model for other concurrent engineering teams both within NASA and throughout the larger aerospace community. The ability to do integrated risk identification and assessment was first introduced into Team X in 2001. Since that time the mission risks identified in each study have been kept in a database. In this paper we will describe how the Team X risk process is evolving highlighting the strengths and weaknesses of the different approaches. The paper will especially focus on the identification and classification of common risks that have arisen during Team X studies of space based science missions.

  20. Advanced engineering environment pilot project.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwegel, Jill; Pomplun, Alan R.; Abernathy, Rusty

    2006-10-01

    The Advanced Engineering Environment (AEE) is a concurrent engineering concept that enables real-time process tooling design and analysis, collaborative process flow development, automated document creation, and full process traceability throughout a product's life cycle. The AEE will enable NNSA's Design and Production Agencies to collaborate through a singular integrated process. Sandia National Laboratories and Parametric Technology Corporation (PTC) are working together on a prototype AEE pilot project to evaluate PTC's product collaboration tools relative to the needs of the NWC. The primary deliverable for the project is a set of validated criteria for defining a complete commercial off-the-shelf (COTS) solutionmore » to deploy the AEE across the NWC.« less

  1. A new nano-engineered hierarchical membrane for concurrent removal of surfactant and oil from oil-in-water nanoemulsion

    PubMed Central

    Qin, Detao; Liu, Zhaoyang; Bai, Hongwei; Sun, Darren Delai; Song, Xiaoxiao

    2016-01-01

    Surfactant stabilized oil-in-water nanoemulsions pose a severe threat to both the environment and human health. Recent development of membrane filtration technology has enabled efficient oil removal from oil/water nanoemulsion, however, the concurrent removal of surfactant and oil remains unsolved because the existing filtration membranes still suffer from low surfactant removal rate and serious surfactant-induced fouling issue. In this study, to realize the concurrent removal of surfactant and oil from nanoemulsion, a novel hierarchically-structured membrane is designed with a nanostructured selective layer on top of a microstructured support layer. The physical and chemical properties of the overall membrane, including wettability, surface roughness, electric charge, thickness and structures, are delicately tailored through a nano-engineered fabrication process, that is, graphene oxide (GO) nanosheet assisted phase inversion coupled with surface functionalization. Compared with the membrane fabricated by conventional phase inversion, this novel membrane has four times higher water flux, significantly higher rejections of both oil (~99.9%) and surfactant (as high as 93.5%), and two thirds lower fouling ratio when treating surfactant stabilized oil-in-water nanoemulsion. Due to its excellent performances and facile fabrication process, this nano-engineered membrane is expected to have wide practical applications in the oil/water separation fields of environmental protection and water purification. PMID:27087362

  2. Concurrent and Collaborative Engineering Implementation in an R and D Organization

    NASA Technical Reports Server (NTRS)

    DelRosario, Ruben; Davis, Jose M.; Keys, L. Ken

    2003-01-01

    Concurrent Engineering (CE), and Collaborative Engineering (or Collaborative Product Development - CPD) have emerged as new paradigms with significant impact in the development of new products and processes. With documented and substantiated success in the automotive and technology industries CE and, most recently, CPD are being touted as innovative management philosophies for many other business sectors including Research and De- velopment. This paper introduces two independent research initiatives conducted at the NASA Glenn Research Center (GRC) in Cleveland, Ohio investigating the application of CE and CPD in an RdiD environment. Since little research has been conducted in the use of CE and CPD in sectors other than the high mass production manufacturing, the objective of these independent studies is to provide a systematic evaluation of the applicability of these paradigms (concur- rent and collaborative) in a low/no production, service environment, in particular R&D.

  3. The MEOW lunar project for education and science based on concurrent engineering approach

    NASA Astrophysics Data System (ADS)

    Roibás-Millán, E.; Sorribes-Palmer, F.; Chimeno-Manguán, M.

    2018-07-01

    The use of concurrent engineering in the design of space missions allows to take into account in an interrelated methodology the high level of coupling and iteration of mission subsystems in the preliminary conceptual phase. This work presents the result of applying concurrent engineering in a short time lapse to design the main elements of the preliminary design for a lunar exploration mission, developed within ESA Academy Concurrent Engineering Challenge 2017. During this program, students of the Master in Space Systems at Technical University of Madrid designed a low cost satellite to find water on the Moon south pole as prospect of a future human lunar base. The resulting mission, The Moon Explorer And Observer of Water/Ice (MEOW) compromises a 262 kg spacecraft to be launched into a Geostationary Transfer Orbit as a secondary payload in the 2023/2025 time frame. A three months Weak Stability Boundary transfer via the Sun-Earth L1 Lagrange point allows for a high launch timeframe flexibility. The different aspects of the mission (orbit analysis, spacecraft design and payload) and possibilities of concurrent engineering are described.

  4. Information Integration for Concurrent Engineering (IICE) IDEF3 Process Description Capture Method Report

    DTIC Science & Technology

    1995-09-01

    vital processes of a business. process, IDEF, method, methodology, modeling, knowledge acquisition, requirements definition, information systems... knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems

  5. Elements of Designing for Cost

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Unal, Resit

    1992-01-01

    During recent history in the United States, government systems development has been performance driven. As a result, systems within a class have experienced exponentially increasing cost over time in fixed year dollars. Moreover, little emphasis has been placed on reducing cost. This paper defines designing for cost and presents several tools which, if used in the engineering process, offer the promise of reducing cost. Although other potential tools exist for designing for cost, this paper focuses on rules of thumb, quality function deployment, Taguchi methods, concurrent engineering, and activity based costing. Each of these tools has been demonstrated to reduce cost if used within the engineering process.

  6. Elements of designing for cost

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Unal, Resit

    1992-01-01

    During recent history in the United States, government systems development has been performance driven. As a result, systems within a class have experienced exponentially increasing cost over time in fixed year dollars. Moreover, little emphasis has been placed on reducing cost. This paper defines designing for cost and presents several tools which, if used in the engineering process, offer the promise of reducing cost. Although other potential tools exist for designing for cost, this paper focuses on rules of thumb, quality function deployment, Taguchi methods, concurrent engineering, and activity-based costing. Each of these tools has been demonstrated to reduce cost if used within the engineering process.

  7. Multi-Attribute Tradespace Exploration in Space System Design

    NASA Astrophysics Data System (ADS)

    Ross, A. M.; Hastings, D. E.

    2002-01-01

    The complexity inherent in space systems necessarily requires intense expenditures of resources both human and monetary. The high level of ambiguity present in the early design phases of these systems causes long, highly iterative, and costly design cycles. This paper looks at incorporating decision theory methods into the early design processes to streamline communication of wants and needs among stakeholders and between levels of design. Communication channeled through formal utility interviews and analysis enables engineers to better understand the key drivers for the system and allows a more thorough exploration of the design tradespace. Multi-Attribute Tradespace Exploration (MATE), an evolving process incorporating decision theory into model and simulation- based design, has been applied to several space system case studies at MIT. Preliminary results indicate that this process can improve the quality of communication to more quickly resolve project ambiguity, and enable the engineer to discover better value designs for multiple stakeholders. MATE is also being integrated into a concurrent design environment to facilitate the transfer knowledge of important drivers into higher fidelity design phases. Formal utility theory provides a mechanism to bridge the language barrier between experts of different backgrounds and differing needs (e.g. scientists, engineers, managers, etc). MATE with concurrent design couples decision makers more closely to the design, and most importantly, maintains their presence between formal reviews.

  8. Risk Identification and Visualization in a Concurrent Engineering Team Environment

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus; Chattopadhyay, Debarati; Shishko, Robert

    2010-01-01

    Incorporating risk assessment into the dynamic environment of a concurrent engineering team requires rapid response and adaptation. Generating consistent risk lists with inputs from all the relevant subsystems and presenting the results clearly to the stakeholders in a concurrent engineering environment is difficult because of the speed with which decisions are made. In this paper we describe the various approaches and techniques that have been explored for the point designs of JPL's Team X and the Trade Space Studies of the Rapid Mission Architecture Team. The paper will also focus on the issues of the misuse of categorical and ordinal data that keep arising within current engineering risk approaches and also in the applied risk literature.

  9. Implementation of a Three-Semester Concurrent Engineering Design Sequence for Lower-Division Engineering Students

    ERIC Educational Resources Information Center

    Bertozzi, N.; Hebert, C.; Rought, J.; Staniunas, C.

    2007-01-01

    Over the past decade the software products available for solid modeling, dynamic, stress, thermal, and flow analysis, and computer-aiding manufacturing (CAM) have become more powerful, affordable, and easier to use. At the same time it has become increasingly important for students to gain concurrent engineering design and systems integration…

  10. Concurrent Engineering for the Management of Research and Development

    NASA Technical Reports Server (NTRS)

    DelRosario, Ruben; Petersen, Paul F.; Keys, L. Ken; Chen, Injazz J.

    2004-01-01

    The Management of Research and Development (R&D) is facing the challenges of reducing time from R&D to customer, reducing the cost of R&D, having higher accountability for results (improved quality), and increasing focus on customers. Concurrent engineering (CE) has shown great success in the automotive and technology industries resulting in significant decreases in cycle time, reduction of total cost, and increases in quality and reliability. This philosophy of concurrency can have similar implications or benefits for the management of R&D organizations. Since most studies on the application of CE have been performed in manufacturing environments, research into the benefits of CE into other environments is needed. This paper presents research conducted at the NASA Glenn Research Center (GRC) investigating the application of CE in the management of an R&D organization. In particular the paper emphasizes possible barriers and enhancers that this environment presents to the successful implementation of CE. Preliminary results and recommendations are based on a series of interviews and subsequent surveys, from which data has been gathered and analyzed as part of the GRC's Continuous Improvement Process.

  11. Space Station logistics policy - Risk management from the top down

    NASA Technical Reports Server (NTRS)

    Paules, Granville; Graham, James L., Jr.

    1990-01-01

    Considerations are presented in the area of risk management specifically relating to logistics and system supportability. These considerations form a basis for confident application of concurrent engineering principles to a development program, aiming at simultaneous consideration of support and logistics requirements within the engineering process as the system concept and designs develop. It is shown that, by applying such a process, the chances of minimizing program logistics and supportability risk in the long term can be improved. The problem of analyzing and minimizing integrated logistics risk for the Space Station Freedom Program is discussed.

  12. Product development: the making of the Abbott ARCHITECT.

    PubMed

    Kisner, H J

    1997-01-01

    Many laboratorians have a limited perspective on what is involved in developing an instrument and bringing it to market. This article traces the product development process used by Abbott Diagnostics Division that resulted in Abbott being named the 1996 Concurrent Engineering Company of the Year for the design of the ARCHITECT.

  13. NASA's Planetary Science Summer School: Training Future Mission Leaders in a Concurrent Engineering Environment

    NASA Astrophysics Data System (ADS)

    Mitchell, K. L.; Lowes, L. L.; Budney, C. J.; Sohus, A.

    2014-12-01

    NASA's Planetary Science Summer School (PSSS) is an intensive program for postdocs and advanced graduate students in science and engineering fields with a keen interest in planetary exploration. The goal is to train the next generation of planetary science mission leaders in a hands-on environment involving a wide range of engineers and scientists. It was established in 1989, and has undergone several incarnations. Initially a series of seminars, it became a more formal mission design experience in 1999. Admission is competitive, with participants given financial support. The competitively selected trainees develop an early mission concept study in teams of 15-17, responsive to a typical NASA Science Mission Directorate Announcement of Opportunity. They select the mission concept from options presented by the course sponsors, based on high-priority missions as defined by the Decadal Survey, prepare a presentation for a proposal authorization review, present it to a senior review board and receive critical feedback. Each participant assumes multiple roles, on science, instrument and project teams. They develop an understanding of top-level science requirements and instrument priorities in advance through a series of reading assignments and webinars help trainees. Then, during the five day session at Jet Propulsion Laboratory, they work closely with concurrent engineers including JPL's Advanced Projects Design Team ("Team X"), a cross-functional multidisciplinary team of engineers that utilizes concurrent engineering methodologies to complete rapid design, analysis and evaluation of mission concept designs. All are mentored and assisted directly by Team X members and course tutors in their assigned project roles. There is a strong emphasis on making difficult trades, simulating a real mission design process as accurately as possible. The process is intense and at times dramatic, with fast-paced design sessions and late evening study sessions. A survey of PSSS alumni administered in 2013 provides information on the program's impact on trainees' career choices and leadership roles as they pursue their employment in planetary science and related fields. Results will be presented during the session, along with highlights of topics and missions covered since the program's inception.

  14. Concurrent design of an RTP chamber and advanced control system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spence, P.; Schaper, C.; Kermani, A.

    1995-12-31

    A concurrent-engineering approach is applied to the development of an axisymmetric rapid-thermal-processing (RTP) reactor and its associated temperature controller. Using a detailed finite-element thermal model as a surrogate for actual hardware, the authors have developed and tested a multi-input multi-output (MIMO) controller. Closed-loop simulations are performed by linking the control algorithm with the finite-element code. Simulations show that good temperature uniformity is maintained on the wafer during both steady and transient conditions. A numerical study shows the effect of ramp rate, feedback gain, sensor placement, and wafer-emissivity patterns on system performance.

  15. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  16. Integrating post-manufacturing issues into design and manufacturing decisions

    NASA Technical Reports Server (NTRS)

    Eubanks, Charles F.

    1996-01-01

    An investigation is conducted on research into some of the fundamental issues underlying the design for manufacturing, service and recycling that affect engineering decisions early in the conceptual design phase of mechanical systems. The investigation focuses on a system-based approach to material selection, manufacturing methods and assembly processes related to overall product requirements, performance and life-cycle costs. Particular emphasis is placed on concurrent engineering decision support for post-manufacturing issues such as serviceability, recyclability, and product retirement.

  17. Performance Modeling in CUDA Streams - A Means for High-Throughput Data Processing.

    PubMed

    Li, Hao; Yu, Di; Kumar, Anand; Tu, Yi-Cheng

    2014-10-01

    Push-based database management system (DBMS) is a new type of data processing software that streams large volume of data to concurrent query operators. The high data rate of such systems requires large computing power provided by the query engine. In our previous work, we built a push-based DBMS named G-SDMS to harness the unrivaled computational capabilities of modern GPUs. A major design goal of G-SDMS is to support concurrent processing of heterogenous query processing operations and enable resource allocation among such operations. Understanding the performance of operations as a result of resource consumption is thus a premise in the design of G-SDMS. With NVIDIA's CUDA framework as the system implementation platform, we present our recent work on performance modeling of CUDA kernels running concurrently under a runtime mechanism named CUDA stream . Specifically, we explore the connection between performance and resource occupancy of compute-bound kernels and develop a model that can predict the performance of such kernels. Furthermore, we provide an in-depth anatomy of the CUDA stream mechanism and summarize the main kernel scheduling disciplines in it. Our models and derived scheduling disciplines are verified by extensive experiments using synthetic and real-world CUDA kernels.

  18. Concurrent engineering research center

    NASA Technical Reports Server (NTRS)

    Callahan, John R.

    1995-01-01

    The projects undertaken by The Concurrent Engineering Research Center (CERC) at West Virginia University are reported and summarized. CERC's participation in the Department of Defense's Defense Advanced Research Project relating to technology needed to improve the product development process is described, particularly in the area of advanced weapon systems. The efforts committed to improving collaboration among the diverse and distributed health care providers are reported, along with the research activities for NASA in Independent Software Verification and Validation. CERC also takes part in the electronic respirator certification initiated by The National Institute for Occupational Safety and Health, as well as in the efforts to find a solution to the problem of producing environment-friendly end-products for product developers worldwide. The 3M Fiber Metal Matrix Composite Model Factory Program is discussed. CERC technologies, facilities,and personnel-related issues are described, along with its library and technical services and recent publications.

  19. Model-Based Systems Engineering in Concurrent Engineering Centers

    NASA Technical Reports Server (NTRS)

    Iwata, Curtis; Infeld, Samantha; Bracken, Jennifer Medlin; McGuire; McQuirk, Christina; Kisdi, Aron; Murphy, Jonathan; Cole, Bjorn; Zarifian, Pezhman

    2015-01-01

    Concurrent Engineering Centers (CECs) are specialized facilities with a goal of generating and maturing engineering designs by enabling rapid design iterations. This is accomplished by co-locating a team of experts (either physically or virtually) in a room with a focused design goal and a limited timeline of a week or less. The systems engineer uses a model of the system to capture the relevant interfaces and manage the overall architecture. A single model that integrates other design information and modeling allows the entire team to visualize the concurrent activity and identify conflicts more efficiently, potentially resulting in a systems model that will continue to be used throughout the project lifecycle. Performing systems engineering using such a system model is the definition of model-based systems engineering (MBSE); therefore, CECs evolving their approach to incorporate advances in MBSE are more successful in reducing time and cost needed to meet study goals. This paper surveys space mission CECs that are in the middle of this evolution, and the authors share their experiences in order to promote discussion within the community.

  20. Model-Based Systems Engineering in Concurrent Engineering Centers

    NASA Technical Reports Server (NTRS)

    Iwata, Curtis; Infeld, Samatha; Bracken, Jennifer Medlin; McGuire, Melissa; McQuirk, Christina; Kisdi, Aron; Murphy, Jonathan; Cole, Bjorn; Zarifian, Pezhman

    2015-01-01

    Concurrent Engineering Centers (CECs) are specialized facilities with a goal of generating and maturing engineering designs by enabling rapid design iterations. This is accomplished by co-locating a team of experts (either physically or virtually) in a room with a narrow design goal and a limited timeline of a week or less. The systems engineer uses a model of the system to capture the relevant interfaces and manage the overall architecture. A single model that integrates other design information and modeling allows the entire team to visualize the concurrent activity and identify conflicts more efficiently, potentially resulting in a systems model that will continue to be used throughout the project lifecycle. Performing systems engineering using such a system model is the definition of model-based systems engineering (MBSE); therefore, CECs evolving their approach to incorporate advances in MBSE are more successful in reducing time and cost needed to meet study goals. This paper surveys space mission CECs that are in the middle of this evolution, and the authors share their experiences in order to promote discussion within the community.

  1. NASA's computer science research program

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  2. A Response Surface Methodology for Bi-Level Integrated System Synthesis (BLISS)

    NASA Technical Reports Server (NTRS)

    Altus, Troy David; Sobieski, Jaroslaw (Technical Monitor)

    2002-01-01

    The report describes a new method for optimization of engineering systems such as aerospace vehicles whose design must harmonize a number of subsystems and various physical phenomena, each represented by a separate computer code, e.g., aerodynamics, structures, propulsion, performance, etc. To represent the system internal couplings, the codes receive output from other codes as part of their inputs. The system analysis and optimization task is decomposed into subtasks that can be executed concurrently, each subtask conducted using local state and design variables and holding constant a set of the system-level design variables. The subtasks results are stored in form of the Response Surfaces (RS) fitted in the space of the system-level variables to be used as the subtask surrogates in a system-level optimization whose purpose is to optimize the system objective(s) and to reconcile the system internal couplings. By virtue of decomposition and execution concurrency, the method enables a broad workfront in organization of an engineering project involving a number of specialty groups that might be geographically dispersed, and it exploits the contemporary computing technology of massively concurrent and distributed processing. The report includes a demonstration test case of supersonic business jet design.

  3. DARPA Concurrent Design/Concurrent Engineering Workshop Held in Key West, Florida on December 6-8, 1988

    DTIC Science & Technology

    1988-12-01

    engineering disciplines. (Here I refer to training in multifunction team mana ement dir’lplines, quality engineering methods, experimental design by such...4001 SSOME ISSUES S• View of strategic issues has been evolving - Speed of design and product deployment - to accelerate experimentation with new...manufacturingprocess design n New technologies (e.g., composites) which can revolutionize prod-uct technical design in some cases Issue still to be faced: " non

  4. A Training Tool and Methodology to Allow Concurrent Multidisciplinary Experimental Projects in Engineering Education

    ERIC Educational Resources Information Center

    Maseda, F. J.; Martija, I.; Martija, I.

    2012-01-01

    This paper describes a novel Electrical Machine and Power Electronic Training Tool (EM&PE[subscript TT]), a methodology for using it, and associated experimental educational activities. The training tool is implemented by recreating a whole power electronics system, divided into modular blocks. This process is similar to that applied when…

  5. Space Station Freedom - Configuration management approach to supporting concurrent engineering and total quality management. [for NASA Space Station Freedom Program

    NASA Technical Reports Server (NTRS)

    Gavert, Raymond B.

    1990-01-01

    Some experiences of NASA configuration management in providing concurrent engineering support to the Space Station Freedom program for the achievement of life cycle benefits and total quality are discussed. Three change decision experiences involving tracing requirements and automated information systems of the electrical power system are described. The potential benefits of concurrent engineering and total quality management include improved operational effectiveness, reduced logistics and support requirements, prevention of schedule slippages, and life cycle cost savings. It is shown how configuration management can influence the benefits attained through disciplined approaches and innovations that compel consideration of all the technical elements of engineering and quality factors that apply to the program development, transition to operations and in operations. Configuration management experiences involving the Space Station program's tiered management structure, the work package contractors, international partners, and the participating NASA centers are discussed.

  6. Long-term health experience of jet engine manufacturing workers: VIII. glioblastoma incidence in relation to workplace experiences with parts and processes.

    PubMed

    Marsh, Gary M; Youk, Ada O; Buchanich, Jeanine M; Downing, Sarah; Kennedy, Kathleen J; Esmen, Nurtan A; Hancock, Roger P; Lacey, Steven E; Pierce, Jennifer S; Fleissner, Mary Lou

    2013-06-01

    To determine whether glioblastoma (GB) incidence rates among jet engine manufacturing workers were associated with workplace experiences with specific parts produced and processes performed. Subjects were 210,784 workers employed between 1952 and 2001. We conducted nested case-control and cohort incidence studies with focus on 277 GB cases. We estimated time experienced with 16 part families, 4 process categories, and 32 concurrent part-process combinations with 20 or more GB cases. In both the cohort and case-control studies, none of the part families, process categories, or both considered was associated with increased GB risk. If not due to chance alone, the not statistically significantly elevated GB rates in the North Haven plant may reflect external occupational factors or nonoccupational factors unmeasured in the current evaluation.

  7. Additive manufacturing: Toward holistic design

    DOE PAGES

    Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.; ...

    2017-03-18

    Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.

    Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.

  9. Performance Modeling in CUDA Streams - A Means for High-Throughput Data Processing

    PubMed Central

    Li, Hao; Yu, Di; Kumar, Anand; Tu, Yi-Cheng

    2015-01-01

    Push-based database management system (DBMS) is a new type of data processing software that streams large volume of data to concurrent query operators. The high data rate of such systems requires large computing power provided by the query engine. In our previous work, we built a push-based DBMS named G-SDMS to harness the unrivaled computational capabilities of modern GPUs. A major design goal of G-SDMS is to support concurrent processing of heterogenous query processing operations and enable resource allocation among such operations. Understanding the performance of operations as a result of resource consumption is thus a premise in the design of G-SDMS. With NVIDIA’s CUDA framework as the system implementation platform, we present our recent work on performance modeling of CUDA kernels running concurrently under a runtime mechanism named CUDA stream. Specifically, we explore the connection between performance and resource occupancy of compute-bound kernels and develop a model that can predict the performance of such kernels. Furthermore, we provide an in-depth anatomy of the CUDA stream mechanism and summarize the main kernel scheduling disciplines in it. Our models and derived scheduling disciplines are verified by extensive experiments using synthetic and real-world CUDA kernels. PMID:26566545

  10. Concurrent Mission and Systems Design at NASA Glenn Research Center: The Origins of the COMPASS Team

    NASA Technical Reports Server (NTRS)

    McGuire, Melissa L.; Oleson, Steven R.; Sarver-Verhey, Timothy R.

    2012-01-01

    Established at the NASA Glenn Research Center (GRC) in 2006 to meet the need for rapid mission analysis and multi-disciplinary systems design for in-space and human missions, the Collaborative Modeling for Parametric Assessment of Space Systems (COMPASS) team is a multidisciplinary, concurrent engineering group whose primary purpose is to perform integrated systems analysis, but it is also capable of designing any system that involves one or more of the disciplines present in the team. The authors were involved in the development of the COMPASS team and its design process, and are continuously making refinements and enhancements. The team was unofficially started in the early 2000s as part of the distributed team known as Team JIMO (Jupiter Icy Moons Orbiter) in support of the multi-center collaborative JIMO spacecraft design during Project Prometheus. This paper documents the origins of a concurrent mission and systems design team at GRC and how it evolved into the COMPASS team, including defining the process, gathering the team and tools, building the facility, and performing studies.

  11. Application of Concurrent Engineering Methods to the Design of an Autonomous Aerial Robot

    DTIC Science & Technology

    1991-12-01

    power within the system, either airborne or at a ground station, was left to the team’s discretion. Data link from the aerial vehicle to the ground...Design Process 1 4 10 0% Conceptual 100% Preliminary 100% Detailed 100% Design Freedom Kowledge About the Design TIME INTO THE DESIGN PROCESS Figure 15...mission planning and control tasks was accomplished. Key system issues regarding power up and component initialization procedures began to be addressed

  12. Selected Current Acquisitions and Articles from Periodicals

    DTIC Science & Technology

    1994-06-01

    on Theater High Altitude Area Defense (THAAD) System : briefing report to the Chairman, Committee on Foreign Relations, U.S. Senate. [Washington, D.C...John Marshall Law School , 1993- LAW PERIODICALS Main PAGE 6 CONCURRENT ENGINEERING. Karbhari, Vistaspa Maneck. Concurrent engineering for composites...Postgraduate School , [19901 VC267.U6 H37 1991 United States. DOD FAR supplement : Department of Defense as of . . Chicago, Ill. : Commerce Clearing House, 1994

  13. Improving generalized inverted index lock wait times

    NASA Astrophysics Data System (ADS)

    Borodin, A.; Mirvoda, S.; Porshnev, S.; Ponomareva, O.

    2018-01-01

    Concurrent operations on tree like data structures is a cornerstone of any database system. Concurrent operations intended for improving read\\write performance and usually implemented via some way of locking. Deadlock-free methods of concurrency control are known as tree locking protocols. These protocols provide basic operations(verbs) and algorithm (ways of operation invocations) for applying it to any tree-like data structure. These algorithms operate on data, managed by storage engine which are very different among RDBMS implementations. In this paper, we discuss tree locking protocol implementation for General inverted index (Gin) applied to multiversion concurrency control (MVCC) storage engine inside PostgreSQL RDBMS. After that we introduce improvements to locking protocol and provide usage statistics about evaluation of our improvement in very high load environment in one of the world’s largest IT company.

  14. Interdisciplinary and multilevel optimum design. [in aerospace structural engineering

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Haftka, Raphael T.

    1987-01-01

    Interactions among engineering disciplines and subsystems in engineering system design are surveyed and specific instances of such interactions are described. Examination of the interactions that a traditional design process in which the numerical values of major design variables are decided consecutively is likely to lead to a suboptimal design. Supporting numerical examples are a glider and a space antenna. Under an alternative approach introduced, the design and its sensitivity data from the subsystems and disciplines are generated concurrently and then made available to the system designer enabling him to modify the system design so as to improve its performance. Examples of a framework structure and an airliner wing illustrate that approach.

  15. Three gene expression vector sets for concurrently expressing multiple genes in Saccharomyces cerevisiae.

    PubMed

    Ishii, Jun; Kondo, Takashi; Makino, Harumi; Ogura, Akira; Matsuda, Fumio; Kondo, Akihiko

    2014-05-01

    Yeast has the potential to be used in bulk-scale fermentative production of fuels and chemicals due to its tolerance for low pH and robustness for autolysis. However, expression of multiple external genes in one host yeast strain is considerably labor-intensive due to the lack of polycistronic transcription. To promote the metabolic engineering of yeast, we generated systematic and convenient genetic engineering tools to express multiple genes in Saccharomyces cerevisiae. We constructed a series of multi-copy and integration vector sets for concurrently expressing two or three genes in S. cerevisiae by embedding three classical promoters. The comparative expression capabilities of the constructed vectors were monitored with green fluorescent protein, and the concurrent expression of genes was monitored with three different fluorescent proteins. Our multiple gene expression tool will be helpful to the advanced construction of genetically engineered yeast strains in a variety of research fields other than metabolic engineering. © 2014 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.

  16. Launch vehicle systems design analysis

    NASA Technical Reports Server (NTRS)

    Ryan, Robert; Verderaime, V.

    1993-01-01

    Current launch vehicle design emphasis is on low life-cycle cost. This paper applies total quality management (TQM) principles to a conventional systems design analysis process to provide low-cost, high-reliability designs. Suggested TQM techniques include Steward's systems information flow matrix method, quality leverage principle, quality through robustness and function deployment, Pareto's principle, Pugh's selection and enhancement criteria, and other design process procedures. TQM quality performance at least-cost can be realized through competent concurrent engineering teams and brilliance of their technical leadership.

  17. Group Design Problems in Engineering Design Graphics.

    ERIC Educational Resources Information Center

    Kelley, David

    2001-01-01

    Describes group design techniques used within the engineering design graphics sequence at Western Washington University. Engineering and design philosophies such as concurrent engineering place an emphasis on group collaboration for the solving of design problems. (Author/DDR)

  18. Systems Engineering News | Wind | NREL

    Science.gov Websites

    News Systems Engineering News The Wind Plant Optimization and Systems Engineering newsletter covers range from multi-disciplinary design analysis and optimization of wind turbine sub-components to wind plant optimization and uncertainty analysis to concurrent engineering and financial engineering

  19. Initiative in Concurrent Engineering (DICE). Phase 1.

    DTIC Science & Technology

    1990-02-09

    and power of commercial and military electronics systems. The continual evolution of HDE technology offers far greater flexibility in circuit design... powerful magnetic field of the permanent magnets in the sawyer motors. This makes it possible to have multiple robots in the workcell and to have them...Controller. The Adept IC was chosen because of its extensive processing power , integrated grayscale vision, standard 28 industrial I/O control

  20. Concurrent Engineering Teams. Volume 2: Annotated Bibliography

    DTIC Science & Technology

    1990-11-01

    publishles. They normally embody restilts of major projects which (a) have a direct bearing am decisionse affecting major program , III) addrnss...D., "What Processes do You Own? How are They Doing?," Program Manager, Journal of the Defense Systems Management College, September-October 1989, pp...216. The key ingredient to any successful TQM program is top management commitment and involvement. The early top management involvement reflects

  1. Integrating reliability and maintainability into a concurrent engineering environment

    NASA Astrophysics Data System (ADS)

    Phillips, Clifton B.; Peterson, Robert R.

    1993-02-01

    This paper describes the results of a reliability and maintainability study conducted at the University of California, San Diego and supported by private industry. Private industry thought the study was important and provided the university access to innovative tools under cooperative agreement. The current capability of reliability and maintainability tools and how they fit into the design process is investigated. The evolution of design methodologies leading up to today's capability is reviewed for ways to enhance the design process while keeping cost under control. A method for measuring the consequences of reliability and maintainability policy for design configurations in an electronic environment is provided. The interaction of selected modern computer tool sets is described for reliability, maintainability, operations, and other elements of the engineering design process. These tools provide a robust system evaluation capability that brings life cycle performance improvement information to engineers and their managers before systems are deployed, and allow them to monitor and track performance while it is in operation.

  2. Systematic and Scalable Testing of Concurrent Programs

    DTIC Science & Technology

    2013-12-16

    The evaluation of CHESS [107] checked eight different programs ranging from process management libraries to a distributed execution engine to a research...tool (§3.1) targets systematic testing of scheduling nondeterminism in multi- threaded components of the Omega cluster management system [129], while...tool for systematic testing of multithreaded com- ponents of the Omega cluster management system [129]. In particular, §3.1.1 defines a model for

  3. Domain-specific languages and diagram customization for a concurrent engineering environment

    NASA Astrophysics Data System (ADS)

    Cole, B.; Dubos, G.; Banazadeh, P.; Reh, J.; Case, K.; Wang, Y.; Jones, S.; Picha, F.

    A major open question for advocates of Model-Based Systems Engineering (MBSE) is the question of how system and subsystem engineers will work together. The Systems Modeling Language (SysML), like any language intended for a large audience, is in tension between the desires for simplicity and for expressiveness. In order to be more expressive, many specialized language elements may be introduced, which will unfortunately make a complete understanding of the language a more daunting task. While this may be acceptable for systems modelers, it will increase the challenge of including subsystem engineers in the modeling effort. One possible answer to this situation is the use of Domain-Specific Languages (DSL), which are fully supported by the Unified Modeling Language (UML). SysML is in fact a DSL for systems engineering. The expressive power of a DSL can be enhanced through the use of diagram customization. Various domains have already developed their own schematic vocabularies. Within the space engineering community, two excellent examples are the propulsion and telecommunication subsystems. A return to simple box-and-line diagrams (e.g., the SysML Internal Block Diagram) are in many ways a step backward. In order allow subsystem engineers to contribute directly to the model, it is necessary to make a system modeling tool at least approximate in accessibility to drawing tools like Microsoft PowerPoint and Visio. The challenge is made more extreme in a concurrent engineering environment, where designs must often be drafted in an hour or two. In the case of the Jet Propulsion Laboratory's Team X concurrent design team, a subsystem is specified using a combination of PowerPoint for drawing and Excel for calculation. A pilot has been undertaken in order to meld the drawing portion and the production of master equipment lists (MELs) via a SysML authoring tool, MagicDraw. Team X currently interacts with its customers in a process of sharing presentations. There are severa- inefficiencies that arise from this situation. The first is that a customer team must wait two weeks to a month (which is 2-4 times the duration of most Team X studies themselves) for a finalized, detailed design description. Another is that this information must be re-entered by hand into the set of engineering artifacts and design tools that the mission concept team uses after a study is complete. Further, there is no persistent connection to Team X or institutionally shared formulation design tools and data after a given study, again reducing the direct reuse of designs created in a Team X study. This paper presents the underpinnings of subsystem DSLs as they were developed for this pilot. This includes specialized semantics for different domains as well as the process by which major categories of objects were derived in support of defining the DSLs. The feedback given to us by the domain experts on usability, along with a pilot study with the partial inclusion of these tools is also discussed.

  4. Domain-Specific Languages and Diagram Customization for a Concurrent Engineering Environment

    NASA Technical Reports Server (NTRS)

    Cole, Bjorn; Dubos, Greg; Banazadeh, Payam; Reh, Jonathan; Case, Kelley; Wang, Yeou-Fang; Jones, Susan; Picha, Frank

    2013-01-01

    A major open question for advocates of Model-Based Systems Engineering (MBSE) is the question of how system and subsystem engineers will work together. The Systems Modeling Language (SysML), like any language intended for a large audience, is in tension between the desires for simplicity and for expressiveness. In order to be more expressive, many specialized language elements may be introduced, which will unfortunately make a complete understanding of the language a more daunting task. While this may be acceptable for systems modelers, it will increase the challenge of including subsystem engineers in the modeling effort. One possible answer to this situation is the use of Domain-Specific Languages (DSL), which are fully supported by the Unified Modeling Language (UML). SysML is in fact a DSL for systems engineering. The expressive power of a DSL can be enhanced through the use of diagram customization. Various domains have already developed their own schematic vocabularies. Within the space engineering community, two excellent examples are the propulsion and telecommunication subsystems. A return to simple box-and-line diagrams (e.g., the SysML Internal Block Diagram) are in many ways a step backward. In order allow subsystem engineers to contribute directly to the model, it is necessary to make a system modeling tool at least approximate in accessibility to drawing tools like Microsoft PowerPoint and Visio. The challenge is made more extreme in a concurrent engineering environment, where designs must often be drafted in an hour or two. In the case of the Jet Propulsion Laboratory's Team X concurrent design team, a subsystem is specified using a combination of PowerPoint for drawing and Excel for calculation. A pilot has been undertaken in order to meld the drawing portion and the production of master equipment lists (MELs) via a SysML authoring tool, MagicDraw. Team X currently interacts with its customers in a process of sharing presentations. There are several inefficiencies that arise from this situation. The first is that a customer team must wait two weeks to a month (which is 2-4 times the duration of most Team X studies themselves) for a finalized, detailed design description. Another is that this information must be re-entered by hand into the set of engineering artifacts and design tools that the mission concept team uses after a study is complete. Further, there is no persistent connection to Team X or institutionally shared formulation design tools and data after a given study, again reducing the direct reuse of designs created in a Team X study. This paper presents the underpinnings of subsystem DSLs as they were developed for this pilot. This includes specialized semantics for different domains as well as the process by which major categories of objects were derived in support of defining the DSLs. The feedback given to us by the domain experts on usability, along with a pilot study with the partial inclusion of these tools is also discussed.

  5. System software for the finite element machine

    NASA Technical Reports Server (NTRS)

    Crockett, T. W.; Knott, J. D.

    1985-01-01

    The Finite Element Machine is an experimental parallel computer developed at Langley Research Center to investigate the application of concurrent processing to structural engineering analysis. This report describes system-level software which has been developed to facilitate use of the machine by applications researchers. The overall software design is outlined, and several important parallel processing issues are discussed in detail, including processor management, communication, synchronization, and input/output. Based on experience using the system, the hardware architecture and software design are critiqued, and areas for further work are suggested.

  6. Interdisciplinary and multilevel optimum design

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Haftka, Raphael T.

    1986-01-01

    Interactions among engineering disciplines and subsystems in engineering system design are surveyed and specific instances of such interactions are described. Examination of the interactions that a traditional design process in which the numerical values of major design variables are decided consecutively is likely to lead to a suboptimal design. Supporting numerical examples are a glider and a space antenna. Under an alternative approach introduced, the design and its sensitivity data from the subsystems and disciplines are generated concurrently and then made available to the system designer enabling him to modify the system design so as to improve its performance. Examples of a framework structure and an airliner wing illustrate that approach.

  7. Design for improved maintenance of the fiber-optic cable system (As carried out in a concurrent engineering environment)

    NASA Astrophysics Data System (ADS)

    Tremoulet, P. C.

    The author describes a number of maintenance improvements in the Fiber Optic Cable System (FOCS). They were achieved during a production phase pilot concurrent engineering program. Listed in order of importance (saved maintenance time and material) by maintenance level, they are: (1) organizational level: improved fiber optic converter (FOC) BITE; (2) Intermediate level: reduced FOC adjustments from 20 to 2; partitioned FOC into electrical and optical parts; developed cost-effective fault isolation test points and test using standard test equipment; improved FOC chassis to have lower mean time to repair; and (3) depot level: revised test requirements documents (TRDs) for common automatic test equipment and incorporated ATE testability into circuit and assemblies and application-specific integrated circuits. These improvements met this contract's tailored logistics MIL-STD 1388-1A requirements of monitoring the design for supportability and determining the most effective support equipment. Important logistics lessons learned while accomplishing these maintainability and supportability improvements on the pilot concurrent engineering program are also discussed.

  8. Utilization of CAD/CAE for concurrent design of structural aircraft components

    NASA Technical Reports Server (NTRS)

    Kahn, William C.

    1993-01-01

    The feasibility of installing the Stratospheric Observatory for Infrared Astronomy telescope (named SOFIA) into an aircraft for NASA astronomy studies is investigated using CAD/CAE equipment to either design or supply data for every facet of design engineering. The aircraft selected for the platform was a Boeing 747, chosen on the basis of its ability to meet the flight profiles required for the given mission and payload. CAD models of the fuselage of two of the aircraft models studied (747-200 and 747 SP) were developed, and models for the component parts of the telescope and subsystems were developed by the various concurrent engineering groups of the SOFIA program, to determine the requirements for the cavity opening and for design configuration. It is noted that, by developing a plan to use CAD/CAE for concurrent engineering at the beginning of the study, it was possible to produce results in about two-thirds of the time required using traditional methods.

  9. Theory of remote entanglement via quantum-limited phase-preserving amplification

    NASA Astrophysics Data System (ADS)

    Silveri, Matti; Zalys-Geller, Evan; Hatridge, Michael; Leghtas, Zaki; Devoret, Michel H.; Girvin, S. M.

    2016-06-01

    We show that a quantum-limited phase-preserving amplifier can act as a which-path information eraser when followed by heterodyne detection. This "beam splitter with gain" implements a continuous joint measurement on the signal sources. As an application, we propose heralded concurrent remote entanglement generation between two qubits coupled dispersively to separate cavities. Dissimilar qubit-cavity pairs can be made indistinguishable by simple engineering of the cavity driving fields providing further experimental flexibility and the prospect for scalability. Additionally, we find an analytic solution for the stochastic master equation, a quantum filter, yielding a thorough physical understanding of the nonlinear measurement process leading to an entangled state of the qubits. We determine the concurrence of the entangled states and analyze its dependence on losses and measurement inefficiencies.

  10. An Example of Concurrent Engineering

    NASA Technical Reports Server (NTRS)

    Rowe, Sidney; Whitten, David; Cloyd, Richard; Coppens, Chris; Rodriguez, Pedro

    1998-01-01

    The Collaborative Engineering Design and Analysis Room (CEDAR) facility allows on-the- spot design review capability for any project during all phases of development. The required disciplines assemble in this facility to work on any problems (analysis, manufacturing, inspection, etc.) associated with a particular design. A small highly focused team of specialists can meet in this room to better expedite the process of developing a solution to an engineering task within the framework of the constraints that are unique to each discipline. This facility provides the engineering tools and translators to develop a concept within the confines of the room or with remote team members that could access the team's data from other locations. The CEDAR area is envisioned as excellent for failure investigation meetings to be conducted where the computer capabilities can be utilized in conjunction with the Smart Board display to develop failure trees, brainstorm failure modes, and evaluate possible solutions.

  11. The BOEING 777 - concurrent engineering and digital pre-assembly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abarbanel, B.

    The processes created on the 777 for checking designs were called {open_quotes}digital pre-assembly{close_quotes}. Using FlyThru(tm), a spin-off of a Boeing advanced computing research project, engineers were able to view up to 1500 models (15000 solids) in 3d traversing that data at high speed. FlyThru(tm) was rapidly deployed in 1991 to meet the needs of the 777 for large scale product visualization and verification. The digital pre-assembly process has bad fantastic results. The 777 has had far fewer assembly and systems problems compared to previous airplane programs. Today, FlyThru(tm) is installed on hundreds of workstations on almost every airplane program, andmore » is being used on Space Station, F22, AWACS, and other defense projects. It`s applications have gone far beyond just design review. In many ways, FlyThru is a Data Warehouse supported by advanced tools for analysis. It is today being integrated with Knowledge Based Engineering geometry generation tools.« less

  12. Transputer parallel processing at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Ellis, Graham K.

    1989-01-01

    The transputer parallel processing lab at NASA Lewis Research Center (LeRC) consists of 69 processors (transputers) that can be connected into various networks for use in general purpose concurrent processing applications. The main goal of the lab is to develop concurrent scientific and engineering application programs that will take advantage of the computational speed increases available on a parallel processor over the traditional sequential processor. Current research involves the development of basic programming tools. These tools will help standardize program interfaces to specific hardware by providing a set of common libraries for applications programmers. The thrust of the current effort is in developing a set of tools for graphics rendering/animation. The applications programmer currently has two options for on-screen plotting. One option can be used for static graphics displays and the other can be used for animated motion. The option for static display involves the use of 2-D graphics primitives that can be called from within an application program. These routines perform the standard 2-D geometric graphics operations in real-coordinate space as well as allowing multiple windows on a single screen.

  13. 78 FR 8596 - Committee on Equal Opportunities in Science and Engineering #1173; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-06

    ... NATIONAL SCIENCE FOUNDATION Committee on Equal Opportunities in Science and Engineering 1173... Science and Engineering (CEOSE). Dates/Time: February 25, 2013, 9:00 a.m.-5:30 p.m.; February 26, 2013, 9... participation in science and engineering. Agenda: Opening Statement by the CEOSE Chair Discussions: Concurrence...

  14. Using Life-Cycle Human Factors Engineering to Avoid $2.4 Million in Costs: Lessons Learned from NASA's Requirements Verification Process for Space Payloads

    NASA Technical Reports Server (NTRS)

    Carr, Daniel; Ellenberger, Rich

    2008-01-01

    The Human Factors Implementation Team (HFIT) process has been used to verify human factors requirements for NASA International Space Station (ISS) payloads since 2003, resulting in $2.4 million in avoided costs. This cost benefit has been realized by greatly reducing the need to process time-consuming formal waivers (exceptions) for individual requirements violations. The HFIT team, which includes astronauts and their technical staff, acts as the single source for human factors requirements integration of payloads. HFIT has the authority to provide inputs during early design phases, thus eliminating many potential requirements violations in a cost-effective manner. In those instances where it is not economically or technically feasible to meet the precise metric of a given requirement, HFIT can work with the payload engineers to develop common sense solutions and formally document that the resulting payload design does not materially affect the astronaut s ability to operate and interact with the payload. The HFIT process is fully ISO 9000 compliant and works concurrently with NASA s formal systems engineering work flow. Due to its success with payloads, the HFIT process is being adapted and extended to ISS systems hardware. Key aspects of this process are also being considered for NASA's Space Shuttle replacement, the Crew Exploration Vehicle.

  15. Ames Engineering Directorate

    NASA Technical Reports Server (NTRS)

    Phillips, Veronica J.

    2017-01-01

    The Ames Engineering Directorate is the principal engineering organization supporting aerospace systems and spaceflight projects at NASA's Ames Research Center in California's Silicon Valley. The Directorate supports all phases of engineering and project management for flight and mission projects-from R&D to Close-out-by leveraging the capabilities of multiple divisions and facilities.The Mission Design Center (MDC) has full end-to-end mission design capability with sophisticated analysis and simulation tools in a collaborative concurrent design environment. Services include concept maturity level (CML) maturation, spacecraft design and trades, scientific instruments selection, feasibility assessments, and proposal support and partnerships. The Engineering Systems Division provides robust project management support as well as systems engineering, mechanical and electrical analysis and design, technical authority and project integration support to a variety of programs and projects across NASA centers. The Applied Manufacturing Division turns abstract ideas into tangible hardware for aeronautics, spaceflight and science applications, specializing in fabrication methods and management of complex fabrication projects. The Engineering Evaluation Lab (EEL) provides full satellite or payload environmental testing services including vibration, temperature, humidity, immersion, pressure/altitude, vacuum, high G centrifuge, shock impact testing and the Flight Processing Center (FPC), which includes cleanrooms, bonded stores and flight preparation resources. The Multi-Mission Operations Center (MMOC) is composed of the facilities, networks, IT equipment, software and support services needed by flight projects to effectively and efficiently perform all mission functions, including planning, scheduling, command, telemetry processing and science analysis.

  16. Design and Implementation of a Threaded Search Engine for Tour Recommendation Systems

    NASA Astrophysics Data System (ADS)

    Lee, Junghoon; Park, Gyung-Leen; Ko, Jin-Hee; Shin, In-Hye; Kang, Mikyung

    This paper implements a threaded scan engine for the O(n!) search space and measures its performance, aiming at providing a responsive tour recommendation and scheduling service. As a preliminary step of integrating POI ontology, mobile object database, and personalization profile for the development of new vehicular telematics services, this implementation can give a useful guideline to design a challenging and computation-intensive vehicular telematics service. The implemented engine allocates the subtree to the respective threads and makes them run concurrently exploiting the primitives provided by the operating system and the underlying multiprocessor architecture. It also makes it easy to add a variety of constraints, for example, the search tree is pruned if the cost of partial allocation already exceeds the current best. The performance measurement result shows that the service can run even in the low-power telematics device when the number of destinations does not exceed 15, with an appropriate constraint processing.

  17. Image-Processing Software For A Hypercube Computer

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Mazer, Alan S.; Groom, Steven L.; Williams, Winifred I.

    1992-01-01

    Concurrent Image Processing Executive (CIPE) is software system intended to develop and use image-processing application programs on concurrent computing environment. Designed to shield programmer from complexities of concurrent-system architecture, it provides interactive image-processing environment for end user. CIPE utilizes architectural characteristics of particular concurrent system to maximize efficiency while preserving architectural independence from user and programmer. CIPE runs on Mark-IIIfp 8-node hypercube computer and associated SUN-4 host computer.

  18. The Concurrent Engineering Design Paradigm Is Now Fully Functional for Graphics Education

    ERIC Educational Resources Information Center

    Krueger, Thomas J.; Barr, Ronald E.

    2007-01-01

    Engineering design graphics education has come a long way in the past two decades. The emergence of solid geometric modeling technology has become the focal point for the graphical development of engineering design ideas. The main attraction of this 3-D modeling approach is the downstream application of the data base to analysis and…

  19. Bi-Level Integrated System Synthesis (BLISS)

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Agte, Jeremy S.; Sandusky, Robert R., Jr.

    1998-01-01

    BLISS is a method for optimization of engineering systems by decomposition. It separates the system level optimization, having a relatively small number of design variables, from the potentially numerous subsystem optimizations that may each have a large number of local design variables. The subsystem optimizations are autonomous and may be conducted concurrently. Subsystem and system optimizations alternate, linked by sensitivity data, producing a design improvement in each iteration. Starting from a best guess initial design, the method improves that design in iterative cycles, each cycle comprised of two steps. In step one, the system level variables are frozen and the improvement is achieved by separate, concurrent, and autonomous optimizations in the local variable subdomains. In step two, further improvement is sought in the space of the system level variables. Optimum sensitivity data link the second step to the first. The method prototype was implemented using MATLAB and iSIGHT programming software and tested on a simplified, conceptual level supersonic business jet design, and a detailed design of an electronic device. Satisfactory convergence and favorable agreement with the benchmark results were observed. Modularity of the method is intended to fit the human organization and map well on the computing technology of concurrent processing.

  20. Materials technology assessment for stirling engines

    NASA Technical Reports Server (NTRS)

    Stephens, J. R.; Witzke, W. R.; Watson, G. K.; Johnston, J. R.; Croft, W. J.

    1977-01-01

    A materials technology assessment of high temperature components in the improved (metal) and advanced (ceramic) Stirling engines was undertaken to evaluate the current state-of-the-art of metals and ceramics, identify materials research and development required to support the development of automotive Stirling engines, and to recommend materials technology programs to assure material readiness concurrent with engine system development programs. The most critical component for each engine is identified and some of the material problem areas are discussed.

  1. Early Formulation Model-centric Engineering on NASA's Europa Mission Concept Study

    NASA Technical Reports Server (NTRS)

    Bayer, Todd; Chung, Seung; Cole, Bjorn; Cooke, Brian; Dekens, Frank; Delp, Chris; Gontijo, Ivair; Lewis, Kari; Moshir, Mehrdad; Rasmussen, Robert; hide

    2012-01-01

    The proposed Jupiter Europa Orbiter and Jupiter Ganymede Orbiter missions were formulated using current state-of-the-art MBSE facilities: - JPL's TeamX, Rapid Mission Architecting - ESA's Concurrent Design Facility - APL's ACE Concurrent Engineering Facility. When JEO became an official "pre-project" in Sep 2010, we had already developed a strong partnership with JPL's Integrated Model Centric Engineering (IMCE) initiative; decided to apply Architecting and SysML-based MBSE from the beginning, begun laying these foundations to support work in Phase A. Release of Planetary Science Decadal Survey and FY12 President's Budget in March 2011 changed the landscape. JEO reverted to being a pre-phase A study. A conscious choice was made to continue application of MBSE on the Europa Study, refocused for early formulation. This presentation describes the approach, results, and lessons.

  2. Process and assembly plans for low cost commercial fuselage structure

    NASA Technical Reports Server (NTRS)

    Willden, Kurtis; Metschan, Stephen; Starkey, Val

    1991-01-01

    Cost and weight reduction for a composite structure is a result of selecting design concepts that can be built using efficient low cost manufacturing and assembly processes. Since design and manufacturing are inherently cost dependent, concurrent engineering in the form of a Design-Build Team (DBT) is essential for low cost designs. Detailed cost analysis from DBT designs and hardware verification must be performed to identify the cost drivers and relationships between design and manufacturing processes. Results from the global evaluation are used to quantitatively rank design, identify cost centers for higher ranking design concepts, define and prioritize a list of technical/economic issues and barriers, and identify parameters that control concept response. These results are then used for final design optimization.

  3. Action Learning in Undergraduate Engineering Thesis Supervision

    ERIC Educational Resources Information Center

    Stappenbelt, Brad

    2017-01-01

    In the present action learning implementation, twelve action learning sets were conducted over eight years. The action learning sets consisted of students involved in undergraduate engineering research thesis work. The concurrent study accompanying this initiative investigated the influence of the action learning environment on student approaches…

  4. Systems design analysis applied to launch vehicle configuration

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Verderaime, V.

    1993-01-01

    As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.

  5. Parallel Algorithm Solves Coupled Differential Equations

    NASA Technical Reports Server (NTRS)

    Hayashi, A.

    1987-01-01

    Numerical methods adapted to concurrent processing. Algorithm solves set of coupled partial differential equations by numerical integration. Adapted to run on hypercube computer, algorithm separates problem into smaller problems solved concurrently. Increase in computing speed with concurrent processing over that achievable with conventional sequential processing appreciable, especially for large problems.

  6. 7 CFR 1794.10 - Applicant responsibilities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... prepare the applicable environmental documentation concurrent with a proposed action's engineering... AGRICULTURE (CONTINUED) ENVIRONMENTAL POLICIES AND PROCEDURES Implementation of the National Environmental...

  7. Software Development Technologies for Reactive, Real-Time, and Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Manna, Zohar

    1996-01-01

    The research is directed towards the design and implementation of a comprehensive deductive environment for the development of high-assurance systems, especially reactive (concurrent, real-time, and hybrid) systems. Reactive systems maintain an ongoing interaction with their environment, and are among the most difficult to design and verify. The project aims to provide engineers with a wide variety of tools within a single, general, formal framework in which the tools will be most effective. The entire development process is considered, including the construction, transformation, validation, verification, debugging, and maintenance of computer systems. The goal is to automate the process as much as possible and reduce the errors that pervade hardware and software development.

  8. A definition of high-level decisions in the engineering of systems

    NASA Astrophysics Data System (ADS)

    Powell, Robert Anthony

    The role of the systems engineer defines that he or she be proactive and guide the program manager and their customers through their decisions to enhance the effectiveness of system development---producing faster, better, and cheaper systems. The present lack of coverage in literature on what these decisions are and how they relate to each other may be a contributing factor to the high rate of failure among system projects. At the onset of the system development process, decisions have an integral role in the design of a system that meets stakeholders' needs. This is apparent during the design and qualification of both the Development System and the Operational System. The performance, cost and schedule of the Development System affect the performance of the Operational System and are affected by decisions that influence physical elements of the Development System. The performance, cost, and schedule of the Operational System is affected by decisions that influence physical elements of the Operational System. Traditionally, product and process have been designed using know-how and trial and error. However, the empiricism of engineers and program managers is limited which can, and has led to costly mistakes. To date, very little research has explored decisions made in the engineering of a system. In government, literature exists on procurement processes for major system development; but in general literature on decisions, how they relate to each other, and the key information requirements within one of two systems and across the two systems is not readily available. This research hopes to improve the processes inherent in the engineering of systems. The primary focus of this research is on department of defense (DoD) military systems, specifically aerospace systems and may generalize more broadly. The result of this research is a process tool, a Decision System Model, which can be used by systems engineers to guide the program manager and their customers through the decisions about concurrently designing and qualifying both the Development and Operational systems.

  9. A Design-Based Engineering Graphics Course for First-Year Students.

    ERIC Educational Resources Information Center

    Smith, Shana Shiang-Fong

    2003-01-01

    Describes the first-year Introduction to Design course at Iowa State University which incorporates design for manufacturing and concurrent engineering principles into the curriculum. Autodesk Inventor was used as the primary CAD tool for parametric solid modeling. Test results show that student spatial visualization skills were dramatically…

  10. The Development and Validation of a Life Experience Inventory for the Identification of Creative Electrical Engineers.

    ERIC Educational Resources Information Center

    Michael, William B.; Colson, Kenneth R.

    1979-01-01

    The construction and validation of the Life Experience Inventory (LEI) for the identification of creative electrical engineers are described. Using the number of patents held or pending as a criterion measure, the LEI was found to have high concurrent validity. (JKS)

  11. Barista: A Framework for Concurrent Speech Processing by USC-SAIL

    PubMed Central

    Can, Doğan; Gibson, James; Vaz, Colin; Georgiou, Panayiotis G.; Narayanan, Shrikanth S.

    2016-01-01

    We present Barista, an open-source framework for concurrent speech processing based on the Kaldi speech recognition toolkit and the libcppa actor library. With Barista, we aim to provide an easy-to-use, extensible framework for constructing highly customizable concurrent (and/or distributed) networks for a variety of speech processing tasks. Each Barista network specifies a flow of data between simple actors, concurrent entities communicating by message passing, modeled after Kaldi tools. Leveraging the fast and reliable concurrency and distribution mechanisms provided by libcppa, Barista lets demanding speech processing tasks, such as real-time speech recognizers and complex training workflows, to be scheduled and executed on parallel (and/or distributed) hardware. Barista is released under the Apache License v2.0. PMID:27610047

  12. Barista: A Framework for Concurrent Speech Processing by USC-SAIL.

    PubMed

    Can, Doğan; Gibson, James; Vaz, Colin; Georgiou, Panayiotis G; Narayanan, Shrikanth S

    2014-05-01

    We present Barista, an open-source framework for concurrent speech processing based on the Kaldi speech recognition toolkit and the libcppa actor library. With Barista, we aim to provide an easy-to-use, extensible framework for constructing highly customizable concurrent (and/or distributed) networks for a variety of speech processing tasks. Each Barista network specifies a flow of data between simple actors, concurrent entities communicating by message passing, modeled after Kaldi tools. Leveraging the fast and reliable concurrency and distribution mechanisms provided by libcppa, Barista lets demanding speech processing tasks, such as real-time speech recognizers and complex training workflows, to be scheduled and executed on parallel (and/or distributed) hardware. Barista is released under the Apache License v2.0.

  13. Dissipative production of a maximally entangled steady state of two quantum bits.

    PubMed

    Lin, Y; Gaebler, J P; Reiter, F; Tan, T R; Bowler, R; Sørensen, A S; Leibfried, D; Wineland, D J

    2013-12-19

    Entangled states are a key resource in fundamental quantum physics, quantum cryptography and quantum computation. Introduction of controlled unitary processes--quantum gates--to a quantum system has so far been the most widely used method to create entanglement deterministically. These processes require high-fidelity state preparation and minimization of the decoherence that inevitably arises from coupling between the system and the environment, and imperfect control of the system parameters. Here we combine unitary processes with engineered dissipation to deterministically produce and stabilize an approximate Bell state of two trapped-ion quantum bits (qubits), independent of their initial states. Compared with previous studies that involved dissipative entanglement of atomic ensembles or the application of sequences of multiple time-dependent gates to trapped ions, we implement our combined process using trapped-ion qubits in a continuous time-independent fashion (analogous to optical pumping of atomic states). By continuously driving the system towards the steady state, entanglement is stabilized even in the presence of experimental noise and decoherence. Our demonstration of an entangled steady state of two qubits represents a step towards dissipative state engineering, dissipative quantum computation and dissipative phase transitions. Following this approach, engineered coupling to the environment may be applied to a broad range of experimental systems to achieve desired quantum dynamics or steady states. Indeed, concurrently with this work, an entangled steady state of two superconducting qubits was demonstrated using dissipation.

  14. Description of inpatient medication management using cognitive work analysis.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Sengstacke, L T C Daniel N

    2009-01-01

    The purpose of this article was to describe key elements of an inpatient medication system using the cognitive work analysis method of Rasmussen et al (Cognitive Systems Engineering. Wiley Series in Systems Engineering; 1994). The work of nurses and physicians were observed in routine care of inpatients on a medical-surgical unit and attached ICU. Interaction with pharmacists was included. Preoperative, postoperative, and medical care was observed. Personnel were interviewed to obtain information not easily observable during routine work. Communication between healthcare workers was projected onto an abstraction/decomposition hierarchy. Decision ladders and information flow charts were developed. Results suggest that decision making on an inpatient medical/surgical unit or ICU setting is a parallel, distributed process. Personnel are highly mobile and often are working on multiple issues concurrently. In this setting, communication is key to maintaining organization and synchronization for effective care. Implications for research approaches to system and interface designs and decision support for personnel involved in the process are discussed.

  15. Troubleshooting crude vacuum tower overhead ejector systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lines, J.R.; Frens, L.L.

    1995-03-01

    Routinely surveying tower overhead vacuum systems can improve performance and product quality. These vacuum systems normally provide reliable and consistent operation. However, process conditions, supplied utilities, corrosion, erosion and fouling all have an impact on ejector system performance. Refinery vacuum distillation towers use ejector systems to maintain tower top pressure and remove overhead gases. However, as with virtually all refinery equipment, performance may be affected by a number of variables. These variables may act independently or concurrently. It is important to understand basic operating principles of vacuum systems and how performance is affected by: utilities, corrosion and erosion, fouling, andmore » process conditions. Reputable vacuum-system suppliers have service engineers that will come to a refinery to survey the system and troubleshoot performance or offer suggestions for improvement. A skilled vacuum-system engineer may be needed to diagnose and remedy system problems. The affect of these variables on performance is discussed. A case history is described of a vacuum system on a crude tower in a South American refinery.« less

  16. Document Concurrence System

    NASA Technical Reports Server (NTRS)

    Muhsin, Mansour; Walters, Ian

    2004-01-01

    The Document Concurrence System is a combination of software modules for routing users expressions of concurrence with documents. This system enables determination of the current status of concurrences and eliminates the need for the prior practice of manually delivering paper documents to all persons whose approvals were required. This system runs on a server, and participants gain access via personal computers equipped with Web-browser and electronic-mail software. A user can begin a concurrence routing process by logging onto an administration module, naming the approvers and stating the sequence for routing among them, and attaching documents. The server then sends a message to the first person on the list. Upon concurrence by the first person, the system sends a message to the second person, and so forth. A person on the list indicates approval, places the documents on hold, or indicates disapproval, via a Web-based module. When the last person on the list has concurred, a message is sent to the initiator, who can then finalize the process through the administration module. A background process running on the server identifies concurrence processes that are overdue and sends reminders to the appropriate persons.

  17. Planning as an Iterative Process

    NASA Technical Reports Server (NTRS)

    Smith, David E.

    2012-01-01

    Activity planning for missions such as the Mars Exploration Rover mission presents many technical challenges, including oversubscription, consideration of time, concurrency, resources, preferences, and uncertainty. These challenges have all been addressed by the research community to varying degrees, but significant technical hurdles still remain. In addition, the integration of these capabilities into a single planning engine remains largely unaddressed. However, I argue that there is a deeper set of issues that needs to be considered namely the integration of planning into an iterative process that begins before the goals, objectives, and preferences are fully defined. This introduces a number of technical challenges for planning, including the ability to more naturally specify and utilize constraints on the planning process, the ability to generate multiple qualitatively different plans, and the ability to provide deep explanation of plans.

  18. Chip-to-chip entanglement of transmon qubits using engineered measurement fields

    NASA Astrophysics Data System (ADS)

    Dickel, C.; Wesdorp, J. J.; Langford, N. K.; Peiter, S.; Sagastizabal, R.; Bruno, A.; Criger, B.; Motzoi, F.; DiCarlo, L.

    2018-02-01

    While the on-chip processing power in circuit QED devices is growing rapidly, an open challenge is to establish high-fidelity quantum links between qubits on different chips. Here, we show entanglement between transmon qubits on different cQED chips with 49 % concurrence and 73 % Bell-state fidelity. We engineer a half-parity measurement by successively reflecting a coherent microwave field off two nearly identical transmon-resonator systems. By ensuring the measured output field does not distinguish |01 > from |10 > , unentangled superposition states are probabilistically projected onto entangled states in the odd-parity subspace. We use in situ tunability and an additional weakly coupled driving field on the second resonator to overcome imperfect matching due to fabrication variations. To demonstrate the flexibility of this approach, we also produce an even-parity entangled state of similar quality, by engineering the matching of outputs for the |00 > and |11 > states. The protocol is characterized over a range of measurement strengths using quantum state tomography showing good agreement with a comprehensive theoretical model.

  19. Analysis of Aurora's Performance Simulation Engine for Three Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman, Janine; Simon, Joseph

    2015-07-07

    Aurora Solar Inc. is building a cloud-based optimization platform to automate the design, engineering, and permit generation process of solar photovoltaic (PV) installations. They requested that the National Renewable Energy Laboratory (NREL) validate the performance of the PV system performance simulation engine of Aurora Solar’s solar design platform, Aurora. In previous work, NREL performed a validation of multiple other PV modeling tools 1, so this study builds upon that work by examining all of the same fixed-tilt systems with available module datasheets that NREL selected and used in the aforementioned study. Aurora Solar set up these three operating PV systemsmore » in their modeling platform using NREL-provided system specifications and concurrent weather data. NREL then verified the setup of these systems, ran the simulations, and compared the Aurora-predicted performance data to measured performance data for those three systems, as well as to performance data predicted by other PV modeling tools.« less

  20. The Design of a Primary Flight Trainer using Concurrent Engineering Concepts

    NASA Technical Reports Server (NTRS)

    Ladesic, James G.; Eastlake, Charles N.; Kietzmann, Nicholas H.

    1993-01-01

    Concurrent Engineering (CE) concepts seek to coordinate the expertise of various disciplines from initial design configuration selection through product disposal so that cost efficient design solutions may be achieve. Integrating this methodology into an undergraduate design course sequence may provide a needed enhancement to engineering education. The Advanced Design Program (ADP) project at Embry-Riddle Aeronautical University (EMU) is focused on developing recommendations for the general aviation Primary Flight Trainer (PFT) of the twenty first century using methods of CE. This project, over the next two years, will continue synthesizing the collective knowledge of teams composed of engineering students along with students from other degree programs, their faculty, and key industry representatives. During the past year (Phase I). conventional trainer configurations that comply with current regulations and existing technologies have been evaluated. Phase I efforts have resulted in two baseline concepts, a high-wing, conventional design named Triton and a low-wing, mid-engine configuration called Viper. In the second and third years (Phases II and III). applications of advanced propulsion, advanced materials, and unconventional airplane configurations along with military and commercial technologies which are anticipated to be within the economic range of general aviation by the year 2000, will be considered.

  1. Integrated Engineering Information Technology, FY93 accommplishments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, R.N.; Miller, D.K.; Neugebauer, G.L.

    1994-03-01

    The Integrated Engineering Information Technology (IEIT) project is providing a comprehensive, easy-to-use computer network solution or communicating with coworkers both inside and outside Sandia National Laboratories. IEIT capabilities include computer networking, electronic mail, mechanical design, and data management. These network-based tools have one fundamental purpose: to help create a concurrent engineering environment that will enable Sandia organizations to excel in today`s increasingly competitive business environment.

  2. Bi-Level Integrated System Synthesis (BLISS) for Concurrent and Distributed Processing

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Altus, Troy D.; Phillips, Matthew; Sandusky, Robert

    2002-01-01

    The paper introduces a new version of the Bi-Level Integrated System Synthesis (BLISS) methods intended for optimization of engineering systems conducted by distributed specialty groups working concurrently and using a multiprocessor computing environment. The method decomposes the overall optimization task into subtasks associated with disciplines or subsystems where the local design variables are numerous and a single, system-level optimization whose design variables are relatively few. The subtasks are fully autonomous as to their inner operations and decision making. Their purpose is to eliminate the local design variables and generate a wide spectrum of feasible designs whose behavior is represented by Response Surfaces to be accessed by a system-level optimization. It is shown that, if the problem is convex, the solution of the decomposed problem is the same as that obtained without decomposition. A simplified example of an aircraft design shows the method working as intended. The paper includes a discussion of the method merits and demerits and recommendations for further research.

  3. Continuous integration for concurrent MOOSE framework and application development on GitHub

    DOE PAGES

    Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.; ...

    2015-11-20

    For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less

  4. Continuous integration for concurrent MOOSE framework and application development on GitHub

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.

    For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less

  5. Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models

    NASA Astrophysics Data System (ADS)

    Altuntas, Alper; Baugh, John

    2017-07-01

    Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.

  6. Compacted graphite iron: Cast iron makes a comeback

    NASA Astrophysics Data System (ADS)

    Dawson, S.

    1994-08-01

    Although compacted graphite iron has been known for more than four decades, the absence of a reliable mass-production technique has resulted in relatively little effort to exploit its operational benefits. However, a proven on-line process control technology developed by SinterCast allows for series production of complex components in high-quality CGI. The improved mechanical properties of compacted graphite iron relative to conventional gray iron allow for substantial weight reduction in gasoline and diesel engines or substantial increases in horsepower, or an optimal combination of both. Concurrent with these primary benefits, CGI also provides significant emissions and fuel efficiency benefits allowing automakers to meet legislated performance standards. The operational and environmental benefits of compacted graphite iron together with its low cost and recyclability reinforce cast iron as a prime engineering material for the future.

  7. Future requirements in surface modeling and grid generation

    NASA Technical Reports Server (NTRS)

    Cosner, Raymond R.

    1995-01-01

    The past ten years have seen steady progress in surface modeling procedures, and wholesale changes in grid generation technology. Today, it seems fair to state that a satisfactory grid can be developed to model nearly any configuration of interest. The issues at present focus on operational concerns such as cost and quality. Continuing evolution of the engineering process is placing new demands on the technologies of surface modeling and grid generation. In the evolution toward a multidisciplinary analysis-bascd design environment, methods developed for Computational Fluid Dynamics are finding acceptance in many additional applications. These two trends, the normal evolution of the process and a watershed shift toward concurrent and multidisciplinary analysis, will be considered in assessing current capabilities and needed technological improvements.

  8. A Model-Driven Approach to Teaching Concurrency

    ERIC Educational Resources Information Center

    Carro, Manuel; Herranz, Angel; Marino, Julio

    2013-01-01

    We present an undergraduate course on concurrent programming where formal models are used in different stages of the learning process. The main practical difference with other approaches lies in the fact that the ability to develop correct concurrent software relies on a systematic transformation of formal models of inter-process interaction (so…

  9. Key technologies for manufacturing and processing sheet materials: A global perspective

    NASA Astrophysics Data System (ADS)

    Demeri, Mahmoud Y.

    2001-02-01

    Modern industrial technologies continue to seek new materials and processes to produce products that meet design and functional requirements. Sheet materials made from ferrous and non-ferrous metals, laminates, composites, and reinforced plastics constitute a large percentage of today’s products, components, and systems. Major manufacturers of sheet products include automotive, aerospace, appliance, and food-packaging industries. The Second Global Symposium on Innovations in Materials Processing & Manufacturing: Sheet Materials is organized to provide a forum for presenting advances in sheet processing and manufacturing by worldwide researchers and engineers from industrial, research, and academic centers. The symposium, sponsored by the TMS Materials Processing & Manufacturing Division (MPMD), was planned for the 2001 TMS Annual Meeting, New Orleans, Louisiana, February 11 15, 2001. This article is a review of key papers submitted for publication in the concurrent volume. The selected papers present significant developments in the rapidly expanding areas of advanced sheet materials, innovative forming methods, industrial applications, primary and secondary processing, composite processing, and numerical modeling of manufacturing processes.

  10. Concurrence control for transactions with priorities

    NASA Technical Reports Server (NTRS)

    Marzullo, Keith

    1989-01-01

    Priority inversion occurs when a process is delayed by the actions of another process with less priority. With atomic transactions, the concurrency control mechanism can cause delays, and without taking priorities into account can be a source of priority inversion. Three traditional concurrency control algorithms are extended so that they are free from unbounded priority inversion.

  11. Concurrent Engineering through Product Data Standards

    DTIC Science & Technology

    1991-05-01

    standards, represents the power of a new industrial revolution . The role of the NIST National PDES testbed, technical leadership and a testing-based foundation for the development of STEP, is described.

  12. Fuel quantity modulation in pilot ignited engines

    DOEpatents

    May, Andrew

    2006-05-16

    An engine system includes a first fuel regulator adapted to control an amount of a first fuel supplied to the engine, a second fuel regulator adapted to control an amount of a second fuel supplied to the engine concurrently with the first fuel being supplied to the engine, and a controller coupled to at least the second fuel regulator. The controller is adapted to determine the amount of the second fuel supplied to the engine in a relationship to the amount of the first fuel supplied to the engine to operate in igniting the first fuel at a specified time in steady state engine operation and adapted to determine the amount of the second fuel supplied to the engine in a manner different from the relationship at steady state engine operation in transient engine operation.

  13. Challenges and Opportunities in Interdisciplinary Materials Research Experiences for Undergraduates

    NASA Astrophysics Data System (ADS)

    Vohra, Yogesh; Nordlund, Thomas

    2009-03-01

    The University of Alabama at Birmingham (UAB) offer a broad range of interdisciplinary materials research experiences to undergraduate students with diverse backgrounds in physics, chemistry, applied mathematics, and engineering. The research projects offered cover a broad range of topics including high pressure physics, microelectronic materials, nano-materials, laser materials, bioceramics and biopolymers, cell-biomaterials interactions, planetary materials, and computer simulation of materials. The students welcome the opportunity to work with an interdisciplinary team of basic science, engineering, and biomedical faculty but the challenge is in learning the key vocabulary for interdisciplinary collaborations, experimental tools, and working in an independent capacity. The career development workshops dealing with the graduate school application process and the entrepreneurial business activities were found to be most effective. The interdisciplinary university wide poster session helped student broaden their horizons in research careers. The synergy of the REU program with other concurrently running high school summer programs on UAB campus will also be discussed.

  14. The Effects of Concurrent Cognitive Load on Phonological Processing in Adults Who Stutter

    ERIC Educational Resources Information Center

    Jones, Robin M.; Fox, Robert A.; Jacewicz, Ewa

    2012-01-01

    Purpose: To determine whether phonological processing in adults who stutter (AWS) is disrupted by increased amounts of cognitive load in a concurrent attention-demanding task. Method: Nine AWS and 9 adults who do not stutter (AWNS) participated. Using a dual-task paradigm, the authors presented word pairs for rhyme judgments and, concurrently,…

  15. Specifying the behavior of concurrent systems

    NASA Technical Reports Server (NTRS)

    Furtek, F. C.

    1984-01-01

    A framework for rigorously specifying the behavior of concurrent systems is proposed. It is based on the view of a concurrent system as a collection of interacting processes but no assumptions are made about the mechanisms for process synchronization and communication. A formal language is described that permits the expression of a broad range of logical and timing dependencies.

  16. Concurrency control for transactions with priorities

    NASA Technical Reports Server (NTRS)

    Marzullo, Keith

    1989-01-01

    Priority inversion occurs when a process is delayed by the actions of another process with less priority. With atomic transations, the concurrency control mechanism can cause delays, and without taking priorities into account can be a source of priority inversion. In this paper, three traditional concurrency control algorithms are extended so that they are free from unbounded priority inversion.

  17. Designing Microstructures/Structures for Desired Functional Material and Local Fields

    DTIC Science & Technology

    2015-12-02

    utilized to engineer multifunctional soft materials for multi-sensing, multi- actuating , human-machine interfaces. [3] Establish a theoretical framework...model for surface elasticity, (ii) derived a new type of Maxwell stress in soft materials due to quantum mechanical-elasticity coupling and...elucidated its ramification in engineering multifunctional soft materials, and (iii) demonstrated the possibility of concurrent magnetoelectricity and

  18. Summer Research Program - 1997 Summer Faculty Research Program Volume 6 Arnold Engineering Development Center United States Air Force Academy Air Logistics Centers

    DTIC Science & Technology

    1997-12-01

    Fracture Analysis of the F-5, 15%-Spar Bolt DR Devendra Kumar SAALC/LD 6- 16 CUNY-City College, New York, NY A Simple, Multiversion Concurrency Control...Program, University of Dayton, Dayton, OH. [3]AFGROW, Air Force Crack Propagation Analysis Program, Version 3.82 (1997) 15-8 A SIMPLE, MULTIVERSION ...Office of Scientific Research Boiling Air Force Base, DC and San Antonio Air Logistic Center August 1997 16-1 A SIMPLE, MULTIVERSION CONCURRENCY

  19. Lessons learned for composite structures

    NASA Technical Reports Server (NTRS)

    Whitehead, R. S.

    1991-01-01

    Lessons learned for composite structures are presented in three technology areas: materials, manufacturing, and design. In addition, future challenges for composite structures are presented. Composite materials have long gestation periods from the developmental stage to fully matured production status. Many examples exist of unsuccessful attempts to accelerate this gestation period. Experience has shown that technology transition of a new material system to fully matured production status is time consuming, involves risk, is expensive and should not be undertaken lightly. The future challenges for composite materials require an intensification of the science based approach to material development, extension of the vendor/customer interaction process to include all engineering disciplines of the end user, reduced material costs because they are a significant factor in overall part cost, and improved batch-to-batch pre-preg physical property control. Historical manufacturing lessons learned are presented using current in-service production structure as examples. Most producibility problems for these structures can be traced to their sequential engineering design. This caused an excessive emphasis on design-to-weight and schedule at the expense of design-to-cost. This resulted in expensive performance originated designs, which required costly tooling and led to non-producible parts. Historically these problems have been allowed to persist throughout the production run. The current/future approach for the production of affordable composite structures mandates concurrent engineering design where equal emphasis is placed on product and process design. Design for simplified assembly is also emphasized, since assembly costs account for a major portion of total airframe costs. The future challenge for composite manufacturing is, therefore, to utilize concurrent engineering in conjunction with automated manufacturing techniques to build affordable composite structures. Composite design experience has shown that significant weight savings have been achieved, outstanding fatigue and corrosion resistance have been demonstrated, and in-service performance has been very successful. Currently no structural design show stoppers exist for composite structures. A major lesson learned is that the full scale static test is the key test for composites, since it is the primary structural 'hot spot' indicator. The major durability issue is supportability of thin skinned structure. Impact damage has been identified as the most significant issue for the damage tolerance control of composite structures. However, delaminations induced during assembly operations have demonstrated a significant nuisance value. The future challenges for composite structures are threefold. Firstly, composite airframe weight fraction should increase to 60 percent. At the same time, the cost of composite structures must be reduced by 50 percent to attain the goal of affordability. To support these challenges it is essential to develop lower cost materials and processes.

  20. Advanced main combustion chamber program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The topics presented are covered in viewgraph form and include the following: investment of low cost castings; usage of SSME program; usage of MSFC personnel for design effort; and usage of concurrent engineering techniques.

  1. Concurrent engineering design and management knowledge capture

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The topics are presented in viewgraph form and include the following: real-time management, personnel management, project management, conceptual design and decision making; the SITRF design problem; and the electronic-design notebook.

  2. The Jupiter-Io connection - An Alfven engine in space

    NASA Technical Reports Server (NTRS)

    Belcher, John W.

    1987-01-01

    Much has been learned about the electromagnetic interaction between Jupiter and its satellite Io from in situ observations. Io, in its motion through the Io plasma torus at Jupiter, continuously generates an Alfven wing that carries two billion kilowatts of power into the jovian ionosphere. Concurrently, Io is acted upon by a J x B force tending to propel it out of the jovian system. The energy source for these processes is the rotation of Jupiter. This unusual planet-satellite coupling serves as an archetype for the interaction of a large moving conductor with a magnetized plasma, a problem of general space and astrophysical interest.

  3. Fully Integral, Flexible Composite Driveshaft

    NASA Technical Reports Server (NTRS)

    Lawrie, Duncan

    2014-01-01

    An all-composite driveshaft incorporating integral flexible diaphragms was developed for prime contractor testing. This new approach makes obsolete the split lines required to attach metallic flex elements and either metallic or composite spacing tubes in current solutions. Subcritical driveshaft weights can be achieved that are half that of incumbent technology for typical rotary wing shaft lengths. Spacing tubes compose an integral part of the initial tooling but remain part of the finished shaft and control natural frequencies and torsional stability. A concurrently engineered manufacturing process and design for performance competes with incumbent solutions at significantly lower weight and with the probability of improved damage tolerance and fatigue life.

  4. An Overview of the Runtime Verification Tool Java PathExplorer

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We present an overview of the Java PathExplorer runtime verification tool, in short referred to as JPAX. JPAX can monitor the execution of a Java program and check that it conforms with a set of user provided properties formulated in temporal logic. JPAX can in addition analyze the program for concurrency errors such as deadlocks and data races. The concurrency analysis requires no user provided specification. The tool facilitates automated instrumentation of a program's bytecode, which when executed will emit an event stream, the execution trace, to an observer. The observer dispatches the incoming event stream to a set of observer processes, each performing a specialized analysis, such as the temporal logic verification, the deadlock analysis and the data race analysis. Temporal logic specifications can be formulated by the user in the Maude rewriting logic, where Maude is a high-speed rewriting system for equational logic, but here extended with executable temporal logic. The Maude rewriting engine is then activated as an event driven monitoring process. Alternatively, temporal specifications can be translated into efficient automata, which check the event stream. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems.

  5. Reducing Design Cycle Time and Cost Through Process Resequencing

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    2004-01-01

    In today's competitive environment, companies are under enormous pressure to reduce the time and cost of their design cycle. One method for reducing both time and cost is to develop an understanding of the flow of the design processes and the effects of the iterative subcycles that are found in complex design projects. Once these aspects are understood, the design manager can make decisions that take advantage of decomposition, concurrent engineering, and parallel processing techniques to reduce the total time and the total cost of the design cycle. One software tool that can aid in this decision-making process is the Design Manager's Aid for Intelligent Decomposition (DeMAID). The DeMAID software minimizes the feedback couplings that create iterative subcycles, groups processes into iterative subcycles, and decomposes the subcycles into a hierarchical structure. The real benefits of producing the best design in the least time and at a minimum cost are obtained from sequencing the processes in the subcycles.

  6. Innovative Approaches to Fuel-Air Mixing and Combustion in Airbreathing Hypersonic Engines

    NASA Astrophysics Data System (ADS)

    MacLeod, C.

    This paper describes some innovative methods for achieving enhanced fuel-air mixing and combustion in Scramjet-like spaceplane engines. A multimodal approach to the problem is discussed; this involves using several concurrent methods of forced mixing. The paper concentrates on Electromagnetic Activation (EMA) and Electrostatic Attraction as suitable techniques for this purpose - although several other potential methods are also discussed. Previously published empirical data is used to draw conclusions about the likely effectiveness of the system and possible engine topologies are outlined.

  7. Concurrently adjusting interrelated control parameters to achieve optimal engine performance

    DOEpatents

    Jiang, Li; Lee, Donghoon; Yilmaz, Hakan; Stefanopoulou, Anna

    2015-12-01

    Methods and systems for real-time engine control optimization are provided. A value of an engine performance variable is determined, a value of a first operating condition and a value of a second operating condition of a vehicle engine are detected, and initial values for a first engine control parameter and a second engine control parameter are determined based on the detected first operating condition and the detected second operating condition. The initial values for the first engine control parameter and the second engine control parameter are adjusted based on the determined value of the engine performance variable to cause the engine performance variable to approach a target engine performance variable. In order to cause the engine performance variable to approach the target engine performance variable, adjusting the initial value for the first engine control parameter necessitates a corresponding adjustment of the initial value for the second engine control parameter.

  8. Cost-engineering modeling to support rapid concept development of an advanced infrared satellite system

    NASA Astrophysics Data System (ADS)

    Bell, Kevin D.; Dafesh, Philip A.; Hsu, L. A.; Tsuda, A. S.

    1995-12-01

    Current architectural and design trade techniques often carry unaffordable alternatives late into the decision process. Early decisions made during the concept exploration and development (CE&D) phase will drive the cost of a program more than any other phase of development; thus, designers must be able to assess both the performance and cost impacts of their early choices. The Space Based Infrared System (SBIRS) cost engineering model (CEM) described in this paper is an end-to-end process integrating engineering and cost expertise through commonly available spreadsheet software, allowing for concurrent design engineering and cost estimation to identify and balance system drives to reduce acquisition costs. The automated interconnectivity between subsystem models using spreadsheet software allows for the quick and consistent assessment of the system design impacts and relative cost impacts due to requirement changes. It is different from most CEM efforts attempted in the past as it incorporates more detailed spacecraft and sensor payload models, and has been applied to determine the cost drivers for an advanced infrared satellite system acquisition. The CEM is comprised of integrated detailed engineering and cost estimating relationships describing performance, design, and cost parameters. Detailed models have been developed to evaluate design parameters for the spacecraft bus and sensor; both step-starer and scanner sensor types incorporate models of focal plane array, optics, processing, thermal, communications, and mission performance. The current CEM effort has provided visibility to requirements, design, and cost drivers for system architects and decision makers to determine the configuration of an infrared satellite architecture that meets essential requirements cost effectively. In general, the methodology described in this paper consists of process building blocks that can be tailored to the needs of many applications. Descriptions of the spacecraft and payload subsystem models provide insight into The Aerospace Corporation expertise and scope of the SBIRS concept development effort.

  9. MEMS product engineering: methodology and tools

    NASA Astrophysics Data System (ADS)

    Ortloff, Dirk; Popp, Jens; Schmidt, Thilo; Hahn, Kai; Mielke, Matthias; Brück, Rainer

    2011-03-01

    The development of MEMS comprises the structural design as well as the definition of an appropriate manufacturing process. Technology constraints have a considerable impact on the device design and vice-versa. Product design and technology development are therefore concurrent tasks. Based on a comprehensive methodology the authors introduce a software environment that links commercial design tools from both area into a common design flow. In this paper emphasis is put on automatic low threshold data acquisition. The intention is to collect and categorize development data for further developments with minimum overhead and minimum disturbance of established business processes. As a first step software tools that automatically extract data from spreadsheets or file-systems and put them in context with existing information are presented. The developments are currently carried out in a European research project.

  10. Real engineering in a virtual world

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deitz, D.

    1995-07-01

    VR technology can be thought of as the next point on a continuum that leads from 1-D data (such as the text and numbers on a finite element analysis printout), through 2-D drawings and 3-D solid models to 4-D digital prototypes that eventually will have texture and weight and can be held in one`s hand. If it lives up to its potential, VR could become just another tool--like 3-D CAD/CAM systems and FEA software--that can be used to pursue continuous improvements in design and manufacturing processes. For example VR could help manufacturers reduce the number of prototypes and engineering changemore » orders (ECOs) generated during the product life cycle. Virtual reality could also be used to promote concurrent engineering. Because realistic virtual models are easier to interpret and interrogate than 2-D drawings or even 3-D solid models, they have the potential to simplify design reviews. They could also make it easier for non-engineers (such as salespeople and potential customers) to contribute to the design process. VR technology still has a way to go before it becomes a standard engineering tool, however. Peripheral devices are still being perfected, and engineers seem to agree that the jury`s still out on which peripherals are most appropriate for which applications. Further, advanced VR applications are largely confined to research and development departments of large corporations or to public and private research centers. Finally, potential users will have to wait a few years before desktop computers are powerful enough to run such applications--and inexpensive enough to survive a cost-benefit analysis.« less

  11. Computational Infrastructure for Engine Structural Performance Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Select computer codes developed over the years to simulate specific aspects of engine structures are described. These codes include blade impact integrated multidisciplinary analysis and optimization, progressive structural fracture, quantification of uncertainties for structural reliability and risk, benefits estimation of new technology insertion and hierarchical simulation of engine structures made from metal matrix and ceramic matrix composites. Collectively these codes constitute a unique infrastructure readiness to credibly evaluate new and future engine structural concepts throughout the development cycle from initial concept, to design and fabrication, to service performance and maintenance and repairs, and to retirement for cause and even to possible recycling. Stated differently, they provide 'virtual' concurrent engineering for engine structures total-life-cycle-cost.

  12. Update on Integrated Optical Design Analyzer

    NASA Technical Reports Server (NTRS)

    Moore, James D., Jr.; Troy, Ed

    2003-01-01

    Updated information on the Integrated Optical Design Analyzer (IODA) computer program has become available. IODA was described in Software for Multidisciplinary Concurrent Optical Design (MFS-31452), NASA Tech Briefs, Vol. 25, No. 10 (October 2001), page 8a. To recapitulate: IODA facilitates multidisciplinary concurrent engineering of highly precise optical instruments. The architecture of IODA was developed by reviewing design processes and software in an effort to automate design procedures. IODA significantly reduces design iteration cycle time and eliminates many potential sources of error. IODA integrates the modeling efforts of a team of experts in different disciplines (e.g., optics, structural analysis, and heat transfer) working at different locations and provides seamless fusion of data among thermal, structural, and optical models used to design an instrument. IODA is compatible with data files generated by the NASTRAN structural-analysis program and the Code V (Registered Trademark) optical-analysis program, and can be used to couple analyses performed by these two programs. IODA supports multiple-load-case analysis for quickly accomplishing trade studies. IODA can also model the transient response of an instrument under the influence of dynamic loads and disturbances.

  13. Reduction of Left Visual Field Lexical Decision Accuracy as a Result of Concurrent Nonverbal Auditory Stimulation

    ERIC Educational Resources Information Center

    Van Strien, Jan W.

    2004-01-01

    To investigate whether concurrent nonverbal sound sequences would affect visual-hemifield lexical processing, lexical-decision performance of 24 strongly right-handed students (12 men, 12 women) was measured in three conditions: baseline, concurrent neutral sound sequence, and concurrent emotional sound sequence. With the neutral sequence,…

  14. The DAB model of drawing processes

    NASA Technical Reports Server (NTRS)

    Hochhaus, Larry W.

    1989-01-01

    The problem of automatic drawing was investigated in two ways. First, a DAB model of drawing processes was introduced. DAB stands for three types of knowledge hypothesized to support drawing abilities, namely, Drawing Knowledge, Assimilated Knowledge, and Base Knowledge. Speculation concerning the content and character of each of these subsystems of the drawing process is introduced and the overall adequacy of the model is evaluated. Second, eight experts were each asked to understand six engineering drawings and to think aloud while doing so. It is anticipated that a concurrent protocol analysis of these interviews can be carried out in the future. Meanwhile, a general description of the videotape database is provided. In conclusion, the DAB model was praised as a worthwhile first step toward solution of a difficult problem, but was considered by and large inadequate to the challenge of automatic drawing. Suggestions for improvements on the model were made.

  15. Concurrent enterprise: a conceptual framework for enterprise supply-chain network activities

    NASA Astrophysics Data System (ADS)

    Addo-Tenkorang, Richard; Helo, Petri T.; Kantola, Jussi

    2017-04-01

    Supply-chain management (SCM) in manufacturing industries has evolved significantly over the years. Recently, a lot more relevant research has picked up on the development of integrated solutions. Thus, seeking a collaborative optimisation of geographical, just-in-time (JIT), quality (customer demand/satisfaction) and return-on-investment (profits), aspects of organisational management and planning through 'best practice' business-process management - concepts and application; employing system tools such as certain applications/aspects of enterprise resource planning (ERP) - SCM systems information technology (IT) enablers to enhance enterprise integrated product development/concurrent engineering principles. This article assumed three main organisation theory applications in positioning its assumptions. Thus, proposing a feasible industry-specific framework not currently included within the SCOR model's level four (4) implementation level, as well as other existing SCM integration reference models such as in the MIT process handbook's - Process Interchange Format (PIF), the TOVE project, etc. which could also be replicated in other SCs. However, the wider focus of this paper's contribution will be concentrated on a complimentary proposed framework to the SCC's SCOR reference model. Quantitative empirical closed-ended questionnaires in addition to the main data collected from a qualitative empirical real-life industrial-based pilot case study were used: To propose a conceptual concurrent enterprise framework for SCM network activities. This research adopts a design structure matrix simulation approach analysis to propose an optimal enterprise SCM-networked value-adding, customised master data-management platform/portal for efficient SCM network information exchange and an effective supply-chain (SC) network systems-design teams' structure. Furthermore, social network theory analysis will be employed in a triangulation approach with statistical correlation analysis to assess the scale/level of frequency, importance, level of collaborative-ness, mutual trust as well as roles and responsibility among the enterprise SCM network for systems product development (PD) design teams' technical communication network as well as extensive literature reviews.

  16. Visual scanning with or without spatial uncertainty and time-sharing performance

    NASA Technical Reports Server (NTRS)

    Liu, Yili; Wickens, Christopher D.

    1989-01-01

    An experiment is reported that examines the pattern of task interference between visual scanning as a sequential and selective attention process and other concurrent spatial or verbal processing tasks. A distinction is proposed between visual scanning with or without spatial uncertainty regarding the possible differential effects of these two types of scanning on interference with other concurrent processes. The experiment required the subject to perform a simulated primary tracking task, which was time-shared with a secondary spatial or verbal decision task. The relevant information that was needed to perform the decision tasks were displayed with or without spatial uncertainty. The experiment employed a 2 x 2 x 2 design with types of scanning (with or without spatial uncertainty), expected scanning distance (low/high), and codes of concurrent processing (spatial/verbal) as the three experimental factors. The results provide strong evidence that visual scanning as a spatial exploratory activity produces greater task interference with concurrent spatial tasks than with concurrent verbal tasks. Furthermore, spatial uncertainty in visual scanning is identified to be the crucial factor in producing this differential effect.

  17. NREL Advancements in Methane Conversion Lead to Cleaner Air, Useful Products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-06-01

    Researchers at NREL leveraged the recent on-site development of gas fermentation capabilities and novel genetic tools to directly convert methane to lactic acid using an engineered methanotrophic bacterium. The results provide proof-of-concept data for a gas-to-liquids bioprocess that concurrently produces fuels and chemicals from methane. NREL researchers developed genetic tools to express heterologous genes in methanotrophic organisms, which have historically been difficult to genetically engineer. Using these tools, researchers demonstrated microbial conversion of methane to lactate, a high-volume biochemical precursor predominantly utilized for the production of bioplastics. Methane biocatalysis offers a means to concurrently liquefy and upgrade natural gas andmore » renewable biogas, enabling their utilization in conventional transportation and industrial manufacturing infrastructure. Producing chemicals and fuels from methane expands the suite of products currently generated from biorefineries, municipalities, and agricultural operations, with the potential to increase revenue and significantly reduce greenhouse gas emissions.« less

  18. History of the Fluids Engineering Division

    DOE PAGES

    Cooper, Paul; Martin, C. Samuel; O'Hern, Timothy J.

    2016-08-03

    The 90th Anniversary of the Fluids Engineering Division (FED) of ASME will be celebrated on July 10–14, 2016 in Washington, DC. The venue is ASME's Summer Heat Transfer Conference (SHTC), Fluids Engineering Division Summer Meeting (FEDSM), and International Conference on Nanochannels and Microchannels (ICNMM). The occasion is an opportune time to celebrate and reflect on the origin of FED and its predecessor—the Hydraulic Division (HYD), which existed from 1926–1963. Furthermore, the FED Executive Committee decided that it would be appropriate to publish concurrently a history of the HYD/FED.

  19. History of the Fluids Engineering Division

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, Paul; Martin, C. Samuel; O'Hern, Timothy J.

    The 90th Anniversary of the Fluids Engineering Division (FED) of ASME will be celebrated on July 10–14, 2016 in Washington, DC. The venue is ASME's Summer Heat Transfer Conference (SHTC), Fluids Engineering Division Summer Meeting (FEDSM), and International Conference on Nanochannels and Microchannels (ICNMM). The occasion is an opportune time to celebrate and reflect on the origin of FED and its predecessor—the Hydraulic Division (HYD), which existed from 1926–1963. Furthermore, the FED Executive Committee decided that it would be appropriate to publish concurrently a history of the HYD/FED.

  20. The CIRP International Workshop on Concurrent Engineering for Product Realization (1st) Held in Tokyo, Japan on June 27 - 28, 1992

    DTIC Science & Technology

    1992-11-01

    7075 Dr. Samuel K.M.HO Dept. of Engineering Warwick University Coventry CV 47 AL UK 15 Tel.(44)203 524173 Fax.(44)203 524307 Mr. Mikio Inagaki...Nishi-8, Kita-ku Sapporo 060, JAPAN Tel. +81-11-716-211 ex.6447 Fax. +81-11-758-1619 Mr. Mikio Kitano 16 Motomachi Plant Toyota Moter Corporation 1...Hirosawa 2-1, Wako, Saitama 351-01 JAPAN Tel 81-484-65-6641 Fax 81-484-67-5942 Professor Hisayoshi Sato Director, Mechanical Engineering Laboratory

  1. A 20k Payload Launch Vehicle Fast Track Development Concept Using an RD-180 Engine and a Centaur Upper Stage

    NASA Technical Reports Server (NTRS)

    Toelle, Ronald (Compiler)

    1995-01-01

    A launch vehicle concept to deliver 20,000 lb of payload to a 100-nmi orbit has been defined. A new liquid oxygen/kerosene booster powered by an RD-180 engine was designed while using a slightly modified Centaur upper stage. The design, development, and test program met the imposed 40-mo schedule by elimination of major structural testing by increased factors of safety and concurrent engineering concepts. A growth path to attain 65,000 lb of payload is developed.

  2. Finite Element Analysis in Concurrent Processing: Computational Issues

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Watson, Brian; Vanderplaats, Garrett

    2004-01-01

    The purpose of this research is to investigate the potential application of new methods for solving large-scale static structural problems on concurrent computers. It is well known that traditional single-processor computational speed will be limited by inherent physical limits. The only path to achieve higher computational speeds lies through concurrent processing. Traditional factorization solution methods for sparse matrices are ill suited for concurrent processing because the null entries get filled, leading to high communication and memory requirements. The research reported herein investigates alternatives to factorization that promise a greater potential to achieve high concurrent computing efficiency. Two methods, and their variants, based on direct energy minimization are studied: a) minimization of the strain energy using the displacement method formulation; b) constrained minimization of the complementary strain energy using the force method formulation. Initial results indicated that in the context of the direct energy minimization the displacement formulation experienced convergence and accuracy difficulties while the force formulation showed promising potential.

  3. Composite chronicles: A study of the lessons learned in the development, production, and service of composite structures

    NASA Technical Reports Server (NTRS)

    Vosteen, Louis F.; Hadcock, Richard N.

    1994-01-01

    A study of past composite aircraft structures programs was conducted to determine the lessons learned during the programs. The study focused on finding major underlying principles and practices that experience showed have significant effects on the development process and should be recognized and understood by those responsible for using of composites. Published information on programs was reviewed and interviews were conducted with personnel associated with current and past major development programs. In all, interviews were conducted with about 56 people representing 32 organizations. Most of the people interviewed have been involved in the engineering and manufacturing development of composites for the past 20 to 25 years. Although composites technology has made great advances over the past 30 years, the effective application of composites to aircraft is still a complex problem that requires experienced personnel with special knowledge. All disciplines involved in the development process must work together in real time to minimize risk and assure total product quality and performance at acceptable costs. The most successful programs have made effective use of integrated, collocated, concurrent engineering teams, and most often used well-planned, systematic development efforts wherein the design and manufacturing processes are validated in a step-by-step or 'building block' approach. Such approaches reduce program risk and are cost effective.

  4. Parallel processing for scientific computations

    NASA Technical Reports Server (NTRS)

    Alkhatib, Hasan S.

    1991-01-01

    The main contribution of the effort in the last two years is the introduction of the MOPPS system. After doing extensive literature search, we introduced the system which is described next. MOPPS employs a new solution to the problem of managing programs which solve scientific and engineering applications on a distributed processing environment. Autonomous computers cooperate efficiently in solving large scientific problems with this solution. MOPPS has the advantage of not assuming the presence of any particular network topology or configuration, computer architecture, or operating system. It imposes little overhead on network and processor resources while efficiently managing programs concurrently. The core of MOPPS is an intelligent program manager that builds a knowledge base of the execution performance of the parallel programs it is managing under various conditions. The manager applies this knowledge to improve the performance of future runs. The program manager learns from experience.

  5. Computational approaches to metabolic engineering utilizing systems biology and synthetic biology.

    PubMed

    Fong, Stephen S

    2014-08-01

    Metabolic engineering modifies cellular function to address various biochemical applications. Underlying metabolic engineering efforts are a host of tools and knowledge that are integrated to enable successful outcomes. Concurrent development of computational and experimental tools has enabled different approaches to metabolic engineering. One approach is to leverage knowledge and computational tools to prospectively predict designs to achieve the desired outcome. An alternative approach is to utilize combinatorial experimental tools to empirically explore the range of cellular function and to screen for desired traits. This mini-review focuses on computational systems biology and synthetic biology tools that can be used in combination for prospective in silico strain design.

  6. Fluorine and sulfur simultaneously co-doped suspended graphene

    NASA Astrophysics Data System (ADS)

    Struzzi, C.; Sezen, H.; Amati, M.; Gregoratti, L.; Reckinger, N.; Colomer, J.-F.; Snyders, R.; Bittencourt, C.; Scardamaglia, M.

    2017-11-01

    Suspended graphene flakes are exposed simultaneously to fluorine and sulfur ions produced by the μ-wave plasma discharge of the SF6 precursor gas. The microscopic and spectroscopic analyses, performed by Raman spectroscopy, scanning electron microscopy and photoelectron spectromicroscopy, show the homogeneity in functionalization yield over the graphene flakes with F and S atoms covalently bonded to the carbon lattice. This promising surface shows potential for several applications ranging from biomolecule immobilization to lithium battery and hydrogen storage devices. The present co-doping process is an optimal strategy to engineer the graphene surface with a concurrent hydrophobic character, thanks to the fluorine atoms, and a high affinity with metal nanoparticles due to the presence of sulfur atoms.

  7. Method of producing gaseous products using a downflow reactor

    DOEpatents

    Cortright, Randy D; Rozmiarek, Robert T; Hornemann, Charles C

    2014-09-16

    Reactor systems and methods are provided for the catalytic conversion of liquid feedstocks to synthesis gases and other noncondensable gaseous products. The reactor systems include a heat exchange reactor configured to allow the liquid feedstock and gas product to flow concurrently in a downflow direction. The reactor systems and methods are particularly useful for producing hydrogen and light hydrocarbons from biomass-derived oxygenated hydrocarbons using aqueous phase reforming. The generated gases may find used as a fuel source for energy generation via PEM fuel cells, solid-oxide fuel cells, internal combustion engines, or gas turbine gensets, or used in other chemical processes to produce additional products. The gaseous products may also be collected for later use or distribution.

  8. Developing a workstation-based, real-time simulation for rapid handling qualities evaluations during design

    NASA Technical Reports Server (NTRS)

    Anderson, Frederick; Biezad, Daniel J.

    1994-01-01

    This paper describes the Rapid Aircraft DynamIcs AssessmeNt (RADIAN) project - an integration of the Aircraft SYNThesis (ACSTNT) design code with the USAD DATCOM code that estimates stability derivatives. Both of these codes are available to universities. These programs are then linked to flight simulation and flight controller synthesis tools and resulting design is evaluated on a graphics workstation. The entire process reduces the preliminary design time by an order of magnitude and provides an initial handling qualities evaluation of the design coupled to a control law. The integrated design process is applicable to both conventional aircraft taken from current textbooks and to unconventional designs emphasizing agility and propulsive control of attitude. The interactive and concurrent nature of the design process has been well received by industry and by design engineers at NASA. The process is being implemented into the design curriculum and is being used by students who view it as a significant advance over prior methods.

  9. Differences in results of analyses of concurrent and split stream-water samples collected and analyzed by the US Geological Survey and the Illinois Environmental Protection Agency, 1985-91

    USGS Publications Warehouse

    Melching, C.S.; Coupe, R.H.

    1995-01-01

    During water years 1985-91, the U.S. Geological Survey (USGS) and the Illinois Environmental Protection Agency (IEPA) cooperated in the collection and analysis of concurrent and split stream-water samples from selected sites in Illinois. Concurrent samples were collected independently by field personnel from each agency at the same time and sent to the IEPA laboratory, whereas the split samples were collected by USGS field personnel and divided into aliquots that were sent to each agency's laboratory for analysis. The water-quality data from these programs were examined by means of the Wilcoxon signed ranks test to identify statistically significant differences between results of the USGS and IEPA analyses. The data sets for constituents and properties identified by the Wilcoxon test as having significant differences were further examined by use of the paired t-test, mean relative percentage difference, and scattergrams to determine if the differences were important. Of the 63 constituents and properties in the concurrent-sample analysis, differences in only 2 (pH and ammonia) were statistically significant and large enough to concern water-quality engineers and planners. Of the 27 constituents and properties in the split-sample analysis, differences in 9 (turbidity, dissolved potassium, ammonia, total phosphorus, dissolved aluminum, dissolved barium, dissolved iron, dissolved manganese, and dissolved nickel) were statistically significant and large enough to con- cern water-quality engineers and planners. The differences in concentration between pairs of the concurrent samples were compared to the precision of the laboratory or field method used. The differences in concentration between pairs of the concurrent samples were compared to the precision of the laboratory or field method used. The differences in concentration between paris of split samples were compared to the precision of the laboratory method used and the interlaboratory precision of measuring a given concentration or property. Consideration of method precision indicated that differences between concurrent samples were insignificant for all concentrations and properties except pH, and that differences between split samples were significant for all concentrations and properties. Consideration of interlaboratory precision indicated that the differences between the split samples were not unusually large. The results for the split samples illustrate the difficulty in obtaining comparable and accurate water-quality data.

  10. A PDA-based system for online recording and analysis of concurrent events in complex behavioral processes.

    PubMed

    Held, Jürgen; Manser, Tanja

    2005-02-01

    This article outlines how a Palm- or Newton-based PDA (personal digital assistant) system for online event recording was used to record and analyze concurrent events. We describe the features of this PDA-based system, called the FIT-System (flexible interface technique), and its application to the analysis of concurrent events in complex behavioral processes--in this case, anesthesia work processes. The patented FIT-System has a unique user interface design allowing the user to design an interface template with a pencil and paper or using a transparency film. The template usually consists of a drawing or sketch that includes icons or symbols that depict the observer's representation of the situation to be observed. In this study, the FIT-System allowed us to create a design for fast, intuitive online recording of concurrent events using a set of 41 observation codes. An analysis of concurrent events leads to a description of action density, and our results revealed a characteristic distribution of action density during the administration of anesthesia in the operating room. This distribution indicated the central role of the overlapping operations in the action sequences of medical professionals as they deal with the varying requirements of this complex task. We believe that the FIT-System for online recording of concurrent events in complex behavioral processes has the potential to be useful across a broad spectrum of research areas.

  11. Concurrent information affects response inhibition processes via the modulation of theta oscillations in cognitive control networks.

    PubMed

    Chmielewski, Witold X; Mückschel, Moritz; Dippel, Gabriel; Beste, Christian

    2016-11-01

    Inhibiting responses is a challenge, where the outcome (partly) depends on the situational context. In everyday situations, response inhibition performance might be altered when irrelevant input is presented simultaneously with the information relevant for response inhibition. More specifically, irrelevant concurrent information may either brace or interfere with response-relevant information, depending on whether these inputs are redundant or conflicting. The aim of this study is to investigate neurophysiological mechanisms and the network underlying such modulations using EEG beamforming as method. The results show that in comparison to a baseline condition without concurrent information, response inhibition performance can be aggravated or facilitated by manipulating the extent of conflict via concurrent input. This depends on whether the requirement for cognitive control is high, as in conflicting trials, or whether it is low, as in redundant trials. In line with this, the total theta frequency power decreases in a right hemispheric orbitofrontal response inhibition network including the SFG, MFG, and SMA, when concurrent redundant information facilitates response inhibition processes. Vice versa, theta activity in a left-hemispheric response inhibition network (i.e., SFG, MFG, and IFG) increases, when conflicting concurrent information compromises response inhibition processes. We conclude that concurrent information bi-directionally shifts response inhibition performance and modulates the network architecture underlying theta oscillations which are signaling different levels of the need for cognitive control.

  12. Catalog of Training and Education Sources in Concurrent Engineering

    DTIC Science & Technology

    1989-11-01

    Undergraduate degree in engineering or hard science. TOEFL (Test of English as a Foreign Language) of 550 or better for international students and GMAT (Graduate...Graduate Record Examination)of 1000 0 (Verbal + Quantitative); TOEFL (Test of English as a Foreign Language) of 550 for students whose first language...Graduate Record Examination) and TOEFL (Test of English as a Foreign Language) 0 scores. Comments: Recipient of the CASA/SME 1988 University LEAD

  13. Air Force Engineering Research Initiation Grant Program

    DTIC Science & Technology

    1994-06-21

    MISFET Structures for High-Frequency Device Applications" RI-B-91-13 Prof. John W. Silvestro Clemson University "The Effect of Scattering by a Near...Synthesis Method for Concurrent Engineering Applications" RI-B-92-03 Prof. Steven H. Collicott Purdue University "An Experimental Study of the Effect of a ...beams is studied. The effect of interply delam- inations on natural frequencies and mode shapes is evaluated analytically. A generalized variational

  14. An infrared flash contemporaneous with the gamma-rays of GRB 041219a.

    PubMed

    Blake, C H; Bloom, J S; Starr, D L; Falco, E E; Skrutskie, M; Fenimore, E E; Duchêne, G; Szentgyorgyi, A; Hornstein, S; Prochaska, J X; McCabe, C; Ghez, A; Konopacky, Q; Stapelfeldt, K; Hurley, K; Campbell, R; Kassis, M; Chaffee, F; Gehrels, N; Barthelmy, S; Cummings, J R; Hullinger, D; Krimm, H A; Markwardt, C B; Palmer, D; Parsons, A; McLean, K; Tueller, J

    2005-05-12

    The explosion that results in a cosmic gamma-ray burst (GRB) is thought to produce emission from two physical processes: the central engine gives rise to the high-energy emission of the burst through internal shocking, and the subsequent interaction of the flow with the external environment produces long-wavelength afterglows. Although observations of afterglows continue to refine our understanding of GRB progenitors and relativistic shocks, gamma-ray observations alone have not yielded a clear picture of the origin of the prompt emission nor details of the central engine. Only one concurrent visible-light transient has been found and it was associated with emission from an external shock. Here we report the discovery of infrared emission contemporaneous with a GRB, beginning 7.2 minutes after the onset of GRB 041219a (ref. 8). We acquired 21 images during the active phase of the burst, yielding early multi-colour observations. Our analysis of the initial infrared pulse suggests an origin consistent with internal shocks.

  15. Decibel: The Relational Dataset Branching System

    PubMed Central

    Maddox, Michael; Goehring, David; Elmore, Aaron J.; Madden, Samuel; Parameswaran, Aditya; Deshpande, Amol

    2017-01-01

    As scientific endeavors and data analysis become increasingly collaborative, there is a need for data management systems that natively support the versioning or branching of datasets to enable concurrent analysis, cleaning, integration, manipulation, or curation of data across teams of individuals. Common practice for sharing and collaborating on datasets involves creating or storing multiple copies of the dataset, one for each stage of analysis, with no provenance information tracking the relationships between these datasets. This results not only in wasted storage, but also makes it challenging to track and integrate modifications made by different users to the same dataset. In this paper, we introduce the Relational Dataset Branching System, Decibel, a new relational storage system with built-in version control designed to address these shortcomings. We present our initial design for Decibel and provide a thorough evaluation of three versioned storage engine designs that focus on efficient query processing with minimal storage overhead. We also develop an exhaustive benchmark to enable the rigorous testing of these and future versioned storage engine designs. PMID:28149668

  16. Heterogeneous concurrent computing with exportable services

    NASA Technical Reports Server (NTRS)

    Sunderam, Vaidy

    1995-01-01

    Heterogeneous concurrent computing, based on the traditional process-oriented model, is approaching its functionality and performance limits. An alternative paradigm, based on the concept of services, supporting data driven computation, and built on a lightweight process infrastructure, is proposed to enhance the functional capabilities and the operational efficiency of heterogeneous network-based concurrent computing. TPVM is an experimental prototype system supporting exportable services, thread-based computation, and remote memory operations that is built as an extension of and an enhancement to the PVM concurrent computing system. TPVM offers a significantly different computing paradigm for network-based computing, while maintaining a close resemblance to the conventional PVM model in the interest of compatibility and ease of transition Preliminary experiences have demonstrated that the TPVM framework presents a natural yet powerful concurrent programming interface, while being capable of delivering performance improvements of upto thirty percent.

  17. A systems-based approach for integrated design of materials, products and design process chains

    NASA Astrophysics Data System (ADS)

    Panchal, Jitesh H.; Choi, Hae-Jin; Allen, Janet K.; McDowell, David L.; Mistree, Farrokh

    2007-12-01

    The concurrent design of materials and products provides designers with flexibility to achieve design objectives that were not previously accessible. However, the improved flexibility comes at a cost of increased complexity of the design process chains and the materials simulation models used for executing the design chains. Efforts to reduce the complexity generally result in increased uncertainty. We contend that a systems based approach is essential for managing both the complexity and the uncertainty in design process chains and simulation models in concurrent material and product design. Our approach is based on simplifying the design process chains systematically such that the resulting uncertainty does not significantly affect the overall system performance. Similarly, instead of striving for accurate models for multiscale systems (that are inherently complex), we rely on making design decisions that are robust to uncertainties in the models. Accordingly, we pursue hierarchical modeling in the context of design of multiscale systems. In this paper our focus is on design process chains. We present a systems based approach, premised on the assumption that complex systems can be designed efficiently by managing the complexity of design process chains. The approach relies on (a) the use of reusable interaction patterns to model design process chains, and (b) consideration of design process decisions using value-of-information based metrics. The approach is illustrated using a Multifunctional Energetic Structural Material (MESM) design example. Energetic materials store considerable energy which can be released through shock-induced detonation; conventionally, they are not engineered for strength properties. The design objectives for the MESM in this paper include both sufficient strength and energy release characteristics. The design is carried out by using models at different length and time scales that simulate different aspects of the system. Finally, by applying the method to the MESM design problem, we show that the integrated design of materials and products can be carried out more efficiently by explicitly accounting for design process decisions with the hierarchy of models.

  18. Concurrent Engineering for Composites

    DTIC Science & Technology

    1991-10-01

    1990), 44. Cooper, R.G. and Kleinschmidt, E.J., Journal of Product Innovation Management . 3[2], (1986), 71.. Drucker, P.F., Harvard Business Review...Journal of Product Innovation Management 6(1], (1989), 43. Hollins, B. and Pugh, S., Successful Product Design, Buttcrworths, London, 1990. Johnson

  19. (Re)engineering Earth System Models to Expose Greater Concurrency for Ultrascale Computing: Practice, Experience, and Musings

    NASA Astrophysics Data System (ADS)

    Mills, R. T.

    2014-12-01

    As the high performance computing (HPC) community pushes towards the exascale horizon, the importance and prevalence of fine-grained parallelism in new computer architectures is increasing. This is perhaps most apparent in the proliferation of so-called "accelerators" such as the Intel Xeon Phi or NVIDIA GPGPUs, but the trend also holds for CPUs, where serial performance has grown slowly and effective use of hardware threads and vector units are becoming increasingly important to realizing high performance. This has significant implications for weather, climate, and Earth system modeling codes, many of which display impressive scalability across MPI ranks but take relatively little advantage of threading and vector processing. In addition to increasing parallelism, next generation codes will also need to address increasingly deep hierarchies for data movement: NUMA/cache levels, on node vs. off node, local vs. wide neighborhoods on the interconnect, and even in the I/O system. We will discuss some approaches (grounded in experiences with the Intel Xeon Phi architecture) for restructuring Earth science codes to maximize concurrency across multiple levels (vectors, threads, MPI ranks), and also discuss some novel approaches for minimizing expensive data movement/communication.

  20. Challenges in Teaching Modern Manufacturing Technologies

    ERIC Educational Resources Information Center

    Ngaile, Gracious; Wang, Jyhwen; Gau, Jenn-Terng

    2015-01-01

    Teaching of manufacturing courses for undergraduate engineering students has become a challenge due to industrial globalisation coupled with influx of new innovations, technologies, customer-driven products. This paper discusses development of a modern manufacturing course taught concurrently in three institutions where students collaborate in…

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The U.S. Department of Energy's (DOE) Co-Optimization of Fuels & Engines (Co-Optima) initiative is accelerating the introduction of affordable, scalable, and sustainable fuels and high-efficiency, low-emission engines with a first-of-its-kind effort to simultaneously tackle fuel and engine research and development (R&D). This report summarizes accomplishments in the first year of the project. Co-Optima is conducting concurrent research to identify the fuel properties and engine design characteristics needed to maximize vehicle performance and affordability, while deeply cutting emissions. Nine national laboratories - the National Renewable Energy Laboratory and Argonne, Idaho, Lawrence Berkeley, Lawrence Livermore, Los Alamos, Oak Ridge, Pacific Northwest, andmore » Sandia National Laboratories - are collaborating with industry and academia on this groundbreaking research.« less

  2. Method for reducing peak phase current and decreasing staring time for an internal combustion engine having an induction machine

    DOEpatents

    Amey, David L.; Degner, Michael W.

    2002-01-01

    A method for reducing the starting time and reducing the peak phase currents for an internal combustion engine that is started using an induction machine starter/alternator. The starting time is reduced by pre-fluxing the induction machine and the peak phase currents are reduced by reducing the flux current command after a predetermined period of time has elapsed and concurrent to the application of the torque current command. The method of the present invention also provides a strategy for anticipating the start command for an internal combustion engine and determines a start strategy based on the start command and the operating state of the internal combustion engine.

  3. Examination of Various Methods Used in Support of Concurrent Engineering

    DTIC Science & Technology

    1990-03-01

    1989. F.Y.I. Drawing a2Ther Productivity. Industrial Engineering 21: 80. Ishi82 Ishikawa , Kaoru . 1982. Guide to Quality Control. White Plains, NY: Kraus...observe it in practice have an easier time identifying the different methods or tech- niques (such as the Ishikawa tools) used than understanding the...simple histogram to show what prob- lems should be attacked first. Cause and Effect Diagrams Sometimes called the fishbone or Ishikawa diagrams-a kind

  4. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Part 2

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas; Yuan, Charles; Sobieski, Jaroslaw (Technical Monitor)

    2000-01-01

    A new MDO method, BLISS, and two different variants of the method, BLISS/RS and BLISS/S, have been implemented using iSIGHT's scripting language and evaluated in this report on multidisciplinary problems. All of these methods are based on decomposing a modular system optimization system into several subtasks optimization, that may be executed concurrently, and the system optimization that coordinates the subtasks optimization. The BLISS method and its variants are well suited for exploiting the concurrent processing capabilities in a multiprocessor machine. Several steps, including the local sensitivity analysis, local optimization, response surfaces construction and updates are all ideally suited for concurrent processing. Needless to mention, such algorithms that can effectively exploit the concurrent processing capabilities of the compute servers will be a key requirement for solving large-scale industrial design problems, such as the automotive vehicle problem detailed in Section 3.4.

  5. Uncovering the Problem-Solving Process: Cued Retrospective Reporting Versus Concurrent and Retrospective Reporting

    ERIC Educational Resources Information Center

    van Gog, Tamara; Paas, Fred; Merrienboer, Jeroen J. G.; Witte, Puk

    2005-01-01

    This study investigated the amounts of problem-solving process information ("action," "why," "how," and "metacognitive") elicited by means of concurrent, retrospective, and cued retrospective reporting. In a within-participants design, 26 participants completed electrical circuit troubleshooting tasks under different reporting conditions. The…

  6. Parametrically driven hybrid qubit-photon systems: Dissipation-induced quantum entanglement and photon production from vacuum

    NASA Astrophysics Data System (ADS)

    Remizov, S. V.; Zhukov, A. A.; Shapiro, D. S.; Pogosov, W. V.; Lozovik, Yu. E.

    2017-10-01

    We consider a dissipative evolution of a parametrically driven qubit-cavity system under the periodic modulation of coupling energy between two subsystems, which leads to the amplification of counter-rotating processes. We reveal a very rich dynamical behavior of this hybrid system. In particular, we find that the energy dissipation in one of the subsystems can enhance quantum effects in another subsystem. For instance, optimal cavity decay assists the stabilization of entanglement and quantum correlations between qubits even in the steady state and the compensation of finite qubit relaxation. On the contrary, energy dissipation in qubit subsystems results in enhanced photon production from vacuum for strong modulation but destroys both quantum concurrence and quantum mutual information between qubits. Our results provide deeper insights to nonstationary cavity quantum electrodynamics in the context of quantum information processing and might be of importance for dissipative quantum state engineering.

  7. Improved Signal Processing Technique Leads to More Robust Self Diagnostic Accelerometer System

    NASA Technical Reports Server (NTRS)

    Tokars, Roger; Lekki, John; Jaros, Dave; Riggs, Terrence; Evans, Kenneth P.

    2010-01-01

    The self diagnostic accelerometer (SDA) is a sensor system designed to actively monitor the health of an accelerometer. In this case an accelerometer is considered healthy if it can be determined that it is operating correctly and its measurements may be relied upon. The SDA system accomplishes this by actively monitoring the accelerometer for a variety of failure conditions including accelerometer structural damage, an electrical open circuit, and most importantly accelerometer detachment. In recent testing of the SDA system in emulated engine operating conditions it has been found that a more robust signal processing technique was necessary. An improved accelerometer diagnostic technique and test results of the SDA system utilizing this technique are presented here. Furthermore, the real time, autonomous capability of the SDA system to concurrently compensate for effects from real operating conditions such as temperature changes and mechanical noise, while monitoring the condition of the accelerometer health and attachment, will be demonstrated.

  8. Effects of concurrent cognitive processing on the fluency of word repetition: comparison between persons who do and do not stutter.

    PubMed

    Bosshardt, Hans-Georg

    2002-01-01

    This study investigated how silent reading and word memorization may affect the fluency of concurrently repeated words. The words silently read or memorized were phonologically similar or dissimilar to the words of the repetition task. Fourteen adults who stutter and 16 who do not participated in the experiment. The two groups were matched for age, education, sex, forward and backward memory span and vocabulary. It was found that the disfluencies of persons who stutter significantly increased during word repetition when similar words were read or memorized concurrently. In contrast, the disfluencies of persons who do not stutter were not significantly affected by either secondary task. These results indicate that the speech of persons who stutter is more sensitive to interference from concurrently performed cognitive processing than that of nonstuttering persons. It is proposed that the phonological and articulatory systems of persons who stutter are protected less efficiently from interference by attention-demanding processing within the central executive system. Alternative interpretations are also discussed. Readers will learn how modern speech production theories and the concept of modularity can account for stuttering, and will be able to explain the greater vulnerability of stutterer's speech fluency to concurrent cognitive processing.

  9. Methodologies and systems for heterogeneous concurrent computing

    NASA Technical Reports Server (NTRS)

    Sunderam, V. S.

    1994-01-01

    Heterogeneous concurrent computing is gaining increasing acceptance as an alternative or complementary paradigm to multiprocessor-based parallel processing as well as to conventional supercomputing. While algorithmic and programming aspects of heterogeneous concurrent computing are similar to their parallel processing counterparts, system issues, partitioning and scheduling, and performance aspects are significantly different. In this paper, we discuss critical design and implementation issues in heterogeneous concurrent computing, and describe techniques for enhancing its effectiveness. In particular, we highlight the system level infrastructures that are required, aspects of parallel algorithm development that most affect performance, system capabilities and limitations, and tools and methodologies for effective computing in heterogeneous networked environments. We also present recent developments and experiences in the context of the PVM system and comment on ongoing and future work.

  10. CO2 fixation by anaerobic non-photosynthetic mixotrophy for improved carbon conversion.

    PubMed

    Jones, Shawn W; Fast, Alan G; Carlson, Ellinor D; Wiedel, Carrissa A; Au, Jennifer; Antoniewicz, Maciek R; Papoutsakis, Eleftherios T; Tracy, Bryan P

    2016-09-30

    Maximizing the conversion of biogenic carbon feedstocks into chemicals and fuels is essential for fermentation processes as feedstock costs and processing is commonly the greatest operating expense. Unfortunately, for most fermentations, over one-third of sugar carbon is lost to CO 2 due to the decarboxylation of pyruvate to acetyl-CoA and limitations in the reducing power of the bio-feedstock. Here we show that anaerobic, non-photosynthetic mixotrophy, defined as the concurrent utilization of organic (for example, sugars) and inorganic (for example, CO 2 ) substrates in a single organism, can overcome these constraints to increase product yields and reduce overall CO 2 emissions. As a proof-of-concept, Clostridium ljungdahlii was engineered to produce acetone and achieved a mass yield 138% of the previous theoretical maximum using a high cell density continuous fermentation process. In addition, when enough reductant (that is, H 2 ) is provided, the fermentation emits no CO 2 . Finally, we show that mixotrophy is a general trait among acetogens.

  11. Reconciling pairs of concurrently used clinical practice guidelines using Constraint Logic Programming.

    PubMed

    Wilk, Szymon; Michalowski, Martin; Michalowski, Wojtek; Hing, Marisela Mainegra; Farion, Ken

    2011-01-01

    This paper describes a new methodological approach to reconciling adverse and contradictory activities (called points of contention) occurring when a patient is managed according to two or more concurrently used clinical practice guidelines (CPGs). The need to address these inconsistencies occurs when a patient with more than one disease, each of which is a comorbid condition, has to be managed according to different treatment regimens. We propose an automatic procedure that constructs a mathematical guideline model using the Constraint Logic Programming (CLP) methodology, uses this model to identify and mitigate encountered points of contention, and revises the considered CPGs accordingly. The proposed procedure is used as an alerting mechanism and coupled with a guideline execution engine warns the physician about potential problems with the concurrent application of two or more guidelines. We illustrate the operation of our procedure in a clinical scenario describing simultaneous use of CPGs for duodenal ulcer and transient ischemic attack.

  12. Influences on Sexual Partnering Among African American Adolescents With Concurrent Sexual Relationships

    PubMed Central

    Reed, Sarah J.; Bangi, Audrey; Sheon, Nicolas; Harper, Gary W.; Catania, Joseph A.; Richards, Kimberly A. M.; Dolcini, M. Margaret; Boyer, Cherrie B.

    2012-01-01

    Adolescents often engage in concurrent sexual partnerships as part of a developmental process of gaining experience with sexuality. The authors qualitatively examined patterns of concurrency and variation in normative and motivational influences on this pattern of sexual partnering among African American adolescents (31 males; 20 females), ages 15 to 17 years. Using content analysis, gender and contextual differences in social norms and motivations for concurrency were explored. Findings describe the normative influences on adolescent males and females with regard to sexual concurrency and the transfer of these norms from one generation to the next. PMID:22505843

  13. Concurrent System Engineering and Risk Reduction for Dual-Band (RF/optical) Spacecraft Communications

    NASA Technical Reports Server (NTRS)

    Fielhauer, Karl, B.; Boone, Bradley, G.; Raible, Daniel, E.

    2012-01-01

    This paper describes a system engineering approach to examining the potential for combining elements of a deep-space RF and optical communications payload, for the purpose of reducing the size, weight and power burden on the spacecraft and the mission. Figures of merit and analytical methodologies are discussed to conduct trade studies, and several potential technology integration strategies are presented. Finally, the NASA Integrated Radio and Optical Communications (iROC) project is described, which directly addresses the combined RF and optical approach.

  14. Infusion of a Gaming Paradigm into Computer-Aided Engineering Design Tools

    DTIC Science & Technology

    2012-05-03

    Virtual Test Bed (VTB), and the gaming tool, Unity3D . This hybrid gaming environment coupled a three-dimensional (3D) multibody vehicle system model...from Google Earth to the 3D visual front-end fabricated around Unity3D . The hybrid environment was sufficiently developed to support analyses of the...ndFr Cti3r4 G’OjrdFr ctior-2 The VTB simulation of the vehicle dynamics ran concurrently with and interacted with the gaming engine, Unity3D which

  15. Stacking transgenes in forest trees.

    PubMed

    Halpin, Claire; Boerjan, Wout

    2003-08-01

    Huge potential exists for improving plant raw materials and foodstuffs via metabolic engineering. To date, progress has mostly been limited to modulating the expression of single genes of well-studied pathways, such as the lignin biosynthetic pathway, in model species. However, a recent report illustrates a new level of sophistication in metabolic engineering by overexpressing one lignin enzyme while simultaneously suppressing the expression of another lignin gene in a tree, aspen. This novel approach to multi-gene manipulation has succeeded in concurrently improving several wood-quality traits.

  16. Time takes space: selective effects of multitasking on concurrent spatial processing.

    PubMed

    Mäntylä, Timo; Coni, Valentina; Kubik, Veit; Todorov, Ivo; Del Missier, Fabio

    2017-08-01

    Many everyday activities require coordination and monitoring of complex relations of future goals and deadlines. Cognitive offloading may provide an efficient strategy for reducing control demands by representing future goals and deadlines as a pattern of spatial relations. We tested the hypothesis that multiple-task monitoring involves time-to-space transformational processes, and that these spatial effects are selective with greater demands on coordinate (metric) than categorical (nonmetric) spatial relation processing. Participants completed a multitasking session in which they monitored four series of deadlines, running on different time scales, while making concurrent coordinate or categorical spatial judgments. We expected and found that multitasking taxes concurrent coordinate, but not categorical, spatial processing. Furthermore, males showed a better multitasking performance than females. These findings provide novel experimental evidence for the hypothesis that efficient multitasking involves metric relational processing.

  17. Concurrent Data Elicitation Procedures, Processes, and the Early Stages of L2 Learning: A Critical Overview

    ERIC Educational Resources Information Center

    Leow, Ronald P.; Grey, Sarah; Marijuan, Silvia; Moorman, Colleen

    2014-01-01

    Given the current methodological interest in eliciting direct data on the cognitive processes L2 learners employ as they interact with L2 data during the early stages of the learning process, this article takes a critical and comparative look at three concurrent data elicitation procedures currently employed in the SLA literature: Think aloud (TA)…

  18. The nuclear thermal electric rocket: a proposed innovative propulsion concept for manned interplanetary missions

    NASA Astrophysics Data System (ADS)

    Dujarric, C.; Santovincenzo, A.; Summerer, L.

    2013-03-01

    Conventional propulsion technology (chemical and electric) currently limits the possibilities for human space exploration to the neighborhood of the Earth. If farther destinations (such as Mars) are to be reached with humans on board, a more capable interplanetary transfer engine featuring high thrust, high specific impulse is required. The source of energy which could in principle best meet these engine requirements is nuclear thermal. However, the nuclear thermal rocket technology is not yet ready for flight application. The development of new materials which is necessary for the nuclear core will require further testing on ground of full-scale nuclear rocket engines. Such testing is a powerful inhibitor to the nuclear rocket development, as the risks of nuclear contamination of the environment cannot be entirely avoided with current concepts. Alongside already further matured activities in the field of space nuclear power sources for generating on-board power, a low level investigation on nuclear propulsion has been running since long within ESA, and innovative concepts have already been proposed at an IAF conference in 1999 [1, 2]. Following a slow maturation process, a new concept was defined which was submitted to a concurrent design exercise in ESTEC in 2007. Great care was taken in the selection of the design parameters to ensure that this quite innovative concept would in all respects likely be feasible with margins. However, a thorough feasibility demonstration will require a more detailed design including the selection of appropriate materials and the verification that these can withstand the expected mechanical, thermal, and chemical environment. So far, the predefinition work made clear that, based on conservative technology assumptions, a specific impulse of 920 s could be obtained with a thrust of 110 kN. Despite the heavy engine dry mass, a preliminary mission analysis using conservative assumptions showed that the concept was reducing the required Initial Mass in Low Earth Orbit compared to conventional nuclear thermal rockets for a human mission to Mars. Of course, the realization of this concept still requires proper engineering and the dimensioning of quite unconventional machinery. A patent was filed on the concept. Because of the operating parameters of the nuclear core, which are very specific to this type of concept, it seems possible to test on ground this kind of engine at full scale in close loop using a reasonable size test facility with safe and clean conditions. Such tests can be conducted within fully confined enclosure, which would substantially increase the associated inherent nuclear safety levels. This breakthrough removes a showstopper for nuclear rocket engines development. The present paper will disclose the NTER (Nuclear Thermal Electric Rocket) engine concept, will present some of the results of the ESTEC concurrent engineering exercise, and will explain the concept for the NTER on-ground testing facility. Regulations and safety issues related to the development and implementation of the NTER concept will be addressed as well.

  19. RAFCON: A Graphical Tool for Engineering Complex, Robotic Tasks

    DTIC Science & Technology

    2016-10-09

    Robotic tasks are becoming increasingly complex, and with this also the robotic systems. This requires new tools to manage this complexity and to...execution of robotic tasks, called RAFCON. These tasks are described in hierarchical state machines supporting concurrency. A formal notation of this concept

  20. Use of exocentric and egocentric representations in the concurrent planning of sequential saccades.

    PubMed

    Sharika, K M; Ramakrishnan, Arjun; Murthy, Aditya

    2014-11-26

    The concurrent planning of sequential saccades offers a simple model to study the nature of visuomotor transformations since the second saccade vector needs to be remapped to foveate the second target following the first saccade. Remapping is thought to occur through egocentric mechanisms involving an efference copy of the first saccade that is available around the time of its onset. In contrast, an exocentric representation of the second target relative to the first target, if available, can be used to directly code the second saccade vector. While human volunteers performed a modified double-step task, we examined the role of exocentric encoding in concurrent saccade planning by shifting the first target location well before the efference copy could be used by the oculomotor system. The impact of the first target shift on concurrent processing was tested by examining the end-points of second saccades following a shift of the second target during the first saccade. The frequency of second saccades to the old versus new location of the second target, as well as the propagation of first saccade localization errors, both indices of concurrent processing, were found to be significantly reduced in trials with the first target shift compared to those without it. A similar decrease in concurrent processing was obtained when we shifted the first target but kept constant the second saccade vector. Overall, these results suggest that the brain can use relatively stable visual landmarks, independent of efference copy-based egocentric mechanisms, for concurrent planning of sequential saccades. Copyright © 2014 the authors 0270-6474/14/3416009-13$15.00/0.

  1. Challenges Facing Design and Analysis Tools

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)

    2001-01-01

    The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools will be described.

  2. Implementing model-based system engineering for the whole lifecycle of a spacecraft

    NASA Astrophysics Data System (ADS)

    Fischer, P. M.; Lüdtke, D.; Lange, C.; Roshani, F.-C.; Dannemann, F.; Gerndt, A.

    2017-09-01

    Design information of a spacecraft is collected over all phases in the lifecycle of a project. A lot of this information is exchanged between different engineering tasks and business processes. In some lifecycle phases, model-based system engineering (MBSE) has introduced system models and databases that help to organize such information and to keep it consistent for everyone. Nevertheless, none of the existing databases approached the whole lifecycle yet. Virtual Satellite is the MBSE database developed at DLR. It has been used for quite some time in Phase A studies and is currently extended for implementing it in the whole lifecycle of spacecraft projects. Since it is unforeseeable which future use cases such a database needs to support in all these different projects, the underlying data model has to provide tailoring and extension mechanisms to its conceptual data model (CDM). This paper explains the mechanisms as they are implemented in Virtual Satellite, which enables extending the CDM along the project without corrupting already stored information. As an upcoming major use case, Virtual Satellite will be implemented as MBSE tool in the S2TEP project. This project provides a new satellite bus for internal research and several different payload missions in the future. This paper explains how Virtual Satellite will be used to manage configuration control problems associated with such a multi-mission platform. It discusses how the S2TEP project starts using the software for collecting the first design information from concurrent engineering studies, then making use of the extension mechanisms of the CDM to introduce further information artefacts such as functional electrical architecture, thus linking more and more processes into an integrated MBSE approach.

  3. C formal verification with unix communication and concurrency

    NASA Technical Reports Server (NTRS)

    Hoover, Doug N.

    1990-01-01

    The results of a NASA SBIR project are presented in which CSP-Ariel, a verification system for C programs which use Unix system calls for concurrent programming, interprocess communication, and file input and output, was developed. This project builds on ORA's Ariel C verification system by using the system of Hoare's book, Communicating Sequential Processes, to model concurrency and communication. The system runs in ORA's Clio theorem proving environment. The use of CSP to model Unix concurrency and sketch the CSP semantics of a simple concurrent program is outlined. Plans for further development of CSP-Ariel are discussed. This paper is presented in viewgraph form.

  4. Economical launching and accelerating control strategy for a single-shaft parallel hybrid electric bus

    NASA Astrophysics Data System (ADS)

    Yang, Chao; Song, Jian; Li, Liang; Li, Shengbo; Cao, Dongpu

    2016-08-01

    This paper presents an economical launching and accelerating mode, including four ordered phases: pure electrical driving, clutch engagement and engine start-up, engine active charging, and engine driving, which can be fit for the alternating conditions and improve the fuel economy of hybrid electric bus (HEB) during typical city-bus driving scenarios. By utilizing the fast response feature of electric motor (EM), an adaptive controller for EM is designed to realize the power demand during the pure electrical driving mode, the engine starting mode and the engine active charging mode. Concurrently, the smoothness issue induced by the sequential mode transitions is solved with a coordinated control logic for engine, EM and clutch. Simulation and experimental results show that the proposed launching and accelerating mode and its control methods are effective in improving the fuel economy and ensure the drivability during the fast transition between the operation modes of HEB.

  5. Petri net model for analysis of concurrently processed complex algorithms

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.

    1986-01-01

    This paper presents a Petri-net model suitable for analyzing the concurrent processing of computationally complex algorithms. The decomposed operations are to be processed in a multiple processor, data driven architecture. Of particular interest is the application of the model to both the description of the data/control flow of a particular algorithm, and to the general specification of the data driven architecture. A candidate architecture is also presented.

  6. Concurrent tailoring of fabrication process and interphase layer to reduce residual stresses in metal matrix composites

    NASA Technical Reports Server (NTRS)

    Saravanos, D. A.; Chamis, C. C.; Morel, M.

    1991-01-01

    A methodology is presented to reduce the residual matrix stresses in continuous fiber metal matrix composites (MMC) by optimizing the fabrication process and interphase layer characteristics. The response of the fabricated MMC was simulated based on nonlinear micromechanics. Application cases include fabrication tailoring, interphase tailoring, and concurrent fabrication-interphase optimization. Two composite systems, silicon carbide/titanium and graphite/copper, are considered. Results illustrate the merits of each approach, indicate that concurrent fabrication/interphase optimization produces significant reductions in the matrix residual stresses and demonstrate the strong coupling between fabrication and interphase tailoring.

  7. The Rapid Response Radiation Survey (R3S) Mission Using the HISat Conformal Satellite Architecture

    NASA Technical Reports Server (NTRS)

    Miller, Nathanael

    2015-01-01

    The Rapid Response Radiation Survey (R3S) experiment, designed as a quick turnaround mission to make radiation measurements in LEO, will fly as a hosted payload in partnership with NovaWurks using their Hyper-integrated Satlet (HiSat) architecture. The need for the mission arises as the Nowcast of Atmospheric Ionization Radiation for Aviation Safety (NAIRAS) model moves from a research effort into an operational radiation assessment tool. The data collected by R3S, in addition to the complementary data from a NASA Langley Research Center (LaRC) atmospheric balloon mission entitled Radiation Dosimetry Experiment (RaDX), will validate exposure prediction capabilities of NAIRAS. This paper discusses the development of the R3S experiment as made possible by use of the HiSat architecture. The system design and operational modes of the experiment are described, as well as the experiment interfaces to the HiSat satellite via the user defined adapter (UDA) provided by NovaWurks. This paper outlines the steps taken by the project to execute the R3S mission in the 4 months of design, build, and test. Finally, description of the engineering process is provided, including the use of facilitated rapid/concurrent engineering sessions, the associated documentation, and the review process employed.

  8. Modeling of Broadband Liners Applied to the Advanced Noise Control Fan

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Jones, Michael G.; Sutliff, Daniel L.

    2015-01-01

    The broadband component of fan noise has grown in relevance with an increase in bypass ratio and incorporation of advanced fan designs. Therefore, while the attenuation of fan tones remains a major factor in engine nacelle acoustic liner design, the simultaneous reduction of broadband fan noise levels has received increased interest. As such, a previous investigation focused on improvements to an established broadband acoustic liner optimization process using the Advanced Noise Control Fan (ANCF) rig as a demonstrator. Constant-depth, double-degree of freedom and variable-depth, multi-degree of freedom liner designs were carried through design, fabrication, and testing. This paper addresses a number of areas for further research identified in the initial assessment of the ANCF study. Specifically, incident source specification and uncertainty in some aspects of the predicted liner impedances are addressed. This information is incorporated in updated predictions of the liner performance and comparisons with measurement are greatly improved. Results illustrate the value of the design process in concurrently evaluating the relative costs/benefits of various liner designs. This study also provides further confidence in the integrated use of duct acoustic propagation/radiation and liner modeling tools in the design and evaluation of novel broadband liner concepts for complex engine configurations.

  9. Hands-On Teaching and Entrepreneurship Development.

    ERIC Educational Resources Information Center

    da Silveira, Marcos Azevedo; da Silva, Mauro Schwanke; Kelber, Christian R.; de Freitas, Manuel R.

    This paper presents the experiment being conducted in the Electric Circuits II course (ELE1103) at PUC-Rio's Electrical Engineering Department since March 1997. This experiment was held in both the fall and the spring semesters of 1997. The basis for the experiment was concurrent teaching methodology, to which the principles of entrepreneurship…

  10. 40 CFR 87.7 - Exemptions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Exemptions. 87.7 Section 87.7... POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES General Provisions § 87.7 Exemptions. (a) Exemptions based on..., with the concurrence of the Administrator, that application of any standard under § 87.21 is not...

  11. Review of Electrocution Deaths in Iraq: Part 1 - Electrocution of Staff Sergeant Ryan D. Maseth, U.S. Army

    DTIC Science & Technology

    2009-07-24

    concurrently. Photographs of LSF-1 from before and after June 2006 are consistent with the believed installation date. A forensic engineering...other services such as refuse collection and disposal, entomology , etc. Starting in November 2003, Washington Group International/Black and Veatch

  12. 33 CFR 203.84 - Forms of local participation-cost sharing.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... sharing. 203.84 Section 203.84 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE EMERGENCY EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Local...; and/or accomplishment of work either concurrently or within a specified reasonable period of time. The...

  13. 33 CFR 203.84 - Forms of local participation-cost sharing.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... sharing. 203.84 Section 203.84 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE EMERGENCY EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Local...; and/or accomplishment of work either concurrently or within a specified reasonable period of time. The...

  14. 33 CFR 203.84 - Forms of local participation-cost sharing.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... sharing. 203.84 Section 203.84 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE EMERGENCY EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Local...; and/or accomplishment of work either concurrently or within a specified reasonable period of time. The...

  15. 33 CFR 203.84 - Forms of local participation-cost sharing.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... sharing. 203.84 Section 203.84 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE EMERGENCY EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Local...; and/or accomplishment of work either concurrently or within a specified reasonable period of time. The...

  16. 33 CFR 203.84 - Forms of local participation-cost sharing.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... sharing. 203.84 Section 203.84 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE EMERGENCY EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Local...; and/or accomplishment of work either concurrently or within a specified reasonable period of time. The...

  17. Integrated continuous processing of proteins expressed as inclusion bodies: GCSF as a case study.

    PubMed

    Kateja, Nikhil; Agarwal, Harshit; Hebbi, Vishwanath; Rathore, Anurag S

    2017-07-01

    Affordability of biopharmaceuticals continues to be a challenge, particularly in developing economies. This has fuelled advancements in manufacturing that can offer higher productivity and better economics without sacrificing product quality in the form of an integrated continuous manufacturing platform. While platform processes for monoclonal antibodies have existed for more than a decade, development of an integrated continuous manufacturing process for bacterial proteins has received relatively scant attention. In this study, we propose an end-to-end integrated continuous downstream process (from inclusion bodies to unformulated drug substance) for a therapeutic protein expressed in Escherichia coli as inclusion body. The final process consisted of a continuous refolding in a coiled flow inverter reactor directly coupled to a three-column periodic counter-current chromatography for capture of the product followed by a three-column con-current chromatography for polishing. The continuous bioprocessing train was run uninterrupted for 26 h to demonstrate its capability and the resulting output was analyzed for the various critical quality attributes, namely product purity (>99%), high molecular weight impurities (<0.5%), host cell proteins (<100 ppm), and host cell DNA (<10 ppb). All attributes were found to be consistent over the period of operation. The developed assembly offers smaller facility footprint, higher productivity, fewer hold steps, and significantly higher equipment and resin utilization. The complexities of process integration in the context of continuous processing have been highlighted. We hope that the study presented here will promote development of highly efficient, universal, end-to-end, fully continuous platforms for manufacturing of biotherapeutics. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 33:998-1009, 2017. © 2016 American Institute of Chemical Engineers.

  18. Size and emotion averaging: costs of dividing attention after all.

    PubMed

    Brand, John; Oriet, Chris; Tottenham, Laurie Sykes

    2012-03-01

    Perceptual averaging is a process by which sets of similar items are represented by summary statistics such as their average size, luminance, or orientation. Researchers have argued that this process is automatic, able to be carried out without interference from concurrent processing. Here, we challenge this conclusion and demonstrate a reliable cost of computing the mean size of circles distinguished by colour (Experiments 1 and 2) and the mean emotionality of faces distinguished by sex (Experiment 3). We also test the viability of two strategies that could have allowed observers to guess the correct response without computing the average size or emotionality of both sets concurrently. We conclude that although two means can be computed concurrently, doing so incurs a cost of dividing attention.

  19. In a quest for engineering acidophiles for biomining applications: challenges and opportunities.

    PubMed

    Gumulya, Yosephine; Boxall, Naomi J; Khaleque, Himel N; Santala, Ville; Carlson, Ross P; Kaksonen, Anna H

    2018-02-21

    Biomining with acidophilic microorganisms has been used at commercial scale for the extraction of metals from various sulfide ores. With metal demand and energy prices on the rise and the concurrent decline in quality and availability of mineral resources, there is an increasing interest in applying biomining technology, in particular for leaching metals from low grade minerals and wastes. However, bioprocessing is often hampered by the presence of inhibitory compounds that originate from complex ores. Synthetic biology could provide tools to improve the tolerance of biomining microbes to various stress factors that are present in biomining environments, which would ultimately increase bioleaching efficiency. This paper reviews the state-of-the-art tools to genetically modify acidophilic biomining microorganisms and the limitations of these tools. The first part of this review discusses resilience pathways that can be engineered in acidophiles to enhance their robustness and tolerance in harsh environments that prevail in bioleaching. The second part of the paper reviews the efforts that have been carried out towards engineering robust microorganisms and developing metabolic modelling tools. Novel synthetic biology tools have the potential to transform the biomining industry and facilitate the extraction of value from ores and wastes that cannot be processed with existing biomining microorganisms.

  20. In a Quest for Engineering Acidophiles for Biomining Applications: Challenges and Opportunities

    PubMed Central

    Gumulya, Yosephine; Boxall, Naomi J; Khaleque, Himel N; Santala, Ville; Carlson, Ross P; Kaksonen, Anna H

    2018-01-01

    Biomining with acidophilic microorganisms has been used at commercial scale for the extraction of metals from various sulfide ores. With metal demand and energy prices on the rise and the concurrent decline in quality and availability of mineral resources, there is an increasing interest in applying biomining technology, in particular for leaching metals from low grade minerals and wastes. However, bioprocessing is often hampered by the presence of inhibitory compounds that originate from complex ores. Synthetic biology could provide tools to improve the tolerance of biomining microbes to various stress factors that are present in biomining environments, which would ultimately increase bioleaching efficiency. This paper reviews the state-of-the-art tools to genetically modify acidophilic biomining microorganisms and the limitations of these tools. The first part of this review discusses resilience pathways that can be engineered in acidophiles to enhance their robustness and tolerance in harsh environments that prevail in bioleaching. The second part of the paper reviews the efforts that have been carried out towards engineering robust microorganisms and developing metabolic modelling tools. Novel synthetic biology tools have the potential to transform the biomining industry and facilitate the extraction of value from ores and wastes that cannot be processed with existing biomining microorganisms. PMID:29466321

  1. Electro-optic holography method for determination of surface shape and deformation

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1998-06-01

    Current demanding engineering analysis and design applications require effective experimental methodologies for characterization of surface shape and deformation. Such characterization is of primary importance in many applications, because these quantities are related to the functionality, performance, and integrity of the objects of interest, especially in view of advances relating to concurrent engineering. In this paper, a new approach to characterization of surface shape and deformation using a simple optical setup is described. The approach consists of a fiber optic based electro-optic holography (EOH) system based on an IR, temperature tuned laser diode, a single mode fiber optic directional coupler assembly, and a video processing computer. The EOH can be arranged in multiple configurations which include, the three-camera, three- illumination, and speckle correlation modes.In particular, the three-camera mode is described, as well as a brief description of the procedures for obtaining quantitative 3D shape and deformation information. A representative application of the three-camera EOH system demonstrates the viability of the approach as an effective engineering tool. A particular feature of this system and the procedure described in this paper is that the 3D quantitative data are written to data files which can be readily interfaced to commercial CAD/CAM environments.

  2. Control-structure-thermal interactions in analysis of lunar telescopes

    NASA Technical Reports Server (NTRS)

    Thompson, Roger C.

    1992-01-01

    The lunar telescope project was an excellent model for the CSTI study because a telescope is a very sensitive instrument, and thermal expansion or mechanical vibration of the mirror assemblies will rapidly degrade the resolution of the device. Consequently, the interactions are strongly coupled. The lunar surface experiences very large temperature variations that range from approximately -180 C to over 100 C. Although the optical assemblies of the telescopes will be well insulated, the temperature of the mirrors will inevitably fluctuate in a similar cycle, but of much smaller magnitude. In order to obtain images of high quality and clarity, allowable thermal deformations of any point on a mirror must be less than 1 micron. Initial estimates indicate that this corresponds to a temperature variation of much less than 1 deg through the thickness of the mirror. Therefore, a lunar telescope design will most probably include active thermal control, a means of controlling the shape of the mirrors, or a combination of both systems. Historically, the design of a complex vehicle was primarily a sequential process in which the basic structure was defined without concurrent detailed analyses or other subsystems. The basic configuration was then passed to the different teams responsible for each subsystem, and their task was to produce a workable solution without requiring major alterations to any principal components or subsystems. Consequently, the final design of the vehicle was not always the most efficient, owing to the fact that each subsystem design was partially constrained by the previous work. This procedure was necessary at the time because the analysis process was extremely time-consuming and had to be started over with each significant alteration of the vehicle. With recent advances in the power and capacity of small computers, and the parallel development of powerful software in structural, thermal, and control system analysis, it is now possible to produce very detailed analyses of intermediate designs in a much shorter period of time. The subsystems can thus be designed concurrently, and alterations in the overall design can be quickly adopted into each analysis; the design becomes an iterative process in which it is much easier to experiment with new ideas, configurations, and components. Concurrent engineering has the potential to produce efficient, highly capable designs because the effect of one subystem on another can be assessed in much more detail at a very early point in the program. The research program consisted of several tasks: scale a prototype telescope assembly to a 1 m aperture, develop a model of the telescope assembly by using finite element (FEM) codes that are available on site, determine structural deflections of the mirror surfaces due to the temperature variations, develop a prototype control system to maintain the proper shape of the optical elements, and most important of all, demonstrate the concurrent engineering approach with this example. In addition, the software used for the finite element models and thermal analysis was relatively new within the Program Development Office and had yet to be applied to systems this large or complex; understanding the software and modifying it for use with this project was also required. The I-DEAS software by Structural Dynamics Research Corporation (SDRC) was used to build the finite element models, and TMG developed by Maya Heat Transfer Technologies, Ltd. (which runs as an I-DEAS module) was used for the thermal model calculations. All control system development was accomplished with MATRIX(sub X) by Integrated Systems, Inc.

  3. Benchmarking the ATLAS software through the Kit Validation engine

    NASA Astrophysics Data System (ADS)

    De Salvo, Alessandro; Brasolin, Franco

    2010-04-01

    The measurement of the experiment software performance is a very important metric in order to choose the most effective resources to be used and to discover the bottlenecks of the code implementation. In this work we present the benchmark techniques used to measure the ATLAS software performance through the ATLAS offline testing engine Kit Validation and the online portal Global Kit Validation. The performance measurements, the data collection, the online analysis and display of the results will be presented. The results of the measurement on different platforms and architectures will be shown, giving a full report on the CPU power and memory consumption of the Monte Carlo generation, simulation, digitization and reconstruction of the most CPU-intensive channels. The impact of the multi-core computing on the ATLAS software performance will also be presented, comparing the behavior of different architectures when increasing the number of concurrent processes. The benchmark techniques described in this paper have been used in the HEPiX group since the beginning of 2008 to help defining the performance metrics for the High Energy Physics applications, based on the real experiment software.

  4. Using SFOC to fly the Magellan Venus mapping mission

    NASA Technical Reports Server (NTRS)

    Bucher, Allen W.; Leonard, Robert E., Jr.; Short, Owen G.

    1993-01-01

    Traditionally, spacecraft flight operations at the Jet Propulsion Laboratory (JPL) have been performed by teams of spacecraft experts utilizing ground software designed specifically for the current mission. The Jet Propulsion Laboratory set out to reduce the cost of spacecraft mission operations by designing ground data processing software that could be used by multiple spacecraft missions, either sequentially or concurrently. The Space Flight Operations Center (SFOC) System was developed to provide the ground data system capabilities needed to monitor several spacecraft simultaneously and provide enough flexibility to meet the specific needs of individual projects. The Magellan Spacecraft Team utilizes the SFOC hardware and software designed for engineering telemetry analysis, both real-time and non-real-time. The flexibility of the SFOC System has allowed the spacecraft team to integrate their own tools with SFOC tools to perform the tasks required to operate a spacecraft mission. This paper describes how the Magellan Spacecraft Team is utilizing the SFOC System in conjunction with their own software tools to perform the required tasks of spacecraft event monitoring as well as engineering data analysis and trending.

  5. An asynchronous traversal engine for graph-based rich metadata management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Dong; Carns, Philip; Ross, Robert B.

    Rich metadata in high-performance computing (HPC) systems contains extended information about users, jobs, data files, and their relationships. Property graphs are a promising data model to represent heterogeneous rich metadata flexibly. Specifically, a property graph can use vertices to represent different entities and edges to record the relationships between vertices with unique annotations. The high-volume HPC use case, with millions of entities and relationships, naturally requires an out-of-core distributed property graph database, which must support live updates (to ingest production information in real time), low-latency point queries (for frequent metadata operations such as permission checking), and large-scale traversals (for provenancemore » data mining). Among these needs, large-scale property graph traversals are particularly challenging for distributed graph storage systems. Most existing graph systems implement a "level synchronous" breadth-first search algorithm that relies on global synchronization in each traversal step. This performs well in many problem domains; but a rich metadata management system is characterized by imbalanced graphs, long traversal lengths, and concurrent workloads, each of which has the potential to introduce or exacerbate stragglers (i.e., abnormally slow steps or servers in a graph traversal) that lead to low overall throughput for synchronous traversal algorithms. Previous research indicated that the straggler problem can be mitigated by using asynchronous traversal algorithms, and many graph-processing frameworks have successfully demonstrated this approach. Such systems require the graph to be loaded into a separate batch-processing framework instead of being iteratively accessed, however. In this work, we investigate a general asynchronous graph traversal engine that can operate atop a rich metadata graph in its native format. We outline a traversal-aware query language and key optimizations (traversal-affiliate caching and execution merging) necessary for efficient performance. We further explore the effect of different graph partitioning strategies on the traversal performance for both synchronous and asynchronous traversal engines. Our experiments show that the asynchronous graph traversal engine is more efficient than its synchronous counterpart in the case of HPC rich metadata processing, where more servers are involved and larger traversals are needed. Furthermore, the asynchronous traversal engine is more adaptive to different graph partitioning strategies.« less

  6. An asynchronous traversal engine for graph-based rich metadata management

    DOE PAGES

    Dai, Dong; Carns, Philip; Ross, Robert B.; ...

    2016-06-23

    Rich metadata in high-performance computing (HPC) systems contains extended information about users, jobs, data files, and their relationships. Property graphs are a promising data model to represent heterogeneous rich metadata flexibly. Specifically, a property graph can use vertices to represent different entities and edges to record the relationships between vertices with unique annotations. The high-volume HPC use case, with millions of entities and relationships, naturally requires an out-of-core distributed property graph database, which must support live updates (to ingest production information in real time), low-latency point queries (for frequent metadata operations such as permission checking), and large-scale traversals (for provenancemore » data mining). Among these needs, large-scale property graph traversals are particularly challenging for distributed graph storage systems. Most existing graph systems implement a "level synchronous" breadth-first search algorithm that relies on global synchronization in each traversal step. This performs well in many problem domains; but a rich metadata management system is characterized by imbalanced graphs, long traversal lengths, and concurrent workloads, each of which has the potential to introduce or exacerbate stragglers (i.e., abnormally slow steps or servers in a graph traversal) that lead to low overall throughput for synchronous traversal algorithms. Previous research indicated that the straggler problem can be mitigated by using asynchronous traversal algorithms, and many graph-processing frameworks have successfully demonstrated this approach. Such systems require the graph to be loaded into a separate batch-processing framework instead of being iteratively accessed, however. In this work, we investigate a general asynchronous graph traversal engine that can operate atop a rich metadata graph in its native format. We outline a traversal-aware query language and key optimizations (traversal-affiliate caching and execution merging) necessary for efficient performance. We further explore the effect of different graph partitioning strategies on the traversal performance for both synchronous and asynchronous traversal engines. Our experiments show that the asynchronous graph traversal engine is more efficient than its synchronous counterpart in the case of HPC rich metadata processing, where more servers are involved and larger traversals are needed. Furthermore, the asynchronous traversal engine is more adaptive to different graph partitioning strategies.« less

  7. Satellite Imagery Production and Processing Using Apache Hadoop

    NASA Astrophysics Data System (ADS)

    Hill, D. V.; Werpy, J.

    2011-12-01

    The United States Geological Survey's (USGS) Earth Resources Observation and Science (EROS) Center Land Science Research and Development (LSRD) project has devised a method to fulfill its processing needs for Essential Climate Variable (ECV) production from the Landsat archive using Apache Hadoop. Apache Hadoop is the distributed processing technology at the heart of many large-scale, processing solutions implemented at well-known companies such as Yahoo, Amazon, and Facebook. It is a proven framework and can be used to process petabytes of data on thousands of processors concurrently. It is a natural fit for producing satellite imagery and requires only a few simple modifications to serve the needs of science data processing. This presentation provides an invaluable learning opportunity and should be heard by anyone doing large scale image processing today. The session will cover a description of the problem space, evaluation of alternatives, feature set overview, configuration of Hadoop for satellite image processing, real-world performance results, tuning recommendations and finally challenges and ongoing activities. It will also present how the LSRD project built a 102 core processing cluster with no financial hardware investment and achieved ten times the initial daily throughput requirements with a full time staff of only one engineer. Satellite Imagery Production and Processing Using Apache Hadoop is presented by David V. Hill, Principal Software Architect for USGS LSRD.

  8. Modelling and simulating decision processes of linked lives: An approach based on concurrent processes and stochastic race.

    PubMed

    Warnke, Tom; Reinhardt, Oliver; Klabunde, Anna; Willekens, Frans; Uhrmacher, Adelinde M

    2017-10-01

    Individuals' decision processes play a central role in understanding modern migration phenomena and other demographic processes. Their integration into agent-based computational demography depends largely on suitable support by a modelling language. We are developing the Modelling Language for Linked Lives (ML3) to describe the diverse decision processes of linked lives succinctly in continuous time. The context of individuals is modelled by networks the individual is part of, such as family ties and other social networks. Central concepts, such as behaviour conditional on agent attributes, age-dependent behaviour, and stochastic waiting times, are tightly integrated in the language. Thereby, alternative decisions are modelled by concurrent processes that compete by stochastic race. Using a migration model, we demonstrate how this allows for compact description of complex decisions, here based on the Theory of Planned Behaviour. We describe the challenges for the simulation algorithm posed by stochastic race between multiple concurrent complex decisions.

  9. Concurrency-Induced Transitions in Epidemic Dynamics on Temporal Networks.

    PubMed

    Onaga, Tomokatsu; Gleeson, James P; Masuda, Naoki

    2017-09-08

    Social contact networks underlying epidemic processes in humans and animals are highly dynamic. The spreading of infections on such temporal networks can differ dramatically from spreading on static networks. We theoretically investigate the effects of concurrency, the number of neighbors that a node has at a given time point, on the epidemic threshold in the stochastic susceptible-infected-susceptible dynamics on temporal network models. We show that network dynamics can suppress epidemics (i.e., yield a higher epidemic threshold) when the node's concurrency is low, but can also enhance epidemics when the concurrency is high. We analytically determine different phases of this concurrency-induced transition, and confirm our results with numerical simulations.

  10. Concurrency-Induced Transitions in Epidemic Dynamics on Temporal Networks

    NASA Astrophysics Data System (ADS)

    Onaga, Tomokatsu; Gleeson, James P.; Masuda, Naoki

    2017-09-01

    Social contact networks underlying epidemic processes in humans and animals are highly dynamic. The spreading of infections on such temporal networks can differ dramatically from spreading on static networks. We theoretically investigate the effects of concurrency, the number of neighbors that a node has at a given time point, on the epidemic threshold in the stochastic susceptible-infected-susceptible dynamics on temporal network models. We show that network dynamics can suppress epidemics (i.e., yield a higher epidemic threshold) when the node's concurrency is low, but can also enhance epidemics when the concurrency is high. We analytically determine different phases of this concurrency-induced transition, and confirm our results with numerical simulations.

  11. The role of aging in intra-item and item-context binding processes in visual working memory.

    PubMed

    Peterson, Dwight J; Naveh-Benjamin, Moshe

    2016-11-01

    Aging is accompanied by declines in both working memory and long-term episodic memory processes. Specifically, important age-related memory deficits are characterized by performance impairments exhibited by older relative to younger adults when binding distinct components into a single integrated representation, despite relatively intact memory for the individual components. While robust patterns of age-related binding deficits are prevalent in studies of long-term episodic memory, observations of such deficits in visual working memory (VWM) may depend on the specific type of binding process being examined. For instance, a number of studies indicate that processes involved in item-context binding of items to occupied spatial locations within visual working memory are impaired in older relative to younger adults. Other findings suggest that intra-item binding of visual surface features (e.g., color, shape), compared to memory for single features, within visual working memory, remains relatively intact. Here, we examined each of these binding processes in younger and older adults under both optimal conditions (i.e., no concurrent load) and concurrent load (e.g., articulatory suppression, backward counting). Experiment 1 revealed an age-related intra-item binding deficit for surface features under no concurrent load but not when articulatory suppression was required. In contrast, in Experiments 2 and 3, we observed an age-related item-context binding deficit regardless of the level of concurrent load. These findings reveal that the influence of concurrent load on distinct binding processes within VWM, potentially those supported by rehearsal, is an important factor mediating the presence or absence of age-related binding deficits within VWM. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. Resource Management and Contingencies in Aerospace Concurrent Engineering

    NASA Technical Reports Server (NTRS)

    Karpati, Gabe; Hyde, Tupper; Peabody, Hume; Garrison, Matthew

    2012-01-01

    significant concern in designing complex systems implementing new technologies is that while knowledge about the system is acquired incrementally, substantial financial commitments, even make-or-break decisions, must be made upfront, essentially in the unknown. One practice that helps in dealing with this dichotomy is the smart embedding of contingencies and margins in the design to serve as buffers against surprises. This issue presents itself in full force in the aerospace industry, where unprecedented systems are formulated and committed to as a matter of routine. As more and more aerospace mission concepts are generated by concurrent design laboratories, it is imperative that such laboratories apply well thought-out contingency and margin structures to their designs. The first part of this publication provides an overview of resource management techniques and standards used in the aerospace industry. That is followed by a thought provoking treatise on margin policies. The expose presents the actual flight telemetry data recorded by the thermal discipline during several recent NASA Goddard Space Flight Center missions. The margins actually achieved in flight are compared against pre-flight predictions, and the appropriateness and the ramifications of having designed with rigid margins to bounding stacked worst case conditions are assessed. The second half of the paper examines the particular issues associated with the application of contingencies and margins in the concurrent engineering environment. In closure, a discipline-by-discipline disclosure of the contingency and margin policies in use at the Integrated Design Center at NASA s Goddard Space Flight Center is made.

  13. Options for making concurrency more multimodal, phase II

    DOT National Transportation Integrated Search

    2007-04-01

    Current Washington State Growth Management regulations state that jurisdictions in Growth Management counties must define a Concurrency process that ensures that adequate transportation facilities are present or will be present within three years bef...

  14. Options for making concurrency more multimodal, phase II revised

    DOT National Transportation Integrated Search

    2007-08-01

    Current Washington State Growth Management regulations state that jurisdictions in Growth Management counties must define a Concurrency process that ensures that adequate transportation facilities are present or will be present within three years bef...

  15. Options for making concurrency more multimodal, phase I

    DOT National Transportation Integrated Search

    2007-03-01

    Current Washington State Growth Management regulations state that jurisdictions in Growth Management counties must define a Concurrency process that ensures that adequate transportation facilities are present or will be present within three years bef...

  16. Systems Engineering Model for ART Energy Conversion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendez Cruz, Carmen Margarita; Rochau, Gary E.; Wilson, Mollye C.

    The near-term objective of the EC team is to establish an operating, commercially scalable Recompression Closed Brayton Cycle (RCBC) to be constructed for the NE - STEP demonstration system (demo) with the lowest risk possible. A systems engineering approach is recommended to ensure adequate requirements gathering, documentation, and mode ling that supports technology development relevant to advanced reactors while supporting crosscut interests in potential applications. A holistic systems engineering model was designed for the ART Energy Conversion program by leveraging Concurrent Engineering, Balance Model, Simplified V Model, and Project Management principles. The resulting model supports the identification and validation ofmore » lifecycle Brayton systems requirements, and allows designers to detail system-specific components relevant to the current stage in the lifecycle, while maintaining a holistic view of all system elements.« less

  17. Uncovering the problem-solving process: cued retrospective reporting versus concurrent and retrospective reporting.

    PubMed

    van Gog, Tamara; Paas, Fred; van Merriënboer, Jeroen J G; Witte, Puk

    2005-12-01

    This study investigated the amounts of problem-solving process information ("action," "why," "how," and "metacognitive") elicited by means of concurrent, retrospective, and cued retrospective reporting. In a within-participants design, 26 participants completed electrical circuit troubleshooting tasks under different reporting conditions. The method of cued retrospective reporting used the original computer-based task and a superimposed record of the participant's eye fixations and mouse-keyboard operations as a cue for retrospection. Cued retrospective reporting (with the exception of why information) and concurrent reporting (with the exception of metacognitive information) resulted in a higher number of codes on the different types of information than did retrospective reporting.

  18. On the operation of machines powered by quantum non-thermal baths

    DOE PAGES

    Niedenzu, Wolfgang; Gelbwaser-Klimovsky, David; Kofman, Abraham G.; ...

    2016-08-02

    Diverse models of engines energised by quantum-coherent, hence non-thermal, baths allow the engine efficiency to transgress the standard thermodynamic Carnot bound. These transgressions call for an elucidation of the underlying mechanisms. Here we show that non-thermal baths may impart not only heat, but also mechanical work to a machine. The Carnot bound is inapplicable to such a hybrid machine. Intriguingly, it may exhibit dual action, concurrently as engine and refrigerator, with up to 100% efficiency. Here, we conclude that even though a machine powered by a quantum bath may exhibit an unconventional performance, it still abides by the traditional principlesmore » of thermodynamics.« less

  19. 33 CFR 385.5 - Guidance memoranda.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Secretary of the Army. The Corps of Engineers and the South Florida Water Management District shall also... achieving the goals and purposes of the Plan. (2) The Secretary of the Army shall afford the public an... concurrence of the Secretary of the Interior and the Governor. Within 180 days after being provided with the...

  20. A game-based decision support methodology for competitive systems design

    NASA Astrophysics Data System (ADS)

    Briceno, Simon Ignacio

    This dissertation describes the development of a game-based methodology that facilitates the exploration and selection of research and development (R&D) projects under uncertain competitive scenarios. The proposed method provides an approach that analyzes competitor positioning and formulates response strategies to forecast the impact of technical design choices on a project's market performance. A critical decision in the conceptual design phase of propulsion systems is the selection of the best architecture, centerline, core size, and technology portfolio. This selection can be challenging when considering evolving requirements from both the airframe manufacturing company and the airlines in the market. Furthermore, the exceedingly high cost of core architecture development and its associated risk makes this strategic architecture decision the most important one for an engine company. Traditional conceptual design processes emphasize performance and affordability as their main objectives. These areas alone however, do not provide decision-makers with enough information as to how successful their engine will be in a competitive market. A key objective of this research is to examine how firm characteristics such as their relative differences in completing R&D projects, differences in the degree of substitutability between different project types, and first/second-mover advantages affect their product development strategies. Several quantitative methods are investigated that analyze business and engineering strategies concurrently. In particular, formulations based on the well-established mathematical field of game theory are introduced to obtain insights into the project selection problem. The use of game theory is explored in this research as a method to assist the selection process of R&D projects in the presence of imperfect market information. The proposed methodology focuses on two influential factors: the schedule uncertainty of project completion times and the uncertainty associated with competitive reactions. A normal-form matrix is created to enumerate players, their moves and payoffs, and to formulate a process by which an optimal decision can be achieved. The non-cooperative model is tested using the concept of a Nash equilibrium to identify potential strategies that are robust to uncertain market fluctuations (e.g: uncertainty in airline demand, airframe requirements and competitor positioning). A first/second-mover advantage parameter is used as a scenario dial to adjust market rewards and firms' payoffs. The methodology is applied to a commercial aircraft engine selection study where engine firms must select an optimal engine project for development. An engine modeling and simulation framework is developed to generate a broad engine project portfolio. The creation of a customer value model enables designers to incorporate airline operation characteristics into the engine modeling and simulation process to improve the accuracy of engine/customer matching. Summary. Several key findings are made that provide recommendations on project selection strategies for firms uncertain as to when they will enter the market. The proposed study demonstrates that within a technical design environment, a rational and analytical means of modeling project development strategies is beneficial in high market risk situations.

  1. The Enhancement of Concurrent Processing through Functional Programming Languages.

    DTIC Science & Technology

    1984-06-01

    ta * functional programming languages allow us to harness the pro- cessing power of computers with hundreds or even thousands of DD I 1473 EDITION OF...that it might be the best way to make imperative library", programs into functional ones which are well suited to concurrent processing. Accession For...statements in their code. We assert that functional programming languajes allok us to harness the processing power of computers with hundre4s or even

  2. Reconciling Pairs of Concurrently Used Clinical Practice Guidelines Using Constraint Logic Programming

    PubMed Central

    Wilk, Szymon; Michalowski, Martin; Michalowski, Wojtek; Hing, Marisela Mainegra; Farion, Ken

    2011-01-01

    This paper describes a new methodological approach to reconciling adverse and contradictory activities (called points of contention) occurring when a patient is managed according to two or more concurrently used clinical practice guidelines (CPGs). The need to address these inconsistencies occurs when a patient with more than one disease, each of which is a comorbid condition, has to be managed according to different treatment regimens. We propose an automatic procedure that constructs a mathematical guideline model using the Constraint Logic Programming (CLP) methodology, uses this model to identify and mitigate encountered points of contention, and revises the considered CPGs accordingly. The proposed procedure is used as an alerting mechanism and coupled with a guideline execution engine warns the physician about potential problems with the concurrent application of two or more guidelines. We illustrate the operation of our procedure in a clinical scenario describing simultaneous use of CPGs for duodenal ulcer and transient ischemic attack. PMID:22195153

  3. Engineered Nanomaterials, Sexy New Technology and Potential Hazards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaulieu, R A

    Engineered nanomaterials enhance exciting new applications that can greatly benefit society in areas of cancer treatments, solar energy, energy storage, and water purification. While nanotechnology shows incredible promise in these and other areas by exploiting nanomaterials unique properties, these same properties can potentially cause adverse health effects to workers who may be exposed during work. Dispersed nanoparticles in air can cause adverse health effects to animals not merely due to their chemical properties but due to their size, structure, shape, surface chemistry, solubility, carcinogenicity, reproductive toxicity, mutagenicity, dermal toxicity, and parent material toxicity. Nanoparticles have a greater likelihood of lungmore » deposition and blood absorption than larger particles due to their size. Nanomaterials can also pose physical hazards due to their unusually high reactivity, which makes them useful as catalysts, but has the potential to cause fires and explosions. Characterization of the hazards (and potential for exposures) associated with nanomaterial development and incorporation in other products is an essential step in the development of nanotechnologies. Developing controls for these hazards are equally important. Engineered controls should be integrated into nanomaterial manufacturing process design according to 10CFR851, DOE Policy 456.1, and DOE Notice 456.1 as safety-related hardware or administrative controls for worker safety. Nanomaterial hazards in a nuclear facility must also meet control requirements per DOE standards 3009, 1189, and 1186. Integration of safe designs into manufacturing processes for new applications concurrent with the developing technology is essential for worker safety. This paper presents a discussion of nanotechnology, nanomaterial properties/hazards and controls.« less

  4. A minimum cost tolerance allocation method for rocket engines and robust rocket engine design

    NASA Technical Reports Server (NTRS)

    Gerth, Richard J.

    1993-01-01

    Rocket engine design follows three phases: systems design, parameter design, and tolerance design. Systems design and parameter design are most effectively conducted in a concurrent engineering (CE) environment that utilize methods such as Quality Function Deployment and Taguchi methods. However, tolerance allocation remains an art driven by experience, handbooks, and rules of thumb. It was desirable to develop and optimization approach to tolerancing. The case study engine was the STME gas generator cycle. The design of the major components had been completed and the functional relationship between the component tolerances and system performance had been computed using the Generic Power Balance model. The system performance nominals (thrust, MR, and Isp) and tolerances were already specified, as were an initial set of component tolerances. However, the question was whether there existed an optimal combination of tolerances that would result in the minimum cost without any degradation in system performance.

  5. Bidding-based autonomous process planning and scheduling

    NASA Astrophysics Data System (ADS)

    Gu, Peihua; Balasubramanian, Sivaram; Norrie, Douglas H.

    1995-08-01

    Improving productivity through computer integrated manufacturing systems (CIMS) and concurrent engineering requires that the islands of automation in an enterprise be completely integrated. The first step in this direction is to integrate design, process planning, and scheduling. This can be achieved through a bidding-based process planning approach. The product is represented in a STEP model with detailed design and administrative information including design specifications, batch size, and due dates. Upon arrival at the manufacturing facility, the product registered in the shop floor manager which is essentially a coordinating agent. The shop floor manager broadcasts the product's requirements to the machines. The shop contains autonomous machines that have knowledge about their functionality, capabilities, tooling, and schedule. Each machine has its own process planner and responds to the product's request in a different way that is consistent with its capabilities and capacities. When more than one machine offers certain process(es) for the same requirements, they enter into negotiation. Based on processing time, due date, and cost, one of the machines wins the contract. The successful machine updates its schedule and advises the product to request raw material for processing. The concept was implemented using a multi-agent system with the task decomposition and planning achieved through contract nets. The examples are included to illustrate the approach.

  6. A multiprocessing architecture for real-time monitoring

    NASA Technical Reports Server (NTRS)

    Schmidt, James L.; Kao, Simon M.; Read, Jackson Y.; Weitzenkamp, Scott M.; Laffey, Thomas J.

    1988-01-01

    A multitasking architecture for performing real-time monitoring and analysis using knowledge-based problem solving techniques is described. To handle asynchronous inputs and perform in real time, the system consists of three or more distributed processes which run concurrently and communicate via a message passing scheme. The Data Management Process acquires, compresses, and routes the incoming sensor data to other processes. The Inference Process consists of a high performance inference engine that performs a real-time analysis on the state and health of the physical system. The I/O Process receives sensor data from the Data Management Process and status messages and recommendations from the Inference Process, updates its graphical displays in real time, and acts as the interface to the console operator. The distributed architecture has been interfaced to an actual spacecraft (NASA's Hubble Space Telescope) and is able to process the incoming telemetry in real-time (i.e., several hundred data changes per second). The system is being used in two locations for different purposes: (1) in Sunnyville, California at the Space Telescope Test Control Center it is used in the preflight testing of the vehicle; and (2) in Greenbelt, Maryland at NASA/Goddard it is being used on an experimental basis in flight operations for health and safety monitoring.

  7. Decision support and disease management: a logic engineering approach.

    PubMed

    Fox, J; Thomson, R

    1998-12-01

    This paper describes the development and application of PROforma, a unified technology for clinical decision support and disease management. Work leading to the implementation of PROforma has been carried out in a series of projects funded by European agencies over the past 13 years. The work has been based on logic engineering, a distinct design and development methodology that combines concepts from knowledge engineering, logic programming, and software engineering. Several of the projects have used the approach to demonstrate a wide range of applications in primary and specialist care and clinical research. Concurrent academic research projects have provided a sound theoretical basis for the safety-critical elements of the methodology. The principal technical results of the work are the PROforma logic language for defining clinical processes and an associated suite of software tools for delivering applications, such as decision support and disease management procedures. The language supports four standard objects (decisions, plans, actions, and enquiries), each of which has an intuitive meaning with well-understood logical semantics. The development toolset includes a powerful visual programming environment for composing applications from these standard components, for verifying consistency and completeness of the resulting specification and for delivering stand-alone or embeddable applications. Tools and applications that have resulted from the work are described and illustrated, with examples from specialist cancer care and primary care. The results of a number of evaluation activities are included to illustrate the utility of the technology.

  8. Improved triacylglycerol production in Acinetobacter baylyi ADP1 by metabolic engineering.

    PubMed

    Santala, Suvi; Efimova, Elena; Kivinen, Virpi; Larjo, Antti; Aho, Tommi; Karp, Matti; Santala, Ville

    2011-05-18

    Triacylglycerols are used in various purposes including food applications, cosmetics, oleochemicals and biofuels. Currently the main sources for triacylglycerol are vegetable oils, and microbial triacylglycerol has been suggested as an alternative for these. Due to the low production rates and yields of microbial processes, the role of metabolic engineering has become more significant. As a robust model organism for genetic and metabolic studies, and for the natural capability to produce triacylglycerol, Acinetobacter baylyi ADP1 serves as an excellent organism for modelling the effects of metabolic engineering for energy molecule biosynthesis. Beneficial gene deletions regarding triacylglycerol production were screened by computational means exploiting the metabolic model of ADP1. Four deletions, acr1, poxB, dgkA, and a triacylglycerol lipase were chosen to be studied experimentally both separately and concurrently by constructing a knock-out strain (MT) with three of the deletions. Improvements in triacylglycerol production were observed: the strain MT produced 5.6 fold more triacylglycerol (mg/g cell dry weight) compared to the wild type strain, and the proportion of triacylglycerol in total lipids was increased by 8-fold. In silico predictions of beneficial gene deletions were verified experimentally. The chosen single and multiple gene deletions affected beneficially the natural triacylglycerol metabolism of A. baylyi ADP1. This study demonstrates the importance of single gene deletions in triacylglycerol metabolism, and proposes Acinetobacter sp. ADP1 as a model system for bioenergetic studies regarding metabolic engineering.

  9. Reducing Concurrent Sexual Partnerships Among Blacks in the Rural Southeastern United States: Development of Narrative Messages for a Radio Campaign.

    PubMed

    Cates, Joan R; Francis, Diane B; Ramirez, Catalina; Brown, Jane D; Schoenbach, Victor J; Fortune, Thierry; Powell Hammond, Wizdom; Adimora, Adaora A

    2015-01-01

    In the United States, heterosexual transmission of HIV infection is dramatically higher among Blacks than among Whites. Overlapping (concurrent) sexual partnerships promote HIV transmission. The authors describe their process for developing a radio campaign (Escape the Web) to raise awareness among 18-34-year-old Black adults of the effect of concurrency on HIV transmission in the rural South. Radio is a powerful channel for the delivery of narrative-style health messages. Through six focus groups (n = 51) and 42 intercept interviews, the authors explored attitudes toward concurrency and solicited feedback on sample messages. Men were advised to (a) end concurrent partnerships and not to begin new ones; (b) use condoms consistently with all partners; and (c) tell others about the risks of concurrency and benefits of ending concurrent partnerships. The narrative portrayed risky behaviors that trigger initiation of casual partnerships. Women were advised to (a) end partnerships in which they are not their partner's only partner; (b) use condoms consistently with all partners; and (c) tell others about the risks of concurrency and benefits of ending concurrent partnerships. Messages for all advised better modeling for children.

  10. Reducing Concurrent Sexual Partnerships Among Blacks in the Rural Southeastern United States: Development of Narrative Messages for a Radio Campaign

    PubMed Central

    CATES, JOAN R.; FRANCIS, DIANE B.; RAMIREZ, CATALINA; BROWN, JANE D.; SCHOENBACH, VICTOR J.; FORTUNE, THIERRY; HAMMOND, WIZDOM POWELL; ADIMORA, ADAORA A.

    2015-01-01

    In the United States, heterosexual transmission of HIV infection is dramatically higher among Blacks than among Whites. Overlapping (concurrent) sexual partnerships promote HIV transmission. The authors describe their process for developing a radio campaign (Escape the Web) to raise awareness among 18–34-year-old Black adults of the effect of concurrency on HIV transmission in the rural South. Radio is a powerful channel for the delivery of narrative-style health messages. Through six focus groups (n = 51) and 42 intercept interviews, the authors explored attitudes toward concurrency and solicited feedback on sample messages. Men were advised to (a) end concurrent partnerships and not to begin new ones; (b) use condoms consistently with all partners; and (c) tell others about the risks of concurrency and benefits of ending concurrent partnerships. The narrative portrayed risky behaviors that trigger initiation of casual partnerships. Women were advised to (a) end partnerships in which they are not their partner’s only partner; (b) use condoms consistently with all partners; and (c) tell others about the risks of concurrency and benefits of ending concurrent partnerships. Messages for all advised better modeling for children. PMID:26134387

  11. Fluid front displacement dynamics affecting pressure fluctuations and phase entrapment in porous media

    NASA Astrophysics Data System (ADS)

    Moebius, F.; Or, D.

    2012-04-01

    Many natural and engineering processes involve motion of fluid fronts in porous media, from infiltration and drainage in hydrology to reservoir management in petroleum engineering. Macroscopically smooth and continuous motion of displacement fronts involves numerous rapid interfacial jumps and local reconfigurations. Detailed observations of displacement processes in micromodels illustrate the wide array of fluid interfacial dynamics ranging from irregular jumping-pinning motions to gradual pore scale invasions. The pressure fluctuations associated with interfacial motions reflect not only pore geometry (as traditionally hypothesized) but there is a strong influence of boundary conditions (e.g., mean drainage rate). The time scales associated with waiting time distribution of individual invasion events and decay time of inertial oscillations (following a rapid interfacial jump) provide a means for distinguishing between displacement regimes. Direct observations using high-speed camera combined with concurrent pressure signal measurements were instrumental in clarifying influences of flow rates, pore size, and gravity on burst size distribution and waiting times. We compared our results with the early experimental and theoretical study on burst size and waiting time distribution during slow drainage processes of Måløy et al. [Måløy et al., 1992]. Results provide insights on critical invasion events that exert strong influence on macroscopic phenomena such as front morphology and residual phase entrapment behind leading to hysteresis. Måløy, K. J., L. Furuberg, J. Feder, and T. Jossang (1992), Dynamics of Slow Drainage in Porous-Media, Phys Rev Lett, 68(14), 2161-2164.

  12. Concurrent analysis: towards generalisable qualitative research.

    PubMed

    Snowden, Austyn; Martin, Colin R

    2011-10-01

    This study develops an original method of qualitative analysis coherent with its interpretivist principles. The objective is to increase the likelihood of achieving generalisability and so improve the chance of the findings being translated into practice. Good qualitative research depends on coherent analysis of different types of data. The limitations of existing methodologies are first discussed to justify the need for a novel approach. To illustrate this approach, primary evidence is presented using the new methodology. The primary evidence consists of a constructivist grounded theory of how mental health nurses with prescribing authority integrate prescribing into practice. This theory is built concurrently from interviews, reflective accounts and case study data from the literature. Concurrent analysis. Ten research articles and 13 semi-structured interviews were sampled purposively and then theoretically and analysed concurrently using constructivist grounded theory. A theory of the process of becoming competent in mental health nurse prescribing was generated through this process. This theory was validated by 32 practising mental health nurse prescribers as an accurate representation of their experience. The methodology generated a coherent and generalisable theory. It is therefore claimed that concurrent analysis engenders consistent and iterative treatment of different sources of qualitative data in a manageable manner. This process supports facilitation of the highest standard of qualitative research. Concurrent analysis removes the artificial delineation of relevant literature from other forms of constructed data. This gives researchers clear direction to treat qualitative data consistently raising the chances of generalisability of the findings. Raising the generalisability of qualitative research will increase its chances of informing clinical practice. © 2010 Blackwell Publishing Ltd.

  13. Concurrent Validity and Diagnostic Accuracy of the Dynamic Indicators of Basic Early Literacy Skills and the Comprehensive Test of Phonological Processing

    ERIC Educational Resources Information Center

    Hintze, John M.; Ryan, Amanda L.; Stoner, Gary

    2003-01-01

    The purpose of this study was to (a) examine the concurrent validity of the Dynamic Indicators of Basic Early Literacy Skills (DIBELS) with the Comprehensive Test of Phonological Processing (CTOPP), and (b) explore the diagnostic accuracy of the DIBELS in predicting CTOPP performance using suggested and alternative cut-scores. Eighty-six students…

  14. Evaluation of interaction dynamics of concurrent processes

    NASA Astrophysics Data System (ADS)

    Sobecki, Piotr; Białasiewicz, Jan T.; Gross, Nicholas

    2017-03-01

    The purpose of this paper is to present the wavelet tools that enable the detection of temporal interactions of concurrent processes. In particular, the determination of interaction coherence of time-varying signals is achieved using a complex continuous wavelet transform. This paper has used electrocardiogram (ECG) and seismocardiogram (SCG) data set to show multiple continuous wavelet analysis techniques based on Morlet wavelet transform. MATLAB Graphical User Interface (GUI), developed in the reported research to assist in quick and simple data analysis, is presented. These software tools can discover the interaction dynamics of time-varying signals, hence they can reveal their correlation in phase and amplitude, as well as their non-linear interconnections. The user-friendly MATLAB GUI enables effective use of the developed software what enables to load two processes under investigation, make choice of the required processing parameters, and then perform the analysis. The software developed is a useful tool for researchers who have a need for investigation of interaction dynamics of concurrent processes.

  15. The Effectiveness of Concurrent Design on the Cost and Schedule Performance of Defense Weapons System Acquisitions

    NASA Astrophysics Data System (ADS)

    Robertson, Randolph B.

    This study investigates the impact of concurrent design on the cost growth and schedule growth of US Department of Defense Major Defense Acquisition Systems (MDAPs). It is motivated by the question of whether employment of concurrent design in the development of a major weapon system will produce better results in terms of cost and schedule than traditional serial development methods. Selected Acquisition Reports were used to determine the cost and schedule growth of MDAPs as well as the degree of concurrency employed. Two simple linear regression analyses were used to determine the degree to which cost growth and schedule growth vary with concurrency. The results were somewhat surprising in that for major weapon systems the utilization of concurrency as it was implemented in the programs under study was shown to have no effect on cost performance, and that performance to development schedule, one of the purported benefits of concurrency, was actually shown to deteriorate with increases in concurrency. These results, while not an indictment of the concept of concurrency, indicate that better practices and methods are needed in the implementation of concurrency in major weapon systems. The findings are instructive to stakeholders in the weapons acquisition process in their consideration of whether and how to employ concurrent design strategies in their planning of new weapons acquisition programs.

  16. DeMAID/GA an Enhanced Design Manager's Aid for Intelligent Decomposition

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.

    1996-01-01

    Many companies are looking for new tools and techniques to aid a design manager in making decisions that can reduce the time and cost of a design cycle. One tool is the Design Manager's Aid for Intelligent Decomposition (DeMAID). Since the initial public release of DeMAID in 1989, much research has been done in the areas of decomposition, concurrent engineering, parallel processing, and process management; many new tools and techniques have emerged. Based on these recent research and development efforts, numerous enhancements have been added to DeMAID to further aid the design manager in saving both cost and time in a design cycle. The key enhancement, a genetic algorithm (GA), will be available in the next public release called DeMAID/GA. The GA sequences the design processes to minimize the cost and time in converging a solution. The major enhancements in the upgrade of DeMAID to DeMAID/GA are discussed in this paper. A sample conceptual design project is used to show how these enhancements can be applied to improve the design cycle.

  17. Regenerative life support system research

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Sections on modeling, experimental activities during the grant period, and topics under consideration for the future are contained. The sessions contain discussions of: four concurrent modeling approaches that were being integrated near the end of the period (knowledge-based modeling support infrastructure and data base management, object-oriented steady state simulations for three concepts, steady state mass-balance engineering tradeoff studies, and object-oriented time-step, quasidynamic simulations of generic concepts); interdisciplinary research activities, beginning with a discussion of RECON lab development and use, and followed with discussions of waste processing research, algae studies and subsystem modeling, low pressure growth testing of plants, subsystem modeling of plants, control of plant growth using lighting and CO2 supply as variables, search for and development of lunar soil simulants, preliminary design parameters for a lunar base life support system, and research considerations for food processing in space; and appendix materials, including a discussion of the CELSS Conference, detailed analytical equations for mass-balance modeling, plant modeling equations, and parametric data on existing life support systems for use in modeling.

  18. System Software Framework for System of Systems Avionics

    NASA Technical Reports Server (NTRS)

    Ferguson, Roscoe C.; Peterson, Benjamin L; Thompson, Hiram C.

    2005-01-01

    Project Constellation implements NASA's vision for space exploration to expand human presence in our solar system. The engineering focus of this project is developing a system of systems architecture. This architecture allows for the incremental development of the overall program. Systems can be built and connected in a "Lego style" manner to generate configurations supporting various mission objectives. The development of the avionics or control systems of such a massive project will result in concurrent engineering. Also, each system will have software and the need to communicate with other (possibly heterogeneous) systems. Fortunately, this design problem has already been solved during the creation and evolution of systems such as the Internet and the Department of Defense's successful effort to standardize distributed simulation (now IEEE 1516). The solution relies on the use of a standard layered software framework and a communication protocol. A standard framework and communication protocol is suggested for the development and maintenance of Project Constellation systems. The ARINC 653 standard is a great start for such a common software framework. This paper proposes a common system software framework that uses the Real Time Publish/Subscribe protocol for framework-to-framework communication to extend ARINC 653. It is highly recommended that such a framework be established before development. This is important for the success of concurrent engineering. The framework provides an infrastructure for general system services and is designed for flexibility to support a spiral development effort.

  19. Developing Concurrency Messages for the Black Community in Seattle, Washington

    PubMed Central

    Chapman, Caitlin Hughes; Clad, Rachel; Murray, Kate; Foster, Jennifer; Morris, Martina; Parks, Malcolm R.; Kurth, Ann Elizabeth

    2013-01-01

    In the United States, Blacks are disproportionately impacted by HIV/AIDS. Sexual networks and concurrent relationships have emerged as important contributors to the heterosexual transmission of HIV. To date, Africa is the only continent where an understanding of the impact of sexual concurrency has been conveyed in HIV prevention messaging. This project was developed by researchers and members of the Seattle, WA African American and African-Born communities, using the principles of community-based participatory research (CBPR). Interest in developing concurrency messaging came from the community and resulted in the successful submission of a community-academic partnership proposal to develop and disseminate HIV prevention messaging around concurrency. We describe: (a) the development of concurrency messaging through the integration of collected formative data and findings from the scientific literature; (b) the process of disseminating the message in the local Black community; and (c) important factors to consider in the development of similar campaigns. PMID:23206202

  20. Concurrent working memory load can facilitate selective attention: evidence for specialized load.

    PubMed

    Park, Soojin; Kim, Min-Shik; Chun, Marvin M

    2007-10-01

    Load theory predicts that concurrent working memory load impairs selective attention and increases distractor interference (N. Lavie, A. Hirst, J. W. de Fockert, & E. Viding). Here, the authors present new evidence that the type of concurrent working memory load determines whether load impairs selective attention or not. Working memory load was paired with a same/different matching task that required focusing on targets while ignoring distractors. When working memory items shared the same limited-capacity processing mechanisms with targets in the matching task, distractor interference increased. However, when working memory items shared processing with distractors in the matching task, distractor interference decreased, facilitating target selection. A specialized load account is proposed to describe the dissociable effects of working memory load on selective processing depending on whether the load overlaps with targets or with distractors. (c) 2007 APA

  1. Area 2. Use Of Engineered Nanoparticle-Stabilized CO 2 Foams To Improve Volumetric Sweep Of CO 2 EOR Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiCarlo, David; Huh, Chun; Johnston, Keith P.

    2015-01-31

    The goal of this project was to develop a new CO 2 injection enhanced oil recovery (CO 2-EOR) process using engineered nanoparticles with optimized surface coatings that has better volumetric sweep efficiency and a wider application range than conventional CO 2-EOR processes. The main objectives of this project were to (1) identify the characteristics of the optimal nanoparticles that generate extremely stable CO 2 foams in situ in reservoir regions without oil; (2) develop a novel method of mobility control using “self-guiding” foams with smart nanoparticles; and (3) extend the applicability of the new method to reservoirs having a widemore » range of salinity, temperatures, and heterogeneity. Concurrent with our experimental effort to understand the foam generation and transport processes and foam-induced mobility reduction, we also developed mathematical models to explain the underlying processes and mechanisms that govern the fate of nanoparticle-stabilized CO 2 foams in porous media and applied these models to (1) simulate the results of foam generation and transport experiments conducted in beadpack and sandstone core systems, (2) analyze CO 2 injection data received from a field operator, and (3) aid with the design of a foam injection pilot test. Our simulator is applicable to near-injection well field-scale foam injection problems and accounts for the effects due to layered heterogeneity in permeability field, foam stabilizing agents effects, oil presence, and shear-thinning on the generation and transport of nanoparticle-stabilized C/W foams. This report presents the details of our experimental and numerical modeling work and outlines the highlights of our findings.« less

  2. Computer Program Re-layers Engineering Drawings

    NASA Technical Reports Server (NTRS)

    Crosby, Dewey C., III

    1990-01-01

    RULCHK computer program aids in structuring layers of information pertaining to part or assembly designed with software described in article "Software for Drawing Design Details Concurrently" (MFS-28444). Checks and optionally updates structure of layers for part. Enables designer to construct model and annotate its documentation without burden of manually layering part to conform to standards at design time.

  3. Human transinformation rates during one-to-four axis tracking with a concurrent audio task

    NASA Technical Reports Server (NTRS)

    Baty, D. L.

    1972-01-01

    The information processing rates of six subjects performing one-, two-, three-, and four-axis compensatory tracking tasks, with and without a concurrent four-choice auditory task were determined. The purpose was to obtain further evidence concerning the nature of an hypothesized ceiling on human transinformation rates. Interference was found among tasks, but the evidence concerning a ceiling on information processing rates was inconclusive.

  4. The effects of concurrent cognitive load on phonological processing in adults who stutter.

    PubMed

    Jones, Robin M; Fox, Robert A; Jacewicz, Ewa

    2012-12-01

    To determine whether phonological processing in adults who stutter (AWS) is disrupted by increased amounts of cognitive load in a concurrent attention-demanding task. Nine AWS and 9 adults who do not stutter (AWNS) participated. Using a dual-task paradigm, the authors presented word pairs for rhyme judgments and, concurrently, letter strings for memory recall. The rhyme judgment task manipulated rhyming type (rhyming/nonrhyming) and orthographic representation (similar/dissimilar). The memory recall task varied stimulus complexity (no letters, 3 letters, 5 letters). Rhyme judgment accuracy and reaction time (RT) were used to assess phonological processing, and letter recall accuracy was used to measure memory recall. For rhyme judgments, AWS were as accurate as AWNS, and the increase in the cognitive load did not affect rhyme judgment accuracy of either group. Significant group differences were found in RTs (delays by AWS were 241 ms greater). RTs of AWS were also slower in the most demanding rhyme condition and varied with the complexity of the memory task. Accuracy of letter recall of AWS was comparatively worse in the most demanding 5-letter condition. Phonological and cognitive processing of AWS is more vulnerable to disruptions caused by increased amounts of cognitive load in concurrent attention-demanding tasks.

  5. Coarse-grained component concurrency in Earth system modeling: parallelizing atmospheric radiative transfer in the GFDL AM3 model using the Flexible Modeling System coupling framework

    NASA Astrophysics Data System (ADS)

    Balaji, V.; Benson, Rusty; Wyman, Bruce; Held, Isaac

    2016-10-01

    Climate models represent a large variety of processes on a variety of timescales and space scales, a canonical example of multi-physics multi-scale modeling. Current hardware trends, such as Graphical Processing Units (GPUs) and Many Integrated Core (MIC) chips, are based on, at best, marginal increases in clock speed, coupled with vast increases in concurrency, particularly at the fine grain. Multi-physics codes face particular challenges in achieving fine-grained concurrency, as different physics and dynamics components have different computational profiles, and universal solutions are hard to come by. We propose here one approach for multi-physics codes. These codes are typically structured as components interacting via software frameworks. The component structure of a typical Earth system model consists of a hierarchical and recursive tree of components, each representing a different climate process or dynamical system. This recursive structure generally encompasses a modest level of concurrency at the highest level (e.g., atmosphere and ocean on different processor sets) with serial organization underneath. We propose to extend concurrency much further by running more and more lower- and higher-level components in parallel with each other. Each component can further be parallelized on the fine grain, potentially offering a major increase in the scalability of Earth system models. We present here first results from this approach, called coarse-grained component concurrency, or CCC. Within the Geophysical Fluid Dynamics Laboratory (GFDL) Flexible Modeling System (FMS), the atmospheric radiative transfer component has been configured to run in parallel with a composite component consisting of every other atmospheric component, including the atmospheric dynamics and all other atmospheric physics components. We will explore the algorithmic challenges involved in such an approach, and present results from such simulations. Plans to achieve even greater levels of coarse-grained concurrency by extending this approach within other components, such as the ocean, will be discussed.

  6. Metric integration architecture for product development

    NASA Astrophysics Data System (ADS)

    Sieger, David B.

    1997-06-01

    Present-day product development endeavors utilize the concurrent engineering philosophy as a logical means for incorporating a variety of viewpoints into the design of products. Since this approach provides no explicit procedural provisions, it is necessary to establish at least a mental coupling with a known design process model. The central feature of all such models is the management and transformation of information. While these models assist in structuring the design process, characterizing the basic flow of operations that are involved, they provide no guidance facilities. The significance of this feature, and the role it plays in the time required to develop products, is increasing in importance due to the inherent process dynamics, system/component complexities, and competitive forces. The methodology presented in this paper involves the use of a hierarchical system structure, discrete event system specification (DEVS), and multidimensional state variable based metrics. This approach is unique in its capability to quantify designer's actions throughout product development, provide recommendations about subsequent activity selection, and coordinate distributed activities of designers and/or design teams across all design stages. Conceptual design tool implementation results are used to demonstrate the utility of this technique in improving the incremental decision making process.

  7. Modeling of dialogue regimes of distance robot control

    NASA Astrophysics Data System (ADS)

    Larkin, E. V.; Privalov, A. N.

    2017-02-01

    Process of distance control of mobile robots is investigated. Petri-Markov net for modeling of dialogue regime is worked out. It is shown, that sequence of operations of next subjects: a human operator, a dialogue computer and an onboard computer may be simulated with use the theory of semi-Markov processes. From the semi-Markov process of the general form Markov process was obtained, which includes only states of transaction generation. It is shown, that a real transaction flow is the result of «concurrency» in states of Markov process. Iteration procedure for evaluation of transaction flow parameters, which takes into account effect of «concurrency», is proposed.

  8. NASA Planetary Science Summer School: Longitudinal Study

    NASA Astrophysics Data System (ADS)

    Giron, Jennie M.; Sohus, A.

    2006-12-01

    NASA’s Planetary Science Summer School is a program designed to prepare the next generation of scientists and engineers to participate in future missions of solar system exploration. The opportunity is advertised to science and engineering post-doctoral and graduate students with a strong interest in careers in planetary exploration. Preference is given to U.S. citizens. The “school” consists of a one-week intensive team exercise learning the process of developing a robotic mission concept into reality through concurrent engineering, working with JPL’s Advanced Project Design Team (Team X). This program benefits the students by providing them with skills, knowledge and the experience of collaborating with a concept mission design. A longitudinal study was conducted to assess the impact of the program on the past participants of the program. Data collected included their current contact information, if they are currently part of the planetary exploration community, if participation in the program contributed to any career choices, if the program benefited their career paths, etc. Approximately 37% of 250 past participants responded to the online survey. Of these, 83% indicated that they are actively involved in planetary exploration or aerospace in general; 78% said they had been able to apply what they learned in the program to their current job or professional career; 100% said they would recommend this program to a colleague.

  9. Concurrent micromechanical tailoring and fabrication process optimization for metal-matrix composites

    NASA Technical Reports Server (NTRS)

    Morel, M.; Saravanos, D. A.; Chamis, Christos C.

    1990-01-01

    A method is presented to minimize the residual matrix stresses in metal matrix composites. Fabrication parameters such as temperature and consolidation pressure are optimized concurrently with the characteristics (i.e., modulus, coefficient of thermal expansion, strength, and interphase thickness) of a fiber-matrix interphase. By including the interphase properties in the fabrication process, lower residual stresses are achievable. Results for an ultra-high modulus graphite (P100)/copper composite show a reduction of 21 percent for the maximum matrix microstress when optimizing the fabrication process alone. Concurrent optimization of the fabrication process and interphase properties show a 41 percent decrease in the maximum microstress. Therefore, this optimization method demonstrates the capability of reducing residual microstresses by altering the temperature and consolidation pressure histories and tailoring the interphase properties for an improved composite material. In addition, the results indicate that the consolidation pressures are the most important fabrication parameters, and the coefficient of thermal expansion is the most critical interphase property.

  10. The emotional startle effect is disrupted by a concurrent working memory task.

    PubMed

    King, Rosemary; Schaefer, Alexandre

    2011-02-01

    Working memory (WM) processes are often thought to play an important role in the cognitive regulation of negative emotions. However, little is known about how they influence emotional processing. We report two experiments that tested whether a concurrent working memory task could modulate the emotional startle eyeblink effect, a well-known index of emotional processing. In both experiments, emotionally negative and neutral pictures were viewed in two conditions: a "cognitive load" (CL) condition, in which participants had to actively maintain information in working memory (WM) while viewing the pictures, and a control "no load" (NL) condition. Picture-viewing instructions were identical across CL and NL. In both experiments, results showed a significant reduction of the emotional modulation of the startle eyeblink reflex in the CL condition compared to the NL condition. These findings suggest that a concurrent WM task disrupts emotional processing even when participants are directing visual focus on emotionally relevant information. Copyright © 2010 Society for Psychophysiological Research.

  11. Interaction of attentional and motor control processes in handwriting.

    PubMed

    Brown, T L; Donnenwirth, E E

    1990-01-01

    The interaction between attentional capacity, motor control processes, and strategic adaptations to changing task demands was investigated in handwriting, a continuous (rather than discrete) skilled performance. Twenty-four subjects completed 12 two-minute handwriting samples under instructions stressing speeded handwriting, normal handwriting, or highly legible handwriting. For half of the writing samples, a concurrent auditory monitoring task was imposed. Subjects copied either familiar (English) or unfamiliar (Latin) passages. Writing speed, legibility ratings, errors in writing and in the secondary auditory task, and a derived measure of the average number of characters held in short-term memory during each sample ("planning unit size") were the dependent variables. The results indicated that the ability to adapt to instructions stressing speed or legibility was substantially constrained by the concurrent listening task and by text familiarity. Interactions between instructions, task concurrence, and text familiarity in the legibility ratings, combined with further analyses of planning unit size, indicated that information throughput from temporary storage mechanisms to motor processes mediated the loss of flexibility effect. Overall, the results suggest that strategic adaptations of a skilled performance to changing task circumstances are sensitive to concurrent attentional demands and that departures from "normal" or "modal" performance require attention.

  12. Geo-information processing service composition for concurrent tasks: A QoS-aware game theory approach

    NASA Astrophysics Data System (ADS)

    Li, Haifeng; Zhu, Qing; Yang, Xiaoxia; Xu, Linrong

    2012-10-01

    Typical characteristics of remote sensing applications are concurrent tasks, such as those found in disaster rapid response. The existing composition approach to geographical information processing service chain, searches for an optimisation solution and is what can be deemed a "selfish" way. This way leads to problems of conflict amongst concurrent tasks and decreases the performance of all service chains. In this study, a non-cooperative game-based mathematical model to analyse the competitive relationships between tasks, is proposed. A best response function is used, to assure each task maintains utility optimisation by considering composition strategies of other tasks and quantifying conflicts between tasks. Based on this, an iterative algorithm that converges to Nash equilibrium is presented, the aim being to provide good convergence and maximise the utilisation of all tasks under concurrent task conditions. Theoretical analyses and experiments showed that the newly proposed method, when compared to existing service composition methods, has better practical utility in all tasks.

  13. The multiple subfunctions of attention: differential developmental gateways to literacy and numeracy.

    PubMed

    Steele, Ann; Karmiloff-Smith, Annette; Cornish, Kim; Scerif, Gaia

    2012-11-01

    Attention is construed as multicomponential, but the roles of its distinct subfunctions in shaping the broader developing cognitive landscape are poorly understood. The current study assessed 3- to 6-year-olds (N=83) to: (a) trace developmental trajectories of attentional processes and their structure in early childhood and (b) measure the impact of distinct attention subfunctions on concurrent and longitudinal abilities related to literacy and numeracy. Distinct trajectories across attention measures revealed the emergence of 2 attentional factors, encompassing "executive" and "sustained-selective" processes. Executive attention predicted concurrent abilities across domains at Time 1, whereas sustained-selective attention predicted basic numeracy 1 year later. These concurrent and longitudinal constraints cast a broader light on the unfolding relations between domain-general and domain-specific processes over early childhood. © 2012 The Authors. Child Development © 2012 Society for Research in Child Development, Inc.

  14. Multitasking-Pascal extensions solve concurrency problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mackie, P.H.

    1982-09-29

    To avoid deadlock (one process waiting for a resource than another process can't release) and indefinite postponement (one process being continually denied a resource request) in a multitasking-system application, it is possible to use a high-level development language with built-in concurrency handlers. Parallel Pascal is one such language; it extends standard Pascal via special task synchronizers: a new data type called signal, new system procedures called wait and send and a Boolean function termed awaited. To understand the language's use the author examines the problems it helps solve.

  15. Enabling communication concurrency through flexible MPI endpoints

    DOE PAGES

    Dinan, James; Grant, Ryan E.; Balaji, Pavan; ...

    2014-09-23

    MPI defines a one-to-one relationship between MPI processes and ranks. This model captures many use cases effectively; however, it also limits communication concurrency and interoperability between MPI and programming models that utilize threads. Our paper describes the MPI endpoints extension, which relaxes the longstanding one-to-one relationship between MPI processes and ranks. Using endpoints, an MPI implementation can map separate communication contexts to threads, allowing them to drive communication independently. Also, endpoints enable threads to be addressable in MPI operations, enhancing interoperability between MPI and other programming models. Furthermore, these characteristics are illustrated through several examples and an empirical study thatmore » contrasts current multithreaded communication performance with the need for high degrees of communication concurrency to achieve peak communication performance.« less

  16. Enabling communication concurrency through flexible MPI endpoints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinan, James; Grant, Ryan E.; Balaji, Pavan

    MPI defines a one-to-one relationship between MPI processes and ranks. This model captures many use cases effectively; however, it also limits communication concurrency and interoperability between MPI and programming models that utilize threads. Our paper describes the MPI endpoints extension, which relaxes the longstanding one-to-one relationship between MPI processes and ranks. Using endpoints, an MPI implementation can map separate communication contexts to threads, allowing them to drive communication independently. Also, endpoints enable threads to be addressable in MPI operations, enhancing interoperability between MPI and other programming models. Furthermore, these characteristics are illustrated through several examples and an empirical study thatmore » contrasts current multithreaded communication performance with the need for high degrees of communication concurrency to achieve peak communication performance.« less

  17. Enabling communication concurrency through flexible MPI endpoints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinan, James; Grant, Ryan E.; Balaji, Pavan

    MPI defines a one-to-one relationship between MPI processes and ranks. This model captures many use cases effectively; however, it also limits communication concurrency and interoperability between MPI and programming models that utilize threads. This paper describes the MPI endpoints extension, which relaxes the longstanding one-to-one relationship between MPI processes and ranks. Using endpoints, an MPI implementation can map separate communication contexts to threads, allowing them to drive communication independently. Endpoints also enable threads to be addressable in MPI operations, enhancing interoperability between MPI and other programming models. These characteristics are illustrated through several examples and an empirical study that contrastsmore » current multithreaded communication performance with the need for high degrees of communication concurrency to achieve peak communication performance.« less

  18. NSTX-U Advances in Real-Time C++11 on Linux

    NASA Astrophysics Data System (ADS)

    Erickson, Keith G.

    2015-08-01

    Programming languages like C and Ada combined with proprietary embedded operating systems have dominated the real-time application space for decades. The new C++11 standard includes native, language-level support for concurrency, a required feature for any nontrivial event-oriented real-time software. Threads, Locks, and Atomics now exist to provide the necessary tools to build the structures that make up the foundation of a complex real-time system. The National Spherical Torus Experiment Upgrade (NSTX-U) at the Princeton Plasma Physics Laboratory (PPPL) is breaking new ground with the language as applied to the needs of fusion devices. A new Digital Coil Protection System (DCPS) will serve as the main protection mechanism for the magnetic coils, and it is written entirely in C++11 running on Concurrent Computer Corporation's real-time operating system, RedHawk Linux. It runs over 600 algorithms in a 5 kHz control loop that determine whether or not to shut down operations before physical damage occurs. To accomplish this, NSTX-U engineers developed software tools that do not currently exist elsewhere, including real-time atomic synchronization, real-time containers, and a real-time logging framework. Together with a recent (and carefully configured) version of the GCC compiler, these tools enable data acquisition, processing, and output using a conventional operating system to meet a hard real-time deadline (that is, missing one periodic is a failure) of 200 microseconds.

  19. From Desktop to Teraflop: Exploiting the U.S. Lead in High Performance Computing. NSF Blue Ribbon Panel on High Performance Computing.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC.

    This report addresses an opportunity to accelerate progress in virtually every branch of science and engineering concurrently, while also boosting the American economy as business firms also learn to exploit these new capabilities. The successful rapid advancement in both science and technology creates its own challenges, four of which are…

  20. 33 CFR 165.768 - Security Zone; MacDill Air Force Base, Tampa Bay, FL.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 2 2014-07-01 2014-07-01 false Security Zone; MacDill Air Force....768 Security Zone; MacDill Air Force Base, Tampa Bay, FL. (a) Location. The following area is a security zone which exists concurrent with an Army Corps of Engineers restricted area in § 334.635 of this...

  1. 33 CFR 165.768 - Security Zone; MacDill Air Force Base, Tampa Bay, FL.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 2 2013-07-01 2013-07-01 false Security Zone; MacDill Air Force....768 Security Zone; MacDill Air Force Base, Tampa Bay, FL. (a) Location. The following area is a security zone which exists concurrent with an Army Corps of Engineers restricted area in § 334.635 of this...

  2. 33 CFR 165.768 - Security Zone; MacDill Air Force Base, Tampa Bay, FL.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false Security Zone; MacDill Air Force....768 Security Zone; MacDill Air Force Base, Tampa Bay, FL. (a) Location. The following area is a security zone which exists concurrent with an Army Corps of Engineers restricted area in § 334.635 of this...

  3. 33 CFR 165.768 - Security Zone; MacDill Air Force Base, Tampa Bay, FL.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Security Zone; MacDill Air Force....768 Security Zone; MacDill Air Force Base, Tampa Bay, FL. (a) Location. The following area is a security zone which exists concurrent with an Army Corps of Engineers restricted area in § 334.635 of this...

  4. Proceedings of the Department of Defense Environmental Technology Workshop

    DTIC Science & Technology

    1995-05-01

    Fabrication Laboratory Results in Waste Elimination William J. Kelso, Parsons Engineering Science, Inc.; Susan H. Errett, Lt. Col. Ronald D. Fancher... Williams , Ocean City Research Corporation ......................... 109 NDCEE Reduces Risk in Technology Transfer Jack H. Cavanaugh, Concurrent...Ecological Receptors William R. Alsop, Mark E. Stelljes, Elizabeth T. Hawkins, Harding Lawson Associates; W illiam Collins, U.S. Department of the Army

  5. Information Integration for Concurrent Engineering (IICE) Compendium of Methods Report

    DTIC Science & Technology

    1995-06-01

    technological, economic, and strategic benefits can be attained through the effective capture, control, and management of information and knowledge ...resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to achieve...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems that

  6. A Primer for DoD Reliability, Maintainability and Safety Standards

    DTIC Science & Technology

    1988-03-02

    the project engineer and the concurrence of their respective managers. The primary consideration in such cases is the thoroughness of the ...basic approaches to the application of environmental stress screening. In one approach, the government explicitly specifies the screens and screening...TO USE DOD-HDBK-344 (USAF) There are two basic approaches to the application of environmental stress

  7. 76 FR 31958 - Information Collection Being Submitted for Review and Approval to the Office of Management and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-02

    ... enhance the quality, utility, and clarity of the information collected; (d) ways to minimize the burden of... frequencies. Section 90.545(c)(1) requires that public safety applicants select one of three ways to meet TV... engineering study to justify other separations; or (3) obtain concurrence from the applicable TV/DTV station(s...

  8. A Model-based Approach to Reactive Self-Configuring Systems

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Nayak, P. Pandurang

    1996-01-01

    This paper describes Livingstone, an implemented kernel for a self-reconfiguring autonomous system, that is reactive and uses component-based declarative models. The paper presents a formal characterization of the representation formalism used in Livingstone, and reports on our experience with the implementation in a variety of domains. Livingstone's representation formalism achieves broad coverage of hybrid software/hardware systems by coupling the concurrent transition system models underlying concurrent reactive languages with the discrete qualitative representations developed in model-based reasoning. We achieve a reactive system that performs significant deductions in the sense/response loop by drawing on our past experience at building fast prepositional conflict-based algorithms for model-based diagnosis, and by framing a model-based configuration manager as a prepositional, conflict-based feedback controller that generates focused, optimal responses. Livingstone automates all these tasks using a single model and a single core deductive engine, thus making significant progress towards achieving a central goal of model-based reasoning. Livingstone, together with the HSTS planning and scheduling engine and the RAPS executive, has been selected as the core autonomy architecture for Deep Space One, the first spacecraft for NASA's New Millennium program.

  9. Engineering S. equi subsp. zooepidemicus towards concurrent production of hyaluronic acid and chondroitin biopolymers of biomedical interest.

    PubMed

    Cimini, Donatella; Iacono, Ileana Dello; Carlino, Elisabetta; Finamore, Rosario; Restaino, Odile F; Diana, Paola; Bedini, Emiliano; Schiraldi, Chiara

    2017-12-01

    Glycosaminoglycans, such as hyaluronic acid and chondroitin sulphate, are not only more and more required as main ingredients in cosmeceutical and nutraceutical preparations, but also as active principles in medical devices and pharmaceutical products. However, while biotechnological production of hyaluronic acid is industrially established through fermentation of Streptococcus spp. and recently Bacillus subtilis, biotechnological chondroitin is not yet on the market. A non-hemolytic and hyaluronidase negative S. equi subsp. zooepidemicus mutant strain was engineered in this work by the addition of two E. coli K4 genes, namely kfoA and kfoC, involved in the biosynthesis of chondroitin-like polysaccharide. Chondroitin is the precursor of chondroitin sulphate, a nutraceutical present on the market as anti-arthritic drug, that is lately being studied for its intrinsic bioactivity. In small scale bioreactor batch experiments the production of about 1.46 ± 0.38 g/L hyaluronic acid and 300 ± 28 mg/L of chondroitin with an average molecular weight of 1750 and 25 kDa, respectively, was demonstrated, providing an approach to the concurrent production of both biopolymers in a single fermentation.

  10. Functional assembly of engineered myocardium by electrical stimulation of cardiac myocytes cultured on scaffolds.

    PubMed

    Radisic, Milica; Park, Hyoungshin; Shing, Helen; Consi, Thomas; Schoen, Frederick J; Langer, Robert; Freed, Lisa E; Vunjak-Novakovic, Gordana

    2004-12-28

    The major challenge of tissue engineering is directing the cells to establish the physiological structure and function of the tissue being replaced across different hierarchical scales. To engineer myocardium, biophysical regulation of the cells needs to recapitulate multiple signals present in the native heart. We hypothesized that excitation-contraction coupling, critical for the development and function of a normal heart, determines the development and function of engineered myocardium. To induce synchronous contractions of cultured cardiac constructs, we applied electrical signals designed to mimic those in the native heart. Over only 8 days in vitro, electrical field stimulation induced cell alignment and coupling, increased the amplitude of synchronous construct contractions by a factor of 7, and resulted in a remarkable level of ultrastructural organization. Development of conductive and contractile properties of cardiac constructs was concurrent, with strong dependence on the initiation and duration of electrical stimulation.

  11. Europa Explorer Operational Scenarios Development

    NASA Technical Reports Server (NTRS)

    Lock, Robert E.; Pappalardo, Robert T.; Clark, Karla B.

    2008-01-01

    In 2007, NASA conducted four advanced mission concept studies for outer planets targets: Europa, Ganymede, Titan and Enceladus. The studies were conducted in close cooperation with the planetary science community. Of the four, the Europa Explorer Concept Study focused on refining mission options, science trades and implementation details for a potential flagship mission to Europa in the 2015 timeframe. A science definition team (SDT) was appointed by NASA to guide the study. A JPL-led engineering team worked closely with the science team to address 3 major focus areas: 1) credible cost estimates, 2) rationale and logical discussion of radiation risk and mitigation approaches, and 3) better definition and exploration of science operational scenario trade space. This paper will address the methods and results of the collaborative process used to develop Europa Explorer operations scenarios. Working in concert with the SDT, and in parallel with the SDT's development of a science value matrix, key mission capabilities and constraints were challenged by the science and engineering members of the team. Science goals were advanced and options were considered for observation scenarios. Data collection and return strategies were tested via simulation, and mission performance was estimated and balanced with flight and ground system resources and science priorities. The key to this successful collaboration was a concurrent development environment in which all stakeholders could rapidly assess the feasibility of strategies for their success in the full system context. Issues of science and instrument compatibility, system constraints, and mission opportunities were treated analytically and objectively leading to complementary strategies for observation and data return. Current plans are that this approach, as part of the system engineering process, will continue as the Europa Explorer Concept Study moves toward becoming a development project.

  12. Ares I-X Roll Control System Development

    NASA Technical Reports Server (NTRS)

    Unger, Ronald J.; Massey, Edmund C.

    2009-01-01

    Project Managers often face challenging technical, schedule and budget issues. This presentation will explore how the Ares I-X Roll Control System Integrated Product Team (IPT) mitigated challenges such as concurrent engineering requirements and environments and evolving program processes, while successfully managing an aggressive project schedule and tight budget. IPT challenges also included communications and negotiations among inter- and intra-government agencies, including the US Air Force, NASA/MSFC Propulsion Engineering, LaRC, GRC, KSC, WSTF, and the Constellation Program. In order to successfully meet these challenges it was essential that the IPT define those items that most affected the schedule critical path, define early mitigation strategies to reduce technical, schedule, and budget risks, and maintain the end-product focus of an "unmanned test flight" context for the flight hardware. The makeup of the IPT and how it would function were also important considerations. The IPT consisted of NASA/MSFC (project management, engineering, and safety/quality) and contractors (Teledyne Brown Engineering and Pratt and Whitney Rocketdyne, who supplied heritage hardware experience). The early decision to have a small focused IPT working "badgelessly" across functional lines to eliminate functional stove-piping allowed for many more tasks to be done by fewer people. It also enhanced a sense of ownership of the products, while still being able to revert back to traditional roles in order to provide the required technical independence in design reviews and verification closures. This presentation will highlight several prominent issues and discuss how they were mitigated and the resulting Lessons Learned that might benefit other projects.

  13. ARTEMIS: a collaborative framework for health care.

    PubMed

    Reddy, R; Jagannathan, V; Srinivas, K; Karinthi, R; Reddy, S M; Gollapudy, C; Friedman, S

    1993-01-01

    Patient centered healthcare delivery is an inherently collaborative process. This involves a wide range of individuals and organizations with diverse perspectives: primary care physicians, hospital administrators, labs, clinics, and insurance. The key to cost reduction and quality improvement in health care is effective management of this collaborative process. The use of multi-media collaboration technology can facilitate timely delivery of patient care and reduce cost at the same time. During the last five years, the Concurrent Engineering Research Center (CERC), under the sponsorship of DARPA (Defense Advanced Research Projects Agency, recently renamed ARPA) developed a number of generic key subsystems of a comprehensive collaboration environment. These subsystems are intended to overcome the barriers that inhibit the collaborative process. Three subsystems developed under this program include: MONET (Meeting On the Net)--to provide consultation over a computer network, ISS (Information Sharing Server)--to provide access to multi-media information, and PCB (Project Coordination Board)--to better coordinate focussed activities. These systems have been integrated into an open environment to enable collaborative processes. This environment is being used to create a wide-area (geographically distributed) research testbed under DARPA sponsorship, ARTEMIS (Advance Research Testbed for Medical Informatics) to explore the collaborative health care processes. We believe this technology will play a key role in the current national thrust to reengineer the present health-care delivery system.

  14. Program Correctness, Verification and Testing for Exascale (Corvette)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Koushik; Iancu, Costin; Demmel, James W

    The goal of this project is to provide tools to assess the correctness of parallel programs written using hybrid parallelism. There is a dire lack of both theoretical and engineering know-how in the area of finding bugs in hybrid or large scale parallel programs, which our research aims to change. In the project we have demonstrated novel approaches in several areas: 1. Low overhead automated and precise detection of concurrency bugs at scale. 2. Using low overhead bug detection tools to guide speculative program transformations for performance. 3. Techniques to reduce the concurrency required to reproduce a bug using partialmore » program restart/replay. 4. Techniques to provide reproducible execution of floating point programs. 5. Techniques for tuning the floating point precision used in codes.« less

  15. An algebra of discrete event processes

    NASA Technical Reports Server (NTRS)

    Heymann, Michael; Meyer, George

    1991-01-01

    This report deals with an algebraic framework for modeling and control of discrete event processes. The report consists of two parts. The first part is introductory, and consists of a tutorial survey of the theory of concurrency in the spirit of Hoare's CSP, and an examination of the suitability of such an algebraic framework for dealing with various aspects of discrete event control. To this end a new concurrency operator is introduced and it is shown how the resulting framework can be applied. It is further shown that a suitable theory that deals with the new concurrency operator must be developed. In the second part of the report the formal algebra of discrete event control is developed. At the present time the second part of the report is still an incomplete and occasionally tentative working paper.

  16. The Caltech Concurrent Computation Program - Project description

    NASA Technical Reports Server (NTRS)

    Fox, G.; Otto, S.; Lyzenga, G.; Rogstad, D.

    1985-01-01

    The Caltech Concurrent Computation Program wwhich studies basic issues in computational science is described. The research builds on initial work where novel concurrent hardware, the necessary systems software to use it and twenty significant scientific implementations running on the initial 32, 64, and 128 node hypercube machines have been constructed. A major goal of the program will be to extend this work into new disciplines and more complex algorithms including general packages that decompose arbitrary problems in major application areas. New high-performance concurrent processors with up to 1024-nodes, over a gigabyte of memory and multigigaflop performance are being constructed. The implementations cover a wide range of problems in areas such as high energy and astrophysics, condensed matter, chemical reactions, plasma physics, applied mathematics, geophysics, simulation, CAD for VLSI, graphics and image processing. The products of the research program include the concurrent algorithms, hardware, systems software, and complete program implementations.

  17. Symbolically Modeling Concurrent MCAPI Executions

    NASA Technical Reports Server (NTRS)

    Fischer, Topher; Mercer, Eric; Rungta, Neha

    2011-01-01

    Improper use of Inter-Process Communication (IPC) within concurrent systems often creates data races which can lead to bugs that are challenging to discover. Techniques that use Satisfiability Modulo Theories (SMT) problems to symbolically model possible executions of concurrent software have recently been proposed for use in the formal verification of software. In this work we describe a new technique for modeling executions of concurrent software that use a message passing API called MCAPI. Our technique uses an execution trace to create an SMT problem that symbolically models all possible concurrent executions and follows the same sequence of conditional branch outcomes as the provided execution trace. We check if there exists a satisfying assignment to the SMT problem with respect to specific safety properties. If such an assignment exists, it provides the conditions that lead to the violation of the property. We show how our method models behaviors of MCAPI applications that are ignored in previously published techniques.

  18. Selective impairment of auditory selective attention under concurrent cognitive load.

    PubMed

    Dittrich, Kerstin; Stahl, Christoph

    2012-06-01

    Load theory predicts that concurrent cognitive load impairs selective attention. For visual stimuli, it has been shown that this impairment can be selective: Distraction was specifically increased when the stimulus material used in the cognitive load task matches that of the selective attention task. Here, we report four experiments that demonstrate such selective load effects for auditory selective attention. The effect of two different cognitive load tasks on two different auditory Stroop tasks was examined, and selective load effects were observed: Interference in a nonverbal-auditory Stroop task was increased under concurrent nonverbal-auditory cognitive load (compared with a no-load condition), but not under concurrent verbal-auditory cognitive load. By contrast, interference in a verbal-auditory Stroop task was increased under concurrent verbal-auditory cognitive load but not under nonverbal-auditory cognitive load. This double-dissociation pattern suggests the existence of different and separable verbal and nonverbal processing resources in the auditory domain.

  19. A Concurrent Multiple Negotiation Protocol Based on Colored Petri Nets.

    PubMed

    Niu, Lei; Ren, Fenghui; Zhang, Minjie; Bai, Quan

    2017-11-01

    Concurrent multiple negotiation (CMN) provides a mechanism for an agent to simultaneously conduct more than one negotiation. There may exist different interdependency relationships among these negotiations and these interdependency relationships can impact the outcomes of these negotiations. The outcomes of these concurrent negotiations contribute together for the agent to achieve an overall negotiation goal. Handling a CMN while considering interdependency relationships among multiple negotiations is a challenging research problem. This paper: 1) comprehensively highlights research problems of negotiations at concurrent negotiation level; 2) provides a graph-based CMN model with consideration of the interdependency relationships; and 3) proposes a colored Petri net-based negotiation protocol for conducting CMNs. With the proposed protocol, a CMN can be efficiently and concurrently processed and negotiation agreements can be efficiently achieved. Experimental results indicate the effectiveness and efficiency of the proposed protocol in terms of the negotiation success rate, the negotiation time and the negotiation outcome.

  20. Concurrent Image Processing Executive (CIPE). Volume 1: Design overview

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.

    1990-01-01

    The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are described. The target machine for this software is a JPL/Caltech Mark 3fp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules: user interface, host-resident executive, hypercube-resident executive, and application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube, a data management method which distributes, redistributes, and tracks data set information was implemented. The data management also allows data sharing among application programs. The CIPE software architecture provides a flexible environment for scientific analysis of complex remote sensing image data, such as planetary data and imaging spectrometry, utilizing state-of-the-art concurrent computation capabilities.

  1. Statistical process management: An essential element of quality improvement

    NASA Astrophysics Data System (ADS)

    Buckner, M. R.

    Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.

  2. DARPA Initiative in Concurrent Engineering (DICE). Phase 2

    DTIC Science & Technology

    1990-07-31

    XS spreadsheet tool " Q-Calc spreadsheet tool " TAE+ outer wrapper for XS • Framemaker-based formal EDN (Electronic Design Notebook) " Data...shared global object space and object persistence. Technical Results Module Development XS Integration Environment A prototype of the wrapper concepts...for a spreadsheet integration environment, using an X-Windows based extensible Lotus 1-2-3 emulation called XS , and was (initially) targeted for

  3. Implementing Set Based Design into Department of Defense Acquisition

    DTIC Science & Technology

    2016-12-01

    challenges for the DOD. This report identifies the original SBD principles and characteristics based on Toyota Motor Corporation’s Set Based Concurrent...Engineering Model. Additionally, the team reviewed DOD case studies that implemented SBD. The SBD principles , along with the common themes from the...perennial challenges for the DOD. This report identifies the original SBD principles and characteristics based on Toyota Motor Corporation’s Set

  4. Adaptive intensity modulated radiotherapy for advanced prostate cancer

    NASA Astrophysics Data System (ADS)

    Ludlum, Erica Marie

    The purpose of this research is to develop and evaluate improvements in intensity modulated radiotherapy (IMRT) for concurrent treatment of prostate and pelvic lymph nodes. The first objective is to decrease delivery time while maintaining treatment quality, and evaluate the effectiveness and efficiency of novel one-step optimization compared to conventional two-step optimization. Both planning methods are examined at multiple levels of complexity by comparing the number of beam apertures, or segments, the amount of radiation delivered as measured by monitor units (MUs), and delivery time. One-step optimization is demonstrated to simplify IMRT planning and reduce segments (from 160 to 40), MUs (from 911 to 746), and delivery time (from 22 to 7 min) with comparable plan quality. The second objective is to examine the capability of three commercial dose calculation engines employing different levels of accuracy and efficiency to handle high--Z materials, such as metallic hip prostheses, included in the treatment field. Pencil beam, convolution superposition, and Monte Carlo dose calculation engines are compared by examining the dose differences for patient plans with unilateral and bilateral hip prostheses, and for phantom plans with a metal insert for comparison with film measurements. Convolution superposition and Monte Carlo methods calculate doses that are 1.3% and 34.5% less than the pencil beam method, respectively. Film results demonstrate that Monte Carlo most closely represents actual radiation delivery, but none of the three engines accurately predict the dose distribution when high-Z heterogeneities exist in the treatment fields. The final objective is to improve the accuracy of IMRT delivery by accounting for independent organ motion during concurrent treatment of the prostate and pelvic lymph nodes. A leaf-shifting algorithm is developed to track daily prostate position without requiring online dose calculation. Compared to conventional methods of adjusting patient position, adjusting the multileaf collimator (MLC) leaves associated with the prostate in each segment significantly improves lymph node dose coverage (maintains 45 Gy compared to 42.7, 38.3, and 34.0 Gy for iso-shifts of 0.5, 1 and 1.5 cm). Altering the MLC portal shape is demonstrated as a new and effective solution to independent prostate movement during concurrent treatment.

  5. Threaded Cognition: An Integrated Theory of Concurrent Multitasking

    ERIC Educational Resources Information Center

    Salvucci, Dario D.; Taatgen, Niels A.

    2008-01-01

    The authors propose the idea of threaded cognition, an integrated theory of concurrent multitasking--that is, performing 2 or more tasks at once. Threaded cognition posits that streams of thought can be represented as threads of processing coordinated by a serial procedural resource and executed across other available resources (e.g., perceptual…

  6. Aging and Concurrent Task Performance: Cognitive Demand and Motor Control

    ERIC Educational Resources Information Center

    Albinet, Cedric; Tomporowski, Phillip D.; Beasman, Kathryn

    2006-01-01

    A motor task that requires fine control of upper limb movements and a cognitive task that requires executive processing--first performing them separately and then concurrently--was performed by 18 young and 18 older adults. The motor task required participants to tap alternatively on two targets, the sizes of which varied systematically. The…

  7. What Lies Beneath?: Verbal Report in Interlanguage Requests in English

    ERIC Educational Resources Information Center

    Woodfield, Helen

    2010-01-01

    The present study investigates the role of concurrent and retrospective verbal report in exploring the cognitive processes of six pairs of advanced ESL learners engaged on a written discourse completion task eliciting status-unequal requests in English. Qualitative analysis of the concurrent data indicate that (i) social contextual aspects of the…

  8. Appraisal and coping styles account for the effects of temperament on preadolescent adjustment

    PubMed Central

    Thompson, Stephanie F.; Zalewski, Maureen; Lengua, Liliana J.

    2014-01-01

    Temperament, appraisal, and coping are known to underlie emotion regulation, yet less is known about how these processes relate to each other across time. We examined temperamental fear, frustration, effortful control, and impulsivity, positive and threat appraisals, and active and avoidant coping as processes underpinning the emotion regulation of pre-adolescent children managing stressful events. Appraisal and coping styles were tested as mediators of the longitudinal effects of temperamental emotionality and self-regulation on adjustment using a community sample (N=316) of preadolescent children (8–12 years at T1) studied across one year. High threat appraisals were concurrently related to high fear and impulsivity, whereas effortful control predicted relative decreases in threat appraisal. High fear was concurrently related to high positive appraisal, and impulsivity predicted increases in positive appraisal. Fear was concurrently related to greater avoidant coping, and impulsivity predicted increases in avoidance. Frustration predicted decreases in active coping. These findings suggest temperament, or dispositional aspects of reactivity and regulation, relates to concurrent appraisal and coping processes and additionally predicts change in these processes. Significant indirect effects indicated that appraisal and coping mediated the effects of temperament on adjustment. Threat appraisal mediated the effects of fear and effortful control on internalizing and externalizing problems, and avoidant coping mediated the effect of impulsivity on internalizing problems. These mediated effects suggest that one pathway through which temperament influences adjustment is pre-adolescents’ appraisal and coping. Findings highlight temperament, appraisal and coping as emotion regulation processes relevant to children’s adjustment in response to stress. PMID:25821237

  9. Head-up displays: Effect of information location on the processing of superimposed symbology

    NASA Technical Reports Server (NTRS)

    Sanford, Beverly D.; Foyle, David C.; Mccann, Robert S.; Jordan, Kevin

    1993-01-01

    Head-up display (HUD) symbology superimposes vehicle status information onto the external terrain, providing simultaneous visual access to both sources of information. Relative to a baseline condition in which the superimposed altitude indicator was omitted, altitude maintenance was improved by the presence of the altitude indicator, and this improvement was the same magnitude regardless of the position of the altitude indicator on the screen. However, a concurrent decifit in heading maintenance was observed only when the altitude indicator was proximal to the path information. These results did not support a model of the concurrent processing deficit based on an inability to attend to multiple locations in parallel. They are consistent with previous claims that the deficit is the product of attentional limits on subjects' ability to process two separate objects (HUD symbology and terrain information) concurrently. The absence of a performance tradeoff when the HUD and the path information were less proximal is attributed to a breaking of attentional tunneling on the HUD, possibly due to eye movements.

  10. More visual mind wandering occurrence during visual task performance: Modality of the concurrent task affects how the mind wanders.

    PubMed

    Choi, HeeSun; Geden, Michael; Feng, Jing

    2017-01-01

    Mind wandering has been considered as a mental process that is either independent from the concurrent task or regulated like a secondary task. These accounts predict that the form of mind wandering (i.e., images or words) should be either unaffected by or different from the modality form (i.e., visual or auditory) of the concurrent task. Findings from this study challenge these accounts. We measured the rate and the form of mind wandering in three task conditions: fixation, visual 2-back, and auditory 2-back. Contrary to the general expectation, we found that mind wandering was more likely in the same form as the task. This result can be interpreted in light of recent findings on overlapping brain activations during internally- and externally-oriented processes. Our result highlights the importance to consider the unique interplay between the internal and external mental processes and to measure mind wandering as a multifaceted rather than a unitary construct.

  11. More visual mind wandering occurrence during visual task performance: Modality of the concurrent task affects how the mind wanders

    PubMed Central

    Choi, HeeSun; Geden, Michael

    2017-01-01

    Mind wandering has been considered as a mental process that is either independent from the concurrent task or regulated like a secondary task. These accounts predict that the form of mind wandering (i.e., images or words) should be either unaffected by or different from the modality form (i.e., visual or auditory) of the concurrent task. Findings from this study challenge these accounts. We measured the rate and the form of mind wandering in three task conditions: fixation, visual 2-back, and auditory 2-back. Contrary to the general expectation, we found that mind wandering was more likely in the same form as the task. This result can be interpreted in light of recent findings on overlapping brain activations during internally- and externally-oriented processes. Our result highlights the importance to consider the unique interplay between the internal and external mental processes and to measure mind wandering as a multifaceted rather than a unitary construct. PMID:29240817

  12. Development of visual 3D virtual environment for control software

    NASA Technical Reports Server (NTRS)

    Hirose, Michitaka; Myoi, Takeshi; Amari, Haruo; Inamura, Kohei; Stark, Lawrence

    1991-01-01

    Virtual environments for software visualization may enable complex programs to be created and maintained. A typical application might be for control of regional electric power systems. As these encompass broader computer networks than ever, construction of such systems becomes very difficult. Conventional text-oriented environments are useful in programming individual processors. However, they are obviously insufficient to program a large and complicated system, that includes large numbers of computers connected to each other; such programming is called 'programming in the large.' As a solution for this problem, the authors are developing a graphic programming environment wherein one can visualize complicated software in virtual 3D world. One of the major features of the environment is the 3D representation of concurrent process. 3D representation is used to supply both network-wide interprocess programming capability (capability for 'programming in the large') and real-time programming capability. The authors' idea is to fuse both the block diagram (which is useful to check relationship among large number of processes or processors) and the time chart (which is useful to check precise timing for synchronization) into a single 3D space. The 3D representation gives us a capability for direct and intuitive planning or understanding of complicated relationship among many concurrent processes. To realize the 3D representation, a technology to enable easy handling of virtual 3D object is a definite necessity. Using a stereo display system and a gesture input device (VPL DataGlove), our prototype of the virtual workstation has been implemented. The workstation can supply the 'sensation' of the virtual 3D space to a programmer. Software for the 3D programming environment is implemented on the workstation. According to preliminary assessments, a 50 percent reduction of programming effort is achieved by using the virtual 3D environment. The authors expect that the 3D environment has considerable potential in the field of software engineering.

  13. Ceramic applications in turbine engines

    NASA Technical Reports Server (NTRS)

    Byrd, J. A.; Janovicz, M. A.; Thrasher, S. R.

    1981-01-01

    Development testing activities on the 1900 F-configuration ceramic parts were completed, 2070 F-configuration ceramic component rig and engine testing was initiated, and the conceptual design for the 2265 F-configuration engine was identified. Fabrication of the 2070 F-configuration ceramic parts continued, along with burner rig development testing of the 2070 F-configuration metal combustor in preparation for 1132 C (2070 F) qualification test conditions. Shakedown testing of the hot engine simulator (HES) rig was also completed in preparation for testing of a spin rig-qualified ceramic-bladed rotor assembly at 1132 C (2070 F) test conditions. Concurrently, ceramics from new sources and alternate materials continued to be evaluated, and fabrication of 2070 F-configuration ceramic component from these new sources continued. Cold spin testing of the critical 2070 F-configuration blade continued in the spin test rig to qualify a set of ceramic blades at 117% engine speed for the gasifier turbine rotor. Rig testing of the ceramic-bladed gasifier turbine rotor assembly at 108% engine speed was also performed, which resulted in the failure of one blade. The new three-piece hot seal with the nickel oxide/calcium fluoride wearface composition was qualified in the regenerator rig and introduced to engine operation wiwth marginal success.

  14. Addressing Research Design Problem in Mixed Methods Research

    NASA Astrophysics Data System (ADS)

    Alavi, Hamed; Hąbek, Patrycja

    2016-03-01

    Alongside other disciplines in social sciences, management researchers use mixed methods research more and more in conduct of their scientific investigations. Mixed methods approach can also be used in the field of production engineering. In comparison with traditional quantitative and qualitative research methods, reasons behind increasing popularity of mixed research method in management science can be traced in different factors. First of all, any particular discipline in management can be theoretically related to it. Second is that concurrent approach of mixed research method to inductive and deductive research logic provides researchers with opportunity to generate theory and test hypothesis in one study simultaneously. In addition, it provides a better justification for chosen method of investigation and higher validity for obtained answers to research questions. Despite increasing popularity of mixed research methods among management scholars, there is still need for a comprehensive approach to research design typology and process in mixed research method from the perspective of management science. The authors in this paper try to explain fundamental principles of mixed research method, its typology and different steps in its design process.

  15. Gray Bananas and a Red Letter A - From Synesthetic Sensation to Memory Colors.

    PubMed

    Weiss, Franziska; Greenlee, Mark W; Volberg, Gregor

    2018-01-01

    Grapheme-color synesthesia is a condition in which objectively achromatic graphemes induce concurrent color experiences. While it was long thought that the colors emerge during perception, there is growing support for the view that colors are integral to synesthetes' cognitive representations of graphemes. In this work, we review evidence for two opposing theories positing either a perceptual or cognitive origin of concurrent colors: the cross-activation theory and the conceptual-mediation model. The review covers results on inducer and concurrent color processing as well as findings concerning the brain structure and grapheme-color mappings in synesthetes and trained mappings in nonsynesthetes. The results support different aspects of both theories. Finally, we discuss how research on memory colors could provide a new perspective in the debate about the level of processing at which the synesthetic colors occur.

  16. Collaborative enterprise and virtual prototyping (CEVP): a product-centric approach to distributed simulation

    NASA Astrophysics Data System (ADS)

    Saunders, Vance M.

    1999-06-01

    The downsizing of the Department of Defense (DoD) and the associated reduction in budgets has re-emphasized the need for commonality, reuse, and standards with respect to the way DoD does business. DoD has implemented significant changes in how it buys weapon systems. The new emphasis is on concurrent engineering with Integrated Product and Process Development and collaboration with Integrated Product Teams. The new DoD vision includes Simulation Based Acquisition (SBA), a process supported by robust, collaborative use of simulation technology that is integrated across acquisition phases and programs. This paper discusses the Air Force Research Laboratory's efforts to use Modeling and Simulation (M&S) resources within a Collaborative Enterprise Environment to support SBA and other Collaborative Enterprise and Virtual Prototyping (CEVP) applications. The paper will discuss four technology areas: (1) a Processing Ontology that defines a hierarchically nested set of collaboration contexts needed to organize and support multi-disciplinary collaboration using M&S, (2) a partial taxonomy of intelligent agents needed to manage different M&S resource contributions to advancing the state of product development, (3) an agent- based process for interfacing disparate M&S resources into a CEVP framework, and (4) a Model-View-Control based approach to defining `a new way of doing business' for users of CEVP frameworks/systems.

  17. Extendable supervised dictionary learning for exploring diverse and concurrent brain activities in task-based fMRI.

    PubMed

    Zhao, Shijie; Han, Junwei; Hu, Xintao; Jiang, Xi; Lv, Jinglei; Zhang, Tuo; Zhang, Shu; Guo, Lei; Liu, Tianming

    2018-06-01

    Recently, a growing body of studies have demonstrated the simultaneous existence of diverse brain activities, e.g., task-evoked dominant response activities, delayed response activities and intrinsic brain activities, under specific task conditions. However, current dominant task-based functional magnetic resonance imaging (tfMRI) analysis approach, i.e., the general linear model (GLM), might have difficulty in discovering those diverse and concurrent brain responses sufficiently. This subtraction-based model-driven approach focuses on the brain activities evoked directly from the task paradigm, thus likely overlooks other possible concurrent brain activities evoked during the information processing. To deal with this problem, in this paper, we propose a novel hybrid framework, called extendable supervised dictionary learning (E-SDL), to explore diverse and concurrent brain activities under task conditions. A critical difference between E-SDL framework and previous methods is that we systematically extend the basic task paradigm regressor into meaningful regressor groups to account for possible regressor variation during the information processing procedure in the brain. Applications of the proposed framework on five independent and publicly available tfMRI datasets from human connectome project (HCP) simultaneously revealed more meaningful group-wise consistent task-evoked networks and common intrinsic connectivity networks (ICNs). These results demonstrate the advantage of the proposed framework in identifying the diversity of concurrent brain activities in tfMRI datasets.

  18. Concurrent Validity of the Stanford-Binet: Fourth Edition and Kaufman Assessment Battery for Children with Learning-Disabled Students.

    ERIC Educational Resources Information Center

    Knight, B. Caleb; And Others

    1990-01-01

    Examined the concurrent validity of the composite and area scores of the Stanford-Binet Intelligence Scale: Fourth Edition (SBIV) and the Mental Processing Composite and global scale scores of the Kaufman Assessment Battery for Children in Black, learning-disabled elementary school students (N=30). Findings demonstrated adequate concurrent…

  19. The Psychotherapy of Parenthood: Towards a Formulation and Valuation of Concurrent Work with Parents

    ERIC Educational Resources Information Center

    Sutton, Adrian; Hughes, Lynette

    2005-01-01

    This paper explores the process and value of concurrent work with parents when their child is being treated in individual psychotherapy. The position taken is that psychoanalytic understanding generally and the specific formulations presented in this paper have a broader applicability in other aspects and approaches in child and adolescent mental…

  20. The Effect of Enhanced Visualization Instruction on First Grade Students' Scores on the North Carolina Standard Course Assessment

    ERIC Educational Resources Information Center

    Thompson, Amber Cole

    2012-01-01

    Visualization was once thought to be an important skill for professions only related to engineering, but due to the realization of concurrent design and the fast pace of technology, it is now desirable in other professions as well. The importance of learning basic knowledge of geometrical concepts has a greater impact than it did prior to the 21st…

  1. An Annotated Reading List for Concurrent Engineering

    DTIC Science & Technology

    1989-07-01

    The seven tools are sometimes referred to as the seven old tools.) -9- Ishikawa , Kaoru , What is Total Quality Control? The Japanese Way, Prentice-Hall...some solutions. * Ishikawa (1982) presents a practical guide (with easy to use tools) for implementing qual- ity control at the working level...study of, :-, ieering for the last two years. Is..ikawa, Kaoru , Guide to Quality Control, Kraus International Publications, White Plains, NY, 1982. The

  2. The Role of Concurrent Engineering in Weapons System Acquisition

    DTIC Science & Technology

    1988-12-01

    checx sheets, Pareto diagrams, graphs, control charts, and scatter diagrams. Kaoru Iskilkawa, Guide to Qualiy Conmi, Asian Productivity Organization...Dewing [3 !,Juran [141, and Ishikawa [l𔃿). Managers in the United States and Japan have used techniques of statistics to measure performance and they have...New York (1962). 15. Kaoru Ishkawa, Guide to Quall’y Control, KRAUS International Publications, White Plains, NY (1-982). 16. Robert H. Hayc,%. St

  3. Semiannual Report for Contract NAS1-19480 (Institute for Computer Applications in Science and Engineering)

    DTIC Science & Technology

    1994-06-01

    algorithms for large, irreducibly coupled systems iteratively solve concurrent problems within different subspaces of a Hilbert space, or within different...effective on problems amenable to SIMD solution. Together with researchers at AT&T Bell Labs (Boris Lubachevsky, Albert Greenberg ) we have developed...reasonable measurement. In the study of different speedups, various causes of superlinear speedup are also presented. Greenberg , Albert G., Boris D

  4. Interphase layer optimization for metal matrix composites with fabrication considerations

    NASA Technical Reports Server (NTRS)

    Morel, M.; Saravanos, D. A.; Chamis, C. C.

    1991-01-01

    A methodology is presented to reduce the final matrix microstresses for metal matrix composites by concurrently optimizing the interphase characteristics and fabrication process. Application cases include interphase tailoring with and without fabrication considerations for two material systems, graphite/copper and silicon carbide/titanium. Results indicate that concurrent interphase/fabrication optimization produces significant reductions in the matrix residual stresses and strong coupling between interphase and fabrication tailoring. The interphase coefficient of thermal expansion and the fabrication consolidation pressure are the most important design parameters and must be concurrently optimized to further reduce the microstresses to more desirable magnitudes.

  5. Modeling and optimum time performance for concurrent processing

    NASA Technical Reports Server (NTRS)

    Mielke, Roland R.; Stoughton, John W.; Som, Sukhamoy

    1988-01-01

    The development of a new graph theoretic model for describing the relation between a decomposed algorithm and its execution in a data flow environment is presented. Called ATAMM, the model consists of a set of Petri net marked graphs useful for representing decision-free algorithms having large-grained, computationally complex primitive operations. Performance time measures which determine computing speed and throughput capacity are defined, and the ATAMM model is used to develop lower bounds for these times. A concurrent processing operating strategy for achieving optimum time performance is presented and illustrated by example.

  6. Nonclassical features of trimodal excited coherent Greenberger - Horne - Zeilinger(GHZ) - type state

    NASA Astrophysics Data System (ADS)

    Merlin, J.; Ahmed, A. B. M.; Mohammed, S. Naina

    2017-06-01

    We examine the influence of photon excitation on each mode of the Glauber coherent GHZ type tripartite state. Concurrence is adopted as entanglement measure between bipartite entangled state. The pairwise concurrence is calculated and used as a quantifier of intermodal entanglement. The entanglement distribution among three modes is investigated using tangle as a measure and the residual entanglement is also calculated. The effect of the photon addition process on the quadrature squeezing is investigated. The higher order squeezing capacity of the photon addition process is also shown.

  7. Integrated design and manufacturing for the high speed civil transport

    NASA Technical Reports Server (NTRS)

    1993-01-01

    In June 1992, Georgia Tech's School of Aerospace Engineering was awarded a NASA University Space Research Association (USRA) Advanced Design Program (ADP) to address 'Integrated Design and Manufacturing for the High Speed Civil Transport (HSCT)' in its graduate aerospace systems design courses. This report summarizes the results of the five courses incorporated into the Georgia Tech's USRA ADP program. It covers AE8113: Introduction to Concurrent Engineering, AE4360: Introduction to CAE/CAD, AE4353: Design for Life Cycle Cost, AE6351: Aerospace Systems Design One, and AE6352: Aerospace Systems Design Two. AE8113: Introduction to Concurrent Engineering was an introductory course addressing the basic principles of concurrent engineering (CE) or integrated product development (IPD). The design of a total system was not the objective of this course. The goal was to understand and define the 'up-front' customer requirements, their decomposition, and determine the value objectives for a complex product, such as the high speed civil transport (HSCT). A generic CE methodology developed at Georgia Tech was used for this purpose. AE4353: Design for Life Cycle Cost addressed the basic economic issues for an HSCT using a robust design technique, Taguchi's parameter design optimization method (PDOM). An HSCT economic sensitivity assessment was conducted using a Taguchi PDOM approach to address the robustness of the basic HSCT design. AE4360: Introduction to CAE/CAD permitted students to develop and utilize CAE/CAD/CAM knowledge and skills using CATIA and CADAM as the basic geometric tools. AE6351: Aerospace Systems Design One focused on the conceptual design refinement of a baseline HSCT configuration as defined by Boeing, Douglas, and NASA in their system studies. It required the use of NASA's synthesis codes FLOPS and ACSYNT. A criterion called the productivity index (P.I.) was used to evaluate disciplinary sensitivities and provide refinements of the baseline HSCT configuration. AE6352: Aerospace Systems Design Two was a continuation of Aerospace Systems Design One in which wing concepts were researched and analyzed in more detail. FLOPS and ACSYNT were again used at the system level while other off-the-shelf computer codes were used for more detailed wing disciplinary analysis and optimization. The culmination of all efforts and submission of this report conclude the first year's efforts of Georgia Tech's NASA USRA ADP. It will hopefully provide the foundation for next year's efforts concerning continuous improvement of integrated design and manufacturing for the HSCT.

  8. Investigating grounded conceptualization: motor system state-dependence facilitates familiarity judgments of novel tools.

    PubMed

    Matheson, Heath E; Familiar, Ariana M; Thompson-Schill, Sharon L

    2018-03-02

    Theories of embodied cognition propose that we recognize tools in part by reactivating sensorimotor representations of tool use in a process of simulation. If motor simulations play a causal role in tool recognition then performing a concurrent motor task should differentially modulate recognition of experienced vs. non-experienced tools. We sought to test the hypothesis that an incompatible concurrent motor task modulates conceptual processing of learned vs. non-learned objects by directly manipulating the embodied experience of participants. We trained one group to use a set of novel, 3-D printed tools under the pretense that they were preparing for an archeological expedition to Mars (manipulation group); we trained a second group to report declarative information about how the tools are stored (storage group). With this design, familiarity and visual attention to different object parts was similar for both groups, though their qualitative interactions differed. After learning, participants made familiarity judgments of auditorily presented tool names while performing a concurrent motor task or simply sitting at rest. We showed that familiarity judgments were facilitated by motor state-dependence; specifically, in the manipulation group, familiarity was facilitated by a concurrent motor task, whereas in the spatial group familiarity was facilitated while sitting at rest. These results are the first to directly show that manipulation experience differentially modulates conceptual processing of familiar vs. unfamiliar objects, suggesting that embodied representations contribute to recognizing tools.

  9. NSTX-U Advances in Real-Time C++11 on Linux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Keith G.

    Programming languages like C and Ada combined with proprietary embedded operating systems have dominated the real-time application space for decades. The new C++11standard includes native, language-level support for concurrency, a required feature for any nontrivial event-oriented real-time software. Threads, Locks, and Atomics now exist to provide the necessary tools to build the structures that make up the foundation of a complex real-time system. The National Spherical Torus Experiment Upgrade (NSTX-U) at the Princeton Plasma Physics Laboratory (PPPL) is breaking new ground with the language as applied to the needs of fusion devices. A new Digital Coil Protection System (DCPS) willmore » serve as the main protection mechanism for the magnetic coils, and it is written entirely in C++11 running on Concurrent Computer Corporation's real-time operating system, RedHawk Linux. It runs over 600 algorithms in a 5 kHz control loop that determine whether or not to shut down operations before physical damage occurs. To accomplish this, NSTX-U engineers developed software tools that do not currently exist elsewhere, including real-time atomic synchronization, real-time containers, and a real-time logging framework. Together with a recent (and carefully configured) version of the GCC compiler, these tools enable data acquisition, processing, and output using a conventional operating system to meet a hard real-time deadline (that is, missing one periodic is a failure) of 200 microseconds.« less

  10. NSTX-U Advances in Real-Time C++11 on Linux

    DOE PAGES

    Erickson, Keith G.

    2015-08-14

    Programming languages like C and Ada combined with proprietary embedded operating systems have dominated the real-time application space for decades. The new C++11standard includes native, language-level support for concurrency, a required feature for any nontrivial event-oriented real-time software. Threads, Locks, and Atomics now exist to provide the necessary tools to build the structures that make up the foundation of a complex real-time system. The National Spherical Torus Experiment Upgrade (NSTX-U) at the Princeton Plasma Physics Laboratory (PPPL) is breaking new ground with the language as applied to the needs of fusion devices. A new Digital Coil Protection System (DCPS) willmore » serve as the main protection mechanism for the magnetic coils, and it is written entirely in C++11 running on Concurrent Computer Corporation's real-time operating system, RedHawk Linux. It runs over 600 algorithms in a 5 kHz control loop that determine whether or not to shut down operations before physical damage occurs. To accomplish this, NSTX-U engineers developed software tools that do not currently exist elsewhere, including real-time atomic synchronization, real-time containers, and a real-time logging framework. Together with a recent (and carefully configured) version of the GCC compiler, these tools enable data acquisition, processing, and output using a conventional operating system to meet a hard real-time deadline (that is, missing one periodic is a failure) of 200 microseconds.« less

  11. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    NASA Technical Reports Server (NTRS)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  12. Extensive Reading Program Which Changes Reluctant Engineering Students into Autonomous Learners of English

    NASA Astrophysics Data System (ADS)

    Nishizawa, Hitoshi; Yoshioka, Takayoshi; Itoh, Kazuaki

    This article introduces extensive reading (ER) as an approach to improve fundamental communication skills in English of reluctant EFL learners : average Japanese engineering students. It is distinct from concurrent translation approach from a perspective that the learners use English instead of Japanese to grasp the meaning of what they read and enjoy reading. In the ER program at Toyota National College of Technology, many students developed more positive attitude toward English, increased their reading speed, and achieved higher TOEIC scores, which was compared to those of the students before this ER program was introduced. Comparison between three groups of the students showed strong correlation between their TOEIC scores and the reading amount.

  13. Concurrent design of quasi-random photonic nanostructures

    PubMed Central

    Lee, Won-Kyu; Yu, Shuangcheng; Engel, Clifford J.; Reese, Thaddeus; Rhee, Dongjoon; Chen, Wei

    2017-01-01

    Nanostructured surfaces with quasi-random geometries can manipulate light over broadband wavelengths and wide ranges of angles. Optimization and realization of stochastic patterns have typically relied on serial, direct-write fabrication methods combined with real-space design. However, this approach is not suitable for customizable features or scalable nanomanufacturing. Moreover, trial-and-error processing cannot guarantee fabrication feasibility because processing–structure relations are not included in conventional designs. Here, we report wrinkle lithography integrated with concurrent design to produce quasi-random nanostructures in amorphous silicon at wafer scales that achieved over 160% light absorption enhancement from 800 to 1,200 nm. The quasi-periodicity of patterns, materials filling ratio, and feature depths could be independently controlled. We statistically represented the quasi-random patterns by Fourier spectral density functions (SDFs) that could bridge the processing–structure and structure–performance relations. Iterative search of the optimal structure via the SDF representation enabled concurrent design of nanostructures and processing. PMID:28760975

  14. A decrease in brain activation associated with driving when listening to someone speak.

    PubMed

    Just, Marcel Adam; Keller, Timothy A; Cynkar, Jacquelyn

    2008-04-18

    Behavioral studies have shown that engaging in a secondary task, such as talking on a cellular telephone, disrupts driving performance. This study used functional magnetic resonance imaging (fMRI) to investigate the impact of concurrent auditory language comprehension on the brain activity associated with a simulated driving task. Participants steered a vehicle along a curving virtual road, either undisturbed or while listening to spoken sentences that they judged as true or false. The dual-task condition produced a significant deterioration in driving accuracy caused by the processing of the auditory sentences. At the same time, the parietal lobe activation associated with spatial processing in the undisturbed driving task decreased by 37% when participants concurrently listened to sentences. The findings show that language comprehension performed concurrently with driving draws mental resources away from the driving and produces deterioration in driving performance, even when it does not require holding or dialing a phone.

  15. Threaded cognition: an integrated theory of concurrent multitasking.

    PubMed

    Salvucci, Dario D; Taatgen, Niels A

    2008-01-01

    The authors propose the idea of threaded cognition, an integrated theory of concurrent multitasking--that is, performing 2 or more tasks at once. Threaded cognition posits that streams of thought can be represented as threads of processing coordinated by a serial procedural resource and executed across other available resources (e.g., perceptual and motor resources). The theory specifies a parsimonious mechanism that allows for concurrent execution, resource acquisition, and resolution of resource conflicts, without the need for specialized executive processes. By instantiating this mechanism as a computational model, threaded cognition provides explicit predictions of how multitasking behavior can result in interference, or lack thereof, for a given set of tasks. The authors illustrate the theory in model simulations of several representative domains ranging from simple laboratory tasks such as dual-choice tasks to complex real-world domains such as driving and driver distraction. (c) 2008 APA, all rights reserved

  16. A Decrease in Brain Activation Associated with Driving When Listening to Someone Speak

    PubMed Central

    Just, Marcel Adam; Keller, Timothy A.; Cynkar, Jacquelyn

    2009-01-01

    Behavioral studies have shown that engaging in a secondary task, such as talking on a cellular telephone, disrupts driving performance. This study used functional magnetic resonance imaging (fMRI) to investigate the impact of concurrent auditory language comprehension on the brain activity associated with a simulated driving task. Participants steered a vehicle along a curving virtual road, either undisturbed or while listening to spoken sentences that they judged as true or false. The dual task condition produced a significant deterioration in driving accuracy caused by the processing of the auditory sentences. At the same time, the parietal lobe activation associated with spatial processing in the undisturbed driving task decreased by 37% when participants concurrently listened to sentences. The findings show that language comprehension performed concurrently with driving draws mental resources away from the driving and produces deterioration in driving performance, even when it does not require holding or dialing a phone. PMID:18353285

  17. Concurrent Image Processing Executive (CIPE). Volume 2: Programmer's guide

    NASA Technical Reports Server (NTRS)

    Williams, Winifred I.

    1990-01-01

    This manual is intended as a guide for application programmers using the Concurrent Image Processing Executive (CIPE). CIPE is intended to become the support system software for a prototype high performance science analysis workstation. In its current configuration CIPE utilizes a JPL/Caltech Mark 3fp Hypercube with a Sun-4 host. CIPE's design is capable of incorporating other concurrent architectures as well. CIPE provides a programming environment to applications' programmers to shield them from various user interfaces, file transactions, and architectural complexities. A programmer may choose to write applications to use only the Sun-4 or to use the Sun-4 with the hypercube. A hypercube program will use the hypercube's data processors and optionally the Weitek floating point accelerators. The CIPE programming environment provides a simple set of subroutines to activate user interface functions, specify data distributions, activate hypercube resident applications, and to communicate parameters to and from the hypercube.

  18. Gray Bananas and a Red Letter A — From Synesthetic Sensation to Memory Colors

    PubMed Central

    Weiss, Franziska; Volberg, Gregor

    2018-01-01

    Grapheme–color synesthesia is a condition in which objectively achromatic graphemes induce concurrent color experiences. While it was long thought that the colors emerge during perception, there is growing support for the view that colors are integral to synesthetes’ cognitive representations of graphemes. In this work, we review evidence for two opposing theories positing either a perceptual or cognitive origin of concurrent colors: the cross-activation theory and the conceptual-mediation model. The review covers results on inducer and concurrent color processing as well as findings concerning the brain structure and grapheme–color mappings in synesthetes and trained mappings in nonsynesthetes. The results support different aspects of both theories. Finally, we discuss how research on memory colors could provide a new perspective in the debate about the level of processing at which the synesthetic colors occur. PMID:29899968

  19. Simplex turbopump design

    NASA Technical Reports Server (NTRS)

    Marsh, Matt; Cowan, Penny

    1994-01-01

    Turbomachinery used in liquid rocket engines typically are composed of complex geometries made from high strength-to-weight super alloys and have long design and fabrication cycle times (3 to 5 years). A simple, low-cost turbopump is being designed in-house to demonstrate the ability to reduce the overall cost to $500K and compress life cycle time to 18 months. The simplex turbopump was designed to provide a discharge pressure of 1500 psia of liquid oxygen at 90 lbm/s. The turbine will be powered by gaseous oxygen. This eliminates the need for an inter-propellant seal typically required to separate the fuel-rich turbine gases from the liquid oxygen pump components. Materials used in the turbine flow paths will utilize existing characterized metals at 800 deg R that are compatible with a warm oxygen environment. This turbopump design would be suitable for integration with a 40 K pound thrust hybrid motor that provides warm oxygen from a tapped-off location to power the turbine. The preliminary and detailed analysis was completed in a year by a multiple discipline, concurrent engineering team. Manpower, schedule, and cost data were tracked during the process for a comparison to the initial goal. The Simplex hardware is the procurement cycle with the expectation of the first test to occur approximately 1.5 months behind the original schedule goal.

  20. Hierarchical Modeling and Robust Synthesis for the Preliminary Design of Large Scale Complex Systems

    NASA Technical Reports Server (NTRS)

    Koch, Patrick N.

    1997-01-01

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis; Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration; and Noise modeling techniques for implementing robust preliminary design when approximate models are employed. Hierarchical partitioning and modeling techniques including intermediate responses, linking variables, and compatibility constraints are incorporated within a hierarchical compromise decision support problem formulation for synthesizing subproblem solutions for a partitioned system. Experimentation and approximation techniques are employed for concurrent investigations and modeling of partitioned subproblems. A modified composite experiment is introduced for fitting better predictive models across the ranges of the factors, and an approach for constructing partitioned response surfaces is developed to reduce the computational expense of experimentation for fitting models in a large number of factors. Noise modeling techniques are compared and recommendations are offered for the implementation of robust design when approximate models are sought. These techniques, approaches, and recommendations are incorporated within the method developed for hierarchical robust preliminary design exploration. This method as well as the associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system. The case study is developed in collaboration with Allison Engine Company, Rolls Royce Aerospace, and is based on the Allison AE3007 existing engine designed for midsize commercial, regional business jets. For this case study, the turbofan system-level problem is partitioned into engine cycle design and configuration design and a compressor modules integrated for more detailed subsystem-level design exploration, improving system evaluation. The fan and low pressure turbine subsystems are also modeled, but in less detail. Given the defined partitioning, these subproblems are investigated independently and concurrently, and response surface models are constructed to approximate the responses of each. These response models are then incorporated within a commercial turbofan hierarchical compromise decision support problem formulation. Five design scenarios are investigated, and robust solutions are identified. The method and solutions identified are verified by comparison with the AE3007 engine. The solutions obtained are similar to the AE3007 cycle and configuration, but are better with respect to many of the requirements.

  1. Multilevel Factor Structure, Concurrent Validity, and Test-Retest Reliability of the High School Teacher Version of the Authoritative School Climate Survey

    ERIC Educational Resources Information Center

    Huang, Francis L.; Cornell, Dewey G.

    2016-01-01

    Although school climate has long been recognized as an important factor in the school improvement process, there are few psychometrically supported measures based on teacher perspectives. The current study replicated and extended the factor structure, concurrent validity, and test-retest reliability of the teacher version of the Authoritative…

  2. Multicenter evaluation of signalment and comorbid conditions associated with aortic thrombotic disease in dogs.

    PubMed

    Winter, Randolph L; Budke, Christine M

    2017-08-15

    OBJECTIVE To assess signalment and concurrent disease processes in dogs with aortic thrombotic disease (ATD). DESIGN Retrospective case-control study. ANIMALS Dogs examined at North American veterinary teaching hospitals from 1985 through 2011 with medical records submitted to the Veterinary Medical Database. PROCEDURES Medical records were reviewed to identify dogs with a diagnosis of ATD (case dogs). Five control dogs without a diagnosis of ATD were then identified for every case dog. Data were collected regarding dog age, sex, breed, body weight, and concurrent disease processes. RESULTS ATD was diagnosed in 291 of the 984,973 (0.03%) dogs included in the database. The odds of a dog having ATD did not differ significantly by sex, age, or body weight. Compared with mixed-breed dogs, Shetland Sheepdogs had a significantly higher odds of ATD (OR, 2.59). Protein-losing nephropathy (64/291 [22%]) was the most commonly recorded concurrent disease in dogs with ATD. CONCLUSIONS AND CLINICAL RELEVANCE Dogs with ATD did not differ significantly from dogs without ATD in most signalment variables. Contrary to previous reports, cardiac disease was not a common concurrent diagnosis in dogs with ATD.

  3. Multiscale Methods, Parallel Computation, and Neural Networks for Real-Time Computer Vision.

    NASA Astrophysics Data System (ADS)

    Battiti, Roberto

    1990-01-01

    This thesis presents new algorithms for low and intermediate level computer vision. The guiding ideas in the presented approach are those of hierarchical and adaptive processing, concurrent computation, and supervised learning. Processing of the visual data at different resolutions is used not only to reduce the amount of computation necessary to reach the fixed point, but also to produce a more accurate estimation of the desired parameters. The presented adaptive multiple scale technique is applied to the problem of motion field estimation. Different parts of the image are analyzed at a resolution that is chosen in order to minimize the error in the coefficients of the differential equations to be solved. Tests with video-acquired images show that velocity estimation is more accurate over a wide range of motion with respect to the homogeneous scheme. In some cases introduction of explicit discontinuities coupled to the continuous variables can be used to avoid propagation of visual information from areas corresponding to objects with different physical and/or kinematic properties. The human visual system uses concurrent computation in order to process the vast amount of visual data in "real -time." Although with different technological constraints, parallel computation can be used efficiently for computer vision. All the presented algorithms have been implemented on medium grain distributed memory multicomputers with a speed-up approximately proportional to the number of processors used. A simple two-dimensional domain decomposition assigns regions of the multiresolution pyramid to the different processors. The inter-processor communication needed during the solution process is proportional to the linear dimension of the assigned domain, so that efficiency is close to 100% if a large region is assigned to each processor. Finally, learning algorithms are shown to be a viable technique to engineer computer vision systems for different applications starting from multiple-purpose modules. In the last part of the thesis a well known optimization method (the Broyden-Fletcher-Goldfarb-Shanno memoryless quasi -Newton method) is applied to simple classification problems and shown to be superior to the "error back-propagation" algorithm for numerical stability, automatic selection of parameters, and convergence properties.

  4. AthenaMT: upgrading the ATLAS software framework for the many-core world with multi-threading

    NASA Astrophysics Data System (ADS)

    Leggett, Charles; Baines, John; Bold, Tomasz; Calafiura, Paolo; Farrell, Steven; van Gemmeren, Peter; Malon, David; Ritsch, Elmar; Stewart, Graeme; Snyder, Scott; Tsulaia, Vakhtang; Wynne, Benjamin; ATLAS Collaboration

    2017-10-01

    ATLAS’s current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognized for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. After concluding a rigorous requirements phase, where many design components were examined in detail, ATLAS has begun the migration to a new data-flow driven, multi-threaded framework, which enables the simultaneous processing of singleton, thread unsafe legacy Algorithms, cloned Algorithms that execute concurrently in their own threads with different Event contexts, and fully re-entrant, thread safe Algorithms. In this paper we report on the process of modifying the framework to safely process multiple concurrent events in different threads, which entails significant changes in the underlying handling of features such as event and time dependent data, asynchronous callbacks, metadata, integration with the online High Level Trigger for partial processing in certain regions of interest, concurrent I/O, as well as ensuring thread safety of core services. We also report on upgrading the framework to handle Algorithms that are fully re-entrant.

  5. ARTEMIS: a collaborative framework for health care.

    PubMed Central

    Reddy, R.; Jagannathan, V.; Srinivas, K.; Karinthi, R.; Reddy, S. M.; Gollapudy, C.; Friedman, S.

    1993-01-01

    Patient centered healthcare delivery is an inherently collaborative process. This involves a wide range of individuals and organizations with diverse perspectives: primary care physicians, hospital administrators, labs, clinics, and insurance. The key to cost reduction and quality improvement in health care is effective management of this collaborative process. The use of multi-media collaboration technology can facilitate timely delivery of patient care and reduce cost at the same time. During the last five years, the Concurrent Engineering Research Center (CERC), under the sponsorship of DARPA (Defense Advanced Research Projects Agency, recently renamed ARPA) developed a number of generic key subsystems of a comprehensive collaboration environment. These subsystems are intended to overcome the barriers that inhibit the collaborative process. Three subsystems developed under this program include: MONET (Meeting On the Net)--to provide consultation over a computer network, ISS (Information Sharing Server)--to provide access to multi-media information, and PCB (Project Coordination Board)--to better coordinate focussed activities. These systems have been integrated into an open environment to enable collaborative processes. This environment is being used to create a wide-area (geographically distributed) research testbed under DARPA sponsorship, ARTEMIS (Advance Research Testbed for Medical Informatics) to explore the collaborative health care processes. We believe this technology will play a key role in the current national thrust to reengineer the present health-care delivery system. PMID:8130536

  6. Materials and Process Activities for NASA's Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Polis, Daniel L.

    2012-01-01

    In January 2007, the NASA Administrator and Associate Administrator for the Exploration Systems Mission Directorate chartered the NASA Engineering and Safety Center (NESC) to design, build, and test a full-scale Composite Crew Module (CCM). The overall goal of the CCM project was to develop a team from the NASA family with hands-on experience in composite design, manufacturing, and testing in anticipation of future space exploration systems being made of composite materials. The CCM project was planned to run concurrently with the Orion project s baseline metallic design within the Constellation Program so that features could be compared and discussed without inducing risk to the overall Program. The materials and process activities were prioritized based on a rapid prototype approach. This approach focused developmental activities on design details with greater risk and uncertainty, such as out-of-autoclave joining, over some of the more traditional lamina and laminate building block levels. While process development and associated building block testing were performed, several anomalies were still observed at the full-scale level due to interactions between process robustness and manufacturing scale-up. This paper describes the process anomalies that were encountered during the CCM development and the subsequent root cause investigations that led to the final design solutions. These investigations highlight the importance of full-scale developmental work early in the schedule of a complex composite design/build project.

  7. Effect of a concurrent auditory task on visual search performance in a driving-related image-flicker task.

    PubMed

    Richard, Christian M; Wright, Richard D; Ee, Cheryl; Prime, Steven L; Shimizu, Yujiro; Vavrik, John

    2002-01-01

    The effect of a concurrent auditory task on visual search was investigated using an image-flicker technique. Participants were undergraduate university students with normal or corrected-to-normal vision who searched for changes in images of driving scenes that involved either driving-related (e.g., traffic light) or driving-unrelated (e.g., mailbox) scene elements. The results indicated that response times were significantly slower if the search was accompanied by a concurrent auditory task. In addition, slower overall responses to scenes involving driving-unrelated changes suggest that the underlying process affected by the concurrent auditory task is strategic in nature. These results were interpreted in terms of their implications for using a cellular telephone while driving. Actual or potential applications of this research include the development of safer in-vehicle communication devices.

  8. Concurrent Image Processing Executive (CIPE)

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Cooper, Gregory T.; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.

    1988-01-01

    The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are discussed. The target machine for this software is a JPL/Caltech Mark IIIfp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules; (1) user interface, (2) host-resident executive, (3) hypercube-resident executive, and (4) application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube a data management method which distributes, redistributes, and tracks data set information was implemented.

  9. Flight Testing Surfaces Engineered for Mitigating Insect Adhesion on a Falcon HU-25C

    NASA Technical Reports Server (NTRS)

    Shanahan, Michelle; Wohl, Chris J.; Smith, Joseph G., Jr.; Connell, John W.; Siochi, Emilie J.; Doss, Jereme R.; Penner, Ronald K.

    2015-01-01

    Insect residue contamination on aircraft wings can decrease fuel efficiency in aircraft designed for natural laminar flow. Insect residues can cause a premature transition to turbulent flow, increasing fuel burn and making the aircraft less environmentally friendly. Surfaces, designed to minimize insect residue adhesion, were evaluated through flight testing on a Falcon HU-25C aircraft flown along the coast of Virginia and North Carolina. The surfaces were affixed to the wing leading edge and the aircraft remained at altitudes lower than 1000 feet throughout the flight to assure high insect density. The number of strikes on the engineered surfaces was compared to, and found to be lower than, untreated aluminum control surfaces flown concurrently. Optical profilometry was used to determine insect residue height and areal coverage. Differences in results between flight and laboratory tests suggest the importance of testing in realistic use environments to evaluate the effectiveness of engineered surface designs.

  10. A Physics-Based Vibrotactile Feedback Library for Collision Events.

    PubMed

    Park, Gunhyuk; Choi, Seungmoon

    2017-01-01

    We present PhysVib: a software solution on the mobile platform extending an open-source physics engine in a multi-rate rendering architecture for automatic vibrotactile feedback upon collision events. PhysVib runs concurrently with a physics engine at a low update rate and generates vibrotactile feedback commands at a high update rate based on the simulation results of the physics engine using an exponentially-decaying sinusoidal model. We demonstrate through a user study that this vibration model is more appropriate to our purpose in terms of perceptual quality than more complex models based on sound synthesis. We also evaluated the perceptual performance of PhysVib by comparing eight vibrotactile rendering methods. Experimental results suggested that PhysVib enables more realistic vibrotactile feedback than the other methods as to perceived similarity to the visual events. PhysVib is an effective solution for providing physically plausible vibrotactile responses while reducing application development time to great extent.

  11. Liquid Rocket Booster (LRB) for the Space Transportation System (STS) systems study, volume 2, addendum 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The feasibility of developing and producing a launch vehicle from an external tank (ET) and an engine module that mounts inline to the tankage at the aft end and contains six space transportation main engines (STME), was assessed. The primary mission of this launch vehicle would be to place a PLS (personnel launch vehicle) into a low earth orbit (LEO). The vehicle tankage and the assembly of the engine module, was evaluated to determine what, if any, manufacturing/production impacts would be incurred if this vehicle were built along side the current ET at Michoud Assembly Facility. It was determined that there would be no significant impacts to produce seven of these vehicles per year while concurrently producing 12 ETs per year. Preliminary estimates of both nonrecurring and recurring costs for this vehicle concept were made.

  12. Non-invasive lightweight integration engine for building EHR from autonomous distributed systems.

    PubMed

    Angulo, Carlos; Crespo, Pere; Maldonado, José A; Moner, David; Pérez, Daniel; Abad, Irene; Mandingorra, Jesús; Robles, Montserrat

    2007-12-01

    In this paper we describe Pangea-LE, a message-oriented lightweight data integration engine that allows homogeneous and concurrent access to clinical information from disperse and heterogeneous data sources. The engine extracts the information and passes it to the requesting client applications in a flexible XML format. The XML response message can be formatted on demand by appropriate Extensible Stylesheet Language (XSL) transformations in order to meet the needs of client applications. We also present a real deployment in a hospital where Pangea-LE collects and generates an XML view of all the available patient clinical information. The information is presented to healthcare professionals in an Electronic Health Record (EHR) viewer Web application with patient search and EHR browsing capabilities. Implantation in a real setting has been a success due to the non-invasive nature of Pangea-LE which respects the existing information systems.

  13. Non-invasive light-weight integration engine for building EHR from autonomous distributed systems.

    PubMed

    Crespo Molina, Pere; Angulo Fernández, Carlos; Maldonado Segura, José A; Moner Cano, David; Robles Viejo, Montserrat

    2006-01-01

    Pangea-LE is a message oriented light-weight integration engine, allowing concurrent access to clinical information from disperse and heterogeneous data sources. The engine extracts the information and serves it to the requester client applications in a flexible XML format. This XML response message can be formatted on demand by the appropriate XSL (Extensible Stylesheet Language) transformation in order to fit client application needs. In this article we present a real use case sample where Pangea-LE collects and generates "on the fly" a structured view of all the patient clinical information available in a healthcare organisation. This information is presented to healthcare professionals in an EHR (Electronic Health Record) viewer Web application with patient search and EHR browsing capabilities. Implantation in a real environment has been a notable success due to the non-invasive method which extremely respects the existing information systems.

  14. NASA Engine Icing Research Overview: Aeronautics Evaluation and Test Capabilities (AETC) Project

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    2015-01-01

    The occurrence of ice accretion within commercial high bypass aircraft turbine engines has been reported by airlines under certain atmospheric conditions. Engine anomalies have taken place at high altitudes that have been attributed to ice crystal ingestion by the engine. The ice crystals can result in degraded engine performance, loss of thrust control, compressor surge or stall, and flameout of the combustor. The Aviation Safety Program at NASA has taken on the technical challenge of a turbofan engine icing caused by ice crystals which can exist in high altitude convective clouds. The NASA engine icing project consists of an integrated approach with four concurrent and ongoing research elements, each of which feeds critical information to the next element. The project objective is to gain understanding of high altitude ice crystals by developing knowledge bases and test facilities for testing full engines and engine components. The first element is to utilize a highly instrumented aircraft to characterize the high altitude convective cloud environment. The second element is the enhancement of the Propulsion Systems Laboratory altitude test facility for gas turbine engines to include the addition of an ice crystal cloud. The third element is basic research of the fundamental physics associated with ice crystal ice accretion. The fourth and final element is the development of computational tools with the goal of simulating the effects of ice crystal ingestion on compressor and gas turbine engine performance. The NASA goal is to provide knowledge to the engine and aircraft manufacturing communities to help mitigate, or eliminate turbofan engine interruptions, engine damage, and failures due to ice crystal ingestion.

  15. Algorithms and software for nonlinear structural dynamics

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted; Gilbertsen, Noreen D.; Neal, Mark O.

    1989-01-01

    The objective of this research is to develop efficient methods for explicit time integration in nonlinear structural dynamics for computers which utilize both concurrency and vectorization. As a framework for these studies, the program WHAMS, which is described in Explicit Algorithms for the Nonlinear Dynamics of Shells (T. Belytschko, J. I. Lin, and C.-S. Tsay, Computer Methods in Applied Mechanics and Engineering, Vol. 42, 1984, pp 225 to 251), is used. There are two factors which make the development of efficient concurrent explicit time integration programs a challenge in a structural dynamics program: (1) the need for a variety of element types, which complicates the scheduling-allocation problem; and (2) the need for different time steps in different parts of the mesh, which is here called mixed delta t integration, so that a few stiff elements do not reduce the time steps throughout the mesh.

  16. Multiprocessor smalltalk: Implementation, performance, and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pallas, J.I.

    1990-01-01

    Multiprocessor Smalltalk demonstrates the value of object-oriented programming on a multiprocessor. Its implementation and analysis shed light on three areas: concurrent programming in an object oriented language without special extensions, implementation techniques for adapting to multiprocessors, and performance factors in the resulting system. Adding parallelism to Smalltalk code is easy, because programs already use control abstractions like iterators. Smalltalk's basic control and concurrency primitives (lambda expressions, processes and semaphores) can be used to build parallel control abstractions, including parallel iterators, parallel objects, atomic objects, and futures. Language extensions for concurrency are not required. This implementation demonstrates that it is possiblemore » to build an efficient parallel object-oriented programming system and illustrates techniques for doing so. Three modification tools-serialization, replication, and reorganization-adapted the Berkeley Smalltalk interpreter to the Firefly multiprocessor. Multiprocessor Smalltalk's performance shows that the combination of multiprocessing and object-oriented programming can be effective: speedups (relative to the original serial version) exceed 2.0 for five processors on all the benchmarks; the median efficiency is 48%. Analysis shows both where performance is lost and how to improve and generalize the experimental results. Changes in the interpreter to support concurrency add at most 12% overhead; better access to per-process variables could eliminate much of that. Changes in the user code to express concurrency add as much as 70% overhead; this overhead could be reduced to 54% if blocks (lambda expressions) were reentrant. Performance is also lost when the program cannot keep all five processors busy.« less

  17. Nuclei-mode particulate emissions and their response to fuel sulfur content and primary dilution during transient operations of old and modern diesel engines.

    PubMed

    Liu, Z Gerald; Vasys, Victoria N; Kittelson, David B

    2007-09-15

    The effects of fuel sulfur content and primary dilution on PM number emissions were investigated during transient operations of an old and a modern diesel engine. Emissions were also studied during steady-state operations in order to confirm consistency with previous findings. Testing methods were concurrent with those implemented by the EPA to regulate PM mass emissions, including the use of the Federal Transient Testing Procedure-Heavy Duty cycle to simulate transient conditions and the use of a Critical Flow Venturi-Constant Volume System to provide primary dilution. Steady-state results were found to be consistent with previous studies in that nuclei-mode particulate emissions were largely reduced when lower-sulfur content fuel was used in the newer engine, while the nuclei-mode PM emissions from the older engine were much less affected by fuel sulfur content. The transient results, however, show that the total number of nuclei-mode PM emissions from both engines increases with fuel sulfur content, although this effect is only seen under the higher primary dilution ratios with the older engine. Transient results further show that higher primary dilution ratios increase total nuclei-mode PM number emissions in both engines.

  18. Manufacturing process and material selection in concurrent collaborative design of MEMS devices

    NASA Astrophysics Data System (ADS)

    Zha, Xuan F.; Du, H.

    2003-09-01

    In this paper we present knowledge of an intensive approach and system for selecting suitable manufacturing processes and materials for microelectromechanical systems (MEMS) devices in concurrent collaborative design environment. In the paper, fundamental issues on MEMS manufacturing process and material selection such as concurrent design framework, manufacturing process and material hierarchies, and selection strategy are first addressed. Then, a fuzzy decision support scheme for a multi-criteria decision-making problem is proposed for estimating, ranking and selecting possible manufacturing processes, materials and their combinations. A Web-based prototype advisory system for the MEMS manufacturing process and material selection, WebMEMS-MASS, is developed based on the client-knowledge server architecture and framework to help the designer find good processes and materials for MEMS devices. The system, as one of the important parts of an advanced simulation and modeling tool for MEMS design, is a concept level process and material selection tool, which can be used as a standalone application or a Java applet via the Web. The running sessions of the system are inter-linked with webpages of tutorials and reference pages to explain the facets, fabrication processes and material choices, and calculations and reasoning in selection are performed using process capability and material property data from a remote Web-based database and interactive knowledge base that can be maintained and updated via the Internet. The use of the developed system including operation scenario, use support, and integration with an MEMS collaborative design system is presented. Finally, an illustration example is provided.

  19. Methods for design and evaluation of integrated hardware-software systems for concurrent computation

    NASA Technical Reports Server (NTRS)

    Pratt, T. W.

    1985-01-01

    Research activities and publications are briefly summarized. The major tasks reviewed are: (1) VAX implementation of the PISCES parallel programming environment; (2) Apollo workstation network implementation of the PISCES environment; (3) FLEX implementation of the PISCES environment; (4) sparse matrix iterative solver in PSICES Fortran; (5) image processing application of PISCES; and (6) a formal model of concurrent computation being developed.

  20. The Raid distributed database system

    NASA Technical Reports Server (NTRS)

    Bhargava, Bharat; Riedl, John

    1989-01-01

    Raid, a robust and adaptable distributed database system for transaction processing (TP), is described. Raid is a message-passing system, with server processes on each site to manage concurrent processing, consistent replicated copies during site failures, and atomic distributed commitment. A high-level layered communications package provides a clean location-independent interface between servers. The latest design of the package delivers messages via shared memory in a configuration with several servers linked into a single process. Raid provides the infrastructure to investigate various methods for supporting reliable distributed TP. Measurements on TP and server CPU time are presented, along with data from experiments on communications software, consistent replicated copy control during site failures, and concurrent distributed checkpointing. A software tool for evaluating the implementation of TP algorithms in an operating-system kernel is proposed.

  1. Multidisciplinary Design Technology Development: A Comparative Investigation of Integrated Aerospace Vehicle Design Tools

    NASA Technical Reports Server (NTRS)

    Renaud, John E.; Batill, Stephen M.; Brockman, Jay B.

    1998-01-01

    This research effort is a joint program between the Departments of Aerospace and Mechanical Engineering and the Computer Science and Engineering Department at the University of Notre Dame. Three Principal Investigators; Drs. Renaud, Brockman and Batill directed this effort. During the four and a half year grant period, six Aerospace and Mechanical Engineering Ph.D. students and one Masters student received full or partial support, while four Computer Science and Engineering Ph.D. students and one Masters student were supported. During each of the summers up to four undergraduate students were involved in related research activities. The purpose of the project was to develop a framework and systematic methodology to facilitate the application of Multidisciplinary Design Optimization (N4DO) to a diverse class of system design problems. For all practical aerospace systems, the design of a systems is a complex sequence of events which integrates the activities of a variety of discipline "experts" and their associated "tools". The development, archiving and exchange of information between these individual experts is central to the design task and it is this information which provides the basis for these experts to make coordinated design decisions (i.e., compromises and trade-offs) - resulting in the final product design. Grant efforts focused on developing and evaluating frameworks for effective design coordination within a MDO environment. Central to these research efforts was the concept that the individual discipline "expert", using the most appropriate "tools" available and the most complete description of the system should be empowered to have the greatest impact on the design decisions and final design. This means that the overall process must be highly interactive and efficiently conducted if the resulting design is to be developed in a manner consistent with cost and time requirements. The methods developed as part of this research effort include; extensions to a sensitivity based Concurrent Subspace Optimization (CSSO) MDO algorithm; the development of a neural network response surface based CSSO-MDO algorithm; and the integration of distributed computing and process scheduling into the MDO environment. This report overviews research efforts in each of these focus. A complete bibliography of research produced with support of this grant is attached.

  2. Patterns of and Motivations for Concurrent Use of Video Games and Substances

    PubMed Central

    Ream, Geoffrey L.; Elliott, Luther C.; Dunlap, Eloise

    2011-01-01

    “Behavioral addictions” share biological mechanisms with substance dependence, and “drug interactions” have been observed between certain substances and self-reinforcing behaviors. This study examines correlates of patterns of and motivations for playing video games while using or feeling the effects of a substance (concurrent use). Data were drawn from a nationally-representative survey of adult Americans who “regularly” or “occasionally” played video games and had played for at least one hour in the past seven days (n = 3,380). Only recent concurrent users’ data were included in analyses (n = 1,196). Independent variables included demographics, substance use frequency and problems, game genre of concurrent use (identified by looking titles up in an industry database), and general game playing variables including problem video game play (PVP), consumer involvement, enjoyment, duration, and frequency of play. Exploratory factor analysis identified the following dimensions underlying patterns of and motivations for concurrent use: pass time or regulate negative emotion, enhance an already enjoyable or positive experience, and use of video games and substances to remediate each other’s undesirable effects. Multivariate regression analyses indicated PVP and hours/day of video game play were associated with most patterns/motivations, as were caffeine, tobacco, alcohol, marijuana, and painkiller use problems. This suggests that concurrent use with some regular situational pattern or effect-seeking motivation is part of the addictive process underlying both PVP and substance dependence. Various demographic, game playing, game genre of concurrent use, and substance use variables were associated with specific motivations/patterns, indicating that all are important in understanding concurrent use. PMID:22073024

  3. Patterns of and motivations for concurrent use of video games and substances.

    PubMed

    Ream, Geoffrey L; Elliott, Luther C; Dunlap, Eloise

    2011-10-01

    "Behavioral addictions" share biological mechanisms with substance dependence, and "drug interactions" have been observed between certain substances and self-reinforcing behaviors. This study examines correlates of patterns of and motivations for playing video games while using or feeling the effects of a substance (concurrent use). Data were drawn from a nationally-representative survey of adult Americans who "regularly" or "occasionally" played video games and had played for at least one hour in the past seven days (n = 3,380). Only recent concurrent users' data were included in analyses (n = 1,196). Independent variables included demographics, substance use frequency and problems, game genre of concurrent use (identified by looking titles up in an industry database), and general game playing variables including problem video game play (PVP), consumer involvement, enjoyment, duration, and frequency of play. Exploratory factor analysis identified the following dimensions underlying patterns of and motivations for concurrent use: pass time or regulate negative emotion, enhance an already enjoyable or positive experience, and use of video games and substances to remediate each other's undesirable effects. Multivariate regression analyses indicated PVP and hours/day of video game play were associated with most patterns/motivations, as were caffeine, tobacco, alcohol, marijuana, and painkiller use problems. This suggests that concurrent use with some regular situational pattern or effect-seeking motivation is part of the addictive process underlying both PVP and substance dependence. Various demographic, game playing, game genre of concurrent use, and substance use variables were associated with specific motivations/patterns, indicating that all are important in understanding concurrent use.

  4. When Content Matters: The Role of Processing Code in Tactile Display Design.

    PubMed

    Ferris, Thomas K; Sarter, Nadine

    2010-01-01

    The distribution of tasks and stimuli across multiple modalities has been proposed as a means to support multitasking in data-rich environments. Recently, the tactile channel and, more specifically, communication via the use of tactile/haptic icons have received considerable interest. Past research has examined primarily the impact of concurrent task modality on the effectiveness of tactile information presentation. However, it is not well known to what extent the interpretation of iconic tactile patterns is affected by another attribute of information: the information processing codes of concurrent tasks. In two driving simulation studies (n = 25 for each), participants decoded icons composed of either spatial or nonspatial patterns of vibrations (engaging spatial and nonspatial processing code resources, respectively) while concurrently interpreting spatial or nonspatial visual task stimuli. As predicted by Multiple Resource Theory, performance was significantly worse (approximately 5-10 percent worse) when the tactile icons and visual tasks engaged the same processing code, with the overall worst performance in the spatial-spatial task pairing. The findings from these studies contribute to an improved understanding of information processing and can serve as input to multidimensional quantitative models of timesharing performance. From an applied perspective, the results suggest that competition for processing code resources warrants consideration, alongside other factors such as the naturalness of signal-message mapping, when designing iconic tactile displays. Nonspatially encoded tactile icons may be preferable in environments which already rely heavily on spatial processing, such as car cockpits.

  5. Design and Implementation of the Boundary Layer Transition Flight Experiment on Space Shuttle Discovery

    NASA Technical Reports Server (NTRS)

    Spanos, Theodoros A.; Micklos, Ann

    2010-01-01

    In an effort to better the understanding of high speed aerodynamics, a series of flight experiments were installed on Space Shuttle Discovery during the STS-119 and STS-128 missions. This experiment, known as the Boundary Layer Transition Flight Experiment (BLTFE), provided the technical community with actual entry flight data from a known height protuberance at Mach numbers at and above Mach 15. Any such data above Mach 15 is irreproducible in a laboratory setting. Years of effort have been invested in obtaining this valuable data, and many obstacles had to be overcome in order to ensure the success of implementing an Orbiter modification. Many Space Shuttle systems were involved in the installation of appropriate components that revealed 'concurrent engineering' was a key integration tool. This allowed the coordination of all various parts and pieces which had to be sequenced appropriately and installed at the right time. Several issues encountered include Orbiter configuration and access, design requirements versus current layout, implementing the modification versus typical processing timelines, and optimizing the engineering design cycles and changes. Open lines of communication within the entire modification team were essential to project success as the team was spread out across the United States, from NASA Kennedy Space Center in Florida, to NASA Johnson Space Center in Texas, to Boeing Huntington Beach, California among others. The forum permits the discussion of processing concerns from the design phase to the implementation phase, which eventually saw the successful flights and data acquisition on STS-119 in March 2009 and on STS-128 in September 2009.

  6. A Principled Approach to the Specification of System Architectures for Space Missions

    NASA Technical Reports Server (NTRS)

    McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad

    2015-01-01

    Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.

  7. Modulation of task demands suggests that semantic processing interferes with the formation of episodic associations

    PubMed Central

    Long, Nicole M.; Kahana, Michael J.

    2016-01-01

    Although episodic and semantic memory share overlapping neural mechanisms, it remains unclear how our pre-existing semantic associations modulate the formation of new, episodic associations. When freely recalling recently studied words, people rely on both episodic and semantic associations, shown through temporal and semantic clustering of responses. We asked whether orienting participants toward semantic associations interferes with or facilitates the formation of episodic associations. We compared electroencephalographic (EEG) activity recorded during the encoding of subsequently recalled words that were either temporally or semantically clustered. Participants studied words with or without a concurrent semantic orienting task. We identified a neural signature of successful episodic association formation whereby high frequency EEG activity (HFA, 44 – 100 Hz) overlying left prefrontal regions increased for subsequently temporally clustered words, but only for those words studied without a concurrent semantic orienting task. To confirm that this disruption in the formation of episodic associations was driven by increased semantic processing, we measured the neural correlates of subsequent semantic clustering. We found that HFA increased for subsequently semantically clustered words only for lists with a concurrent semantic orienting task. This dissociation suggests that increased semantic processing of studied items interferes with the neural processes that support the formation of novel episodic associations. PMID:27617775

  8. Modulation of task demands suggests that semantic processing interferes with the formation of episodic associations.

    PubMed

    Long, Nicole M; Kahana, Michael J

    2017-02-01

    Although episodic and semantic memory share overlapping neural mechanisms, it remains unclear how our pre-existing semantic associations modulate the formation of new, episodic associations. When freely recalling recently studied words, people rely on both episodic and semantic associations, shown through temporal and semantic clustering of responses. We asked whether orienting participants toward semantic associations interferes with or facilitates the formation of episodic associations. We compared electroencephalographic (EEG) activity recorded during the encoding of subsequently recalled words that were either temporally or semantically clustered. Participants studied words with or without a concurrent semantic orienting task. We identified a neural signature of successful episodic association formation whereby high-frequency EEG activity (HFA, 44-100 Hz) overlying left prefrontal regions increased for subsequently temporally clustered words, but only for those words studied without a concurrent semantic orienting task. To confirm that this disruption in the formation of episodic associations was driven by increased semantic processing, we measured the neural correlates of subsequent semantic clustering. We found that HFA increased for subsequently semantically clustered words only for lists with a concurrent semantic orienting task. This dissociation suggests that increased semantic processing of studied items interferes with the neural processes that support the formation of novel episodic associations. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. New technologies for space avionics

    NASA Technical Reports Server (NTRS)

    Aibel, David W.; Dingus, Peter; Lanciault, Mark; Hurdlebrink, Debra; Gurevich, Inna; Wenglar, Lydia

    1994-01-01

    This report reviews a 1994 effort that continued 1993 investigations into issues associated with the definition of requirements, with the practice concurrent engineering and rapid prototyping in the context of the development of a prototyping of a next-generation reaction jet driver controller. This report discusses lessons learned, the testing of the current prototype, the details of the current design, and the nature and performance of a mathematical model of the life cycle of a pilot operated valve solenoid.

  10. Lightning safety of animals.

    PubMed

    Gomes, Chandima

    2012-11-01

    This paper addresses a concurrent multidisciplinary problem: animal safety against lightning hazards. In regions where lightning is prevalent, either seasonally or throughout the year, a considerable number of wild, captive and tame animals are injured due to lightning generated effects. The paper discusses all possible injury mechanisms, focusing mainly on animals with commercial value. A large number of cases from several countries have been analyzed. Economically and practically viable engineering solutions are proposed to address the issues related to the lightning threats discussed.

  11. Architectural Guidelines for Multimedia and Hypermedia Data Interchange: Computer Aided Acquisition and Logistics Support/Concurrent Engineering (CALS/ CE) and Electronic Commerce/Electronic Data Interchange (EC/EDI)

    DTIC Science & Technology

    1991-09-01

    other networks . 69 For example, E-mail can be sent to an SNA network through a Softswitch gateway, but at a very slow rate. As discussed in Chapter III...10 6. Communication Protocols ..................... 10 D. NEW INFRASTRUCTURES ....................... 11 1. CALS Test Network (CTN...11 2. Industrial Networks ......................... 12 3. FTS-2000 and ISDN ........................ 12 4. CALS Operational Resource

  12. New Technologies for Space Avionics, 1993

    NASA Technical Reports Server (NTRS)

    Aibel, David W.; Harris, David R.; Bartlett, Dave; Black, Steve; Campagna, Dave; Fernald, Nancy; Garbos, Ray

    1993-01-01

    The report reviews a 1993 effort that investigated issues associated with the development of requirements, with the practice of concurrent engineering and with rapid prototyping, in the development of a next-generation Reaction Jet Drive Controller. This report details lessons learned, the current status of the prototype, and suggestions for future work. The report concludes with a discussion of the vision of future avionics architectures based on the principles associated with open architectures and integrated vehicle health management.

  13. Coverage Metrics for Model Checking

    NASA Technical Reports Server (NTRS)

    Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)

    2001-01-01

    When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.

  14. Developmental changes in distinguishing concurrent auditory objects.

    PubMed

    Alain, Claude; Theunissen, Eef L; Chevalier, Hélène; Batty, Magali; Taylor, Margot J

    2003-04-01

    Children have considerable difficulties in identifying speech in noise. In the present study, we examined age-related differences in central auditory functions that are crucial for parsing co-occurring auditory events using behavioral and event-related brain potential measures. Seventeen pre-adolescent children and 17 adults were presented with complex sounds containing multiple harmonics, one of which could be 'mistuned' so that it was no longer an integer multiple of the fundamental. Both children and adults were more likely to report hearing the mistuned harmonic as a separate sound with an increase in mistuning. However, children were less sensitive in detecting mistuning across all levels as revealed by lower d' scores than adults. The perception of two concurrent auditory events was accompanied by a negative wave that peaked at about 160 ms after sound onset. In both age groups, the negative wave, referred to as the 'object-related negativity' (ORN), increased in amplitude with mistuning. The ORN was larger in children than in adults despite a lower d' score. Together, the behavioral and electrophysiological results suggest that concurrent sound segregation is probably adult-like in pre-adolescent children, but that children are inefficient in processing the information following the detection of mistuning. These findings also suggest that processes involved in distinguishing concurrent auditory objects continue to mature during adolescence.

  15. A testing platform for durability studies of polymers and fiber-reinforced polymer composites under concurrent hygrothermo-mechanical stimuli.

    PubMed

    Gomez, Antonio; Pires, Robert; Yambao, Alyssa; La Saponara, Valeria

    2014-12-11

    The durability of polymers and fiber-reinforced polymer composites under service condition is a critical aspect to be addressed for their robust designs and condition-based maintenance. These materials are adopted in a wide range of engineering applications, from aircraft and ship structures, to bridges, wind turbine blades, biomaterials and biomedical implants. Polymers are viscoelastic materials, and their response may be highly nonlinear and thus make it challenging to predict and monitor their in-service performance. The laboratory-scale testing platform presented herein assists the investigation of the influence of concurrent mechanical loadings and environmental conditions on these materials. The platform was designed to be low-cost and user-friendly. Its chemically resistant materials make the platform adaptable to studies of chemical degradation due to in-service exposure to fluids. An example of experiment was conducted at RT on closed-cell polyurethane foam samples loaded with a weight corresponding to ~50% of their ultimate static and dry load. Results show that the testing apparatus is appropriate for these studies. Results also highlight the larger vulnerability of the polymer under concurrent loading, based on the higher mid-point displacements and lower residual failure loads. Recommendations are made for additional improvements to the testing apparatus.

  16. A Testing Platform for Durability Studies of Polymers and Fiber-reinforced Polymer Composites under Concurrent Hygrothermo-mechanical Stimuli

    PubMed Central

    Gomez, Antonio; Pires, Robert; Yambao, Alyssa; La Saponara, Valeria

    2014-01-01

    The durability of polymers and fiber-reinforced polymer composites under service condition is a critical aspect to be addressed for their robust designs and condition-based maintenance. These materials are adopted in a wide range of engineering applications, from aircraft and ship structures, to bridges, wind turbine blades, biomaterials and biomedical implants. Polymers are viscoelastic materials, and their response may be highly nonlinear and thus make it challenging to predict and monitor their in-service performance. The laboratory-scale testing platform presented herein assists the investigation of the influence of concurrent mechanical loadings and environmental conditions on these materials. The platform was designed to be low-cost and user-friendly. Its chemically resistant materials make the platform adaptable to studies of chemical degradation due to in-service exposure to fluids. An example of experiment was conducted at RT on closed-cell polyurethane foam samples loaded with a weight corresponding to ~50% of their ultimate static and dry load. Results show that the testing apparatus is appropriate for these studies. Results also highlight the larger vulnerability of the polymer under concurrent loading, based on the higher mid-point displacements and lower residual failure loads. Recommendations are made for additional improvements to the testing apparatus. PMID:25548950

  17. Information processing biases concurrently and prospectively predict depressive symptoms in adolescents: Evidence from a self-referent encoding task.

    PubMed

    Connolly, Samantha L; Abramson, Lyn Y; Alloy, Lauren B

    2016-01-01

    Negative information processing biases have been hypothesised to serve as precursors for the development of depression. The current study examined negative self-referent information processing and depressive symptoms in a community sample of adolescents (N = 291, Mage at baseline = 12.34 ± 0.61, 53% female, 47.4% African-American, 49.5% Caucasian and 3.1% Biracial). Participants completed a computerised self-referent encoding task (SRET) and a measure of depressive symptoms at baseline and completed an additional measure of depressive symptoms nine months later. Several negative information processing biases on the SRET were associated with concurrent depressive symptoms and predicted increases in depressive symptoms at follow-up. Findings partially support the hypothesis that negative information processing biases are associated with depressive symptoms in a nonclinical sample of adolescents, and provide preliminary evidence that these biases prospectively predict increases in depressive symptoms.

  18. Relationships among cognitive deficits and component skills of reading in younger and older students with developmental dyslexia.

    PubMed

    Park, Heeyoung; Lombardino, Linda J

    2013-09-01

    Processing speed deficits along with phonological awareness deficits have been identified as risk factors for dyslexia. This study was designed to examine the behavioral profiles of two groups, a younger (6-8 years) and an older (10-15 years) group of dyslexic children for the purposes of (1) evaluating the degree to which phonological awareness and processing speed deficits occur in the two developmental cohorts; (2) determining the strength of relationships between the groups' respective mean scores on cognitive tasks of phonological awareness and processing speed and their scores on component skills of reading; and (3) evaluating the degree to which phonological awareness and processing speed serve as concurrent predictors of component reading skills for each group. The mean scaled scores for both groups were similar on all but one processing speed task. The older group was significantly more depressed on a visual matching test of attention, scanning, and speed. Correlations between reading skills and the cognitive constructs were very similar for both age-groups. Neither of the two phonological awareness tasks correlated with either of the two processing speed tasks or with any of the three measures of reading. One of the two processing speed measures served as a concurrent predictor of word- and text-level reading in the younger, however, only the rapid naming measure functioned as a concurrent predictor of word reading in the older group. Conversely, phonological processing measures did not serve as concurrent predictors for word-level or text-level reading in either of the groups. Descriptive analyses of individual subjects' deficits in the domains of phonological awareness and processing speed revealed that (1) both linguistic and nonlinguistic processing speed deficits in the younger dyslexic children occurred at higher rates than deficits in phonological awareness and (2) cognitive deficits within and across these two domains were greater in the older dyslexic children. Our findings underscore the importance of using rapid naming measures when testing school-age children suspected of having a reading disability and suggest that processing speed measures that do not reply on verbal responses may serve as predictors of reading disability in young children prior to their development of naming automaticity. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Estimation Model of Spacecraft Parameters and Cost Based on a Statistical Analysis of COMPASS Designs

    NASA Technical Reports Server (NTRS)

    Gerberich, Matthew W.; Oleson, Steven R.

    2013-01-01

    The Collaborative Modeling for Parametric Assessment of Space Systems (COMPASS) team at Glenn Research Center has performed integrated system analysis of conceptual spacecraft mission designs since 2006 using a multidisciplinary concurrent engineering process. The set of completed designs was archived in a database, to allow for the study of relationships between design parameters. Although COMPASS uses a parametric spacecraft costing model, this research investigated the possibility of using a top-down approach to rapidly estimate the overall vehicle costs. This paper presents the relationships between significant design variables, including breakdowns of dry mass, wet mass, and cost. It also develops a model for a broad estimate of these parameters through basic mission characteristics, including the target location distance, the payload mass, the duration, the delta-v requirement, and the type of mission, propulsion, and electrical power. Finally, this paper examines the accuracy of this model in regards to past COMPASS designs, with an assessment of outlying spacecraft, and compares the results to historical data of completed NASA missions.

  20. Nanopatterned carbon films with engineered morphology by direct carbonization of UV-stabilized block copolymer films.

    PubMed

    Wang, Yong; Liu, Jinquan; Christiansen, Silke; Kim, Dong Ha; Gösele, Ulrich; Steinhart, Martin

    2008-11-01

    Nanopatterned thin carbon films were prepared by direct and expeditious carbonization of the block copolymer polystyrene- block-poly(2-vinylpyridine) (PS- b-P2VP) without the necessity of slow heating to the process temperature and of addition of further carbon precursors. Carbonaceous films having an ordered "dots-on-film" surface topology were obtained from reverse micelle monolayers. The regular nanoporous morphology of PS- b-P2VP films obtained by subjecting reverse micelle monolayers to swelling-induced surface reconstruction could likewise be transferred to carbon films thus characterized by ordered nanopit arrays. Stabilization of PS- b-P2VP by UV irradiation and the concurrent carbonization of both blocks were key to the conservation of the film topography. The approach reported here may enable the realization of a broad range of nanoscaled architectures for carbonaceous materials using a block copolymer ideally suited as a template because of the pronounced repulsion between its blocks and its capability to form highly ordered microdomain structures.

  1. Fracture of Human Femur Tissue Monitored by Acoustic Emission Sensors

    PubMed Central

    Aggelis, Dimitrios. G.; Strantza, Maria; Louis, Olivia; Boulpaep, Frans; Polyzos, Demosthenes; van Hemelrijck, Danny

    2015-01-01

    The study describes the acoustic emission (AE) activity during human femur tissue fracture. The specimens were fractured in a bending-torsion loading pattern with concurrent monitoring by two AE sensors. The number of recorded signals correlates well with the applied load providing the onset of micro-fracture at approximately one sixth of the maximum load. Furthermore, waveform frequency content and rise time are related to the different modes of fracture (bending of femur neck or torsion of diaphysis). The importance of the study lies mainly in two disciplines. One is that, although femurs are typically subjects of surgical repair in humans, detailed monitoring of the fracture with AE will enrich the understanding of the process in ways that cannot be achieved using only the mechanical data. Additionally, from the point of view of monitoring techniques, applying sensors used for engineering materials and interpreting the obtained data pose additional difficulties due to the uniqueness of the bone structure. PMID:25763648

  2. Modeling and implementation of concurrent logic controllers with use of Petri nets, LSMs, and sequent calculus

    NASA Astrophysics Data System (ADS)

    Tkacz, J.; Bukowiec, A.; Doligalski, M.

    2017-08-01

    The paper presentes the method of modeling and implementation of concurrent controllers. Concurrent controllers are specified by Petri nets. Then Petri nets are decomposed using symbolic deduction method of analysis. Formal methods like sequent calculus system with considered elements of Thelen's algorithm have been used here. As a result, linked state machines (LSMs) are received. Each FSM is implemented using methods of structural decomposition during process of logic synthesis. The method of multiple encoding of microinstruction has been applied. It leads to decreased number of Boolean function realized by combinational part of FSM. The additional decoder could be implemented with the use of memory blocks.

  3. Visual attention and emotional memory: recall of aversive pictures is partially mediated by concurrent task performance.

    PubMed

    Pottage, Claire L; Schaefer, Alexandre

    2012-02-01

    The emotional enhancement of memory is often thought to be determined by attention. However, recent evidence using divided attention paradigms suggests that attention does not play a significant role in the formation of memories for aversive pictures. We report a study that investigated this question using a paradigm in which participants had to encode lists of randomly intermixed negative and neutral pictures under conditions of full attention and divided attention followed by a free recall test. Attention was divided by a highly demanding concurrent task tapping visual processing resources. Results showed that the advantage in recall for aversive pictures was still present in the DA condition. However, mediation analyses also revealed that concurrent task performance significantly mediated the emotional enhancement of memory under divided attention. This finding suggests that visual attentional processes play a significant role in the formation of emotional memories. PsycINFO Database Record (c) 2012 APA, all rights reserved

  4. Concurrent evolution of feature extractors and modular artificial neural networks

    NASA Astrophysics Data System (ADS)

    Hannak, Victor; Savakis, Andreas; Yang, Shanchieh Jay; Anderson, Peter

    2009-05-01

    This paper presents a new approach for the design of feature-extracting recognition networks that do not require expert knowledge in the application domain. Feature-Extracting Recognition Networks (FERNs) are composed of interconnected functional nodes (feurons), which serve as feature extractors, and are followed by a subnetwork of traditional neural nodes (neurons) that act as classifiers. A concurrent evolutionary process (CEP) is used to search the space of feature extractors and neural networks in order to obtain an optimal recognition network that simultaneously performs feature extraction and recognition. By constraining the hill-climbing search functionality of the CEP on specific parts of the solution space, i.e., individually limiting the evolution of feature extractors and neural networks, it was demonstrated that concurrent evolution is a necessary component of the system. Application of this approach to a handwritten digit recognition task illustrates that the proposed methodology is capable of producing recognition networks that perform in-line with other methods without the need for expert knowledge in image processing.

  5. Foveal Processing Under Concurrent Peripheral Load in Profoundly Deaf Adults

    PubMed Central

    2016-01-01

    Development of the visual system typically proceeds in concert with the development of audition. One result is that the visual system of profoundly deaf individuals differs from that of those with typical auditory systems. While past research has suggested deaf people have enhanced attention in the visual periphery, it is still unclear whether or not this enhancement entails deficits in central vision. Profoundly deaf and typically hearing adults were administered a variant of the useful field of view task that independently assessed performance on concurrent central and peripheral tasks. Identification of a foveated target was impaired by a concurrent selective peripheral attention task, more so in profoundly deaf adults than in the typically hearing. Previous findings of enhanced performance on the peripheral task were not replicated. These data are discussed in terms of flexible allocation of spatial attention targeted towards perceived task demands, and support a modified “division of labor” hypothesis whereby attentional resources co-opted to process peripheral space result in reduced resources in the central visual field. PMID:26657078

  6. Peer deviance, parenting and disruptive behavior among young girls.

    PubMed

    Miller, Shari; Loeber, Rolf; Hipwell, Alison

    2009-02-01

    This study examined concurrent and longitudinal associations between peer deviance, parenting practices, and conduct and oppositional problems among young girls ages 7 and 8. Participants were 588 African American and European American girls who were part of a population-based study of the development of conduct problems and delinquency among girls. Affiliations with problem-prone peers were apparent among a sizeable minority of the girls, and these associations included both males and females. Although peer delinquency concurrently predicted disruptive behaviors, the gender of these peers did not contribute to girls' behavior problems. Harsh parenting and low parental warmth showed both concurrent and prospective associations with girls' disruptive behaviors. Similar patterns of association were seen for African American and European American girls. The findings show that peer and parent risk processes are important contributors to the early development of young girls' conduct and oppositional behaviors. These data contribute to our understanding of girls' aggression and antisocial behaviors and further inform our understanding of risk processes for these behaviors among young girls in particular.

  7. Peer Deviance, Parenting and Disruptive Behavior among Young Girls

    PubMed Central

    Miller, Shari; Loeber, Rolf; Hipwell, Alison

    2009-01-01

    This study examined concurrent and longitudinal associations between peer deviance, parenting practices, and conduct and oppositional problems among young girls ages 7 and 8. Participants were 588 African American and European American girls who were part of a population-based study of the development of conduct problems and delinquency among girls. Affiliations with problem-prone peers were apparent among a sizeable minority of the girls, and these associations included both males and females. Although peer delinquency concurrently predicted disruptive behaviors, the gender of these peers did not contribute to girls’ behavior problems. Harsh parenting and low parental warmth showed both concurrent and prospective associations with girls’ disruptive behaviors. Similar patterns of association were seen for African American and European American girls. The findings show that peer and parent risk processes are important contributors to the early development of young girls’ conduct and oppositional behaviors. These data contribute to our understanding of girls’ aggression and antisocial behaviors and further inform our understanding of risk processes for these behaviors among young girls in particular. PMID:18777132

  8. Validation of Reverse-Engineered and Additive-Manufactured Microsurgical Instrument Prototype.

    PubMed

    Singh, Ramandeep; Suri, Ashish; Anand, Sneh; Baby, Britty

    2016-12-01

    With advancements in imaging techniques, neurosurgical procedures are becoming highly precise and minimally invasive, thus demanding development of new ergonomically aesthetic instruments. Conventionally, neurosurgical instruments are manufactured using subtractive manufacturing methods. Such a process is complex, time-consuming, and impractical for prototype development and validation of new designs. Therefore, an alternative design process has been used utilizing blue light scanning, computer-aided designing, and additive manufacturing direct metal laser sintering (DMLS) for microsurgical instrument prototype development. Deviations of DMLS-fabricated instrument were studied by superimposing scan data of fabricated instrument with the computer-aided designing model. Content and concurrent validity of the fabricated prototypes was done by a group of 15 neurosurgeons by performing sciatic nerve anastomosis in small laboratory animals. Comparative scoring was obtained for the control and study instrument. T test was applied to the individual parameters and P values for force (P < .0001) and surface roughness (P < .01) were found to be statistically significant. These 2 parameters were further analyzed using objective measures. Results depicts that additive manufacturing by DMLS provides an effective method for prototype development. However, direct application of these additive-manufactured instruments in the operating room requires further validation. © The Author(s) 2016.

  9. Charged Water Droplets can Melt Metallic Electrodes

    NASA Astrophysics Data System (ADS)

    Elton, Eric; Rosenberg, Ethan; Ristenpart, William

    2016-11-01

    A water drop, when immersed in an insulating fluid, acquires charge when it contacts an energized electrode. Provided the electric field is strong enough, the drop will move away to the opposite electrode, acquire the opposite charge, and repeat the process, effectively 'bouncing' back and forth between the electrodes. A key implicit assumption, dating back to Maxwell, has been that the electrode remains unaltered by the charging process. Here we demonstrate that the electrode is physically deformed during each charge transfer event with an individual water droplet or other conducting object. We used optical, electron, and atomic force microscopy to characterize a variety of different metallic electrodes before and after drops were electrically bounced on them. Although the electrodes appear unchanged to the naked eye, the microscopy reveals that each charge transfer event yielded a crater approximately 1 micron wide and 50 nm deep, with the exact dimensions proportional to the applied field strength. We present evidence that the craters are formed by localized melting of the electrodes via Joule heating in the metal and concurrent dielectric breakdown of the surrounding fluid, suggesting that the electrode locally achieves temperatures exceeding 3400°C. Present address: Dept. Materials Sci. Engineering, MIT.

  10. Experimental clean combustor program, phase 2

    NASA Technical Reports Server (NTRS)

    Roberts, R.; Peduzzi, A.; Vitti, G. E.

    1976-01-01

    Combustor pollution reduction technology for commercial CTOL engines was generated and this technology was demonstrated in a full-scale JT9D engine in 1976. Component rig refinement of the two best combustor concepts were tested. These concepts are the vorbix combustor, and a hybrid combustor which combines the pilot zone of the staged premix combustor and the main zone of the swirl-can combustor. Both concepts significantly reduced all pollutant emissions relative to the JT9D-7 engine combustor. However, neither concept met all program goals. The hybrid combustor met pollution goals for unburned hydrocarbons and carbon monoxide but did not achieve the oxides of nitrogen goal. This combustor had significant performance deficiencies. The Vorbix combustor met goals for unburned hydrocarbons and oxides of nitrogen but did not achieve the carbon monoxide goal. Performance of the vorbix combustor approached the engine requirements. On the basis of these results, the vorbix combustor was selected for the engine demonstration program. A control study was conducted to establish fuel control requirements imposed by the low-emission combustor concepts and to identify conceptual control system designs. Concurrent efforts were also completed on two addendums: an alternate fuels addendum and a combustion noise addendum.

  11. Motor-cognitive dual-task performance: effects of a concurrent motor task on distinct components of visual processing capacity.

    PubMed

    Künstler, E C S; Finke, K; Günther, A; Klingner, C; Witte, O; Bublak, P

    2018-01-01

    Dual tasking, or the simultaneous execution of two continuous tasks, is frequently associated with a performance decline that can be explained within a capacity sharing framework. In this study, we assessed the effects of a concurrent motor task on the efficiency of visual information uptake based on the 'theory of visual attention' (TVA). TVA provides parameter estimates reflecting distinct components of visual processing capacity: perceptual threshold, visual processing speed, and visual short-term memory (VSTM) storage capacity. Moreover, goodness-of-fit values and bootstrapping estimates were derived to test whether the TVA-model is validly applicable also under dual task conditions, and whether the robustness of parameter estimates is comparable in single- and dual-task conditions. 24 subjects of middle to higher age performed a continuous tapping task, and a visual processing task (whole report of briefly presented letter arrays) under both single- and dual-task conditions. Results suggest a decline of both visual processing capacity and VSTM storage capacity under dual-task conditions, while the perceptual threshold remained unaffected by a concurrent motor task. In addition, goodness-of-fit values and bootstrapping estimates support the notion that participants processed the visual task in a qualitatively comparable, although quantitatively less efficient way under dual-task conditions. The results support a capacity sharing account of motor-cognitive dual tasking and suggest that even performing a relatively simple motor task relies on central attentional capacity that is necessary for efficient visual information uptake.

  12. A sociocultural analysis of Latino high school students' funds of knowledge and implications for culturally responsive engineering education

    NASA Astrophysics Data System (ADS)

    Mejia, Joel Alejandro

    Previous studies have suggested that, when funds of knowledge are incorporated into science and mathematics curricula, students are more engaged and often develop richer understandings of scientific concepts. While there has been a growing body of research addressing how teachers may integrate students' linguistic, social, and cultural practices with science and mathematics instruction, very little research has been conducted on how the same can be accomplished with Latino and Latina students in engineering. The purpose of this study was to address this gap in the literature by investigating how fourteen Latino and Latina high school adolescents used their funds of knowledge to address engineering design challenges. This project was intended to enhance the educational experience of underrepresented minorities whose social and cultural practices have been traditionally undervalued in schools. This ethnographic study investigated the funds of knowledge of fourteen Latino and Latina high school adolescents and how they used these funds of knowledge in engineering design. Participant observation, bi-monthly group discussion, retrospective and concurrent protocols, and monthly one-on-one interviews were conducted during the study. A constant comparative analysis suggested that Latino and Latina adolescents, although profoundly underrepresented in engineering, bring a wealth of knowledge and experiences that are relevant to engineering design thinking and practice.

  13. Space Transportation Engine Program (STEP), phase B

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The Space Transportation Engine Program (STEP) Phase 2 effort includes preliminary design and activities plan preparation that will allow smooth and time transition into a Prototype Phase and then into Phases 3, 4, and 5. A Concurrent Engineering approach using Total Quality Management (TQM) techniques, is being applied to define an oxygen-hydrogen engine. The baseline from Phase 1/1' studies was used as a point of departure for trade studies and analyses. Existing STME system models are being enhanced as more detailed module/component characteristics are determined. Preliminary designs for the open expander, closed expander, and gas generator cycles were prepared, and recommendations for cycle selection made at the Design Concept Review (DCR). As a result of July '90 DCR, and information subsequently supplied to the Technical Review Team, a gas generator cycle was selected. Results of the various Advanced Development Programs (ADP's) for the Advanced Launch Systems (ALS) were contributive to this effort. An active vehicle integration effort is supplying the NASA, Air Force, and vehicle contractors with engine parameters and data, and flowing down appropriate vehicle requirements. Engine design and analysis trade studies are being documented in a data base that was developed and is being used to organize information. To date, seventy four trade studies were input to the data base.

  14. Toward Genome-Based Metabolic Engineering in Bacteria.

    PubMed

    Oesterle, Sabine; Wuethrich, Irene; Panke, Sven

    2017-01-01

    Prokaryotes modified stably on the genome are of great importance for production of fine and commodity chemicals. Traditional methods for genome engineering have long suffered from imprecision and low efficiencies, making construction of suitable high-producer strains laborious. Here, we review the recent advances in discovery and refinement of molecular precision engineering tools for genome-based metabolic engineering in bacteria for chemical production, with focus on the λ-Red recombineering and the clustered regularly interspaced short palindromic repeats/Cas9 nuclease systems. In conjunction, they enable the integration of in vitro-synthesized DNA segments into specified locations on the chromosome and allow for enrichment of rare mutants by elimination of unmodified wild-type cells. Combination with concurrently developing improvements in important accessory technologies such as DNA synthesis, high-throughput screening methods, regulatory element design, and metabolic pathway optimization tools has resulted in novel efficient microbial producer strains and given access to new metabolic products. These new tools have made and will likely continue to make a big impact on the bioengineering strategies that transform the chemical industry. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Concurrent growth of InSe wires and In2O3 tulip-like structures in the Au-catalytic vapour-liquid-solid process

    NASA Astrophysics Data System (ADS)

    Taurino, A.; Signore, M. A.

    2015-06-01

    In this work, the concurrent growth of InSe and In2O3 nanostructures, obtained by thermal evaporation of InSe powders on Au-covered Si substrates, has been investigated by scanning and transmission electron microscopy techniques. The vapour-solid and Au catalytic vapour-liquid-solid growth mechanisms, responsible of the simultaneous development of the two different types of nanostructures, i.e. InSe wires and In2O3 tulip-like structures respectively, are discussed in detail. The thermodynamic processes giving rise to the obtained morphologies and materials are explained.

  16. Coherence number as a discrete quantum resource

    NASA Astrophysics Data System (ADS)

    Chin, Seungbeom

    2017-10-01

    We introduce a discrete coherence monotone named the coherence number, which is a generalization of the coherence rank to mixed states. After defining the coherence number in a manner similar to that of the Schmidt number in entanglement theory, we present a necessary and sufficient condition of the coherence number for a coherent state to be converted to an entangled state of nonzero k concurrence (a member of the generalized concurrence family with 2 ≤k ≤d ). As an application of the coherence number to a practical quantum system, Grover's search algorithm of N items is considered. We show that the coherence number remains N and falls abruptly when the success probability of a searching process becomes maximal. This phenomenon motivates us to analyze the depletion pattern of Cc(N ) (the last member of the generalized coherence concurrence, nonzero when the coherence number is N ), which turns out to be an optimal resource for the process since it is completely consumed to finish the searching task. The generalization of the original Grover algorithm with arbitrary (mixed) initial states is also discussed, which reveals the boundary condition for the coherence to be monotonically decreasing under the process.

  17. Concurrent processing simulation of the space station

    NASA Technical Reports Server (NTRS)

    Gluck, R.; Hale, A. L.; Sunkel, John W.

    1989-01-01

    The development of a new capability for the time-domain simulation of multibody dynamic systems and its application to the study of a large angle rotational maneuvers of the Space Station is described. The effort was divided into three sequential tasks, which required significant advancements of the state-of-the art to accomplish. These were: (1) the development of an explicit mathematical model via symbol manipulation of a flexible, multibody dynamic system; (2) the development of a methodology for balancing the computational load of an explicit mathematical model for concurrent processing; and (3) the implementation and successful simulation of the above on a prototype Custom Architectured Parallel Processing System (CAPPS) containing eight processors. The throughput rate achieved by the CAPPS operating at only 70 percent efficiency, was 3.9 times greater than that obtained sequentially by the IBM 3090 supercomputer simulating the same problem. More significantly, analysis of the results leads to the conclusion that the relative cost effectiveness of concurrent vs. sequential digital computation will grow substantially as the computational load is increased. This is a welcomed development in an era when very complex and cumbersome mathematical models of large space vehicles must be used as substitutes for full scale testing which has become impractical.

  18. Parallel effects of memory set activation and search on timing and working memory capacity.

    PubMed

    Schweickert, Richard; Fortin, Claudette; Xi, Zhuangzhuang; Viau-Quesnel, Charles

    2014-01-01

    Accurately estimating a time interval is required in everyday activities such as driving or cooking. Estimating time is relatively easy, provided a person attends to it. But a brief shift of attention to another task usually interferes with timing. Most processes carried out concurrently with timing interfere with it. Curiously, some do not. Literature on a few processes suggests a general proposition, the Timing and Complex-Span Hypothesis: A process interferes with concurrent timing if and only if process performance is related to complex span. Complex-span is the number of items correctly recalled in order, when each item presented for study is followed by a brief activity. Literature on task switching, visual search, memory search, word generation and mental time travel supports the hypothesis. Previous work found that another process, activation of a memory set in long term memory, is not related to complex-span. If the Timing and Complex-Span Hypothesis is true, activation should not interfere with concurrent timing in dual-task conditions. We tested such activation in single-task memory search task conditions and in dual-task conditions where memory search was executed with concurrent timing. In Experiment 1, activating a memory set increased reaction time, with no significant effect on time production. In Experiment 2, set size and memory set activation were manipulated. Activation and set size had a puzzling interaction for time productions, perhaps due to difficult conditions, leading us to use a related but easier task in Experiment 3. In Experiment 3 increasing set size lengthened time production, but memory activation had no significant effect. Results here and in previous literature on the whole support the Timing and Complex-Span Hypotheses. Results also support a sequential organization of activation and search of memory. This organization predicts activation and set size have additive effects on reaction time and multiplicative effects on percent correct, which was found.

  19. ACSYNT inner loop flight control design study

    NASA Technical Reports Server (NTRS)

    Bortins, Richard; Sorensen, John A.

    1993-01-01

    The NASA Ames Research Center developed the Aircraft Synthesis (ACSYNT) computer program to synthesize conceptual future aircraft designs and to evaluate critical performance metrics early in the design process before significant resources are committed and cost decisions made. ACSYNT uses steady-state performance metrics, such as aircraft range, payload, and fuel consumption, and static performance metrics, such as the control authority required for the takeoff rotation and for landing with an engine out, to evaluate conceptual aircraft designs. It can also optimize designs with respect to selected criteria and constraints. Many modern aircraft have stability provided by the flight control system rather than by the airframe. This may allow the aircraft designer to increase combat agility, or decrease trim drag, for increased range and payload. This strategy requires concurrent design of the airframe and the flight control system, making trade-offs of performance and dynamics during the earliest stages of design. ACSYNT presently lacks means to implement flight control system designs but research is being done to add methods for predicting rotational degrees of freedom and control effector performance. A software module to compute and analyze the dynamics of the aircraft and to compute feedback gains and analyze closed loop dynamics is required. The data gained from these analyses can then be fed back to the aircraft design process so that the effects of the flight control system and the airframe on aircraft performance can be included as design metrics. This report presents results of a feasibility study and the initial design work to add an inner loop flight control system (ILFCS) design capability to the stability and control module in ACSYNT. The overall objective is to provide a capability for concurrent design of the aircraft and its flight control system, and enable concept designers to improve performance by exploiting the interrelationships between aircraft and flight control system design parameters.

  20. 8 CFR 204.3 - Orphan cases under section 101(b)(1)(F) of the Act (non-Convention cases).

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... advanced processing application (or the advanced processing application concurrently with the orphan... home study preparer and/or fingerprint check. Advanced processing application means Form I-600A (Application for Advanced Processing of Orphan Petition) completed in accordance with the form's instructions...

  1. 8 CFR 204.3 - Orphan cases under section 101(b)(1)(F) of the Act (non-Convention cases).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... advanced processing application (or the advanced processing application concurrently with the orphan... home study preparer and/or fingerprint check. Advanced processing application means Form I-600A (Application for Advanced Processing of Orphan Petition) completed in accordance with the form's instructions...

  2. 8 CFR 204.3 - Orphan cases under section 101(b)(1)(F) of the Act (non-Convention cases).

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... advanced processing application (or the advanced processing application concurrently with the orphan... home study preparer and/or fingerprint check. Advanced processing application means Form I-600A (Application for Advanced Processing of Orphan Petition) completed in accordance with the form's instructions...

  3. 8 CFR 204.3 - Orphan cases under section 101(b)(1)(F) of the Act (non-Convention cases).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... advanced processing application (or the advanced processing application concurrently with the orphan... home study preparer and/or fingerprint check. Advanced processing application means Form I-600A (Application for Advanced Processing of Orphan Petition) completed in accordance with the form's instructions...

  4. 8 CFR 204.3 - Orphan cases under section 101(b)(1)(F) of the Act (non-Convention cases).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... advanced processing application (or the advanced processing application concurrently with the orphan... home study preparer and/or fingerprint check. Advanced processing application means Form I-600A (Application for Advanced Processing of Orphan Petition) completed in accordance with the form's instructions...

  5. Expanded study of feasibility of measuring in-flight 747/JT9D loads, performance, clearance, and thermal data

    NASA Technical Reports Server (NTRS)

    Sallee, G. P.; Martin, R. L.

    1980-01-01

    The JT9D jet engine exhibits a TSFC loss of about 1 percent in the initial 50 flight cycles of a new engine. These early losses are caused by seal-wear induced opening of running clearances in the engine gas path. The causes of this seal wear have been identified as flight induced loads which deflect the engine cases and rotors, causing the rotating blades to rub against the seal surfaces, producing permanent clearance changes. The real level of flight loads encountered during airplane acceptance testing and revenue service and the engine's response in the dynamic flight environment were investigated. The feasibility of direct measurement of these flight loads and their effects by concurrent measurement of 747/JT9D propulsion system aerodynamic and inertia loads and the critical engine clearance and performance changes during 747 flight and ground operations was evaluated. A number of technical options were examined in relation to the total estimated program cost to facilitate selection of the most cost effective option. It is concluded that a flight test program meeting the overall objective of determining the levels of aerodynamic and inertia load levels to which the engine is exposed during the initial flight acceptance test and normal flight maneuvers is feasible and desirable. A specific recommended flight test program, based on the evaluation of cost effectiveness, is defined.

  6. Feasibility study for convertible engine torque converter

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The feasibility study has shown that a dump/fill type torque converter has excellent potential for the convertible fan/shaft engine. The torque converter space requirement permits internal housing within the normal flow path of a turbofan engine at acceptable engine weight. The unit permits operating the engine in the turboshaft mode by decoupling the fan. To convert to turbofan mode, the torque converter overdrive capability bring the fan speed up to the power turbine speed to permit engagement of a mechanical lockup device when the shaft speed are synchronized. The conversion to turbofan mode can be made without drop of power turbine speed in less than 10 sec. Total thrust delivered to the aircraft by the proprotor, fan, and engine during tansient can be controlled to prevent loss of air speed or altitude. Heat rejection to the oil is low, and additional oil cooling capacity is not required. The turbofan engine aerodynamic design is basically uncompromised by convertibility and allows proper fan design for quiet and efficient cruise operation. Although the results of the feasibility study are exceedingly encouraging, it must be noted that they are based on extrapolation of limited existing data on torque converters. A component test program with three trial torque converter designs and concurrent computer modeling for fluid flow, stress, and dynamics, updated with test results from each unit, is recommended.

  7. Refinement and Further Validation of the Decisional Process Inventory.

    ERIC Educational Resources Information Center

    Hartung, Paul J.; Marco, Cynthia D.

    1998-01-01

    The Decisional Process Inventory is a Gestalt theory-based measure of career decision-making and level of career indecision. Results from a sample of 183 undergraduates supported its content, construct, and concurrent validity. (SK)

  8. Multimedia content analysis and indexing: evaluation of a distributed and scalable architecture

    NASA Astrophysics Data System (ADS)

    Mandviwala, Hasnain; Blackwell, Scott; Weikart, Chris; Van Thong, Jean-Manuel

    2003-11-01

    Multimedia search engines facilitate the retrieval of documents from large media content archives now available via intranets and the Internet. Over the past several years, many research projects have focused on algorithms for analyzing and indexing media content efficiently. However, special system architectures are required to process large amounts of content from real-time feeds or existing archives. Possible solutions include dedicated distributed architectures for analyzing content rapidly and for making it searchable. The system architecture we propose implements such an approach: a highly distributed and reconfigurable batch media content analyzer that can process media streams and static media repositories. Our distributed media analysis application handles media acquisition, content processing, and document indexing. This collection of modules is orchestrated by a task flow management component, exploiting data and pipeline parallelism in the application. A scheduler manages load balancing and prioritizes the different tasks. Workers implement application-specific modules that can be deployed on an arbitrary number of nodes running different operating systems. Each application module is exposed as a web service, implemented with industry-standard interoperable middleware components such as Microsoft ASP.NET and Sun J2EE. Our system architecture is the next generation system for the multimedia indexing application demonstrated by www.speechbot.com. It can process large volumes of audio recordings with minimal support and maintenance, while running on low-cost commodity hardware. The system has been evaluated on a server farm running concurrent content analysis processes.

  9. Strategies for the Successful Implementation of Viral Laboratory Automation

    PubMed Central

    Avivar, Cristóbal

    2012-01-01

    It has been estimated that more than 70% of all medical activity is directly related to information providing analytical data. Substantial technological advances have taken place recently, which have allowed a previously unimagined number of analytical samples to be processed while offering high quality results. Concurrently, yet more new diagnostic determinations have been introduced - all of which has led to a significant increase in the prescription of analytical parameters. This increased workload has placed great pressure on the laboratory with respect to health costs. The present manager of the Clinical Laboratory (CL) has had to examine cost control as well as rationing - meaning that the CL’s focus has not been strictly metrological, as if it were purely a system producing results, but instead has had to concentrate on its efficiency and efficacy. By applying re-engineering criteria, an emphasis has had to be placed on improved organisation and operating practice within the CL, focussing on the current criteria of the Integrated Management Areas where the technical and human resources are brought together. This re-engineering has been based on the concepts of consolidating and integrating the analytical platforms, while differentiating the production areas (CORE Laboratory) from the information areas. With these present concepts in mind, automation and virological treatment, along with serology in general, follow the same criteria as the rest of the operating methodology in the Clinical Laboratory. PMID:23248733

  10. Strategies for the successful implementation of viral laboratory automation.

    PubMed

    Avivar, Cristóbal

    2012-01-01

    It has been estimated that more than 70% of all medical activity is directly related to information providing analytical data. Substantial technological advances have taken place recently, which have allowed a previously unimagined number of analytical samples to be processed while offering high quality results. Concurrently, yet more new diagnostic determinations have been introduced - all of which has led to a significant increase in the prescription of analytical parameters. This increased workload has placed great pressure on the laboratory with respect to health costs. The present manager of the Clinical Laboratory (CL) has had to examine cost control as well as rationing - meaning that the CL's focus has not been strictly metrological, as if it were purely a system producing results, but instead has had to concentrate on its efficiency and efficacy. By applying re-engineering criteria, an emphasis has had to be placed on improved organisation and operating practice within the CL, focussing on the current criteria of the Integrated Management Areas where the technical and human resources are brought together. This re-engineering has been based on the concepts of consolidating and integrating the analytical platforms, while differentiating the production areas (CORE Laboratory) from the information areas. With these present concepts in mind, automation and virological treatment, along with serology in general, follow the same criteria as the rest of the operating methodology in the Clinical Laboratory.

  11. Collaborative Mission Design at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Gough, Kerry M.; Allen, B. Danette; Amundsen, Ruth M.

    2005-01-01

    NASA Langley Research Center (LaRC) has developed and tested two facilities dedicated to increasing efficiency in key mission design processes, including payload design, mission planning, and implementation plan development, among others. The Integrated Design Center (IDC) is a state-of-the-art concurrent design facility which allows scientists and spaceflight engineers to produce project designs and mission plans in a real-time collaborative environment, using industry-standard physics-based development tools and the latest communication technology. The Mission Simulation Lab (MiSL), a virtual reality (VR) facility focused on payload and project design, permits engineers to quickly translate their design and modeling output into enhanced three-dimensional models and then examine them in a realistic full-scale virtual environment. The authors were responsible for envisioning both facilities and turning those visions into fully operational mission design resources at LaRC with multiple advanced capabilities and applications. In addition, the authors have created a synergistic interface between these two facilities. This combined functionality is the Interactive Design and Simulation Center (IDSC), a meta-facility which offers project teams a powerful array of highly advanced tools, permitting them to rapidly produce project designs while maintaining the integrity of the input from every discipline expert on the project. The concept-to-flight mission support provided by IDSC has shown improved inter- and intra-team communication and a reduction in the resources required for proposal development, requirements definition, and design effort.

  12. Introduction to Concurrent Engineering: Electronic Circuit Design and Production Applications

    DTIC Science & Technology

    1992-09-01

    STD-1629. Failure mode distribution data for many different types of parts may be found in RAC publication FMD -91. FMEA utilizes inductive logic in a...contrasts with a Fault Tree Analysis ( FTA ) which utilizes deductive logic in a "top down" approach. In FTA , a system failure is assumed and traced down...Analysis ( FTA ) is a graphical method of risk analysis used to identify critical failure modes within a system or equipment. Utilizing a pictorial approach

  13. The Implementation of a Multi-Backend Database System (MDBS). Part I. Software Engineering Strategies and Efforts Towards a Prototype MDBS.

    DTIC Science & Technology

    1983-06-01

    for DEC PDPll systems. MAINSAIL was developed and is marketed with a set of integrated tools for program development. The syntax of the language is...stack, and to test for stack-full and stack-empty conditions. This technique is useful in enforcing data integrity and in con- trolling concurrent...and market MAINSAIL. The language is distinguished by its portability. The same compiler and runtime system, both written in MAINSAIL, are the basis

  14. SEI Report on Graduate Software Engineering Education for 1991

    DTIC Science & Technology

    1991-04-01

    12, 12 (Dec. 1979), 85-94. Andrews83 Andrews, Gregory R . and Schneider, Fred B. “Concepts and Notations for Concurrent Programming.” ACM Computing...Barringer87 Barringer , H. “Up and Down the Temporal Way.” Computer J. 30, 2 (Apr. 1987), 134-148. Bjørner78 The Vienna Development Method: The Meta-Language...Lecture Notes in Computer Science. Bruns86 Bruns, Glenn R . Technology Assessment: PAISLEY. Tech. Rep. MCC TR STP-296-86, MCC, Austin, Texas, Sept

  15. The development of internet based ship design support system for small and medium sized shipyards

    NASA Astrophysics Data System (ADS)

    Shin, Sung-Chul; Lee, Soon-Sup; Kang, Dong-Hoon; Lee, Kyung-Ho

    2012-03-01

    In this paper, a prototype of ship basic planning system is implemented for the small and medium sized shipyards based on the internet technology and concurrent engineering concept. The system is designed from the user requirements. Consequently, standardized development environment and tools are selected. These tools are used for the system development to define and evaluate core application technologies. The system will contribute to increasing competitiveness of small and medium sized shipyards in the 21st century industrial en-vironment.

  16. Planetary exploration with nanosatellites: a space campus for future technology development

    NASA Astrophysics Data System (ADS)

    Drossart, P.; Mosser, B.; Segret, B.

    2017-09-01

    Planetary exploration is at the eve of a revolution through nanosatellites accompanying larger missions, or freely cruising in the solar system, providing a man-made cosmic web for in situ or remote sensing exploration of the Solar System. A first step is to build a specific place dedicated to nanosatellite development. The context of the CCERES PSL space campus presents an environment for nanosatellite testing and integration, a concurrent engineering facility room for project analysis and science environment dedicated to this task.

  17. Using reinforcement learning to examine dynamic attention allocation during reading.

    PubMed

    Liu, Yanping; Reichle, Erik D; Gao, Ding-Guo

    2013-01-01

    A fundamental question in reading research concerns whether attention is allocated strictly serially, supporting lexical processing of one word at a time, or in parallel, supporting concurrent lexical processing of two or more words (Reichle, Liversedge, Pollatsek, & Rayner, 2009). The origins of this debate are reviewed. We then report three simulations to address this question using artificial reading agents (Liu & Reichle, 2010; Reichle & Laurent, 2006) that learn to dynamically allocate attention to 1-4 words to "read" as efficiently as possible. These simulation results indicate that the agents strongly preferred serial word processing, although they occasionally attended to more than one word concurrently. The reason for this preference is discussed, along with implications for the debate about how humans allocate attention during reading. Copyright © 2013 Cognitive Science Society, Inc.

  18. Concurrent removal of elemental mercury and SO2 from flue gas using a thiol-impregnated CaCO3-based adsorbent: a full factorial design study.

    PubMed

    Balasundaram, Karthik; Sharma, Mukesh

    2018-06-01

    Mercury (Hg) emitted from coal-based thermal power plants (CTPPs) can accumulate and bio-magnify in the food chain, thereby posing a risk to humans and wildlife. The central idea of this study was to develop an adsorbent which can concurrently remove elemental mercury (Hg 0 ) and SO 2 emitted from coal-based thermal power plants (CTPPs) in a single unit operation. Specifically, a composite adsorbent of CaCO 3 impregnated with 2-mercaptobenimidazole (2-MBI) (referred to as modified calcium carbonate (MCC)) was developed. While 2-MBI having sulfur functional group could selectively adsorb Hg 0 , CaCO 3 could remove SO 2 . Performance of the adsorbent was evaluated in terms of (i) removal (%) of Hg 0 and SO 2 , (ii) adsorption mechanism, (iii) adsorption kinetics, and (iv) leaching potential of mercury from spent adsorbent. The adsorption studies were performed using a 2 2 full factorial design of experiments with 15 ppbV of Hg 0 and 600 ppmV of SO 2 . Two factors, (i) reaction temperature (80 and 120 °C; temperature range in flue gas) and (ii) mass of 2-MBI (10 and 15 wt%), were investigated for the removal of Hg 0 and SO 2 (as %). The maximum Hg 0 and SO 2 removal was 86 and 93%, respectively. The results of XPS characterization showed that chemisorption is the predominant mechanism of Hg 0 and SO 2 adsorption on MCC. The Hg 0 adsorption on MCC followed Elovich kinetic model which is also indicative of chemisorption on heterogeneous surface. The toxicity characteristic leaching procedure (TCLP) and synthetic precipitation leaching procedure (SPLP) leached mercury from the spent adsorbent were within the acceptable levels defined in these tests. The engineering significance of this study is that the 2-MBI-modified CaCO 3 -based adsorbent has potential for concurrent removal of Hg 0 and SO 2 in a single unit operation. With only minor process modifications, the newly developed adsorbent can replace CaCO 3 in the flue-gas desulfurization (FGD) system.

  19. Local concurrent error detection and correction in data structures using virtual backpointers

    NASA Technical Reports Server (NTRS)

    Li, C. C.; Chen, P. P.; Fuchs, W. K.

    1987-01-01

    A new technique, based on virtual backpointers, for local concurrent error detection and correction in linked data structures is presented. Two new data structures, the Virtual Double Linked List, and the B-tree with Virtual Backpointers, are described. For these structures, double errors can be detected in 0(1) time and errors detected during forward moves can be corrected in 0(1) time. The application of a concurrent auditor process to data structure error detection and correction is analyzed, and an implementation is described, to determine the effect on mean time to failure of a multi-user shared database system. The implementation utilizes a Sequent shared memory multiprocessor system operating on a shared databased of Virtual Double Linked Lists.

  20. Local concurrent error detection and correction in data structures using virtual backpointers

    NASA Technical Reports Server (NTRS)

    Li, Chung-Chi Jim; Chen, Paul Peichuan; Fuchs, W. Kent

    1989-01-01

    A new technique, based on virtual backpointers, for local concurrent error detection and correction in linked data strutures is presented. Two new data structures, the Virtual Double Linked List, and the B-tree with Virtual Backpointers, are described. For these structures, double errors can be detected in 0(1) time and errors detected during forward moves can be corrected in 0(1) time. The application of a concurrent auditor process to data structure error detection and correction is analyzed, and an implementation is described, to determine the effect on mean time to failure of a multi-user shared database system. The implementation utilizes a Sequent shared memory multiprocessor system operating on a shared database of Virtual Double Linked Lists.

  1. Case of concurrent Riedel's thyroiditis, acute suppurative thyroiditis, and micropapillary carcinoma.

    PubMed

    Hong, Ji Taek; Lee, Jung Hwan; Kim, So Hun; Hong, Seong Bin; Nam, Moonsuk; Kim, Yong Seong; Chu, Young Chae

    2013-03-01

    Riedel's thyroiditis (RT) is a rare chronic inflammatory disease of the thyroid gland. It is characterized by a fibroinflammatory process that partially destroys the gland and extends into adjacent neck structures. Its clinical manifestation can mask an accompanying thyroid neoplasm and can mimic invasive thyroid carcinoma. Therefore, diagnosis can be difficult prior to surgical removal of the thyroid, and histopathologic examination of the thyroid is necessary for a definite diagnosis. The concurrent presence of RT and other thyroid diseases has been reported. However, to our knowledge, the association of RT with acute suppurative thyroiditis and micropapillary carcinoma has not been reported. We report a rare case of concurrent RT, acute suppurative thyroiditis, and micropapillary carcinoma in a 48-year-old patient.

  2. Temporal texture of associative encoding modulates recall processes.

    PubMed

    Tibon, Roni; Levy, Daniel A

    2014-02-01

    Binding aspects of an experience that are distributed over time is an important element of episodic memory. In the current study, we examined how the temporal complexity of an experience may govern the processes required for its retrieval. We recorded event-related potentials during episodic cued recall following pair associate learning of concurrently and sequentially presented object-picture pairs. Cued recall success effects over anterior and posterior areas were apparent in several time windows. In anterior locations, these recall success effects were similar for concurrently and sequentially encoded pairs. However, in posterior sites clustered over parietal scalp the effect was larger for the retrieval of sequentially encoded pairs. We suggest that anterior aspects of the mid-latency recall success effects may reflect working-with-memory operations or direct access recall processes, while more posterior aspects reflect recollective processes which are required for retrieval of episodes of greater temporal complexity. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. Concurrent simulation of a parallel jaw end effector

    NASA Technical Reports Server (NTRS)

    Bynum, Bill

    1985-01-01

    A system of programs developed to aid in the design and development of the command/response protocol between a parallel jaw end effector and the strategic planner program controlling it are presented. The system executes concurrently with the LISP controlling program to generate a graphical image of the end effector that moves in approximately real time in response to commands sent from the controlling program. Concurrent execution of the simulation program is useful for revealing flaws in the communication command structure arising from the asynchronous nature of the message traffic between the end effector and the strategic planner. Software simulation helps to minimize the number of hardware changes necessary to the microprocessor driving the end effector because of changes in the communication protocol. The simulation of other actuator devices can be easily incorporated into the system of programs by using the underlying support that was developed for the concurrent execution of the simulation process and the communication between it and the controlling program.

  4. Giant energy density and high efficiency achieved in bismuth ferrite-based film capacitors via domain engineering.

    PubMed

    Pan, Hao; Ma, Jing; Ma, Ji; Zhang, Qinghua; Liu, Xiaozhi; Guan, Bo; Gu, Lin; Zhang, Xin; Zhang, Yu-Jun; Li, Liangliang; Shen, Yang; Lin, Yuan-Hua; Nan, Ce-Wen

    2018-05-08

    Developing high-performance film dielectrics for capacitive energy storage has been a great challenge for modern electrical devices. Despite good results obtained in lead titanate-based dielectrics, lead-free alternatives are strongly desirable due to environmental concerns. Here we demonstrate that giant energy densities of ~70 J cm -3 , together with high efficiency as well as excellent cycling and thermal stability, can be achieved in lead-free bismuth ferrite-strontium titanate solid-solution films through domain engineering. It is revealed that the incorporation of strontium titanate transforms the ferroelectric micro-domains of bismuth ferrite into highly-dynamic polar nano-regions, resulting in a ferroelectric to relaxor-ferroelectric transition with concurrently improved energy density and efficiency. Additionally, the introduction of strontium titanate greatly improves the electrical insulation and breakdown strength of the films by suppressing the formation of oxygen vacancies. This work opens up a feasible and propagable route, i.e., domain engineering, to systematically develop new lead-free dielectrics for energy storage.

  5. Introducing concurrency in the Gaudi data processing framework

    NASA Astrophysics Data System (ADS)

    Clemencic, Marco; Hegner, Benedikt; Mato, Pere; Piparo, Danilo

    2014-06-01

    In the past, the increasing demands for HEP processing resources could be fulfilled by the ever increasing clock-frequencies and by distributing the work to more and more physical machines. Limitations in power consumption of both CPUs and entire data centres are bringing an end to this era of easy scalability. To get the most CPU performance per watt, future hardware will be characterised by less and less memory per processor, as well as thinner, more specialized and more numerous cores per die, and rather heterogeneous resources. To fully exploit the potential of the many cores, HEP data processing frameworks need to allow for parallel execution of reconstruction or simulation algorithms on several events simultaneously. We describe our experience in introducing concurrency related capabilities into Gaudi, a generic data processing software framework, which is currently being used by several HEP experiments, including the ATLAS and LHCb experiments at the LHC. After a description of the concurrent framework and the most relevant design choices driving its development, we describe the behaviour of the framework in a more realistic environment, using a subset of the real LHCb reconstruction workflow, and present our strategy and the used tools to validate the physics outcome of the parallel framework against the results of the present, purely sequential LHCb software. We then summarize the measurement of the code performance of the multithreaded application in terms of memory and CPU usage.

  6. Cognitive processing in the aftermath of relationship dissolution: Associations with concurrent and prospective distress and posttraumatic growth.

    PubMed

    Del Palacio-González, Adriana; Clark, David A; O'Sullivan, Lucia F

    2017-12-01

    Non-marital romantic relationship dissolution is amongst the most stressful life events experienced by young adults. Yet, some individuals experience posttraumatic growth following relationship dissolution. Little is known about the specific and differential contribution of trait-like and event-specific cognitive processing styles to each of these outcomes. A longitudinal design was employed in which trait-like (brooding and reflection) and dissolution-specific (intrusive and deliberate) cognitive processing was examined as predictors of growth (Posttraumatic Growth Inventory) and distress (Breakup Distress Scale) following a recent relationship dissolution. Initially, 148 participants completed measures of trait-like and dissolution-specific cognitive processing, growth, and distress (T1). A subsample completed a seven-month follow-up (T2). Higher frequency of relationship-dissolution intrusive thoughts predicted concurrent distress after accounting for brooding and relationship characteristics. Further, higher brooding and lower reflection predicted higher distress prospectively. Concurrent growth was predicted by both higher brooding and more deliberate relationship-dissolution thoughts. Prospectively, T1 dissolution intrusive thoughts predicted higher T2 deliberate thoughts, and the interaction between these two constructs predicted higher T2 growth. Therefore, deliberately thinking of the dissolution was related to positive psychological outcomes. In contrast, intrusive dissolution cognitions and a tendency for brooding had a mixed (paradoxical) association with psychological adjustment. Copyright © 2016 John Wiley & Sons, Ltd.

  7. The Television Generation, Television Literacy, and Television Trends.

    ERIC Educational Resources Information Center

    Cohen, Jodi R.

    Unlike the linear, serial process of reading books, learning to "read" television is a parallel process in which multiple pieces of information are simultaneously received. Perceiving images, only one aspect of understanding television, requires the concurrent processing of information that is compounded within a symbol system. The…

  8. Information Processing in Memory Tasks.

    ERIC Educational Resources Information Center

    Johnston, William A.

    The intensity of information processing engendered in different phases of standard memory tasks was examined in six experiments. Processing intensity was conceptualized as system capacity consumed, and was measured via a divided-attention procedure in which subjects performed a memory task and a simple reaction-time (RT) task concurrently. The…

  9. Extensible packet processing architecture

    DOEpatents

    Robertson, Perry J.; Hamlet, Jason R.; Pierson, Lyndon G.; Olsberg, Ronald R.; Chun, Guy D.

    2013-08-20

    A technique for distributed packet processing includes sequentially passing packets associated with packet flows between a plurality of processing engines along a flow through data bus linking the plurality of processing engines in series. At least one packet within a given packet flow is marked by a given processing engine to signify by the given processing engine to the other processing engines that the given processing engine has claimed the given packet flow for processing. A processing function is applied to each of the packet flows within the processing engines and the processed packets are output on a time-shared, arbitered data bus coupled to the plurality of processing engines.

  10. Concurrent Overexpression of Arabidopsis thaliana Cystathionine γ-Synthase and Silencing of Endogenous Methionine γ-Lyase Enhance Tuber Methionine Content in Solanum tuberosum.

    PubMed

    Kumar, Pavan; Jander, Georg

    2017-04-05

    Potatoes (Solanum tuberosum) are deficient in methionine, an essential amino acid in human and animal diets. Higher methionine levels increase the nutritional quality and promote the typically pleasant aroma associated with baked and fried potatoes. Several attempts have been made to elevate tuber methionine levels by genetic engineering of methionine biosynthesis and catabolism. Overexpressing Arabidopsis thaliana cystathionine γ-synthase (AtCGS) in S. tuberosum up-regulates a rate-limiting step of methionine biosynthesis and increases tuber methionine levels. Alternatively, silencing S. tuberosum methionine γ-lyase (StMGL), which causes decreased degradation of methionine into 2-ketobutyrate, also increases methionine levels. Concurrently enhancing biosynthesis and reducing degradation were predicted to provide further increases in tuber methionine content. Here we report that S. tuberosum cv. Désirée plants with AtCGS overexpression and StMGL silenced by RNA interference are morphologically normal and accumulate higher free methionine levels than either single-transgenic line.

  11. Toward ubiquitous healthcare services with a novel efficient cloud platform.

    PubMed

    He, Chenguang; Fan, Xiaomao; Li, Ye

    2013-01-01

    Ubiquitous healthcare services are becoming more and more popular, especially under the urgent demand of the global aging issue. Cloud computing owns the pervasive and on-demand service-oriented natures, which can fit the characteristics of healthcare services very well. However, the abilities in dealing with multimodal, heterogeneous, and nonstationary physiological signals to provide persistent personalized services, meanwhile keeping high concurrent online analysis for public, are challenges to the general cloud. In this paper, we proposed a private cloud platform architecture which includes six layers according to the specific requirements. This platform utilizes message queue as a cloud engine, and each layer thereby achieves relative independence by this loosely coupled means of communications with publish/subscribe mechanism. Furthermore, a plug-in algorithm framework is also presented, and massive semistructure or unstructured medical data are accessed adaptively by this cloud architecture. As the testing results showing, this proposed cloud platform, with robust, stable, and efficient features, can satisfy high concurrent requests from ubiquitous healthcare services.

  12. Developing a novel hierarchical approach for multiscale structural reliability predictions for ultra-high consequence applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emery, John M.; Coffin, Peter; Robbins, Brian A.

    Microstructural variabilities are among the predominant sources of uncertainty in structural performance and reliability. We seek to develop efficient algorithms for multiscale calcu- lations for polycrystalline alloys such as aluminum alloy 6061-T6 in environments where ductile fracture is the dominant failure mode. Our approach employs concurrent multiscale methods, but does not focus on their development. They are a necessary but not sufficient ingredient to multiscale reliability predictions. We have focused on how to efficiently use concurrent models for forward propagation because practical applications cannot include fine-scale details throughout the problem domain due to exorbitant computational demand. Our approach begins withmore » a low-fidelity prediction at the engineering scale that is sub- sequently refined with multiscale simulation. The results presented in this report focus on plasticity and damage at the meso-scale, efforts to expedite Monte Carlo simulation with mi- crostructural considerations, modeling aspects regarding geometric representation of grains and second-phase particles, and contrasting algorithms for scale coupling.« less

  13. A Survey of Applications and Research in Integrated Design Systems Technology

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The initial part of the study was begun with a combination of literature searches, World Wide Web searches, and contacts with individuals and companies who were known to members of our team to have an interest in topics that seemed to be related to our study. There is a long list of such topics, such as concurrent engineering, design for manufacture, life-cycle engineering, systems engineering, systems integration, systems design, design systems, integrated product and process approaches, enterprise integration, integrated product realization, and similar terms. These all capture, at least in part, the flavor of what we describe here as integrated design systems. An inhibiting factor in this inquiry was the absence of agreed terminology for the study of integrated design systems. It is common for the term to be applied to what are essentially augmented Computer-Aided Design (CAD) systems, which are integrated only to the extent that agreements have been reached to attach proprietary extensions to proprietary CAD programs. It is also common for some to use the term integrated design systems to mean a system that applies only, or mainly, to the design phase of a product life cycle. It is likewise common for many of the terms listed earlier to be used as synonyms for integrated design systems. We tried to avoid this ambiguity by adopting the definition of integrated design systems that is implied in the introductory notes that we provided to our contacts, cited earlier. We thus arrived at this definition: Integrated Design Systems refers to the integration of the different tools and processes that comprise the engineering, of complex systems. It takes a broad view of the engineering of systems, to include consideration of the entire product realization process and the product life cycle. An important aspect of integrated design systems is the extent to which they integrate existing, "islands of automation" into a comprehensive design and product realization environment. As the study progressed, we relied increasingly upon a networking approach to lead us to new information. The departure point for such searches often was a government-sponsored project or a company initiative. The advantage of this approach was that short conversations with knowledgeable persons would usually cut through confusion over differences of terminology, thereby somewhat reducing the search space of the study. Even so, it was not until late in our eight-month inquiry that we began to see signs of convergence of the search, in the sense that a number of the latest inquiries began to turn up references to earlier contacts. As suggested above, this convergence often occurred with respect to particular government or company projects.

  14. Acquisition of Programming Skills

    DTIC Science & Technology

    1990-04-01

    skills (e.g., arithmetic reasoning, work knowledge, information processing speed); and c) passive versus active learning style. Ability measures...concurrent storage and processing an individual was capable of doing), and an active learning style. Implications of the findings for the development of

  15. Multidisciplinary Design Technology Development: A Comparative Investigation of Integrated Aerospace Vehicle Design Tools

    NASA Technical Reports Server (NTRS)

    Renaud, John E.; Batill, Stephen M.; Brockman, Jay B.

    1999-01-01

    This research effort is a joint program between the Departments of Aerospace and Mechanical Engineering and the Computer Science and Engineering Department at the University of Notre Dame. The purpose of the project was to develop a framework and systematic methodology to facilitate the application of Multidisciplinary Design Optimization (MDO) to a diverse class of system design problems. For all practical aerospace systems, the design of a systems is a complex sequence of events which integrates the activities of a variety of discipline "experts" and their associated "tools". The development, archiving and exchange of information between these individual experts is central to the design task and it is this information which provides the basis for these experts to make coordinated design decisions (i.e., compromises and trade-offs) - resulting in the final product design. Grant efforts focused on developing and evaluating frameworks for effective design coordination within a MDO environment. Central to these research efforts was the concept that the individual discipline "expert", using the most appropriate "tools" available and the most complete description of the system should be empowered to have the greatest impact on the design decisions and final design. This means that the overall process must be highly interactive and efficiently conducted if the resulting design is to be developed in a manner consistent with cost and time requirements. The methods developed as part of this research effort include; extensions to a sensitivity based Concurrent Subspace Optimization (CSSO) NMO algorithm; the development of a neural network response surface based CSSO-MDO algorithm; and the integration of distributed computing and process scheduling into the MDO environment. This report overviews research efforts in each of these focus. A complete bibliography of research produced with support of this grant is attached.

  16. Engineering low phorbol ester Jatropha curcas seed by intercepting casbene biosynthesis.

    PubMed

    Li, Chunhong; Ng, Ailing; Xie, Lifen; Mao, Huizhu; Qiu, Chengxiang; Srinivasan, Ramachandran; Yin, Zhongchao; Hong, Yan

    2016-01-01

    Casbene is a precursor to phorbol esters and down-regulating casbene synthase effectively reduces phorbol ester biosynthesis. Seed-specific reduction of phorbol ester (PE) helps develop Jatropha seed cake for animal nutrition. Phorbol esters (PEs) are diterpenoids present in some Euphorbiaceae family members like Jatropha curcas L. (Jatropha), a tropical shrub yielding high-quality oil suitable as feedstock for biodiesel and bio jet fuel. Jatropha seed contains up to 40 % of oil and can produce oil together with cake containing high-quality proteins. However, skin-irritating and cancer-promoting PEs make Jatropha cake meal unsuitable for animal nutrition and also raise some safety and environmental concerns on its planting and processing. Two casbene synthase gene (JcCASA163 and JcCASD168) homologues were cloned from Jatropha genome and both genes were highly expressed during seed development. In vitro functional analysis proved casbene synthase activity of JcCASA163 in converting geranylgeranyl diphosphate into casbene which has been speculated to be the precursor to PEs. A seed-specific promoter driving inverted repeats for RNAi interference targeting at either JcCASA163 or both genes could effectively down-regulate casbene synthase gene expression with concurrent marked reduction of PE level (by as much as 85 %) in seeds with no pleiotropic effects observed. Such engineered low PE in seed was heritable and co-segregated with the transgene. Our work implicated casbene synthase in Jatropha PE biosynthesis and provided evidence for casbene being the precursor for PEs. The success in reducing seed PE content through down-regulation of casbene synthase demonstrates the feasibility of intercepting PE biosynthesis in Jatropha seed to help address safety concerns on Jatropha plantation and seed processing and facilitate use of its seed protein for animal nutrition.

  17. Work Coordination Engine

    NASA Technical Reports Server (NTRS)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Kim, Rachel; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    The Work Coordination Engine (WCE) is a Java application integrated into the Service Management Database (SMDB), which coordinates the dispatching and monitoring of a work order system. WCE de-queues work orders from SMDB and orchestrates the dispatching of work to a registered set of software worker applications distributed over a set of local, or remote, heterogeneous computing systems. WCE monitors the execution of work orders once dispatched, and accepts the results of the work order by storing to the SMDB persistent store. The software leverages the use of a relational database, Java Messaging System (JMS), and Web Services using Simple Object Access Protocol (SOAP) technologies to implement an efficient work-order dispatching mechanism capable of coordinating the work of multiple computer servers on various platforms working concurrently on different, or similar, types of data or algorithmic processing. Existing (legacy) applications can be wrapped with a proxy object so that no changes to the application are needed to make them available for integration into the work order system as "workers." WCE automatically reschedules work orders that fail to be executed by one server to a different server if available. From initiation to completion, the system manages the execution state of work orders and workers via a well-defined set of events, states, and actions. It allows for configurable work-order execution timeouts by work-order type. This innovation eliminates a current processing bottleneck by providing a highly scalable, distributed work-order system used to quickly generate products needed by the Deep Space Network (DSN) to support space flight operations. WCE is driven by asynchronous messages delivered via JMS indicating the availability of new work or workers. It runs completely unattended in support of the lights-out operations concept in the DSN.

  18. Martin Marietta, Y-12 Plant Laboratory Partnership Program Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koger, J.

    1995-02-10

    The Y-12 Plant currently embraces three mission areas; stockpile surveillance, maintaining production capability, and storage of special nuclear materials. The Y-12 Plant also contributes to the nations` economic strength by partnering with industry in deploying technology. This partnering has been supported to a great extent through the Technology Transfer Initiative (TTI) directed by DOE/Defense Programs (DP-14). The Oak Ridge Centers for Manufacturing Technology (ORCMT) was established to draw upon the manufacturing and fabrication capabilities at the Y-12 Plant to coordinate and support collaborative efforts, between DP and the domestic industrial sector, toward the development of technologies which offer mutual benefitmore » to both DOE/DP programs and the private sector. Most of the needed technologies for the ``Factory of the Future`` (FOF) are being pursued as core areas at the Y-12 Plant. As a result, 85% of DP-14 projects already support the FOF. The unique capabilities of ORCMT can be applied to a wide range of manufacturing problems to enhance the capabilities of the US industrial base and its economic outcome. The ORCMT has an important role to play in DOE`s Technology Transfer initiative because its capabilities are focused on applied manufacturing and technology deployment which has a more near-term impact on private sector competitiveness. The Y-12 Plant uses the ORCMT to help maintain its own core competencies for the FOF by challenging its engineers and capabilities with technical problems from industry. Areas of strength at the Y-12 Plant that could impact the FOF include modeling of processes and advanced materials; intelligent inspection systems with standardized operator interfaces, analysis software, and part programming language; electronic transfer of designs and features; existing computer-based concurrent engineering; and knowledge-based forming process.« less

  19. Standing at the crossroads: Identity and recognition of the Applied Science Technologist in British Columbia

    NASA Astrophysics Data System (ADS)

    Roemer, Thomas

    Modern technical education in British Columbia has been affected by two societal trends: in industry, engineering technology evolved as a discipline to bridge the increasing chasm between the process-oriented skill sets of tradespersons/technicians, and the declarative knowledge focus of engineering; in education, the provincial college and institute system was created to address the need for a new post-secondary credential situated between trades certificates and university degrees. The Applied Science Technologist arguably forms the intersection of these two concepts. Almost forty years after its inception, it is timely to ask if the original model has matured into a distinct occupational category in industry, education, and in the public mind. The thesis proposes three environments, the Formative, Market and Public Domain, respectively. Interviews, surveys and personal experience afforded insights into the dynamics of these domains with respect to a fledgling occupational category, while the socio-philosophical concepts of culture, habitus and social imaginary provide the tools to interpret the findings. The thesis postulates that an emerging occupational category will not only challenge existing cultures and habitus, but that over time it will influence the imaginaries of each domain and society as a whole. Ultimately, the occupational category will be truly successful only when the general public is able to distinguish it from related disciplines. Charles Taylor's writings on multiculturalism are used to discuss identity and recognition of the Applied Science Technologist in each domain while Pierre Bourdieu's perspectives on the existence of habitus and self-proliferating elites form the framework to examine the relationships between technologists and engineers. Taylor's theory of multiple concurrent social imaginaries guides the comparison of divergent expectations among academic, career and vocational instructors at British Columbia's colleges. The thesis concludes with recommendations for the sustainability of the Applied Science Technologist as distinct occupational category. Keywords. engineering technology; community college; diploma; recognition; identity; social imaginaries

  20. A Centaur Reconnaissance Mission: a NASA JPL Planetary Science Summer Seminar mission design experience

    NASA Astrophysics Data System (ADS)

    Chou, L.; Howell, S. M.; Bhattaru, S.; Blalock, J. J.; Bouchard, M.; Brueshaber, S.; Cusson, S.; Eggl, S.; Jawin, E.; Marcus, M.; Miller, K.; Rizzo, M.; Smith, H. B.; Steakley, K.; Thomas, N. H.; Thompson, M.; Trent, K.; Ugelow, M.; Budney, C. J.; Mitchell, K. L.

    2017-12-01

    The NASA Planetary Science Summer Seminar (PSSS), sponsored by the Jet Propulsion Laboratory (JPL), offers advanced graduate students and recent doctoral graduates the unique opportunity to develop a robotic planetary exploration mission that answers NASA's Science Mission Directorate's Announcement of Opportunity for the New Frontiers Program. Preceded by a series of 10 weekly webinars, the seminar is an intensive one-week exercise at JPL, where students work directly with JPL's project design team "TeamX" on the process behind developing mission concepts through concurrent engineering, project design sessions, instrument selection, science traceability matrix development, and risks and cost management. The 2017 NASA PSSS team included 18 participants from various U.S. institutions with a diverse background in science and engineering. We proposed a Centaur Reconnaissance Mission, named CAMILLA, designed to investigate the geologic state, surface evolution, composition, and ring systems through a flyby and impact of Chariklo. Centaurs are defined as minor planets with semi-major axis that lies between Jupiter and Neptune's orbit. Chariklo is both the largest Centaur and the only known minor planet with rings. CAMILLA was designed to address high priority cross-cutting themes defined in National Research Council's Vision and Voyages for Planetary Science in the Decade 2013-2022. At the end of the seminar, a final presentation was given by the participants to a review board of JPL scientists and engineers as well as NASA headquarters executives. The feedback received on the strengths and weaknesses of our proposal provided a rich and valuable learning experience in how to design a successful NASA planetary exploration mission and generate a successful New Frontiers proposal. The NASA PSSS is an educational experience that trains the next generation of NASA's planetary explorers by bridging the gap between scientists and engineers, allowing for participants to learn how to design a mission and build a spacecraft in a collaborative and fast-pace environment.

  1. Developmental problems and their solution for the Space Shuttle main engine alternate liquid oxygen high-pressure turbopump: Anomaly or failure investigation the key

    NASA Astrophysics Data System (ADS)

    Ryan, R.; Gross, L. A.

    1995-05-01

    The Space Shuttle main engine (SSME) alternate high-pressure liquid oxygen pump experienced synchronous vibration and ball bearing life problems that were program threatening. The success of the program hinged on the ability to solve these development problems. The design and solutions to these problems are engirded in the lessons learned and experiences from prior programs, technology programs, and the ability to properly conduct failure or anomaly investigations. The failure investigation determines the problem cause and is the basis for recommending design solutions. For a complex problem, a comprehensive solution requires that formal investigation procedures be used, including fault trees, resolution logic, and action items worked through a concurrent engineering-multidiscipline team. The normal tendency to use an intuitive, cut-and-try approach will usually prove to be costly, both in money and time and will reach a less than optimum, poorly understood answer. The SSME alternate high-pressure oxidizer turbopump development has had two complex problems critical to program success: (1) high synchronous vibrations and (2) excessive ball bearing wear. This paper will use these two problems as examples of this formal failure investigation approach. The results of the team's investigation provides insight into the complexity of the turbomachinery technical discipline interacting/sensitivities and the fine balance of competing investigations required to solve problems and guarantee program success. It is very important to the solution process that maximum use be made of the resources that both the contractor and Government can bring to the problem in a supporting and noncompeting way. There is no place for the not-invented-here attitude. The resources include, but are not limited to: (1) specially skilled professionals; (2) supporting technologies; (3) computational codes and capabilities; and (4) test and manufacturing facilities.

  2. Developmental problems and their solution for the Space Shuttle main engine alternate liquid oxygen high-pressure turbopump: Anomaly or failure investigation the key

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Gross, L. A.

    1995-01-01

    The Space Shuttle main engine (SSME) alternate high-pressure liquid oxygen pump experienced synchronous vibration and ball bearing life problems that were program threatening. The success of the program hinged on the ability to solve these development problems. The design and solutions to these problems are engirded in the lessons learned and experiences from prior programs, technology programs, and the ability to properly conduct failure or anomaly investigations. The failure investigation determines the problem cause and is the basis for recommending design solutions. For a complex problem, a comprehensive solution requires that formal investigation procedures be used, including fault trees, resolution logic, and action items worked through a concurrent engineering-multidiscipline team. The normal tendency to use an intuitive, cut-and-try approach will usually prove to be costly, both in money and time and will reach a less than optimum, poorly understood answer. The SSME alternate high-pressure oxidizer turbopump development has had two complex problems critical to program success: (1) high synchronous vibrations and (2) excessive ball bearing wear. This paper will use these two problems as examples of this formal failure investigation approach. The results of the team's investigation provides insight into the complexity of the turbomachinery technical discipline interacting/sensitivities and the fine balance of competing investigations required to solve problems and guarantee program success. It is very important to the solution process that maximum use be made of the resources that both the contractor and Government can bring to the problem in a supporting and noncompeting way. There is no place for the not-invented-here attitude. The resources include, but are not limited to: (1) specially skilled professionals; (2) supporting technologies; (3) computational codes and capabilities; and (4) test and manufacturing facilities.

  3. Global Design Optimization for Aerodynamics and Rocket Propulsion Components

    NASA Technical Reports Server (NTRS)

    Shyy, Wei; Papila, Nilay; Vaidyanathan, Rajkumar; Tucker, Kevin; Turner, James E. (Technical Monitor)

    2000-01-01

    Modern computational and experimental tools for aerodynamics and propulsion applications have matured to a stage where they can provide substantial insight into engineering processes involving fluid flows, and can be fruitfully utilized to help improve the design of practical devices. In particular, rapid and continuous development in aerospace engineering demands that new design concepts be regularly proposed to meet goals for increased performance, robustness and safety while concurrently decreasing cost. To date, the majority of the effort in design optimization of fluid dynamics has relied on gradient-based search algorithms. Global optimization methods can utilize the information collected from various sources and by different tools. These methods offer multi-criterion optimization, handle the existence of multiple design points and trade-offs via insight into the entire design space, can easily perform tasks in parallel, and are often effective in filtering the noise intrinsic to numerical and experimental data. However, a successful application of the global optimization method needs to address issues related to data requirements with an increase in the number of design variables, and methods for predicting the model performance. In this article, we review recent progress made in establishing suitable global optimization techniques employing neural network and polynomial-based response surface methodologies. Issues addressed include techniques for construction of the response surface, design of experiment techniques for supplying information in an economical manner, optimization procedures and multi-level techniques, and assessment of relative performance between polynomials and neural networks. Examples drawn from wing aerodynamics, turbulent diffuser flows, gas-gas injectors, and supersonic turbines are employed to help demonstrate the issues involved in an engineering design context. Both the usefulness of the existing knowledge to aid current design practices and the need for future research are identified.

  4. Information Processing in the Cerebral Hemispheres: Selective Hemispheric Activation and Capacity Limitations.

    ERIC Educational Resources Information Center

    Hellige, Joseph B.; And Others

    1979-01-01

    Five experiments are reported concerning the effect on visual information processing of concurrently maintaining verbal information. The results suggest that the left cerebral hemisphere functions as a typical limited-capacity information processing system that can be influenced somewhat separately from the right hemisphere system. (Author/CTM)

  5. Foveal Processing Under Concurrent Peripheral Load in Profoundly Deaf Adults.

    PubMed

    Dye, Matthew W G

    2016-04-01

    Development of the visual system typically proceeds in concert with the development of audition. One result is that the visual system of profoundly deaf individuals differs from that of those with typical auditory systems. While past research has suggested deaf people have enhanced attention in the visual periphery, it is still unclear whether or not this enhancement entails deficits in central vision. Profoundly deaf and typically hearing adults were administered a variant of the useful field of view task that independently assessed performance on concurrent central and peripheral tasks. Identification of a foveated target was impaired by a concurrent selective peripheral attention task, more so in profoundly deaf adults than in the typically hearing. Previous findings of enhanced performance on the peripheral task were not replicated. These data are discussed in terms of flexible allocation of spatial attention targeted towards perceived task demands, and support a modified "division of labor" hypothesis whereby attentional resources co-opted to process peripheral space result in reduced resources in the central visual field. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Radioisotope Power Systems Reference Book for Mission Designers and Planners

    NASA Technical Reports Server (NTRS)

    Lee, Young; Bairstow, Brian

    2015-01-01

    The RPS Program's Program Planning and Assessment (PPA) Office commissioned the Mission Analysis team to develop the Radioisotope Power Systems (RPS) Reference Book for Mission Planners and Designers to define a baseline of RPS technology capabilities with specific emphasis on performance parameters and technology readiness. The main objective of this book is to provide RPS technology information that could be utilized by future mission concept studies and concurrent engineering practices. A progress summary from the major branches of RPS technology research provides mission analysis teams with a vital tool for assessing the RPS trade space, and provides concurrent engineering centers with a consistent set of guidelines for RPS performance characteristics. This book will be iterated when substantial new information becomes available to ensure continued relevance, serving as one of the cornerstone products of the RPS PPA Office. This book updates the original 2011 internal document, using data from the relevant publicly released RPS technology references and consultations with RPS technologists. Each performance parameter and RPS product subsection has been reviewed and cleared by at least one subject matter representative. A virtual workshop was held to reach consensus on the scope and contents of the book, and the definitions and assumptions that should be used. The subject matter experts then reviewed and updated the appropriate sections of the book. The RPS Mission Analysis Team then performed further updates and crosschecked the book for consistency. Finally, a second virtual workshop was held to ensure all subject matter experts and stakeholders concurred on the contents.

  7. Advances in river ice hydrology 1999-2003

    NASA Astrophysics Data System (ADS)

    Morse, Brian; Hicks, Faye

    2005-01-01

    In the period 1999 to 2003, river ice has continued to have important socio-economic impacts in Canada and other Nordic countries. Concurrently, there have been many important advances in all areas of Canadian research into river ice engineering and hydrology. For example: (1) River ice processes were highlighted in two special journal issues (Canadian Journal of Civil Engineering in 2003 and Hydrological Processes in 2002) and at five conferences (Canadian Committee on River Ice Processes and the Environment in 1999, 2001 and 2003, and International Association of Hydraulic Research in 2000 and 2002). (2) A number of workers have clearly advanced our understanding of river ice processes by bringing together disparate information in comprehensive review articles. (3) There have been significant advances in river ice modelling. For example, both one-dimensional (e.g. RIVICE, RIVJAM, ICEJAM, HEC-RAS, etc.) and two-dimensional (2-D; www.river2d.ca) public-domain ice-jam models are now available. Work is ongoing to improve RIVER2D, and a commercial 2-D ice-process model is being developed. (4) The 1999-2003 period is notable for the number of distinctly hydrological and ecological studies. On the quantitative side, many are making efforts to determine streamflow during the winter period. On the ecological side, some new publications have addressed the link to water quality (temperature, dissolved oxygen, nutrients and pollutants), and others have dealt with sediment transport and geomorphology (particularly as it relates to break-up), stream ecology (plants, food cycle, etc.) and fish habitat.There is the growing recognition, that these types of study require collaborative efforts. In our view, the main areas requiring further work are: (1) to interface geomorphological and habitat models with quantitative river ice hydrodynamic models; (2) to develop a manager's toolbox (database management, remote sensing, forecasting, intervention methodologies, etc.) to enable agencies to intervene better at the time of ice-jam-induced floods; and (3) finalize ice-jam prevention methods on the St Lawrence River to safeguard its $2 billion commercial navigation industry. Copyright

  8. Comparison of laser Doppler and laser speckle contrast imaging using a concurrent processing system

    NASA Astrophysics Data System (ADS)

    Sun, Shen; Hayes-Gill, Barrie R.; He, Diwei; Zhu, Yiqun; Huynh, Nam T.; Morgan, Stephen P.

    2016-08-01

    Full field laser Doppler imaging (LDI) and single exposure laser speckle contrast imaging (LSCI) are directly compared using a novel instrument which can concurrently image blood flow using both LDI and LSCI signal processing. Incorporating a commercial CMOS camera chip and a field programmable gate array (FPGA) the flow images of LDI and the contrast maps of LSCI are simultaneously processed by utilizing the same detected optical signals. The comparison was carried out by imaging a rotating diffuser. LDI has a linear response to the velocity. In contrast, LSCI is exposure time dependent and does not provide a linear response in the presence of static speckle. It is also demonstrated that the relationship between LDI and LSCI can be related through a power law which depends on the exposure time of LSCI.

  9. Software-safety and software quality assurance in real-time applications Part 2: Real-time structures and languages

    NASA Astrophysics Data System (ADS)

    Schoitsch, Erwin

    1988-07-01

    Our society is depending more and more on the reliability of embedded (real-time) computer systems even in every-day life. Considering the complexity of the real world, this might become a severe threat. Real-time programming is a discipline important not only in process control and data acquisition systems, but also in fields like communication, office automation, interactive databases, interactive graphics and operating systems development. General concepts of concurrent programming and constructs for process-synchronization are discussed in detail. Tasking and synchronization concepts, methods of process communication, interrupt- and timeout handling in systems based on semaphores, signals, conditional critical regions or on real-time languages like Concurrent PASCAL, MODULA, CHILL and ADA are explained and compared with each other and with respect to their potential to quality and safety.

  10. Architecture independent environment for developing engineering software on MIMD computers

    NASA Technical Reports Server (NTRS)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  11. X-33 Attitude Control Using the XRS-2200 Linear Aerospike Engine

    NASA Technical Reports Server (NTRS)

    Hall, Charles E.; Panossian, Hagop V.

    1999-01-01

    The Vehicle Control Systems Team at Marshall Space Flight Center, Structures and Dynamics Laboratory, Guidance and Control Systems Division is designing, under a cooperative agreement with Lockheed Martin Skunkworks, the Ascent, Transition, and Entry flight attitude control systems for the X-33 experimental vehicle. Test flights, while suborbital, will achieve sufficient altitudes and Mach numbers to test Single Stage To Orbit, Reusable Launch Vehicle technologies. Ascent flight control phase, the focus of this paper, begins at liftoff and ends at linear aerospike main engine cutoff (MECO). The X-33 attitude control system design is confronted by a myriad of design challenges: a short design cycle, the X-33 incremental test philosophy, the concurrent design philosophy chosen for the X-33 program, and the fact that the attitude control system design is, as usual, closely linked to many other subsystems and must deal with constraints and requirements from these subsystems. Additionally, however, and of special interest, the use of the linear aerospike engine is a departure from the gimbaled engines traditionally used for thrust vector control (TVC) in launch vehicles and poses certain design challenges. This paper discusses the unique problem of designing the X-33 attitude control system with the linear aerospike engine, requirements development, modeling and analyses that verify the design.

  12. Enhanced identification of eligibility for depression research using an electronic medical record search engine.

    PubMed

    Seyfried, Lisa; Hanauer, David A; Nease, Donald; Albeiruti, Rashad; Kavanagh, Janet; Kales, Helen C

    2009-12-01

    Electronic medical records (EMRs) have become part of daily practice for many physicians. Attempts have been made to apply electronic search engine technology to speed EMR review. This was a prospective, observational study to compare the speed and clinical accuracy of a medical record search engine vs. manual review of the EMR. Three raters reviewed 49 cases in the EMR to screen for eligibility in a depression study using the electronic medical record search engine (EMERSE). One week later raters received a scrambled set of the same patients including 9 distractor cases, and used manual EMR review to determine eligibility. For both methods, accuracy was assessed for the original 49 cases by comparison with a gold standard rater. Use of EMERSE resulted in considerable time savings; chart reviews using EMERSE were significantly faster than traditional manual review (p=0.03). The percent agreement of raters with the gold standard (e.g. concurrent validity) using either EMERSE or manual review was not significantly different. Using a search engine optimized for finding clinical information in the free-text sections of the EMR can provide significant time savings while preserving clinical accuracy. The major power of this search engine is not from a more advanced and sophisticated search algorithm, but rather from a user interface designed explicitly to help users search the entire medical record in a way that protects health information.

  13. Engineering of Surface Chemistry for Enhanced Sensitivity in Nanoporous Interferometric Sensing Platforms.

    PubMed

    Law, Cheryl Suwen; Sylvia, Georgina M; Nemati, Madieh; Yu, Jingxian; Losic, Dusan; Abell, Andrew D; Santos, Abel

    2017-03-15

    We explore new approaches to engineering the surface chemistry of interferometric sensing platforms based on nanoporous anodic alumina (NAA) and reflectometric interference spectroscopy (RIfS). Two surface engineering strategies are presented, namely (i) selective chemical functionalization of the inner surface of NAA pores with amine-terminated thiol molecules and (ii) selective chemical functionalization of the top surface of NAA with dithiol molecules. The strong molecular interaction of Au 3+ ions with thiol-containing functional molecules of alkane chain or peptide character provides a model sensing system with which to assess the sensitivity of these NAA platforms by both molecular feature and surface engineering. Changes in the effective optical thickness of the functionalized NAA photonic films (i.e., sensing principle), in response to gold ions, are monitored in real-time by RIfS. 6-Amino-1-hexanethiol (inner surface) and 1,6-hexanedithiol (top surface), the most sensitive functional molecules from approaches i and ii, respectively, were combined into a third sensing strategy whereby the NAA platforms are functionalized on both the top and inner surfaces concurrently. Engineering of the surface according to this approach resulted in an additive enhancement in sensitivity of up to 5-fold compared to previously reported systems. This study advances the rational engineering of surface chemistry for interferometric sensing on nanoporous platforms with potential applications for real-time monitoring of multiple analytes in dynamic environments.

  14. Effects of biodiesel, engine load and diesel particulate filter on nonvolatile particle number size distributions in heavy-duty diesel engine exhaust.

    PubMed

    Young, Li-Hao; Liou, Yi-Jyun; Cheng, Man-Ting; Lu, Jau-Huai; Yang, Hsi-Hsien; Tsai, Ying I; Wang, Lin-Chi; Chen, Chung-Bang; Lai, Jim-Shoung

    2012-01-15

    Diesel engine exhaust contains large numbers of submicrometer particles that degrade air quality and human health. This study examines the number emission characteristics of 10-1000 nm nonvolatile particles from a heavy-duty diesel engine, operating with various waste cooking oil biodiesel blends (B2, B10 and B20), engine loads (0%, 25%, 50% and 75%) and a diesel oxidation catalyst plus diesel particulate filter (DOC+DPF) under steady modes. For a given load, the total particle number concentrations (N(TOT)) decrease slightly, while the mode diameters show negligible changes with increasing biodiesel blends. For a given biodiesel blend, both the N(TOT) and mode diameters increase modestly with increasing load of above 25%. The N(TOT) at idle are highest and their size distributions are strongly affected by condensation and possible nucleation of semivolatile materials. Nonvolatile cores of diameters less than 16 nm are only observed at idle mode. The DOC+DPF shows remarkable filtration efficiency for both the core and soot particles, irrespective of the biodiesel blend and engine load under study. The N(TOT) post the DOC+DPF are comparable to typical ambient levels of ≈ 10(4)cm(-3). This implies that, without concurrent reductions of semivolatile materials, the formation of semivolatile nucleation mode particles post the after treatment is highly favored. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Enhanced Identification of Eligibility for Depression Research Using an Electronic Medical Record Search Engine

    PubMed Central

    Seyfried, Lisa; Hanauer, David; Nease, Donald; Albeiruti, Rashad; Kavanagh, Janet; Kales, Helen C.

    2009-01-01

    Purpose Electronic medical records (EMR) have become part of daily practice for many physicians. Attempts have been made to apply electronic search engine technology to speed EMR review. This was a prospective, observational study to compare the speed and accuracy of electronic search engine vs. manual review of the EMR. Methods Three raters reviewed 49 cases in the EMR to screen for eligibility in a depression study using the electronic search engine (EMERSE). One week later raters received a scrambled set of the same patients including 9 distractor cases, and used manual EMR review to determine eligibility. For both methods, accuracy was assessed for the original 49 cases by comparison with a gold standard rater. Results Use of EMERSE resulted in considerable time savings; chart reviews using EMERSE were significantly faster than traditional manual review (p=0.03). The percent agreement of raters with the gold standard (e.g. concurrent validity) using either EMERSE or manual review was not significantly different. Conclusions Using a search engine optimized for finding clinical information in the free-text sections of the EMR can provide significant time savings while preserving reliability. The major power of this search engine is not from a more advanced and sophisticated search algorithm, but rather from a user interface designed explicitly to help users search the entire medical record in a way that protects health information. PMID:19560962

  16. [Development of Web-based multimedia content for a physical examination and health assessment course].

    PubMed

    Oh, Pok-Ja; Kim, Il-Ok; Shin, Sung-Rae; Jung, Hoe-Kyung

    2004-10-01

    This study was to develop Web-based multimedia content for Physical Examination and Health Assessment. The multimedia content was developed based on Jung's teaching and learning structure plan model, using the following 5 processes : 1) Analysis Stage, 2) Planning Stage, 3) Storyboard Framing and Production Stage, 4) Program Operation Stage, and 5) Final Evaluation Stage. The web based multimedia content consisted of an intro movie, main page and sub pages. On the main page, there were 6 menu bars that consisted of Announcement center, Information of professors, Lecture guide, Cyber lecture, Q&A, and Data centers, and a site map which introduced 15 week lectures. In the operation of web based multimedia content, HTML, JavaScript, Flash, and multimedia technology (Audio and Video) were utilized and the content consisted of text content, interactive content, animation, and audio & video. Consultation with the experts in context, computer engineering, and educational technology was utilized in the development of these processes. Web-based multimedia content is expected to offer individualized and tailored learning opportunities to maximize and facilitate the effectiveness of the teaching and learning process. Therefore, multimedia content should be utilized concurrently with the lecture in the Physical Examination and Health Assessment classes as a vital teaching aid to make up for the weakness of the face-to- face teaching-learning method.

  17. Nondestructive evaluations

    NASA Astrophysics Data System (ADS)

    Kulkarni, S.

    1993-03-01

    This report discusses Nondestructive Evaluation (NDE) thrust area which supports initiatives that advance inspection science and technology. The goal of the NDE thrust area is to provide cutting-edge technologies that have promise of inspection tools three to five years in the future. In selecting projects, the thrust area anticipates the needs of existing and future Lawrence Livermore National Laboratory (LLNL) programs. NDE provides materials characterization inspections, finished parts, and complex objects to find flaws and fabrication defects and to determine their physical and chemical characteristics. NDE also encompasses process monitoring and control sensors and the monitoring of in-service damage. For concurrent engineering, NDE becomes a frontline technology and strongly impacts issues of certification and of life prediction and extension. In FY-92, in addition to supporting LLNL programs and the activities of nuclear weapons contractors, NDE has initiated several projects with government agencies and private industries to study aging infrastructures and to advance manufacturing processes. Examples of these projects are (1) the Aging Airplanes Inspection Program for the Federal Aviation Administration, (2) Signal Processing of Acoustic Signatures of Heart Valves for Shiley, Inc., and (3) Turbine Blade Inspection for the Air Force, jointly with Southwest Research Institute and Garrett. In FY-92, the primary contributions of the NDE thrust area, described in this report, were in fieldable chemical sensor systems, computed tomography, and laser generation and detection of ultrasonic energy.

  18. Improved Broadband Liner Optimization Applied to the Advanced Noise Control Fan

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Jones, Michael G.; Sutliff, Daniel L.; Ayle, Earl; Ichihashi, Fumitaka

    2014-01-01

    The broadband component of fan noise has grown in relevance with the utilization of increased bypass ratio and advanced fan designs. Thus, while the attenuation of fan tones remains paramount, the ability to simultaneously reduce broadband fan noise levels has become more desirable. This paper describes improvements to a previously established broadband acoustic liner optimization process using the Advanced Noise Control Fan rig as a demonstrator. Specifically, in-duct attenuation predictions with a statistical source model are used to obtain optimum impedance spectra over the conditions of interest. The predicted optimum impedance information is then used with acoustic liner modeling tools to design liners aimed at producing impedance spectra that most closely match the predicted optimum values. Design selection is based on an acceptance criterion that provides the ability to apply increased weighting to specific frequencies and/or operating conditions. Constant-depth, double-degree of freedom and variable-depth, multi-degree of freedom designs are carried through design, fabrication, and testing to validate the efficacy of the design process. Results illustrate the value of the design process in concurrently evaluating the relative costs/benefits of these liner designs. This study also provides an application for demonstrating the integrated use of duct acoustic propagation/radiation and liner modeling tools in the design and evaluation of novel broadband liner concepts for complex engine configurations.

  19. Dynamic MRI to quantify musculoskeletal motion: A systematic review of concurrent validity and reliability, and perspectives for evaluation of musculoskeletal disorders.

    PubMed

    Borotikar, Bhushan; Lempereur, Mathieu; Lelievre, Mathieu; Burdin, Valérie; Ben Salem, Douraied; Brochard, Sylvain

    2017-01-01

    To report evidence for the concurrent validity and reliability of dynamic MRI techniques to evaluate in vivo joint and muscle mechanics, and to propose recommendations for their use in the assessment of normal and impaired musculoskeletal function. The search was conducted on articles published in Web of science, PubMed, Scopus, Academic search Premier, and Cochrane Library between 1990 and August 2017. Studies that reported the concurrent validity and/or reliability of dynamic MRI techniques for in vivo evaluation of joint or muscle mechanics were included after assessment by two independent reviewers. Selected articles were assessed using an adapted quality assessment tool and a data extraction process. Results for concurrent validity and reliability were categorized as poor, moderate, or excellent. Twenty articles fulfilled the inclusion criteria with a mean quality assessment score of 66% (±10.4%). Concurrent validity and/or reliability of eight dynamic MRI techniques were reported, with the knee being the most evaluated joint (seven studies). Moderate to excellent concurrent validity and reliability were reported for seven out of eight dynamic MRI techniques. Cine phase contrast and real-time MRI appeared to be the most valid and reliable techniques to evaluate joint motion, and spin tag for muscle motion. Dynamic MRI techniques are promising for the in vivo evaluation of musculoskeletal mechanics; however results should be evaluated with caution since validity and reliability have not been determined for all joints and muscles, nor for many pathological conditions.

  20. Spectral composition of concurrent noise affects neuronal sensitivity to interaural time differences of tones in the dorsal nucleus of the lateral lemniscus.

    PubMed

    Siveke, Ida; Leibold, Christian; Grothe, Benedikt

    2007-11-01

    We are regularly exposed to several concurrent sounds, producing a mixture of binaural cues. The neuronal mechanisms underlying the localization of concurrent sounds are not well understood. The major binaural cues for localizing low-frequency sounds in the horizontal plane are interaural time differences (ITDs). Auditory brain stem neurons encode ITDs by firing maximally in response to "favorable" ITDs and weakly or not at all in response to "unfavorable" ITDs. We recorded from ITD-sensitive neurons in the dorsal nucleus of the lateral lemniscus (DNLL) while presenting pure tones at different ITDs embedded in noise. We found that increasing levels of concurrent white noise suppressed the maximal response rate to tones with favorable ITDs and slightly enhanced the response rate to tones with unfavorable ITDs. Nevertheless, most of the neurons maintained ITD sensitivity to tones even for noise intensities equal to that of the tone. Using concurrent noise with a spectral composition in which the neuron's excitatory frequencies are omitted reduced the maximal response similar to that obtained with concurrent white noise. This finding indicates that the decrease of the maximal rate is mediated by suppressive cross-frequency interactions, which we also observed during monaural stimulation with additional white noise. In contrast, the enhancement of the firing rate to tones at unfavorable ITD might be due to early binaural interactions (e.g., at the level of the superior olive). A simple simulation corroborates this interpretation. Taken together, these findings suggest that the spectral composition of a concurrent sound strongly influences the spatial processing of ITD-sensitive DNLL neurons.

  1. Validation and application of Acoustic Mapping Velocimetry

    NASA Astrophysics Data System (ADS)

    Baranya, Sandor; Muste, Marian

    2016-04-01

    The goal of this paper is to introduce a novel methodology to estimate bedload transport in rivers based on an improved bedform tracking procedure. The measurement technique combines components and processing protocols from two contemporary nonintrusive instruments: acoustic and image-based. The bedform mapping is conducted with acoustic surveys while the estimation of the velocity of the bedforms is obtained with processing techniques pertaining to image-based velocimetry. The technique is therefore called Acoustic Mapping Velocimetry (AMV). The implementation of this technique produces a whole-field velocity map associated with the multi-directional bedform movement. Based on the calculated two-dimensional bedform migration velocity field, the bedload transport estimation is done using the Exner equation. A proof-of-concept experiment was performed to validate the AMV based bedload estimation in a laboratory flume at IIHR-Hydroscience & Engineering (IIHR). The bedform migration was analysed at three different flow discharges. Repeated bed geometry mapping, using a multiple transducer array (MTA), provided acoustic maps, which were post-processed with a particle image velocimetry (PIV) method. Bedload transport rates were calculated along longitudinal sections using the streamwise components of the bedform velocity vectors and the measured bedform heights. The bulk transport rates were compared with the results from concurrent direct physical samplings and acceptable agreement was found. As a first field implementation of the AMV an attempt was made to estimate bedload transport for a section of the Ohio river in the United States, where bed geometry maps, resulted by repeated multibeam echo sounder (MBES) surveys, served as input data. Cross-sectional distributions of bedload transport rates from the AMV based method were compared with the ones obtained from another non-intrusive technique (due to the lack of direct samplings), ISSDOTv2, developed by the US Army Corps of Engineers. The good agreement between the results from the two different methods is encouraging and suggests further field tests in varying hydro-morphological situations.

  2. Incidental category learning and cognitive load in a multisensory environment across childhood.

    PubMed

    Broadbent, H J; Osborne, T; Rea, M; Peng, A; Mareschal, D; Kirkham, N Z

    2018-06-01

    Multisensory information has been shown to facilitate learning (Bahrick & Lickliter, 2000; Broadbent, White, Mareschal, & Kirkham, 2017; Jordan & Baker, 2011; Shams & Seitz, 2008). However, although research has examined the modulating effect of unisensory and multisensory distractors on multisensory processing, the extent to which a concurrent unisensory or multisensory cognitive load task would interfere with or support multisensory learning remains unclear. This study examined the role of concurrent task modality on incidental category learning in 6- to 10-year-olds. Participants were engaged in a multisensory learning task while also performing either a unisensory (visual or auditory only) or multisensory (audiovisual) concurrent task (CT). We found that engaging in an auditory CT led to poorer performance on incidental category learning compared with an audiovisual or visual CT, across groups. In 6-year-olds, category test performance was at chance in the auditory-only CT condition, suggesting auditory concurrent tasks may interfere with learning in younger children, but the addition of visual information may serve to focus attention. These findings provide novel insight into the use of multisensory concurrent information on incidental learning. Implications for the deployment of multisensory learning tasks within education across development and developmental changes in modality dominance and ability to switch flexibly across modalities are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  3. Concurrence of lower jaw skeletal anomalies in triploid Atlantic salmon (Salmo salar L.) and the effect on growth in freshwater.

    PubMed

    Amoroso, G; Cobcroft, J M; Adams, M B; Ventura, T; Carter, C G

    2016-12-01

    Triploid Atlantic salmon populations are associated with higher prevalence of lower jaw skeletal anomalies affecting fish performance, welfare and value deleteriously. Anomalous lower jaw can be curved downward (LJD), shortened (SJ) or misaligned (MA). Two separate groups of triploid Atlantic salmon (~12 g) with either normal lower jaw (NOR) or SJ were visually assessed four times over three months for presence and concurrence of jaw anomalies (with severity classified) and opercular shortening to understand the relatedness of these anomalous developmental processes. The prevalence of jaw anomalies increased in both groups over time (NOR group - SJ, LJD and MA combined 0-24.5%; SJ group - LJD and MA combined 17-31%). SJ and LJD occurred both independently and concurrently whereas MA exclusively concurred with them. All three anomalies could be concurrent. Severity of both LJD and SJ increased in the SJ group only. Opercular shortening recovery was observed in both groups but at a slower rate in the SJ group. The SJ group specific growth rate (SGR) was significantly (P < 0.05) lower than the NOR group. This study demonstrated the concurrence of SJ, LJD and MA and showed possible deleterious consequences deriving from the conditions. © 2016 John Wiley & Sons Ltd.

  4. Peroxide Propulsion at the Turn of the Century

    NASA Technical Reports Server (NTRS)

    Anderson, William E.; Butler, Kathy; Crocket, Dave; Lewis, Tim; McNeal, Curtis

    2000-01-01

    A resurgence of interest in peroxide propulsion has occurred in the last years of the 21st Century. This interest is driven by the need for lower cost propulsion systems and the need for storable reusable propulsion systems to meet future space transportation system architectures. NASA and the Air Force are jointly developing two propulsion systems for flight demonstration early in the 21st Century. One system will be a development of Boeing's AR2-3 engine, which was successfully fielded in the 1960s. The other is a new pressure-fed design by Orbital Sciences Corporation for expendable mission requirements. Concurrently NASA and industry are pursuing the key peroxide technologies needed to design, fabricate, and test advanced peroxide engines to meet the mission needs beyond 2005. This paper will present a description of the AR2-3, report the status of its current test program, and describe its intended flight demonstration. This paper will then describe the Orbital 10K engine, the status of its test program, and describe its planned flight demonstration. Finally the paper will present a plan, or technology roadmap, for the development of an advanced peroxide engine for the 21st Century.

  5. Implementing Software Safety in the NASA Environment

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha S.; Radley, Charles F.

    1994-01-01

    Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of the system to be built. Shortly thereafter, as the system requirements are being defined, the second iteration of hazard analyses takes place, the systems hazard analysis (SHA). During the systems requirements phase, decisions are made as to what functions of the system will be the responsibility of software. This is the most critical time to affect the safety of the software. From this point, software safety analyses as well as software engineering practices are the main focus for assuring safe software. While many of the steps proposed in this paper seem like just sound engineering practices, they are the best technical and most cost effective means to assure safe software within a safe system.

  6. Aerospace engineering design by systematic decomposition and multilevel optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Barthelemy, J. F. M.; Giles, G. L.

    1984-01-01

    A method for systematic analysis and optimization of large engineering systems, by decomposition of a large task into a set of smaller subtasks that is solved concurrently is described. The subtasks may be arranged in hierarchical levels. Analyses are carried out in each subtask using inputs received from other subtasks, and are followed by optimizations carried out from the bottom up. Each optimization at the lower levels is augmented by analysis of its sensitivity to the inputs received from other subtasks to account for the couplings among the subtasks in a formal manner. The analysis and optimization operations alternate iteratively until they converge to a system design whose performance is maximized with all constraints satisfied. The method, which is still under development, is tentatively validated by test cases in structural applications and an aircraft configuration optimization.

  7. Control system software, simulation, and robotic applications

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    1991-01-01

    All essential existing capabilities needed to create a man-machine interaction dynamics and performance (MMIDAP) capability are reviewed. The multibody system dynamics software program Order N DISCOS will be used for machine and musculo-skeletal dynamics modeling. The program JACK will be used for estimating and animating whole body human response to given loading situations and motion constraints. The basic elements of performance (BEP) task decomposition methodologies associated with the Human Performance Institute database will be used for performance assessment. Techniques for resolving the statically indeterminant muscular load sharing problem will be used for a detailed understanding of potential musculotendon or ligamentous fatigue, pain, discomfort, and trauma. The envisioned capacity is to be used for mechanical system design, human performance assessment, extrapolation of man/machine interaction test data, biomedical engineering, and soft prototyping within a concurrent engineering (CE) system.

  8. Local Working Agreements and the Tennessee SOPs

    EPA Pesticide Factsheets

    TN Interagency workgroup convened to improve communication and Coordination, identify permit requirements, implement concurrent reviews, reduce permit revisions; and develop coordinated JD/Pre-App process

  9. FIBER AND INTEGRATED OPTICS. OPTOELECTRONICS: Some characteristics of formation of volume dynamic holograms by concurrent waves propagating in resonant atomic media

    NASA Astrophysics Data System (ADS)

    Kirilenko, A. K.

    1989-07-01

    An investigation was made of the transient process of formation of volume dynamic holograms by light within the spectral limits of the D2 resonant absorption line of sodium. The observed asymmetry of the spectral distribution of the gain of the signal waves in the case of a concurrent interaction between four beams was attributed to different mechanisms of the interaction, the main of which were a four-wave interaction in the long-wavelength wing and transient two-beam energy transfer in the short-wavelength wing. The results obtained were used to recommend an experimental method for the determination of the relative contributions of these processes to the amplification of signal waves.

  10. 75 FR 22047 - Intent To Initiate Consultation and Coordinate the National Oceanic and Atmospheric...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-27

    ... purposes and policies of the NMSA (73 FR 53161). The management plan review process occurs concurrently... management plan was written as part of the sanctuary designation process and published in the Final... Environmental Impact Statement under the authority of NEPA (73 FR 53161). The management plan review process is...

  11. Spatial and temporal modifications of multitalker speech can improve speech perception in older adults.

    PubMed

    Gygi, Brian; Shafiro, Valeriy

    2014-04-01

    Speech perception in multitalker environments often requires listeners to divide attention among several concurrent talkers before focusing on one talker with pertinent information. Such attentionally demanding tasks are particularly difficult for older adults due both to age-related hearing loss (presbacusis) and general declines in attentional processing and associated cognitive abilities. This study investigated two signal-processing techniques that have been suggested as a means of improving speech perception accuracy of older adults: time stretching and spatial separation of target talkers. Stimuli in each experiment comprised 2-4 fixed-form utterances in which listeners were asked to consecutively 1) detect concurrently spoken keywords in the beginning of the utterance (divided attention); and, 2) identify additional keywords from only one talker at the end of the utterance (selective attention). In Experiment 1, the overall tempo of each utterance was unaltered or slowed down by 25%; in Experiment 2 the concurrent utterances were spatially coincident or separated across a 180-degree hemifield. Both manipulations improved performance for elderly adults with age-appropriate hearing on both tasks. Increasing the divided attention load by attending to more concurrent keywords had a marked negative effect on performance of the selective attention task only when the target talker was identified by a keyword, but not by spatial location. These findings suggest that the temporal and spatial modifications of multitalker speech improved perception of multitalker speech primarily by reducing competition among cognitive resources required to perform attentionally demanding tasks. Published by Elsevier B.V.

  12. What You Don't Notice Can Harm You: Age-Related Differences in Detecting Concurrent Visual, Auditory, and Tactile Cues.

    PubMed

    Pitts, Brandon J; Sarter, Nadine

    2018-06-01

    Objective This research sought to determine whether people can perceive and process three nonredundant (and unrelated) signals in vision, hearing, and touch at the same time and how aging and concurrent task demands affect this ability. Background Multimodal displays have been shown to improve multitasking and attention management; however, their potential limitations are not well understood. The majority of studies on multimodal information presentation have focused on the processing of only two concurrent and, most often, redundant cues by younger participants. Method Two experiments were conducted in which younger and older adults detected and responded to a series of singles, pairs, and triplets of visual, auditory, and tactile cues in the absence (Experiment 1) and presence (Experiment 2) of an ongoing simulated driving task. Detection rates, response times, and driving task performance were measured. Results Compared to younger participants, older adults showed longer response times and higher error rates in response to cues/cue combinations. Older participants often missed the tactile cue when three cues were combined. They sometimes falsely reported the presence of a visual cue when presented with a pair of auditory and tactile signals. Driving performance suffered most in the presence of cue triplets. Conclusion People are more likely to miss information if more than two concurrent nonredundant signals are presented to different sensory channels. Application The findings from this work help inform the design of multimodal displays and ensure their usefulness across different age groups and in various application domains.

  13. Research in computer science

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1984-01-01

    Several short summaries of the work performed during this reporting period are presented. Topics discussed in this document include: (1) resilient seeded errors via simple techniques; (2) knowledge representation for engineering design; (3) analysis of faults in a multiversion software experiment; (4) implementation of parallel programming environment; (5) symbolic execution of concurrent programs; (6) two computer graphics systems for visualization of pressure distribution and convective density particles; (7) design of a source code management system; (8) vectorizing incomplete conjugate gradient on the Cyber 203/205; (9) extensions of domain testing theory and; (10) performance analyzer for the pisces system.

  14. Evaluating software development by analysis of changes: The data from the software engineering laboratory

    NASA Technical Reports Server (NTRS)

    1982-01-01

    An effective data collection methodology for evaluating software development methodologies was applied to four different software development projects. Goals of the data collection included characterizing changes and errors, characterizing projects and programmers, identifying effective error detection and correction techniques, and investigating ripple effects. The data collected consisted of changes (including error corrections) made to the software after code was written and baselined, but before testing began. Data collection and validation were concurrent with software development. Changes reported were verified by interviews with programmers.

  15. Nacelle Aerodynamic and Inertial Loads (NAIL) project. Appendix B

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The testing was conducted on the Boeing-owned 747 RA001 test bed airplane during the concurrent 767/JT9D-7R4 engine development program. Following a functional check flight conducted from Boeing Field International (BFI) on 3 October 1980, the airplane and test personnel were ferried to Valley Industrial Park (GSG) near Glasgow, Montana, on 7 October 1980. The combined NAL and 7670JT9D-7R4 test flights were conducted at the Glasgow remote test site, and the airplane was returned to Seattle on 26 October 1980.

  16. Measurements of multi-scalar mixing in a turbulent coaxial jet

    NASA Astrophysics Data System (ADS)

    Hewes, Alais; Mydlarski, Laurent

    2017-11-01

    There are relatively few studies of turbulent multi-scalar mixing, despite the occurrence of this phenomenon in common processes (e.g. chemically reacting flows, oceanic mixing). In the present work, we simultaneously measure the evolution of two passive scalars (temperature and helium concentration) and velocity in a coaxial jet. Such a flow is particularly relevant, as coaxial jets are regularly employed in applications of turbulent non-premixed combustion, which relies on multi-scalar mixing. The coaxial jet used in the current experiment is based on the work of Cai et al. (J. Fluid Mech., 2011), and consists of a vertically oriented central jet of helium and air, surrounded by an annular flow of (unheated) pure air, emanating into a slow co-flow of (pure) heated air. The simultaneous two-scalar and velocity measurements are made using a 3-wire hot-wire anemometry probe. The first two wires of this probe form an interference (or Way-Libby) probe, and measure velocity and concentration. The third wire, a hot-wire operating at a low overheat ratio, measures temperature. The 3-wire probe is used to obtain concurrent velocity, concentration, and temperature statistics to characterize the mixing process by way of single and multivariable/joint statistics. Supported by the Natural Sciences and Engineering Research Council of Canada (Grant 217184).

  17. Enhancing proliferation and optimizing the culture condition for human bone marrow stromal cells using hypoxia and fibroblast growth factor-2.

    PubMed

    Lee, Jung-Seok; Kim, Seul Ki; Jung, Byung-Joo; Choi, Seong-Bok; Choi, Eun-Young; Kim, Chang-Sung

    2018-04-01

    This study aimed to determine the cellular characteristics and behaviors of human bone marrow stromal cells (hBMSCs) expanded in media in a hypoxic or normoxic condition and with or without fibroblast growth factor-2 (FGF-2) treatment. hBMSCs isolated from the vertebral body and expanded in these four groups were evaluated for cellular proliferation/migration, colony-forming units, cell-surface characterization, in vitro differentiation, in vivo transplantation, and gene expression. Culturing hBMSCs using a particular environmental factor (hypoxia) and with the addition of FGF-2 increased the cellular proliferation rate while enhancing the regenerative potential, modulated the multipotency-related processes (enhanced chondrogenesis-related processes/osteogenesis, but reduced adipogenesis), and increased cellular migration and collagen formation. The gene expression levels in the experimental samples showed activation of the hypoxia-inducible factor-1 pathway and glycolysis in the hypoxic condition, with this not being affected by the addition of FGF-2. The concurrent application of hypoxia and FGF-2 could provide a favorable condition for culturing hBMSCs to be used in clinical applications associated with bone tissue engineering, due to the enhancement of cellular proliferation and regenerative potential. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Integrating Design and Manufacturing for a High Speed Civil Transport Wing

    NASA Technical Reports Server (NTRS)

    Marx, William J.; Mavris, Dimitri N.; Schrage, Daniel P.

    1994-01-01

    The aerospace industry is currently addressing the problem of integrating design and manufacturing. Because of the difficulties associated with using conventional, procedural techniques and algorithms, it is the authors' belief that the only feasible way to integrate the two concepts is with the development of an appropriate Knowledge-Based System (KBS). The authors propose a methodology for an aircraft producibility assessment, including a KBS, that addresses both procedural and heuristic aspects of integrating design and manufacturing of a High Speed Civil Transport (HSCT) wing. The HSCT was chosen as the focus of this investigation since it is a current NASA/aerospace industry initiative full of technological challenges involving many disciplines. The paper gives a brief background of selected previous supersonic transport studies followed by descriptions of key relevant design and manufacturing methodologies. Georgia Tech's Concurrent Engineering/Integrated Product and Process Development methodology is discussed with reference to this proposed conceptual producibility assessment. Evaluation criteria are presented that relate pertinent product and process parameters to overall product producibility. In addition, the authors' integration methodology and reasons for selecting a KBS to integrate design and manufacturing are presented in this paper. Finally, a proposed KBS is given, as well as statements of future work and overall investigation objectives.

  19. Longitudinal in vivo evaluation of bone regeneration by combined measurement of multi-pinhole SPECT and micro-CT for tissue engineering

    NASA Astrophysics Data System (ADS)

    Lienemann, Philipp S.; Metzger, Stéphanie; Kiveliö, Anna-Sofia; Blanc, Alain; Papageorgiou, Panagiota; Astolfo, Alberto; Pinzer, Bernd R.; Cinelli, Paolo; Weber, Franz E.; Schibli, Roger; Béhé, Martin; Ehrbar, Martin

    2015-05-01

    Over the last decades, great strides were made in the development of novel implants for the treatment of bone defects. The increasing versatility and complexity of these implant designs request for concurrent advances in means to assess in vivo the course of induced bone formation in preclinical models. Since its discovery, micro-computed tomography (micro-CT) has excelled as powerful high-resolution technique for non-invasive assessment of newly formed bone tissue. However, micro-CT fails to provide spatiotemporal information on biological processes ongoing during bone regeneration. Conversely, due to the versatile applicability and cost-effectiveness, single photon emission computed tomography (SPECT) would be an ideal technique for assessing such biological processes with high sensitivity and for nuclear imaging comparably high resolution (<1 mm). Herein, we employ modular designed poly(ethylene glycol)-based hydrogels that release bone morphogenetic protein to guide the healing of critical sized calvarial bone defects. By combined in vivo longitudinal multi-pinhole SPECT and micro-CT evaluations we determine the spatiotemporal course of bone formation and remodeling within this synthetic hydrogel implant. End point evaluations by high resolution micro-CT and histological evaluation confirm the value of this approach to follow and optimize bone-inducing biomaterials.

  20. Programming Models for Concurrency and Real-Time

    NASA Astrophysics Data System (ADS)

    Vitek, Jan

    Modern real-time applications are increasingly large, complex and concurrent systems which must meet stringent performance and predictability requirements. Programming those systems require fundamental advances in programming languages and runtime systems. This talk presents our work on Flexotasks, a programming model for concurrent, real-time systems inspired by stream-processing and concurrent active objects. Some of the key innovations in Flexotasks are that it support both real-time garbage collection and region-based memory with an ownership type system for static safety. Communication between tasks is performed by channels with a linear type discipline to avoid copying messages, and by a non-blocking transactional memory facility. We have evaluated our model empirically within two distinct implementations, one based on Purdue’s Ovm research virtual machine framework and the other on Websphere, IBM’s production real-time virtual machine. We have written a number of small programs, as well as a 30 KLOC avionics collision detector application. We show that Flexotasks are capable of executing periodic threads at 10 KHz with a standard deviation of 1.2us and have performance competitive with hand coded C programs.

  1. Phonological similarity effect in complex span task.

    PubMed

    Camos, Valérie; Mora, Gérôme; Barrouillet, Pierre

    2013-01-01

    The aim of our study was to test the hypothesis that two systems are involved in verbal working memory; one is specifically dedicated to the maintenance of phonological representations through verbal rehearsal while the other would maintain multimodal representations through attentional refreshing. This theoretical framework predicts that phonologically related phenomena such as the phonological similarity effect (PSE) should occur when the domain-specific system is involved in maintenance, but should disappear when concurrent articulation hinders its use. Impeding maintenance in the domain-general system by a concurrent attentional demand should impair recall performance without affecting PSE. In three experiments, we manipulated the concurrent articulation and the attentional demand induced by the processing component of complex span tasks in which participants had to maintain lists of either similar or dissimilar words. Confirming our predictions, PSE affected recall performance in complex span tasks. Although both the attentional demand and the articulatory requirement of the concurrent task impaired recall, only the induction of an articulatory suppression during maintenance made the PSE disappear. These results suggest a duality in the systems devoted to verbal maintenance in the short term, constraining models of working memory.

  2. NASA Planetary Science Summer School: Preparing the Next Generation of Planetary Mission Leaders

    NASA Astrophysics Data System (ADS)

    Budney, C. J.; Lowes, L. L.; Sohus, A.; Wheeler, T.; Wessen, A.; Scalice, D.

    2010-12-01

    Sponsored by NASA’s Planetary Science Division, and managed by the Jet Propulsion Laboratory, the Planetary Science Summer School prepares the next generation of engineers and scientists to participate in future solar system exploration missions. Participants learn the mission life cycle, roles of scientists and engineers in a mission environment, mission design interconnectedness and trade-offs, and the importance of teamwork. For this professional development opportunity, applicants are sought who have a strong interest and experience in careers in planetary exploration, and who are science and engineering post-docs, recent PhDs, and doctoral students, and faculty teaching such students. Disciplines include planetary science, geoscience, geophysics, environmental science, aerospace engineering, mechanical engineering, and materials science. Participants are selected through a competitive review process, with selections based on the strength of the application and advisor’s recommendation letter. Under the mentorship of a lead engineer (Dr. Charles Budney), students select, design, and develop a mission concept in response to the NASA New Frontiers Announcement of Opportunity. They develop their mission in the JPL Advanced Projects Design Team (Team X) environment, which is a cross-functional multidisciplinary team of professional engineers that utilizes concurrent engineering methodologies to complete rapid design, analysis and evaluation of mission concept designs. About 36 students participate each year, divided into two summer sessions. In advance of an intensive week-long session in the Project Design Center at JPL, students select the mission and science goals during a series of six weekly WebEx/telecons, and develop a preliminary suite of instrumentation and a science traceability matrix. Students assume both a science team and a mission development role with JPL Team X mentors. Once at JPL, students participate in a series of Team X project design sessions, during which their mentors aid them in finalizing their mission design and instrument suite, and in making the necessary trade-offs to stay within the cost cap. Tours of JPL facilities highlight the end-to-end life cycle of a mission. At week’s end, students present their Concept Study to a “proposal review board” of JPL scientists and engineers and NASA Headquarters executives, who feed back the strengths and weaknesses of their proposal and mission design. The majority of students come from top US universities with planetary science or engineering programs, such as Brown University, MIT, Georgia Tech, University of Colorado, Caltech, Stanford, University of Arizona, UCLA, and University of Michigan. Almost a third of Planetary Science Summer School alumni from the last 10 years of the program are currently employed by NASA or JPL. The Planetary Science Summer School is implemented by the JPL Education Office in partnership with JPL’s Team X Project Design Center.

  3. Examining the Effect of the Die Angle on Tool Load and Wear in the Extrusion Process

    NASA Astrophysics Data System (ADS)

    Nowotyńska, Irena; Kut, Stanisław

    2014-04-01

    The tool durability is a crucial factor in each manufacturing process, and this also includes the extrusion process. Striving to achieve the higher product quality should be accompanied by a long-term tool life and production cost reduction. This article presents the comparative research of load and wear of die at various angles of working cone during the concurrent extrusion. The numerical calculations of a tool load during the concurrent extrusion were performed using the MSC MARC software using the finite element method (FEM). Archard model was used to determine and compare die wear. This model was implemented in the software using the FEM. The examined tool deformations and stress distribution were determined based on the performed analyses. The die wear depth at various working cone angles was determined. Properly shaped die has an effect on the extruded material properties, but also controls loads, elastic deformation, and the tool life.

  4. Strategies for concurrent processing of complex algorithms in data driven architectures

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.

    1988-01-01

    The purpose is to document research to develop strategies for concurrent processing of complex algorithms in data driven architectures. The problem domain consists of decision-free algorithms having large-grained, computationally complex primitive operations. Such are often found in signal processing and control applications. The anticipated multiprocessor environment is a data flow architecture containing between two and twenty computing elements. Each computing element is a processor having local program memory, and which communicates with a common global data memory. A new graph theoretic model called ATAMM which establishes rules for relating a decomposed algorithm to its execution in a data flow architecture is presented. The ATAMM model is used to determine strategies to achieve optimum time performance and to develop a system diagnostic software tool. In addition, preliminary work on a new multiprocessor operating system based on the ATAMM specifications is described.

  5. Acute physical exercise affected processing efficiency in an auditory attention task more than processing effectiveness.

    PubMed

    Dutke, Stephan; Jaitner, Thomas; Berse, Timo; Barenberg, Jonathan

    2014-02-01

    Research on effects of acute physical exercise on performance in a concurrent cognitive task has generated equivocal evidence. Processing efficiency theory predicts that concurrent physical exercise can increase resource requirements for sustaining cognitive performance even when the level of performance is unaffected. This hypothesis was tested in a dual-task experiment. Sixty young adults worked on a primary auditory attention task and a secondary interval production task while cycling on a bicycle ergometer. Physical load (cycling) and cognitive load of the primary task were manipulated. Neither physical nor cognitive load affected primary task performance, but both factors interacted on secondary task performance. Sustaining primary task performance under increased physical and/or cognitive load increased resource consumption as indicated by decreased secondary task performance. Results demonstrated that physical exercise effects on cognition might be underestimated when only single task performance is the focus.

  6. The Design of a High Performance Earth Imagery and Raster Data Management and Processing Platform

    NASA Astrophysics Data System (ADS)

    Xie, Qingyun

    2016-06-01

    This paper summarizes the general requirements and specific characteristics of both geospatial raster database management system and raster data processing platform from a domain-specific perspective as well as from a computing point of view. It also discusses the need of tight integration between the database system and the processing system. These requirements resulted in Oracle Spatial GeoRaster, a global scale and high performance earth imagery and raster data management and processing platform. The rationale, design, implementation, and benefits of Oracle Spatial GeoRaster are described. Basically, as a database management system, GeoRaster defines an integrated raster data model, supports image compression, data manipulation, general and spatial indices, content and context based queries and updates, versioning, concurrency, security, replication, standby, backup and recovery, multitenancy, and ETL. It provides high scalability using computer and storage clustering. As a raster data processing platform, GeoRaster provides basic operations, image processing, raster analytics, and data distribution featuring high performance computing (HPC). Specifically, HPC features include locality computing, concurrent processing, parallel processing, and in-memory computing. In addition, the APIs and the plug-in architecture are discussed.

  7. Combined micromechanical and fabrication process optimization for metal-matrix composites

    NASA Technical Reports Server (NTRS)

    Morel, M.; Saravanos, D. A.; Chamis, C. C.

    1991-01-01

    A method is presented to minimize the residual matrix stresses in metal matrix composites. Fabrication parameters such as temperature and consolidation pressure are optimized concurrently with the characteristics (i.e., modulus, coefficient of thermal expansion, strength, and interphase thickness) of a fiber-matrix interphase. By including the interphase properties in the fabrication process, lower residual stresses are achievable. Results for an ultra-high modulus graphite (P100)/copper composite show a reduction of 21 percent for the maximum matrix microstress when optimizing the fabrication process alone. Concurrent optimization of the fabrication process and interphase properties show a 41 percent decrease in the maximum microstress. Therefore, this optimization method demonstrates the capability of reducing residual microstresses by altering the temperature and consolidation pressure histories and tailoring the interphase properties for an improved composite material. In addition, the results indicate that the consolidation pressures are the most important fabrication parameters, and the coefficient of thermal expansion is the most critical interphase property.

  8. Flaws in the Flow: The Weakness of Unstructured Business Process Modeling Languages Dealing with Data

    NASA Astrophysics Data System (ADS)

    Combi, Carlo; Gambini, Mauro

    Process-Aware Information Systems (PAISs) need more flexibility for supporting complex and varying human activities. PAISs usually support business process design by means of graphical graph-oriented business process modeling languages (BPMLs) in conjunction with textual executable specifications. In this paper we discuss the flexibility of such BPMLs which are the main interface for users that need to change the behavior of PAISs. In particular, we show how common BPMLs features, that seem good when considered alone, have a negative impact on flexibility when they are combined together for providing a complete executable specification. A model has to be understood before being changed and a change is made only when the benefits outweigh the effort. Two main factors have a great impact on comprehensibility and ease of change: concurrency and modularity. We show why BPMLs usually offer a limited concurrency model and lack of modularity; finally we discuss how to overcome these problems.

  9. Distributed Parallel Processing and Dynamic Load Balancing Techniques for Multidisciplinary High Speed Aircraft Design

    NASA Technical Reports Server (NTRS)

    Krasteva, Denitza T.

    1998-01-01

    Multidisciplinary design optimization (MDO) for large-scale engineering problems poses many challenges (e.g., the design of an efficient concurrent paradigm for global optimization based on disciplinary analyses, expensive computations over vast data sets, etc.) This work focuses on the application of distributed schemes for massively parallel architectures to MDO problems, as a tool for reducing computation time and solving larger problems. The specific problem considered here is configuration optimization of a high speed civil transport (HSCT), and the efficient parallelization of the embedded paradigm for reasonable design space identification. Two distributed dynamic load balancing techniques (random polling and global round robin with message combining) and two necessary termination detection schemes (global task count and token passing) were implemented and evaluated in terms of effectiveness and scalability to large problem sizes and a thousand processors. The effect of certain parameters on execution time was also inspected. Empirical results demonstrated stable performance and effectiveness for all schemes, and the parametric study showed that the selected algorithmic parameters have a negligible effect on performance.

  10. Skeletal biology: Where matrix meets mineral

    PubMed Central

    Young, Marian F.

    2017-01-01

    The skeleton is unique from all other tissues in the body because of its ability to mineralize. The incorporation of mineral into bones and teeth is essential to give them strength and structure for body support and function. For years, researchers have wondered how mineralized tissues form and repair. A major focus in this context has been on the role of the extracellular matrix, which harbors key regulators of the mineralization process. In this introductory minireview, we will review some key concepts of matrix biology as it related to mineralized tissues. Concurrently, we will highlight the subject of this special issue covering many aspects of mineralized tissues, including bones and teeth and their associated structures cartilage and tendon. Areas of emphasis are on the generation and analysis of new animal models with permutations of matrix components as well as the development of new approaches for tissue engineering for repair of damaged hard tissue. In assembling key topics on mineralized tissues written by leaders in our field, we hope the reader will get a broad view of the topic and all of its fascinating complexities. PMID:27131884

  11. IMAGE: A Design Integration Framework Applied to the High Speed Civil Transport

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.

    1993-01-01

    Effective design of the High Speed Civil Transport requires the systematic application of design resources throughout a product's life-cycle. Information obtained from the use of these resources is used for the decision-making processes of Concurrent Engineering. Integrated computing environments facilitate the acquisition, organization, and use of required information. State-of-the-art computing technologies provide the basis for the Intelligent Multi-disciplinary Aircraft Generation Environment (IMAGE) described in this paper. IMAGE builds upon existing agent technologies by adding a new component called a model. With the addition of a model, the agent can provide accountable resource utilization in the presence of increasing design fidelity. The development of a zeroth-order agent is used to illustrate agent fundamentals. Using a CATIA(TM)-based agent from previous work, a High Speed Civil Transport visualization system linking CATIA, FLOPS, and ASTROS will be shown. These examples illustrate the important role of the agent technologies used to implement IMAGE, and together they demonstrate that IMAGE can provide an integrated computing environment for the design of the High Speed Civil Transport.

  12. A methodology for collecting valid software engineering data

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Weiss, David M.

    1983-01-01

    An effective data collection method for evaluating software development methodologies and for studying the software development process is described. The method uses goal-directed data collection to evaluate methodologies with respect to the claims made for them. Such claims are used as a basis for defining the goals of the data collection, establishing a list of questions of interest to be answered by data analysis, defining a set of data categorization schemes, and designing a data collection form. The data to be collected are based on the changes made to the software during development, and are obtained when the changes are made. To insure accuracy of the data, validation is performed concurrently with software development and data collection. Validation is based on interviews with those people supplying the data. Results from using the methodology show that data validation is a necessary part of change data collection. Without it, as much as 50% of the data may be erroneous. Feasibility of the data collection methodology was demonstrated by applying it to five different projects in two different environments. The application showed that the methodology was both feasible and useful.

  13. Recurrent Neural Network for Computing the Drazin Inverse.

    PubMed

    Stanimirović, Predrag S; Zivković, Ivan S; Wei, Yimin

    2015-11-01

    This paper presents a recurrent neural network (RNN) for computing the Drazin inverse of a real matrix in real time. This recurrent neural network (RNN) is composed of n independent parts (subnetworks), where n is the order of the input matrix. These subnetworks can operate concurrently, so parallel and distributed processing can be achieved. In this way, the computational advantages over the existing sequential algorithms can be attained in real-time applications. The RNN defined in this paper is convenient for an implementation in an electronic circuit. The number of neurons in the neural network is the same as the number of elements in the output matrix, which represents the Drazin inverse. The difference between the proposed RNN and the existing ones for the Drazin inverse computation lies in their network architecture and dynamics. The conditions that ensure the stability of the defined RNN as well as its convergence toward the Drazin inverse are considered. In addition, illustrative examples and examples of application to the practical engineering problems are discussed to show the efficacy of the proposed neural network.

  14. Capturing flight system test engineering expertise: Lessons learned

    NASA Technical Reports Server (NTRS)

    Woerner, Irene Wong

    1991-01-01

    Within a few years, JPL will be challenged by the most active mission set in history. Concurrently, flight systems are increasingly more complex. Presently, the knowledge to conduct integration and test of spacecraft and large instruments is held by a few key people, each with many years of experience. JPL is in danger of losing a significant amount of this critical expertise, through retirement, during a period when demand for this expertise is rapidly increasing. The most critical issue at hand is to collect and retain this expertise and develop tools that would ensure the ability to successfully perform the integration and test of future spacecraft and large instruments. The proposed solution was to capture and codity a subset of existing knowledge, and to utilize this captured expertise in knowledge-based systems. First year results and activities planned for the second year of this on-going effort are described. Topics discussed include lessons learned in knowledge acquisition and elicitation techniques, life-cycle paradigms, and rapid prototyping of a knowledge-based advisor (Spacecraft Test Assistant) and a hypermedia browser (Test Engineering Browser). The prototype Spacecraft Test Assistant supports a subset of integration and test activities for flight systems. Browser is a hypermedia tool that allows users easy perusal of spacecraft test topics. A knowledge acquisition tool called ConceptFinder which was developed to search through large volumes of data for related concepts is also described and is modified to semi-automate the process of creating hypertext links.

  15. The James Webb Telescope Instrument Suite Layout: Optical System Engineering Considerations for a Large, Deployable Space Telescope

    NASA Technical Reports Server (NTRS)

    Bos, Brent; Davila, Pam; Jurotich, Matthew; Hobbs, Gurnie; Lightsey, Paul; Contreras, Jim; Whitman, Tony

    2003-01-01

    The James Webb Space Telescope (JWST) is a space-based, infrared observatory designed to study the early stages of galaxy formation in the Universe. The telescope will be launched into an elliptical orbit about the second Lagrange point and passively cooled to 30-50 K to enable astronomical observations from 0.6 to 28 microns. A group from the NASA Goddard Space Flight Center and the Northrop Grumman Space Technology prime contractor team has developed an optical and mechanical layout for the science instruments within the JWST field of view that satisfies the telescope s high-level performance requirements. Four instruments required accommodation within the telescope's field of view: a Near-Infrared Camera (NIRCam) provided by the University of Arizona; a Near-Mared Spectrometer (NIRSpec) provided by the European Space Agency; a Mid-Infrared Instrument (MIRI) provided by the Jet Propulsion Laboratory and a European consortium; and a Fine Guidance Sensor (FGS) with a tunable filter module provided by the Canadian Space Agency. The size and position of each instrument's field of view allocation were developed through an iterative, concurrent engineering process involving the key observatory stakeholders. While some of the system design considerations were those typically encountered during the development of an infrared observatory, others were unique to the deployable and controllable nature of JWST. This paper describes the optical and mechanical issues considered during the field of view layout development, as well as the supporting modeling and analysis activities.

  16. Stress, Allostatic Load, Catecholamines, and Other Neurotransmitters in Neurodegenerative Diseases

    PubMed Central

    2016-01-01

    As populations age, the prevalence of geriatric neurodegenerative diseases will increase. These diseases generally are multifactorial, arising from complex interactions among genes, environment, concurrent morbidities, treatments, and time. This essay provides a concept for the pathogenesis of Lewy body diseases such as Parkinson disease, by considering them in the context of allostasis and allostatic load. Allostasis reflects active, adaptive processes that maintain apparent steady states, via multiple, interacting effectors regulated by homeostatic comparators—“homeostats.” Stress can be defined as a condition or state in which a sensed discrepancy between afferent information and a setpoint for response leads to activation of effectors, reducing the discrepancy. “Allostatic load” refers to the consequences of sustained or repeated activation of mediators of allostasis. From the analogy of an idling car, the revolutions per minute of the engine can be maintained at any of a variety of levels (allostatic states). Just as allostatic load (cumulative wear and tear) reflects design and manufacturing variations, byproducts of combustion, and time, eventually leading to engine breakdown, allostatic load in catecholaminergic neurons might eventually lead to Lewy body diseases. Central to the argument is that catecholaminergic neurons leak vesicular contents into the cytoplasm continuously during life and that catecholamines in the neuronal cytoplasm are autotoxic. These neurons therefore depend on vesicular sequestration to limit autotoxicity of cytosolic transmitter. Parkinson disease might be a disease of the elderly because of allostatic load, which depends on genetic predispositions, environmental exposures, repeated stress-related catecholamine release, and time. PMID:22297542

  17. Stress, Allostatic Load, Catecholamines, and Other Neurotransmitters in Neurodegenerative Diseases

    PubMed Central

    2017-01-01

    As populations age, the prevalence of geriatric neurodegenerative diseases will increase. These diseases generally are multifactorial, arising from complex interactions among genes, environment, concurrent morbidities, treatments, and time. This essay provides a concept for the pathogenesis of Lewy body diseases such as Parkinson disease, by considering them in the context of allostasis and allostatic load. Allostasis reflects active, adaptive processes that maintain apparent steady states, via multiple, interacting effectors regulated by homeostatic comparators—“homeostats.” Stress can be defined as a condition or state in which a sensed discrepancy between afferent information and a setpoint for response leads to activation of effectors, reducing the discrepancy. “Allostatic load” refers to the consequences of sustained or repeated activation of mediators of allostasis. From the analogy of an idling car, the revolutions per minute of the engine can be maintained at any of a variety of levels (allostatic states). Just as allostatic load (cumulative wear and tear) reflects design and manufacturing variations, byproducts of combustion, and time, eventually leading to engine breakdown, allostatic load in catecholaminergic neurons might eventually lead to Lewy body diseases. Central to the argument is that catecholaminergic neurons leak vesicular contents into the cytoplasm continuously during life and that catecholamines in the neuronal cytoplasm are autotoxic. These neurons therefore depend on vesicular sequestration to limit autotoxicity of cytosolic transmitter. Parkinson disease might be a disease of the elderly because of allostatic load, which depends on genetic predispositions, environmental exposures, repeated stress-related catecholamine release, and time. PMID:21615193

  18. How Static is the Statics Classroom? An investigation into how innovations, specifically Research-Based Instructional Strategies, are adopted into the Statics Classroom

    NASA Astrophysics Data System (ADS)

    Cutler, Stephanie Leigh

    The purpose of this dissertation is to investigate how educational research, specifically Research-Based Instructional Strategies (RBIS), is adopted by education practice, specifically within the engineering Statics classroom. Using a systematic approach, changes in classroom teaching practices were investigated from the instructors' perspective. Both researchers and practitioners are included in the process, combining efforts to improve student learning, which is a critical goal for engineering education. The study is divided into 3 stages and each is discussed in an individual manuscript. Manuscript 1 provides an assessment of current teaching practices; Manuscript 2 explores RBIS use by Statics instructors and perceived barriers of adoption; and Manuscript 3 evaluates adoption using Fidelity of Implementation. A common set of concurrent mixed methods was used for each stage of this study. A quantitative national survey of Statics instructors (n =166) and 18 qualitative interviews were conducted to examine activities used in the Statics classroom and familiarity with nine RBIS. The results of this study show that lecturing is the most common activity throughout Statics classrooms, but is not the only activity. Other common activities included working examples and students working on problems individually and in groups. As discussed by the interview participants, each of Rogers' characteristics influenced adoption for different reasons. For example, Complexity (level of difficulty with implementation of an RBIS) was most commonly identified as a barrier. His study also evaluated the Fidelity of Implementation for each RBIS and found it to be higher for RBIS that were less complex (in terms of the number of critical components). Many of the critical components (i.e. activities required for implementation, as described in the literature) were found to statistically distinguish RBIS users and non-users. This dissertation offers four contributions: (1) an understanding of current practices in Statics; (2) the instructor perspective of the barriers to using RBIS in the classroom; (3) the use of Fidelity of Implementation as a unique evaluation of RBIS adoption, which can be used by future engineering education researchers; and (4) a systematic approach of exploring change in the classroom, which offers new perspectives and approaches to accelerate the adoption process.

  19. Dynamic MRI to quantify musculoskeletal motion: A systematic review of concurrent validity and reliability, and perspectives for evaluation of musculoskeletal disorders

    PubMed Central

    Lempereur, Mathieu; Lelievre, Mathieu; Burdin, Valérie; Ben Salem, Douraied; Brochard, Sylvain

    2017-01-01

    Purpose To report evidence for the concurrent validity and reliability of dynamic MRI techniques to evaluate in vivo joint and muscle mechanics, and to propose recommendations for their use in the assessment of normal and impaired musculoskeletal function. Materials and methods The search was conducted on articles published in Web of science, PubMed, Scopus, Academic search Premier, and Cochrane Library between 1990 and August 2017. Studies that reported the concurrent validity and/or reliability of dynamic MRI techniques for in vivo evaluation of joint or muscle mechanics were included after assessment by two independent reviewers. Selected articles were assessed using an adapted quality assessment tool and a data extraction process. Results for concurrent validity and reliability were categorized as poor, moderate, or excellent. Results Twenty articles fulfilled the inclusion criteria with a mean quality assessment score of 66% (±10.4%). Concurrent validity and/or reliability of eight dynamic MRI techniques were reported, with the knee being the most evaluated joint (seven studies). Moderate to excellent concurrent validity and reliability were reported for seven out of eight dynamic MRI techniques. Cine phase contrast and real-time MRI appeared to be the most valid and reliable techniques to evaluate joint motion, and spin tag for muscle motion. Conclusion Dynamic MRI techniques are promising for the in vivo evaluation of musculoskeletal mechanics; however results should be evaluated with caution since validity and reliability have not been determined for all joints and muscles, nor for many pathological conditions. PMID:29232401

  20. Cooperative optimization of reconfigurable machine tool configurations and production process plan

    NASA Astrophysics Data System (ADS)

    Xie, Nan; Li, Aiping; Xue, Wei

    2012-09-01

    The production process plan design and configurations of reconfigurable machine tool (RMT) interact with each other. Reasonable process plans with suitable configurations of RMT help to improve product quality and reduce production cost. Therefore, a cooperative strategy is needed to concurrently solve the above issue. In this paper, the cooperative optimization model for RMT configurations and production process plan is presented. Its objectives take into account both impacts of process and configuration. Moreover, a novel genetic algorithm is also developed to provide optimal or near-optimal solutions: firstly, its chromosome is redesigned which is composed of three parts, operations, process plan and configurations of RMTs, respectively; secondly, its new selection, crossover and mutation operators are also developed to deal with the process constraints from operation processes (OP) graph, otherwise these operators could generate illegal solutions violating the limits; eventually the optimal configurations for RMT under optimal process plan design can be obtained. At last, a manufacturing line case is applied which is composed of three RMTs. It is shown from the case that the optimal process plan and configurations of RMT are concurrently obtained, and the production cost decreases 6.28% and nonmonetary performance increases 22%. The proposed method can figure out both RMT configurations and production process, improve production capacity, functions and equipment utilization for RMT.

Top