Probabilistic simulation of concurrent engineering of propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Singhal, S. N.
1993-01-01
Technology readiness and the available infrastructure is assessed for timely computational simulation of concurrent engineering for propulsion systems. Results for initial coupled multidisciplinary, fabrication-process, and system simulators are presented including uncertainties inherent in various facets of engineering processes. An approach is outlined for computationally formalizing the concurrent engineering process from cradle-to-grave via discipline dedicated workstations linked with a common database.
Computational simulation of concurrent engineering for aerospace propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Singhal, S. N.
1992-01-01
Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.
Computational simulation for concurrent engineering of aerospace propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Singhal, S. N.
1993-01-01
Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.
Computational simulation for concurrent engineering of aerospace propulsion systems
NASA Astrophysics Data System (ADS)
Chamis, C. C.; Singhal, S. N.
1993-02-01
Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.
Risk Identification and Visualization in a Concurrent Engineering Team Environment
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Chattopadhyay, Debarati; Shishko, Robert
2010-01-01
Incorporating risk assessment into the dynamic environment of a concurrent engineering team requires rapid response and adaptation. Generating consistent risk lists with inputs from all the relevant subsystems and presenting the results clearly to the stakeholders in a concurrent engineering environment is difficult because of the speed with which decisions are made. In this paper we describe the various approaches and techniques that have been explored for the point designs of JPL's Team X and the Trade Space Studies of the Rapid Mission Architecture Team. The paper will also focus on the issues of the misuse of categorical and ordinal data that keep arising within current engineering risk approaches and also in the applied risk literature.
A Software Tool for Integrated Optical Design Analysis
NASA Technical Reports Server (NTRS)
Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)
2001-01-01
Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.
NASA Technical Reports Server (NTRS)
Gavert, Raymond B.
1990-01-01
Some experiences of NASA configuration management in providing concurrent engineering support to the Space Station Freedom program for the achievement of life cycle benefits and total quality are discussed. Three change decision experiences involving tracing requirements and automated information systems of the electrical power system are described. The potential benefits of concurrent engineering and total quality management include improved operational effectiveness, reduced logistics and support requirements, prevention of schedule slippages, and life cycle cost savings. It is shown how configuration management can influence the benefits attained through disciplined approaches and innovations that compel consideration of all the technical elements of engineering and quality factors that apply to the program development, transition to operations and in operations. Configuration management experiences involving the Space Station program's tiered management structure, the work package contractors, international partners, and the participating NASA centers are discussed.
The MEOW lunar project for education and science based on concurrent engineering approach
NASA Astrophysics Data System (ADS)
Roibás-Millán, E.; Sorribes-Palmer, F.; Chimeno-Manguán, M.
2018-07-01
The use of concurrent engineering in the design of space missions allows to take into account in an interrelated methodology the high level of coupling and iteration of mission subsystems in the preliminary conceptual phase. This work presents the result of applying concurrent engineering in a short time lapse to design the main elements of the preliminary design for a lunar exploration mission, developed within ESA Academy Concurrent Engineering Challenge 2017. During this program, students of the Master in Space Systems at Technical University of Madrid designed a low cost satellite to find water on the Moon south pole as prospect of a future human lunar base. The resulting mission, The Moon Explorer And Observer of Water/Ice (MEOW) compromises a 262 kg spacecraft to be launched into a Geostationary Transfer Orbit as a secondary payload in the 2023/2025 time frame. A three months Weak Stability Boundary transfer via the Sun-Earth L1 Lagrange point allows for a high launch timeframe flexibility. The different aspects of the mission (orbit analysis, spacecraft design and payload) and possibilities of concurrent engineering are described.
The Application of Concurrent Engineering Tools and Design Structure Matrix in Designing Tire
NASA Astrophysics Data System (ADS)
Ginting, Rosnani; Fachrozi Fitra Ramadhan, T.
2016-02-01
The development of automobile industry in Indonesia is growing rapidly. This phenomenon causes companies related to the automobile industry such as tire industry must develop products based on customers’ needs and considering the timeliness of delivering the product to the customer. It could be reached by applying strategic planning in developing an integrated concept of product development. This research was held in PT. XYZ that applied the sequential approach in designing and developing products. The need to improve in one stage of product development could occur re-designing that needs longer time in developing a new product. This research is intended to get an integrated product design concept of tire pertaining to the customer's needs using Concurrent Engineering Tools by implementing the two-phased of product development. The implementation of Concurrent Engineering approach results in applying the stage of project planning, conceptual design, and product modules. The product modules consist of four modules that using Product Architecture - Design Structure Matrix to ease the designing process of new product development.
Model-Based Systems Engineering in Concurrent Engineering Centers
NASA Technical Reports Server (NTRS)
Iwata, Curtis; Infeld, Samantha; Bracken, Jennifer Medlin; McGuire; McQuirk, Christina; Kisdi, Aron; Murphy, Jonathan; Cole, Bjorn; Zarifian, Pezhman
2015-01-01
Concurrent Engineering Centers (CECs) are specialized facilities with a goal of generating and maturing engineering designs by enabling rapid design iterations. This is accomplished by co-locating a team of experts (either physically or virtually) in a room with a focused design goal and a limited timeline of a week or less. The systems engineer uses a model of the system to capture the relevant interfaces and manage the overall architecture. A single model that integrates other design information and modeling allows the entire team to visualize the concurrent activity and identify conflicts more efficiently, potentially resulting in a systems model that will continue to be used throughout the project lifecycle. Performing systems engineering using such a system model is the definition of model-based systems engineering (MBSE); therefore, CECs evolving their approach to incorporate advances in MBSE are more successful in reducing time and cost needed to meet study goals. This paper surveys space mission CECs that are in the middle of this evolution, and the authors share their experiences in order to promote discussion within the community.
Model-Based Systems Engineering in Concurrent Engineering Centers
NASA Technical Reports Server (NTRS)
Iwata, Curtis; Infeld, Samatha; Bracken, Jennifer Medlin; McGuire, Melissa; McQuirk, Christina; Kisdi, Aron; Murphy, Jonathan; Cole, Bjorn; Zarifian, Pezhman
2015-01-01
Concurrent Engineering Centers (CECs) are specialized facilities with a goal of generating and maturing engineering designs by enabling rapid design iterations. This is accomplished by co-locating a team of experts (either physically or virtually) in a room with a narrow design goal and a limited timeline of a week or less. The systems engineer uses a model of the system to capture the relevant interfaces and manage the overall architecture. A single model that integrates other design information and modeling allows the entire team to visualize the concurrent activity and identify conflicts more efficiently, potentially resulting in a systems model that will continue to be used throughout the project lifecycle. Performing systems engineering using such a system model is the definition of model-based systems engineering (MBSE); therefore, CECs evolving their approach to incorporate advances in MBSE are more successful in reducing time and cost needed to meet study goals. This paper surveys space mission CECs that are in the middle of this evolution, and the authors share their experiences in order to promote discussion within the community.
NASA Technical Reports Server (NTRS)
Chattopadhyay, Debarati; Hihn, Jairus; Warfield, Keith
2011-01-01
As aerospace missions grow larger and more technically complex in the face of ever tighter budgets, it will become increasingly important to use concurrent engineering methods in the development of early conceptual designs because of their ability to facilitate rapid assessments and trades in a cost-efficient manner. To successfully accomplish these complex missions with limited funding, it is also essential to effectively leverage the strengths of individuals and teams across government, industry, academia, and international agencies by increased cooperation between organizations. As a result, the existing concurrent engineering teams will need to increasingly engage in distributed collaborative concurrent design. This paper is an extension of a recent white paper written by the Concurrent Engineering Working Group, which details the unique challenges of distributed collaborative concurrent engineering. This paper includes a short history of aerospace concurrent engineering, and defines the terms 'concurrent', 'collaborative' and 'distributed' in the context of aerospace concurrent engineering. In addition, a model for the levels of complexity of concurrent engineering teams is presented to provide a way to conceptualize information and data flow within these types of teams.
Computational approaches to metabolic engineering utilizing systems biology and synthetic biology.
Fong, Stephen S
2014-08-01
Metabolic engineering modifies cellular function to address various biochemical applications. Underlying metabolic engineering efforts are a host of tools and knowledge that are integrated to enable successful outcomes. Concurrent development of computational and experimental tools has enabled different approaches to metabolic engineering. One approach is to leverage knowledge and computational tools to prospectively predict designs to achieve the desired outcome. An alternative approach is to utilize combinatorial experimental tools to empirically explore the range of cellular function and to screen for desired traits. This mini-review focuses on computational systems biology and synthetic biology tools that can be used in combination for prospective in silico strain design.
NASA Astrophysics Data System (ADS)
Luo, X. W.; Xu, P.; Sun, C. W.; Jin, H.; Hou, R. J.; Leng, H. Y.; Zhu, S. N.
2017-06-01
Concurrent spontaneous parametric down-conversion (SPDC) processes have proved to be an appealing approach for engineering the path-entangled photonic state with designable and tunable spatial modes. In this work, we propose a general scheme to construct high-dimensional path entanglement and demonstrate the basic properties of concurrent SPDC processes from domain-engineered quadratic nonlinear photonic crystals, including the spatial modes and the photon flux, as well as the anisotropy of spatial correlation under noncollinear quasi-phase-matching geometry. The overall understanding about the performance of concurrent SPDC processes will give valuable references to the construction of compact path entanglement and the development of new types of photonic quantum technologies.
The Concurrent Engineering Design Paradigm Is Now Fully Functional for Graphics Education
ERIC Educational Resources Information Center
Krueger, Thomas J.; Barr, Ronald E.
2007-01-01
Engineering design graphics education has come a long way in the past two decades. The emergence of solid geometric modeling technology has become the focal point for the graphical development of engineering design ideas. The main attraction of this 3-D modeling approach is the downstream application of the data base to analysis and…
Innovative Approaches to Fuel-Air Mixing and Combustion in Airbreathing Hypersonic Engines
NASA Astrophysics Data System (ADS)
MacLeod, C.
This paper describes some innovative methods for achieving enhanced fuel-air mixing and combustion in Scramjet-like spaceplane engines. A multimodal approach to the problem is discussed; this involves using several concurrent methods of forced mixing. The paper concentrates on Electromagnetic Activation (EMA) and Electrostatic Attraction as suitable techniques for this purpose - although several other potential methods are also discussed. Previously published empirical data is used to draw conclusions about the likely effectiveness of the system and possible engine topologies are outlined.
Next-generation concurrent engineering: developing models to complement point designs
NASA Technical Reports Server (NTRS)
Morse, Elizabeth; Leavens, Tracy; Cohanim, Babak; Harmon, Corey; Mahr, Eric; Lewis, Brian
2006-01-01
Concurrent Engineering Design (CED) teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a 'next-generation CED; in addition to a point design, the Team develops a model of the local trade space. The process is a balance between the power of a model developing tools and the creativity of humal experts, enabling the development of a variety of trade models for any space mission. This paper reviews the modeling method and its practical implementation in the ED environment. Example results illustrate the benefit of this approach.
Concurrently adjusting interrelated control parameters to achieve optimal engine performance
Jiang, Li; Lee, Donghoon; Yilmaz, Hakan; Stefanopoulou, Anna
2015-12-01
Methods and systems for real-time engine control optimization are provided. A value of an engine performance variable is determined, a value of a first operating condition and a value of a second operating condition of a vehicle engine are detected, and initial values for a first engine control parameter and a second engine control parameter are determined based on the detected first operating condition and the detected second operating condition. The initial values for the first engine control parameter and the second engine control parameter are adjusted based on the determined value of the engine performance variable to cause the engine performance variable to approach a target engine performance variable. In order to cause the engine performance variable to approach the target engine performance variable, adjusting the initial value for the first engine control parameter necessitates a corresponding adjustment of the initial value for the second engine control parameter.
Concurrent Software Engineering Project
ERIC Educational Resources Information Center
Stankovic, Nenad; Tillo, Tammam
2009-01-01
Concurrent engineering or overlapping activities is a business strategy for schedule compression on large development projects. Design parameters and tasks from every aspect of a product's development process and their interdependencies are overlapped and worked on in parallel. Concurrent engineering suffers from negative effects such as excessive…
NASA Technical Reports Server (NTRS)
Fielhauer, Karl, B.; Boone, Bradley, G.; Raible, Daniel, E.
2012-01-01
This paper describes a system engineering approach to examining the potential for combining elements of a deep-space RF and optical communications payload, for the purpose of reducing the size, weight and power burden on the spacecraft and the mission. Figures of merit and analytical methodologies are discussed to conduct trade studies, and several potential technology integration strategies are presented. Finally, the NASA Integrated Radio and Optical Communications (iROC) project is described, which directly addresses the combined RF and optical approach.
Identification and Classification of Common Risks in Space Science Missions
NASA Technical Reports Server (NTRS)
Hihn, Jairus M.; Chattopadhyay, Debarati; Hanna, Robert A.; Port, Daniel; Eggleston, Sabrina
2010-01-01
Due to the highly constrained schedules and budgets that NASA missions must contend with, the identification and management of cost, schedule and risks in the earliest stages of the lifecycle is critical. At the Jet Propulsion Laboratory (JPL) it is the concurrent engineering teams that first address these items in a systematic manner. Foremost of these concurrent engineering teams is Team X. Started in 1995, Team X has carried out over 1000 studies, dramatically reducing the time and cost involved, and has been the model for other concurrent engineering teams both within NASA and throughout the larger aerospace community. The ability to do integrated risk identification and assessment was first introduced into Team X in 2001. Since that time the mission risks identified in each study have been kept in a database. In this paper we will describe how the Team X risk process is evolving highlighting the strengths and weaknesses of the different approaches. The paper will especially focus on the identification and classification of common risks that have arisen during Team X studies of space based science missions.
Aerospace Concurrent Engineering Design Teams: Current State, Next Steps and a Vision for the Future
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Chattopadhyay, Debarati; Karpati, Gabriel; McGuire, Melissa; Borden, Chester; Panek, John; Warfield, Keith
2011-01-01
Over the past sixteen years, government aerospace agencies and aerospace industry have developed and evolved operational concurrent design teams to create novel spaceflight mission concepts and designs. These capabilities and teams, however, have evolved largely independently. In today's environment of increasingly complex missions with limited budgets it is becoming readily apparent that both implementing organizations and today's concurrent engineering teams will need to interact more often than they have in the past. This will require significant changes in the current state of practice. This paper documents the findings from a concurrent engineering workshop held in August 2010 to identify the key near term improvement areas for concurrent engineering capabilities and challenges to the long-term advancement of concurrent engineering practice. The paper concludes with a discussion of a proposed vision for the evolution of these teams over the next decade.
A Primer for DoD Reliability, Maintainability and Safety Standards
1988-03-02
the project engineer and the concurrence of their respective managers. The primary consideration in such cases is the thoroughness of the ...basic approaches to the application of environmental stress screening. In one approach, the government explicitly specifies the screens and screening...TO USE DOD-HDBK-344 (USAF) There are two basic approaches to the application of environmental stress
Early Formulation Model-centric Engineering on NASA's Europa Mission Concept Study
NASA Technical Reports Server (NTRS)
Bayer, Todd; Chung, Seung; Cole, Bjorn; Cooke, Brian; Dekens, Frank; Delp, Chris; Gontijo, Ivair; Lewis, Kari; Moshir, Mehrdad; Rasmussen, Robert;
2012-01-01
The proposed Jupiter Europa Orbiter and Jupiter Ganymede Orbiter missions were formulated using current state-of-the-art MBSE facilities: - JPL's TeamX, Rapid Mission Architecting - ESA's Concurrent Design Facility - APL's ACE Concurrent Engineering Facility. When JEO became an official "pre-project" in Sep 2010, we had already developed a strong partnership with JPL's Integrated Model Centric Engineering (IMCE) initiative; decided to apply Architecting and SysML-based MBSE from the beginning, begun laying these foundations to support work in Phase A. Release of Planetary Science Decadal Survey and FY12 President's Budget in March 2011 changed the landscape. JEO reverted to being a pre-phase A study. A conscious choice was made to continue application of MBSE on the Europa Study, refocused for early formulation. This presentation describes the approach, results, and lessons.
Product design for energy reduction in concurrent engineering: An Inverted Pyramid Approach
NASA Astrophysics Data System (ADS)
Alkadi, Nasr M.
Energy factors in product design in concurrent engineering (CE) are becoming an emerging dimension for several reasons; (a) the rising interest in "green design and manufacturing", (b) the national energy security concerns and the dramatic increase in energy prices, (c) the global competition in the marketplace and global climate change commitments including carbon tax and emission trading systems, and (d) the widespread recognition of the need for sustainable development. This research presents a methodology for the intervention of energy factors in concurrent engineering product development process to significantly reduce the manufacturing energy requirement. The work presented here is the first attempt at integrating the design for energy in concurrent engineering framework. It adds an important tool to the DFX toolbox for evaluation of the impact of design decisions on the product manufacturing energy requirement early during the design phase. The research hypothesis states that "Product Manufacturing Energy Requirement is a Function of Design Parameters". The hypothesis was tested by conducting experimental work in machining and heat treating that took place at the manufacturing lab of the Industrial and Management Systems Engineering Department (IMSE) at West Virginia University (WVU) and at a major U.S steel manufacturing plant, respectively. The objective of the machining experiment was to study the effect of changing specific product design parameters (Material type and diameter) and process design parameters (metal removal rate) on a gear head lathe input power requirement through performing defined sets of machining experiments. The objective of the heat treating experiment was to study the effect of varying product charging temperature on the fuel consumption of a walking beams reheat furnace. The experimental work in both directions have revealed important insights into energy utilization in machining and heat-treating processes and its variance based on product, process, and system design parameters. In depth evaluation to how the design and manufacturing normally happen in concurrent engineering provided a framework to develop energy system levels in machining within the concurrent engineering environment using the method of "Inverted Pyramid Approach", (IPA). The IPA features varying levels of output energy based information depending on the input design parameters that is available during each stage (level) of the product design. The experimental work, the in-depth evaluation of design and manufacturing in CE, and the developed energy system levels in machining provided a solid base for the development of the model for the design for energy reduction in CE. The model was used to analyze an example part where 12 evolving designs were thoroughly reviewed to investigate the sensitivity of energy to design parameters in machining. The model allowed product design teams to address manufacturing energy concerns early during the design stage. As a result, ranges for energy sensitive design parameters impacting product manufacturing energy consumption were found in earlier levels. As designer proceeds to deeper levels in the model, this range tightens and results in significant energy reductions.
Processing multilevel secure test and evaluation information
NASA Astrophysics Data System (ADS)
Hurlburt, George; Hildreth, Bradley; Acevedo, Teresa
1994-07-01
The Test and Evaluation Community Network (TECNET) is building a Multilevel Secure (MLS) system. This system features simultaneous access to classified and unclassified information and easy access through widely available communications channels. It provides the necessary separation of classification levels, assured through the use of trusted system design techniques, security assessments and evaluations. This system enables cleared T&E users to view and manipulate classified and unclassified information resources either using a single terminal interface or multiple windows in a graphical user interface. TECNET is in direct partnership with the National Security Agency (NSA) to develop and field the MLS TECNET capability in the near term. The centerpiece of this partnership is a state-of-the-art Concurrent Systems Security Engineering (CSSE) process. In developing the MLS TECNET capability, TECNET and NSA are providing members, with various expertise and diverse backgrounds, to participate in the CSSE process. The CSSE process is founded on the concepts of both Systems Engineering and Concurrent Engineering. Systems Engineering is an interdisciplinary approach to evolve and verify an integrated and life cycle balanced set of system product and process solutions that satisfy customer needs (ASD/ENS-MIL STD 499B 1992). Concurrent Engineering is design and development using the simultaneous, applied talents of a diverse group of people with the appropriate skills. Harnessing diverse talents to support CSSE requires active participation by team members in an environment that both respects and encourages diversity.
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Chattopadhyay, Debarati; Karpati, Gabriel; McGuire, Melissa; Panek, John; Warfield, Keith; Borden, Chester
2011-01-01
As aerospace missions grow larger and more technically complex in the face of ever tighter budgets, it will become increasingly important to use concurrent engineering methods in the development of early conceptual designs because of their ability to facilitate rapid assessments and trades of performance, cost and schedule. To successfully accomplish these complex missions with limited funding, it is essential to effectively leverage the strengths of individuals and teams across government, industry, academia, and international agencies by increased cooperation between organizations. As a result, the existing concurrent engineering teams will need to increasingly engage in distributed collaborative concurrent design. The purpose of this white paper is to identify a near-term vision for the future of distributed collaborative concurrent engineering design for aerospace missions as well as discuss the challenges to achieving that vision. The white paper also documents the advantages of creating a working group to investigate how to engage the expertise of different teams in joint design sessions while enabling organizations to maintain their organizations competitive advantage.
Interdisciplinary and multilevel optimum design. [in aerospace structural engineering
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw; Haftka, Raphael T.
1987-01-01
Interactions among engineering disciplines and subsystems in engineering system design are surveyed and specific instances of such interactions are described. Examination of the interactions that a traditional design process in which the numerical values of major design variables are decided consecutively is likely to lead to a suboptimal design. Supporting numerical examples are a glider and a space antenna. Under an alternative approach introduced, the design and its sensitivity data from the subsystems and disciplines are generated concurrently and then made available to the system designer enabling him to modify the system design so as to improve its performance. Examples of a framework structure and an airliner wing illustrate that approach.
Fuzzy simulation in concurrent engineering
NASA Technical Reports Server (NTRS)
Kraslawski, A.; Nystrom, L.
1992-01-01
Concurrent engineering is becoming a very important practice in manufacturing. A problem in concurrent engineering is the uncertainty associated with the values of the input variables and operating conditions. The problem discussed in this paper concerns the simulation of processes where the raw materials and the operational parameters possess fuzzy characteristics. The processing of fuzzy input information is performed by the vertex method and the commercial simulation packages POLYMATH and GEMS. The examples are presented to illustrate the usefulness of the method in the simulation of chemical engineering processes.
Integrated System-Level Optimization for Concurrent Engineering With Parametric Subsystem Modeling
NASA Technical Reports Server (NTRS)
Schuman, Todd; DeWeck, Oliver L.; Sobieski, Jaroslaw
2005-01-01
The introduction of concurrent design practices to the aerospace industry has greatly increased the productivity of engineers and teams during design sessions as demonstrated by JPL's Team X. Simultaneously, advances in computing power have given rise to a host of potent numerical optimization methods capable of solving complex multidisciplinary optimization problems containing hundreds of variables, constraints, and governing equations. Unfortunately, such methods are tedious to set up and require significant amounts of time and processor power to execute, thus making them unsuitable for rapid concurrent engineering use. This paper proposes a framework for Integration of System-Level Optimization with Concurrent Engineering (ISLOCE). It uses parametric neural-network approximations of the subsystem models. These approximations are then linked to a system-level optimizer that is capable of reaching a solution quickly due to the reduced complexity of the approximations. The integration structure is described in detail and applied to the multiobjective design of a simplified Space Shuttle external fuel tank model. Further, a comparison is made between the new framework and traditional concurrent engineering (without system optimization) through an experimental trial with two groups of engineers. Each method is evaluated in terms of optimizer accuracy, time to solution, and ease of use. The results suggest that system-level optimization, running as a background process during integrated concurrent engineering sessions, is potentially advantageous as long as it is judiciously implemented.
Concurrent design of an RTP chamber and advanced control system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spence, P.; Schaper, C.; Kermani, A.
1995-12-31
A concurrent-engineering approach is applied to the development of an axisymmetric rapid-thermal-processing (RTP) reactor and its associated temperature controller. Using a detailed finite-element thermal model as a surrogate for actual hardware, the authors have developed and tested a multi-input multi-output (MIMO) controller. Closed-loop simulations are performed by linking the control algorithm with the finite-element code. Simulations show that good temperature uniformity is maintained on the wafer during both steady and transient conditions. A numerical study shows the effect of ramp rate, feedback gain, sensor placement, and wafer-emissivity patterns on system performance.
Interdisciplinary and multilevel optimum design
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw; Haftka, Raphael T.
1986-01-01
Interactions among engineering disciplines and subsystems in engineering system design are surveyed and specific instances of such interactions are described. Examination of the interactions that a traditional design process in which the numerical values of major design variables are decided consecutively is likely to lead to a suboptimal design. Supporting numerical examples are a glider and a space antenna. Under an alternative approach introduced, the design and its sensitivity data from the subsystems and disciplines are generated concurrently and then made available to the system designer enabling him to modify the system design so as to improve its performance. Examples of a framework structure and an airliner wing illustrate that approach.
Stacking transgenes in forest trees.
Halpin, Claire; Boerjan, Wout
2003-08-01
Huge potential exists for improving plant raw materials and foodstuffs via metabolic engineering. To date, progress has mostly been limited to modulating the expression of single genes of well-studied pathways, such as the lignin biosynthetic pathway, in model species. However, a recent report illustrates a new level of sophistication in metabolic engineering by overexpressing one lignin enzyme while simultaneously suppressing the expression of another lignin gene in a tree, aspen. This novel approach to multi-gene manipulation has succeeded in concurrently improving several wood-quality traits.
Integrating post-manufacturing issues into design and manufacturing decisions
NASA Technical Reports Server (NTRS)
Eubanks, Charles F.
1996-01-01
An investigation is conducted on research into some of the fundamental issues underlying the design for manufacturing, service and recycling that affect engineering decisions early in the conceptual design phase of mechanical systems. The investigation focuses on a system-based approach to material selection, manufacturing methods and assembly processes related to overall product requirements, performance and life-cycle costs. Particular emphasis is placed on concurrent engineering decision support for post-manufacturing issues such as serviceability, recyclability, and product retirement.
Model-Based Engineering Design for Trade Space Exploration throughout the Design Cycle
NASA Technical Reports Server (NTRS)
Lamassoure, Elisabeth S.; Wall, Stephen D.; Easter, Robert W.
2004-01-01
This paper presents ongoing work to standardize model-based system engineering as a complement to point design development in the conceptual design phase of deep space missions. It summarizes two first steps towards practical application of this capability within the framework of concurrent engineering design teams and their customers. The first step is standard generation of system sensitivities models as the output of concurrent engineering design sessions, representing the local trade space around a point design. A review of the chosen model development process, and the results of three case study examples, demonstrate that a simple update to the concurrent engineering design process can easily capture sensitivities to key requirements. It can serve as a valuable tool to analyze design drivers and uncover breakpoints in the design. The second step is development of rough-order- of-magnitude, broad-range-of-validity design models for rapid exploration of the trade space, before selection of a point design. At least one case study demonstrated the feasibility to generate such models in a concurrent engineering session. The experiment indicated that such a capability could yield valid system-level conclusions for a trade space composed of understood elements. Ongoing efforts are assessing the practicality of developing end-to-end system-level design models for use before even convening the first concurrent engineering session, starting with modeling an end-to-end Mars architecture.
How Engineers Really Think About Risk: A Study of JPL Engineers
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Chattopadhyay, Deb; Valerdi, Ricardo
2011-01-01
The objectives of this work are: To improve risk assessment practices as used during the mission design process by JPL's concurrent engineering teams. (1) Developing effective ways to identify and assess mission risks (2) Providing a process for more effective dialog between stakeholders about the existence and severity of mission risks (3) Enabling the analysis of interactions of risks across concurrent engineering roles.
Concurrent engineering: Spacecraft and mission operations system design
NASA Technical Reports Server (NTRS)
Landshof, J. A.; Harvey, R. J.; Marshall, M. H.
1994-01-01
Despite our awareness of the mission design process, spacecraft historically have been designed and developed by one team and then turned over as a system to the Mission Operations organization to operate on-orbit. By applying concurrent engineering techniques and envisioning operability as an essential characteristic of spacecraft design, tradeoffs can be made in the overall mission design to minimize mission lifetime cost. Lessons learned from previous spacecraft missions will be described, as well as the implementation of concurrent mission operations and spacecraft engineering for the Near Earth Asteroid Rendezvous (NEAR) program.
ERIC Educational Resources Information Center
Bertozzi, N.; Hebert, C.; Rought, J.; Staniunas, C.
2007-01-01
Over the past decade the software products available for solid modeling, dynamic, stress, thermal, and flow analysis, and computer-aiding manufacturing (CAM) have become more powerful, affordable, and easier to use. At the same time it has become increasingly important for students to gain concurrent engineering design and systems integration…
NASA Astrophysics Data System (ADS)
Nishizawa, Hitoshi; Yoshioka, Takayoshi; Itoh, Kazuaki
This article introduces extensive reading (ER) as an approach to improve fundamental communication skills in English of reluctant EFL learners : average Japanese engineering students. It is distinct from concurrent translation approach from a perspective that the learners use English instead of Japanese to grasp the meaning of what they read and enjoy reading. In the ER program at Toyota National College of Technology, many students developed more positive attitude toward English, increased their reading speed, and achieved higher TOEIC scores, which was compared to those of the students before this ER program was introduced. Comparison between three groups of the students showed strong correlation between their TOEIC scores and the reading amount.
Introduction to Concurrent Engineering: Electronic Circuit Design and Production Applications
1992-09-01
STD-1629. Failure mode distribution data for many different types of parts may be found in RAC publication FMD -91. FMEA utilizes inductive logic in a...contrasts with a Fault Tree Analysis ( FTA ) which utilizes deductive logic in a "top down" approach. In FTA , a system failure is assumed and traced down...Analysis ( FTA ) is a graphical method of risk analysis used to identify critical failure modes within a system or equipment. Utilizing a pictorial approach
Wilk, Szymon; Michalowski, Martin; Michalowski, Wojtek; Hing, Marisela Mainegra; Farion, Ken
2011-01-01
This paper describes a new methodological approach to reconciling adverse and contradictory activities (called points of contention) occurring when a patient is managed according to two or more concurrently used clinical practice guidelines (CPGs). The need to address these inconsistencies occurs when a patient with more than one disease, each of which is a comorbid condition, has to be managed according to different treatment regimens. We propose an automatic procedure that constructs a mathematical guideline model using the Constraint Logic Programming (CLP) methodology, uses this model to identify and mitigate encountered points of contention, and revises the considered CPGs accordingly. The proposed procedure is used as an alerting mechanism and coupled with a guideline execution engine warns the physician about potential problems with the concurrent application of two or more guidelines. We illustrate the operation of our procedure in a clinical scenario describing simultaneous use of CPGs for duodenal ulcer and transient ischemic attack.
A general engineering scenario for concurrent engineering environments
NASA Astrophysics Data System (ADS)
Mucino, V. H.; Pavelic, V.
The paper describes an engineering method scenario which categorizes the various activities and tasks into blocks seen as subjects which consume and produce data and information. These methods, tools, and associated utilities interact with other engineering tools by exchanging information in such a way that a relationship between customers and suppliers of engineering data is established clearly, while data exchange consistency is maintained throughout the design process. The events and data transactions are presented in the form of flowcharts in which data transactions represent the connection between the various bricks, which in turn represent the engineering activities developed for the particular task required in the concurrent engineering environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Emery, John M.; Coffin, Peter; Robbins, Brian A.
Microstructural variabilities are among the predominant sources of uncertainty in structural performance and reliability. We seek to develop efficient algorithms for multiscale calcu- lations for polycrystalline alloys such as aluminum alloy 6061-T6 in environments where ductile fracture is the dominant failure mode. Our approach employs concurrent multiscale methods, but does not focus on their development. They are a necessary but not sufficient ingredient to multiscale reliability predictions. We have focused on how to efficiently use concurrent models for forward propagation because practical applications cannot include fine-scale details throughout the problem domain due to exorbitant computational demand. Our approach begins withmore » a low-fidelity prediction at the engineering scale that is sub- sequently refined with multiscale simulation. The results presented in this report focus on plasticity and damage at the meso-scale, efforts to expedite Monte Carlo simulation with mi- crostructural considerations, modeling aspects regarding geometric representation of grains and second-phase particles, and contrasting algorithms for scale coupling.« less
GLobal Integrated Design Environment (GLIDE): A Concurrent Engineering Application
NASA Technical Reports Server (NTRS)
McGuire, Melissa L.; Kunkel, Matthew R.; Smith, David A.
2010-01-01
The GLobal Integrated Design Environment (GLIDE) is a client-server software application purpose-built to mitigate issues associated with real time data sharing in concurrent engineering environments and to facilitate discipline-to-discipline interaction between multiple engineers and researchers. GLIDE is implemented in multiple programming languages utilizing standardized web protocols to enable secure parameter data sharing between engineers and researchers across the Internet in closed and/or widely distributed working environments. A well defined, HyperText Transfer Protocol (HTTP) based Application Programming Interface (API) to the GLIDE client/server environment enables users to interact with GLIDE, and each other, within common and familiar tools. One such common tool, Microsoft Excel (Microsoft Corporation), paired with its add-in API for GLIDE, is discussed in this paper. The top-level examples given demonstrate how this interface improves the efficiency of the design process of a concurrent engineering study while reducing potential errors associated with manually sharing information between study participants.
Electronic Design Automation: Integrating the Design and Manufacturing Functions
NASA Technical Reports Server (NTRS)
Bachnak, Rafic; Salkowski, Charles
1997-01-01
As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.
Systems Engineering Model for ART Energy Conversion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendez Cruz, Carmen Margarita; Rochau, Gary E.; Wilson, Mollye C.
The near-term objective of the EC team is to establish an operating, commercially scalable Recompression Closed Brayton Cycle (RCBC) to be constructed for the NE - STEP demonstration system (demo) with the lowest risk possible. A systems engineering approach is recommended to ensure adequate requirements gathering, documentation, and mode ling that supports technology development relevant to advanced reactors while supporting crosscut interests in potential applications. A holistic systems engineering model was designed for the ART Energy Conversion program by leveraging Concurrent Engineering, Balance Model, Simplified V Model, and Project Management principles. The resulting model supports the identification and validation ofmore » lifecycle Brayton systems requirements, and allows designers to detail system-specific components relevant to the current stage in the lifecycle, while maintaining a holistic view of all system elements.« less
1988-12-01
engineering disciplines. (Here I refer to training in multifunction team mana ement dir’lplines, quality engineering methods, experimental design by such...4001 SSOME ISSUES S• View of strategic issues has been evolving - Speed of design and product deployment - to accelerate experimentation with new...manufacturingprocess design n New technologies (e.g., composites) which can revolutionize prod-uct technical design in some cases Issue still to be faced: " non
A minimum cost tolerance allocation method for rocket engines and robust rocket engine design
NASA Technical Reports Server (NTRS)
Gerth, Richard J.
1993-01-01
Rocket engine design follows three phases: systems design, parameter design, and tolerance design. Systems design and parameter design are most effectively conducted in a concurrent engineering (CE) environment that utilize methods such as Quality Function Deployment and Taguchi methods. However, tolerance allocation remains an art driven by experience, handbooks, and rules of thumb. It was desirable to develop and optimization approach to tolerancing. The case study engine was the STME gas generator cycle. The design of the major components had been completed and the functional relationship between the component tolerances and system performance had been computed using the Generic Power Balance model. The system performance nominals (thrust, MR, and Isp) and tolerances were already specified, as were an initial set of component tolerances. However, the question was whether there existed an optimal combination of tolerances that would result in the minimum cost without any degradation in system performance.
Law, Cheryl Suwen; Sylvia, Georgina M; Nemati, Madieh; Yu, Jingxian; Losic, Dusan; Abell, Andrew D; Santos, Abel
2017-03-15
We explore new approaches to engineering the surface chemistry of interferometric sensing platforms based on nanoporous anodic alumina (NAA) and reflectometric interference spectroscopy (RIfS). Two surface engineering strategies are presented, namely (i) selective chemical functionalization of the inner surface of NAA pores with amine-terminated thiol molecules and (ii) selective chemical functionalization of the top surface of NAA with dithiol molecules. The strong molecular interaction of Au 3+ ions with thiol-containing functional molecules of alkane chain or peptide character provides a model sensing system with which to assess the sensitivity of these NAA platforms by both molecular feature and surface engineering. Changes in the effective optical thickness of the functionalized NAA photonic films (i.e., sensing principle), in response to gold ions, are monitored in real-time by RIfS. 6-Amino-1-hexanethiol (inner surface) and 1,6-hexanedithiol (top surface), the most sensitive functional molecules from approaches i and ii, respectively, were combined into a third sensing strategy whereby the NAA platforms are functionalized on both the top and inner surfaces concurrently. Engineering of the surface according to this approach resulted in an additive enhancement in sensitivity of up to 5-fold compared to previously reported systems. This study advances the rational engineering of surface chemistry for interferometric sensing on nanoporous platforms with potential applications for real-time monitoring of multiple analytes in dynamic environments.
Wilk, Szymon; Michalowski, Martin; Michalowski, Wojtek; Hing, Marisela Mainegra; Farion, Ken
2011-01-01
This paper describes a new methodological approach to reconciling adverse and contradictory activities (called points of contention) occurring when a patient is managed according to two or more concurrently used clinical practice guidelines (CPGs). The need to address these inconsistencies occurs when a patient with more than one disease, each of which is a comorbid condition, has to be managed according to different treatment regimens. We propose an automatic procedure that constructs a mathematical guideline model using the Constraint Logic Programming (CLP) methodology, uses this model to identify and mitigate encountered points of contention, and revises the considered CPGs accordingly. The proposed procedure is used as an alerting mechanism and coupled with a guideline execution engine warns the physician about potential problems with the concurrent application of two or more guidelines. We illustrate the operation of our procedure in a clinical scenario describing simultaneous use of CPGs for duodenal ulcer and transient ischemic attack. PMID:22195153
Selected Current Acquisitions and Articles from Periodicals
1994-06-01
on Theater High Altitude Area Defense (THAAD) System : briefing report to the Chairman, Committee on Foreign Relations, U.S. Senate. [Washington, D.C...John Marshall Law School , 1993- LAW PERIODICALS Main PAGE 6 CONCURRENT ENGINEERING. Karbhari, Vistaspa Maneck. Concurrent engineering for composites...Postgraduate School , [19901 VC267.U6 H37 1991 United States. DOD FAR supplement : Department of Defense as of . . Chicago, Ill. : Commerce Clearing House, 1994
Improving generalized inverted index lock wait times
NASA Astrophysics Data System (ADS)
Borodin, A.; Mirvoda, S.; Porshnev, S.; Ponomareva, O.
2018-01-01
Concurrent operations on tree like data structures is a cornerstone of any database system. Concurrent operations intended for improving read\\write performance and usually implemented via some way of locking. Deadlock-free methods of concurrency control are known as tree locking protocols. These protocols provide basic operations(verbs) and algorithm (ways of operation invocations) for applying it to any tree-like data structure. These algorithms operate on data, managed by storage engine which are very different among RDBMS implementations. In this paper, we discuss tree locking protocol implementation for General inverted index (Gin) applied to multiversion concurrency control (MVCC) storage engine inside PostgreSQL RDBMS. After that we introduce improvements to locking protocol and provide usage statistics about evaluation of our improvement in very high load environment in one of the world’s largest IT company.
NASA Technical Reports Server (NTRS)
Panczak, Tim; Ring, Steve; Welch, Mark
1999-01-01
Thermal engineering has long been left out of the concurrent engineering environment dominated by CAD (computer aided design) and FEM (finite element method) software. Current tools attempt to force the thermal design process into an environment primarily created to support structural analysis, which results in inappropriate thermal models. As a result, many thermal engineers either build models "by hand" or use geometric user interfaces that are separate from and have little useful connection, if any, to CAD and FEM systems. This paper describes the development of a new thermal design environment called the Thermal Desktop. This system, while fully integrated into a neutral, low cost CAD system, and which utilizes both FEM and FD methods, does not compromise the needs of the thermal engineer. Rather, the features needed for concurrent thermal analysis are specifically addressed by combining traditional parametric surface based radiation and FD based conduction modeling with CAD and FEM methods. The use of flexible and familiar temperature solvers such as SINDA/FLUINT (Systems Improved Numerical Differencing Analyzer/Fluid Integrator) is retained.
Program Correctness, Verification and Testing for Exascale (Corvette)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Koushik; Iancu, Costin; Demmel, James W
The goal of this project is to provide tools to assess the correctness of parallel programs written using hybrid parallelism. There is a dire lack of both theoretical and engineering know-how in the area of finding bugs in hybrid or large scale parallel programs, which our research aims to change. In the project we have demonstrated novel approaches in several areas: 1. Low overhead automated and precise detection of concurrency bugs at scale. 2. Using low overhead bug detection tools to guide speculative program transformations for performance. 3. Techniques to reduce the concurrency required to reproduce a bug using partialmore » program restart/replay. 4. Techniques to provide reproducible execution of floating point programs. 5. Techniques for tuning the floating point precision used in codes.« less
Agile rediscovering values: Similarities to continuous improvement strategies
NASA Astrophysics Data System (ADS)
Díaz de Mera, P.; Arenas, J. M.; González, C.
2012-04-01
Research in the late 80's on technological companies that develop products of high value innovation, with sufficient speed and flexibility to adapt quickly to changing market conditions, gave rise to the new set of methodologies known as Agile Management Approach. In the current changing economic scenario, we considered very interesting to study the similarities of these Agile Methodologies with other practices whose effectiveness has been amply demonstrated in both the West and Japan. Strategies such as Kaizen, Lean, World Class Manufacturing, Concurrent Engineering, etc, would be analyzed to check the values they have in common with the Agile Approach.
Integrating Thermal Tools Into the Mechanical Design Process
NASA Technical Reports Server (NTRS)
Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.
1999-01-01
The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.
Ishii, Jun; Kondo, Takashi; Makino, Harumi; Ogura, Akira; Matsuda, Fumio; Kondo, Akihiko
2014-05-01
Yeast has the potential to be used in bulk-scale fermentative production of fuels and chemicals due to its tolerance for low pH and robustness for autolysis. However, expression of multiple external genes in one host yeast strain is considerably labor-intensive due to the lack of polycistronic transcription. To promote the metabolic engineering of yeast, we generated systematic and convenient genetic engineering tools to express multiple genes in Saccharomyces cerevisiae. We constructed a series of multi-copy and integration vector sets for concurrently expressing two or three genes in S. cerevisiae by embedding three classical promoters. The comparative expression capabilities of the constructed vectors were monitored with green fluorescent protein, and the concurrent expression of genes was monitored with three different fluorescent proteins. Our multiple gene expression tool will be helpful to the advanced construction of genetically engineered yeast strains in a variety of research fields other than metabolic engineering. © 2014 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.
Group Design Problems in Engineering Design Graphics.
ERIC Educational Resources Information Center
Kelley, David
2001-01-01
Describes group design techniques used within the engineering design graphics sequence at Western Washington University. Engineering and design philosophies such as concurrent engineering place an emphasis on group collaboration for the solving of design problems. (Author/DDR)
Systems Engineering News | Wind | NREL
News Systems Engineering News The Wind Plant Optimization and Systems Engineering newsletter covers range from multi-disciplinary design analysis and optimization of wind turbine sub-components to wind plant optimization and uncertainty analysis to concurrent engineering and financial engineering
Hierarchical Modeling and Robust Synthesis for the Preliminary Design of Large Scale Complex Systems
NASA Technical Reports Server (NTRS)
Koch, Patrick N.
1997-01-01
Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis; Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration; and Noise modeling techniques for implementing robust preliminary design when approximate models are employed. Hierarchical partitioning and modeling techniques including intermediate responses, linking variables, and compatibility constraints are incorporated within a hierarchical compromise decision support problem formulation for synthesizing subproblem solutions for a partitioned system. Experimentation and approximation techniques are employed for concurrent investigations and modeling of partitioned subproblems. A modified composite experiment is introduced for fitting better predictive models across the ranges of the factors, and an approach for constructing partitioned response surfaces is developed to reduce the computational expense of experimentation for fitting models in a large number of factors. Noise modeling techniques are compared and recommendations are offered for the implementation of robust design when approximate models are sought. These techniques, approaches, and recommendations are incorporated within the method developed for hierarchical robust preliminary design exploration. This method as well as the associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system. The case study is developed in collaboration with Allison Engine Company, Rolls Royce Aerospace, and is based on the Allison AE3007 existing engine designed for midsize commercial, regional business jets. For this case study, the turbofan system-level problem is partitioned into engine cycle design and configuration design and a compressor modules integrated for more detailed subsystem-level design exploration, improving system evaluation. The fan and low pressure turbine subsystems are also modeled, but in less detail. Given the defined partitioning, these subproblems are investigated independently and concurrently, and response surface models are constructed to approximate the responses of each. These response models are then incorporated within a commercial turbofan hierarchical compromise decision support problem formulation. Five design scenarios are investigated, and robust solutions are identified. The method and solutions identified are verified by comparison with the AE3007 engine. The solutions obtained are similar to the AE3007 cycle and configuration, but are better with respect to many of the requirements.
Next-generation concurrent engineering: developing models to complement point designs
NASA Technical Reports Server (NTRS)
Morse, Elizabeth; Leavens, Tracy; Cohanim, Barbak; Harmon, Corey; Mahr, Eric; Lewis, Brian
2006-01-01
Concurrent Engineering Design teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a next generation CED; nin addition to a point design, the team develops a model of the local trade space. The process is a balance between the power of model-developing tools and the creativity of human experts, enabling the development of a variety of trade models for any space mission.
A Model-based Approach to Reactive Self-Configuring Systems
NASA Technical Reports Server (NTRS)
Williams, Brian C.; Nayak, P. Pandurang
1996-01-01
This paper describes Livingstone, an implemented kernel for a self-reconfiguring autonomous system, that is reactive and uses component-based declarative models. The paper presents a formal characterization of the representation formalism used in Livingstone, and reports on our experience with the implementation in a variety of domains. Livingstone's representation formalism achieves broad coverage of hybrid software/hardware systems by coupling the concurrent transition system models underlying concurrent reactive languages with the discrete qualitative representations developed in model-based reasoning. We achieve a reactive system that performs significant deductions in the sense/response loop by drawing on our past experience at building fast prepositional conflict-based algorithms for model-based diagnosis, and by framing a model-based configuration manager as a prepositional, conflict-based feedback controller that generates focused, optimal responses. Livingstone automates all these tasks using a single model and a single core deductive engine, thus making significant progress towards achieving a central goal of model-based reasoning. Livingstone, together with the HSTS planning and scheduling engine and the RAPS executive, has been selected as the core autonomy architecture for Deep Space One, the first spacecraft for NASA's New Millennium program.
Cimini, Donatella; Iacono, Ileana Dello; Carlino, Elisabetta; Finamore, Rosario; Restaino, Odile F; Diana, Paola; Bedini, Emiliano; Schiraldi, Chiara
2017-12-01
Glycosaminoglycans, such as hyaluronic acid and chondroitin sulphate, are not only more and more required as main ingredients in cosmeceutical and nutraceutical preparations, but also as active principles in medical devices and pharmaceutical products. However, while biotechnological production of hyaluronic acid is industrially established through fermentation of Streptococcus spp. and recently Bacillus subtilis, biotechnological chondroitin is not yet on the market. A non-hemolytic and hyaluronidase negative S. equi subsp. zooepidemicus mutant strain was engineered in this work by the addition of two E. coli K4 genes, namely kfoA and kfoC, involved in the biosynthesis of chondroitin-like polysaccharide. Chondroitin is the precursor of chondroitin sulphate, a nutraceutical present on the market as anti-arthritic drug, that is lately being studied for its intrinsic bioactivity. In small scale bioreactor batch experiments the production of about 1.46 ± 0.38 g/L hyaluronic acid and 300 ± 28 mg/L of chondroitin with an average molecular weight of 1750 and 25 kDa, respectively, was demonstrated, providing an approach to the concurrent production of both biopolymers in a single fermentation.
Concurrency in product realization
NASA Astrophysics Data System (ADS)
Kelly, Michael J.
1994-03-01
Technology per se does not provide a competitive advantage. Timely exploitation of technology is what gives the competitive edge, and this demands a major shift in the product development process and management of the industrial enterprise. `Teaming to win' is more than a management theme; it is the disciplined engineering practice that is essential to success in today's global marketplace. Teaming supports the concurrent engineering practices required to integrate the activities of people responsible for product realization through achievement of shorter development cycles, lower costs, and defect-free products.
NASA Astrophysics Data System (ADS)
Tremoulet, P. C.
The author describes a number of maintenance improvements in the Fiber Optic Cable System (FOCS). They were achieved during a production phase pilot concurrent engineering program. Listed in order of importance (saved maintenance time and material) by maintenance level, they are: (1) organizational level: improved fiber optic converter (FOC) BITE; (2) Intermediate level: reduced FOC adjustments from 20 to 2; partitioned FOC into electrical and optical parts; developed cost-effective fault isolation test points and test using standard test equipment; improved FOC chassis to have lower mean time to repair; and (3) depot level: revised test requirements documents (TRDs) for common automatic test equipment and incorporated ATE testability into circuit and assemblies and application-specific integrated circuits. These improvements met this contract's tailored logistics MIL-STD 1388-1A requirements of monitoring the design for supportability and determining the most effective support equipment. Important logistics lessons learned while accomplishing these maintainability and supportability improvements on the pilot concurrent engineering program are also discussed.
Utilization of CAD/CAE for concurrent design of structural aircraft components
NASA Technical Reports Server (NTRS)
Kahn, William C.
1993-01-01
The feasibility of installing the Stratospheric Observatory for Infrared Astronomy telescope (named SOFIA) into an aircraft for NASA astronomy studies is investigated using CAD/CAE equipment to either design or supply data for every facet of design engineering. The aircraft selected for the platform was a Boeing 747, chosen on the basis of its ability to meet the flight profiles required for the given mission and payload. CAD models of the fuselage of two of the aircraft models studied (747-200 and 747 SP) were developed, and models for the component parts of the telescope and subsystems were developed by the various concurrent engineering groups of the SOFIA program, to determine the requirements for the cavity opening and for design configuration. It is noted that, by developing a plan to use CAD/CAE for concurrent engineering at the beginning of the study, it was possible to produce results in about two-thirds of the time required using traditional methods.
Nutritional strategies to support concurrent training.
Perez-Schindler, Joaquin; Hamilton, D Lee; Moore, Daniel R; Baar, Keith; Philp, Andrew
2015-01-01
Concurrent training (the combination of endurance exercise to resistance training) is a common practice for athletes looking to maximise strength and endurance. Over 20 years ago, it was first observed that performing endurance exercise after resistance exercise could have detrimental effects on strength gains. At the cellular level, specific protein candidates have been suggested to mediate this training interference; however, at present, the physiological reason(s) behind the concurrent training effect remain largely unknown. Even less is known regarding the optimal nutritional strategies to support concurrent training and whether unique nutritional approaches are needed to support endurance and resistance exercise during concurrent training approaches. In this review, we will discuss the importance of protein supplementation for both endurance and resistance training adaptation and highlight additional nutritional strategies that may support concurrent training. Finally, we will attempt to synergise current understanding of the interaction between physiological responses and nutritional approaches into practical recommendations for concurrent training.
78 FR 8596 - Committee on Equal Opportunities in Science and Engineering #1173; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-06
... NATIONAL SCIENCE FOUNDATION Committee on Equal Opportunities in Science and Engineering 1173... Science and Engineering (CEOSE). Dates/Time: February 25, 2013, 9:00 a.m.-5:30 p.m.; February 26, 2013, 9... participation in science and engineering. Agenda: Opening Statement by the CEOSE Chair Discussions: Concurrence...
Electro-optic holography method for determination of surface shape and deformation
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1998-06-01
Current demanding engineering analysis and design applications require effective experimental methodologies for characterization of surface shape and deformation. Such characterization is of primary importance in many applications, because these quantities are related to the functionality, performance, and integrity of the objects of interest, especially in view of advances relating to concurrent engineering. In this paper, a new approach to characterization of surface shape and deformation using a simple optical setup is described. The approach consists of a fiber optic based electro-optic holography (EOH) system based on an IR, temperature tuned laser diode, a single mode fiber optic directional coupler assembly, and a video processing computer. The EOH can be arranged in multiple configurations which include, the three-camera, three- illumination, and speckle correlation modes.In particular, the three-camera mode is described, as well as a brief description of the procedures for obtaining quantitative 3D shape and deformation information. A representative application of the three-camera EOH system demonstrates the viability of the approach as an effective engineering tool. A particular feature of this system and the procedure described in this paper is that the 3D quantitative data are written to data files which can be readily interfaced to commercial CAD/CAM environments.
NASA Technical Reports Server (NTRS)
Chien, Andrew A.; Karamcheti, Vijay; Plevyak, John; Sahrawat, Deepak
1993-01-01
Concurrent object-oriented languages, particularly fine-grained approaches, reduce the difficulty of large scale concurrent programming by providing modularity through encapsulation while exposing large degrees of concurrency. Despite these programmability advantages, such languages have historically suffered from poor efficiency. This paper describes the Concert project whose goal is to develop portable, efficient implementations of fine-grained concurrent object-oriented languages. Our approach incorporates aggressive program analysis and program transformation with careful information management at every stage from the compiler to the runtime system. The paper discusses the basic elements of the Concert approach along with a description of the potential payoffs. Initial performance results and specific plans for system development are also detailed.
A Model-Driven Approach to Teaching Concurrency
ERIC Educational Resources Information Center
Carro, Manuel; Herranz, Angel; Marino, Julio
2013-01-01
We present an undergraduate course on concurrent programming where formal models are used in different stages of the learning process. The main practical difference with other approaches lies in the fact that the ability to develop correct concurrent software relies on a systematic transformation of formal models of inter-process interaction (so…
NASA Astrophysics Data System (ADS)
Mills, R. T.
2014-12-01
As the high performance computing (HPC) community pushes towards the exascale horizon, the importance and prevalence of fine-grained parallelism in new computer architectures is increasing. This is perhaps most apparent in the proliferation of so-called "accelerators" such as the Intel Xeon Phi or NVIDIA GPGPUs, but the trend also holds for CPUs, where serial performance has grown slowly and effective use of hardware threads and vector units are becoming increasingly important to realizing high performance. This has significant implications for weather, climate, and Earth system modeling codes, many of which display impressive scalability across MPI ranks but take relatively little advantage of threading and vector processing. In addition to increasing parallelism, next generation codes will also need to address increasingly deep hierarchies for data movement: NUMA/cache levels, on node vs. off node, local vs. wide neighborhoods on the interconnect, and even in the I/O system. We will discuss some approaches (grounded in experiences with the Intel Xeon Phi architecture) for restructuring Earth science codes to maximize concurrency across multiple levels (vectors, threads, MPI ranks), and also discuss some novel approaches for minimizing expensive data movement/communication.
NASA Astrophysics Data System (ADS)
Yun, Wanying; Lu, Zhenzhou; Jiang, Xian
2018-06-01
To efficiently execute the variance-based global sensitivity analysis, the law of total variance in the successive intervals without overlapping is proved at first, on which an efficient space-partition sampling-based approach is subsequently proposed in this paper. Through partitioning the sample points of output into different subsets according to different inputs, the proposed approach can efficiently evaluate all the main effects concurrently by one group of sample points. In addition, there is no need for optimizing the partition scheme in the proposed approach. The maximum length of subintervals is decreased by increasing the number of sample points of model input variables in the proposed approach, which guarantees the convergence condition of the space-partition approach well. Furthermore, a new interpretation on the thought of partition is illuminated from the perspective of the variance ratio function. Finally, three test examples and one engineering application are employed to demonstrate the accuracy, efficiency and robustness of the proposed approach.
Knowledge Management tools integration within DLR's concurrent engineering facility
NASA Astrophysics Data System (ADS)
Lopez, R. P.; Soragavi, G.; Deshmukh, M.; Ludtke, D.
The complexity of space endeavors has increased the need for Knowledge Management (KM) tools. The concept of KM involves not only the electronic storage of knowledge, but also the process of making this knowledge available, reusable and traceable. Establishing a KM concept within the Concurrent Engineering Facility (CEF) has been a research topic of the German Aerospace Centre (DLR). This paper presents the current KM tools of the CEF: the Software Platform for Organizing and Capturing Knowledge (S.P.O.C.K.), the data model Virtual Satellite (VirSat), and the Simulation Model Library (SimMoLib), and how their usage improved the Concurrent Engineering (CE) process. This paper also exposes the lessons learned from the introduction of KM practices into the CEF and elaborates a roadmap for the further development of KM in CE activities at DLR. The results of the application of the Knowledge Management tools have shown the potential of merging the three software platforms with their functionalities, as the next step towards the fully integration of KM practices into the CE process. VirSat will stay as the main software platform used within a CE study, and S.P.O.C.K. and SimMoLib will be integrated into VirSat. These tools will support the data model as a reference and documentation source, and as an access to simulation and calculation models. The use of KM tools in the CEF aims to become a basic practice during the CE process. The settlement of this practice will result in a much more extended knowledge and experience exchange within the Concurrent Engineering environment and, consequently, the outcome of the studies will comprise higher quality in the design of space systems.
Examining Science Teachers' Argumentation in a Teacher Workshop on Earthquake Engineering
NASA Astrophysics Data System (ADS)
Cavlazoglu, Baki; Stuessy, Carol
2018-02-01
The purpose of this study was to examine changes in the quality of science teachers' argumentation as a result of their engagement in a teacher workshop on earthquake engineering emphasizing distributed learning approaches, which included concept mapping, collaborative game playing, and group lesson planning. The participants were ten high school science teachers from US high schools who elected to attend the workshop. To begin and end the teacher workshop, teachers in small groups engaged in concept mapping exercises with other teachers. Researchers audio-recorded individual teachers' argumentative statements about the inclusion of earthquake engineering concepts in their concept maps, which were then analyzed to reveal the quality of teachers' argumentation. Toulmin's argumentation model formed the framework for designing a classification schema to analyze the quality of participants' argumentative statements. While the analysis of differences in pre- and post-workshop concept mapping exercises revealed that the number of argumentative statements did not change significantly, the quality of participants' argumentation did increase significantly. As these differences occurred concurrently with distributed learning approaches used throughout the workshop, these results provide evidence to support distributed learning approaches in professional development workshop activities to increase the quality of science teachers' argumentation. Additionally, these results support the use of concept mapping as a cognitive scaffold to organize participants' knowledge, facilitate the presentation of argumentation, and as a research tool for providing evidence of teachers' argumentation skills.
Chip-to-chip entanglement of transmon qubits using engineered measurement fields
NASA Astrophysics Data System (ADS)
Dickel, C.; Wesdorp, J. J.; Langford, N. K.; Peiter, S.; Sagastizabal, R.; Bruno, A.; Criger, B.; Motzoi, F.; DiCarlo, L.
2018-02-01
While the on-chip processing power in circuit QED devices is growing rapidly, an open challenge is to establish high-fidelity quantum links between qubits on different chips. Here, we show entanglement between transmon qubits on different cQED chips with 49 % concurrence and 73 % Bell-state fidelity. We engineer a half-parity measurement by successively reflecting a coherent microwave field off two nearly identical transmon-resonator systems. By ensuring the measured output field does not distinguish |01 > from |10 > , unentangled superposition states are probabilistically projected onto entangled states in the odd-parity subspace. We use in situ tunability and an additional weakly coupled driving field on the second resonator to overcome imperfect matching due to fabrication variations. To demonstrate the flexibility of this approach, we also produce an even-parity entangled state of similar quality, by engineering the matching of outputs for the |00 > and |11 > states. The protocol is characterized over a range of measurement strengths using quantum state tomography showing good agreement with a comprehensive theoretical model.
Experimental clean combustor program, phase 2
NASA Technical Reports Server (NTRS)
Roberts, R.; Peduzzi, A.; Vitti, G. E.
1976-01-01
Combustor pollution reduction technology for commercial CTOL engines was generated and this technology was demonstrated in a full-scale JT9D engine in 1976. Component rig refinement of the two best combustor concepts were tested. These concepts are the vorbix combustor, and a hybrid combustor which combines the pilot zone of the staged premix combustor and the main zone of the swirl-can combustor. Both concepts significantly reduced all pollutant emissions relative to the JT9D-7 engine combustor. However, neither concept met all program goals. The hybrid combustor met pollution goals for unburned hydrocarbons and carbon monoxide but did not achieve the oxides of nitrogen goal. This combustor had significant performance deficiencies. The Vorbix combustor met goals for unburned hydrocarbons and oxides of nitrogen but did not achieve the carbon monoxide goal. Performance of the vorbix combustor approached the engine requirements. On the basis of these results, the vorbix combustor was selected for the engine demonstration program. A control study was conducted to establish fuel control requirements imposed by the low-emission combustor concepts and to identify conceptual control system designs. Concurrent efforts were also completed on two addendums: an alternate fuels addendum and a combustion noise addendum.
1992-05-01
methodology, knowledge acquisition, 140 requirements definition, information systems, information engineering, 16. PRICE CODE systems engineering...and knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be...evolve towards an information -integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key
How to Quickly Import CAD Geometry into Thermal Desktop
NASA Technical Reports Server (NTRS)
Wright, Shonte; Beltran, Emilio
2002-01-01
There are several groups at JPL (Jet Propulsion Laboratory) that are committed to concurrent design efforts, two are featured here. Center for Space Mission Architecture and Design (CSMAD) enables the practical application of advanced process technologies in JPL's mission architecture process. Team I functions as an incubator for projects that are in the Discovery, and even pre-Discovery proposal stages. JPL's concurrent design environment is to a large extent centered on the CAD (Computer Aided Design) file. During concurrent design sessions CAD geometry is ported to other more specialized engineering design packages.
Space Transportation Engine Program (STEP), phase B
NASA Technical Reports Server (NTRS)
1990-01-01
The Space Transportation Engine Program (STEP) Phase 2 effort includes preliminary design and activities plan preparation that will allow smooth and time transition into a Prototype Phase and then into Phases 3, 4, and 5. A Concurrent Engineering approach using Total Quality Management (TQM) techniques, is being applied to define an oxygen-hydrogen engine. The baseline from Phase 1/1' studies was used as a point of departure for trade studies and analyses. Existing STME system models are being enhanced as more detailed module/component characteristics are determined. Preliminary designs for the open expander, closed expander, and gas generator cycles were prepared, and recommendations for cycle selection made at the Design Concept Review (DCR). As a result of July '90 DCR, and information subsequently supplied to the Technical Review Team, a gas generator cycle was selected. Results of the various Advanced Development Programs (ADP's) for the Advanced Launch Systems (ALS) were contributive to this effort. An active vehicle integration effort is supplying the NASA, Air Force, and vehicle contractors with engine parameters and data, and flowing down appropriate vehicle requirements. Engine design and analysis trade studies are being documented in a data base that was developed and is being used to organize information. To date, seventy four trade studies were input to the data base.
An Integrated Approach to Risk Assessment for Concurrent Design
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Voss, Luke; Feather, Martin; Cornford, Steve
2005-01-01
This paper describes an approach to risk assessment and analysis suited to the early phase, concurrent design of a space mission. The approach integrates an agile, multi-user risk collection tool, a more in-depth risk analysis tool, and repositories of risk information. A JPL developed tool, named RAP, is used for collecting expert opinions about risk from designers involved in the concurrent design of a space mission. Another in-house developed risk assessment tool, named DDP, is used for the analysis.
Materials technology assessment for stirling engines
NASA Technical Reports Server (NTRS)
Stephens, J. R.; Witzke, W. R.; Watson, G. K.; Johnston, J. R.; Croft, W. J.
1977-01-01
A materials technology assessment of high temperature components in the improved (metal) and advanced (ceramic) Stirling engines was undertaken to evaluate the current state-of-the-art of metals and ceramics, identify materials research and development required to support the development of automotive Stirling engines, and to recommend materials technology programs to assure material readiness concurrent with engine system development programs. The most critical component for each engine is identified and some of the material problem areas are discussed.
NASA Engine Icing Research Overview: Aeronautics Evaluation and Test Capabilities (AETC) Project
NASA Technical Reports Server (NTRS)
Veres, Joseph P.
2015-01-01
The occurrence of ice accretion within commercial high bypass aircraft turbine engines has been reported by airlines under certain atmospheric conditions. Engine anomalies have taken place at high altitudes that have been attributed to ice crystal ingestion by the engine. The ice crystals can result in degraded engine performance, loss of thrust control, compressor surge or stall, and flameout of the combustor. The Aviation Safety Program at NASA has taken on the technical challenge of a turbofan engine icing caused by ice crystals which can exist in high altitude convective clouds. The NASA engine icing project consists of an integrated approach with four concurrent and ongoing research elements, each of which feeds critical information to the next element. The project objective is to gain understanding of high altitude ice crystals by developing knowledge bases and test facilities for testing full engines and engine components. The first element is to utilize a highly instrumented aircraft to characterize the high altitude convective cloud environment. The second element is the enhancement of the Propulsion Systems Laboratory altitude test facility for gas turbine engines to include the addition of an ice crystal cloud. The third element is basic research of the fundamental physics associated with ice crystal ice accretion. The fourth and final element is the development of computational tools with the goal of simulating the effects of ice crystal ingestion on compressor and gas turbine engine performance. The NASA goal is to provide knowledge to the engine and aircraft manufacturing communities to help mitigate, or eliminate turbofan engine interruptions, engine damage, and failures due to ice crystal ingestion.
Action Learning in Undergraduate Engineering Thesis Supervision
ERIC Educational Resources Information Center
Stappenbelt, Brad
2017-01-01
In the present action learning implementation, twelve action learning sets were conducted over eight years. The action learning sets consisted of students involved in undergraduate engineering research thesis work. The concurrent study accompanying this initiative investigated the influence of the action learning environment on student approaches…
Use of Concurrent Engineering in Space Mission Design
NASA Technical Reports Server (NTRS)
Wall, S.
2000-01-01
In recent years, conceptual-phase (proposal level) design of space missions has been improved considerably. Team structures, tool linkage, specialized facilities known as design centers and scripted processes have been demonstrated to cut proposal-level engineering design time from a few months to a few weeks.
7 CFR 1794.10 - Applicant responsibilities.
Code of Federal Regulations, 2010 CFR
2010-01-01
... prepare the applicable environmental documentation concurrent with a proposed action's engineering... AGRICULTURE (CONTINUED) ENVIRONMENTAL POLICIES AND PROCEDURES Implementation of the National Environmental...
Software engineering aspects of real-time programming concepts
NASA Astrophysics Data System (ADS)
Schoitsch, Erwin
1986-08-01
Real-time programming is a discipline of great importance not only in process control, but also in fields like communication, office automation, interactive databases, interactive graphics and operating systems development. General concepts of concurrent programming and constructs for process-synchronization are discussed in detail. Tasking and synchronization concepts, methods of process communication, interrupt and timeout handling in systems based on semaphores, signals, conditional critical regions or on real-time languages like Concurrent PASCAL, MODULA, CHILL and ADA are explained and compared with each other. The second part deals with structuring and modularization of technical processes to build reliable and maintainable real time systems. Software-quality and software engineering aspects are considered throughout the paper.
Fully Integral, Flexible Composite Driveshaft
NASA Technical Reports Server (NTRS)
Lawrie, Duncan
2014-01-01
An all-composite driveshaft incorporating integral flexible diaphragms was developed for prime contractor testing. This new approach makes obsolete the split lines required to attach metallic flex elements and either metallic or composite spacing tubes in current solutions. Subcritical driveshaft weights can be achieved that are half that of incumbent technology for typical rotary wing shaft lengths. Spacing tubes compose an integral part of the initial tooling but remain part of the finished shaft and control natural frequencies and torsional stability. A concurrently engineered manufacturing process and design for performance competes with incumbent solutions at significantly lower weight and with the probability of improved damage tolerance and fatigue life.
Concurrent and Collaborative Engineering Implementation in an R and D Organization
NASA Technical Reports Server (NTRS)
DelRosario, Ruben; Davis, Jose M.; Keys, L. Ken
2003-01-01
Concurrent Engineering (CE), and Collaborative Engineering (or Collaborative Product Development - CPD) have emerged as new paradigms with significant impact in the development of new products and processes. With documented and substantiated success in the automotive and technology industries CE and, most recently, CPD are being touted as innovative management philosophies for many other business sectors including Research and De- velopment. This paper introduces two independent research initiatives conducted at the NASA Glenn Research Center (GRC) in Cleveland, Ohio investigating the application of CE and CPD in an RdiD environment. Since little research has been conducted in the use of CE and CPD in sectors other than the high mass production manufacturing, the objective of these independent studies is to provide a systematic evaluation of the applicability of these paradigms (concur- rent and collaborative) in a low/no production, service environment, in particular R&D.
A Design-Based Engineering Graphics Course for First-Year Students.
ERIC Educational Resources Information Center
Smith, Shana Shiang-Fong
2003-01-01
Describes the first-year Introduction to Design course at Iowa State University which incorporates design for manufacturing and concurrent engineering principles into the curriculum. Autodesk Inventor was used as the primary CAD tool for parametric solid modeling. Test results show that student spatial visualization skills were dramatically…
ERIC Educational Resources Information Center
Michael, William B.; Colson, Kenneth R.
1979-01-01
The construction and validation of the Life Experience Inventory (LEI) for the identification of creative electrical engineers are described. Using the number of patents held or pending as a criterion measure, the LEI was found to have high concurrent validity. (JKS)
Decision support and disease management: a logic engineering approach.
Fox, J; Thomson, R
1998-12-01
This paper describes the development and application of PROforma, a unified technology for clinical decision support and disease management. Work leading to the implementation of PROforma has been carried out in a series of projects funded by European agencies over the past 13 years. The work has been based on logic engineering, a distinct design and development methodology that combines concepts from knowledge engineering, logic programming, and software engineering. Several of the projects have used the approach to demonstrate a wide range of applications in primary and specialist care and clinical research. Concurrent academic research projects have provided a sound theoretical basis for the safety-critical elements of the methodology. The principal technical results of the work are the PROforma logic language for defining clinical processes and an associated suite of software tools for delivering applications, such as decision support and disease management procedures. The language supports four standard objects (decisions, plans, actions, and enquiries), each of which has an intuitive meaning with well-understood logical semantics. The development toolset includes a powerful visual programming environment for composing applications from these standard components, for verifying consistency and completeness of the resulting specification and for delivering stand-alone or embeddable applications. Tools and applications that have resulted from the work are described and illustrated, with examples from specialist cancer care and primary care. The results of a number of evaluation activities are included to illustrate the utility of the technology.
Maher, Dermot; Waswa, Laban; Karabarinde, Alex; Baisley, Kathy
2011-08-17
Although concurrent sexual partnerships may play an important role in HIV transmission in Africa, the lack of an agreed definition of concurrency and of standard methodological approaches has hindered studies. In a long-standing general population cohort in rural Uganda we assessed the prevalence of concurrency and investigated its association with sociodemographic and behavioural factors and with HIV prevalence, using the new recommended standard definition and methodological approaches. As part of the 2010 annual cohort HIV serosurvey among adults, we used a structured questionnaire to collect information on sociodemographic and behavioural factors and to measure standard indicators of concurrency using the recommended method of obtaining sexual-partner histories. We used logistic regression to build a multivariable model of factors independently associated with concurrency. Among those eligible, 3,291 (66%) males and 4,052 (72%) females participated in the survey. Among currently married participants, 11% of men and 25% of women reported being in a polygynous union. Among those with a sexual partner in the past year, the proportion reporting at least one concurrent partnership was 17% in males and 0.5% in females. Polygyny accounted for a third of concurrency in men and was not associated with increased HIV risk. Among men there was no evidence of an association between concurrency and HIV prevalence (but too few women reported concurrency to assess this after adjusting for confounding). Regarding sociodemographic factors associated with concurrency, females were significantly more likely to be younger, unmarried, and of lower socioeconomic status than males. Behavioural factors associated with concurrency were young age at first sex, increasing lifetime partners, and a casual partner in the past year (among men and women) and problem drinking (only men). Our findings based on the new standard definition and methodological approaches provide a baseline for measuring changes in concurrency and HIV incidence in future surveys, and a benchmark for other studies. As campaigns are now widely conducted against concurrency, such surveys and studies are important in evaluating their effectiveness in decreasing HIV transmission.
The Design of a Primary Flight Trainer using Concurrent Engineering Concepts
NASA Technical Reports Server (NTRS)
Ladesic, James G.; Eastlake, Charles N.; Kietzmann, Nicholas H.
1993-01-01
Concurrent Engineering (CE) concepts seek to coordinate the expertise of various disciplines from initial design configuration selection through product disposal so that cost efficient design solutions may be achieve. Integrating this methodology into an undergraduate design course sequence may provide a needed enhancement to engineering education. The Advanced Design Program (ADP) project at Embry-Riddle Aeronautical University (EMU) is focused on developing recommendations for the general aviation Primary Flight Trainer (PFT) of the twenty first century using methods of CE. This project, over the next two years, will continue synthesizing the collective knowledge of teams composed of engineering students along with students from other degree programs, their faculty, and key industry representatives. During the past year (Phase I). conventional trainer configurations that comply with current regulations and existing technologies have been evaluated. Phase I efforts have resulted in two baseline concepts, a high-wing, conventional design named Triton and a low-wing, mid-engine configuration called Viper. In the second and third years (Phases II and III). applications of advanced propulsion, advanced materials, and unconventional airplane configurations along with military and commercial technologies which are anticipated to be within the economic range of general aviation by the year 2000, will be considered.
Integrated Engineering Information Technology, FY93 accommplishments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, R.N.; Miller, D.K.; Neugebauer, G.L.
1994-03-01
The Integrated Engineering Information Technology (IEIT) project is providing a comprehensive, easy-to-use computer network solution or communicating with coworkers both inside and outside Sandia National Laboratories. IEIT capabilities include computer networking, electronic mail, mechanical design, and data management. These network-based tools have one fundamental purpose: to help create a concurrent engineering environment that will enable Sandia organizations to excel in today`s increasingly competitive business environment.
ERIC Educational Resources Information Center
Milk, Robert D.
This study analyzes how two bilingual classroom language distribution approaches affect classroom language use patterns. The two strategies, separate instruction in the two languages vs. the new concurrent language usage approach (NCA) allowing use of both languages with strict guidelines for language alternation, are observed on videotapes of a…
Integrated design and manufacturing for the high speed civil transport
NASA Technical Reports Server (NTRS)
1993-01-01
In June 1992, Georgia Tech's School of Aerospace Engineering was awarded a NASA University Space Research Association (USRA) Advanced Design Program (ADP) to address 'Integrated Design and Manufacturing for the High Speed Civil Transport (HSCT)' in its graduate aerospace systems design courses. This report summarizes the results of the five courses incorporated into the Georgia Tech's USRA ADP program. It covers AE8113: Introduction to Concurrent Engineering, AE4360: Introduction to CAE/CAD, AE4353: Design for Life Cycle Cost, AE6351: Aerospace Systems Design One, and AE6352: Aerospace Systems Design Two. AE8113: Introduction to Concurrent Engineering was an introductory course addressing the basic principles of concurrent engineering (CE) or integrated product development (IPD). The design of a total system was not the objective of this course. The goal was to understand and define the 'up-front' customer requirements, their decomposition, and determine the value objectives for a complex product, such as the high speed civil transport (HSCT). A generic CE methodology developed at Georgia Tech was used for this purpose. AE4353: Design for Life Cycle Cost addressed the basic economic issues for an HSCT using a robust design technique, Taguchi's parameter design optimization method (PDOM). An HSCT economic sensitivity assessment was conducted using a Taguchi PDOM approach to address the robustness of the basic HSCT design. AE4360: Introduction to CAE/CAD permitted students to develop and utilize CAE/CAD/CAM knowledge and skills using CATIA and CADAM as the basic geometric tools. AE6351: Aerospace Systems Design One focused on the conceptual design refinement of a baseline HSCT configuration as defined by Boeing, Douglas, and NASA in their system studies. It required the use of NASA's synthesis codes FLOPS and ACSYNT. A criterion called the productivity index (P.I.) was used to evaluate disciplinary sensitivities and provide refinements of the baseline HSCT configuration. AE6352: Aerospace Systems Design Two was a continuation of Aerospace Systems Design One in which wing concepts were researched and analyzed in more detail. FLOPS and ACSYNT were again used at the system level while other off-the-shelf computer codes were used for more detailed wing disciplinary analysis and optimization. The culmination of all efforts and submission of this report conclude the first year's efforts of Georgia Tech's NASA USRA ADP. It will hopefully provide the foundation for next year's efforts concerning continuous improvement of integrated design and manufacturing for the HSCT.
Concurrent Engineering for the Management of Research and Development
NASA Technical Reports Server (NTRS)
DelRosario, Ruben; Petersen, Paul F.; Keys, L. Ken; Chen, Injazz J.
2004-01-01
The Management of Research and Development (R&D) is facing the challenges of reducing time from R&D to customer, reducing the cost of R&D, having higher accountability for results (improved quality), and increasing focus on customers. Concurrent engineering (CE) has shown great success in the automotive and technology industries resulting in significant decreases in cycle time, reduction of total cost, and increases in quality and reliability. This philosophy of concurrency can have similar implications or benefits for the management of R&D organizations. Since most studies on the application of CE have been performed in manufacturing environments, research into the benefits of CE into other environments is needed. This paper presents research conducted at the NASA Glenn Research Center (GRC) investigating the application of CE in the management of an R&D organization. In particular the paper emphasizes possible barriers and enhancers that this environment presents to the successful implementation of CE. Preliminary results and recommendations are based on a series of interviews and subsequent surveys, from which data has been gathered and analyzed as part of the GRC's Continuous Improvement Process.
Man-machine analysis of translation and work tasks of Skylab films
NASA Technical Reports Server (NTRS)
Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.
1979-01-01
An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.
Qin, Detao; Liu, Zhaoyang; Bai, Hongwei; Sun, Darren Delai; Song, Xiaoxiao
2016-01-01
Surfactant stabilized oil-in-water nanoemulsions pose a severe threat to both the environment and human health. Recent development of membrane filtration technology has enabled efficient oil removal from oil/water nanoemulsion, however, the concurrent removal of surfactant and oil remains unsolved because the existing filtration membranes still suffer from low surfactant removal rate and serious surfactant-induced fouling issue. In this study, to realize the concurrent removal of surfactant and oil from nanoemulsion, a novel hierarchically-structured membrane is designed with a nanostructured selective layer on top of a microstructured support layer. The physical and chemical properties of the overall membrane, including wettability, surface roughness, electric charge, thickness and structures, are delicately tailored through a nano-engineered fabrication process, that is, graphene oxide (GO) nanosheet assisted phase inversion coupled with surface functionalization. Compared with the membrane fabricated by conventional phase inversion, this novel membrane has four times higher water flux, significantly higher rejections of both oil (~99.9%) and surfactant (as high as 93.5%), and two thirds lower fouling ratio when treating surfactant stabilized oil-in-water nanoemulsion. Due to its excellent performances and facile fabrication process, this nano-engineered membrane is expected to have wide practical applications in the oil/water separation fields of environmental protection and water purification. PMID:27087362
Concurrency-based approaches to parallel programming
NASA Technical Reports Server (NTRS)
Kale, L.V.; Chrisochoides, N.; Kohl, J.; Yelick, K.
1995-01-01
The inevitable transition to parallel programming can be facilitated by appropriate tools, including languages and libraries. After describing the needs of applications developers, this paper presents three specific approaches aimed at development of efficient and reusable parallel software for irregular and dynamic-structured problems. A salient feature of all three approaches in their exploitation of concurrency within a processor. Benefits of individual approaches such as these can be leveraged by an interoperability environment which permits modules written using different approaches to co-exist in single applications.
Multidisciplinary optimization for engineering systems - Achievements and potential
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw
1989-01-01
The currently common sequential design process for engineering systems is likely to lead to suboptimal designs. Recently developed decomposition methods offer an alternative for coming closer to optimum by breaking the large task of system optimization into smaller, concurrently executed and, yet, coupled tasks, identified with engineering disciplines or subsystems. The hierarchic and non-hierarchic decompositions are discussed and illustrated by examples. An organization of a design process centered on the non-hierarchic decomposition is proposed.
Multidisciplinary optimization for engineering systems: Achievements and potential
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw
1989-01-01
The currently common sequential design process for engineering systems is likely to lead to suboptimal designs. Recently developed decomposition methods offer an alternative for coming closer to optimum by breaking the large task of system optimization into smaller, concurrently executed and, yet, coupled tasks, identified with engineering disciplines or subsystems. The hierarchic and non-hierarchic decompositions are discussed and illustrated by examples. An organization of a design process centered on the non-hierarchic decomposition is proposed.
Controlled ecological life-support system - Use of plants for human life-support in space
NASA Technical Reports Server (NTRS)
Chamberland, D.; Knott, W. M.; Sager, J. C.; Wheeler, R.
1992-01-01
Scientists and engineers within NASA are conducting research which will lead to development of advanced life-support systems that utilize higher plants in a unique approach to solving long-term life-support problems in space. This biological solution to life-support, Controlled Ecological Life-Support System (CELSS), is a complex, extensively controlled, bioengineered system that relies on plants to provide the principal elements from gas exchange and food production to potable water reclamation. Research at John F. Kennedy Space Center (KSC) is proceeding with a comprehensive investigation of the individual parts of the CELSS system at a one-person scale in an approach called the Breadboard Project. Concurrently a relatively new NASA sponsored research effort is investigating plant growth and metabolism in microgravity, innovative hydroponic nutrient delivery systems, and use of highly efficient light emitting diodes for artificial plant illumination.
Development of a forestry government agency enterprise GIS system: a disconnected editing approach
NASA Astrophysics Data System (ADS)
Zhu, Jin; Barber, Brad L.
2008-10-01
The Texas Forest Service (TFS) has developed a geographic information system (GIS) for use by agency personnel in central Texas for managing oak wilt suppression and other landowner assistance programs. This Enterprise GIS system was designed to support multiple concurrent users accessing shared information resources. The disconnected editing approach was adopted in this system to avoid the overhead of maintaining an active connection between TFS central Texas field offices and headquarters since most field offices are operating with commercially provided Internet service. The GIS system entails maintaining a personal geodatabase on each local field office computer. Spatial data from the field is periodically up-loaded into a central master geodatabase stored in a Microsoft SQL Server at the TFS headquarters in College Station through the ESRI Spatial Database Engine (SDE). This GIS allows users to work off-line when editing data and requires connecting to the central geodatabase only when needed.
Concurrent Engineering through Product Data Standards
1991-05-01
standards, represents the power of a new industrial revolution . The role of the NIST National PDES testbed, technical leadership and a testing-based foundation for the development of STEP, is described.
Fuel quantity modulation in pilot ignited engines
May, Andrew
2006-05-16
An engine system includes a first fuel regulator adapted to control an amount of a first fuel supplied to the engine, a second fuel regulator adapted to control an amount of a second fuel supplied to the engine concurrently with the first fuel being supplied to the engine, and a controller coupled to at least the second fuel regulator. The controller is adapted to determine the amount of the second fuel supplied to the engine in a relationship to the amount of the first fuel supplied to the engine to operate in igniting the first fuel at a specified time in steady state engine operation and adapted to determine the amount of the second fuel supplied to the engine in a manner different from the relationship at steady state engine operation in transient engine operation.
Description of inpatient medication management using cognitive work analysis.
Pingenot, Alleene Anne; Shanteau, James; Sengstacke, L T C Daniel N
2009-01-01
The purpose of this article was to describe key elements of an inpatient medication system using the cognitive work analysis method of Rasmussen et al (Cognitive Systems Engineering. Wiley Series in Systems Engineering; 1994). The work of nurses and physicians were observed in routine care of inpatients on a medical-surgical unit and attached ICU. Interaction with pharmacists was included. Preoperative, postoperative, and medical care was observed. Personnel were interviewed to obtain information not easily observable during routine work. Communication between healthcare workers was projected onto an abstraction/decomposition hierarchy. Decision ladders and information flow charts were developed. Results suggest that decision making on an inpatient medical/surgical unit or ICU setting is a parallel, distributed process. Personnel are highly mobile and often are working on multiple issues concurrently. In this setting, communication is key to maintaining organization and synchronization for effective care. Implications for research approaches to system and interface designs and decision support for personnel involved in the process are discussed.
Multi-Organization Multi-Discipline Effort Developing a Mitigation Concept for Planetary Defense
NASA Technical Reports Server (NTRS)
Leung, Ronald Y.; Barbee, Brent W.; Seery, Bernard D.; Bambacus, Myra; Finewood, Lee; Greenaugh, Kevin C.; Lewis, Anthony; Dearborn, David; Miller, Paul L.; Weaver, Robert P.;
2017-01-01
There have been significant recent efforts in addressing mitigation approaches to neutralize Potentially Hazardous Asteroids (PHA). One such research effort was performed in 2015 by an integrated, inter-disciplinary team of asteroid scientists, energy deposition modeling scientists, payload engineers, orbital dynamist engineers, spacecraft discipline engineers, and systems architecture engineer from NASAs Goddard Space Flight Center (GSFC) and the Department of Energy (DoE) National Nuclear Security Administration (NNSA) laboratories (Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratories (LLNL) and Sandia National Laboratories). The study team collaborated with GSFCs Integrated Design Centers Mission Design Lab (MDL) which engaged a team of GSFC flight hardware discipline engineers to work with GSFC, LANL, and LLNL NEA-related subject matter experts during a one-week intensive concept formulation study in an integrated concurrent engineering environment. This team has analyzed the first of several distinct study cases for a multi-year NASA research grant. This Case 1 study references the Near-Earth Asteroid (NEA) named Bennu as the notional target due to the availability of a very detailed Design Reference Asteroid (DRA) model for its orbit and physical characteristics (courtesy of the Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) mission team). The research involved the formulation and optimization of spacecraft trajectories to intercept Bennu, overall mission and architecture concepts, and high-fidelity modeling of both kinetic impact (spacecraft collision to change a NEAs momentum and orbit) and nuclear detonation effects on Bennu, for purposes of deflecting Bennu.
Monogamy relations of concurrence for any dimensional quantum systems
NASA Astrophysics Data System (ADS)
Zhu, Xue-Na; Li-Jost, Xianqing; Fei, Shao-Ming
2017-11-01
We study monogamy relations for arbitrary dimensional multipartite systems. Monogamy relations based on concurrence and concurrence of assistance for any dimensional m_1⊗ m_2⊗ \\cdots ⊗ mN quantum states are derived, which give rise to the restrictions on the entanglement distributions among the subsystems. Besides, we give the lower bound of concurrence for four-partite mixed states. The approach can be readily generalized to arbitrary multipartite systems.
Computer Sciences and Data Systems, volume 1
NASA Technical Reports Server (NTRS)
1987-01-01
Topics addressed include: software engineering; university grants; institutes; concurrent processing; sparse distributed memory; distributed operating systems; intelligent data management processes; expert system for image analysis; fault tolerant software; and architecture research.
Designing Microstructures/Structures for Desired Functional Material and Local Fields
2015-12-02
utilized to engineer multifunctional soft materials for multi-sensing, multi- actuating , human-machine interfaces. [3] Establish a theoretical framework...model for surface elasticity, (ii) derived a new type of Maxwell stress in soft materials due to quantum mechanical-elasticity coupling and...elucidated its ramification in engineering multifunctional soft materials, and (iii) demonstrated the possibility of concurrent magnetoelectricity and
1997-12-01
Fracture Analysis of the F-5, 15%-Spar Bolt DR Devendra Kumar SAALC/LD 6- 16 CUNY-City College, New York, NY A Simple, Multiversion Concurrency Control...Program, University of Dayton, Dayton, OH. [3]AFGROW, Air Force Crack Propagation Analysis Program, Version 3.82 (1997) 15-8 A SIMPLE, MULTIVERSION ...Office of Scientific Research Boiling Air Force Base, DC and San Antonio Air Logistic Center August 1997 16-1 A SIMPLE, MULTIVERSION CONCURRENCY
A systems-based approach for integrated design of materials, products and design process chains
NASA Astrophysics Data System (ADS)
Panchal, Jitesh H.; Choi, Hae-Jin; Allen, Janet K.; McDowell, David L.; Mistree, Farrokh
2007-12-01
The concurrent design of materials and products provides designers with flexibility to achieve design objectives that were not previously accessible. However, the improved flexibility comes at a cost of increased complexity of the design process chains and the materials simulation models used for executing the design chains. Efforts to reduce the complexity generally result in increased uncertainty. We contend that a systems based approach is essential for managing both the complexity and the uncertainty in design process chains and simulation models in concurrent material and product design. Our approach is based on simplifying the design process chains systematically such that the resulting uncertainty does not significantly affect the overall system performance. Similarly, instead of striving for accurate models for multiscale systems (that are inherently complex), we rely on making design decisions that are robust to uncertainties in the models. Accordingly, we pursue hierarchical modeling in the context of design of multiscale systems. In this paper our focus is on design process chains. We present a systems based approach, premised on the assumption that complex systems can be designed efficiently by managing the complexity of design process chains. The approach relies on (a) the use of reusable interaction patterns to model design process chains, and (b) consideration of design process decisions using value-of-information based metrics. The approach is illustrated using a Multifunctional Energetic Structural Material (MESM) design example. Energetic materials store considerable energy which can be released through shock-induced detonation; conventionally, they are not engineered for strength properties. The design objectives for the MESM in this paper include both sufficient strength and energy release characteristics. The design is carried out by using models at different length and time scales that simulate different aspects of the system. Finally, by applying the method to the MESM design problem, we show that the integrated design of materials and products can be carried out more efficiently by explicitly accounting for design process decisions with the hierarchy of models.
Advanced main combustion chamber program
NASA Technical Reports Server (NTRS)
1991-01-01
The topics presented are covered in viewgraph form and include the following: investment of low cost castings; usage of SSME program; usage of MSFC personnel for design effort; and usage of concurrent engineering techniques.
Concurrent engineering design and management knowledge capture
NASA Technical Reports Server (NTRS)
1990-01-01
The topics are presented in viewgraph form and include the following: real-time management, personnel management, project management, conceptual design and decision making; the SITRF design problem; and the electronic-design notebook.
Lessons learned for composite structures
NASA Technical Reports Server (NTRS)
Whitehead, R. S.
1991-01-01
Lessons learned for composite structures are presented in three technology areas: materials, manufacturing, and design. In addition, future challenges for composite structures are presented. Composite materials have long gestation periods from the developmental stage to fully matured production status. Many examples exist of unsuccessful attempts to accelerate this gestation period. Experience has shown that technology transition of a new material system to fully matured production status is time consuming, involves risk, is expensive and should not be undertaken lightly. The future challenges for composite materials require an intensification of the science based approach to material development, extension of the vendor/customer interaction process to include all engineering disciplines of the end user, reduced material costs because they are a significant factor in overall part cost, and improved batch-to-batch pre-preg physical property control. Historical manufacturing lessons learned are presented using current in-service production structure as examples. Most producibility problems for these structures can be traced to their sequential engineering design. This caused an excessive emphasis on design-to-weight and schedule at the expense of design-to-cost. This resulted in expensive performance originated designs, which required costly tooling and led to non-producible parts. Historically these problems have been allowed to persist throughout the production run. The current/future approach for the production of affordable composite structures mandates concurrent engineering design where equal emphasis is placed on product and process design. Design for simplified assembly is also emphasized, since assembly costs account for a major portion of total airframe costs. The future challenge for composite manufacturing is, therefore, to utilize concurrent engineering in conjunction with automated manufacturing techniques to build affordable composite structures. Composite design experience has shown that significant weight savings have been achieved, outstanding fatigue and corrosion resistance have been demonstrated, and in-service performance has been very successful. Currently no structural design show stoppers exist for composite structures. A major lesson learned is that the full scale static test is the key test for composites, since it is the primary structural 'hot spot' indicator. The major durability issue is supportability of thin skinned structure. Impact damage has been identified as the most significant issue for the damage tolerance control of composite structures. However, delaminations induced during assembly operations have demonstrated a significant nuisance value. The future challenges for composite structures are threefold. Firstly, composite airframe weight fraction should increase to 60 percent. At the same time, the cost of composite structures must be reduced by 50 percent to attain the goal of affordability. To support these challenges it is essential to develop lower cost materials and processes.
Computational Infrastructure for Engine Structural Performance Simulation
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1997-01-01
Select computer codes developed over the years to simulate specific aspects of engine structures are described. These codes include blade impact integrated multidisciplinary analysis and optimization, progressive structural fracture, quantification of uncertainties for structural reliability and risk, benefits estimation of new technology insertion and hierarchical simulation of engine structures made from metal matrix and ceramic matrix composites. Collectively these codes constitute a unique infrastructure readiness to credibly evaluate new and future engine structural concepts throughout the development cycle from initial concept, to design and fabrication, to service performance and maintenance and repairs, and to retirement for cause and even to possible recycling. Stated differently, they provide 'virtual' concurrent engineering for engine structures total-life-cycle-cost.
Simulink-Based Simulation Architecture for Evaluating Controls for Aerospace Vehicles (SAREC-ASV)
NASA Technical Reports Server (NTRS)
Christhilf, David m.; Bacon, Barton J.
2006-01-01
The Simulation Architecture for Evaluating Controls for Aerospace Vehicles (SAREC-ASV) is a Simulink-based approach to providing an engineering quality desktop simulation capability for finding trim solutions, extracting linear models for vehicle analysis and control law development, and generating open-loop and closed-loop time history responses for control system evaluation. It represents a useful level of maturity rather than a finished product. The layout is hierarchical and supports concurrent component development and validation, with support from the Concurrent Versions System (CVS) software management tool. Real Time Workshop (RTW) is used to generate pre-compiled code for substantial component modules, and templates permit switching seamlessly between original Simulink and code compiled for various platforms. Two previous limitations are addressed. Turn around time for incorporating tabular model components was improved through auto-generation of required Simulink diagrams based on data received in XML format. The layout was modified to exploit a Simulink "compile once, evaluate multiple times" capability for zero elapsed time for use in trimming and linearizing. Trim is achieved through a Graphical User Interface (GUI) with a narrow, script definable interface to the vehicle model which facilitates incorporating new models.
A Response Surface Methodology for Bi-Level Integrated System Synthesis (BLISS)
NASA Technical Reports Server (NTRS)
Altus, Troy David; Sobieski, Jaroslaw (Technical Monitor)
2002-01-01
The report describes a new method for optimization of engineering systems such as aerospace vehicles whose design must harmonize a number of subsystems and various physical phenomena, each represented by a separate computer code, e.g., aerodynamics, structures, propulsion, performance, etc. To represent the system internal couplings, the codes receive output from other codes as part of their inputs. The system analysis and optimization task is decomposed into subtasks that can be executed concurrently, each subtask conducted using local state and design variables and holding constant a set of the system-level design variables. The subtasks results are stored in form of the Response Surfaces (RS) fitted in the space of the system-level variables to be used as the subtask surrogates in a system-level optimization whose purpose is to optimize the system objective(s) and to reconcile the system internal couplings. By virtue of decomposition and execution concurrency, the method enables a broad workfront in organization of an engineering project involving a number of specialty groups that might be geographically dispersed, and it exploits the contemporary computing technology of massively concurrent and distributed processing. The report includes a demonstration test case of supersonic business jet design.
Artificial concurrent catalytic processes involving enzymes.
Köhler, Valentin; Turner, Nicholas J
2015-01-11
The concurrent operation of multiple catalysts can lead to enhanced reaction features including (i) simultaneous linear multi-step transformations in a single reaction flask (ii) the control of intermediate equilibria (iii) stereoconvergent transformations (iv) rapid processing of labile reaction products. Enzymes occupy a prominent position for the development of such processes, due to their high potential compatibility with other biocatalysts. Genes for different enzymes can be co-expressed to reconstruct natural or construct artificial pathways and applied in the form of engineered whole cell biocatalysts to carry out complex transformations or, alternatively, the enzymes can be combined in vitro after isolation. Moreover, enzyme variants provide a wider substrate scope for a given reaction and often display altered selectivities and specificities. Man-made transition metal catalysts and engineered or artificial metalloenzymes also widen the range of reactivities and catalysed reactions that are potentially employable. Cascades for simultaneous cofactor or co-substrate regeneration or co-product removal are now firmly established. Many applications of more ambitious concurrent cascade catalysis are only just beginning to appear in the literature. The current review presents some of the most recent examples, with an emphasis on the combination of transition metal with enzymatic catalysis and aims to encourage researchers to contribute to this emerging field.
Working on the Boundaries: Philosophies and Practices of the Design Process
NASA Technical Reports Server (NTRS)
Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.
1996-01-01
While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.
NASA Astrophysics Data System (ADS)
Mitchell, K. L.; Lowes, L. L.; Budney, C. J.; Sohus, A.
2014-12-01
NASA's Planetary Science Summer School (PSSS) is an intensive program for postdocs and advanced graduate students in science and engineering fields with a keen interest in planetary exploration. The goal is to train the next generation of planetary science mission leaders in a hands-on environment involving a wide range of engineers and scientists. It was established in 1989, and has undergone several incarnations. Initially a series of seminars, it became a more formal mission design experience in 1999. Admission is competitive, with participants given financial support. The competitively selected trainees develop an early mission concept study in teams of 15-17, responsive to a typical NASA Science Mission Directorate Announcement of Opportunity. They select the mission concept from options presented by the course sponsors, based on high-priority missions as defined by the Decadal Survey, prepare a presentation for a proposal authorization review, present it to a senior review board and receive critical feedback. Each participant assumes multiple roles, on science, instrument and project teams. They develop an understanding of top-level science requirements and instrument priorities in advance through a series of reading assignments and webinars help trainees. Then, during the five day session at Jet Propulsion Laboratory, they work closely with concurrent engineers including JPL's Advanced Projects Design Team ("Team X"), a cross-functional multidisciplinary team of engineers that utilizes concurrent engineering methodologies to complete rapid design, analysis and evaluation of mission concept designs. All are mentored and assisted directly by Team X members and course tutors in their assigned project roles. There is a strong emphasis on making difficult trades, simulating a real mission design process as accurately as possible. The process is intense and at times dramatic, with fast-paced design sessions and late evening study sessions. A survey of PSSS alumni administered in 2013 provides information on the program's impact on trainees' career choices and leadership roles as they pursue their employment in planetary science and related fields. Results will be presented during the session, along with highlights of topics and missions covered since the program's inception.
NASA Astrophysics Data System (ADS)
Balaji, V.; Benson, Rusty; Wyman, Bruce; Held, Isaac
2016-10-01
Climate models represent a large variety of processes on a variety of timescales and space scales, a canonical example of multi-physics multi-scale modeling. Current hardware trends, such as Graphical Processing Units (GPUs) and Many Integrated Core (MIC) chips, are based on, at best, marginal increases in clock speed, coupled with vast increases in concurrency, particularly at the fine grain. Multi-physics codes face particular challenges in achieving fine-grained concurrency, as different physics and dynamics components have different computational profiles, and universal solutions are hard to come by. We propose here one approach for multi-physics codes. These codes are typically structured as components interacting via software frameworks. The component structure of a typical Earth system model consists of a hierarchical and recursive tree of components, each representing a different climate process or dynamical system. This recursive structure generally encompasses a modest level of concurrency at the highest level (e.g., atmosphere and ocean on different processor sets) with serial organization underneath. We propose to extend concurrency much further by running more and more lower- and higher-level components in parallel with each other. Each component can further be parallelized on the fine grain, potentially offering a major increase in the scalability of Earth system models. We present here first results from this approach, called coarse-grained component concurrency, or CCC. Within the Geophysical Fluid Dynamics Laboratory (GFDL) Flexible Modeling System (FMS), the atmospheric radiative transfer component has been configured to run in parallel with a composite component consisting of every other atmospheric component, including the atmospheric dynamics and all other atmospheric physics components. We will explore the algorithmic challenges involved in such an approach, and present results from such simulations. Plans to achieve even greater levels of coarse-grained concurrency by extending this approach within other components, such as the ocean, will be discussed.
Concurrent enterprise: a conceptual framework for enterprise supply-chain network activities
NASA Astrophysics Data System (ADS)
Addo-Tenkorang, Richard; Helo, Petri T.; Kantola, Jussi
2017-04-01
Supply-chain management (SCM) in manufacturing industries has evolved significantly over the years. Recently, a lot more relevant research has picked up on the development of integrated solutions. Thus, seeking a collaborative optimisation of geographical, just-in-time (JIT), quality (customer demand/satisfaction) and return-on-investment (profits), aspects of organisational management and planning through 'best practice' business-process management - concepts and application; employing system tools such as certain applications/aspects of enterprise resource planning (ERP) - SCM systems information technology (IT) enablers to enhance enterprise integrated product development/concurrent engineering principles. This article assumed three main organisation theory applications in positioning its assumptions. Thus, proposing a feasible industry-specific framework not currently included within the SCOR model's level four (4) implementation level, as well as other existing SCM integration reference models such as in the MIT process handbook's - Process Interchange Format (PIF), the TOVE project, etc. which could also be replicated in other SCs. However, the wider focus of this paper's contribution will be concentrated on a complimentary proposed framework to the SCC's SCOR reference model. Quantitative empirical closed-ended questionnaires in addition to the main data collected from a qualitative empirical real-life industrial-based pilot case study were used: To propose a conceptual concurrent enterprise framework for SCM network activities. This research adopts a design structure matrix simulation approach analysis to propose an optimal enterprise SCM-networked value-adding, customised master data-management platform/portal for efficient SCM network information exchange and an effective supply-chain (SC) network systems-design teams' structure. Furthermore, social network theory analysis will be employed in a triangulation approach with statistical correlation analysis to assess the scale/level of frequency, importance, level of collaborative-ness, mutual trust as well as roles and responsibility among the enterprise SCM network for systems product development (PD) design teams' technical communication network as well as extensive literature reviews.
ERIC Educational Resources Information Center
Arce-Ferrer, Alvaro J.; Bulut, Okan
2017-01-01
This study examines separate and concurrent approaches to combine the detection of item parameter drift (IPD) and the estimation of scale transformation coefficients in the context of the common item nonequivalent groups design with the three-parameter item response theory equating. The study uses real and synthetic data sets to compare the two…
ERIC Educational Resources Information Center
Koskey, Kristin L. K.; Stewart, Victoria C.
2014-01-01
This small "n" observational study used a concurrent mixed methods approach to address a void in the literature with regard to the qualitative meaningfulness of the data yielded by absolute magnitude estimation scaling (MES) used to rate subjective stimuli. We investigated whether respondents' scales progressed from less to more and…
Implementing model-based system engineering for the whole lifecycle of a spacecraft
NASA Astrophysics Data System (ADS)
Fischer, P. M.; Lüdtke, D.; Lange, C.; Roshani, F.-C.; Dannemann, F.; Gerndt, A.
2017-09-01
Design information of a spacecraft is collected over all phases in the lifecycle of a project. A lot of this information is exchanged between different engineering tasks and business processes. In some lifecycle phases, model-based system engineering (MBSE) has introduced system models and databases that help to organize such information and to keep it consistent for everyone. Nevertheless, none of the existing databases approached the whole lifecycle yet. Virtual Satellite is the MBSE database developed at DLR. It has been used for quite some time in Phase A studies and is currently extended for implementing it in the whole lifecycle of spacecraft projects. Since it is unforeseeable which future use cases such a database needs to support in all these different projects, the underlying data model has to provide tailoring and extension mechanisms to its conceptual data model (CDM). This paper explains the mechanisms as they are implemented in Virtual Satellite, which enables extending the CDM along the project without corrupting already stored information. As an upcoming major use case, Virtual Satellite will be implemented as MBSE tool in the S2TEP project. This project provides a new satellite bus for internal research and several different payload missions in the future. This paper explains how Virtual Satellite will be used to manage configuration control problems associated with such a multi-mission platform. It discusses how the S2TEP project starts using the software for collecting the first design information from concurrent engineering studies, then making use of the extension mechanisms of the CDM to introduce further information artefacts such as functional electrical architecture, thus linking more and more processes into an integrated MBSE approach.
NASA Technical Reports Server (NTRS)
Vosteen, Louis F.; Hadcock, Richard N.
1994-01-01
A study of past composite aircraft structures programs was conducted to determine the lessons learned during the programs. The study focused on finding major underlying principles and practices that experience showed have significant effects on the development process and should be recognized and understood by those responsible for using of composites. Published information on programs was reviewed and interviews were conducted with personnel associated with current and past major development programs. In all, interviews were conducted with about 56 people representing 32 organizations. Most of the people interviewed have been involved in the engineering and manufacturing development of composites for the past 20 to 25 years. Although composites technology has made great advances over the past 30 years, the effective application of composites to aircraft is still a complex problem that requires experienced personnel with special knowledge. All disciplines involved in the development process must work together in real time to minimize risk and assure total product quality and performance at acceptable costs. The most successful programs have made effective use of integrated, collocated, concurrent engineering teams, and most often used well-planned, systematic development efforts wherein the design and manufacturing processes are validated in a step-by-step or 'building block' approach. Such approaches reduce program risk and are cost effective.
Concurrent engineering research center
NASA Technical Reports Server (NTRS)
Callahan, John R.
1995-01-01
The projects undertaken by The Concurrent Engineering Research Center (CERC) at West Virginia University are reported and summarized. CERC's participation in the Department of Defense's Defense Advanced Research Project relating to technology needed to improve the product development process is described, particularly in the area of advanced weapon systems. The efforts committed to improving collaboration among the diverse and distributed health care providers are reported, along with the research activities for NASA in Independent Software Verification and Validation. CERC also takes part in the electronic respirator certification initiated by The National Institute for Occupational Safety and Health, as well as in the efforts to find a solution to the problem of producing environment-friendly end-products for product developers worldwide. The 3M Fiber Metal Matrix Composite Model Factory Program is discussed. CERC technologies, facilities,and personnel-related issues are described, along with its library and technical services and recent publications.
NREL Advancements in Methane Conversion Lead to Cleaner Air, Useful Products
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-06-01
Researchers at NREL leveraged the recent on-site development of gas fermentation capabilities and novel genetic tools to directly convert methane to lactic acid using an engineered methanotrophic bacterium. The results provide proof-of-concept data for a gas-to-liquids bioprocess that concurrently produces fuels and chemicals from methane. NREL researchers developed genetic tools to express heterologous genes in methanotrophic organisms, which have historically been difficult to genetically engineer. Using these tools, researchers demonstrated microbial conversion of methane to lactate, a high-volume biochemical precursor predominantly utilized for the production of bioplastics. Methane biocatalysis offers a means to concurrently liquefy and upgrade natural gas andmore » renewable biogas, enabling their utilization in conventional transportation and industrial manufacturing infrastructure. Producing chemicals and fuels from methane expands the suite of products currently generated from biorefineries, municipalities, and agricultural operations, with the potential to increase revenue and significantly reduce greenhouse gas emissions.« less
Integration of Design, Thermal, Structural, and Optical Analysis, Including Thermal Animation
NASA Technical Reports Server (NTRS)
Amundsen, Ruth M.
1993-01-01
In many industries there has recently been a concerted movement toward 'quality management' and the issue of how to accomplish work more efficiently. Part of this effort is focused on concurrent engineering; the idea of integrating the design and analysis processes so that they are not separate, sequential processes (often involving design rework due to analytical findings) but instead form an integrated system with smooth transfers of information. Presented herein are several specific examples of concurrent engineering methods being carried out at Langley Research Center (LaRC): integration of thermal, structural and optical analyses to predict changes in optical performance based on thermal and structural effects; integration of the CAD design process with thermal and structural analyses; and integration of analysis and presentation by animating the thermal response of a system as an active color map -- a highly effective visual indication of heat flow.
History of the Fluids Engineering Division
Cooper, Paul; Martin, C. Samuel; O'Hern, Timothy J.
2016-08-03
The 90th Anniversary of the Fluids Engineering Division (FED) of ASME will be celebrated on July 10–14, 2016 in Washington, DC. The venue is ASME's Summer Heat Transfer Conference (SHTC), Fluids Engineering Division Summer Meeting (FEDSM), and International Conference on Nanochannels and Microchannels (ICNMM). The occasion is an opportune time to celebrate and reflect on the origin of FED and its predecessor—the Hydraulic Division (HYD), which existed from 1926–1963. Furthermore, the FED Executive Committee decided that it would be appropriate to publish concurrently a history of the HYD/FED.
History of the Fluids Engineering Division
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooper, Paul; Martin, C. Samuel; O'Hern, Timothy J.
The 90th Anniversary of the Fluids Engineering Division (FED) of ASME will be celebrated on July 10–14, 2016 in Washington, DC. The venue is ASME's Summer Heat Transfer Conference (SHTC), Fluids Engineering Division Summer Meeting (FEDSM), and International Conference on Nanochannels and Microchannels (ICNMM). The occasion is an opportune time to celebrate and reflect on the origin of FED and its predecessor—the Hydraulic Division (HYD), which existed from 1926–1963. Furthermore, the FED Executive Committee decided that it would be appropriate to publish concurrently a history of the HYD/FED.
1992-11-01
7075 Dr. Samuel K.M.HO Dept. of Engineering Warwick University Coventry CV 47 AL UK 15 Tel.(44)203 524173 Fax.(44)203 524307 Mr. Mikio Inagaki...Nishi-8, Kita-ku Sapporo 060, JAPAN Tel. +81-11-716-211 ex.6447 Fax. +81-11-758-1619 Mr. Mikio Kitano 16 Motomachi Plant Toyota Moter Corporation 1...Hirosawa 2-1, Wako, Saitama 351-01 JAPAN Tel 81-484-65-6641 Fax 81-484-67-5942 Professor Hisayoshi Sato Director, Mechanical Engineering Laboratory
NASA Technical Reports Server (NTRS)
Toelle, Ronald (Compiler)
1995-01-01
A launch vehicle concept to deliver 20,000 lb of payload to a 100-nmi orbit has been defined. A new liquid oxygen/kerosene booster powered by an RD-180 engine was designed while using a slightly modified Centaur upper stage. The design, development, and test program met the imposed 40-mo schedule by elimination of major structural testing by increased factors of safety and concurrent engineering concepts. A growth path to attain 65,000 lb of payload is developed.
A Concurrent Distributed System for Aircraft Tactical Decision Generation
NASA Technical Reports Server (NTRS)
McManus, John W.
1990-01-01
A research program investigating the use of artificial intelligence (AI) techniques to aid in the development of a Tactical Decision Generator (TDG) for Within Visual Range (WVR) air combat engagements is discussed. The application of AI programming and problem solving methods in the development and implementation of a concurrent version of the Computerized Logic For Air-to-Air Warfare Simulations (CLAWS) program, a second generation TDG, is presented. Concurrent computing environments and programming approaches are discussed and the design and performance of a prototype concurrent TDG system are presented.
Addressing Research Design Problem in Mixed Methods Research
NASA Astrophysics Data System (ADS)
Alavi, Hamed; Hąbek, Patrycja
2016-03-01
Alongside other disciplines in social sciences, management researchers use mixed methods research more and more in conduct of their scientific investigations. Mixed methods approach can also be used in the field of production engineering. In comparison with traditional quantitative and qualitative research methods, reasons behind increasing popularity of mixed research method in management science can be traced in different factors. First of all, any particular discipline in management can be theoretically related to it. Second is that concurrent approach of mixed research method to inductive and deductive research logic provides researchers with opportunity to generate theory and test hypothesis in one study simultaneously. In addition, it provides a better justification for chosen method of investigation and higher validity for obtained answers to research questions. Despite increasing popularity of mixed research methods among management scholars, there is still need for a comprehensive approach to research design typology and process in mixed research method from the perspective of management science. The authors in this paper try to explain fundamental principles of mixed research method, its typology and different steps in its design process.
Melching, C.S.; Coupe, R.H.
1995-01-01
During water years 1985-91, the U.S. Geological Survey (USGS) and the Illinois Environmental Protection Agency (IEPA) cooperated in the collection and analysis of concurrent and split stream-water samples from selected sites in Illinois. Concurrent samples were collected independently by field personnel from each agency at the same time and sent to the IEPA laboratory, whereas the split samples were collected by USGS field personnel and divided into aliquots that were sent to each agency's laboratory for analysis. The water-quality data from these programs were examined by means of the Wilcoxon signed ranks test to identify statistically significant differences between results of the USGS and IEPA analyses. The data sets for constituents and properties identified by the Wilcoxon test as having significant differences were further examined by use of the paired t-test, mean relative percentage difference, and scattergrams to determine if the differences were important. Of the 63 constituents and properties in the concurrent-sample analysis, differences in only 2 (pH and ammonia) were statistically significant and large enough to concern water-quality engineers and planners. Of the 27 constituents and properties in the split-sample analysis, differences in 9 (turbidity, dissolved potassium, ammonia, total phosphorus, dissolved aluminum, dissolved barium, dissolved iron, dissolved manganese, and dissolved nickel) were statistically significant and large enough to con- cern water-quality engineers and planners. The differences in concentration between pairs of the concurrent samples were compared to the precision of the laboratory or field method used. The differences in concentration between pairs of the concurrent samples were compared to the precision of the laboratory or field method used. The differences in concentration between paris of split samples were compared to the precision of the laboratory method used and the interlaboratory precision of measuring a given concentration or property. Consideration of method precision indicated that differences between concurrent samples were insignificant for all concentrations and properties except pH, and that differences between split samples were significant for all concentrations and properties. Consideration of interlaboratory precision indicated that the differences between the split samples were not unusually large. The results for the split samples illustrate the difficulty in obtaining comparable and accurate water-quality data.
Abstraction and Assume-Guarantee Reasoning for Automated Software Verification
NASA Technical Reports Server (NTRS)
Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.
2004-01-01
Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.
NASA Astrophysics Data System (ADS)
Qi, Xianfei; Gao, Ting; Yan, Fengli
2017-01-01
Concurrence, as one of the entanglement measures, is a useful tool to characterize quantum entanglement in various quantum systems. However, the computation of the concurrence involves difficult optimizations and only for the case of two qubits, an exact formula was found. We investigate the concurrence of four-qubit quantum states and derive analytical lower bound of concurrence using the multiqubit monogamy inequality. It is shown that this lower bound is able to improve the existing bounds. This approach can be generalized to arbitrary qubit systems. We present an exact formula of concurrence for some mixed quantum states. For even-qubit states, we derive an improved lower bound of concurrence using a monogamy equality for qubit systems. At the same time, we show that a multipartite state is k-nonseparable if the multipartite concurrence is larger than a constant related to the value of k, the qudit number and the dimension of the subsystems. Our results can be applied to detect the multipartite k-nonseparable states.
Chou, Eva; Liu, Jun; Seaworth, Cathleen; Furst, Meredith; Amato, Malena M; Blaydon, Sean M; Durairaj, Vikram D; Nakra, Tanuj; Shore, John W
To compare revision rates for ptosis surgery between posterior-approach and anterior-approach ptosis repair techniques. This is the retrospective, consecutive cohort study. All patients undergoing ptosis surgery at a high-volume oculofacial plastic surgery practice over a 4-year period. A retrospective chart review was conducted of all patients undergoing posterior-approach and anterior-approach ptosis surgery for all etiologies of ptosis between 2011 and 2014. Etiology of ptosis, concurrent oculofacial surgeries, revision, and complications were analyzed. The main outcome measure is the ptosis revision rate. A total of 1519 patients were included in this study. The mean age was 63 ± 15.4 years. A total of 1056 (70%) of patients were female, 1451 (95%) had involutional ptosis, and 1129 (74.3%) had concurrent upper blepharoplasty. Five hundred thirteen (33.8%) underwent posterior-approach ptosis repair, and 1006 (66.2%) underwent anterior-approach ptosis repair. The degree of ptosis was greater in the anterior-approach ptosis repair group. The overall revision rate for all patients was 8.7%. Of the posterior group, 6.8% required ptosis revision; of the anterior group, 9.5% required revision surgery. The main reason for ptosis revision surgery was undercorrection of one or both eyelids. Concurrent brow lifting was associated with a decreased, but not statistically significant, rate of revision surgery. Patients who underwent unilateral ptosis surgery had a 5.1% rate of Hering's phenomenon requiring ptosis repair in the contralateral eyelid. Multivariable logistic regression for predictive factors show that, when adjusted for gender and concurrent blepharoplasty, the revision rate in anterior-approach ptosis surgery is higher than posterior-approach ptosis surgery (odds ratio = 2.08; p = 0.002). The overall revision rate in patients undergoing ptosis repair via posterior-approach or anterior-approach techniques is 8.7%. There is a statistically higher rate of revision with anterior-approach ptosis repair.
Concurrent Design used in the Design of Space Instruments
NASA Technical Reports Server (NTRS)
Oxnevad, Knut I.
1998-01-01
At the Project Design Center at the Jet Propulsion Laboratory, a concurrent design environment is under development for supporting development and analyses of space instruments in the early, conceptual design phases. This environment is being utilized by a Team I, a multidisciplinary group of experts. Team I is providing study and proposal support. To provide the required support, the Team I concurrent design environment features effectively interconnected high-end optics, CAD, and thermal design and analysis tools. Innovative approaches for linking tools, and for transferring files between applications have been implemented. These approaches together with effective sharing of geometry between the optics, CAD, and thermal tools are already showing significant timesavings.
Catalog of Training and Education Sources in Concurrent Engineering
1989-11-01
Undergraduate degree in engineering or hard science. TOEFL (Test of English as a Foreign Language) of 550 or better for international students and GMAT (Graduate...Graduate Record Examination)of 1000 0 (Verbal + Quantitative); TOEFL (Test of English as a Foreign Language) of 550 for students whose first language...Graduate Record Examination) and TOEFL (Test of English as a Foreign Language) 0 scores. Comments: Recipient of the CASA/SME 1988 University LEAD
Air Force Engineering Research Initiation Grant Program
1994-06-21
MISFET Structures for High-Frequency Device Applications" RI-B-91-13 Prof. John W. Silvestro Clemson University "The Effect of Scattering by a Near...Synthesis Method for Concurrent Engineering Applications" RI-B-92-03 Prof. Steven H. Collicott Purdue University "An Experimental Study of the Effect of a ...beams is studied. The effect of interply delam- inations on natural frequencies and mode shapes is evaluated analytically. A generalized variational
Wang, Gang; Yang, Luhan; Grishin, Dennis; Rios, Xavier; Ye, Lillian Y; Hu, Yong; Li, Kai; Zhang, Donghui; Church, George M; Pu, William T
2017-01-01
Genome editing of human induced pluripotent stem cells (hiPSCs) offers unprecedented opportunities for in vitro disease modeling and personalized cell replacement therapy. The introduction of Cas9-directed genome editing has expanded adoption of this approach. However, marker-free genome editing using standard protocols remains inefficient, yielding desired targeted alleles at a rate of ∼1-5%. We developed a protocol based on a doxycycline-inducible Cas9 transgene carried on a piggyBac transposon to enable robust and highly efficient Cas9-directed genome editing, so that a parental line can be expeditiously engineered to harbor many separate mutations. Treatment with doxycycline and transfection with guide RNA (gRNA), donor DNA and piggyBac transposase resulted in efficient, targeted genome editing and concurrent scarless transgene excision. Using this approach, in 7 weeks it is possible to efficiently obtain genome-edited clones with minimal off-target mutagenesis and with indel mutation frequencies of 40-50% and homology-directed repair (HDR) frequencies of 10-20%.
Future Issues and Approaches to Health Monitoring and Failure Prevention for Oil-Free Gas Turbines
NASA Technical Reports Server (NTRS)
DellaCorte, Christopher
2004-01-01
Recent technology advances in foil air bearings, high temperature solid lubricants and computer based modeling has enabled the development of small Oil-Free gas turbines. These turbomachines are currently commercialized as small (<100 kW) microturbine generators and larger machines are being developed. Based upon these successes and the high potential payoffs offered by Oil-Free systems, NASA, industry, and other government entities are anticipating Oil-Free gas turbine propulsion systems to proliferate future markets. Since an Oil-Free engine has no oil system, traditional approaches to health monitoring and diagnostics, such as chip detection, oil analysis, and possibly vibration signature analyses (e.g., ball pass frequency) will be unavailable. As such, new approaches will need to be considered. These could include shaft orbit analyses, foil bearing temperature measurements, embedded wear sensors and start-up/coast down speed analysis. In addition, novel, as yet undeveloped techniques may emerge based upon concurrent developments in MEMS technology. This paper introduces Oil-Free technology, reviews the current state of the art and potential for future turbomachinery applications and discusses possible approaches to health monitoring, diagnostics and failure prevention.
Bioinspired magnetoreception and navigation using magnetic signatures as waypoints.
Taylor, Brian K
2018-05-15
Diverse taxa use Earth's magnetic field in conjunction with other sensory modalities to accomplish navigation tasks ranging from local homing to long-distance migration across continents and ocean basins. However, despite extensive research, the mechanisms that underlie animal magnetoreception are not clearly understood, and how animals use Earth's magnetic field to navigate is an active area of investigation. Concurrently, Earth's magnetic field offers a signal that engineered systems can leverage for navigation in environments where man-made systems such as GPS are unavailable or unreliable. Using a proxy for Earth's magnetic field, and inspired by migratory animal behavior, this work implements a behavioral strategy that uses combinations of magnetic field properties as rare or unique signatures that mark specific locations. Using a discrete number of these signatures as goal waypoints, the strategy navigates through a closed set of points several times in a variety of environmental conditions, and with various levels of sensor noise. The results from this engineering/quantitative biology approach support existing notions that some animals may use combinations of magnetic properties as navigational markers, and provides insights into features and constraints that would enable navigational success or failure. The findings also offer insights into how autonomous engineered platforms might be designed to leverage the magnetic field as a navigational resource.
Concurrent Engineering for Composites
1991-10-01
1990), 44. Cooper, R.G. and Kleinschmidt, E.J., Journal of Product Innovation Management . 3[2], (1986), 71.. Drucker, P.F., Harvard Business Review...Journal of Product Innovation Management 6(1], (1989), 43. Hollins, B. and Pugh, S., Successful Product Design, Buttcrworths, London, 1990. Johnson
A Concurrent Product-Development Approach for Friction-Stir Welded Vehicle-Underbody Structures
NASA Astrophysics Data System (ADS)
Grujicic, M.; Arakere, G.; Hariharan, A.; Pandurangan, B.
2012-04-01
High-strength aluminum and titanium alloys with superior blast/ballistic resistance against armor piercing (AP) threats and with high vehicle light-weighing potential are being increasingly used as military-vehicle armor. Due to the complex structure of these vehicles, they are commonly constructed through joining (mainly welding) of the individual components. Unfortunately, these alloys are not very amenable to conventional fusion-based welding technologies [e.g., gas metal arc welding (GMAW)] and to obtain high-quality welds, solid-state joining technologies such as friction-stir welding (FSW) have to be employed. However, since FSW is a relatively new and fairly complex joining technology, its introduction into advanced military-vehicle-underbody structures is not straight forward and entails a comprehensive multi-prong approach which addresses concurrently and interactively all the aspects associated with the components/vehicle-underbody design, fabrication, and testing. One such approach is developed and applied in this study. The approach consists of a number of well-defined steps taking place concurrently and relies on two-way interactions between various steps. The approach is critically assessed using a strengths, weaknesses, opportunities, and threats (SWOT) analysis.
The Effect of Basepair Mismatch on DNA Strand Displacement.
Broadwater, D W Bo; Kim, Harold D
2016-04-12
DNA strand displacement is a key reaction in DNA homologous recombination and DNA mismatch repair and is also heavily utilized in DNA-based computation and locomotion. Despite its ubiquity in science and engineering, sequence-dependent effects of displacement kinetics have not been extensively characterized. Here, we measured toehold-mediated strand displacement kinetics using single-molecule fluorescence in the presence of a single basepair mismatch. The apparent displacement rate varied significantly when the mismatch was introduced in the invading DNA strand. The rate generally decreased as the mismatch in the invader was encountered earlier in displacement. Our data indicate that a single base pair mismatch in the invader stalls branch migration and displacement occurs via direct dissociation of the destabilized incumbent strand from the substrate strand. We combined both branch migration and direct dissociation into a model, which we term the concurrent displacement model, and used the first passage time approach to quantitatively explain the salient features of the observed relationship. We also introduce the concept of splitting probabilities to justify that the concurrent model can be simplified into a three-step sequential model in the presence of an invader mismatch. We expect our model to become a powerful tool to design DNA-based reaction schemes with broad functionality. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Challenges in Teaching Modern Manufacturing Technologies
ERIC Educational Resources Information Center
Ngaile, Gracious; Wang, Jyhwen; Gau, Jenn-Terng
2015-01-01
Teaching of manufacturing courses for undergraduate engineering students has become a challenge due to industrial globalisation coupled with influx of new innovations, technologies, customer-driven products. This paper discusses development of a modern manufacturing course taught concurrently in three institutions where students collaborate in…
DOE Office of Scientific and Technical Information (OSTI.GOV)
The U.S. Department of Energy's (DOE) Co-Optimization of Fuels & Engines (Co-Optima) initiative is accelerating the introduction of affordable, scalable, and sustainable fuels and high-efficiency, low-emission engines with a first-of-its-kind effort to simultaneously tackle fuel and engine research and development (R&D). This report summarizes accomplishments in the first year of the project. Co-Optima is conducting concurrent research to identify the fuel properties and engine design characteristics needed to maximize vehicle performance and affordability, while deeply cutting emissions. Nine national laboratories - the National Renewable Energy Laboratory and Argonne, Idaho, Lawrence Berkeley, Lawrence Livermore, Los Alamos, Oak Ridge, Pacific Northwest, andmore » Sandia National Laboratories - are collaborating with industry and academia on this groundbreaking research.« less
Amey, David L.; Degner, Michael W.
2002-01-01
A method for reducing the starting time and reducing the peak phase currents for an internal combustion engine that is started using an induction machine starter/alternator. The starting time is reduced by pre-fluxing the induction machine and the peak phase currents are reduced by reducing the flux current command after a predetermined period of time has elapsed and concurrent to the application of the torque current command. The method of the present invention also provides a strategy for anticipating the start command for an internal combustion engine and determines a start strategy based on the start command and the operating state of the internal combustion engine.
Examination of Various Methods Used in Support of Concurrent Engineering
1990-03-01
1989. F.Y.I. Drawing a2Ther Productivity. Industrial Engineering 21: 80. Ishi82 Ishikawa , Kaoru . 1982. Guide to Quality Control. White Plains, NY: Kraus...observe it in practice have an easier time identifying the different methods or tech- niques (such as the Ishikawa tools) used than understanding the...simple histogram to show what prob- lems should be attacked first. Cause and Effect Diagrams Sometimes called the fishbone or Ishikawa diagrams-a kind
Substantial increase in concurrent droughts and heatwaves in the United States
Mazdiyasni, Omid; AghaKouchak, Amir
2015-01-01
A combination of climate events (e.g., low precipitation and high temperatures) may cause a significant impact on the ecosystem and society, although individual events involved may not be severe extremes themselves. Analyzing historical changes in concurrent climate extremes is critical to preparing for and mitigating the negative effects of climatic change and variability. This study focuses on the changes in concurrences of heatwaves and meteorological droughts from 1960 to 2010. Despite an apparent hiatus in rising temperature and no significant trend in droughts, we show a substantial increase in concurrent droughts and heatwaves across most parts of the United States, and a statistically significant shift in the distribution of concurrent extremes. Although commonly used trend analysis methods do not show any trend in concurrent droughts and heatwaves, a unique statistical approach discussed in this study exhibits a statistically significant change in the distribution of the data. PMID:26324927
Substantial increase in concurrent droughts and heatwaves in the United States.
Mazdiyasni, Omid; AghaKouchak, Amir
2015-09-15
A combination of climate events (e.g., low precipitation and high temperatures) may cause a significant impact on the ecosystem and society, although individual events involved may not be severe extremes themselves. Analyzing historical changes in concurrent climate extremes is critical to preparing for and mitigating the negative effects of climatic change and variability. This study focuses on the changes in concurrences of heatwaves and meteorological droughts from 1960 to 2010. Despite an apparent hiatus in rising temperature and no significant trend in droughts, we show a substantial increase in concurrent droughts and heatwaves across most parts of the United States, and a statistically significant shift in the distribution of concurrent extremes. Although commonly used trend analysis methods do not show any trend in concurrent droughts and heatwaves, a unique statistical approach discussed in this study exhibits a statistically significant change in the distribution of the data.
Multigrid methods with space–time concurrency
Falgout, R. D.; Friedhoff, S.; Kolev, Tz. V.; ...
2017-10-06
Here, we consider the comparison of multigrid methods for parabolic partial differential equations that allow space–time concurrency. With current trends in computer architectures leading towards systems with more, but not faster, processors, space–time concurrency is crucial for speeding up time-integration simulations. In contrast, traditional time-integration techniques impose serious limitations on parallel performance due to the sequential nature of the time-stepping approach, allowing spatial concurrency only. This paper considers the three basic options of multigrid algorithms on space–time grids that allow parallelism in space and time: coarsening in space and time, semicoarsening in the spatial dimensions, and semicoarsening in the temporalmore » dimension. We develop parallel software and performance models to study the three methods at scales of up to 16K cores and introduce an extension of one of them for handling multistep time integration. We then discuss advantages and disadvantages of the different approaches and their benefit compared to traditional space-parallel algorithms with sequential time stepping on modern architectures.« less
A concurrent distributed system for aircraft tactical decision generation
NASA Technical Reports Server (NTRS)
Mcmanus, John W.
1990-01-01
A research program investigating the use of AI techniques to aid in the development of a tactical decision generator (TDG) for within visual range (WVR) air combat engagements is discussed. The application of AI programming and problem-solving methods in the development and implementation of a concurrent version of the computerized logic for air-to-air warfare simulations (CLAWS) program, a second-generation TDG, is presented. Concurrent computing environments and programming approaches are discussed, and the design and performance of prototype concurrent TDG system (Cube CLAWS) are presented. It is concluded that the Cube CLAWS has provided a useful testbed to evaluate the development of a distributed blackboard system. The project has shown that the complexity of developing specialized software on a distributed, message-passing architecture such as the Hypercube is not overwhelming, and that reasonable speedups and processor efficiency can be achieved by a distributed blackboard system. The project has also highlighted some of the costs of using a distributed approach to designing a blackboard system.
Multigrid methods with space–time concurrency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Falgout, R. D.; Friedhoff, S.; Kolev, Tz. V.
Here, we consider the comparison of multigrid methods for parabolic partial differential equations that allow space–time concurrency. With current trends in computer architectures leading towards systems with more, but not faster, processors, space–time concurrency is crucial for speeding up time-integration simulations. In contrast, traditional time-integration techniques impose serious limitations on parallel performance due to the sequential nature of the time-stepping approach, allowing spatial concurrency only. This paper considers the three basic options of multigrid algorithms on space–time grids that allow parallelism in space and time: coarsening in space and time, semicoarsening in the spatial dimensions, and semicoarsening in the temporalmore » dimension. We develop parallel software and performance models to study the three methods at scales of up to 16K cores and introduce an extension of one of them for handling multistep time integration. We then discuss advantages and disadvantages of the different approaches and their benefit compared to traditional space-parallel algorithms with sequential time stepping on modern architectures.« less
NASA Astrophysics Data System (ADS)
Li, Haifeng; Zhu, Qing; Yang, Xiaoxia; Xu, Linrong
2012-10-01
Typical characteristics of remote sensing applications are concurrent tasks, such as those found in disaster rapid response. The existing composition approach to geographical information processing service chain, searches for an optimisation solution and is what can be deemed a "selfish" way. This way leads to problems of conflict amongst concurrent tasks and decreases the performance of all service chains. In this study, a non-cooperative game-based mathematical model to analyse the competitive relationships between tasks, is proposed. A best response function is used, to assure each task maintains utility optimisation by considering composition strategies of other tasks and quantifying conflicts between tasks. Based on this, an iterative algorithm that converges to Nash equilibrium is presented, the aim being to provide good convergence and maximise the utilisation of all tasks under concurrent task conditions. Theoretical analyses and experiments showed that the newly proposed method, when compared to existing service composition methods, has better practical utility in all tasks.
Ebrahimi, Ahmad; Kia, Reza; Komijan, Alireza Rashidi
2016-01-01
In this article, a novel integrated mixed-integer nonlinear programming model is presented for designing a cellular manufacturing system (CMS) considering machine layout and part scheduling problems simultaneously as interrelated decisions. The integrated CMS model is formulated to incorporate several design features including part due date, material handling time, operation sequence, processing time, an intra-cell layout of unequal-area facilities, and part scheduling. The objective function is to minimize makespan, tardiness penalties, and material handling costs of inter-cell and intra-cell movements. Two numerical examples are solved by the Lingo software to illustrate the results obtained by the incorporated features. In order to assess the effects and importance of integration of machine layout and part scheduling in designing a CMS, two approaches, sequentially and concurrent are investigated and the improvement resulted from a concurrent approach is revealed. Also, due to the NP-hardness of the integrated model, an efficient genetic algorithm is designed. As a consequence, computational results of this study indicate that the best solutions found by GA are better than the solutions found by B&B in much less time for both sequential and concurrent approaches. Moreover, the comparisons between the objective function values (OFVs) obtained by sequential and concurrent approaches demonstrate that the OFV improvement is averagely around 17 % by GA and 14 % by B&B.
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.
Assessing risk-adjustment approaches under non-random selection.
Luft, Harold S; Dudley, R Adams
2004-01-01
Various approaches have been proposed to adjust for differences in enrollee risk in health plans. Because risk-selection strategies may have different effects on enrollment, we simulated three types of selection--dumping, skimming, and stinting. Concurrent diagnosis-based risk adjustment, and a hybrid using concurrent adjustment for about 8% of the cases and prospective adjustment for the rest, perform markedly better than prospective or demographic adjustments, both in terms of R2 and the extent to which plans experience unwarranted gains or losses. The simulation approach offers a valuable tool for analysts in assessing various risk-adjustment strategies under different selection situations.
Assessment of a new web-based sexual concurrency measurement tool for men who have sex with men.
Rosenberg, Eli S; Rothenberg, Richard B; Kleinbaum, David G; Stephenson, Rob B; Sullivan, Patrick S
2014-11-10
Men who have sex with men (MSM) are the most affected risk group in the United States' human immunodeficiency virus (HIV) epidemic. Sexual concurrency, the overlapping of partnerships in time, accelerates HIV transmission in populations and has been documented at high levels among MSM. However, concurrency is challenging to measure empirically and variations in assessment techniques used (primarily the date overlap and direct question approaches) and the outcomes derived from them have led to heterogeneity and questionable validity of estimates among MSM and other populations. The aim was to evaluate a novel Web-based and interactive partnership-timing module designed for measuring concurrency among MSM, and to compare outcomes measured by the partnership-timing module to those of typical approaches in an online study of MSM. In an online study of MSM aged ≥18 years, we assessed concurrency by using the direct question method and by gathering the dates of first and last sex, with enhanced programming logic, for each reported partner in the previous 6 months. From these methods, we computed multiple concurrency cumulative prevalence outcomes: direct question, day resolution / date overlap, and month resolution / date overlap including both 1-month ties and excluding ties. We additionally computed variants of the UNAIDS point prevalence outcome. The partnership-timing module was also administered. It uses an interactive month resolution calendar to improve recall and follow-up questions to resolve temporal ambiguities, combines elements of the direct question and date overlap approaches. The agreement between the partnership-timing module and other concurrency outcomes was assessed with percent agreement, kappa statistic (κ), and matched odds ratios at the individual, dyad, and triad levels of analysis. Among 2737 MSM who completed the partnership section of the partnership-timing module, 41.07% (1124/2737) of individuals had concurrent partners in the previous 6 months. The partnership-timing module had the highest degree of agreement with the direct question. Agreement was lower with date overlap outcomes (agreement range 79%-81%, κ range .55-.59) and lowest with the UNAIDS outcome at 5 months before interview (65% agreement, κ=.14, 95% CI .12-.16). All agreements declined after excluding individuals with 1 sex partner (always classified as not engaging in concurrency), although the highest agreement was still observed with the direct question technique (81% agreement, κ=.59, 95% CI .55-.63). Similar patterns in agreement were observed with dyad- and triad-level outcomes. The partnership-timing module showed strong concurrency detection ability and agreement with previous measures. These levels of agreement were greater than others have reported among previous measures. The partnership-timing module may be well suited to quantifying concurrency among MSM at multiple levels of analysis.
Infusion of a Gaming Paradigm into Computer-Aided Engineering Design Tools
2012-05-03
Virtual Test Bed (VTB), and the gaming tool, Unity3D . This hybrid gaming environment coupled a three-dimensional (3D) multibody vehicle system model...from Google Earth to the 3D visual front-end fabricated around Unity3D . The hybrid environment was sufficiently developed to support analyses of the...ndFr Cti3r4 G’OjrdFr ctior-2 The VTB simulation of the vehicle dynamics ran concurrently with and interacted with the gaming engine, Unity3D which
Quality Function Deployment for Large Systems
NASA Technical Reports Server (NTRS)
Dean, Edwin B.
1992-01-01
Quality Function Deployment (QFD) is typically applied to small subsystems. This paper describes efforts to extend QFD to large scale systems. It links QFD to the system engineering process, the concurrent engineering process, the robust design process, and the costing process. The effect is to generate a tightly linked project management process of high dimensionality which flushes out issues early to provide a high quality, low cost, and, hence, competitive product. A pre-QFD matrix linking customers to customer desires is described.
Composite Crew Module: Primary Structure
NASA Technical Reports Server (NTRS)
Kirsch, Michael T.
2011-01-01
In January 2007, the NASA Administrator and Associate Administrator for the Exploration Systems Mission Directorate chartered the NASA Engineering and Safety Center to design, build, and test a full-scale crew module primary structure, using carbon fiber reinforced epoxy based composite materials. The overall goal of the Composite Crew Module project was to develop a team from the NASA family with hands-on experience in composite design, manufacturing, and testing in anticipation of future space exploration systems being made of composite materials. The CCM project was planned to run concurrently with the Orion project's baseline metallic design within the Constellation Program so that features could be compared and discussed without inducing risk to the overall Program. This report discusses the project management aspects of the project including team organization, decision making, independent technical reviews, and cost and schedule management approach.
RAFCON: A Graphical Tool for Engineering Complex, Robotic Tasks
2016-10-09
Robotic tasks are becoming increasingly complex, and with this also the robotic systems. This requires new tools to manage this complexity and to...execution of robotic tasks, called RAFCON. These tasks are described in hierarchical state machines supporting concurrency. A formal notation of this concept
Relative Reinforcer Rates and Magnitudes Do Not Control Concurrent Choice Independently
ERIC Educational Resources Information Center
Elliffe, Douglas; Davison, Michael; Landon, Jason
2008-01-01
One assumption of the matching approach to choice is that different independent variables control choice independently of each other. We tested this assumption for reinforcer rate and magnitude in an extensive parametric experiment. Five pigeons responded for food reinforcement on switching-key concurrent variable-interval variable-interval…
Elements of Designing for Cost
NASA Technical Reports Server (NTRS)
Dean, Edwin B.; Unal, Resit
1992-01-01
During recent history in the United States, government systems development has been performance driven. As a result, systems within a class have experienced exponentially increasing cost over time in fixed year dollars. Moreover, little emphasis has been placed on reducing cost. This paper defines designing for cost and presents several tools which, if used in the engineering process, offer the promise of reducing cost. Although other potential tools exist for designing for cost, this paper focuses on rules of thumb, quality function deployment, Taguchi methods, concurrent engineering, and activity based costing. Each of these tools has been demonstrated to reduce cost if used within the engineering process.
Elements of designing for cost
NASA Technical Reports Server (NTRS)
Dean, Edwin B.; Unal, Resit
1992-01-01
During recent history in the United States, government systems development has been performance driven. As a result, systems within a class have experienced exponentially increasing cost over time in fixed year dollars. Moreover, little emphasis has been placed on reducing cost. This paper defines designing for cost and presents several tools which, if used in the engineering process, offer the promise of reducing cost. Although other potential tools exist for designing for cost, this paper focuses on rules of thumb, quality function deployment, Taguchi methods, concurrent engineering, and activity-based costing. Each of these tools has been demonstrated to reduce cost if used within the engineering process.
NASA Astrophysics Data System (ADS)
Davendralingam, Navindran
Conceptual design of aircraft and the airline network (routes) on which aircraft fly on are inextricably linked to passenger driven demand. Many factors influence passenger demand for various Origin-Destination (O-D) city pairs including demographics, geographic location, seasonality, socio-economic factors and naturally, the operations of directly competing airlines. The expansion of airline operations involves the identificaion of appropriate aircraft to meet projected future demand. The decisions made in incorporating and subsequently allocating these new aircraft to serve air travel demand affects the inherent risk and profit potential as predicted through the airline revenue management systems. Competition between airlines then translates to latent passenger observations of the routes served between OD pairs and ticket pricing---this in effect reflexively drives future states of demand. This thesis addresses the integrated nature of aircraft design, airline operations and passenger demand, in order to maximize future expected profits as new aircraft are brought into service. The goal of this research is to develop an approach that utilizes aircraft design, airline network design and passenger demand as a unified framework to provide better integrated design solutions in order to maximize expexted profits of an airline. This is investigated through two approaches. The first is a static model that poses the concurrent engineering paradigm above as an investment portfolio problem. Modern financial portfolio optimization techniques are used to leverage risk of serving future projected demand using a 'yet to be introduced' aircraft against potentially generated future profits. Robust optimization methodologies are incorporated to mitigate model sensitivity and address estimation risks associated with such optimization techniques. The second extends the portfolio approach to include dynamic effects of an airline's operations. A dynamic programming approach is employed to simulate the reflexive nature of airline supply-demand interactions by modeling the aggregate changes in demand that would result from tactical allocations of aircraft to maximize profit. The best yet-to-be-introduced aircraft maximizes profit by minimizing the long term fleetwide direct operating costs.
NASA Astrophysics Data System (ADS)
Piras, Annamaria; Malucchi, Giovanni
2012-08-01
In the design and development phase of a new program one of the critical aspects is the integration of all the functional requirements of the system and the control of the overall consistency between the identified needs on one side and the available resources on the other side, especially when both the required needs and available resources are not yet consolidated, but they are evolving as the program maturity increases.The Integrated Engineering Harness Avionics and Software database (IDEHAS) is a tool that has been developed to support this process in the frame of the Avionics and Software disciplines through the different phases of the program. The tool is in fact designed to allow an incremental build up of the avionics and software systems, from the description of the high level architectural data (available in the early stages of the program) to the definition of the pin to pin connectivity information (typically consolidated in the design finalization stages) and finally to the construction and validation of the detailed telemetry parameters and commands to be used in the test phases and in the Mission Control Centre. The key feature of this approach and of the associated tool is that it allows the definition and the maintenance / update of all these data in a single, consistent environment.On one side a system level and concurrent approach requires the feasibility to easily integrate and update the best data available since the early stages of a program in order to improve confidence in the consistency and to control the design information.On the other side, the amount of information of different typologies and the cross-relationships among the data imply highly consolidated structures requiring lot of checks to guarantee the data content consistency with negative effects on simplicity and flexibility and often limiting the attention to special needs and to the interfaces with other disciplines.
NASA Astrophysics Data System (ADS)
Ryan, R.; Gross, L. A.
1995-05-01
The Space Shuttle main engine (SSME) alternate high-pressure liquid oxygen pump experienced synchronous vibration and ball bearing life problems that were program threatening. The success of the program hinged on the ability to solve these development problems. The design and solutions to these problems are engirded in the lessons learned and experiences from prior programs, technology programs, and the ability to properly conduct failure or anomaly investigations. The failure investigation determines the problem cause and is the basis for recommending design solutions. For a complex problem, a comprehensive solution requires that formal investigation procedures be used, including fault trees, resolution logic, and action items worked through a concurrent engineering-multidiscipline team. The normal tendency to use an intuitive, cut-and-try approach will usually prove to be costly, both in money and time and will reach a less than optimum, poorly understood answer. The SSME alternate high-pressure oxidizer turbopump development has had two complex problems critical to program success: (1) high synchronous vibrations and (2) excessive ball bearing wear. This paper will use these two problems as examples of this formal failure investigation approach. The results of the team's investigation provides insight into the complexity of the turbomachinery technical discipline interacting/sensitivities and the fine balance of competing investigations required to solve problems and guarantee program success. It is very important to the solution process that maximum use be made of the resources that both the contractor and Government can bring to the problem in a supporting and noncompeting way. There is no place for the not-invented-here attitude. The resources include, but are not limited to: (1) specially skilled professionals; (2) supporting technologies; (3) computational codes and capabilities; and (4) test and manufacturing facilities.
NASA Technical Reports Server (NTRS)
Ryan, R.; Gross, L. A.
1995-01-01
The Space Shuttle main engine (SSME) alternate high-pressure liquid oxygen pump experienced synchronous vibration and ball bearing life problems that were program threatening. The success of the program hinged on the ability to solve these development problems. The design and solutions to these problems are engirded in the lessons learned and experiences from prior programs, technology programs, and the ability to properly conduct failure or anomaly investigations. The failure investigation determines the problem cause and is the basis for recommending design solutions. For a complex problem, a comprehensive solution requires that formal investigation procedures be used, including fault trees, resolution logic, and action items worked through a concurrent engineering-multidiscipline team. The normal tendency to use an intuitive, cut-and-try approach will usually prove to be costly, both in money and time and will reach a less than optimum, poorly understood answer. The SSME alternate high-pressure oxidizer turbopump development has had two complex problems critical to program success: (1) high synchronous vibrations and (2) excessive ball bearing wear. This paper will use these two problems as examples of this formal failure investigation approach. The results of the team's investigation provides insight into the complexity of the turbomachinery technical discipline interacting/sensitivities and the fine balance of competing investigations required to solve problems and guarantee program success. It is very important to the solution process that maximum use be made of the resources that both the contractor and Government can bring to the problem in a supporting and noncompeting way. There is no place for the not-invented-here attitude. The resources include, but are not limited to: (1) specially skilled professionals; (2) supporting technologies; (3) computational codes and capabilities; and (4) test and manufacturing facilities.
Europa Explorer Operational Scenarios Development
NASA Technical Reports Server (NTRS)
Lock, Robert E.; Pappalardo, Robert T.; Clark, Karla B.
2008-01-01
In 2007, NASA conducted four advanced mission concept studies for outer planets targets: Europa, Ganymede, Titan and Enceladus. The studies were conducted in close cooperation with the planetary science community. Of the four, the Europa Explorer Concept Study focused on refining mission options, science trades and implementation details for a potential flagship mission to Europa in the 2015 timeframe. A science definition team (SDT) was appointed by NASA to guide the study. A JPL-led engineering team worked closely with the science team to address 3 major focus areas: 1) credible cost estimates, 2) rationale and logical discussion of radiation risk and mitigation approaches, and 3) better definition and exploration of science operational scenario trade space. This paper will address the methods and results of the collaborative process used to develop Europa Explorer operations scenarios. Working in concert with the SDT, and in parallel with the SDT's development of a science value matrix, key mission capabilities and constraints were challenged by the science and engineering members of the team. Science goals were advanced and options were considered for observation scenarios. Data collection and return strategies were tested via simulation, and mission performance was estimated and balanced with flight and ground system resources and science priorities. The key to this successful collaboration was a concurrent development environment in which all stakeholders could rapidly assess the feasibility of strategies for their success in the full system context. Issues of science and instrument compatibility, system constraints, and mission opportunities were treated analytically and objectively leading to complementary strategies for observation and data return. Current plans are that this approach, as part of the system engineering process, will continue as the Europa Explorer Concept Study moves toward becoming a development project.
Maru, Biniam T; Munasinghe, Pradeep C; Gilary, Hadar; Jones, Shawn W; Tracy, Bryan P
2018-04-01
Biological CO2 fixation is an important technology that can assist in combating climate change. Here, we show an approach called anaerobic, non-photosynthetic mixotrophy can result in net CO2 fixation when using a reduced feedstock. This approach uses microbes called acetogens that are capable of concurrent utilization of both organic and inorganic substrates. In this study, we investigated the substrate utilization of 17 different acetogens, both mesophilic and thermophilic, on a variety of different carbohydrates and gases. Compared to most model acetogen strains, several non-model mesophilic strains displayed greater substrate flexibility, including the ability to utilize disaccharides, glycerol and an oligosaccharide, and growth rates. Three of these non-model strains (Blautia producta, Clostridium scatologenes and Thermoanaerobacter kivui) were chosen for further characterization, under a variety of conditions including H2- or syngas-fed sugar fermentations and a CO2-fed glycerol fermentation. In all cases, CO2 was fixed and carbon yields approached 100%. Finally, the model acetogen C. ljungdahlii was engineered to utilize glucose, a non-preferred sugar, while maintaining mixotrophic behavior. This work demonstrates the flexibility and robustness of anaerobic, non-photosynthetic mixotrophy as a technology to help reduce CO2 emissions.
NASA Astrophysics Data System (ADS)
Yang, Chao; Song, Jian; Li, Liang; Li, Shengbo; Cao, Dongpu
2016-08-01
This paper presents an economical launching and accelerating mode, including four ordered phases: pure electrical driving, clutch engagement and engine start-up, engine active charging, and engine driving, which can be fit for the alternating conditions and improve the fuel economy of hybrid electric bus (HEB) during typical city-bus driving scenarios. By utilizing the fast response feature of electric motor (EM), an adaptive controller for EM is designed to realize the power demand during the pure electrical driving mode, the engine starting mode and the engine active charging mode. Concurrently, the smoothness issue induced by the sequential mode transitions is solved with a coordinated control logic for engine, EM and clutch. Simulation and experimental results show that the proposed launching and accelerating mode and its control methods are effective in improving the fuel economy and ensure the drivability during the fast transition between the operation modes of HEB.
Heterogeneous concurrent computing with exportable services
NASA Technical Reports Server (NTRS)
Sunderam, Vaidy
1995-01-01
Heterogeneous concurrent computing, based on the traditional process-oriented model, is approaching its functionality and performance limits. An alternative paradigm, based on the concept of services, supporting data driven computation, and built on a lightweight process infrastructure, is proposed to enhance the functional capabilities and the operational efficiency of heterogeneous network-based concurrent computing. TPVM is an experimental prototype system supporting exportable services, thread-based computation, and remote memory operations that is built as an extension of and an enhancement to the PVM concurrent computing system. TPVM offers a significantly different computing paradigm for network-based computing, while maintaining a close resemblance to the conventional PVM model in the interest of compatibility and ease of transition Preliminary experiences have demonstrated that the TPVM framework presents a natural yet powerful concurrent programming interface, while being capable of delivering performance improvements of upto thirty percent.
Multi-Attribute Tradespace Exploration in Space System Design
NASA Astrophysics Data System (ADS)
Ross, A. M.; Hastings, D. E.
2002-01-01
The complexity inherent in space systems necessarily requires intense expenditures of resources both human and monetary. The high level of ambiguity present in the early design phases of these systems causes long, highly iterative, and costly design cycles. This paper looks at incorporating decision theory methods into the early design processes to streamline communication of wants and needs among stakeholders and between levels of design. Communication channeled through formal utility interviews and analysis enables engineers to better understand the key drivers for the system and allows a more thorough exploration of the design tradespace. Multi-Attribute Tradespace Exploration (MATE), an evolving process incorporating decision theory into model and simulation- based design, has been applied to several space system case studies at MIT. Preliminary results indicate that this process can improve the quality of communication to more quickly resolve project ambiguity, and enable the engineer to discover better value designs for multiple stakeholders. MATE is also being integrated into a concurrent design environment to facilitate the transfer knowledge of important drivers into higher fidelity design phases. Formal utility theory provides a mechanism to bridge the language barrier between experts of different backgrounds and differing needs (e.g. scientists, engineers, managers, etc). MATE with concurrent design couples decision makers more closely to the design, and most importantly, maintains their presence between formal reviews.
Hands-On Teaching and Entrepreneurship Development.
ERIC Educational Resources Information Center
da Silveira, Marcos Azevedo; da Silva, Mauro Schwanke; Kelber, Christian R.; de Freitas, Manuel R.
This paper presents the experiment being conducted in the Electric Circuits II course (ELE1103) at PUC-Rio's Electrical Engineering Department since March 1997. This experiment was held in both the fall and the spring semesters of 1997. The basis for the experiment was concurrent teaching methodology, to which the principles of entrepreneurship…
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Exemptions. 87.7 Section 87.7... POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES General Provisions § 87.7 Exemptions. (a) Exemptions based on..., with the concurrence of the Administrator, that application of any standard under § 87.21 is not...
Product development: the making of the Abbott ARCHITECT.
Kisner, H J
1997-01-01
Many laboratorians have a limited perspective on what is involved in developing an instrument and bringing it to market. This article traces the product development process used by Abbott Diagnostics Division that resulted in Abbott being named the 1996 Concurrent Engineering Company of the Year for the design of the ARCHITECT.
2009-07-24
concurrently. Photographs of LSF-1 from before and after June 2006 are consistent with the believed installation date. A forensic engineering...other services such as refuse collection and disposal, entomology , etc. Starting in November 2003, Washington Group International/Black and Veatch
33 CFR 203.84 - Forms of local participation-cost sharing.
Code of Federal Regulations, 2011 CFR
2011-07-01
... sharing. 203.84 Section 203.84 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE EMERGENCY EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Local...; and/or accomplishment of work either concurrently or within a specified reasonable period of time. The...
33 CFR 203.84 - Forms of local participation-cost sharing.
Code of Federal Regulations, 2013 CFR
2013-07-01
... sharing. 203.84 Section 203.84 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE EMERGENCY EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Local...; and/or accomplishment of work either concurrently or within a specified reasonable period of time. The...
33 CFR 203.84 - Forms of local participation-cost sharing.
Code of Federal Regulations, 2012 CFR
2012-07-01
... sharing. 203.84 Section 203.84 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE EMERGENCY EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Local...; and/or accomplishment of work either concurrently or within a specified reasonable period of time. The...
33 CFR 203.84 - Forms of local participation-cost sharing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... sharing. 203.84 Section 203.84 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE EMERGENCY EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Local...; and/or accomplishment of work either concurrently or within a specified reasonable period of time. The...
33 CFR 203.84 - Forms of local participation-cost sharing.
Code of Federal Regulations, 2014 CFR
2014-07-01
... sharing. 203.84 Section 203.84 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE EMERGENCY EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Local...; and/or accomplishment of work either concurrently or within a specified reasonable period of time. The...
Dissipative production of a maximally entangled steady state of two quantum bits.
Lin, Y; Gaebler, J P; Reiter, F; Tan, T R; Bowler, R; Sørensen, A S; Leibfried, D; Wineland, D J
2013-12-19
Entangled states are a key resource in fundamental quantum physics, quantum cryptography and quantum computation. Introduction of controlled unitary processes--quantum gates--to a quantum system has so far been the most widely used method to create entanglement deterministically. These processes require high-fidelity state preparation and minimization of the decoherence that inevitably arises from coupling between the system and the environment, and imperfect control of the system parameters. Here we combine unitary processes with engineered dissipation to deterministically produce and stabilize an approximate Bell state of two trapped-ion quantum bits (qubits), independent of their initial states. Compared with previous studies that involved dissipative entanglement of atomic ensembles or the application of sequences of multiple time-dependent gates to trapped ions, we implement our combined process using trapped-ion qubits in a continuous time-independent fashion (analogous to optical pumping of atomic states). By continuously driving the system towards the steady state, entanglement is stabilized even in the presence of experimental noise and decoherence. Our demonstration of an entangled steady state of two qubits represents a step towards dissipative state engineering, dissipative quantum computation and dissipative phase transitions. Following this approach, engineered coupling to the environment may be applied to a broad range of experimental systems to achieve desired quantum dynamics or steady states. Indeed, concurrently with this work, an entangled steady state of two superconducting qubits was demonstrated using dissipation.
Multifidelity, multidisciplinary optimization of turbomachines with shock interaction
NASA Astrophysics Data System (ADS)
Joly, Michael Marie
Research on high-speed air-breathing propulsion aims at developing aircraft with antipodal range and space access. Before reaching high speed at high altitude, the flight vehicle needs to accelerate from takeoff to scramjet takeover. Air turbo rocket engines combine turbojet and rocket engine cycles to provide the necessary thrust in the so-called low-speed regime. Challenges related to turbomachinery components are multidisciplinary, since both the high compression ratio compressor and the powering high-pressure turbine operate in the transonic regime in compact environments with strong shock interactions. Besides, lightweight is vital to avoid hindering the scramjet operation. Recent progress in evolutionary computing provides aerospace engineers with robust and efficient optimization algorithms to address concurrent objectives. The present work investigates Multidisciplinary Design Optimization (MDO) of innovative transonic turbomachinery components. Inter-stage aerodynamic shock interaction in turbomachines are known to generate high-cycle fatigue on the rotor blades compromising their structural integrity. A soft-computing strategy is proposed to mitigate the vane downstream distortion, and shown to successfully attenuate the unsteady forcing on the rotor of a high-pressure turbine. Counter-rotation offers promising prospects to reduce the weight of the machine, with fewer stages and increased load per row. An integrated approach based on increasing level of fidelity and aero-structural coupling is then presented and allows achieving a highly loaded compact counter-rotating compressor.
Domain-specific languages and diagram customization for a concurrent engineering environment
NASA Astrophysics Data System (ADS)
Cole, B.; Dubos, G.; Banazadeh, P.; Reh, J.; Case, K.; Wang, Y.; Jones, S.; Picha, F.
A major open question for advocates of Model-Based Systems Engineering (MBSE) is the question of how system and subsystem engineers will work together. The Systems Modeling Language (SysML), like any language intended for a large audience, is in tension between the desires for simplicity and for expressiveness. In order to be more expressive, many specialized language elements may be introduced, which will unfortunately make a complete understanding of the language a more daunting task. While this may be acceptable for systems modelers, it will increase the challenge of including subsystem engineers in the modeling effort. One possible answer to this situation is the use of Domain-Specific Languages (DSL), which are fully supported by the Unified Modeling Language (UML). SysML is in fact a DSL for systems engineering. The expressive power of a DSL can be enhanced through the use of diagram customization. Various domains have already developed their own schematic vocabularies. Within the space engineering community, two excellent examples are the propulsion and telecommunication subsystems. A return to simple box-and-line diagrams (e.g., the SysML Internal Block Diagram) are in many ways a step backward. In order allow subsystem engineers to contribute directly to the model, it is necessary to make a system modeling tool at least approximate in accessibility to drawing tools like Microsoft PowerPoint and Visio. The challenge is made more extreme in a concurrent engineering environment, where designs must often be drafted in an hour or two. In the case of the Jet Propulsion Laboratory's Team X concurrent design team, a subsystem is specified using a combination of PowerPoint for drawing and Excel for calculation. A pilot has been undertaken in order to meld the drawing portion and the production of master equipment lists (MELs) via a SysML authoring tool, MagicDraw. Team X currently interacts with its customers in a process of sharing presentations. There are severa- inefficiencies that arise from this situation. The first is that a customer team must wait two weeks to a month (which is 2-4 times the duration of most Team X studies themselves) for a finalized, detailed design description. Another is that this information must be re-entered by hand into the set of engineering artifacts and design tools that the mission concept team uses after a study is complete. Further, there is no persistent connection to Team X or institutionally shared formulation design tools and data after a given study, again reducing the direct reuse of designs created in a Team X study. This paper presents the underpinnings of subsystem DSLs as they were developed for this pilot. This includes specialized semantics for different domains as well as the process by which major categories of objects were derived in support of defining the DSLs. The feedback given to us by the domain experts on usability, along with a pilot study with the partial inclusion of these tools is also discussed.
Domain-Specific Languages and Diagram Customization for a Concurrent Engineering Environment
NASA Technical Reports Server (NTRS)
Cole, Bjorn; Dubos, Greg; Banazadeh, Payam; Reh, Jonathan; Case, Kelley; Wang, Yeou-Fang; Jones, Susan; Picha, Frank
2013-01-01
A major open question for advocates of Model-Based Systems Engineering (MBSE) is the question of how system and subsystem engineers will work together. The Systems Modeling Language (SysML), like any language intended for a large audience, is in tension between the desires for simplicity and for expressiveness. In order to be more expressive, many specialized language elements may be introduced, which will unfortunately make a complete understanding of the language a more daunting task. While this may be acceptable for systems modelers, it will increase the challenge of including subsystem engineers in the modeling effort. One possible answer to this situation is the use of Domain-Specific Languages (DSL), which are fully supported by the Unified Modeling Language (UML). SysML is in fact a DSL for systems engineering. The expressive power of a DSL can be enhanced through the use of diagram customization. Various domains have already developed their own schematic vocabularies. Within the space engineering community, two excellent examples are the propulsion and telecommunication subsystems. A return to simple box-and-line diagrams (e.g., the SysML Internal Block Diagram) are in many ways a step backward. In order allow subsystem engineers to contribute directly to the model, it is necessary to make a system modeling tool at least approximate in accessibility to drawing tools like Microsoft PowerPoint and Visio. The challenge is made more extreme in a concurrent engineering environment, where designs must often be drafted in an hour or two. In the case of the Jet Propulsion Laboratory's Team X concurrent design team, a subsystem is specified using a combination of PowerPoint for drawing and Excel for calculation. A pilot has been undertaken in order to meld the drawing portion and the production of master equipment lists (MELs) via a SysML authoring tool, MagicDraw. Team X currently interacts with its customers in a process of sharing presentations. There are several inefficiencies that arise from this situation. The first is that a customer team must wait two weeks to a month (which is 2-4 times the duration of most Team X studies themselves) for a finalized, detailed design description. Another is that this information must be re-entered by hand into the set of engineering artifacts and design tools that the mission concept team uses after a study is complete. Further, there is no persistent connection to Team X or institutionally shared formulation design tools and data after a given study, again reducing the direct reuse of designs created in a Team X study. This paper presents the underpinnings of subsystem DSLs as they were developed for this pilot. This includes specialized semantics for different domains as well as the process by which major categories of objects were derived in support of defining the DSLs. The feedback given to us by the domain experts on usability, along with a pilot study with the partial inclusion of these tools is also discussed.
Gender sensitive education in watershed management to support environmental friendly city
NASA Astrophysics Data System (ADS)
Asteria, D.; Budidarmono; Herdiansyah, H.; Ni’mah, N. L.
2018-03-01
This study is about gender-sensitive perspective in watershed management education program as one of capacity building for citizens in watershed management with community-based strategy to support environmental friendly cities and security for women from flood disasters. Involving women and increasing women’s active participation in sustainable watershed management is essential in urban area. In global warming and climate change situations, city management should be integrated between social aspect and environmental planning. This study used mix method (concurrent embedded type, with quantitative as primary method) with research type is descriptive-explanatory. The result of this study is education strategies with gender approaches and affirmative action through emancipation approach and local knowledge from women’s experiences can increase women’s participation. Women’s empowerment efforts need integrated intervention and collaboration from government, NGO, and other stakeholders to optimize women’s role in watershed management for support environmental friendly city. The implication of this study is an educational strategy on watershed conservation with gender perspective to offer social engineering alternatives for decision makers to policy of sustainable watershed management in urban area related to flood mitigation efforts.
Resource Management and Contingencies in Aerospace Concurrent Engineering
NASA Technical Reports Server (NTRS)
Karpati, Gabe; Hyde, Tupper; Peabody, Hume; Garrison, Matthew
2012-01-01
significant concern in designing complex systems implementing new technologies is that while knowledge about the system is acquired incrementally, substantial financial commitments, even make-or-break decisions, must be made upfront, essentially in the unknown. One practice that helps in dealing with this dichotomy is the smart embedding of contingencies and margins in the design to serve as buffers against surprises. This issue presents itself in full force in the aerospace industry, where unprecedented systems are formulated and committed to as a matter of routine. As more and more aerospace mission concepts are generated by concurrent design laboratories, it is imperative that such laboratories apply well thought-out contingency and margin structures to their designs. The first part of this publication provides an overview of resource management techniques and standards used in the aerospace industry. That is followed by a thought provoking treatise on margin policies. The expose presents the actual flight telemetry data recorded by the thermal discipline during several recent NASA Goddard Space Flight Center missions. The margins actually achieved in flight are compared against pre-flight predictions, and the appropriateness and the ramifications of having designed with rigid margins to bounding stacked worst case conditions are assessed. The second half of the paper examines the particular issues associated with the application of contingencies and margins in the concurrent engineering environment. In closure, a discipline-by-discipline disclosure of the contingency and margin policies in use at the Integrated Design Center at NASA s Goddard Space Flight Center is made.
A novel approach to quality improvement in a safety-net practice: concurrent peer review visits.
Fiscella, Kevin; Volpe, Ellen; Winters, Paul; Brown, Melissa; Idris, Amna; Harren, Tricia
2010-12-01
Concurrent peer review visits are structured office visits conducted by clinician peers of the primary care clinician that are specifically designed to reduce competing demands, clinical inertia, and bias. We assessed whether a single concurrent peer review visit reduced clinical inertia and improved control of hypertension, hyperlipidemia, and diabetes control among underserved patients. We conducted a randomized encouragement trial to evaluate concurrent peer review visits with a community health center. Seven hundred twenty-seven patients with hypertension, hyperlipidemia, and/or diabetes who were not at goal for systolic blood pressure (SBP), low-density lipoprotein cholesterol (LDL-C), and/or glycated hemoglobin (A1c) were randomly assigned to an invitation to participate in a concurrent peer review visit or to usual care. We compared change in these measures using mixed models and rates of therapeutic intensification during concurrent peer review visits with control visits. One hundred seventy-one patients completed a concurrent peer review visit. SBP improved significantly (p < .01) more among those completing concurrent peer review visits than among those who failed to respond to a concurrent peer review invitation or those randomized to usual care. There were no differences seen for changes in LDL-C or A1c. Concurrent peer review visits were associated with statistically significant greater clinician intensification of blood pressure (p < .001), lipid (p < .001), and diabetes (p < .005) treatment than either for control visits for patients in either the nonresponse group or usual care group. Concurrent peer review visits represent a promising strategy for improving blood pressure control and improving therapeutic intensification in community health centers.
Tendon Reconstruction with Tissue Engineering Approach--A Review.
Verdiyeva, Gunay; Koshy, Kiron; Glibbery, Natalia; Mann, Haroon; Seifalian, Alexander M
2015-09-01
Tendon injuries are a common and rising occurrence, associated with significant impairment to quality of life and financial burden to the healthcare system. Clinically, they represent an unresolved problem, due to poor natural tendon healing and the inability of current treatment strategies to restore the tendon to its native state. Tissue engineering offers a promising alternative, with the incorporation of scaffolds, cells and growth factors to support the complete regeneration of the tendon. The materials used in tendon engineering to date have provided significant advances in structural integrity and biological compatibility and in many cases the results obtained are superior to those observed in natural healing. However, grafts fail to reproduce the qualities of the pre-injured tendon and each has weaknesses subject to its constituent parts. Furthermore, many materials and cell types are being investigated concurrently, with seemingly little association or comparison between research results. In this review the properties of the most-investigated and effective components have been appraised in light of the surrounding literature, with research from early in-vitro experiments to clinical trials being discussed. Extensive comparisons have been made between scaffolds, cell types and growth factors used, listing strengths and weaknesses to provide a stable platform for future research. Promising future endeavours are also described in the field of nanocomposite material science, stem cell sources and growth factors, which may bypass weaknesses found in individual elements. The future of tendon engineering looks bright, with growing understanding in material technology, cell and growth factor application and encouraging recent advances bringing us ever closer to regenerating the native tendon.
Berenson, Daniel F; Weiss, Allison R; Wan, Zhu-Li; Weiss, Michael A
2011-12-01
The engineering of insulin analogs represents a triumph of structure-based protein design. A framework has been provided by structures of insulin hexamers. Containing a zinc-coordinated trimer of dimers, such structures represent a storage form of the active insulin monomer. Initial studies focused on destabilization of subunit interfaces. Because disassembly facilitates capillary absorption, such targeted destabilization enabled development of rapid-acting insulin analogs. Converse efforts were undertaken to stabilize the insulin hexamer and promote higher-order self-assembly within the subcutaneous depot toward the goal of enhanced basal glycemic control with reduced risk of hypoglycemia. Current products either operate through isoelectric precipitation (insulin glargine, the active component of Lantus(®); Sanofi-Aventis) or employ an albumin-binding acyl tether (insulin detemir, the active component of Levemir(®); Novo-Nordisk). To further improve pharmacokinetic properties, modified approaches are presently under investigation. Novel strategies have recently been proposed based on subcutaneous supramolecular assembly coupled to (a) large-scale allosteric reorganization of the insulin hexamer (the TR transition), (b) pH-dependent binding of zinc ions to engineered His-X(3)-His sites at hexamer surfaces, or (c) the long-range vision of glucose-responsive polymers for regulated hormone release. Such designs share with wild-type insulin and current insulin products a susceptibility to degradation above room temperature, and so their delivery, storage, and use require the infrastructure of an affluent society. Given the global dimensions of the therapeutic supply chain, we envisage that concurrent engineering of ultra-stable protein analog formulations would benefit underprivileged patients in the developing world.
MESA: Message-Based System Analysis Using Runtime Verification
NASA Technical Reports Server (NTRS)
Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter
2017-01-01
In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.
Diffusion phenomena of cells and biomolecules in microfluidic devices.
Yildiz-Ozturk, Ece; Yesil-Celiktas, Ozlem
2015-09-01
Biomicrofluidics is an emerging field at the cross roads of microfluidics and life sciences which requires intensive research efforts in terms of introducing appropriate designs, production techniques, and analysis. The ultimate goal is to deliver innovative and cost-effective microfluidic devices to biotech, biomedical, and pharmaceutical industries. Therefore, creating an in-depth understanding of the transport phenomena of cells and biomolecules becomes vital and concurrently poses significant challenges. The present article outlines the recent advancements in diffusion phenomena of cells and biomolecules by highlighting transport principles from an engineering perspective, cell responses in microfluidic devices with emphases on diffusion- and flow-based microfluidic gradient platforms, macroscopic and microscopic approaches for investigating the diffusion phenomena of biomolecules, microfluidic platforms for the delivery of these molecules, as well as the state of the art in biological applications of mammalian cell responses and diffusion of biomolecules.
Diffusion phenomena of cells and biomolecules in microfluidic devices
Yildiz-Ozturk, Ece; Yesil-Celiktas, Ozlem
2015-01-01
Biomicrofluidics is an emerging field at the cross roads of microfluidics and life sciences which requires intensive research efforts in terms of introducing appropriate designs, production techniques, and analysis. The ultimate goal is to deliver innovative and cost-effective microfluidic devices to biotech, biomedical, and pharmaceutical industries. Therefore, creating an in-depth understanding of the transport phenomena of cells and biomolecules becomes vital and concurrently poses significant challenges. The present article outlines the recent advancements in diffusion phenomena of cells and biomolecules by highlighting transport principles from an engineering perspective, cell responses in microfluidic devices with emphases on diffusion- and flow-based microfluidic gradient platforms, macroscopic and microscopic approaches for investigating the diffusion phenomena of biomolecules, microfluidic platforms for the delivery of these molecules, as well as the state of the art in biological applications of mammalian cell responses and diffusion of biomolecules. PMID:26180576
Plans for the extreme ultraviolet explorer data base
NASA Technical Reports Server (NTRS)
Marshall, Herman L.; Dobson, Carl A.; Malina, Roger F.; Bowyer, Stuart
1988-01-01
The paper presents an approach for storage and fast access to data that will be obtained by the Extreme Ultraviolet Explorer (EUVE), a satellite payload scheduled for launch in 1991. The EUVE telescopes will be operated remotely from the EUVE Science Operation Center (SOC) located at the University of California, Berkeley. The EUVE science payload consists of three scanning telescope carrying out an all-sky survey in the 80-800 A spectral region and a Deep Survey/Spectrometer telescope performing a deep survey in the 80-250 A spectral region. Guest Observers will remotely access the EUVE spectrometer database at the SOC. The EUVE database will consist of about 2 X 10 to the 10th bytes of information in a very compact form, very similar to the raw telemetry data. A history file will be built concurrently giving telescope parameters, command history, attitude summaries, engineering summaries, anomalous events, and ephemeris summaries.
Crystal Plasticity Model of Reactor Pressure Vessel Embrittlement in GRIZZLY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chakraborty, Pritam; Biner, Suleyman Bulent; Zhang, Yongfeng
2015-07-01
The integrity of reactor pressure vessels (RPVs) is of utmost importance to ensure safe operation of nuclear reactors under extended lifetime. Microstructure-scale models at various length and time scales, coupled concurrently or through homogenization methods, can play a crucial role in understanding and quantifying irradiation-induced defect production, growth and their influence on mechanical behavior of RPV steels. A multi-scale approach, involving atomistic, meso- and engineering-scale models, is currently being pursued within the GRIZZLY project to understand and quantify irradiation-induced embrittlement of RPV steels. Within this framework, a dislocation-density based crystal plasticity model has been developed in GRIZZLY that captures themore » effect of irradiation-induced defects on the flow stress behavior and is presented in this report. The present formulation accounts for the interaction between self-interstitial loops and matrix dislocations. The model predictions have been validated with experiments and dislocation dynamics simulation.« less
On the operation of machines powered by quantum non-thermal baths
Niedenzu, Wolfgang; Gelbwaser-Klimovsky, David; Kofman, Abraham G.; ...
2016-08-02
Diverse models of engines energised by quantum-coherent, hence non-thermal, baths allow the engine efficiency to transgress the standard thermodynamic Carnot bound. These transgressions call for an elucidation of the underlying mechanisms. Here we show that non-thermal baths may impart not only heat, but also mechanical work to a machine. The Carnot bound is inapplicable to such a hybrid machine. Intriguingly, it may exhibit dual action, concurrently as engine and refrigerator, with up to 100% efficiency. Here, we conclude that even though a machine powered by a quantum bath may exhibit an unconventional performance, it still abides by the traditional principlesmore » of thermodynamics.« less
Advanced engineering environment pilot project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwegel, Jill; Pomplun, Alan R.; Abernathy, Rusty
2006-10-01
The Advanced Engineering Environment (AEE) is a concurrent engineering concept that enables real-time process tooling design and analysis, collaborative process flow development, automated document creation, and full process traceability throughout a product's life cycle. The AEE will enable NNSA's Design and Production Agencies to collaborate through a singular integrated process. Sandia National Laboratories and Parametric Technology Corporation (PTC) are working together on a prototype AEE pilot project to evaluate PTC's product collaboration tools relative to the needs of the NWC. The primary deliverable for the project is a set of validated criteria for defining a complete commercial off-the-shelf (COTS) solutionmore » to deploy the AEE across the NWC.« less
Visualization of Concurrent Program Executions
NASA Technical Reports Server (NTRS)
Artho, Cyrille; Havelund, Klaus; Honiden, Shinichi
2007-01-01
Various program analysis techniques are efficient at discovering failures and properties. However, it is often difficult to evaluate results, such as program traces. This calls for abstraction and visualization tools. We propose an approach based on UML sequence diagrams, addressing shortcomings of such diagrams for concurrency. The resulting visualization is expressive and provides all the necessary information at a glance.
Allaf, Mohamad E; Hsu, Thomas H; Sullivan, Wendy; Su, Li-Ming
2003-12-01
Concurrent repair of inguinal hernias during open radical retropubic prostatectomy is well described and commonly practiced. With the advent of the laparoscopic approach to radical prostatectomy, the possibility of concurrent laparoscopic hernia repair merits investigation. We present a case of simultaneous prosthetic mesh onlay hernia repair for bilateral inguinal hernias during laparoscopic transperitoneal radical prostatectomy.
Functional language and data flow architectures
NASA Technical Reports Server (NTRS)
Ercegovac, M. D.; Patel, D. R.; Lang, T.
1983-01-01
This is a tutorial article about language and architecture approaches for highly concurrent computer systems based on the functional style of programming. The discussion concentrates on the basic aspects of functional languages, and sequencing models such as data-flow, demand-driven and reduction which are essential at the machine organization level. Several examples of highly concurrent machines are described.
ERIC Educational Resources Information Center
Aucoin, Jennifer Mangrum
2013-01-01
The purpose of this mixed methods concurrent triangulation study was to examine the program evaluation practices of high school counselors. A total of 294 high school counselors in Texas were assessed using a mixed methods concurrent triangulation design. A researcher-developed survey, the School Counseling Program Evaluation Questionnaire…
The Psychotherapy of Parenthood: Towards a Formulation and Valuation of Concurrent Work with Parents
ERIC Educational Resources Information Center
Sutton, Adrian; Hughes, Lynette
2005-01-01
This paper explores the process and value of concurrent work with parents when their child is being treated in individual psychotherapy. The position taken is that psychoanalytic understanding generally and the specific formulations presented in this paper have a broader applicability in other aspects and approaches in child and adolescent mental…
33 CFR 385.5 - Guidance memoranda.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Secretary of the Army. The Corps of Engineers and the South Florida Water Management District shall also... achieving the goals and purposes of the Plan. (2) The Secretary of the Army shall afford the public an... concurrence of the Secretary of the Interior and the Governor. Within 180 days after being provided with the...
NASA Astrophysics Data System (ADS)
Semiatin, S. L.; Fagin, P. N.; Goetz, R. L.; Furrer, D. U.; Dutton, R. E.
2015-09-01
The plastic-flow behavior which controls the formation of bulk residual stresses during final heat treatment of powder-metallurgy (PM), nickel-base superalloys was quantified using conventional (isothermal) stress-relaxation (SR) tests and a novel approach which simulates concurrent temperature and strain transients during cooling following solution treatment. The concurrent cooling/straining test involves characterization of the thermal compliance of the test sample. In turn, this information is used to program the ram-displacement- vs-time profile to impose a constant plastic strain rate during cooling. To demonstrate the efficacy of the new approach, SR tests (in both tension and compression) and concurrent cooling/tension-straining experiments were performed on two PM superalloys, LSHR and IN-100. The isothermal SR experiments were conducted at a series of temperatures between 1144 K and 1436 K (871 °C and 1163 °C) on samples that had been supersolvus solution treated and cooled slowly or rapidly to produce starting microstructures comprising coarse gamma grains and coarse or fine secondary gamma-prime precipitates, respectively. The concurrent cooling/straining tests comprised supersolvus solution treatment and various combinations of subsequent cooling rate and plastic strain rate. Comparison of flow-stress data from the SR and concurrent cooling/straining tests showed some similarities and some differences which were explained in the context of the size of the gamma-prime precipitates and the evolution of dislocation substructure. The magnitude of the effect of concurrent deformation during cooling on gamma-prime precipitation was also quantified experimentally and theoretically.
Concurrent analysis: towards generalisable qualitative research.
Snowden, Austyn; Martin, Colin R
2011-10-01
This study develops an original method of qualitative analysis coherent with its interpretivist principles. The objective is to increase the likelihood of achieving generalisability and so improve the chance of the findings being translated into practice. Good qualitative research depends on coherent analysis of different types of data. The limitations of existing methodologies are first discussed to justify the need for a novel approach. To illustrate this approach, primary evidence is presented using the new methodology. The primary evidence consists of a constructivist grounded theory of how mental health nurses with prescribing authority integrate prescribing into practice. This theory is built concurrently from interviews, reflective accounts and case study data from the literature. Concurrent analysis. Ten research articles and 13 semi-structured interviews were sampled purposively and then theoretically and analysed concurrently using constructivist grounded theory. A theory of the process of becoming competent in mental health nurse prescribing was generated through this process. This theory was validated by 32 practising mental health nurse prescribers as an accurate representation of their experience. The methodology generated a coherent and generalisable theory. It is therefore claimed that concurrent analysis engenders consistent and iterative treatment of different sources of qualitative data in a manageable manner. This process supports facilitation of the highest standard of qualitative research. Concurrent analysis removes the artificial delineation of relevant literature from other forms of constructed data. This gives researchers clear direction to treat qualitative data consistently raising the chances of generalisability of the findings. Raising the generalisability of qualitative research will increase its chances of informing clinical practice. © 2010 Blackwell Publishing Ltd.
NASA Astrophysics Data System (ADS)
Cifelli, R.; Mahoney, K. M.; Webb, R. S.; McCormick, B.
2017-12-01
To ensure structural and operational safety of dams and other water management infrastructure, water resources managers and engineers require information about the potential for heavy precipitation. The methods and data used to estimate extreme rainfall amounts for managing risk are based on 40-year-old science and in need of improvement. The need to evaluate new approaches based on the best science available has led the states of Colorado and New Mexico to engage a body of scientists and engineers in an innovative "ensemble approach" to updating extreme precipitation estimates. NOAA is at the forefront of one of three technical approaches that make up the "ensemble study"; the three approaches are conducted concurrently and in collaboration with each other. One approach is the conventional deterministic, "storm-based" method, another is a risk-based regional precipitation frequency estimation tool, and the third is an experimental approach utilizing NOAA's state-of-the-art High Resolution Rapid Refresh (HRRR) physically-based dynamical weather prediction model. The goal of the overall project is to use the individual strengths of these different methods to define an updated and broadly acceptable state of the practice for evaluation and design of dam spillways. This talk will highlight the NOAA research and NOAA's role in the overarching goal to better understand and characterizing extreme precipitation estimation uncertainty. The research led by NOAA explores a novel high-resolution dataset and post-processing techniques using a super-ensemble of hourly forecasts from the HRRR model. We also investigate how this rich dataset may be combined with statistical methods to optimally cast the data in probabilistic frameworks. NOAA expertise in the physical processes that drive extreme precipitation is also employed to develop careful testing and improved understanding of the limitations of older estimation methods and assumptions. The process of decision making in the midst of uncertainty is a major part of this study. We will speak to how the ensemble approach may be used in concert with one another to manage risk and enhance resiliency in the midst of uncertainty. Finally, the presentation will also address the implications of including climate change in future extreme precipitation estimation studies.
Zhao, Shijie; Han, Junwei; Hu, Xintao; Jiang, Xi; Lv, Jinglei; Zhang, Tuo; Zhang, Shu; Guo, Lei; Liu, Tianming
2018-06-01
Recently, a growing body of studies have demonstrated the simultaneous existence of diverse brain activities, e.g., task-evoked dominant response activities, delayed response activities and intrinsic brain activities, under specific task conditions. However, current dominant task-based functional magnetic resonance imaging (tfMRI) analysis approach, i.e., the general linear model (GLM), might have difficulty in discovering those diverse and concurrent brain responses sufficiently. This subtraction-based model-driven approach focuses on the brain activities evoked directly from the task paradigm, thus likely overlooks other possible concurrent brain activities evoked during the information processing. To deal with this problem, in this paper, we propose a novel hybrid framework, called extendable supervised dictionary learning (E-SDL), to explore diverse and concurrent brain activities under task conditions. A critical difference between E-SDL framework and previous methods is that we systematically extend the basic task paradigm regressor into meaningful regressor groups to account for possible regressor variation during the information processing procedure in the brain. Applications of the proposed framework on five independent and publicly available tfMRI datasets from human connectome project (HCP) simultaneously revealed more meaningful group-wise consistent task-evoked networks and common intrinsic connectivity networks (ICNs). These results demonstrate the advantage of the proposed framework in identifying the diversity of concurrent brain activities in tfMRI datasets.
Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)
NASA Astrophysics Data System (ADS)
Selvy, Brian M.; Claver, Charles; Angeli, George
2014-08-01
This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.
NASA Technical Reports Server (NTRS)
Saravanos, D. A.; Chamis, C. C.; Morel, M.
1991-01-01
A methodology is presented to reduce the residual matrix stresses in continuous fiber metal matrix composites (MMC) by optimizing the fabrication process and interphase layer characteristics. The response of the fabricated MMC was simulated based on nonlinear micromechanics. Application cases include fabrication tailoring, interphase tailoring, and concurrent fabrication-interphase optimization. Two composite systems, silicon carbide/titanium and graphite/copper, are considered. Results illustrate the merits of each approach, indicate that concurrent fabrication/interphase optimization produces significant reductions in the matrix residual stresses and demonstrate the strong coupling between fabrication and interphase tailoring.
System Software Framework for System of Systems Avionics
NASA Technical Reports Server (NTRS)
Ferguson, Roscoe C.; Peterson, Benjamin L; Thompson, Hiram C.
2005-01-01
Project Constellation implements NASA's vision for space exploration to expand human presence in our solar system. The engineering focus of this project is developing a system of systems architecture. This architecture allows for the incremental development of the overall program. Systems can be built and connected in a "Lego style" manner to generate configurations supporting various mission objectives. The development of the avionics or control systems of such a massive project will result in concurrent engineering. Also, each system will have software and the need to communicate with other (possibly heterogeneous) systems. Fortunately, this design problem has already been solved during the creation and evolution of systems such as the Internet and the Department of Defense's successful effort to standardize distributed simulation (now IEEE 1516). The solution relies on the use of a standard layered software framework and a communication protocol. A standard framework and communication protocol is suggested for the development and maintenance of Project Constellation systems. The ARINC 653 standard is a great start for such a common software framework. This paper proposes a common system software framework that uses the Real Time Publish/Subscribe protocol for framework-to-framework communication to extend ARINC 653. It is highly recommended that such a framework be established before development. This is important for the success of concurrent engineering. The framework provides an infrastructure for general system services and is designed for flexibility to support a spiral development effort.
NASA Astrophysics Data System (ADS)
Kargarian, M.; Jafari, R.; Langari, A.
2007-12-01
We have combined the idea of renormalization group and quantum-information theory. We have shown how the entanglement or concurrence evolve as the size of the system becomes large, i.e., the finite size scaling is obtained. Moreover, we introduce how the renormalization-group approach can be implemented to obtain the quantum-information properties of a many-body system. We have obtained the concurrence as a measure of entanglement, its derivatives and their scaling behavior versus the size of system for the one-dimensional Ising model in transverse field. We have found that the derivative of concurrence between two blocks each containing half of the system size diverges at the critical point with the exponent, which is directly associated with the divergence of the correlation length.
ERIC Educational Resources Information Center
Maseda, F. J.; Martija, I.; Martija, I.
2012-01-01
This paper describes a novel Electrical Machine and Power Electronic Training Tool (EM&PE[subscript TT]), a methodology for using it, and associated experimental educational activities. The training tool is implemented by recreating a whole power electronics system, divided into modular blocks. This process is similar to that applied when…
Computer Program Re-layers Engineering Drawings
NASA Technical Reports Server (NTRS)
Crosby, Dewey C., III
1990-01-01
RULCHK computer program aids in structuring layers of information pertaining to part or assembly designed with software described in article "Software for Drawing Design Details Concurrently" (MFS-28444). Checks and optionally updates structure of layers for part. Enables designer to construct model and annotate its documentation without burden of manually layering part to conform to standards at design time.
On the feasibility of concurrent human TMS-EEG-fMRI measurements
Reithler, Joel; Schuhmann, Teresa; de Graaf, Tom; Uludağ, Kâmil; Goebel, Rainer; Sack, Alexander T.
2013-01-01
Simultaneously combining the complementary assets of EEG, functional MRI (fMRI), and transcranial magnetic stimulation (TMS) within one experimental session provides synergetic results, offering insights into brain function that go beyond the scope of each method when used in isolation. The steady increase of concurrent EEG-fMRI, TMS-EEG, and TMS-fMRI studies further underlines the added value of such multimodal imaging approaches. Whereas concurrent EEG-fMRI enables monitoring of brain-wide network dynamics with high temporal and spatial resolution, the combination with TMS provides insights in causal interactions within these networks. Thus the simultaneous use of all three methods would allow studying fast, spatially accurate, and distributed causal interactions in the perturbed system and its functional relevance for intact behavior. Concurrent EEG-fMRI, TMS-EEG, and TMS-fMRI experiments are already technically challenging, and the three-way combination of TMS-EEG-fMRI might yield additional difficulties in terms of hardware strain or signal quality. The present study explored the feasibility of concurrent TMS-EEG-fMRI studies by performing safety and quality assurance tests based on phantom and human data combining existing commercially available hardware. Results revealed that combined TMS-EEG-fMRI measurements were technically feasible, safe in terms of induced temperature changes, allowed functional MRI acquisition with comparable image quality as during concurrent EEG-fMRI or TMS-fMRI, and provided artifact-free EEG before and from 300 ms after TMS pulse application. Based on these empirical findings, we discuss the conceptual benefits of this novel complementary approach to investigate the working human brain and list a number of precautions and caveats to be heeded when setting up such multimodal imaging facilities with current hardware. PMID:23221407
Using Histories to Implement Atomic Objects
NASA Technical Reports Server (NTRS)
Ng, Pui
1987-01-01
In this paper we describe an approach of implementing atomicity. Atomicity requires that computations appear to be all-or-nothing and executed in a serialization order. The approach we describe has three characteristics. First, it utilizes the semantics of an application to improve concurrency. Second, it reduces the complexity of application-dependent synchronization code by analyzing the process of writing it. In fact, the process can be automated with logic programming. Third, our approach hides the protocol used to arrive at a serialization order from the applications. As a result, different protocols can be used without affecting the applications. Our approach uses a history tree abstraction. The history tree captures the ordering relationship among concurrent computations. By determining what types of computations exist in the history tree and their parameters, a computation can determine whether it can proceed.
A Principled Approach to the Specification of System Architectures for Space Missions
NASA Technical Reports Server (NTRS)
McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad
2015-01-01
Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.
Space Station logistics policy - Risk management from the top down
NASA Technical Reports Server (NTRS)
Paules, Granville; Graham, James L., Jr.
1990-01-01
Considerations are presented in the area of risk management specifically relating to logistics and system supportability. These considerations form a basis for confident application of concurrent engineering principles to a development program, aiming at simultaneous consideration of support and logistics requirements within the engineering process as the system concept and designs develop. It is shown that, by applying such a process, the chances of minimizing program logistics and supportability risk in the long term can be improved. The problem of analyzing and minimizing integrated logistics risk for the Space Station Freedom Program is discussed.
Concurrent Learning of Control in Multi agent Sequential Decision Tasks
2018-04-17
Concurrent Learning of Control in Multi-agent Sequential Decision Tasks The overall objective of this project was to develop multi-agent reinforcement...learning (MARL) approaches for intelligent agents to autonomously learn distributed control policies in decentral- ized partially observable...shall be subject to any oenalty for failing to comply with a collection of information if it does not display a currently valid OMB control number
Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems
NASA Astrophysics Data System (ADS)
Koch, Patrick Nathan
Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.
Exascale computing and what it means for shock physics
NASA Astrophysics Data System (ADS)
Germann, Timothy
2015-06-01
The U.S. Department of Energy is preparing to launch an Exascale Computing Initiative, to address the myriad challenges required to deploy and effectively utilize an exascale-class supercomputer (i.e., one capable of performing 1018 operations per second) in the 2023 timeframe. Since physical (power dissipation) requirements limit clock rates to at most a few GHz, this will necessitate the coordination of on the order of a billion concurrent operations, requiring sophisticated system and application software, and underlying mathematical algorithms, that may differ radically from traditional approaches. Even at the smaller workstation or cluster level of computation, the massive concurrency and heterogeneity within each processor will impact computational scientists. Through the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx), we have initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. In my talk, I will discuss these challenges, and what it will mean for exascale-era electronic structure, molecular dynamics, and engineering-scale simulations of shock-compressed condensed matter. In particular, we anticipate that the emerging hierarchical, heterogeneous architectures can be exploited to achieve higher physical fidelity simulations using adaptive physics refinement. This work is supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research.
Electrical features of eighteen automated external defibrillators: a systematic evaluation.
Kette, Fulvio; Locatelli, Aldo; Bozzola, Marcella; Zoli, Alberto; Li, Yongqin; Salmoiraghi, Marco; Ristagno, Giuseppe; Andreassi, Aida
2013-11-01
Assessment and comparison of the electrical parameters (energy, current, first and second phase waveform duration) among eighteen AEDs. Engineering bench tests for a descriptive systematic evaluation in commercially available AEDs. AEDs were tested through an ECG simulator, an impedance simulator, an oscilloscope and a measuring device detecting energy delivered, peak and average current, and duration of first and second phase of the biphasic waveforms. All tests were performed at the engineering facility of the Lombardia Regional Emergency Service (AREU). Large variations in the energy delivered at the first shock were observed. The trend of current highlighted a progressive decline concurrent with the increases of impedance. First and second phase duration varied substantially among the AEDs using the exponential biphasic waveform, unlike rectilinear waveform AEDs in which phase duration remained relatively constant. There is a large variability in the electrical features of the AEDs tested. Energy is likely not to be the best indicator for strength dose selection. Current and shock duration should be both considered when approaching the technical features of AEDs. These findings may prompt further investigations to define the optimal current and duration of the shock waves to increase the success rate in the clinical setting. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Leskiw, Donald M.; Zhau, Junmei
2000-06-01
This paper reports on results from an ongoing project to develop methodologies for representing and managing multiple, concurrent levels of detail and enabling high performance computing using parallel arrays within distributed object-based simulation frameworks. At this time we present the methodology for representing and managing multiple, concurrent levels of detail and modeling accuracy by using a representation based on the Kalman approach for estimation. The Kalman System Model equations are used to represent model accuracy, Kalman Measurement Model equations provide transformations between heterogeneous levels of detail, and interoperability among disparate abstractions is provided using a form of the Kalman Update equations.
NASA Astrophysics Data System (ADS)
Cutler, Stephanie Leigh
The purpose of this dissertation is to investigate how educational research, specifically Research-Based Instructional Strategies (RBIS), is adopted by education practice, specifically within the engineering Statics classroom. Using a systematic approach, changes in classroom teaching practices were investigated from the instructors' perspective. Both researchers and practitioners are included in the process, combining efforts to improve student learning, which is a critical goal for engineering education. The study is divided into 3 stages and each is discussed in an individual manuscript. Manuscript 1 provides an assessment of current teaching practices; Manuscript 2 explores RBIS use by Statics instructors and perceived barriers of adoption; and Manuscript 3 evaluates adoption using Fidelity of Implementation. A common set of concurrent mixed methods was used for each stage of this study. A quantitative national survey of Statics instructors (n =166) and 18 qualitative interviews were conducted to examine activities used in the Statics classroom and familiarity with nine RBIS. The results of this study show that lecturing is the most common activity throughout Statics classrooms, but is not the only activity. Other common activities included working examples and students working on problems individually and in groups. As discussed by the interview participants, each of Rogers' characteristics influenced adoption for different reasons. For example, Complexity (level of difficulty with implementation of an RBIS) was most commonly identified as a barrier. His study also evaluated the Fidelity of Implementation for each RBIS and found it to be higher for RBIS that were less complex (in terms of the number of critical components). Many of the critical components (i.e. activities required for implementation, as described in the literature) were found to statistically distinguish RBIS users and non-users. This dissertation offers four contributions: (1) an understanding of current practices in Statics; (2) the instructor perspective of the barriers to using RBIS in the classroom; (3) the use of Fidelity of Implementation as a unique evaluation of RBIS adoption, which can be used by future engineering education researchers; and (4) a systematic approach of exploring change in the classroom, which offers new perspectives and approaches to accelerate the adoption process.
The Complexities of Teachers' Commitment to Environmental Education: A Mixed Methods Approach
ERIC Educational Resources Information Center
Sosu, Edward M.; McWilliam, Angus; Gray, Donald S.
2008-01-01
This article argues that a mixed methods approach is useful in understanding the complexity that underlies teachers' commitment to environmental education. Using sequential and concurrent procedures, the authors demonstrate how different methodological approaches highlighted different aspects of teacher commitment. The quantitative survey examined…
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC.
This report addresses an opportunity to accelerate progress in virtually every branch of science and engineering concurrently, while also boosting the American economy as business firms also learn to exploit these new capabilities. The successful rapid advancement in both science and technology creates its own challenges, four of which are…
33 CFR 165.768 - Security Zone; MacDill Air Force Base, Tampa Bay, FL.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 33 Navigation and Navigable Waters 2 2014-07-01 2014-07-01 false Security Zone; MacDill Air Force....768 Security Zone; MacDill Air Force Base, Tampa Bay, FL. (a) Location. The following area is a security zone which exists concurrent with an Army Corps of Engineers restricted area in § 334.635 of this...
33 CFR 165.768 - Security Zone; MacDill Air Force Base, Tampa Bay, FL.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 33 Navigation and Navigable Waters 2 2013-07-01 2013-07-01 false Security Zone; MacDill Air Force....768 Security Zone; MacDill Air Force Base, Tampa Bay, FL. (a) Location. The following area is a security zone which exists concurrent with an Army Corps of Engineers restricted area in § 334.635 of this...
33 CFR 165.768 - Security Zone; MacDill Air Force Base, Tampa Bay, FL.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false Security Zone; MacDill Air Force....768 Security Zone; MacDill Air Force Base, Tampa Bay, FL. (a) Location. The following area is a security zone which exists concurrent with an Army Corps of Engineers restricted area in § 334.635 of this...
33 CFR 165.768 - Security Zone; MacDill Air Force Base, Tampa Bay, FL.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Security Zone; MacDill Air Force....768 Security Zone; MacDill Air Force Base, Tampa Bay, FL. (a) Location. The following area is a security zone which exists concurrent with an Army Corps of Engineers restricted area in § 334.635 of this...
Proceedings of the Department of Defense Environmental Technology Workshop
1995-05-01
Fabrication Laboratory Results in Waste Elimination William J. Kelso, Parsons Engineering Science, Inc.; Susan H. Errett, Lt. Col. Ronald D. Fancher... Williams , Ocean City Research Corporation ......................... 109 NDCEE Reduces Risk in Technology Transfer Jack H. Cavanaugh, Concurrent...Ecological Receptors William R. Alsop, Mark E. Stelljes, Elizabeth T. Hawkins, Harding Lawson Associates; W illiam Collins, U.S. Department of the Army
1995-09-01
vital processes of a business. process, IDEF, method, methodology, modeling, knowledge acquisition, requirements definition, information systems... knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems
Information Integration for Concurrent Engineering (IICE) Compendium of Methods Report
1995-06-01
technological, economic, and strategic benefits can be attained through the effective capture, control, and management of information and knowledge ...resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to achieve...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems that
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-02
... enhance the quality, utility, and clarity of the information collected; (d) ways to minimize the burden of... frequencies. Section 90.545(c)(1) requires that public safety applicants select one of three ways to meet TV... engineering study to justify other separations; or (3) obtain concurrence from the applicable TV/DTV station(s...
A cognitive approach to classifying perceived behaviors
NASA Astrophysics Data System (ADS)
Benjamin, Dale Paul; Lyons, Damian
2010-04-01
This paper describes our work on integrating distributed, concurrent control in a cognitive architecture, and using it to classify perceived behaviors. We are implementing the Robot Schemas (RS) language in Soar. RS is a CSP-type programming language for robotics that controls a hierarchy of concurrently executing schemas. The behavior of every RS schema is defined using port automata. This provides precision to the semantics and also a constructive means of reasoning about the behavior and meaning of schemas. Our implementation uses Soar operators to build, instantiate and connect port automata as needed. Our approach is to use comprehension through generation (similar to NLSoar) to search for ways to construct port automata that model perceived behaviors. The generality of RS permits us to model dynamic, concurrent behaviors. A virtual world (Ogre) is used to test the accuracy of these automata. Soar's chunking mechanism is used to generalize and save these automata. In this way, the robot learns to recognize new behaviors.
Concurrent evolution of feature extractors and modular artificial neural networks
NASA Astrophysics Data System (ADS)
Hannak, Victor; Savakis, Andreas; Yang, Shanchieh Jay; Anderson, Peter
2009-05-01
This paper presents a new approach for the design of feature-extracting recognition networks that do not require expert knowledge in the application domain. Feature-Extracting Recognition Networks (FERNs) are composed of interconnected functional nodes (feurons), which serve as feature extractors, and are followed by a subnetwork of traditional neural nodes (neurons) that act as classifiers. A concurrent evolutionary process (CEP) is used to search the space of feature extractors and neural networks in order to obtain an optimal recognition network that simultaneously performs feature extraction and recognition. By constraining the hill-climbing search functionality of the CEP on specific parts of the solution space, i.e., individually limiting the evolution of feature extractors and neural networks, it was demonstrated that concurrent evolution is a necessary component of the system. Application of this approach to a handwritten digit recognition task illustrates that the proposed methodology is capable of producing recognition networks that perform in-line with other methods without the need for expert knowledge in image processing.
Ó Ciardha, Caoilte; Attard-Johnson, Janice; Bindemann, Markus
2018-04-01
Latency-based measures of sexual interest require additional evidence of validity, as do newer pupil dilation approaches. A total of 102 community men completed six latency-based measures of sexual interest. Pupillary responses were recorded during three of these tasks and in an additional task where no participant response was required. For adult stimuli, there was a high degree of intercorrelation between measures, suggesting that tasks may be measuring the same underlying construct (convergent validity). In addition to being correlated with one another, measures also predicted participants' self-reported sexual interest, demonstrating concurrent validity (i.e., the ability of a task to predict a more validated, simultaneously recorded, measure). Latency-based and pupillometric approaches also showed preliminary evidence of concurrent validity in predicting both self-reported interest in child molestation and viewing pornographic material containing children. Taken together, the study findings build on the evidence base for the validity of latency-based and pupillometric measures of sexual interest.
Radisic, Milica; Park, Hyoungshin; Shing, Helen; Consi, Thomas; Schoen, Frederick J; Langer, Robert; Freed, Lisa E; Vunjak-Novakovic, Gordana
2004-12-28
The major challenge of tissue engineering is directing the cells to establish the physiological structure and function of the tissue being replaced across different hierarchical scales. To engineer myocardium, biophysical regulation of the cells needs to recapitulate multiple signals present in the native heart. We hypothesized that excitation-contraction coupling, critical for the development and function of a normal heart, determines the development and function of engineered myocardium. To induce synchronous contractions of cultured cardiac constructs, we applied electrical signals designed to mimic those in the native heart. Over only 8 days in vitro, electrical field stimulation induced cell alignment and coupling, increased the amplitude of synchronous construct contractions by a factor of 7, and resulted in a remarkable level of ultrastructural organization. Development of conductive and contractile properties of cardiac constructs was concurrent, with strong dependence on the initiation and duration of electrical stimulation.
An Example of Concurrent Engineering
NASA Technical Reports Server (NTRS)
Rowe, Sidney; Whitten, David; Cloyd, Richard; Coppens, Chris; Rodriguez, Pedro
1998-01-01
The Collaborative Engineering Design and Analysis Room (CEDAR) facility allows on-the- spot design review capability for any project during all phases of development. The required disciplines assemble in this facility to work on any problems (analysis, manufacturing, inspection, etc.) associated with a particular design. A small highly focused team of specialists can meet in this room to better expedite the process of developing a solution to an engineering task within the framework of the constraints that are unique to each discipline. This facility provides the engineering tools and translators to develop a concept within the confines of the room or with remote team members that could access the team's data from other locations. The CEDAR area is envisioned as excellent for failure investigation meetings to be conducted where the computer capabilities can be utilized in conjunction with the Smart Board display to develop failure trees, brainstorm failure modes, and evaluate possible solutions.
Concurrent Mission and Systems Design at NASA Glenn Research Center: The Origins of the COMPASS Team
NASA Technical Reports Server (NTRS)
McGuire, Melissa L.; Oleson, Steven R.; Sarver-Verhey, Timothy R.
2012-01-01
Established at the NASA Glenn Research Center (GRC) in 2006 to meet the need for rapid mission analysis and multi-disciplinary systems design for in-space and human missions, the Collaborative Modeling for Parametric Assessment of Space Systems (COMPASS) team is a multidisciplinary, concurrent engineering group whose primary purpose is to perform integrated systems analysis, but it is also capable of designing any system that involves one or more of the disciplines present in the team. The authors were involved in the development of the COMPASS team and its design process, and are continuously making refinements and enhancements. The team was unofficially started in the early 2000s as part of the distributed team known as Team JIMO (Jupiter Icy Moons Orbiter) in support of the multi-center collaborative JIMO spacecraft design during Project Prometheus. This paper documents the origins of a concurrent mission and systems design team at GRC and how it evolved into the COMPASS team, including defining the process, gathering the team and tools, building the facility, and performing studies.
ERIC Educational Resources Information Center
Zhao, Yue; Huen, Jenny M. Y.; Prosser, Michael
2017-01-01
Purpose: Hong Kong has undergone extensive curriculum reform and shifted from a three-year to a four-year university system. With a nuanced look at the impact of the curriculum reform, the purpose of the present study was to compare two concurrent cohorts by examining the extent to which the students in each cohort perceived their learning…
Lopez, Nicole E; Peterson, Carrie Y; Ramamoorthy, Sonia L; McLemore, Elisabeth C; Sedrak, Michael F; Lowy, Andrew M; Horgan, Santiago; Talamini, Mark A; Sicklick, Jason K
2015-02-01
Single-incision laparoscopic surgery (SILS) is gaining popularity for a wide variety of surgical operations and capitalizes on the benefits of traditional laparoscopic surgery without incurring multiple incision sites. Traditionally, SILS is performed by a midline periumbilical approach. However, such a minimally invasive approach may be utilized in patients who already have an abdominal incision. Our series retrospectively reviews 7 cases in which we utilized the fascial defect at the time of after ostomy reversal as our SILS incision site. In turn, we performed a variety of concurrent intra-abdominal procedures with excellent technical success and outcomes. Our study is the largest single-institution case series of this novel approach and suggests that utilizing an existing ostomy-site abdominal incision is a safe and effective location for SILS port placement and should be considered in patients undergoing concurrent procedures.
A One-Versus-All Class Binarization Strategy for Bearing Diagnostics of Concurrent Defects
Ng, Selina S. Y.; Tse, Peter W.; Tsui, Kwok L.
2014-01-01
In bearing diagnostics using a data-driven modeling approach, a concern is the need for data from all possible scenarios to build a practical model for all operating conditions. This paper is a study on bearing diagnostics with the concurrent occurrence of multiple defect types. The authors are not aware of any work in the literature that studies this practical problem. A strategy based on one-versus-all (OVA) class binarization is proposed to improve fault diagnostics accuracy while reducing the number of scenarios for data collection, by predicting concurrent defects from training data of normal and single defects. The proposed OVA diagnostic approach is evaluated with empirical analysis using support vector machine (SVM) and C4.5 decision tree, two popular classification algorithms frequently applied to system health diagnostics and prognostics. Statistical features are extracted from the time domain and the frequency domain. Prediction performance of the proposed strategy is compared with that of a simple multi-class classification, as well as that of random guess and worst-case classification. We have verified the potential of the proposed OVA diagnostic strategy in performance improvements for single-defect diagnosis and predictions of BPFO plus BPFI concurrent defects using two laboratory-collected vibration data sets. PMID:24419162
A one-versus-all class binarization strategy for bearing diagnostics of concurrent defects.
Ng, Selina S Y; Tse, Peter W; Tsui, Kwok L
2014-01-13
In bearing diagnostics using a data-driven modeling approach, a concern is the need for data from all possible scenarios to build a practical model for all operating conditions. This paper is a study on bearing diagnostics with the concurrent occurrence of multiple defect types. The authors are not aware of any work in the literature that studies this practical problem. A strategy based on one-versus-all (OVA) class binarization is proposed to improve fault diagnostics accuracy while reducing the number of scenarios for data collection, by predicting concurrent defects from training data of normal and single defects. The proposed OVA diagnostic approach is evaluated with empirical analysis using support vector machine (SVM) and C4.5 decision tree, two popular classification algorithms frequently applied to system health diagnostics and prognostics. Statistical features are extracted from the time domain and the frequency domain. Prediction performance of the proposed strategy is compared with that of a simple multi-class classification, as well as that of random guess and worst-case classification. We have verified the potential of the proposed OVA diagnostic strategy in performance improvements for single-defect diagnosis and predictions of BPFO plus BPFI concurrent defects using two laboratory-collected vibration data sets.
NASA Astrophysics Data System (ADS)
Siripatana, Chairat; Thongpan, Hathaikarn; Promraksa, Arwut
2017-03-01
This article explores a volumetric approach in formulating differential equations for a class of engineering flow problems involving component transfer within or between two phases. In contrast to conventional formulation which is based on linear velocities, this work proposed a slightly different approach based on volumetric flow-rate which is essentially constant in many industrial processes. In effect, many multi-dimensional flow problems found industrially can be simplified into multi-component or multi-phase but one-dimensional flow problems. The formulation is largely generic, covering counter-current, concurrent or batch, fixed and fluidized bed arrangement. It was also intended to use for start-up, shut-down, control and steady state simulation. Since many realistic and industrial operation are dynamic with variable velocity and porosity in relation to position, analytical solutions are rare and limited to only very simple cases. Thus we also provide a numerical solution using Crank-Nicolson finite difference scheme. This solution is inherently stable as tested against a few cases published in the literature. However, it is anticipated that, for unconfined flow or non-constant flow-rate, traditional formulation should be applied.
SNOMED CT module-driven clinical archetype management.
Allones, J L; Taboada, M; Martinez, D; Lozano, R; Sobrido, M J
2013-06-01
To explore semantic search to improve management and user navigation in clinical archetype repositories. In order to support semantic searches across archetypes, an automated method based on SNOMED CT modularization is implemented to transform clinical archetypes into SNOMED CT extracts. Concurrently, query terms are converted into SNOMED CT concepts using the search engine Lucene. Retrieval is then carried out by matching query concepts with the corresponding SNOMED CT segments. A test collection of the 16 clinical archetypes, including over 250 terms, and a subset of 55 clinical terms from two medical dictionaries, MediLexicon and MedlinePlus, were used to test our method. The keyword-based service supported by the OpenEHR repository offered us a benchmark to evaluate the enhancement of performance. In total, our approach reached 97.4% precision and 69.1% recall, providing a substantial improvement of recall (more than 70%) compared to the benchmark. Exploiting medical domain knowledge from ontologies such as SNOMED CT may overcome some limitations of the keyword-based systems and thus improve the search experience of repository users. An automated approach based on ontology segmentation is an efficient and feasible way for supporting modeling, management and user navigation in clinical archetype repositories. Copyright © 2013 Elsevier Inc. All rights reserved.
Metric integration architecture for product development
NASA Astrophysics Data System (ADS)
Sieger, David B.
1997-06-01
Present-day product development endeavors utilize the concurrent engineering philosophy as a logical means for incorporating a variety of viewpoints into the design of products. Since this approach provides no explicit procedural provisions, it is necessary to establish at least a mental coupling with a known design process model. The central feature of all such models is the management and transformation of information. While these models assist in structuring the design process, characterizing the basic flow of operations that are involved, they provide no guidance facilities. The significance of this feature, and the role it plays in the time required to develop products, is increasing in importance due to the inherent process dynamics, system/component complexities, and competitive forces. The methodology presented in this paper involves the use of a hierarchical system structure, discrete event system specification (DEVS), and multidimensional state variable based metrics. This approach is unique in its capability to quantify designer's actions throughout product development, provide recommendations about subsequent activity selection, and coordinate distributed activities of designers and/or design teams across all design stages. Conceptual design tool implementation results are used to demonstrate the utility of this technique in improving the incremental decision making process.
Issa, Ziad F
2007-09-01
Atrioventricular junction (AVJ) ablation combined with permanent pacemaker implantation (the "ablate and pace" approach) remains an acceptable alternative treatment strategy for symptomatic, drug-refractory atrial fibrillation (AF) with rapid ventricular response. This case series describes the feasibility and safety of catheter ablation of the AVJ via a superior vena caval approach performed during concurrent dual-chamber pacemaker implantation. A total of 17 consecutive patients with symptomatic, drug-refractory, paroxysmal AF underwent combined AVJ ablation and dual-chamber pacemaker implantation procedure using a left axillary venous approach. Two separate introducer sheaths were placed into the axillary vein. The first sheath was used for implantation of the pacemaker ventricular lead, which was then connected to the pulse generator. Subsequently, a standard ablation catheter was introduced through the second axillary venous sheath and used for radiofrequency (RF) ablation of the AVJ. After successful ablation, the catheter was withdrawn and the pacemaker atrial lead was advanced through that same sheath and implanted in the right atrium. Catheter ablation of the AVJ was successfully achieved in all patients. The median number of RF applications required to achieve complete AV block was three (range 1-10). In one patient, AV conduction recovered within the first hour after completion of the procedure, and AVJ ablation was then performed using the conventional femoral venous approach. There were no procedural complications. Catheter ablation of the AVJ can be performed successfully and safely via a superior vena caval approach in patients undergoing concurrent dual-chamber pacemaker implantation.
Theory of remote entanglement via quantum-limited phase-preserving amplification
NASA Astrophysics Data System (ADS)
Silveri, Matti; Zalys-Geller, Evan; Hatridge, Michael; Leghtas, Zaki; Devoret, Michel H.; Girvin, S. M.
2016-06-01
We show that a quantum-limited phase-preserving amplifier can act as a which-path information eraser when followed by heterodyne detection. This "beam splitter with gain" implements a continuous joint measurement on the signal sources. As an application, we propose heralded concurrent remote entanglement generation between two qubits coupled dispersively to separate cavities. Dissimilar qubit-cavity pairs can be made indistinguishable by simple engineering of the cavity driving fields providing further experimental flexibility and the prospect for scalability. Additionally, we find an analytic solution for the stochastic master equation, a quantum filter, yielding a thorough physical understanding of the nonlinear measurement process leading to an entangled state of the qubits. We determine the concurrence of the entangled states and analyze its dependence on losses and measurement inefficiencies.
Huet, Michaël; Jacobs, David M; Camachon, Cyril; Goulon, Cedric; Montagne, Gilles
2009-12-01
This study (a) compares the effectiveness of different types of feedback for novices who learn to land a virtual aircraft in a fixed-base flight simulator and (b) analyzes the informational variables that learners come to use after practice. An extensive body of research exists concerning the informational variables that allow successful landing. In contrast, few studies have examined how the attention of pilots can be directed toward these sources of information. In this study, 15 participants were asked to land a virtual Cessna 172 on 245 trials while trying to follow the glide-slope area as accurately as possible. Three groups of participants practiced under different feedback conditions: with self-controlled concurrent feedback (the self-controlled group), with imposed concurrent feedback (the yoked group), or without concurrent feedback (the control group). The self-controlled group outperformed the yoked group, which in turn outperformed the control group. Removing or manipulating specific sources of information during transfer tests had different effects for different individuals. However, removing the cockpit from the visual scene had a detrimental effect on the performance of the majority of the participants. Self-controlled concurrent feedback helps learners to more quickly attune to the informational variables that allow them to control the aircraft during the approach phase. Knowledge concerning feedback schedules can be used for the design of optimal practice methods for student pilots, and knowledge about the informational variables used by expert performers has implications for the design of cockpits and runways that facilitate the detection of these variables.
NASA Astrophysics Data System (ADS)
Knaster, J.; Ibarra, A.; Abal, J.; Abou-Sena, A.; Arbeiter, F.; Arranz, F.; Arroyo, J. M.; Bargallo, E.; Beauvais, P.-Y.; Bernardi, D.; Casal, N.; Carmona, J. M.; Chauvin, N.; Comunian, M.; Delferriere, O.; Delgado, A.; Diaz-Arocas, P.; Fischer, U.; Frisoni, M.; Garcia, A.; Garin, P.; Gobin, R.; Gouat, P.; Groeschel, F.; Heidinger, R.; Ida, M.; Kondo, K.; Kikuchi, T.; Kubo, T.; Le Tonqueze, Y.; Leysen, W.; Mas, A.; Massaut, V.; Matsumoto, H.; Micciche, G.; Mittwollen, M.; Mora, J. C.; Mota, F.; Nghiem, P. A. P.; Nitti, F.; Nishiyama, K.; Ogando, F.; O'hira, S.; Oliver, C.; Orsini, F.; Perez, D.; Perez, M.; Pinna, T.; Pisent, A.; Podadera, I.; Porfiri, M.; Pruneri, G.; Queral, V.; Rapisarda, D.; Roman, R.; Shingala, M.; Soldaini, M.; Sugimoto, M.; Theile, J.; Tian, K.; Umeno, H.; Uriot, D.; Wakai, E.; Watanabe, K.; Weber, M.; Yamamoto, M.; Yokomine, T.
2015-08-01
The International Fusion Materials Irradiation Facility (IFMIF), presently in its Engineering Validation and Engineering Design Activities (EVEDA) phase under the frame of the Broader Approach Agreement between Europe and Japan, accomplished in summer 2013, on schedule, its EDA phase with the release of the engineering design report of the IFMIF plant, which is here described. Many improvements of the design from former phases are implemented, particularly a reduction of beam losses and operational costs thanks to the superconducting accelerator concept, the re-location of the quench tank outside the test cell (TC) with a reduction of tritium inventory and a simplification on its replacement in case of failure, the separation of the irradiation modules from the shielding block gaining irradiation flexibility and enhancement of the remote handling equipment reliability and cost reduction, and the water cooling of the liner and biological shielding of the TC, enhancing the efficiency and economy of the related sub-systems. In addition, the maintenance strategy has been modified to allow a shorter yearly stop of the irradiation operations and a more careful management of the irradiated samples. The design of the IFMIF plant is intimately linked with the EVA phase carried out since the entry into force of IFMIF/EVEDA in June 2007. These last activities and their on-going accomplishment have been thoroughly described elsewhere (Knaster J et al [19]), which, combined with the present paper, allows a clear understanding of the maturity of the European-Japanese international efforts. This released IFMIF Intermediate Engineering Design Report (IIEDR), which could be complemented if required concurrently with the outcome of the on-going EVA, will allow decision making on its construction and/or serve as the basis for the definition of the next step, aligned with the evolving needs of our fusion community.
A Methodology for Formal Hardware Verification, with Application to Microprocessors.
1993-08-29
concurrent programming lan- guages. Proceedings of the NATO Advanced Study Institute on Logics and Models of Concurrent Systems ( Colle - sur - Loup , France, 8-19...restricted class of formu- las . Bose and Fisher [26] developed a symbolic model checker based on a Cosmos switch-level model. Their modeling approach...verification using SDVS-the method and a case study. 17th Anuual Microprogramming Workshop (New Orleans, LA , 30 October-2 November 1984). Published as
NASA's computer science research program
NASA Technical Reports Server (NTRS)
Larsen, R. L.
1983-01-01
Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.
DARPA Initiative in Concurrent Engineering (DICE). Phase 2
1990-07-31
XS spreadsheet tool " Q-Calc spreadsheet tool " TAE+ outer wrapper for XS • Framemaker-based formal EDN (Electronic Design Notebook) " Data...shared global object space and object persistence. Technical Results Module Development XS Integration Environment A prototype of the wrapper concepts...for a spreadsheet integration environment, using an X-Windows based extensible Lotus 1-2-3 emulation called XS , and was (initially) targeted for
Initiative in Concurrent Engineering (DICE). Phase 1.
1990-02-09
and power of commercial and military electronics systems. The continual evolution of HDE technology offers far greater flexibility in circuit design... powerful magnetic field of the permanent magnets in the sawyer motors. This makes it possible to have multiple robots in the workcell and to have them...Controller. The Adept IC was chosen because of its extensive processing power , integrated grayscale vision, standard 28 industrial I/O control
Concurrent Engineering Teams. Volume 2: Annotated Bibliography
1990-11-01
publishles. They normally embody restilts of major projects which (a) have a direct bearing am decisionse affecting major program , III) addrnss...D., "What Processes do You Own? How are They Doing?," Program Manager, Journal of the Defense Systems Management College, September-October 1989, pp...216. The key ingredient to any successful TQM program is top management commitment and involvement. The early top management involvement reflects
Implementing Set Based Design into Department of Defense Acquisition
2016-12-01
challenges for the DOD. This report identifies the original SBD principles and characteristics based on Toyota Motor Corporation’s Set Based Concurrent...Engineering Model. Additionally, the team reviewed DOD case studies that implemented SBD. The SBD principles , along with the common themes from the...perennial challenges for the DOD. This report identifies the original SBD principles and characteristics based on Toyota Motor Corporation’s Set
Adaptive intensity modulated radiotherapy for advanced prostate cancer
NASA Astrophysics Data System (ADS)
Ludlum, Erica Marie
The purpose of this research is to develop and evaluate improvements in intensity modulated radiotherapy (IMRT) for concurrent treatment of prostate and pelvic lymph nodes. The first objective is to decrease delivery time while maintaining treatment quality, and evaluate the effectiveness and efficiency of novel one-step optimization compared to conventional two-step optimization. Both planning methods are examined at multiple levels of complexity by comparing the number of beam apertures, or segments, the amount of radiation delivered as measured by monitor units (MUs), and delivery time. One-step optimization is demonstrated to simplify IMRT planning and reduce segments (from 160 to 40), MUs (from 911 to 746), and delivery time (from 22 to 7 min) with comparable plan quality. The second objective is to examine the capability of three commercial dose calculation engines employing different levels of accuracy and efficiency to handle high--Z materials, such as metallic hip prostheses, included in the treatment field. Pencil beam, convolution superposition, and Monte Carlo dose calculation engines are compared by examining the dose differences for patient plans with unilateral and bilateral hip prostheses, and for phantom plans with a metal insert for comparison with film measurements. Convolution superposition and Monte Carlo methods calculate doses that are 1.3% and 34.5% less than the pencil beam method, respectively. Film results demonstrate that Monte Carlo most closely represents actual radiation delivery, but none of the three engines accurately predict the dose distribution when high-Z heterogeneities exist in the treatment fields. The final objective is to improve the accuracy of IMRT delivery by accounting for independent organ motion during concurrent treatment of the prostate and pelvic lymph nodes. A leaf-shifting algorithm is developed to track daily prostate position without requiring online dose calculation. Compared to conventional methods of adjusting patient position, adjusting the multileaf collimator (MLC) leaves associated with the prostate in each segment significantly improves lymph node dose coverage (maintains 45 Gy compared to 42.7, 38.3, and 34.0 Gy for iso-shifts of 0.5, 1 and 1.5 cm). Altering the MLC portal shape is demonstrated as a new and effective solution to independent prostate movement during concurrent treatment.
Concurrent Transmission Based on Channel Quality in Ad Hoc Networks: A Game Theoretic Approach
NASA Astrophysics Data System (ADS)
Chen, Chen; Gao, Xinbo; Li, Xiaoji; Pei, Qingqi
In this paper, a decentralized concurrent transmission strategy in shared channel in Ad Hoc networks is proposed based on game theory. Firstly, a static concurrent transmissions game is used to determine the candidates for transmitting by channel quality threshold and to maximize the overall throughput with consideration of channel quality variation. To achieve NES (Nash Equilibrium Solution), the selfish behaviors of node to attempt to improve the channel gain unilaterally are evaluated. Therefore, this game allows each node to be distributed and to decide whether to transmit concurrently with others or not depending on NES. Secondly, as there are always some nodes with lower channel gain than NES, which are defined as hunger nodes in this paper, a hunger suppression scheme is proposed by adjusting the price function with interferences reservation and forward relay, to fairly give hunger nodes transmission opportunities. Finally, inspired by stock trading, a dynamic concurrent transmission threshold determination scheme is implemented to make the static game practical. Numerical results show that the proposed scheme is feasible to increase concurrent transmission opportunities for active nodes, and at the same time, the number of hunger nodes is greatly reduced with the least increase of threshold by interferences reservation. Also, the good performance on network goodput of the proposed model can be seen from the results.
Regenerative life support system research
NASA Technical Reports Server (NTRS)
1988-01-01
Sections on modeling, experimental activities during the grant period, and topics under consideration for the future are contained. The sessions contain discussions of: four concurrent modeling approaches that were being integrated near the end of the period (knowledge-based modeling support infrastructure and data base management, object-oriented steady state simulations for three concepts, steady state mass-balance engineering tradeoff studies, and object-oriented time-step, quasidynamic simulations of generic concepts); interdisciplinary research activities, beginning with a discussion of RECON lab development and use, and followed with discussions of waste processing research, algae studies and subsystem modeling, low pressure growth testing of plants, subsystem modeling of plants, control of plant growth using lighting and CO2 supply as variables, search for and development of lunar soil simulants, preliminary design parameters for a lunar base life support system, and research considerations for food processing in space; and appendix materials, including a discussion of the CELSS Conference, detailed analytical equations for mass-balance modeling, plant modeling equations, and parametric data on existing life support systems for use in modeling.
NASA Technical Reports Server (NTRS)
Gerberich, Matthew W.; Oleson, Steven R.
2013-01-01
The Collaborative Modeling for Parametric Assessment of Space Systems (COMPASS) team at Glenn Research Center has performed integrated system analysis of conceptual spacecraft mission designs since 2006 using a multidisciplinary concurrent engineering process. The set of completed designs was archived in a database, to allow for the study of relationships between design parameters. Although COMPASS uses a parametric spacecraft costing model, this research investigated the possibility of using a top-down approach to rapidly estimate the overall vehicle costs. This paper presents the relationships between significant design variables, including breakdowns of dry mass, wet mass, and cost. It also develops a model for a broad estimate of these parameters through basic mission characteristics, including the target location distance, the payload mass, the duration, the delta-v requirement, and the type of mission, propulsion, and electrical power. Finally, this paper examines the accuracy of this model in regards to past COMPASS designs, with an assessment of outlying spacecraft, and compares the results to historical data of completed NASA missions.
Decoupling the Functional Pleiotropy of Stem Cell Factor by Tuning c-Kit Signaling
Ho, Chia Chi M.; Chhabra, Akanksha; Starkl, Philipp; Schnorr, Peter-John; Wilmes, Stephan; Moraga, Ignacio; Kwon, Hye-Sook; Gaudenzio, Nicolas; Sibilano, Riccardo; Wehrman, Tom S.; Gakovic, Milica; Sockolosky, Jonathan T.; Tiffany, Matthew R.; Ring, Aaron M.; Piehler, Jacob; Weissman, Irving L.; Galli, Stephen J.; Shizuru, Judith A.; Garcia, K. Christopher
2017-01-01
SUMMARY Most secreted growth factors and cytokines are functionally pleiotropic because their receptors are expressed on diverse cell types. While important for normal mammalian physiology, pleiotropy limits the efficacy of cytokines and growth factors as therapeutics. Stem cell factor (SCF) is a growth factor that acts through the c-Kit receptor tyrosine kinase to elicit hematopoietic progenitor expansion, but can be toxic when administered in vivo because it concurrently activates mast cells. We engineered a mechanism-based SCF partial agonist that impaired c-Kit dimerization, truncating downstream signaling amplitude. This SCF variant elicited biased activation of hematopoietic progenitors over mast cells in vitro and in vivo. Mouse models of SCF-mediated anaphylaxis, radioprotection, and hematopoietic expansion revealed that this SCF partial agonist retained therapeutic efficacy while exhibiting virtually no anaphylactic off-target effects. The approach of biasing cell activation by tuning signaling thresholds and outputs has applications to many dimeric receptor-ligand systems. PMID:28283060
Wang, Yong; Liu, Jinquan; Christiansen, Silke; Kim, Dong Ha; Gösele, Ulrich; Steinhart, Martin
2008-11-01
Nanopatterned thin carbon films were prepared by direct and expeditious carbonization of the block copolymer polystyrene- block-poly(2-vinylpyridine) (PS- b-P2VP) without the necessity of slow heating to the process temperature and of addition of further carbon precursors. Carbonaceous films having an ordered "dots-on-film" surface topology were obtained from reverse micelle monolayers. The regular nanoporous morphology of PS- b-P2VP films obtained by subjecting reverse micelle monolayers to swelling-induced surface reconstruction could likewise be transferred to carbon films thus characterized by ordered nanopit arrays. Stabilization of PS- b-P2VP by UV irradiation and the concurrent carbonization of both blocks were key to the conservation of the film topography. The approach reported here may enable the realization of a broad range of nanoscaled architectures for carbonaceous materials using a block copolymer ideally suited as a template because of the pronounced repulsion between its blocks and its capability to form highly ordered microdomain structures.
Solving Partial Differential Equations in a data-driven multiprocessor environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaudiot, J.L.; Lin, C.M.; Hosseiniyar, M.
1988-12-31
Partial differential equations can be found in a host of engineering and scientific problems. The emergence of new parallel architectures has spurred research in the definition of parallel PDE solvers. Concurrently, highly programmable systems such as data-how architectures have been proposed for the exploitation of large scale parallelism. The implementation of some Partial Differential Equation solvers (such as the Jacobi method) on a tagged token data-flow graph is demonstrated here. Asynchronous methods (chaotic relaxation) are studied and new scheduling approaches (the Token No-Labeling scheme) are introduced in order to support the implementation of the asychronous methods in a data-driven environment.more » New high-level data-flow language program constructs are introduced in order to handle chaotic operations. Finally, the performance of the program graphs is demonstrated by a deterministic simulation of a message passing data-flow multiprocessor. An analysis of the overhead in the data-flow graphs is undertaken to demonstrate the limits of parallel operations in dataflow PDE program graphs.« less
Ceramic applications in turbine engines
NASA Technical Reports Server (NTRS)
Byrd, J. A.; Janovicz, M. A.; Thrasher, S. R.
1981-01-01
Development testing activities on the 1900 F-configuration ceramic parts were completed, 2070 F-configuration ceramic component rig and engine testing was initiated, and the conceptual design for the 2265 F-configuration engine was identified. Fabrication of the 2070 F-configuration ceramic parts continued, along with burner rig development testing of the 2070 F-configuration metal combustor in preparation for 1132 C (2070 F) qualification test conditions. Shakedown testing of the hot engine simulator (HES) rig was also completed in preparation for testing of a spin rig-qualified ceramic-bladed rotor assembly at 1132 C (2070 F) test conditions. Concurrently, ceramics from new sources and alternate materials continued to be evaluated, and fabrication of 2070 F-configuration ceramic component from these new sources continued. Cold spin testing of the critical 2070 F-configuration blade continued in the spin test rig to qualify a set of ceramic blades at 117% engine speed for the gasifier turbine rotor. Rig testing of the ceramic-bladed gasifier turbine rotor assembly at 108% engine speed was also performed, which resulted in the failure of one blade. The new three-piece hot seal with the nickel oxide/calcium fluoride wearface composition was qualified in the regenerator rig and introduced to engine operation wiwth marginal success.
Rethinking Regenerative Medicine: A Macrophage-Centered Approach
Brown, Bryan N.; Sicari, Brian M.; Badylak, Stephen F.
2014-01-01
Regenerative medicine, a multi-disciplinary approach that seeks to restore form and function to damaged or diseased tissues and organs, has evolved significantly during the past decade. By adapting and integrating fundamental knowledge from cell biology, polymer science, and engineering, coupled with an increasing understanding of the mechanisms which underlie the pathogenesis of specific diseases, regenerative medicine has the potential for innovative and transformative therapies for heretofore unmet medical needs. However, the translation of novel technologies from the benchtop to animal models and clinical settings is non-trivial and requires an understanding of the mechanisms by which the host will respond to these novel therapeutic approaches. The role of the innate immune system, especially the role of macrophages, in the host response to regenerative medicine based strategies has recently received considerable attention. Macrophage phenotype and function have been suggested as critical and determinant factors in downstream outcomes. The constructive and regulatory, and in fact essential, role of macrophages in positive outcomes represents a significant departure from the classical paradigms of host–biomaterial interactions, which typically consider activation of the host immune system as a detrimental event. It appears desirable that emerging regenerative medicine approaches should not only accommodate but also promote the involvement of the immune system to facilitate positive outcomes. Herein, we describe the current understanding of macrophage phenotype as it pertains to regenerative medicine and suggest that improvement of our understanding of context-dependent macrophage polarization will lead to concurrent improvement in outcomes. PMID:25408693
Source-space ICA for MEG source imaging.
Jonmohamadi, Yaqub; Jones, Richard D
2016-02-01
One of the most widely used approaches in electroencephalography/magnetoencephalography (MEG) source imaging is application of an inverse technique (such as dipole modelling or sLORETA) on the component extracted by independent component analysis (ICA) (sensor-space ICA + inverse technique). The advantage of this approach over an inverse technique alone is that it can identify and localize multiple concurrent sources. Among inverse techniques, the minimum-variance beamformers offer a high spatial resolution. However, in order to have both high spatial resolution of beamformer and be able to take on multiple concurrent sources, sensor-space ICA + beamformer is not an ideal combination. We propose source-space ICA for MEG as a powerful alternative approach which can provide the high spatial resolution of the beamformer and handle multiple concurrent sources. The concept of source-space ICA for MEG is to apply the beamformer first and then singular value decomposition + ICA. In this paper we have compared source-space ICA with sensor-space ICA both in simulation and real MEG. The simulations included two challenging scenarios of correlated/concurrent cluster sources. Source-space ICA provided superior performance in spatial reconstruction of source maps, even though both techniques performed equally from a temporal perspective. Real MEG from two healthy subjects with visual stimuli were also used to compare performance of sensor-space ICA and source-space ICA. We have also proposed a new variant of minimum-variance beamformer called weight-normalized linearly-constrained minimum-variance with orthonormal lead-field. As sensor-space ICA-based source reconstruction is popular in EEG and MEG imaging, and given that source-space ICA has superior spatial performance, it is expected that source-space ICA will supersede its predecessor in many applications.
Advanced Technology Lifecycle Analysis System (ATLAS)
NASA Technical Reports Server (NTRS)
O'Neil, Daniel A.; Mankins, John C.
2004-01-01
Developing credible mass and cost estimates for space exploration and development architectures require multidisciplinary analysis based on physics calculations, and parametric estimates derived from historical systems. Within the National Aeronautics and Space Administration (NASA), concurrent engineering environment (CEE) activities integrate discipline oriented analysis tools through a computer network and accumulate the results of a multidisciplinary analysis team via a centralized database or spreadsheet Each minute of a design and analysis study within a concurrent engineering environment is expensive due the size of the team and supporting equipment The Advanced Technology Lifecycle Analysis System (ATLAS) reduces the cost of architecture analysis by capturing the knowledge of discipline experts into system oriented spreadsheet models. A framework with a user interface presents a library of system models to an architecture analyst. The analyst selects models of launchers, in-space transportation systems, and excursion vehicles, as well as space and surface infrastructure such as propellant depots, habitats, and solar power satellites. After assembling the architecture from the selected models, the analyst can create a campaign comprised of missions spanning several years. The ATLAS controller passes analyst specified parameters to the models and data among the models. An integrator workbook calls a history based parametric analysis cost model to determine the costs. Also, the integrator estimates the flight rates, launched masses, and architecture benefits over the years of the campaign. An accumulator workbook presents the analytical results in a series of bar graphs. In no way does ATLAS compete with a CEE; instead, ATLAS complements a CEE by ensuring that the time of the experts is well spent Using ATLAS, an architecture analyst can perform technology sensitivity analysis, study many scenarios, and see the impact of design decisions. When the analyst is satisfied with the system configurations, technology portfolios, and deployment strategies, he or she can present the concepts to a team, which will conduct a detailed, discipline-oriented analysis within a CEE. An analog to this approach is the music industry where a songwriter creates the lyrics and music before entering a recording studio.
The interplay of representations and patterns of classroom discourse in science teaching sequences
NASA Astrophysics Data System (ADS)
Tang, Kok-Sing
2016-09-01
The purpose of this study is to examines the relationship between the communicative approach of classroom talk and the modes of representations used by science teachers. Based on video data from two physics classrooms in Singapore, a recurring pattern in the relationship was observed as the teaching sequence of a lesson unfolded. It was found that as the mode of representation shifted from enactive (action based) to iconic (image based) to symbolic (language based), there was a concurrent and coordinated shift in the classroom communicative approach from interactive-dialogic to interactive-authoritative to non-interactive-authoritative. Specifically, the shift from enactive to iconic to symbolic representations occurred mainly within the interactive-dialogic approach while the shift towards the interactive-authoritative and non-interactive-authoritative approaches occurred when symbolic modes of representation were used. This concurrent and coordinated shift has implications on how we conceive the use of representations in conjunction with the co-occurring classroom discourse, both theoretically and pedagogically.
A Survey of Applications and Research in Integrated Design Systems Technology
NASA Technical Reports Server (NTRS)
1998-01-01
The initial part of the study was begun with a combination of literature searches, World Wide Web searches, and contacts with individuals and companies who were known to members of our team to have an interest in topics that seemed to be related to our study. There is a long list of such topics, such as concurrent engineering, design for manufacture, life-cycle engineering, systems engineering, systems integration, systems design, design systems, integrated product and process approaches, enterprise integration, integrated product realization, and similar terms. These all capture, at least in part, the flavor of what we describe here as integrated design systems. An inhibiting factor in this inquiry was the absence of agreed terminology for the study of integrated design systems. It is common for the term to be applied to what are essentially augmented Computer-Aided Design (CAD) systems, which are integrated only to the extent that agreements have been reached to attach proprietary extensions to proprietary CAD programs. It is also common for some to use the term integrated design systems to mean a system that applies only, or mainly, to the design phase of a product life cycle. It is likewise common for many of the terms listed earlier to be used as synonyms for integrated design systems. We tried to avoid this ambiguity by adopting the definition of integrated design systems that is implied in the introductory notes that we provided to our contacts, cited earlier. We thus arrived at this definition: Integrated Design Systems refers to the integration of the different tools and processes that comprise the engineering, of complex systems. It takes a broad view of the engineering of systems, to include consideration of the entire product realization process and the product life cycle. An important aspect of integrated design systems is the extent to which they integrate existing, "islands of automation" into a comprehensive design and product realization environment. As the study progressed, we relied increasingly upon a networking approach to lead us to new information. The departure point for such searches often was a government-sponsored project or a company initiative. The advantage of this approach was that short conversations with knowledgeable persons would usually cut through confusion over differences of terminology, thereby somewhat reducing the search space of the study. Even so, it was not until late in our eight-month inquiry that we began to see signs of convergence of the search, in the sense that a number of the latest inquiries began to turn up references to earlier contacts. As suggested above, this convergence often occurred with respect to particular government or company projects.
ERIC Educational Resources Information Center
Thompson, Amber Cole
2012-01-01
Visualization was once thought to be an important skill for professions only related to engineering, but due to the realization of concurrent design and the fast pace of technology, it is now desirable in other professions as well. The importance of learning basic knowledge of geometrical concepts has a greater impact than it did prior to the 21st…
An Annotated Reading List for Concurrent Engineering
1989-07-01
The seven tools are sometimes referred to as the seven old tools.) -9- Ishikawa , Kaoru , What is Total Quality Control? The Japanese Way, Prentice-Hall...some solutions. * Ishikawa (1982) presents a practical guide (with easy to use tools) for implementing qual- ity control at the working level...study of, :-, ieering for the last two years. Is..ikawa, Kaoru , Guide to Quality Control, Kraus International Publications, White Plains, NY, 1982. The
The Role of Concurrent Engineering in Weapons System Acquisition
1988-12-01
checx sheets, Pareto diagrams, graphs, control charts, and scatter diagrams. Kaoru Iskilkawa, Guide to Qualiy Conmi, Asian Productivity Organization...Dewing [3 !,Juran [141, and Ishikawa [l). Managers in the United States and Japan have used techniques of statistics to measure performance and they have...New York (1962). 15. Kaoru Ishkawa, Guide to Quall’y Control, KRAUS International Publications, White Plains, NY (1-982). 16. Robert H. Hayc,%. St
Systematic and Scalable Testing of Concurrent Programs
2013-12-16
The evaluation of CHESS [107] checked eight different programs ranging from process management libraries to a distributed execution engine to a research...tool (§3.1) targets systematic testing of scheduling nondeterminism in multi- threaded components of the Omega cluster management system [129], while...tool for systematic testing of multithreaded com- ponents of the Omega cluster management system [129]. In particular, §3.1.1 defines a model for
1994-06-01
algorithms for large, irreducibly coupled systems iteratively solve concurrent problems within different subspaces of a Hilbert space, or within different...effective on problems amenable to SIMD solution. Together with researchers at AT&T Bell Labs (Boris Lubachevsky, Albert Greenberg ) we have developed...reasonable measurement. In the study of different speedups, various causes of superlinear speedup are also presented. Greenberg , Albert G., Boris D
ERIC Educational Resources Information Center
Esmaily, Hamideh M.; Silver, Ivan; Shiva, Shadi; Gargani, Alireza; Maleki-Dizaji, Nasrin; Al-Maniri, Abdullah; Wahlstrom, Rolf
2010-01-01
Introduction: An outcome-based education approach has been proposed to develop more effective continuing medical education (CME) programs. We have used this approach in developing an outcome-based educational intervention for general physicians working in primary care (GPs) and evaluated its effectiveness compared with a concurrent CME program in…
ERIC Educational Resources Information Center
Kelly, Michael; Humphrey, Charlotte
2013-01-01
Background: Care for clients with mental health problems and concurrent intellectual disability (dual diagnosis) is currently expected to be provided through the care programme approach (CPA), an approach to provide care to people with mental health problems in secondary mental health services. When CPA was originally introduced into UK mental…
Bi-Level Integrated System Synthesis (BLISS)
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw; Agte, Jeremy S.; Sandusky, Robert R., Jr.
1998-01-01
BLISS is a method for optimization of engineering systems by decomposition. It separates the system level optimization, having a relatively small number of design variables, from the potentially numerous subsystem optimizations that may each have a large number of local design variables. The subsystem optimizations are autonomous and may be conducted concurrently. Subsystem and system optimizations alternate, linked by sensitivity data, producing a design improvement in each iteration. Starting from a best guess initial design, the method improves that design in iterative cycles, each cycle comprised of two steps. In step one, the system level variables are frozen and the improvement is achieved by separate, concurrent, and autonomous optimizations in the local variable subdomains. In step two, further improvement is sought in the space of the system level variables. Optimum sensitivity data link the second step to the first. The method prototype was implemented using MATLAB and iSIGHT programming software and tested on a simplified, conceptual level supersonic business jet design, and a detailed design of an electronic device. Satisfactory convergence and favorable agreement with the benchmark results were observed. Modularity of the method is intended to fit the human organization and map well on the computing technology of concurrent processing.
Performance Modeling in CUDA Streams - A Means for High-Throughput Data Processing.
Li, Hao; Yu, Di; Kumar, Anand; Tu, Yi-Cheng
2014-10-01
Push-based database management system (DBMS) is a new type of data processing software that streams large volume of data to concurrent query operators. The high data rate of such systems requires large computing power provided by the query engine. In our previous work, we built a push-based DBMS named G-SDMS to harness the unrivaled computational capabilities of modern GPUs. A major design goal of G-SDMS is to support concurrent processing of heterogenous query processing operations and enable resource allocation among such operations. Understanding the performance of operations as a result of resource consumption is thus a premise in the design of G-SDMS. With NVIDIA's CUDA framework as the system implementation platform, we present our recent work on performance modeling of CUDA kernels running concurrently under a runtime mechanism named CUDA stream . Specifically, we explore the connection between performance and resource occupancy of compute-bound kernels and develop a model that can predict the performance of such kernels. Furthermore, we provide an in-depth anatomy of the CUDA stream mechanism and summarize the main kernel scheduling disciplines in it. Our models and derived scheduling disciplines are verified by extensive experiments using synthetic and real-world CUDA kernels.
Safety effects of exclusive and concurrent signal phasing for pedestrian crossing.
Zhang, Yaohua; Mamun, Sha A; Ivan, John N; Ravishanker, Nalini; Haque, Khademul
2015-10-01
This paper describes the estimation of pedestrian crash count and vehicle interaction severity prediction models for a sample of signalized intersections in Connecticut with either concurrent or exclusive pedestrian phasing. With concurrent phasing, pedestrians cross at the same time as motor vehicle traffic in the same direction receives a green phase, while with exclusive phasing, pedestrians cross during their own phase when all motor vehicle traffic on all approaches is stopped. Pedestrians crossing at each intersection were observed and classified according to the severity of interactions with motor vehicles. Observation intersections were selected to represent both types of signal phasing while controlling for other physical characteristics. In the nonlinear mixed models for interaction severity, pedestrians crossing on the walk signal at an exclusive signal experienced lower interaction severity compared to those crossing on the green light with concurrent phasing; however, pedestrians crossing on a green light where an exclusive phase was available experienced higher interaction severity. Intersections with concurrent phasing have fewer total pedestrian crashes than those with exclusive phasing but more crashes at higher severity levels. It is recommended that exclusive pedestrian phasing only be used at locations where pedestrians are more likely to comply. Copyright © 2015. Published by Elsevier Ltd.
Software fault tolerance for real-time avionics systems
NASA Technical Reports Server (NTRS)
Anderson, T.; Knight, J. C.
1983-01-01
Avionics systems have very high reliability requirements and are therefore prime candidates for the inclusion of fault tolerance techniques. In order to provide tolerance to software faults, some form of state restoration is usually advocated as a means of recovery. State restoration can be very expensive for systems which utilize concurrent processes. The concurrency present in most avionics systems and the further difficulties introduced by timing constraints imply that providing tolerance for software faults may be inordinately expensive or complex. A straightforward pragmatic approach to software fault tolerance which is believed to be applicable to many real-time avionics systems is proposed. A classification system for software errors is presented together with approaches to recovery and continued service for each error type.
Biocorrosion rate and mechanism of metallic magnesium in model arterial environments
NASA Astrophysics Data System (ADS)
Bowen, Patrick K.
A new paradigm in biomedical engineering calls for biologically active implants that are absorbed by the body over time. One popular application for this concept is in the engineering of endovascular stents that are delivered concurrently with balloon angioplasty. These devices enable the injured vessels to remain patent during healing, but are not needed for more than a few months after the procedure. Early studies of iron- and magnesium-based stents have concluded that magnesium is a potentially suitable base material for such a device; alloys can achieve acceptable mechanical properties and do not seem to harm the artery during degradation. Research done up to the onset of research contained in this dissertation, for the most part, failed to define realistic physiological corrosion mechanisms, and failed to correlate degradation rates between in vitro and in vivo environments. Six previously published works form the basis of this dissertation. The topics of these papers include (1) a method by which tensile testing may be applied to evaluate biomaterial degradation; (2) a suite of approaches that can be used to screen candidate absorbable magnesium biomaterials; (3) in vivo-in vitro environmental correlations based on mechanical behavior; (4) a similar correlation on the basis of penetration rate; (5) a mid-to-late stage physiological corrosion mechanism for magnesium in an arterial environment; and (6) the identification of corrosion products in degradable magnesium using transmission electron microscopy.
NASA Technical Reports Server (NTRS)
Reed, John A.; Afjeh, Abdollah A.
1995-01-01
A major difficulty in designing aeropropulsion systems is that of identifying and understanding the interactions between the separate engine components and disciplines (e.g., fluid mechanics, structural mechanics, heat transfer, material properties, etc.). The traditional analysis approach is to decompose the system into separate components with the interaction between components being evaluated by the application of each of the single disciplines in a sequential manner. Here, one discipline uses information from the calculation of another discipline to determine the effects of component coupling. This approach, however, may not properly identify the consequences of these effects during the design phase, leaving the interactions to be discovered and evaluated during engine testing. This contributes to the time and cost of developing new propulsion systems as, typically, several design-build-test cycles are needed to fully identify multidisciplinary effects and reach the desired system performance. The alternative to sequential isolated component analysis is to use multidisciplinary coupling at a more fundamental level. This approach has been made more plausible due to recent advancements in computation simulation along with application of concurrent engineering concepts. Computer simulation systems designed to provide an environment which is capable of integrating the various disciplines into a single simulation system have been proposed and are currently being developed. One such system is being developed by the Numerical Propulsion System Simulation (NPSS) project. The NPSS project, being developed at the Interdisciplinary Technology Office at the NASA Lewis Research Center is a 'numerical test cell' designed to provide for comprehensive computational design and analysis of aerospace propulsion systems. It will provide multi-disciplinary analyses on a variety of computational platforms, and a user-interface consisting of expert systems, data base management and visualization tools, to allow the designer to investigate the complex interactions inherent in these systems. An interactive programming software system, known as the Application Visualization System (AVS), was utilized for the development of the propulsion system simulation. The modularity of this system provides the ability to couple propulsion system components, as well as disciplines, and provides for the ability to integrate existing, well established analysis codes into the overall system simulation. This feature allows the user to customize the simulation model by inserting desired analysis codes. The prototypical simulation environment for multidisciplinary analysis, called Turbofan Engine System Simulation (TESS), which incorporates many of the characteristics of the simulation environment proposed herein, is detailed.
Proof of Concept for the Rewrite Rule Machine: Interensemble Studies
1994-02-23
34 -,,, S2 •fbo fibo 0 1 Figure 1: Concurrent Rewriting of Fibonacci Expressions exploit a problem’s parallelism at several levels. We call this...property multigrain concurrency; it makes the RRM very well suited for solving not only homogeneous problems, but also complex, locally homogeneous but...interprocessor message passing over a network-is not well suited to data parallelism. A key goal of the RRM is to combine the best of these two approaches in a
Concurrent substance-related disorders and mental illness: the North American experience
el-Guebaly, Nady
2004-01-01
Ingredients of the evolving North American experience in addressing the management of patients with concurrent substance-related disorders and mental illness are presented. This experience as well as select data from Europe and Australia indicate a growing empirically-based consensus to provide an integrated approach to the care of these patients. It also highlights the necessity to conduct local surveys of needs and resources and adapt the published clinical experience to the local system of care, resources and culture. PMID:16633492
Model Based Mission Assurance: Emerging Opportunities for Robotic Systems
NASA Technical Reports Server (NTRS)
Evans, John W.; DiVenti, Tony
2016-01-01
The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).
Advani, Poonam; Joseph, Blessy; Ambre, Premlata; Pissurlenkar, Raghuvir; Khedkar, Vijay; Iyer, Krishna; Gabhe, Satish; Iyer, Radhakrishnan P; Coutinho, Evans
2016-01-01
The present work exploits the potential of in silico approaches for minimizing attrition of leads in the later stages of drug development. We propose a theoretical approach, wherein 'parallel' information is generated to simultaneously optimize the pharmacokinetics (PK) and pharmacodynamics (PD) of lead candidates. β-blockers, though in use for many years, have suboptimal PKs; hence are an ideal test series for the 'parallel progression approach'. This approach utilizes molecular modeling tools viz. hologram quantitative structure activity relationships, homology modeling, docking, predictive metabolism, and toxicity models. Validated models have been developed for PK parameters such as volume of distribution (log Vd) and clearance (log Cl), which together influence the half-life (t1/2) of a drug. Simultaneously, models for PD in terms of inhibition constant pKi have been developed. Thus, PK and PD properties of β-blockers were concurrently analyzed and after iterative cycling, modifications were proposed that lead to compounds with optimized PK and PD. We report some of the resultant re-engineered β-blockers with improved half-lives and pKi values comparable with marketed β-blockers. These were further analyzed by the docking studies to evaluate their binding poses. Finally, metabolic and toxicological assessment of these molecules was done through in silico methods. The strategy proposed herein has potential universal applicability, and can be used in any drug discovery scenario; provided that the data used is consistent in terms of experimental conditions, endpoints, and methods employed. Thus the 'parallel progression approach' helps to simultaneously fine-tune various properties of the drug and would be an invaluable tool during the drug development process.
Bidding-based autonomous process planning and scheduling
NASA Astrophysics Data System (ADS)
Gu, Peihua; Balasubramanian, Sivaram; Norrie, Douglas H.
1995-08-01
Improving productivity through computer integrated manufacturing systems (CIMS) and concurrent engineering requires that the islands of automation in an enterprise be completely integrated. The first step in this direction is to integrate design, process planning, and scheduling. This can be achieved through a bidding-based process planning approach. The product is represented in a STEP model with detailed design and administrative information including design specifications, batch size, and due dates. Upon arrival at the manufacturing facility, the product registered in the shop floor manager which is essentially a coordinating agent. The shop floor manager broadcasts the product's requirements to the machines. The shop contains autonomous machines that have knowledge about their functionality, capabilities, tooling, and schedule. Each machine has its own process planner and responds to the product's request in a different way that is consistent with its capabilities and capacities. When more than one machine offers certain process(es) for the same requirements, they enter into negotiation. Based on processing time, due date, and cost, one of the machines wins the contract. The successful machine updates its schedule and advises the product to request raw material for processing. The concept was implemented using a multi-agent system with the task decomposition and planning achieved through contract nets. The examples are included to illustrate the approach.
NASA Astrophysics Data System (ADS)
Saunders, Vance M.
1999-06-01
The downsizing of the Department of Defense (DoD) and the associated reduction in budgets has re-emphasized the need for commonality, reuse, and standards with respect to the way DoD does business. DoD has implemented significant changes in how it buys weapon systems. The new emphasis is on concurrent engineering with Integrated Product and Process Development and collaboration with Integrated Product Teams. The new DoD vision includes Simulation Based Acquisition (SBA), a process supported by robust, collaborative use of simulation technology that is integrated across acquisition phases and programs. This paper discusses the Air Force Research Laboratory's efforts to use Modeling and Simulation (M&S) resources within a Collaborative Enterprise Environment to support SBA and other Collaborative Enterprise and Virtual Prototyping (CEVP) applications. The paper will discuss four technology areas: (1) a Processing Ontology that defines a hierarchically nested set of collaboration contexts needed to organize and support multi-disciplinary collaboration using M&S, (2) a partial taxonomy of intelligent agents needed to manage different M&S resource contributions to advancing the state of product development, (3) an agent- based process for interfacing disparate M&S resources into a CEVP framework, and (4) a Model-View-Control based approach to defining `a new way of doing business' for users of CEVP frameworks/systems.
The Hierarchical Database Decomposition Approach to Database Concurrency Control.
1984-12-01
approach, we postulate a model of transaction behavior under two phase locking as shown in Figure 39(a) and a model of that under multiversion ...transaction put in the block queue until it is reactivated. Under multiversion timestamping, however, the request is always granted. Once the request
ERIC Educational Resources Information Center
Wilson, Keithia; Fowler, Jane
2005-01-01
This study investigated whether students' approaches to learning were influenced by the design of university courses. Pre- and post-evaluations of the approaches to learning of the same group of students concurrently enrolled in a conventional course (lectures and tutorials) and an action learning-based course (project work, learning groups) were…
Flight Testing Surfaces Engineered for Mitigating Insect Adhesion on a Falcon HU-25C
NASA Technical Reports Server (NTRS)
Shanahan, Michelle; Wohl, Chris J.; Smith, Joseph G., Jr.; Connell, John W.; Siochi, Emilie J.; Doss, Jereme R.; Penner, Ronald K.
2015-01-01
Insect residue contamination on aircraft wings can decrease fuel efficiency in aircraft designed for natural laminar flow. Insect residues can cause a premature transition to turbulent flow, increasing fuel burn and making the aircraft less environmentally friendly. Surfaces, designed to minimize insect residue adhesion, were evaluated through flight testing on a Falcon HU-25C aircraft flown along the coast of Virginia and North Carolina. The surfaces were affixed to the wing leading edge and the aircraft remained at altitudes lower than 1000 feet throughout the flight to assure high insect density. The number of strikes on the engineered surfaces was compared to, and found to be lower than, untreated aluminum control surfaces flown concurrently. Optical profilometry was used to determine insect residue height and areal coverage. Differences in results between flight and laboratory tests suggest the importance of testing in realistic use environments to evaluate the effectiveness of engineered surface designs.
A Physics-Based Vibrotactile Feedback Library for Collision Events.
Park, Gunhyuk; Choi, Seungmoon
2017-01-01
We present PhysVib: a software solution on the mobile platform extending an open-source physics engine in a multi-rate rendering architecture for automatic vibrotactile feedback upon collision events. PhysVib runs concurrently with a physics engine at a low update rate and generates vibrotactile feedback commands at a high update rate based on the simulation results of the physics engine using an exponentially-decaying sinusoidal model. We demonstrate through a user study that this vibration model is more appropriate to our purpose in terms of perceptual quality than more complex models based on sound synthesis. We also evaluated the perceptual performance of PhysVib by comparing eight vibrotactile rendering methods. Experimental results suggested that PhysVib enables more realistic vibrotactile feedback than the other methods as to perceived similarity to the visual events. PhysVib is an effective solution for providing physically plausible vibrotactile responses while reducing application development time to great extent.
NASA Technical Reports Server (NTRS)
1991-01-01
The feasibility of developing and producing a launch vehicle from an external tank (ET) and an engine module that mounts inline to the tankage at the aft end and contains six space transportation main engines (STME), was assessed. The primary mission of this launch vehicle would be to place a PLS (personnel launch vehicle) into a low earth orbit (LEO). The vehicle tankage and the assembly of the engine module, was evaluated to determine what, if any, manufacturing/production impacts would be incurred if this vehicle were built along side the current ET at Michoud Assembly Facility. It was determined that there would be no significant impacts to produce seven of these vehicles per year while concurrently producing 12 ETs per year. Preliminary estimates of both nonrecurring and recurring costs for this vehicle concept were made.
Design and Implementation of a Threaded Search Engine for Tour Recommendation Systems
NASA Astrophysics Data System (ADS)
Lee, Junghoon; Park, Gyung-Leen; Ko, Jin-Hee; Shin, In-Hye; Kang, Mikyung
This paper implements a threaded scan engine for the O(n!) search space and measures its performance, aiming at providing a responsive tour recommendation and scheduling service. As a preliminary step of integrating POI ontology, mobile object database, and personalization profile for the development of new vehicular telematics services, this implementation can give a useful guideline to design a challenging and computation-intensive vehicular telematics service. The implemented engine allocates the subtree to the respective threads and makes them run concurrently exploiting the primitives provided by the operating system and the underlying multiprocessor architecture. It also makes it easy to add a variety of constraints, for example, the search tree is pruned if the cost of partial allocation already exceeds the current best. The performance measurement result shows that the service can run even in the low-power telematics device when the number of destinations does not exceed 15, with an appropriate constraint processing.
Non-invasive lightweight integration engine for building EHR from autonomous distributed systems.
Angulo, Carlos; Crespo, Pere; Maldonado, José A; Moner, David; Pérez, Daniel; Abad, Irene; Mandingorra, Jesús; Robles, Montserrat
2007-12-01
In this paper we describe Pangea-LE, a message-oriented lightweight data integration engine that allows homogeneous and concurrent access to clinical information from disperse and heterogeneous data sources. The engine extracts the information and passes it to the requesting client applications in a flexible XML format. The XML response message can be formatted on demand by appropriate Extensible Stylesheet Language (XSL) transformations in order to meet the needs of client applications. We also present a real deployment in a hospital where Pangea-LE collects and generates an XML view of all the available patient clinical information. The information is presented to healthcare professionals in an Electronic Health Record (EHR) viewer Web application with patient search and EHR browsing capabilities. Implantation in a real setting has been a success due to the non-invasive nature of Pangea-LE which respects the existing information systems.
Non-invasive light-weight integration engine for building EHR from autonomous distributed systems.
Crespo Molina, Pere; Angulo Fernández, Carlos; Maldonado Segura, José A; Moner Cano, David; Robles Viejo, Montserrat
2006-01-01
Pangea-LE is a message oriented light-weight integration engine, allowing concurrent access to clinical information from disperse and heterogeneous data sources. The engine extracts the information and serves it to the requester client applications in a flexible XML format. This XML response message can be formatted on demand by the appropriate XSL (Extensible Stylesheet Language) transformation in order to fit client application needs. In this article we present a real use case sample where Pangea-LE collects and generates "on the fly" a structured view of all the patient clinical information available in a healthcare organisation. This information is presented to healthcare professionals in an EHR (Electronic Health Record) viewer Web application with patient search and EHR browsing capabilities. Implantation in a real environment has been a notable success due to the non-invasive method which extremely respects the existing information systems.
Algorithms and software for nonlinear structural dynamics
NASA Technical Reports Server (NTRS)
Belytschko, Ted; Gilbertsen, Noreen D.; Neal, Mark O.
1989-01-01
The objective of this research is to develop efficient methods for explicit time integration in nonlinear structural dynamics for computers which utilize both concurrency and vectorization. As a framework for these studies, the program WHAMS, which is described in Explicit Algorithms for the Nonlinear Dynamics of Shells (T. Belytschko, J. I. Lin, and C.-S. Tsay, Computer Methods in Applied Mechanics and Engineering, Vol. 42, 1984, pp 225 to 251), is used. There are two factors which make the development of efficient concurrent explicit time integration programs a challenge in a structural dynamics program: (1) the need for a variety of element types, which complicates the scheduling-allocation problem; and (2) the need for different time steps in different parts of the mesh, which is here called mixed delta t integration, so that a few stiff elements do not reduce the time steps throughout the mesh.
Liu, Z Gerald; Vasys, Victoria N; Kittelson, David B
2007-09-15
The effects of fuel sulfur content and primary dilution on PM number emissions were investigated during transient operations of an old and a modern diesel engine. Emissions were also studied during steady-state operations in order to confirm consistency with previous findings. Testing methods were concurrent with those implemented by the EPA to regulate PM mass emissions, including the use of the Federal Transient Testing Procedure-Heavy Duty cycle to simulate transient conditions and the use of a Critical Flow Venturi-Constant Volume System to provide primary dilution. Steady-state results were found to be consistent with previous studies in that nuclei-mode particulate emissions were largely reduced when lower-sulfur content fuel was used in the newer engine, while the nuclei-mode PM emissions from the older engine were much less affected by fuel sulfur content. The transient results, however, show that the total number of nuclei-mode PM emissions from both engines increases with fuel sulfur content, although this effect is only seen under the higher primary dilution ratios with the older engine. Transient results further show that higher primary dilution ratios increase total nuclei-mode PM number emissions in both engines.
A Novel Human Tissue-Engineered 3-D Functional Vascularized Cardiac Muscle Construct
Valarmathi, Mani T.; Fuseler, John W.; Davis, Jeffrey M.; Price, Robert L.
2017-01-01
Organ tissue engineering, including cardiovascular tissues, has been an area of intense investigation. The major challenge to these approaches has been the inability to vascularize and perfuse the in vitro engineered tissue constructs. Attempts to provide oxygen and nutrients to the cells contained in the biomaterial constructs have had varying degrees of success. The aim of this current study is to develop a three-dimensional (3-D) model of vascularized cardiac tissue to examine the concurrent temporal and spatial regulation of cardiomyogenesis in the context of postnatal de novo vasculogenesis during stem cell cardiac regeneration. In order to achieve the above aim, we have developed an in vitro 3-D functional vascularized cardiac muscle construct using human induced pluripotent stem cell-derived embryonic cardiac myocytes (hiPSC-ECMs) and human mesenchymal stem cells (hMSCs). First, to generate the prevascularized scaffold, human cardiac microvascular endothelial cells (hCMVECs) and hMSCs were co-cultured onto a 3-D collagen cell carrier (CCC) for 7 days under vasculogenic culture conditions. In this milieu, hCMVECs/hMSCs underwent maturation, differentiation, and morphogenesis characteristic of microvessels, and formed extensive plexuses of vascular networks. Next, the hiPSC-ECMs and hMSCs were co-cultured onto this generated prevascularized CCCs for further 7 or 14 days in myogenic culture conditions. Finally, the vascular and cardiac phenotypic inductions were analyzed at the morphological, immunological, biochemical, molecular, and functional levels. Expression and functional analyses of the differentiated cells revealed neo-angiogenesis and neo-cardiomyogenesis. Thus, our unique 3-D co-culture system provided us the apt in vitro functional vascularized 3-D cardiac patch that can be utilized for cellular cardiomyoplasty. PMID:28194397
Balancing Plan-Driven and Agile Methods in Software Engineering Project Courses
NASA Astrophysics Data System (ADS)
Boehm, Barry; Port, Dan; Winsor Brown, A.
2002-09-01
For the past 6 years, we have been teaching a two-semester software engineering project course. The students organize into 5-person teams and develop largely web-based electronic services projects for real USC campus clients. We have been using and evolving a method called Model- Based (System) Architecting and Software Engineering (MBASE) for use in both the course and in industrial applications. The MBASE Guidelines include a lot of documents. We teach risk-driven documentation: if it is risky to document something, and not risky to leave it out (e.g., GUI screen placements), leave it out. Even so, students tend to associate more documentation with higher grades, although our grading eventually discourages this. We are always on the lookout for ways to have students learn best practices without having to produce excessive documentation. Thus, we were very interested in analyzing the various emerging agile methods. We found that agile methods and milestone plan-driven methods are part of a “how much planning is enough?” spectrum. Both agile and plan-driven methods have home grounds of project characteristics where they clearly work best, and where the other will have difficulties. Hybrid agile/plan-driven approaches are feasible, and necessary for projects having a mix of agile and plan-driven home ground characteristics. Information technology trends are going more toward the agile methods' home ground characteristics of emergent requirements and rapid change, although there is a concurrent increase in concern with dependability. As a result, we are currently experimenting with risk-driven combinations of MBASE and agile methods, such as integrating requirements, test plans, peer reviews, and pair programming into “agile quality management.”
Aristotle and Autism: Reconsidering a Radical Shift to Virtue Ethics in Engineering.
Furey, Heidi
2017-04-01
Virtue-based approaches to engineering ethics have recently received considerable attention within the field of engineering education. Proponents of virtue ethics in engineering argue that the approach is practically and pedagogically superior to traditional approaches to engineering ethics, including the study of professional codes of ethics and normative theories of behavior. This paper argues that a virtue-based approach, as interpreted in the current literature, is neither practically or pedagogically effective for a significant subpopulation within engineering: engineers with high functioning autism spectrum disorder (ASD). Because the main argument for adopting a character-based approach is that it could be more successfully applied to engineering than traditional rule-based or algorithmic ethical approaches, this oversight is problematic for the proponents of the virtue-based view. Furthermore, without addressing these concerns, the wide adoption of a virtue-based approach to engineering ethics has the potential to isolate individuals with ASD and to devalue their contributions to moral practice. In the end, this paper gestures towards a way of incorporating important insights from virtue ethics in engineering that would be more inclusive of those with ASD.
A systematic approach to engineering ethics education.
Li, Jessica; Fu, Shengli
2012-06-01
Engineering ethics education is a complex field characterized by dynamic topics and diverse students, which results in significant challenges for engineering ethics educators. The purpose of this paper is to introduce a systematic approach to determine what to teach and how to teach in an ethics curriculum. This is a topic that has not been adequately addressed in the engineering ethics literature. This systematic approach provides a method to: (1) develop a context-specific engineering ethics curriculum using the Delphi technique, a process-driven research method; and (2) identify appropriate delivery strategies and instructional strategies using an instructional design model. This approach considers the context-specific needs of different engineering disciplines in ethics education and leverages the collaboration of engineering professors, practicing engineers, engineering graduate students, ethics scholars, and instructional design experts. The proposed approach is most suitable for a department, a discipline/field or a professional society. The approach helps to enhance learning outcomes and to facilitate ethics education curriculum development as part of the regular engineering curriculum.
Department of Transportation's intelligent transportation systems (ITS) projects book
DOT National Transportation Integrated Search
2000-01-01
Intelligent Transportation Systems (ITS), formerly Intelligent Vehicle-Highway Systems (IVHS), provide the technology applications helping the nation address current surface transportation problems while concurrently providing approaches for dealing ...
A Community Mental Health Approach to Drug Addiction.
ERIC Educational Resources Information Center
Brotman, Richard; Freedman, Alfred
The nature of the historical changes in the presumed stereo-types of drug users in the United States, and the associated policy changes, are described in this report which takes a community health viewpoint of drug use while concurrently dealing with the individual. Eight case histories illustrate the community mental health approach in action.…
ERIC Educational Resources Information Center
Parent, Veronique; Birtwell, Kirstin B.; Lambright, Nathan; DuBard, Melanie
2016-01-01
This article presents an individual intervention combining cognitive-behavioral and behavior-analytic approaches to target severe emotion dysregulation in verbal youth with autism spectrum disorder (ASD) concurrent with intellectual disability (ID). The article focuses on two specific individuals who received the treatment within a therapeutic…
ERIC Educational Resources Information Center
Parette, Howard P.; Blum, Craig; Boeckmann, Nichole M.
2009-01-01
As assistive technology applications are increasingly implemented in early childhood settings for children who are at risk or who have disabilities, it is critical that teachers utilize observational approaches to determine whether targeted assistive technology-supported interventions make a difference in children's learning. One structured…
Department of Transportation's intelligent transportation systems (ITS) projects book
DOT National Transportation Integrated Search
1999-01-01
Intelligent Transportation Systems (ITS), formerly Intelligent Vehicle-Highway Systems (IVHS), provide the technology applications helping the nation address current surface transportation problems and while concurrently providing approaches for deal...
Elmi, Maryam; Azin, Arash; Elnahas, Ahmad; McCready, David R; Cil, Tulin D
2018-05-14
Patients with genetic susceptibility to breast and ovarian cancer are eligible for risk-reduction surgery. Surgical morbidity of risk-reduction mastectomy (RRM) with concurrent bilateral salpingo-oophorectomy (BSO) is unknown. Outcomes in these patients were compared to patients undergoing RRM without BSO using a large multi-institutional database. A retrospective cohort analysis was conducted using the American College of Surgeon's National Surgical Quality Improvement Program (NSQIP) 2007-2016 datasets, comparing postoperative morbidity between patients undergoing RRM with patients undergoing RRM with concurrent BSO. Patients with genetic susceptibility to breast/ovarian cancer undergoing risk-reduction surgery were identified. The primary outcome was 30-day postoperative major morbidity. Secondary outcomes included surgical site infections, reoperations, readmissions, length of stay, and venous thromboembolic events. A multivariate analysis was performed to determine predictors of postoperative morbidity and the adjusted effect of concurrent BSO on morbidity. Of the 5470 patients undergoing RRM, 149 (2.7%) underwent concurrent BSO. The overall rate of major morbidity and postoperative infections was 4.5% and 4.6%, respectively. There was no significant difference in the rate of postoperative major morbidity (4.5% vs 4.7%, p = 0.91) or any of the secondary outcomes between patients undergoing RRM without BSO vs. those undergoing RRM with concurrent BSO. Multivariable analysis showed Body Mass Index (OR 1.05; p < 0.001) and smoking (OR 1.78; p = 0.003) to be the only predictors associated with major morbidity. Neither immediate breast reconstruction (OR 1.02; p = 0.93) nor concurrent BSO (OR 0.94; p = 0.89) were associated with increased postoperative major morbidity. This study demonstrated that RRM with concurrent BSO was not associated with significant additional morbidity when compared to RRM without BSO. Therefore, this joint approach may be considered for select patients at risk for both breast and ovarian cancer.
New technologies for space avionics
NASA Technical Reports Server (NTRS)
Aibel, David W.; Dingus, Peter; Lanciault, Mark; Hurdlebrink, Debra; Gurevich, Inna; Wenglar, Lydia
1994-01-01
This report reviews a 1994 effort that continued 1993 investigations into issues associated with the definition of requirements, with the practice concurrent engineering and rapid prototyping in the context of the development of a prototyping of a next-generation reaction jet driver controller. This report discusses lessons learned, the testing of the current prototype, the details of the current design, and the nature and performance of a mathematical model of the life cycle of a pilot operated valve solenoid.
Gomes, Chandima
2012-11-01
This paper addresses a concurrent multidisciplinary problem: animal safety against lightning hazards. In regions where lightning is prevalent, either seasonally or throughout the year, a considerable number of wild, captive and tame animals are injured due to lightning generated effects. The paper discusses all possible injury mechanisms, focusing mainly on animals with commercial value. A large number of cases from several countries have been analyzed. Economically and practically viable engineering solutions are proposed to address the issues related to the lightning threats discussed.
Application of Concurrent Engineering Methods to the Design of an Autonomous Aerial Robot
1991-12-01
power within the system, either airborne or at a ground station, was left to the team’s discretion. Data link from the aerial vehicle to the ground...Design Process 1 4 10 0% Conceptual 100% Preliminary 100% Detailed 100% Design Freedom Kowledge About the Design TIME INTO THE DESIGN PROCESS Figure 15...mission planning and control tasks was accomplished. Key system issues regarding power up and component initialization procedures began to be addressed
1991-09-01
other networks . 69 For example, E-mail can be sent to an SNA network through a Softswitch gateway, but at a very slow rate. As discussed in Chapter III...10 6. Communication Protocols ..................... 10 D. NEW INFRASTRUCTURES ....................... 11 1. CALS Test Network (CTN...11 2. Industrial Networks ......................... 12 3. FTS-2000 and ISDN ........................ 12 4. CALS Operational Resource
New Technologies for Space Avionics, 1993
NASA Technical Reports Server (NTRS)
Aibel, David W.; Harris, David R.; Bartlett, Dave; Black, Steve; Campagna, Dave; Fernald, Nancy; Garbos, Ray
1993-01-01
The report reviews a 1993 effort that investigated issues associated with the development of requirements, with the practice of concurrent engineering and with rapid prototyping, in the development of a next-generation Reaction Jet Drive Controller. This report details lessons learned, the current status of the prototype, and suggestions for future work. The report concludes with a discussion of the vision of future avionics architectures based on the principles associated with open architectures and integrated vehicle health management.
Coverage Metrics for Model Checking
NASA Technical Reports Server (NTRS)
Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)
2001-01-01
When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.
Control-structure-thermal interactions in analysis of lunar telescopes
NASA Technical Reports Server (NTRS)
Thompson, Roger C.
1992-01-01
The lunar telescope project was an excellent model for the CSTI study because a telescope is a very sensitive instrument, and thermal expansion or mechanical vibration of the mirror assemblies will rapidly degrade the resolution of the device. Consequently, the interactions are strongly coupled. The lunar surface experiences very large temperature variations that range from approximately -180 C to over 100 C. Although the optical assemblies of the telescopes will be well insulated, the temperature of the mirrors will inevitably fluctuate in a similar cycle, but of much smaller magnitude. In order to obtain images of high quality and clarity, allowable thermal deformations of any point on a mirror must be less than 1 micron. Initial estimates indicate that this corresponds to a temperature variation of much less than 1 deg through the thickness of the mirror. Therefore, a lunar telescope design will most probably include active thermal control, a means of controlling the shape of the mirrors, or a combination of both systems. Historically, the design of a complex vehicle was primarily a sequential process in which the basic structure was defined without concurrent detailed analyses or other subsystems. The basic configuration was then passed to the different teams responsible for each subsystem, and their task was to produce a workable solution without requiring major alterations to any principal components or subsystems. Consequently, the final design of the vehicle was not always the most efficient, owing to the fact that each subsystem design was partially constrained by the previous work. This procedure was necessary at the time because the analysis process was extremely time-consuming and had to be started over with each significant alteration of the vehicle. With recent advances in the power and capacity of small computers, and the parallel development of powerful software in structural, thermal, and control system analysis, it is now possible to produce very detailed analyses of intermediate designs in a much shorter period of time. The subsystems can thus be designed concurrently, and alterations in the overall design can be quickly adopted into each analysis; the design becomes an iterative process in which it is much easier to experiment with new ideas, configurations, and components. Concurrent engineering has the potential to produce efficient, highly capable designs because the effect of one subystem on another can be assessed in much more detail at a very early point in the program. The research program consisted of several tasks: scale a prototype telescope assembly to a 1 m aperture, develop a model of the telescope assembly by using finite element (FEM) codes that are available on site, determine structural deflections of the mirror surfaces due to the temperature variations, develop a prototype control system to maintain the proper shape of the optical elements, and most important of all, demonstrate the concurrent engineering approach with this example. In addition, the software used for the finite element models and thermal analysis was relatively new within the Program Development Office and had yet to be applied to systems this large or complex; understanding the software and modifying it for use with this project was also required. The I-DEAS software by Structural Dynamics Research Corporation (SDRC) was used to build the finite element models, and TMG developed by Maya Heat Transfer Technologies, Ltd. (which runs as an I-DEAS module) was used for the thermal model calculations. All control system development was accomplished with MATRIX(sub X) by Integrated Systems, Inc.
Gomez, Antonio; Pires, Robert; Yambao, Alyssa; La Saponara, Valeria
2014-12-11
The durability of polymers and fiber-reinforced polymer composites under service condition is a critical aspect to be addressed for their robust designs and condition-based maintenance. These materials are adopted in a wide range of engineering applications, from aircraft and ship structures, to bridges, wind turbine blades, biomaterials and biomedical implants. Polymers are viscoelastic materials, and their response may be highly nonlinear and thus make it challenging to predict and monitor their in-service performance. The laboratory-scale testing platform presented herein assists the investigation of the influence of concurrent mechanical loadings and environmental conditions on these materials. The platform was designed to be low-cost and user-friendly. Its chemically resistant materials make the platform adaptable to studies of chemical degradation due to in-service exposure to fluids. An example of experiment was conducted at RT on closed-cell polyurethane foam samples loaded with a weight corresponding to ~50% of their ultimate static and dry load. Results show that the testing apparatus is appropriate for these studies. Results also highlight the larger vulnerability of the polymer under concurrent loading, based on the higher mid-point displacements and lower residual failure loads. Recommendations are made for additional improvements to the testing apparatus.
Gomez, Antonio; Pires, Robert; Yambao, Alyssa; La Saponara, Valeria
2014-01-01
The durability of polymers and fiber-reinforced polymer composites under service condition is a critical aspect to be addressed for their robust designs and condition-based maintenance. These materials are adopted in a wide range of engineering applications, from aircraft and ship structures, to bridges, wind turbine blades, biomaterials and biomedical implants. Polymers are viscoelastic materials, and their response may be highly nonlinear and thus make it challenging to predict and monitor their in-service performance. The laboratory-scale testing platform presented herein assists the investigation of the influence of concurrent mechanical loadings and environmental conditions on these materials. The platform was designed to be low-cost and user-friendly. Its chemically resistant materials make the platform adaptable to studies of chemical degradation due to in-service exposure to fluids. An example of experiment was conducted at RT on closed-cell polyurethane foam samples loaded with a weight corresponding to ~50% of their ultimate static and dry load. Results show that the testing apparatus is appropriate for these studies. Results also highlight the larger vulnerability of the polymer under concurrent loading, based on the higher mid-point displacements and lower residual failure loads. Recommendations are made for additional improvements to the testing apparatus. PMID:25548950
Performance Modeling in CUDA Streams - A Means for High-Throughput Data Processing
Li, Hao; Yu, Di; Kumar, Anand; Tu, Yi-Cheng
2015-01-01
Push-based database management system (DBMS) is a new type of data processing software that streams large volume of data to concurrent query operators. The high data rate of such systems requires large computing power provided by the query engine. In our previous work, we built a push-based DBMS named G-SDMS to harness the unrivaled computational capabilities of modern GPUs. A major design goal of G-SDMS is to support concurrent processing of heterogenous query processing operations and enable resource allocation among such operations. Understanding the performance of operations as a result of resource consumption is thus a premise in the design of G-SDMS. With NVIDIA’s CUDA framework as the system implementation platform, we present our recent work on performance modeling of CUDA kernels running concurrently under a runtime mechanism named CUDA stream. Specifically, we explore the connection between performance and resource occupancy of compute-bound kernels and develop a model that can predict the performance of such kernels. Furthermore, we provide an in-depth anatomy of the CUDA stream mechanism and summarize the main kernel scheduling disciplines in it. Our models and derived scheduling disciplines are verified by extensive experiments using synthetic and real-world CUDA kernels. PMID:26566545
An asynchronous traversal engine for graph-based rich metadata management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Dong; Carns, Philip; Ross, Robert B.
Rich metadata in high-performance computing (HPC) systems contains extended information about users, jobs, data files, and their relationships. Property graphs are a promising data model to represent heterogeneous rich metadata flexibly. Specifically, a property graph can use vertices to represent different entities and edges to record the relationships between vertices with unique annotations. The high-volume HPC use case, with millions of entities and relationships, naturally requires an out-of-core distributed property graph database, which must support live updates (to ingest production information in real time), low-latency point queries (for frequent metadata operations such as permission checking), and large-scale traversals (for provenancemore » data mining). Among these needs, large-scale property graph traversals are particularly challenging for distributed graph storage systems. Most existing graph systems implement a "level synchronous" breadth-first search algorithm that relies on global synchronization in each traversal step. This performs well in many problem domains; but a rich metadata management system is characterized by imbalanced graphs, long traversal lengths, and concurrent workloads, each of which has the potential to introduce or exacerbate stragglers (i.e., abnormally slow steps or servers in a graph traversal) that lead to low overall throughput for synchronous traversal algorithms. Previous research indicated that the straggler problem can be mitigated by using asynchronous traversal algorithms, and many graph-processing frameworks have successfully demonstrated this approach. Such systems require the graph to be loaded into a separate batch-processing framework instead of being iteratively accessed, however. In this work, we investigate a general asynchronous graph traversal engine that can operate atop a rich metadata graph in its native format. We outline a traversal-aware query language and key optimizations (traversal-affiliate caching and execution merging) necessary for efficient performance. We further explore the effect of different graph partitioning strategies on the traversal performance for both synchronous and asynchronous traversal engines. Our experiments show that the asynchronous graph traversal engine is more efficient than its synchronous counterpart in the case of HPC rich metadata processing, where more servers are involved and larger traversals are needed. Furthermore, the asynchronous traversal engine is more adaptive to different graph partitioning strategies.« less
An asynchronous traversal engine for graph-based rich metadata management
Dai, Dong; Carns, Philip; Ross, Robert B.; ...
2016-06-23
Rich metadata in high-performance computing (HPC) systems contains extended information about users, jobs, data files, and their relationships. Property graphs are a promising data model to represent heterogeneous rich metadata flexibly. Specifically, a property graph can use vertices to represent different entities and edges to record the relationships between vertices with unique annotations. The high-volume HPC use case, with millions of entities and relationships, naturally requires an out-of-core distributed property graph database, which must support live updates (to ingest production information in real time), low-latency point queries (for frequent metadata operations such as permission checking), and large-scale traversals (for provenancemore » data mining). Among these needs, large-scale property graph traversals are particularly challenging for distributed graph storage systems. Most existing graph systems implement a "level synchronous" breadth-first search algorithm that relies on global synchronization in each traversal step. This performs well in many problem domains; but a rich metadata management system is characterized by imbalanced graphs, long traversal lengths, and concurrent workloads, each of which has the potential to introduce or exacerbate stragglers (i.e., abnormally slow steps or servers in a graph traversal) that lead to low overall throughput for synchronous traversal algorithms. Previous research indicated that the straggler problem can be mitigated by using asynchronous traversal algorithms, and many graph-processing frameworks have successfully demonstrated this approach. Such systems require the graph to be loaded into a separate batch-processing framework instead of being iteratively accessed, however. In this work, we investigate a general asynchronous graph traversal engine that can operate atop a rich metadata graph in its native format. We outline a traversal-aware query language and key optimizations (traversal-affiliate caching and execution merging) necessary for efficient performance. We further explore the effect of different graph partitioning strategies on the traversal performance for both synchronous and asynchronous traversal engines. Our experiments show that the asynchronous graph traversal engine is more efficient than its synchronous counterpart in the case of HPC rich metadata processing, where more servers are involved and larger traversals are needed. Furthermore, the asynchronous traversal engine is more adaptive to different graph partitioning strategies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lucas, Robert; Ang, James; Bergman, Keren
2014-02-10
Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a systemmore » that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.« less
Skeletal biology: Where matrix meets mineral
Young, Marian F.
2017-01-01
The skeleton is unique from all other tissues in the body because of its ability to mineralize. The incorporation of mineral into bones and teeth is essential to give them strength and structure for body support and function. For years, researchers have wondered how mineralized tissues form and repair. A major focus in this context has been on the role of the extracellular matrix, which harbors key regulators of the mineralization process. In this introductory minireview, we will review some key concepts of matrix biology as it related to mineralized tissues. Concurrently, we will highlight the subject of this special issue covering many aspects of mineralized tissues, including bones and teeth and their associated structures cartilage and tendon. Areas of emphasis are on the generation and analysis of new animal models with permutations of matrix components as well as the development of new approaches for tissue engineering for repair of damaged hard tissue. In assembling key topics on mineralized tissues written by leaders in our field, we hope the reader will get a broad view of the topic and all of its fascinating complexities. PMID:27131884
NASA Astrophysics Data System (ADS)
Mejia, Joel Alejandro
Previous studies have suggested that, when funds of knowledge are incorporated into science and mathematics curricula, students are more engaged and often develop richer understandings of scientific concepts. While there has been a growing body of research addressing how teachers may integrate students' linguistic, social, and cultural practices with science and mathematics instruction, very little research has been conducted on how the same can be accomplished with Latino and Latina students in engineering. The purpose of this study was to address this gap in the literature by investigating how fourteen Latino and Latina high school adolescents used their funds of knowledge to address engineering design challenges. This project was intended to enhance the educational experience of underrepresented minorities whose social and cultural practices have been traditionally undervalued in schools. This ethnographic study investigated the funds of knowledge of fourteen Latino and Latina high school adolescents and how they used these funds of knowledge in engineering design. Participant observation, bi-monthly group discussion, retrospective and concurrent protocols, and monthly one-on-one interviews were conducted during the study. A constant comparative analysis suggested that Latino and Latina adolescents, although profoundly underrepresented in engineering, bring a wealth of knowledge and experiences that are relevant to engineering design thinking and practice.
Toward Genome-Based Metabolic Engineering in Bacteria.
Oesterle, Sabine; Wuethrich, Irene; Panke, Sven
2017-01-01
Prokaryotes modified stably on the genome are of great importance for production of fine and commodity chemicals. Traditional methods for genome engineering have long suffered from imprecision and low efficiencies, making construction of suitable high-producer strains laborious. Here, we review the recent advances in discovery and refinement of molecular precision engineering tools for genome-based metabolic engineering in bacteria for chemical production, with focus on the λ-Red recombineering and the clustered regularly interspaced short palindromic repeats/Cas9 nuclease systems. In conjunction, they enable the integration of in vitro-synthesized DNA segments into specified locations on the chromosome and allow for enrichment of rare mutants by elimination of unmodified wild-type cells. Combination with concurrently developing improvements in important accessory technologies such as DNA synthesis, high-throughput screening methods, regulatory element design, and metabolic pathway optimization tools has resulted in novel efficient microbial producer strains and given access to new metabolic products. These new tools have made and will likely continue to make a big impact on the bioengineering strategies that transform the chemical industry. Copyright © 2017 Elsevier Inc. All rights reserved.
Michalowski, Martin; Wilk, Szymon; Tan, Xing; Michalowski, Wojtek
2014-01-01
Clinical practice guidelines (CPGs) implement evidence-based medicine designed to help generate a therapy for a patient suffering from a single disease. When applied to a comorbid patient, the concurrent combination of treatment steps from multiple CPGs is susceptible to adverse interactions in the resulting combined therapy (i.e., a therapy established according to all considered CPGs). This inability to concurrently apply CPGs has been shown to be one of the key shortcomings of CPG uptake in a clinical setting1. Several research efforts are underway to address this issue such as the K4CARE2 and GuideLine INteraction Detection Assistant (GLINDA)3 projects and our previous research on applying constraint logic programming to developing a consistent combined therapy for a comorbid patient4. However, there is no generalized framework for mitigation that effectively captures general characteristics of the problem while handling nuances such as time and ordering requirements imposed by specific CPGs. In this paper we propose a first-order logic-based (FOL) approach for developing a generalized framework of mitigation. This approach uses a meta-algorithm and entailment properties to mitigate (i.e., identify and address) adverse interactions introduced by concurrently applied CPGs. We use an illustrative case study of a patient suffering from type 2 diabetes being treated for an onset of severe rheumatoid arthritis to show the expressiveness and robustness of our proposed FOL-based approach, and we discuss its appropriateness as the basis for the generalized theory.
NASA Technical Reports Server (NTRS)
Sherwood, Brent; McCleese, Daniel
2012-01-01
Space science missions are increasingly challenged today: in ambition, by increasingly sophisticated hypotheses tested; in development, by the increasing complexity of advanced technologies; in budgeting, by the decline of flagship-class mission opportunities; in management, by expectations for breakthrough science despite a risk-averse programmatic climate; and in planning, by increasing competition for scarce resources. How are the space-science missions of tomorrow being formulated? The paper describes the JPL Innovation Foundry, created in 2011, to respond to this evolving context. The Foundry integrates methods, tools, and experts that span the mission concept lifecycle. Grounded in JPL's heritage of missions, flight instruments, mission proposals, and concept innovation, the Foundry seeks to provide continuity of support and cost-effective, on-call access to the right domain experts at the right time, as science definition teams and Principal Investigators mature mission ideas from "cocktail napkin" to PDR. The Foundry blends JPL capabilities in proposal development and concurrent engineering, including Team X, with new approaches for open-ended concept exploration in earlier, cost-constrained phases, and with ongoing research and technology projects. It applies complexity and cost models, projectformulation lessons learned, and strategy analyses appropriate to each level of concept maturity. The Foundry is organizationally integrated with JPL formulation program offices; staffed by JPL's line organizations for engineering, science, and costing; and overseen by senior Laboratory leaders to assure experienced coordination and review. Incubation of each concept is tailored depending on its maturity and proposal history, and its highest leverage modeling and analysis needs.
Progress in Integrative Biomaterial Systems to Approach Three-Dimensional Cell Mechanotransduction
Zhang, Ying; Liao, Kin; Li, Chuan; Lai, Alvin C.K.; Foo, Ji-Jinn
2017-01-01
Mechanotransduction between cells and the extracellular matrix regulates major cellular functions in physiological and pathological situations. The effect of mechanical cues on biochemical signaling triggered by cell–matrix and cell–cell interactions on model biomimetic surfaces has been extensively investigated by a combination of fabrication, biophysical, and biological methods. To simulate the in vivo physiological microenvironment in vitro, three dimensional (3D) microstructures with tailored bio-functionality have been fabricated on substrates of various materials. However, less attention has been paid to the design of 3D biomaterial systems with geometric variances, such as the possession of precise micro-features and/or bio-sensing elements for probing the mechanical responses of cells to the external microenvironment. Such precisely engineered 3D model experimental platforms pave the way for studying the mechanotransduction of multicellular aggregates under controlled geometric and mechanical parameters. Concurrently with the progress in 3D biomaterial fabrication, cell traction force microscopy (CTFM) developed in the field of cell biophysics has emerged as a highly sensitive technique for probing the mechanical stresses exerted by cells onto the opposing deformable surface. In the current work, we first review the recent advances in the fabrication of 3D micropatterned biomaterials which enable the seamless integration with experimental cell mechanics in a controlled 3D microenvironment. Then, we discuss the role of collective cell–cell interactions in the mechanotransduction of engineered tissue equivalents determined by such integrative biomaterial systems under simulated physiological conditions. PMID:28952551
X-33 Attitude Control System Design for Ascent, Transition, and Entry Flight Regimes
NASA Technical Reports Server (NTRS)
Hall, Charles E.; Gallaher, Michael W.; Hendrix, Neal D.
1998-01-01
The Vehicle Control Systems Team at Marshall Space Flight Center, Systems Dynamics Laboratory, Guidance and Control Systems Division is designing under a cooperative agreement with Lockheed Martin Skunkworks, the Ascent, Transition, and Entry flight attitude control system for the X-33 experimental vehicle. Ascent flight control begins at liftoff and ends at linear aerospike main engine cutoff (NECO) while Transition and Entry flight control begins at MECO and concludes at the terminal area energy management (TAEM) interface. TAEM occurs at approximately Mach 3.0. This task includes not only the design of the vehicle attitude control systems but also the development of requirements for attitude control system components and subsystems. The X-33 attitude control system design is challenged by a short design cycle, the design environment (Mach 0 to about Mach 15), and the X-33 incremental test philosophy. The X-33 design-to-launch cycle of less than 3 years requires a concurrent design approach while the test philosophy requires design adaptation to vehicle variations that are a function of Mach number and mission profile. The flight attitude control system must deal with the mixing of aerosurfaces, reaction control thrusters, and linear aerospike engine control effectors and handle parasitic effects such as vehicle flexibility and propellant sloshing from the uniquely shaped propellant tanks. The attitude control system design is, as usual, closely linked to many other subsystems and must deal with constraints and requirements from these subsystems.
ERIC Educational Resources Information Center
Bryant, Doug
This paper, titled "The Components of Emotional Intelligence and the Relationship to Sales Performance," presents two general approaches to studying emotional intelligence. The first is a broad model approach that considers abilities as well as a series of personality traits. The second is based on ability models. The possible correlation between…
Anthropological Approach and Activity Theory: Culture, Communities and Institutions
ERIC Educational Resources Information Center
Lagrange, Jean-Baptiste
2013-01-01
The goal of this paper is to evaluate the contribution of the anthropological approach (AA) concurrently to Activity Theory (AT) in view of overarching questions about classroom use of technology for teaching and learning mathematics. I will do it first from a philosophical point of view, presenting the main notions of AA that have been used to…
NASA Technical Reports Server (NTRS)
Sallee, G. P.; Martin, R. L.
1980-01-01
The JT9D jet engine exhibits a TSFC loss of about 1 percent in the initial 50 flight cycles of a new engine. These early losses are caused by seal-wear induced opening of running clearances in the engine gas path. The causes of this seal wear have been identified as flight induced loads which deflect the engine cases and rotors, causing the rotating blades to rub against the seal surfaces, producing permanent clearance changes. The real level of flight loads encountered during airplane acceptance testing and revenue service and the engine's response in the dynamic flight environment were investigated. The feasibility of direct measurement of these flight loads and their effects by concurrent measurement of 747/JT9D propulsion system aerodynamic and inertia loads and the critical engine clearance and performance changes during 747 flight and ground operations was evaluated. A number of technical options were examined in relation to the total estimated program cost to facilitate selection of the most cost effective option. It is concluded that a flight test program meeting the overall objective of determining the levels of aerodynamic and inertia load levels to which the engine is exposed during the initial flight acceptance test and normal flight maneuvers is feasible and desirable. A specific recommended flight test program, based on the evaluation of cost effectiveness, is defined.
Feasibility study for convertible engine torque converter
NASA Technical Reports Server (NTRS)
1985-01-01
The feasibility study has shown that a dump/fill type torque converter has excellent potential for the convertible fan/shaft engine. The torque converter space requirement permits internal housing within the normal flow path of a turbofan engine at acceptable engine weight. The unit permits operating the engine in the turboshaft mode by decoupling the fan. To convert to turbofan mode, the torque converter overdrive capability bring the fan speed up to the power turbine speed to permit engagement of a mechanical lockup device when the shaft speed are synchronized. The conversion to turbofan mode can be made without drop of power turbine speed in less than 10 sec. Total thrust delivered to the aircraft by the proprotor, fan, and engine during tansient can be controlled to prevent loss of air speed or altitude. Heat rejection to the oil is low, and additional oil cooling capacity is not required. The turbofan engine aerodynamic design is basically uncompromised by convertibility and allows proper fan design for quiet and efficient cruise operation. Although the results of the feasibility study are exceedingly encouraging, it must be noted that they are based on extrapolation of limited existing data on torque converters. A component test program with three trial torque converter designs and concurrent computer modeling for fluid flow, stress, and dynamics, updated with test results from each unit, is recommended.
NASA Technical Reports Server (NTRS)
Phillips, Veronica J.
2017-01-01
The Ames Engineering Directorate is the principal engineering organization supporting aerospace systems and spaceflight projects at NASA's Ames Research Center in California's Silicon Valley. The Directorate supports all phases of engineering and project management for flight and mission projects-from R&D to Close-out-by leveraging the capabilities of multiple divisions and facilities.The Mission Design Center (MDC) has full end-to-end mission design capability with sophisticated analysis and simulation tools in a collaborative concurrent design environment. Services include concept maturity level (CML) maturation, spacecraft design and trades, scientific instruments selection, feasibility assessments, and proposal support and partnerships. The Engineering Systems Division provides robust project management support as well as systems engineering, mechanical and electrical analysis and design, technical authority and project integration support to a variety of programs and projects across NASA centers. The Applied Manufacturing Division turns abstract ideas into tangible hardware for aeronautics, spaceflight and science applications, specializing in fabrication methods and management of complex fabrication projects. The Engineering Evaluation Lab (EEL) provides full satellite or payload environmental testing services including vibration, temperature, humidity, immersion, pressure/altitude, vacuum, high G centrifuge, shock impact testing and the Flight Processing Center (FPC), which includes cleanrooms, bonded stores and flight preparation resources. The Multi-Mission Operations Center (MMOC) is composed of the facilities, networks, IT equipment, software and support services needed by flight projects to effectively and efficiently perform all mission functions, including planning, scheduling, command, telemetry processing and science analysis.
Ruglass, Lesia M; Shevorykin, Alina; Brezing, Christina; Hu, Mei-Chen; Hien, Denise A
2017-09-01
While the detrimental effects of concurrent substance use disorders (SUDs) are now being well documented, very few studies have examined this comorbidity among women with posttraumatic stress disorder (PTSD). Data for these analyses were derived from the "Women and Trauma" study conducted within the National Drug Abuse Treatment Clinical Trials Network. Women with full or subthreshold PTSD and co-occurring cannabis use disorder (CUD) and cocaine use disorder (COD; N=99) were compared to their counterparts with co-occurring CUD only (N=26) and co-occurring COD only (N=161) on rates of trauma exposure, psychiatric disorders, psychosocial problems, and other substance use utilizing a set of multivariate logistic regressions. In models adjusted for age and race/ethnicity, women with PTSD and COD only were significantly older than their counterparts with CUD only and concurrent CUD+COD. Relative to those with CUD only, women with concurrent CUD+COD had higher odds of adult sexual assault. Relative to those with COD only, women with concurrent CUD+COD had higher odds of alcohol use disorder in the past 12months. Finally, relative to those with CUD only, women with COD only had higher odds of ever being arrested/convicted and adult sexual assault. The higher rates of adult sexual assault and alcohol use disorder among those with concurrent CUD+COD suggest the need for trauma-informed approaches that can respond to the needs of this dually-diagnosed population. Moreover, the causal link between repeated traumatic stress exposure and polysubstance use requires further examination. Copyright © 2017 Elsevier Inc. All rights reserved.
Concurrent design of quasi-random photonic nanostructures
Lee, Won-Kyu; Yu, Shuangcheng; Engel, Clifford J.; Reese, Thaddeus; Rhee, Dongjoon; Chen, Wei
2017-01-01
Nanostructured surfaces with quasi-random geometries can manipulate light over broadband wavelengths and wide ranges of angles. Optimization and realization of stochastic patterns have typically relied on serial, direct-write fabrication methods combined with real-space design. However, this approach is not suitable for customizable features or scalable nanomanufacturing. Moreover, trial-and-error processing cannot guarantee fabrication feasibility because processing–structure relations are not included in conventional designs. Here, we report wrinkle lithography integrated with concurrent design to produce quasi-random nanostructures in amorphous silicon at wafer scales that achieved over 160% light absorption enhancement from 800 to 1,200 nm. The quasi-periodicity of patterns, materials filling ratio, and feature depths could be independently controlled. We statistically represented the quasi-random patterns by Fourier spectral density functions (SDFs) that could bridge the processing–structure and structure–performance relations. Iterative search of the optimal structure via the SDF representation enabled concurrent design of nanostructures and processing. PMID:28760975
Thirteen years and counting: Outcomes of a concurrent ASN/BSN enrollment program.
Heglund, Stephen; Simmons, Jessica; Wink, Diane; D'Meza Leuner, Jean
In their 2011 report, The Future of Nursing, the Institute of Medicine called for 80% of the nursing workforce to be comprised of baccalaureate prepared Registered Nurses by the year 2020. One suggested approach to achieve this goal is the creation of programs that allow students to progress through associate and baccalaureate nursing preparation simultaneously. This paper describes the University of Central Florida's 13-year experience after implementing a Concurrent Enrollment Program. Development and structure of the program, advisement and curriculum details, facilitators and barriers are described. Data on National Council Licensure Examination for Registered Nurses pass rates, completion rates, comparison with traditional RN-BSN students, and progression to graduate school are also included. The Concurrent Program model described here between a specific university and state college partners, demonstrated positive outcomes that support achievement of the Institute of Medicine's goals. Copyright © 2017 Elsevier Inc. All rights reserved.
Design, Development and Utilization Perspectives on Database Management Systems
ERIC Educational Resources Information Center
Shneiderman, Ben
1977-01-01
This paper reviews the historical development of integrated data base management systems and examines competing approaches. Topics include management and utilization, implementation and design, query languages, security, integrity, privacy and concurrency. (Author/KP)
Launch vehicle systems design analysis
NASA Technical Reports Server (NTRS)
Ryan, Robert; Verderaime, V.
1993-01-01
Current launch vehicle design emphasis is on low life-cycle cost. This paper applies total quality management (TQM) principles to a conventional systems design analysis process to provide low-cost, high-reliability designs. Suggested TQM techniques include Steward's systems information flow matrix method, quality leverage principle, quality through robustness and function deployment, Pareto's principle, Pugh's selection and enhancement criteria, and other design process procedures. TQM quality performance at least-cost can be realized through competent concurrent engineering teams and brilliance of their technical leadership.
1983-06-01
for DEC PDPll systems. MAINSAIL was developed and is marketed with a set of integrated tools for program development. The syntax of the language is...stack, and to test for stack-full and stack-empty conditions. This technique is useful in enforcing data integrity and in con- trolling concurrent...and market MAINSAIL. The language is distinguished by its portability. The same compiler and runtime system, both written in MAINSAIL, are the basis
SEI Report on Graduate Software Engineering Education for 1991
1991-04-01
12, 12 (Dec. 1979), 85-94. Andrews83 Andrews, Gregory R . and Schneider, Fred B. “Concepts and Notations for Concurrent Programming.” ACM Computing...Barringer87 Barringer , H. “Up and Down the Temporal Way.” Computer J. 30, 2 (Apr. 1987), 134-148. Bjørner78 The Vienna Development Method: The Meta-Language...Lecture Notes in Computer Science. Bruns86 Bruns, Glenn R . Technology Assessment: PAISLEY. Tech. Rep. MCC TR STP-296-86, MCC, Austin, Texas, Sept
The development of internet based ship design support system for small and medium sized shipyards
NASA Astrophysics Data System (ADS)
Shin, Sung-Chul; Lee, Soon-Sup; Kang, Dong-Hoon; Lee, Kyung-Ho
2012-03-01
In this paper, a prototype of ship basic planning system is implemented for the small and medium sized shipyards based on the internet technology and concurrent engineering concept. The system is designed from the user requirements. Consequently, standardized development environment and tools are selected. These tools are used for the system development to define and evaluate core application technologies. The system will contribute to increasing competitiveness of small and medium sized shipyards in the 21st century industrial en-vironment.
Planetary exploration with nanosatellites: a space campus for future technology development
NASA Astrophysics Data System (ADS)
Drossart, P.; Mosser, B.; Segret, B.
2017-09-01
Planetary exploration is at the eve of a revolution through nanosatellites accompanying larger missions, or freely cruising in the solar system, providing a man-made cosmic web for in situ or remote sensing exploration of the Solar System. A first step is to build a specific place dedicated to nanosatellite development. The context of the CCERES PSL space campus presents an environment for nanosatellite testing and integration, a concurrent engineering facility room for project analysis and science environment dedicated to this task.
An Element-Based Concurrent Partitioner for Unstructured Finite Element Meshes
NASA Technical Reports Server (NTRS)
Ding, Hong Q.; Ferraro, Robert D.
1996-01-01
A concurrent partitioner for partitioning unstructured finite element meshes on distributed memory architectures is developed. The partitioner uses an element-based partitioning strategy. Its main advantage over the more conventional node-based partitioning strategy is its modular programming approach to the development of parallel applications. The partitioner first partitions element centroids using a recursive inertial bisection algorithm. Elements and nodes then migrate according to the partitioned centroids, using a data request communication template for unpredictable incoming messages. Our scalable implementation is contrasted to a non-scalable implementation which is a straightforward parallelization of a sequential partitioner.
Approaches to Integrating Engineering in STEM Units and Student Achievement Gains
ERIC Educational Resources Information Center
Crotty, Elizabeth A.; Guzey, Selcen S.; Roehrig, Gillian H.; Glancy, Aran W.; Ring-Whalen, Elizabeth A.
2017-01-01
This study examined different approaches to integrating engineering practices in science, technology, engineering, and mathematics (STEM) curriculum units. These various approaches were correlated with student outcomes on engineering assessment items. There are numerous reform documents in the USA and around the world that emphasize the need to…
Concurrent initialization for Bearing-Only SLAM.
Munguía, Rodrigo; Grau, Antoni
2010-01-01
Simultaneous Localization and Mapping (SLAM) is perhaps the most fundamental problem to solve in robotics in order to build truly autonomous mobile robots. The sensors have a large impact on the algorithm used for SLAM. Early SLAM approaches focused on the use of range sensors as sonar rings or lasers. However, cameras have become more and more used, because they yield a lot of information and are well adapted for embedded systems: they are light, cheap and power saving. Unlike range sensors which provide range and angular information, a camera is a projective sensor which measures the bearing of images features. Therefore depth information (range) cannot be obtained in a single step. This fact has propitiated the emergence of a new family of SLAM algorithms: the Bearing-Only SLAM methods, which mainly rely in especial techniques for features system-initialization in order to enable the use of bearing sensors (as cameras) in SLAM systems. In this work a novel and robust method, called Concurrent Initialization, is presented which is inspired by having the complementary advantages of the Undelayed and Delayed methods that represent the most common approaches for addressing the problem. The key is to use concurrently two kinds of feature representations for both undelayed and delayed stages of the estimation. The simulations results show that the proposed method surpasses the performance of previous schemes.
Concurrent profiling of polar metabolites and lipids in human plasma using HILIC-FTMS
NASA Astrophysics Data System (ADS)
Cai, Xiaoming; Li, Ruibin
2016-11-01
Blood plasma is the most popularly used sample matrix for metabolite profiling studies, which aim to achieve global metabolite profiling and biomarker discovery. However, most of the current studies on plasma metabolite profiling focused on either the polar metabolites or lipids. In this study, a comprehensive analysis approach based on HILIC-FTMS was developed to concurrently examine polar metabolites and lipids. The HILIC-FTMS method was developed using mixed standards of polar metabolites and lipids, the separation efficiency of which is better in HILIC mode than in C5 and C18 reversed phase (RP) chromatography. This method exhibits good reproducibility in retention times (CVs < 3.43%) and high mass accuracy (<3.5 ppm). In addition, we found MeOH/ACN/Acetone (1:1:1, v/v/v) as extraction cocktail could achieve desirable gathering of demanded extracts from plasma samples. We further integrated the MeOH/ACN/Acetone extraction with the HILIC-FTMS method for metabolite profiling and smoking-related biomarker discovery in human plasma samples. Heavy smokers could be successfully distinguished from non smokers by univariate and multivariate statistical analysis of the profiling data, and 62 biomarkers for cigarette smoke were found. These results indicate that our concurrent analysis approach could be potentially used for clinical biomarker discovery, metabolite-based diagnosis, etc.
The approach to engineering tasks composition on knowledge portals
NASA Astrophysics Data System (ADS)
Novogrudska, Rina; Globa, Larysa; Schill, Alexsander; Romaniuk, Ryszard; Wójcik, Waldemar; Karnakova, Gaini; Kalizhanova, Aliya
2017-08-01
The paper presents an approach to engineering tasks composition on engineering knowledge portals. The specific features of engineering tasks are highlighted, their analysis makes the basis for partial engineering tasks integration. The formal algebraic system for engineering tasks composition is proposed, allowing to set the context-independent formal structures for engineering tasks elements' description. The method of engineering tasks composition is developed that allows to integrate partial calculation tasks into general calculation tasks on engineering portals, performed on user request demand. The real world scenario «Calculation of the strength for the power components of magnetic systems» is represented, approving the applicability and efficiency of proposed approach.
Olson, Robert; Karam, Irene; Wilson, Gavin; Bowman, Angela; Lee, Christopher; Wong, Frances
2013-12-01
The purpose of this study is to compare patient outcomes between a therapeutic versus a prophylactic gastrostomy tube (GT) placement approach in patients treated with concurrent systemic and radiation (SRT) therapy for head and neck cancer (HNC). Outcomes were compared between all HNC patients treated with concurrent SRT from January 2001 to June 2009 from a center that only places GTs therapeutically when clinically necessary (center A) versus a center that generally places them prophylactically (center B). A total of 445 patients with HNC were identified, with 63 % from center A. As anticipated, GTs were placed less commonly in center A compared to B (31 versus 88 %; p < 0.001). Center B had a significantly higher number of GT complications (p < 0.001), including infection (16 versus 5 %), leakage (10 versus 2 %), and blockage (3 versus 1 %). Conversely, center A had a higher admission rate (27 versus 13 %, p = 0.001), most prominent for GT-related issues (15 versus 6 %). Center B had higher GT dependence at 90 days post-radiation therapy (34 versus 12 %; p < 0.001), but not at 1 year (11 versus 10 %; p = 0.74). There was no significant difference in the proportion of head and neck patients who had a 10 % weight loss at 1 year (compared to baseline) between centers A and B (42 versus 53 %, p = 0.07). There was no significant difference in the overall survival (A versus B, HR = 0.99; p = 0.96). A prophylactic GT approach results in exposing higher number of patients to GT complications. The higher rate of hospitalizations using a therapeutic approach suggests that patients are sicker when GTs are required. Given the similar weight loss and survival, a therapeutic approach at an earlier stage of need may be a preferable approach, when access to prompt GT placement is available.
Pan, Hao; Ma, Jing; Ma, Ji; Zhang, Qinghua; Liu, Xiaozhi; Guan, Bo; Gu, Lin; Zhang, Xin; Zhang, Yu-Jun; Li, Liangliang; Shen, Yang; Lin, Yuan-Hua; Nan, Ce-Wen
2018-05-08
Developing high-performance film dielectrics for capacitive energy storage has been a great challenge for modern electrical devices. Despite good results obtained in lead titanate-based dielectrics, lead-free alternatives are strongly desirable due to environmental concerns. Here we demonstrate that giant energy densities of ~70 J cm -3 , together with high efficiency as well as excellent cycling and thermal stability, can be achieved in lead-free bismuth ferrite-strontium titanate solid-solution films through domain engineering. It is revealed that the incorporation of strontium titanate transforms the ferroelectric micro-domains of bismuth ferrite into highly-dynamic polar nano-regions, resulting in a ferroelectric to relaxor-ferroelectric transition with concurrently improved energy density and efficiency. Additionally, the introduction of strontium titanate greatly improves the electrical insulation and breakdown strength of the films by suppressing the formation of oxygen vacancies. This work opens up a feasible and propagable route, i.e., domain engineering, to systematically develop new lead-free dielectrics for energy storage.
The BOEING 777 - concurrent engineering and digital pre-assembly
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abarbanel, B.
The processes created on the 777 for checking designs were called {open_quotes}digital pre-assembly{close_quotes}. Using FlyThru(tm), a spin-off of a Boeing advanced computing research project, engineers were able to view up to 1500 models (15000 solids) in 3d traversing that data at high speed. FlyThru(tm) was rapidly deployed in 1991 to meet the needs of the 777 for large scale product visualization and verification. The digital pre-assembly process has bad fantastic results. The 777 has had far fewer assembly and systems problems compared to previous airplane programs. Today, FlyThru(tm) is installed on hundreds of workstations on almost every airplane program, andmore » is being used on Space Station, F22, AWACS, and other defense projects. It`s applications have gone far beyond just design review. In many ways, FlyThru is a Data Warehouse supported by advanced tools for analysis. It is today being integrated with Knowledge Based Engineering geometry generation tools.« less
Analysis of Aurora's Performance Simulation Engine for Three Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freeman, Janine; Simon, Joseph
2015-07-07
Aurora Solar Inc. is building a cloud-based optimization platform to automate the design, engineering, and permit generation process of solar photovoltaic (PV) installations. They requested that the National Renewable Energy Laboratory (NREL) validate the performance of the PV system performance simulation engine of Aurora Solar’s solar design platform, Aurora. In previous work, NREL performed a validation of multiple other PV modeling tools 1, so this study builds upon that work by examining all of the same fixed-tilt systems with available module datasheets that NREL selected and used in the aforementioned study. Aurora Solar set up these three operating PV systemsmore » in their modeling platform using NREL-provided system specifications and concurrent weather data. NREL then verified the setup of these systems, ran the simulations, and compared the Aurora-predicted performance data to measured performance data for those three systems, as well as to performance data predicted by other PV modeling tools.« less
Integrating reliability and maintainability into a concurrent engineering environment
NASA Astrophysics Data System (ADS)
Phillips, Clifton B.; Peterson, Robert R.
1993-02-01
This paper describes the results of a reliability and maintainability study conducted at the University of California, San Diego and supported by private industry. Private industry thought the study was important and provided the university access to innovative tools under cooperative agreement. The current capability of reliability and maintainability tools and how they fit into the design process is investigated. The evolution of design methodologies leading up to today's capability is reviewed for ways to enhance the design process while keeping cost under control. A method for measuring the consequences of reliability and maintainability policy for design configurations in an electronic environment is provided. The interaction of selected modern computer tool sets is described for reliability, maintainability, operations, and other elements of the engineering design process. These tools provide a robust system evaluation capability that brings life cycle performance improvement information to engineers and their managers before systems are deployed, and allow them to monitor and track performance while it is in operation.
Materials and Process Activities for NASA's Composite Crew Module
NASA Technical Reports Server (NTRS)
Polis, Daniel L.
2012-01-01
In January 2007, the NASA Administrator and Associate Administrator for the Exploration Systems Mission Directorate chartered the NASA Engineering and Safety Center (NESC) to design, build, and test a full-scale Composite Crew Module (CCM). The overall goal of the CCM project was to develop a team from the NASA family with hands-on experience in composite design, manufacturing, and testing in anticipation of future space exploration systems being made of composite materials. The CCM project was planned to run concurrently with the Orion project s baseline metallic design within the Constellation Program so that features could be compared and discussed without inducing risk to the overall Program. The materials and process activities were prioritized based on a rapid prototype approach. This approach focused developmental activities on design details with greater risk and uncertainty, such as out-of-autoclave joining, over some of the more traditional lamina and laminate building block levels. While process development and associated building block testing were performed, several anomalies were still observed at the full-scale level due to interactions between process robustness and manufacturing scale-up. This paper describes the process anomalies that were encountered during the CCM development and the subsequent root cause investigations that led to the final design solutions. These investigations highlight the importance of full-scale developmental work early in the schedule of a complex composite design/build project.
Short-course versus long-course chemoradiation in rectal cancer--time to change strategies?
Palta, Manisha; Willett, Christopher G; Czito, Brian G
2014-09-01
There is significant debate regarding the optimal neoadjuvant regimen for resectable rectal cancer patients. Short-course radiotherapy, a standard approach throughout most of northern Europe, is generally defined as 25 Gy in 5 fractions over the course of 1 week without the concurrent administration of chemotherapy. Long-course radiotherapy is typically defined as 45 to 50.4 Gy in 25-28 fractions with the administration of concurrent 5-fluoropyrimidine-based chemotherapy and is the standard approach in other parts of Europe and the United States. At present, two randomized trials have compared outcomes for short course radiotherapy with long-course chemoradiation showing no difference in respective study endpoints. Late toxicity data are lacking given limited follow-up. Although the ideal neoadjuvant regimen is controversial, our current bias is long-course chemoradiation to treat patients with locally advanced, resectable rectal cancer.
Hybrid Parallelism for Volume Rendering on Large-, Multi-, and Many-Core Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Howison, Mark; Bethel, E. Wes; Childs, Hank
2012-01-01
With the computing industry trending towards multi- and many-core processors, we study how a standard visualization algorithm, ray-casting volume rendering, can benefit from a hybrid parallelism approach. Hybrid parallelism provides the best of both worlds: using distributed-memory parallelism across a large numbers of nodes increases available FLOPs and memory, while exploiting shared-memory parallelism among the cores within each node ensures that each node performs its portion of the larger calculation as efficiently as possible. We demonstrate results from weak and strong scaling studies, at levels of concurrency ranging up to 216,000, and with datasets as large as 12.2 trillion cells.more » The greatest benefit from hybrid parallelism lies in the communication portion of the algorithm, the dominant cost at higher levels of concurrency. We show that reducing the number of participants with a hybrid approach significantly improves performance.« less
Case Study: Developing Graduate Engineers at Kentz Engineers & Constructors
ERIC Educational Resources Information Center
O'Donnell, Hugh; Karallis, Takis; Sandelands, Eric; Cassin, James; O'Neill, Donal
2008-01-01
Purpose: The aim of this paper is to outline the approach and process in place within Kentz Engineers & Constructors to develop graduate engineers on an international basis. Design/methodology/approach: The approach adopted is that of a case study which describes activities and processes within the organization and the rationale behind them,…
Bi-Level Integrated System Synthesis (BLISS) for Concurrent and Distributed Processing
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw; Altus, Troy D.; Phillips, Matthew; Sandusky, Robert
2002-01-01
The paper introduces a new version of the Bi-Level Integrated System Synthesis (BLISS) methods intended for optimization of engineering systems conducted by distributed specialty groups working concurrently and using a multiprocessor computing environment. The method decomposes the overall optimization task into subtasks associated with disciplines or subsystems where the local design variables are numerous and a single, system-level optimization whose design variables are relatively few. The subtasks are fully autonomous as to their inner operations and decision making. Their purpose is to eliminate the local design variables and generate a wide spectrum of feasible designs whose behavior is represented by Response Surfaces to be accessed by a system-level optimization. It is shown that, if the problem is convex, the solution of the decomposed problem is the same as that obtained without decomposition. A simplified example of an aircraft design shows the method working as intended. The paper includes a discussion of the method merits and demerits and recommendations for further research.
Kumar, Pavan; Jander, Georg
2017-04-05
Potatoes (Solanum tuberosum) are deficient in methionine, an essential amino acid in human and animal diets. Higher methionine levels increase the nutritional quality and promote the typically pleasant aroma associated with baked and fried potatoes. Several attempts have been made to elevate tuber methionine levels by genetic engineering of methionine biosynthesis and catabolism. Overexpressing Arabidopsis thaliana cystathionine γ-synthase (AtCGS) in S. tuberosum up-regulates a rate-limiting step of methionine biosynthesis and increases tuber methionine levels. Alternatively, silencing S. tuberosum methionine γ-lyase (StMGL), which causes decreased degradation of methionine into 2-ketobutyrate, also increases methionine levels. Concurrently enhancing biosynthesis and reducing degradation were predicted to provide further increases in tuber methionine content. Here we report that S. tuberosum cv. Désirée plants with AtCGS overexpression and StMGL silenced by RNA interference are morphologically normal and accumulate higher free methionine levels than either single-transgenic line.
Toward ubiquitous healthcare services with a novel efficient cloud platform.
He, Chenguang; Fan, Xiaomao; Li, Ye
2013-01-01
Ubiquitous healthcare services are becoming more and more popular, especially under the urgent demand of the global aging issue. Cloud computing owns the pervasive and on-demand service-oriented natures, which can fit the characteristics of healthcare services very well. However, the abilities in dealing with multimodal, heterogeneous, and nonstationary physiological signals to provide persistent personalized services, meanwhile keeping high concurrent online analysis for public, are challenges to the general cloud. In this paper, we proposed a private cloud platform architecture which includes six layers according to the specific requirements. This platform utilizes message queue as a cloud engine, and each layer thereby achieves relative independence by this loosely coupled means of communications with publish/subscribe mechanism. Furthermore, a plug-in algorithm framework is also presented, and massive semistructure or unstructured medical data are accessed adaptively by this cloud architecture. As the testing results showing, this proposed cloud platform, with robust, stable, and efficient features, can satisfy high concurrent requests from ubiquitous healthcare services.
Continuous integration for concurrent MOOSE framework and application development on GitHub
Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.; ...
2015-11-20
For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less
Continuous integration for concurrent MOOSE framework and application development on GitHub
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.
For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less
Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models
NASA Astrophysics Data System (ADS)
Altuntas, Alper; Baugh, John
2017-07-01
Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.
ERIC Educational Resources Information Center
Cooper, Joseph N.; Hall, Jori
2016-01-01
The purpose of this article is to describe how a mixed methods approach was employed to acquire a better understanding of Black male student athletes' experiences at a historically Black college/university in the southeastern United States. A concurrent triangulation design was incorporated to allow different data sources to be collected and…
ERIC Educational Resources Information Center
Huet, Michael; Jacobs, David M.; Camachon, Cyril; Missenard, Olivier; Gray, Rob; Montagne, Gilles
2011-01-01
The present study reports two experiments in which a total of 20 participants without prior flight experience practiced the final approach phase in a fixed-base simulator. All participants received self-controlled concurrent feedback during 180 practice trials. Experiment 1 shows that participants learn more quickly under variable practice…
A Unified Approach to IRT Scale Linking and Scale Transformations. Research Report. RR-04-09
ERIC Educational Resources Information Center
von Davier, Matthias; von Davier, Alina A.
2004-01-01
This paper examines item response theory (IRT) scale transformations and IRT scale linking methods used in the Non-Equivalent Groups with Anchor Test (NEAT) design to equate two tests, X and Y. It proposes a unifying approach to the commonly used IRT linking methods: mean-mean, mean-var linking, concurrent calibration, Stocking and Lord and…
Emergence of Scaffold-free Approaches for Tissue Engineering Musculoskeletal Cartilages
DuRaine, Grayson D.; Brown, Wendy E.; Hu, Jerry C.; Athanasiou, Kyriacos A.
2014-01-01
This review explores scaffold-free methods as an additional paradigm for tissue engineering. Musculoskeletal cartilages –for example articular cartilage, meniscus, temporomandibular joint disc, and intervertebral disc – are characterized by low vascularity and cellularity, and are amenable to scaffold-free tissue engineering approaches. Scaffold-free approaches, particularly the self-assembling process, mimic elements of developmental processes underlying these tissues. Discussed are various scaffold-free approaches for musculoskeletal cartilage tissue engineering, such as cell sheet engineering, aggregation, and the self-assembling process, as well as the availability and variety of cells used. Immunological considerations are of particular importance as engineered tissues are frequently of allogeneic, if not xenogeneic, origin. Factors that enhance the matrix production and mechanical properties of these engineered cartilages are also reviewed, as the fabrication of biomimetically suitable tissues is necessary to replicate function and ensure graft survival in vivo. The concept of combining scaffold-free and scaffold-based tissue engineering methods to address clinical needs is also discussed. Inasmuch as scaffold-based musculoskeletal tissue engineering approaches have been employed as a paradigm to generate engineered cartilages with appropriate functional properties, scaffold-free approaches are emerging as promising elements of a translational pathway not only for musculoskeletal cartilages but for other tissues as well. PMID:25331099
NASA Astrophysics Data System (ADS)
Dujarric, C.; Santovincenzo, A.; Summerer, L.
2013-03-01
Conventional propulsion technology (chemical and electric) currently limits the possibilities for human space exploration to the neighborhood of the Earth. If farther destinations (such as Mars) are to be reached with humans on board, a more capable interplanetary transfer engine featuring high thrust, high specific impulse is required. The source of energy which could in principle best meet these engine requirements is nuclear thermal. However, the nuclear thermal rocket technology is not yet ready for flight application. The development of new materials which is necessary for the nuclear core will require further testing on ground of full-scale nuclear rocket engines. Such testing is a powerful inhibitor to the nuclear rocket development, as the risks of nuclear contamination of the environment cannot be entirely avoided with current concepts. Alongside already further matured activities in the field of space nuclear power sources for generating on-board power, a low level investigation on nuclear propulsion has been running since long within ESA, and innovative concepts have already been proposed at an IAF conference in 1999 [1, 2]. Following a slow maturation process, a new concept was defined which was submitted to a concurrent design exercise in ESTEC in 2007. Great care was taken in the selection of the design parameters to ensure that this quite innovative concept would in all respects likely be feasible with margins. However, a thorough feasibility demonstration will require a more detailed design including the selection of appropriate materials and the verification that these can withstand the expected mechanical, thermal, and chemical environment. So far, the predefinition work made clear that, based on conservative technology assumptions, a specific impulse of 920 s could be obtained with a thrust of 110 kN. Despite the heavy engine dry mass, a preliminary mission analysis using conservative assumptions showed that the concept was reducing the required Initial Mass in Low Earth Orbit compared to conventional nuclear thermal rockets for a human mission to Mars. Of course, the realization of this concept still requires proper engineering and the dimensioning of quite unconventional machinery. A patent was filed on the concept. Because of the operating parameters of the nuclear core, which are very specific to this type of concept, it seems possible to test on ground this kind of engine at full scale in close loop using a reasonable size test facility with safe and clean conditions. Such tests can be conducted within fully confined enclosure, which would substantially increase the associated inherent nuclear safety levels. This breakthrough removes a showstopper for nuclear rocket engines development. The present paper will disclose the NTER (Nuclear Thermal Electric Rocket) engine concept, will present some of the results of the ESTEC concurrent engineering exercise, and will explain the concept for the NTER on-ground testing facility. Regulations and safety issues related to the development and implementation of the NTER concept will be addressed as well.
The rise of concurrent care for veterans with advanced cancer at the end of life.
Mor, Vincent; Joyce, Nina R; Coté, Danielle L; Gidwani, Risha A; Ersek, Mary; Levy, Cari R; Faricy-Anderson, Katherine E; Miller, Susan C; Wagner, Todd H; Kinosian, Bruce P; Lorenz, Karl A; Shreve, Scott T
2016-03-01
Unlike Medicare, the Veterans Health Administration (VA) health care system does not require veterans with cancer to make the "terrible choice" between receipt of hospice services or disease-modifying chemotherapy/radiation therapy. For this report, the authors characterized the VA's provision of concurrent care, defined as days in the last 6 months of life during which veterans simultaneously received hospice services and chemotherapy or radiation therapy. This retrospective cohort study included veteran decedents with cancer during 2006 through 2012 who were identified from claims with cancer diagnoses. Hospice and cancer treatment were identified using VA and Medicare administrative data. Descriptive statistics were used to characterize the changes in concurrent care, hospice, palliative care, and chemotherapy or radiation treatment. The proportion of veterans receiving chemotherapy or radiation therapy remained stable at approximately 45%, whereas the proportion of veterans who received hospice increased from 55% to 68%. The receipt of concurrent care also increased during this time from 16.2% to 24.5%. The median time between hospice initiation and death remained stable at around 21 days. Among veterans who received chemotherapy or radiation therapy in their last 6 months of life, the median time between treatment termination and death ranged from 35 to 40 days. There was considerable variation between VA medical centers in the use of concurrent care (interquartile range, 16%-34% in 2012). Concurrent receipt of hospice and chemotherapy or radiation therapy increased among veterans dying from cancer without reductions in the receipt of cancer therapy. This approach reflects the expansion of hospice services in the VA with VA policy allowing the concurrent receipt of hospice and antineoplastic therapies. Cancer 2016;122:782-790. © 2015 American Cancer Society. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.
Glaser, Johann; Beisteiner, Roland; Bauer, Herbert; Fischmeister, Florian Ph S
2013-11-09
In concurrent EEG/fMRI recordings, EEG data are impaired by the fMRI gradient artifacts which exceed the EEG signal by several orders of magnitude. While several algorithms exist to correct the EEG data, these algorithms lack the flexibility to either leave out or add new steps. The here presented open-source MATLAB toolbox FACET is a modular toolbox for the fast and flexible correction and evaluation of imaging artifacts from concurrently recorded EEG datasets. It consists of an Analysis, a Correction and an Evaluation framework allowing the user to choose from different artifact correction methods with various pre- and post-processing steps to form flexible combinations. The quality of the chosen correction approach can then be evaluated and compared to different settings. FACET was evaluated on a dataset provided with the FMRIB plugin for EEGLAB using two different correction approaches: Averaged Artifact Subtraction (AAS, Allen et al., NeuroImage 12(2):230-239, 2000) and the FMRI Artifact Slice Template Removal (FASTR, Niazy et al., NeuroImage 28(3):720-737, 2005). Evaluation of the obtained results were compared to the FASTR algorithm implemented in the EEGLAB plugin FMRIB. No differences were found between the FACET implementation of FASTR and the original algorithm across all gradient artifact relevant performance indices. The FACET toolbox not only provides facilities for all three modalities: data analysis, artifact correction as well as evaluation and documentation of the results but it also offers an easily extendable framework for development and evaluation of new approaches.
2013-01-01
Background In concurrent EEG/fMRI recordings, EEG data are impaired by the fMRI gradient artifacts which exceed the EEG signal by several orders of magnitude. While several algorithms exist to correct the EEG data, these algorithms lack the flexibility to either leave out or add new steps. The here presented open-source MATLAB toolbox FACET is a modular toolbox for the fast and flexible correction and evaluation of imaging artifacts from concurrently recorded EEG datasets. It consists of an Analysis, a Correction and an Evaluation framework allowing the user to choose from different artifact correction methods with various pre- and post-processing steps to form flexible combinations. The quality of the chosen correction approach can then be evaluated and compared to different settings. Results FACET was evaluated on a dataset provided with the FMRIB plugin for EEGLAB using two different correction approaches: Averaged Artifact Subtraction (AAS, Allen et al., NeuroImage 12(2):230–239, 2000) and the FMRI Artifact Slice Template Removal (FASTR, Niazy et al., NeuroImage 28(3):720–737, 2005). Evaluation of the obtained results were compared to the FASTR algorithm implemented in the EEGLAB plugin FMRIB. No differences were found between the FACET implementation of FASTR and the original algorithm across all gradient artifact relevant performance indices. Conclusion The FACET toolbox not only provides facilities for all three modalities: data analysis, artifact correction as well as evaluation and documentation of the results but it also offers an easily extendable framework for development and evaluation of new approaches. PMID:24206927
From non-preemptive to preemptive scheduling using synchronization synthesis.
Černý, Pavol; Clarke, Edmund M; Henzinger, Thomas A; Radhakrishna, Arjun; Ryzhyk, Leonid; Samanta, Roopsha; Tarrach, Thorsten
2017-01-01
We present a computer-aided programming approach to concurrency. The approach allows programmers to program assuming a friendly, non-preemptive scheduler, and our synthesis procedure inserts synchronization to ensure that the final program works even with a preemptive scheduler. The correctness specification is implicit, inferred from the non-preemptive behavior. Let us consider sequences of calls that the program makes to an external interface. The specification requires that any such sequence produced under a preemptive scheduler should be included in the set of sequences produced under a non-preemptive scheduler. We guarantee that our synthesis does not introduce deadlocks and that the synchronization inserted is optimal w.r.t. a given objective function. The solution is based on a finitary abstraction, an algorithm for bounded language inclusion modulo an independence relation, and generation of a set of global constraints over synchronization placements. Each model of the global constraints set corresponds to a correctness-ensuring synchronization placement. The placement that is optimal w.r.t. the given objective function is chosen as the synchronization solution. We apply the approach to device-driver programming, where the driver threads call the software interface of the device and the API provided by the operating system. Our experiments demonstrate that our synthesis method is precise and efficient. The implicit specification helped us find one concurrency bug previously missed when model-checking using an explicit, user-provided specification. We implemented objective functions for coarse-grained and fine-grained locking and observed that different synchronization placements are produced for our experiments, favoring a minimal number of synchronization operations or maximum concurrency, respectively.
Radioisotope Power Systems Reference Book for Mission Designers and Planners
NASA Technical Reports Server (NTRS)
Lee, Young; Bairstow, Brian
2015-01-01
The RPS Program's Program Planning and Assessment (PPA) Office commissioned the Mission Analysis team to develop the Radioisotope Power Systems (RPS) Reference Book for Mission Planners and Designers to define a baseline of RPS technology capabilities with specific emphasis on performance parameters and technology readiness. The main objective of this book is to provide RPS technology information that could be utilized by future mission concept studies and concurrent engineering practices. A progress summary from the major branches of RPS technology research provides mission analysis teams with a vital tool for assessing the RPS trade space, and provides concurrent engineering centers with a consistent set of guidelines for RPS performance characteristics. This book will be iterated when substantial new information becomes available to ensure continued relevance, serving as one of the cornerstone products of the RPS PPA Office. This book updates the original 2011 internal document, using data from the relevant publicly released RPS technology references and consultations with RPS technologists. Each performance parameter and RPS product subsection has been reviewed and cleared by at least one subject matter representative. A virtual workshop was held to reach consensus on the scope and contents of the book, and the definitions and assumptions that should be used. The subject matter experts then reviewed and updated the appropriate sections of the book. The RPS Mission Analysis Team then performed further updates and crosschecked the book for consistency. Finally, a second virtual workshop was held to ensure all subject matter experts and stakeholders concurred on the contents.
A Practical Approach to Implementing Real-Time Semantics
NASA Technical Reports Server (NTRS)
Luettgen, Gerald; Bhat, Girish; Cleaveland, Rance
1999-01-01
This paper investigates implementations of process algebras which are suitable for modeling concurrent real-time systems. It suggests an approach for efficiently implementing real-time semantics using dynamic priorities. For this purpose a proces algebra with dynamic priority is defined, whose semantics corresponds one-to-one to traditional real-time semantics. The advantage of the dynamic-priority approach is that it drastically reduces the state-space sizes of the systems in question while preserving all properties of their functional and real-time behavior. The utility of the technique is demonstrated by a case study which deals with the formal modeling and verification of the SCSI-2 bus-protocol. The case study is carried out in the Concurrency Workbench of North Carolina, an automated verification tool in which the process algebra with dynamic priority is implemented. It turns out that the state space of the bus-protocol model is about an order of magnitude smaller than the one resulting from real-time semantics. The accuracy of the model is proved by applying model checking for verifying several mandatory properties of the bus protocol.
Architecture independent environment for developing engineering software on MIMD computers
NASA Technical Reports Server (NTRS)
Valimohamed, Karim A.; Lopez, L. A.
1990-01-01
Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.
X-33 Attitude Control Using the XRS-2200 Linear Aerospike Engine
NASA Technical Reports Server (NTRS)
Hall, Charles E.; Panossian, Hagop V.
1999-01-01
The Vehicle Control Systems Team at Marshall Space Flight Center, Structures and Dynamics Laboratory, Guidance and Control Systems Division is designing, under a cooperative agreement with Lockheed Martin Skunkworks, the Ascent, Transition, and Entry flight attitude control systems for the X-33 experimental vehicle. Test flights, while suborbital, will achieve sufficient altitudes and Mach numbers to test Single Stage To Orbit, Reusable Launch Vehicle technologies. Ascent flight control phase, the focus of this paper, begins at liftoff and ends at linear aerospike main engine cutoff (MECO). The X-33 attitude control system design is confronted by a myriad of design challenges: a short design cycle, the X-33 incremental test philosophy, the concurrent design philosophy chosen for the X-33 program, and the fact that the attitude control system design is, as usual, closely linked to many other subsystems and must deal with constraints and requirements from these subsystems. Additionally, however, and of special interest, the use of the linear aerospike engine is a departure from the gimbaled engines traditionally used for thrust vector control (TVC) in launch vehicles and poses certain design challenges. This paper discusses the unique problem of designing the X-33 attitude control system with the linear aerospike engine, requirements development, modeling and analyses that verify the design.
Design Approaches to Myocardial and Vascular Tissue Engineering.
Akintewe, Olukemi O; Roberts, Erin G; Rim, Nae-Gyune; Ferguson, Michael A H; Wong, Joyce Y
2017-06-21
Engineered tissues represent an increasingly promising therapeutic approach for correcting structural defects and promoting tissue regeneration in cardiovascular diseases. One of the challenges associated with this approach has been the necessity for the replacement tissue to promote sufficient vascularization to maintain functionality after implantation. This review highlights a number of promising prevascularization design approaches for introducing vasculature into engineered tissues. Although we focus on encouraging blood vessel formation within myocardial implants, we also discuss techniques developed for other tissues that could eventually become relevant to engineered cardiac tissues. Because the ultimate solution to engineered tissue vascularization will require collaboration between wide-ranging disciplines such as developmental biology, tissue engineering, and computational modeling, we explore contributions from each field.
Seyfried, Lisa; Hanauer, David A; Nease, Donald; Albeiruti, Rashad; Kavanagh, Janet; Kales, Helen C
2009-12-01
Electronic medical records (EMRs) have become part of daily practice for many physicians. Attempts have been made to apply electronic search engine technology to speed EMR review. This was a prospective, observational study to compare the speed and clinical accuracy of a medical record search engine vs. manual review of the EMR. Three raters reviewed 49 cases in the EMR to screen for eligibility in a depression study using the electronic medical record search engine (EMERSE). One week later raters received a scrambled set of the same patients including 9 distractor cases, and used manual EMR review to determine eligibility. For both methods, accuracy was assessed for the original 49 cases by comparison with a gold standard rater. Use of EMERSE resulted in considerable time savings; chart reviews using EMERSE were significantly faster than traditional manual review (p=0.03). The percent agreement of raters with the gold standard (e.g. concurrent validity) using either EMERSE or manual review was not significantly different. Using a search engine optimized for finding clinical information in the free-text sections of the EMR can provide significant time savings while preserving clinical accuracy. The major power of this search engine is not from a more advanced and sophisticated search algorithm, but rather from a user interface designed explicitly to help users search the entire medical record in a way that protects health information.
Young, Li-Hao; Liou, Yi-Jyun; Cheng, Man-Ting; Lu, Jau-Huai; Yang, Hsi-Hsien; Tsai, Ying I; Wang, Lin-Chi; Chen, Chung-Bang; Lai, Jim-Shoung
2012-01-15
Diesel engine exhaust contains large numbers of submicrometer particles that degrade air quality and human health. This study examines the number emission characteristics of 10-1000 nm nonvolatile particles from a heavy-duty diesel engine, operating with various waste cooking oil biodiesel blends (B2, B10 and B20), engine loads (0%, 25%, 50% and 75%) and a diesel oxidation catalyst plus diesel particulate filter (DOC+DPF) under steady modes. For a given load, the total particle number concentrations (N(TOT)) decrease slightly, while the mode diameters show negligible changes with increasing biodiesel blends. For a given biodiesel blend, both the N(TOT) and mode diameters increase modestly with increasing load of above 25%. The N(TOT) at idle are highest and their size distributions are strongly affected by condensation and possible nucleation of semivolatile materials. Nonvolatile cores of diameters less than 16 nm are only observed at idle mode. The DOC+DPF shows remarkable filtration efficiency for both the core and soot particles, irrespective of the biodiesel blend and engine load under study. The N(TOT) post the DOC+DPF are comparable to typical ambient levels of ≈ 10(4)cm(-3). This implies that, without concurrent reductions of semivolatile materials, the formation of semivolatile nucleation mode particles post the after treatment is highly favored. Copyright © 2011 Elsevier B.V. All rights reserved.
Seyfried, Lisa; Hanauer, David; Nease, Donald; Albeiruti, Rashad; Kavanagh, Janet; Kales, Helen C.
2009-01-01
Purpose Electronic medical records (EMR) have become part of daily practice for many physicians. Attempts have been made to apply electronic search engine technology to speed EMR review. This was a prospective, observational study to compare the speed and accuracy of electronic search engine vs. manual review of the EMR. Methods Three raters reviewed 49 cases in the EMR to screen for eligibility in a depression study using the electronic search engine (EMERSE). One week later raters received a scrambled set of the same patients including 9 distractor cases, and used manual EMR review to determine eligibility. For both methods, accuracy was assessed for the original 49 cases by comparison with a gold standard rater. Results Use of EMERSE resulted in considerable time savings; chart reviews using EMERSE were significantly faster than traditional manual review (p=0.03). The percent agreement of raters with the gold standard (e.g. concurrent validity) using either EMERSE or manual review was not significantly different. Conclusions Using a search engine optimized for finding clinical information in the free-text sections of the EMR can provide significant time savings while preserving reliability. The major power of this search engine is not from a more advanced and sophisticated search algorithm, but rather from a user interface designed explicitly to help users search the entire medical record in a way that protects health information. PMID:19560962
Systems metabolic engineering strategies for the production of amino acids.
Ma, Qian; Zhang, Quanwei; Xu, Qingyang; Zhang, Chenglin; Li, Yanjun; Fan, Xiaoguang; Xie, Xixian; Chen, Ning
2017-06-01
Systems metabolic engineering is a multidisciplinary area that integrates systems biology, synthetic biology and evolutionary engineering. It is an efficient approach for strain improvement and process optimization, and has been successfully applied in the microbial production of various chemicals including amino acids. In this review, systems metabolic engineering strategies including pathway-focused approaches, systems biology-based approaches, evolutionary approaches and their applications in two major amino acid producing microorganisms: Corynebacterium glutamicum and Escherichia coli, are summarized.
NASA Astrophysics Data System (ADS)
Fiedler, Lorenz; Wöstmann, Malte; Graversen, Carina; Brandmeyer, Alex; Lunner, Thomas; Obleser, Jonas
2017-06-01
Objective. Conventional, multi-channel scalp electroencephalography (EEG) allows the identification of the attended speaker in concurrent-listening (‘cocktail party’) scenarios. This implies that EEG might provide valuable information to complement hearing aids with some form of EEG and to install a level of neuro-feedback. Approach. To investigate whether a listener’s attentional focus can be detected from single-channel hearing-aid-compatible EEG configurations, we recorded EEG from three electrodes inside the ear canal (‘in-Ear-EEG’) and additionally from 64 electrodes on the scalp. In two different, concurrent listening tasks, participants (n = 7) were fitted with individualized in-Ear-EEG pieces and were either asked to attend to one of two dichotically-presented, concurrent tone streams or to one of two diotically-presented, concurrent audiobooks. A forward encoding model was trained to predict the EEG response at single EEG channels. Main results. Each individual participants’ attentional focus could be detected from single-channel EEG response recorded from short-distance configurations consisting only of a single in-Ear-EEG electrode and an adjacent scalp-EEG electrode. The differences in neural responses to attended and ignored stimuli were consistent in morphology (i.e. polarity and latency of components) across subjects. Significance. In sum, our findings show that the EEG response from a single-channel, hearing-aid-compatible configuration provides valuable information to identify a listener’s focus of attention.
Concurrent Validity of the International Family Quality of Life Survey.
Samuel, Preethy S; Pociask, Fredrick D; DiZazzo-Miller, Rosanne; Carrellas, Ann; LeRoy, Barbara W
2016-01-01
The measurement of the social construct of Family Quality of Life (FQOL) is a parsimonious alternative to the current approach of measuring familial outcomes using a battery of tools related to individual-level outcomes. The purpose of this study was to examine the internal consistency and concurrent validity of the International FQOL Survey (FQOLS-2006), using cross-sectional data collected from 65 family caregivers of children with developmental disabilities. It shows a moderate correlation between the total FQOL scores of the FQOLS-2006 and the Beach Center's FQOL scale. The validity of five FQOLS-2006 domains was supported by the correlations between conceptually related domains.
Peroxide Propulsion at the Turn of the Century
NASA Technical Reports Server (NTRS)
Anderson, William E.; Butler, Kathy; Crocket, Dave; Lewis, Tim; McNeal, Curtis
2000-01-01
A resurgence of interest in peroxide propulsion has occurred in the last years of the 21st Century. This interest is driven by the need for lower cost propulsion systems and the need for storable reusable propulsion systems to meet future space transportation system architectures. NASA and the Air Force are jointly developing two propulsion systems for flight demonstration early in the 21st Century. One system will be a development of Boeing's AR2-3 engine, which was successfully fielded in the 1960s. The other is a new pressure-fed design by Orbital Sciences Corporation for expendable mission requirements. Concurrently NASA and industry are pursuing the key peroxide technologies needed to design, fabricate, and test advanced peroxide engines to meet the mission needs beyond 2005. This paper will present a description of the AR2-3, report the status of its current test program, and describe its intended flight demonstration. This paper will then describe the Orbital 10K engine, the status of its test program, and describe its planned flight demonstration. Finally the paper will present a plan, or technology roadmap, for the development of an advanced peroxide engine for the 21st Century.
Reflexive Principlism as an Effective Approach for Developing Ethical Reasoning in Engineering.
Beever, Jonathan; Brightman, Andrew O
2016-02-01
An important goal of teaching ethics to engineering students is to enhance their ability to make well-reasoned ethical decisions in their engineering practice: a goal in line with the stated ethical codes of professional engineering organizations. While engineering educators have explored a wide range of methodologies for teaching ethics, a satisfying model for developing ethical reasoning skills has not been adopted broadly. In this paper we argue that a principlist-based approach to ethical reasoning is uniquely suited to engineering ethics education. Reflexive Principlism is an approach to ethical decision-making that focuses on internalizing a reflective and iterative process of specification, balancing, and justification of four core ethical principles in the context of specific cases. In engineering, that approach provides structure to ethical reasoning while allowing the flexibility for adaptation to varying contexts through specification. Reflexive Principlism integrates well with the prevalent and familiar methodologies of reasoning within the engineering disciplines as well as with the goals of engineering ethics education.
Virtual Collaborative Environments for System of Systems Engineering and Applications for ISAT
NASA Technical Reports Server (NTRS)
Dryer, David A.
2002-01-01
This paper describes an system of systems or metasystems approach and models developed to help prepare engineering organizations for distributed engineering environments. These changes in engineering enterprises include competition in increasingly global environments; new partnering opportunities caused by advances in information and communication technologies, and virtual collaboration issues associated with dispersed teams. To help address challenges and needs in this environment, a framework is proposed that can be customized and adapted for NASA to assist in improved engineering activities conducted in distributed, enhanced engineering environments. The approach is designed to prepare engineers for such distributed collaborative environments by learning and applying e-engineering methods and tools to a real-world engineering development scenario. The approach consists of two phases: an e-engineering basics phase and e-engineering application phase. The e-engineering basics phase addresses skills required for e-engineering. The e-engineering application phase applies these skills in a distributed collaborative environment to system development projects.
Code of Federal Regulations, 2010 CFR
2010-10-01
... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions... two value engineering approaches: (1) The first is an incentive approach in which contractor...
New High Throughput Methods to Estimate Chemical Exposure
EPA has made many recent advances in high throughput bioactivity testing. However, concurrent advances in rapid, quantitative prediction of human and ecological exposures have been lacking, despite the clear importance of both measures for a risk-based approach to prioritizing an...
Compacted graphite iron: Cast iron makes a comeback
NASA Astrophysics Data System (ADS)
Dawson, S.
1994-08-01
Although compacted graphite iron has been known for more than four decades, the absence of a reliable mass-production technique has resulted in relatively little effort to exploit its operational benefits. However, a proven on-line process control technology developed by SinterCast allows for series production of complex components in high-quality CGI. The improved mechanical properties of compacted graphite iron relative to conventional gray iron allow for substantial weight reduction in gasoline and diesel engines or substantial increases in horsepower, or an optimal combination of both. Concurrent with these primary benefits, CGI also provides significant emissions and fuel efficiency benefits allowing automakers to meet legislated performance standards. The operational and environmental benefits of compacted graphite iron together with its low cost and recyclability reinforce cast iron as a prime engineering material for the future.
Aerospace engineering design by systematic decomposition and multilevel optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Barthelemy, J. F. M.; Giles, G. L.
1984-01-01
A method for systematic analysis and optimization of large engineering systems, by decomposition of a large task into a set of smaller subtasks that is solved concurrently is described. The subtasks may be arranged in hierarchical levels. Analyses are carried out in each subtask using inputs received from other subtasks, and are followed by optimizations carried out from the bottom up. Each optimization at the lower levels is augmented by analysis of its sensitivity to the inputs received from other subtasks to account for the couplings among the subtasks in a formal manner. The analysis and optimization operations alternate iteratively until they converge to a system design whose performance is maximized with all constraints satisfied. The method, which is still under development, is tentatively validated by test cases in structural applications and an aircraft configuration optimization.
Control system software, simulation, and robotic applications
NASA Technical Reports Server (NTRS)
Frisch, Harold P.
1991-01-01
All essential existing capabilities needed to create a man-machine interaction dynamics and performance (MMIDAP) capability are reviewed. The multibody system dynamics software program Order N DISCOS will be used for machine and musculo-skeletal dynamics modeling. The program JACK will be used for estimating and animating whole body human response to given loading situations and motion constraints. The basic elements of performance (BEP) task decomposition methodologies associated with the Human Performance Institute database will be used for performance assessment. Techniques for resolving the statically indeterminant muscular load sharing problem will be used for a detailed understanding of potential musculotendon or ligamentous fatigue, pain, discomfort, and trauma. The envisioned capacity is to be used for mechanical system design, human performance assessment, extrapolation of man/machine interaction test data, biomedical engineering, and soft prototyping within a concurrent engineering (CE) system.
Marsh, Gary M; Youk, Ada O; Buchanich, Jeanine M; Downing, Sarah; Kennedy, Kathleen J; Esmen, Nurtan A; Hancock, Roger P; Lacey, Steven E; Pierce, Jennifer S; Fleissner, Mary Lou
2013-06-01
To determine whether glioblastoma (GB) incidence rates among jet engine manufacturing workers were associated with workplace experiences with specific parts produced and processes performed. Subjects were 210,784 workers employed between 1952 and 2001. We conducted nested case-control and cohort incidence studies with focus on 277 GB cases. We estimated time experienced with 16 part families, 4 process categories, and 32 concurrent part-process combinations with 20 or more GB cases. In both the cohort and case-control studies, none of the part families, process categories, or both considered was associated with increased GB risk. If not due to chance alone, the not statistically significantly elevated GB rates in the North Haven plant may reflect external occupational factors or nonoccupational factors unmeasured in the current evaluation.
2015-06-12
MEASURING THE IMMEASURABLE: AN APPROACH TO ASSESSING THE EFFECTIVENESS OF ENGINEERING CIVIC ASSISTANCE PROJECTS TOWARDS ACHIEVING...SUBTITLE Measuring the Immeasurable: An Approach to Assessing the Effectiveness of Engineering Civic Assistance Projects Towards Achieving National...increasing reliance on Humanitarian and Civic Assistance (HCA), specifically engineering civic assistance projects (ENCAPs), as a way to shape the
Concurrent Initialization for Bearing-Only SLAM
Munguía, Rodrigo; Grau, Antoni
2010-01-01
Simultaneous Localization and Mapping (SLAM) is perhaps the most fundamental problem to solve in robotics in order to build truly autonomous mobile robots. The sensors have a large impact on the algorithm used for SLAM. Early SLAM approaches focused on the use of range sensors as sonar rings or lasers. However, cameras have become more and more used, because they yield a lot of information and are well adapted for embedded systems: they are light, cheap and power saving. Unlike range sensors which provide range and angular information, a camera is a projective sensor which measures the bearing of images features. Therefore depth information (range) cannot be obtained in a single step. This fact has propitiated the emergence of a new family of SLAM algorithms: the Bearing-Only SLAM methods, which mainly rely in especial techniques for features system-initialization in order to enable the use of bearing sensors (as cameras) in SLAM systems. In this work a novel and robust method, called Concurrent Initialization, is presented which is inspired by having the complementary advantages of the Undelayed and Delayed methods that represent the most common approaches for addressing the problem. The key is to use concurrently two kinds of feature representations for both undelayed and delayed stages of the estimation. The simulations results show that the proposed method surpasses the performance of previous schemes. PMID:22294884
A Systematic Approach to Sensor Selection for Aircraft Engine Health Estimation
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Garg, Sanjay
2009-01-01
A systematic approach for selecting an optimal suite of sensors for on-board aircraft gas turbine engine health estimation is presented. The methodology optimally chooses the engine sensor suite and the model tuning parameter vector to minimize the Kalman filter mean squared estimation error in the engine s health parameters or other unmeasured engine outputs. This technique specifically addresses the underdetermined estimation problem where there are more unknown system health parameters representing degradation than available sensor measurements. This paper presents the theoretical estimation error equations, and describes the optimization approach that is applied to select the sensors and model tuning parameters to minimize these errors. Two different model tuning parameter vector selection approaches are evaluated: the conventional approach of selecting a subset of health parameters to serve as the tuning parameters, and an alternative approach that selects tuning parameters as a linear combination of all health parameters. Results from the application of the technique to an aircraft engine simulation are presented, and compared to those from an alternative sensor selection strategy.
Warnke, Tom; Reinhardt, Oliver; Klabunde, Anna; Willekens, Frans; Uhrmacher, Adelinde M
2017-10-01
Individuals' decision processes play a central role in understanding modern migration phenomena and other demographic processes. Their integration into agent-based computational demography depends largely on suitable support by a modelling language. We are developing the Modelling Language for Linked Lives (ML3) to describe the diverse decision processes of linked lives succinctly in continuous time. The context of individuals is modelled by networks the individual is part of, such as family ties and other social networks. Central concepts, such as behaviour conditional on agent attributes, age-dependent behaviour, and stochastic waiting times, are tightly integrated in the language. Thereby, alternative decisions are modelled by concurrent processes that compete by stochastic race. Using a migration model, we demonstrate how this allows for compact description of complex decisions, here based on the Theory of Planned Behaviour. We describe the challenges for the simulation algorithm posed by stochastic race between multiple concurrent complex decisions.
NASA Astrophysics Data System (ADS)
Lange, Rense
2015-02-01
An extension of concurrent validity is proposed that uses qualitative data for the purpose of validating quantitative measures. The approach relies on Latent Semantic Analysis (LSA) which places verbal (written) statements in a high dimensional semantic space. Using data from a medical / psychiatric domain as a case study - Near Death Experiences, or NDE - we established concurrent validity by connecting NDErs qualitative (written) experiential accounts with their locations on a Rasch scalable measure of NDE intensity. Concurrent validity received strong empirical support since the variance in the Rasch measures could be predicted reliably from the coordinates of their accounts in the LSA derived semantic space (R2 = 0.33). These coordinates also predicted NDErs age with considerable precision (R2 = 0.25). Both estimates are probably artificially low due to the small available data samples (n = 588). It appears that Rasch scalability of NDE intensity is a prerequisite for these findings, as each intensity level is associated (at least probabilistically) with a well- defined pattern of item endorsements.
Fransson, Mari; Granqvist, Pehr; Bohlin, Gunilla; Hagekull, Berit
2013-01-01
In this paper, we examine concurrent and prospective links between attachment and the Five-Factor Model (FFM) of personality from middle childhood to young adulthood (n = 66). At age 8.5 years, attachment was measured with the Separation Anxiety Test and at 21 years with the Adult Attachment Interview, whereas the personality dimensions were assessed with questionnaires at both time points. The results showed that attachment and personality dimensions are meaningfully related, concurrently and longitudinally. Attachment security in middle childhood was positively related to extraversion and openness, both concurrently and prospectively. Unresolved/disorganized (U/d) attachment was negatively related to conscientiousness and positively related to openness in young adulthood. U/d attachment showed a unique contribution to openness above the observed temporal stability of openness. As attachment security was also associated with openness, the duality of this factor is discussed together with other theoretical considerations regarding attachment theory in relation to the FFM.
Multi-agent framework for negotiation in a closed environment
NASA Astrophysics Data System (ADS)
Cretan, Adina; Coutinho, Carlos; Bratu, Ben; Jardim-Goncalves, Ricardo
2013-10-01
The goal of this paper is to offer support for small and medium enterprises which cannot or do not want to fulfill a big contract alone. Each organization has limited resources and in order to better accomplish a higher external demand, the managers are forced to outsource parts of their contracts even to concurrent organizations. In this concurrent environment each enterprise wants to preserve its decision autonomy and to disclose as little as possible from its business information. To describe this interaction, our approach is to define a framework for managing parallel and concurrent negotiations among independent organizations acting in the same industrial market. The complexity of our negotiation framework is done by the dynamic environment in which multi-attribute and multi-participant negotiations are racing over the same set of resources. Moreover, the proposed framework helps the organizations within the collaborative networked environment to augment their efficiency and ability to react to unforeseen situations, thus improving their market competitiveness.
Additive manufacturing: Toward holistic design
Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.; ...
2017-03-18
Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.
NASA Technical Reports Server (NTRS)
Ortega, J. M.
1984-01-01
Several short summaries of the work performed during this reporting period are presented. Topics discussed in this document include: (1) resilient seeded errors via simple techniques; (2) knowledge representation for engineering design; (3) analysis of faults in a multiversion software experiment; (4) implementation of parallel programming environment; (5) symbolic execution of concurrent programs; (6) two computer graphics systems for visualization of pressure distribution and convective density particles; (7) design of a source code management system; (8) vectorizing incomplete conjugate gradient on the Cyber 203/205; (9) extensions of domain testing theory and; (10) performance analyzer for the pisces system.
NASA Technical Reports Server (NTRS)
1982-01-01
An effective data collection methodology for evaluating software development methodologies was applied to four different software development projects. Goals of the data collection included characterizing changes and errors, characterizing projects and programmers, identifying effective error detection and correction techniques, and investigating ripple effects. The data collected consisted of changes (including error corrections) made to the software after code was written and baselined, but before testing began. Data collection and validation were concurrent with software development. Changes reported were verified by interviews with programmers.
Systems design analysis applied to launch vehicle configuration
NASA Technical Reports Server (NTRS)
Ryan, R.; Verderaime, V.
1993-01-01
As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.
Nacelle Aerodynamic and Inertial Loads (NAIL) project. Appendix B
NASA Technical Reports Server (NTRS)
1981-01-01
The testing was conducted on the Boeing-owned 747 RA001 test bed airplane during the concurrent 767/JT9D-7R4 engine development program. Following a functional check flight conducted from Boeing Field International (BFI) on 3 October 1980, the airplane and test personnel were ferried to Valley Industrial Park (GSG) near Glasgow, Montana, on 7 October 1980. The combined NAL and 7670JT9D-7R4 test flights were conducted at the Glasgow remote test site, and the airplane was returned to Seattle on 26 October 1980.
System software for the finite element machine
NASA Technical Reports Server (NTRS)
Crockett, T. W.; Knott, J. D.
1985-01-01
The Finite Element Machine is an experimental parallel computer developed at Langley Research Center to investigate the application of concurrent processing to structural engineering analysis. This report describes system-level software which has been developed to facilitate use of the machine by applications researchers. The overall software design is outlined, and several important parallel processing issues are discussed in detail, including processor management, communication, synchronization, and input/output. Based on experience using the system, the hardware architecture and software design are critiqued, and areas for further work are suggested.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.
Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.
Zhu, Haihao; Woolfenden, Steve; Bronson, Roderick T; Jaffer, Zahara M; Barluenga, Sofia; Winssinger, Nicolas; Rubenstein, Allan E; Chen, Ruihong; Charest, Al
2010-09-01
Glioblastoma multiforme (GBM) has an abysmal prognosis. We now know that the epidermal growth factor receptor (EGFR) signaling pathway and the loss of function of the tumor suppressor genes p16Ink4a/p19ARF and PTEN play a crucial role in GBM pathogenesis: initiating the early stages of tumor development, sustaining tumor growth, promoting infiltration, and mediating resistance to therapy. We have recently shown that this genetic combination is sufficient to promote the development of GBM in adult mice. Therapeutic agents raised against single targets of the EGFR signaling pathway have proven rather inefficient in GBM therapy, showing the need for combinatorial therapeutic approaches. An effective strategy for concurrent disruption of multiple signaling pathways is via the inhibition of the molecular chaperone heat shock protein 90 (Hsp90). Hsp90 inhibition leads to the degradation of so-called client proteins, many of which are key effectors of GBM pathogenesis. NXD30001 is a novel second generation Hsp90 inhibitor that shows improved pharmacokinetic parameters. Here we show that NXD30001 is a potent inhibitor of GBM cell growth in vitro consistent with its capacity to inhibit several key targets and regulators of GBM biology. We also show the efficacy of NXD30001 in vivo in an EGFR-driven genetically engineered mouse model of GBM. Our findings establish that the Hsp90 inhibitor NXD30001 is a therapeutically multivalent molecule, whose actions strike GBM at the core of its drivers of tumorigenesis and represent a compelling rationale for its use in GBM treatment.
Salmon, Paul Matthew; Goode, Natassia; Spiertz, Antje; Thomas, Miles; Grant, Eryn; Clacy, Amanda
2017-06-01
Questions have been raised regarding the impact that providing concurrent verbal protocols has on task performance in various settings; however, there has been little empirical testing of this in road transport. The aim of this study was to examine the impact of providing concurrent verbal protocols on driving performance. Participants drove an instrumented vehicle around a set route, twice whilst providing a concurrent verbal protocol, and twice without. A comparison revealed no differences in behaviour related to speed, braking and steering wheel angle when driving mid-block, but a significant difference in aspects of braking and acceleration at roundabouts. When not providing a verbal protocol, participants were found to brake harder on approach to a roundabout and accelerate more heavily coming out of roundabouts. It is concluded that providing verbal protocols may have a positive effect on braking and accelerating. Practical implications related to driver training and future research are discussed. Practitioner Summary: Verbal protocol analysis is used by ergonomists to understand aspects of cognition and decision-making during complex tasks such as driving and control room operation. This study examines the impact that it has on driving performance, providing evidence to support its continued use in ergonomics applications.
Automatic Management of Parallel and Distributed System Resources
NASA Technical Reports Server (NTRS)
Yan, Jerry; Ngai, Tin Fook; Lundstrom, Stephen F.
1990-01-01
Viewgraphs on automatic management of parallel and distributed system resources are presented. Topics covered include: parallel applications; intelligent management of multiprocessing systems; performance evaluation of parallel architecture; dynamic concurrent programs; compiler-directed system approach; lattice gaseous cellular automata; and sparse matrix Cholesky factorization.
A Future of Leadership Development
ERIC Educational Resources Information Center
Williams, Ken
2009-01-01
Leadership and leadership development are popular topics today. Concurrent with the construction of leadership theory, leadership development has emerged as a practice, with programs, consultants, reports, and networking opportunities proliferating. Given the reality of limited resources, it is critical that investments in and approaches to…
HERMIES-I: a mobile robot for navigation and manipulation experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weisbin, C.R.; Barhen, J.; de Saussure, G.
1985-01-01
The purpose of this paper is to report the current status of investigations ongoing at the Center for Engineering Systems Advanced Research (CESAR) in the areas of navigation and manipulation in unstructured environments. The HERMIES-I mobile robot, a prototype of a series which contains many of the major features needed for remote work in hazardous environments is discussed. Initial experimental work at CESAR has begun in the area of navigation. It briefly reviews some of the ongoing research in autonomous navigation and describes initial research with HERMIES-I and associated graphic simulation. Since the HERMIES robots will generally be composed ofmore » a variety of asynchronously controlled hardware components (such as manipulator arms, digital image sensors, sonars, etc.) it seems appropriate to consider future development of the HERMIES brain as a hypercube ensemble machine with concurrent computation and associated message passing. The basic properties of such a hypercube architecture are presented. Decision-making under uncertainty eventually permeates all of our work. Following a survey of existing analytical approaches, it was decided that a stronger theoretical basis is required. As such, this paper presents the framework for a recently developed hybrid uncertainty theory. 21 refs., 2 figs.« less
Chronobiology of mood disorders.
Malhi, G S; Kuiper, S
2013-01-01
As part of a series of papers examining chronobiology ['Getting depression clinical guidelines right: time for change?' Kuiper et al. Acta Psychiatr Scand 2013;128(Suppl. 444):24-30; and 'Manipulating melatonin in managing mood' Boyce & Hopwood. ActaPsychiatrScand 2013;128(Suppl. 444):16-23], in this article, we review and synthesise the extant literature pertaining to the chronobiology of depression and provide a preliminary model for understanding the neural systems involved. A selective literature search was conducted using search engines such as MEDLINE/PubMed, combining terms associated with chronobiology and mood disorders. We propose that understanding of sleep-wake function and mood can be enhanced by simultaneously considering the circadian system, the sleep homoeostat and the core stress system, all of which are likely to be simultaneously disrupted in major mood disorders. This integrative approach is likely to allow flexible modelling of a much broader range of mood disorder presentations and phenomenology. A preliminary multifaceted model is presented, which will require further development and testing. Future depression research should aim to examine multiple systems concurrently in order to derive a more sophisticated understanding of the underlying neurobiology. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
High-Frequency Testing of Composite Fan Vanes With Erosion-Resistant Coating Conducted
NASA Technical Reports Server (NTRS)
Bowman, Cheryl L.; Sutter, James K.; Naik, Subhash; Otten, Kim D.; Perusek, Gail P.
2003-01-01
The mechanical integrity of hard, erosion-resistant coatings were tested using the Structural Dynamics Laboratory at the NASA Glenn Research Center. Under the guidance of Structural Mechanics and Dynamics Branch personnel, fixturing and test procedures were developed at Glenn to simulate engine vibratory conditions on coated polymer-matrix- composite bypass vanes using a slip table in the Structural Dynamics Laboratory. Results from the high-frequency mechanical bench testing, along with concurrent erosion testing of coupons and vanes, provided sufficient confidence to engine-endurance test similarly coated vane segments. The knowledge gained from this program will be applied to the development of oxidation- and erosion-resistant coatings for polymer matrix composite blades and vanes in future advanced turbine engines. Fan bypass vanes from the AE3007 (Rolls Royce America, Indianapolis, IN) gas turbine engine were coated by Engelhard (Windsor, CT) with compliant bond coatings and hard ceramic coatings. The coatings were developed collaboratively by Glenn and Allison Advanced Development Corporation (AADC)/Rolls Royce America through research sponsored by the High-Temperature Engine Materials Technology Project (HITEMP) and the Higher Operating Temperature Propulsion Components (HOTPC) project. High-cycle fatigue was performed through high-frequency vibratory testing on a shaker table. Vane resonant frequency modes were surveyed from 50 to 3000 Hz at input loads from 1g to 55g on both uncoated production vanes and vanes with the erosion-resistant coating. Vanes were instrumented with both lightweight accelerometers and strain gauges to establish resonance, mode shape, and strain amplitudes. Two high-frequency dwell conditions were chosen to excite two strain levels: one approaching the vane's maximum allowable design strain and another near the expected maximum strain during engine operation. Six specimens were tested per dwell condition. Pretest and posttest inspections were performed optically at up to 60 magnification and using a fluorescent-dye penetrant. Accumulation of 10 million cycles at a strain amplitude of two to three times that expected in the engine (approximately 670 Hz and 20g) led to the development of multiple cracks in the coating that were only detectable using fluorescent-dye penetrant inspection. Cracks were prevalent on the trailing edge and on the convex side of the midsection. No cracking or spalling was evident using standard optical inspection at up to 60 magnification. Further inspection may reveal whether these fine cracks penetrated the coating or were strictly on the surface. The dwell condition that simulated actual engine conditions produced no obvious surface flaws even after up to 80 million cycles had been accumulated at strain amplitudes produced at approximately 1500 Hz and 45g.
A game-based decision support methodology for competitive systems design
NASA Astrophysics Data System (ADS)
Briceno, Simon Ignacio
This dissertation describes the development of a game-based methodology that facilitates the exploration and selection of research and development (R&D) projects under uncertain competitive scenarios. The proposed method provides an approach that analyzes competitor positioning and formulates response strategies to forecast the impact of technical design choices on a project's market performance. A critical decision in the conceptual design phase of propulsion systems is the selection of the best architecture, centerline, core size, and technology portfolio. This selection can be challenging when considering evolving requirements from both the airframe manufacturing company and the airlines in the market. Furthermore, the exceedingly high cost of core architecture development and its associated risk makes this strategic architecture decision the most important one for an engine company. Traditional conceptual design processes emphasize performance and affordability as their main objectives. These areas alone however, do not provide decision-makers with enough information as to how successful their engine will be in a competitive market. A key objective of this research is to examine how firm characteristics such as their relative differences in completing R&D projects, differences in the degree of substitutability between different project types, and first/second-mover advantages affect their product development strategies. Several quantitative methods are investigated that analyze business and engineering strategies concurrently. In particular, formulations based on the well-established mathematical field of game theory are introduced to obtain insights into the project selection problem. The use of game theory is explored in this research as a method to assist the selection process of R&D projects in the presence of imperfect market information. The proposed methodology focuses on two influential factors: the schedule uncertainty of project completion times and the uncertainty associated with competitive reactions. A normal-form matrix is created to enumerate players, their moves and payoffs, and to formulate a process by which an optimal decision can be achieved. The non-cooperative model is tested using the concept of a Nash equilibrium to identify potential strategies that are robust to uncertain market fluctuations (e.g: uncertainty in airline demand, airframe requirements and competitor positioning). A first/second-mover advantage parameter is used as a scenario dial to adjust market rewards and firms' payoffs. The methodology is applied to a commercial aircraft engine selection study where engine firms must select an optimal engine project for development. An engine modeling and simulation framework is developed to generate a broad engine project portfolio. The creation of a customer value model enables designers to incorporate airline operation characteristics into the engine modeling and simulation process to improve the accuracy of engine/customer matching. Summary. Several key findings are made that provide recommendations on project selection strategies for firms uncertain as to when they will enter the market. The proposed study demonstrates that within a technical design environment, a rational and analytical means of modeling project development strategies is beneficial in high market risk situations.
Quantitative Tracking of Combinatorially Engineered Populations with Multiplexed Binary Assemblies.
Zeitoun, Ramsey I; Pines, Gur; Grau, Willliam C; Gill, Ryan T
2017-04-21
Advances in synthetic biology and genomics have enabled full-scale genome engineering efforts on laboratory time scales. However, the absence of sufficient approaches for mapping engineered genomes at system-wide scales onto performance has limited the adoption of more sophisticated algorithms for engineering complex biological systems. Here we report on the development and application of a robust approach to quantitatively map combinatorially engineered populations at scales up to several dozen target sites. This approach works by assembling genome engineered sites with cell-specific barcodes into a format compatible with high-throughput sequencing technologies. This approach, called barcoded-TRACE (bTRACE) was applied to assess E. coli populations engineered by recursive multiplex recombineering across both 6-target sites and 31-target sites. The 31-target library was then tracked throughout growth selections in the presence and absence of isopentenol (a potential next-generation biofuel). We also use the resolution of bTRACE to compare the influence of technical and biological noise on genome engineering efforts.
Advanced General Aviation Turbine Engine (GATE) study
NASA Technical Reports Server (NTRS)
Smith, R.; Benstein, E. H.
1979-01-01
The small engine technology requirements suitable for general aviation service in the 1987 to 1988 time frame were defined. The market analysis showed potential United States engines sales of 31,500 per year providing that the turbine engine sales price approaches current reciprocating engine prices. An optimum engine design was prepared for four categories of fixed wing aircraft and for rotary wing applications. A common core approach was derived from the optimum engines that maximizes engine commonality over the power spectrum with a projected price competitive with reciprocating piston engines. The advanced technology features reduced engine cost, approximately 50 percent compared with current technology.
An Integrated Neuroscience and Engineering Approach to Classifying Human Brain-States
2015-12-22
AFRL-AFOSR-VA-TR-2016-0037 An Integrated Neuroscience and Engineering Approach to Classifying Human Brain-States Adrian Lee UNIVERSITY OF WASHINGTON...to 14-09-2015 4. TITLE AND SUBTITLE An Integrated Neuroscience and Engineering Approach to Classifying Human Brain- States 5a. CONTRACT NUMBER 5b...specific cognitive states remains elusive, owing perhaps to limited crosstalk between the fields of neuroscience and engineering. Here, we report a
Use of Traffic Displays for General Aviation Approach Spacing: A Human Factors Study
2007-12-01
engine rated pilots participated. Eight flew approaches in a twin-engine Piper Aztec originating in Sanford, ME, and eight flew approaches in the same...flew approaches in a twin-engine Piper Aztec originating in Sanford, ME, and eight flew approaches in the same aircraft originating in Atlantic City... Aztec . The plane was equipped with a horizontal Situation Indicator (hSI). The Garmin International MX-20™ multifunction traffic display or “Basic
Chitapanarux, Imjai; Tungkasamit, Tharatorn; Petsuksiri, Janjira; Kannarunimit, Danita; Katanyoo, Kanyarat; Chakkabat, Chakkapong; Setakornnukul, Jiraporn; Wongsrita, Somying; Jirawatwarakul, Naruemon; Lertbusayanukul, Chawalit; Sripan, Patumrat; Traisathit, Patrinee
2018-03-01
The purpose of the study is to compare the efficacy of benzydamine HCl with sodium bicarbonate in the prevention of concurrent chemoradiation-induced oral mucositis in head and neck cancer patients. Sixty locally advanced head and neck cancer patients treated with high-dose radiotherapy concurrently with platinum-based chemotherapy were randomly assigned to receive either benzydamine HCl or sodium bicarbonate from the first day of treatment to 2 weeks after the completion of treatment. The total score for mucositis, based on the Oral Mucositis Assessment Scale (OMAS), was used for the assessment, conducted weekly during the treatment period and at the fourth week of the follow-up. Pain score, all prescribed medications, and tube feeding needs were also recorded and compared. The median of total OMAS score was statistically significant lower in patients who received benzydamine HCl during concurrent chemo-radiotherapy (CCRT) than in those who received sodium bicarbonate, (p value < 0.001). There was no difference in median pain score, (p value = 0.52). Nineteen percent of patients in sodium bicarbonate arm needed oral antifungal agents whereas none in the benzydamine HCl arm required such medications, (p value = 0.06). Tube feeding needs and the compliance of CCRT were not different between the two study arms. For patients undergoing high-dose radiotherapy concurrently with platinum-based chemotherapy, using benzydamine HCl mouthwash as a preventive approach was superior to basic oral care using sodium bicarbonate mouthwash in terms of reducing the severity of oral mucositis and encouraging trend for the less need of oral antifungal drugs.
Wang, Shu-Lian; Liao, Zhongxing; Liu, Helen; Ajani, Jaffer; Swisher, Stephen; Cox, James D; Komaki, Ritsuko
2006-09-14
To evaluate the dosimetry, efficacy and toxicity of intensity-modulated radiation therapy (IMRT) and concurrent chemotherapy for patients with locally advanced cervical and upper thoracic esophageal cancer. A retrospective study was performed on 7 patients who were definitively treated with IMRT and concurrent chemotherapy. Patients who did not receive IMRT radiation and concurrent chemotherapy were not included in this analysis. IMRT plans were evaluated to assess the tumor coverage and normal tissue avoidance. Treatment response was evaluated and toxicities were assessed. Five- to nine-beam IMRT were used to deliver a total dose of 59.4-66 Gy (median: 64.8 Gy) to the primary tumor with 6-MV photons. The minimum dose received by the planning tumor volume (PTV) of the gross tumor volume boost was 91.2%-98.2% of the prescription dose (standard deviation [SD]: 3.7%-5.7%). The minimum dose received by the PTV of the clinical tumor volume was 93.8%-104.8% (SD: 4.3%-11.1%) of the prescribed dose. With a median follow-up of 15 mo (range: 3-21 mo), all 6 evaluable patients achieved complete response. Of them, 2 developed local recurrences and 2 had distant metastases, 3 survived with no evidence of disease. After treatment, 2 patients developed esophageal stricture requiring frequent dilation and 1 patient developed tracheal-esophageal fistula. Concurrent IMRT and chemotherapy resulted in an excellent early response in patients with locally advanced cervical and upper thoracic esophageal cancer. However, local and distant recurrence and toxicity remain to be a problem. Innovative approaches are needed to improve the outcome.
ERIC Educational Resources Information Center
Fleischmann, Corinna; Nakagawa, Elizabeth; Kelley, Tyler
2016-01-01
As the National Science Foundation and engineers throughout the world seek to strengthen the future of the engineering profession, the Civil Engineering (CE) program at the United States Coast Guard Academy embodies this initiative with a student focused approach. One course in particular, Materials for Civil and Construction Engineers (CE…
Multi-Agent Systems Design for Novices
ERIC Educational Resources Information Center
Lynch, Simon; Rajendran, Keerthi
2005-01-01
Advanced approaches to the construction of software systems can present difficulties to learners. This is true for multi-agent systems (MAS) which exhibit concurrency, non-determinacy of structure and composition and sometimes emergent behavior characteristics. Additional barriers exist for learners because mainstream MAS technology is young and…
Social Goals, Social Behavior, and Social Status in Middle Childhood
ERIC Educational Resources Information Center
Rodkin, Philip C.; Ryan, Allison M.; Jamison, Rhonda; Wilson, Travis
2013-01-01
This study examines motivational precursors of social status and the applicability of a dual-component model of social competence to middle childhood. Concurrent and longitudinal relationships between self-reported social goals (social development, demonstration-approach, demonstration-avoid goal orientations), teacher-rated prosocial and…
DOT National Transportation Integrated Search
2013-11-01
The Highway Capacity Manual (HCM) has had a delay-based level of service methodology for signalized intersections since 1985. : The 2010 HCM has revised the method for calculating delay. This happened concurrent with such jurisdictions as NYC reviewi...
A hierarchical, automated target recognition algorithm for a parallel analog processor
NASA Technical Reports Server (NTRS)
Woodward, Gail; Padgett, Curtis
1997-01-01
A hierarchical approach is described for an automated target recognition (ATR) system, VIGILANTE, that uses a massively parallel, analog processor (3DANN). The 3DANN processor is capable of performing 64 concurrent inner products of size 1x4096 every 250 nanoseconds.
USDA-ARS?s Scientific Manuscript database
Most hosts are concurrently or sequentially infected with multiple parasites, thus fully understanding interactions between individual parasite species and their hosts depends on accurate characterization of the parasite community. For parasitic nematodes, non-invasive methods for obtaining quantita...
This memorandum addresses the approach EPA should use in determining whether to concur that a parcel has been properly identified by a military service as 'uncontaminated' and therefore transferrable pursuant to CERCLA Section 120 (h)(4).
Killeen, Peter R
2015-07-01
The generalized matching law (GML) is reconstructed as a logistic regression equation that privileges no particular value of the sensitivity parameter, a. That value will often approach 1 due to the feedback that drives switching that is intrinsic to most concurrent schedules. A model of that feedback reproduced some features of concurrent data. The GML is a law only in the strained sense that any equation that maps data is a law. The machine under the hood of matching is in all likelihood the very law that was displaced by the Matching Law. It is now time to return the Law of Effect to centrality in our science. © Society for the Experimental Analysis of Behavior.
Concurrent systems and time synchronization
NASA Astrophysics Data System (ADS)
Burgin, Mark; Grathoff, Annette
2018-05-01
In the majority of scientific fields, system dynamics is described assuming existence of unique time for the whole system. However, it is established theoretically, for example, in relativity theory or in the system theory of time, and validated experimentally that there are different times and time scales in a variety of real systems - physical, chemical, biological, social, etc. In spite of this, there are no wide-ranging scientific approaches to exploration of such systems. Therefore, the goal of this paper is to study systems with this property. We call them concurrent systems because processes in them can go, events can happen and actions can be performed in different time scales. The problem of time synchronization is specifically explored.
Determinants of overweight with concurrent stunting among Ghanaian children.
Atsu, Benedicta K; Guure, Chris; Laar, Amos K
2017-07-27
Malnutrition (undernutrition and overnutrition) is a major public health problem in Ghana -affecting growth and development of individuals and the nation. Stunting and overweight are of particular interest, as recent national surveys show a rising trend of overnutrition and stubbornly high burden of stunting among Ghanaian children. There are currently no data on the simultaneous occurrence of overweight and stunting within individuals in Ghana. This paper presents the burden, the individual-level, and contextual determinants of overweight with concurrent stunting among Ghanaian children. This study analyzed data set of the fourth round of the Ghana Multiple Indicator Cluster Survey (MICS4). Bivariate analyses were used to describe selected characteristics of survey respondents and their children. Hierarchical modelling approach facilitated identification of significant distal, intermediate and proximal factors/determinants of concurrent stunting and overweight. Both crude and adjusted prevalence ratios via a multivariable Poison regression model with their corresponding 95% Confidence Intervals (CI) are reported. Variables with p ≤ 0.25 at the bivariate level were included in the multivariable analysis. An alpha value of 5% was used to indicate significance. Of 7550 cases (children) analyzed, the prevalence of stunting was 27.5%; underweight was 17.3%; and wasting was 7.7%. The prevalence of overweight and concurrent overweight and stunting were respectively 2.4% and 1.2%. Children who belonged to the fourth wealth quintile, were more likely to be overweight and concurrently stunted as against children belonging to the poorest quintile (aPR = 1.010; 95% CI, 1.003-1.017). Compared to religious (Christians/Muslim/Traditionalist) household heads, children whose household heads did not belong to any religion had 2 times the rates of the Overweight with concurrent stunting (PR = 2.024; 95% CI, 1.016-4.034). Children with mothers aged 20-34 and 35-49 had an increased though insignificant prevalence ratio of association (aPR = 1.001; 95% CI, 0.994-1.005) and (aPR = 1.001; 95% CI, 0.998-1.012) respectively. This analysis determined the prevalence of concurrent stunting and overweight among Ghanaian children to be 1.2%. Four contextual variables (breastfeeding status, religion, geographic region, and wealth index quintile) were associated with overweight with concurrent stunting. We conclude that, only contextual factors are predictive of DBM among children under five living in Ghana.
Aerospace engineering design by systematic decomposition and multilevel optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Giles, G. L.; Barthelemy, J.-F. M.
1984-01-01
This paper describes a method for systematic analysis and optimization of large engineering systems, e.g., aircraft, by decomposition of a large task into a set of smaller, self-contained subtasks that can be solved concurrently. The subtasks may be arranged in many hierarchical levels with the assembled system at the top level. Analyses are carried out in each subtask using inputs received from other subtasks, and are followed by optimizations carried out from the bottom up. Each optimization at the lower levels is augmented by analysis of its sensitivity to the inputs received from other subtasks to account for the couplings among the subtasks in a formal manner. The analysis and optimization operations alternate iteratively until they converge to a system design whose performance is maximized with all constraints satisfied. The method, which is still under development, is tentatively validated by test cases in structural applications and an aircraft configuration optimization. It is pointed out that the method is intended to be compatible with the typical engineering organization and the modern technology of distributed computing.
Experience Engineering: An Engineering Course for Non-Majors
ERIC Educational Resources Information Center
Hargrove-Leak, Sirena
2012-01-01
The engineering profession continues to struggle to attract new talent, in part because it is not well understood by the general public and often viewed in a negative light. Therefore, engineering professionals have called for new approaches promote better understanding and change negative perceptions. One suggested approach is for engineering…
Investigating Engineering Practice Is Valuable for Mathematics Learning
ERIC Educational Resources Information Center
Goold, Eileen
2015-01-01
While engineering mathematics curricula often prescribe a fixed body of mathematical knowledge, this study takes a different approach; second-year engineering students are additionally required to investigate and document an aspect of mathematics used in engineering practice. A qualitative approach is used to evaluate the impact that students'…
NASA Technical Reports Server (NTRS)
Shuai, Yanmin; Masek, Jeffrey G.; Gao, Feng; Schaaf, Crystal B.; He, Tao
2014-01-01
Land surface albedo has been recognized by the Global Terrestrial Observing System (GTOS) as an essential climate variable crucial for accurate modeling and monitoring of the Earth's radiative budget. While global climate studies can leverage albedo datasets from MODIS, VIIRS, and other coarse-resolution sensors, many applications in heterogeneous environments can benefit from higher-resolution albedo products derived from Landsat. We previously developed a "MODIS-concurrent" approach for the 30-meter albedo estimation which relied on combining post-2000 Landsat data with MODIS Bidirectional Reflectance Distribution Function (BRDF) information. Here we present a "pre-MODIS era" approach to extend 30-m surface albedo generation in time back to the 1980s, through an a priori anisotropy Look-Up Table (LUT) built up from the high quality MCD43A BRDF estimates over representative homogenous regions. Each entry in the LUT reflects a unique combination of land cover, seasonality, terrain information, disturbance age and type, and Landsat optical spectral bands. An initial conceptual LUT was created for the Pacific Northwest (PNW) of the United States and provides BRDF shapes estimated from MODIS observations for undisturbed and disturbed surface types (including recovery trajectories of burned areas and non-fire disturbances). By accepting the assumption of a generally invariant BRDF shape for similar land surface structures as a priori information, spectral white-sky and black-sky albedos are derived through albedo-to-nadir reflectance ratios as a bridge between the Landsat and MODIS scale. A further narrow-to-broadband conversion based on radiative transfer simulations is adopted to produce broadband albedos at visible, near infrared, and shortwave regimes.We evaluate the accuracy of resultant Landsat albedo using available field measurements at forested AmeriFlux stations in the PNW region, and examine the consistency of the surface albedo generated by this approach respectively with that from the "concurrent" approach and the coincident MODIS operational surface albedo products. Using the tower measurements as reference, the derived Landsat 30-m snow-free shortwave broadband albedo yields an absolute accuracy of 0.02 with a root mean square error less than 0.016 and a bias of no more than 0.007. A further cross-comparison over individual scenes shows that the retrieved white sky shortwave albedo from the "pre-MODIS era" LUT approach is highly consistent (R(exp 2) = 0.988, the scene-averaged low RMSE = 0.009 and bias = -0.005) with that generated by the earlier "concurrent" approach. The Landsat albedo also exhibits more detailed landscape texture and a wider dynamic range of albedo values than the coincident 500-m MODIS operational products (MCD43A3), especially in the heterogeneous regions. Collectively, the "pre-MODIS" LUT and "concurrent" approaches provide a practical way to retrieve long-term Landsat albedo from the historic Landsat archives as far back as the 1980s, as well as the current Landsat-8 mission, and thus support investigations into the evolution of the albedo of terrestrial biomes at fine resolution.
NASA Astrophysics Data System (ADS)
Yang, Jia-Yue; Cheng, Long; Hu, Ming
2017-12-01
Intermetallic clathrates, one class of guest-host systems with perfectly crystalline structures, hold great potential to be the "phonon glass - electron crystal" thermoelectric materials. Previous studies focus on revealing the atomistic origins of blocked phononic transport, yet little attention is drawn to the enhanced electronic transport. In this work, we investigate the binary type-I M8Si46 (M = Sr, Ba, Tl, and Pb) clathrates and unravel how rattlers concurrently block phononic transport and enhance electronic transport from first-principles. By comparing the empty and filled clathrates, the lattice thermal conductivity is greatly reduced by a factor of 21 due to the decrease in phonon relaxation time for propagative phonons over 0-6 THz by 1.5 orders of magnitude. On the other hand, rattlers bridge charge gaps among cages by donating electrons and thus drastically increase electrical conductivity. The concurrent realization of blocked phononic transport and enhanced electronic transport boosts the figure-of-merit (ZT) of empty clathrate by 4 orders of magnitude. Furthermore, by manipulating metallic rattlers and n-type doping, the power factor is markedly improved and ZT can reach 0.55 at 800 K. These results provide a quantitative description of the guest-host interaction and coupling dynamics from first-principles. The proposed strategy of manipulating ratting atoms and in-situ doping offers important guidance to engineer clathrates with high thermoelectric performance.
Blois, Hélène; Iris, François
2010-01-01
Natural outbreaks of multidrug-resistant microorganisms can cause widespread devastation, and several can be used or engineered as agents of bioterrorism. From a biosecurity standpoint, the capacity to detect and then efficiently control, within hours, the spread and the potential pathological effects of an emergent outbreak, for which there may be no effective antibiotics or vaccines, become key challenges that must be met. We turned to phage engineering as a potentially highly flexible and effective means to both detect and eradicate threats originating from emergent (uncharacterized) bacterial strains. To this end, we developed technologies allowing us to (1) concurrently modify multiple regions within the coding sequence of a gene while conserving intact the remainder of the gene, (2) reversibly interrupt the lytic cycle of an obligate virulent phage (T4) within its host, (3) carry out efficient insertion, by homologous recombination, of any number of engineered genes into the deactivated genomes of a T4 wild-type phage population, and (4) reactivate the lytic cycle, leading to the production of engineered infective virulent recombinant progeny. This allows the production of very large, genetically engineered lytic phage banks containing, in an E. coli host, a very wide spectrum of variants for any chosen phage-associated function, including phage host-range. Screening of such a bank should allow the rapid isolation of recombinant T4 particles capable of detecting (ie, diagnosing), infecting, and destroying hosts belonging to gram-negative bacterial species far removed from the original E. coli host. PMID:20569057
Stylianopoulos, Triantafyllos; Bashur, Chris A.; Goldstein, Aaron S.; Guelcher, Scott A.; Barocas, Victor H.
2008-01-01
The mechanical properties of biomaterial scaffolds are crucial for their efficacy in tissue engineering and regenerative medicine. At the microscopic scale, the scaffold must be sufficiently rigid to support cell adhesion, spreading, and normal extracellular matrix deposition. Concurrently, at the macroscopic scale the scaffold must have mechanical properties that closely match those of the target tissue. The achievement of both goals may be possible by careful control of the scaffold architecture. Recently, electrospinning has emerged as an attractive means to form fused fiber scaffolds for tissue engineering. The diameter and relative orientation of fibers affect cell behavior, but their impact on the tensile properties of the scaffolds has not been rigorously characterized. To examine the structure-property relationship, electrospun meshes were made from a polyurethane elastomer with different fiber diameters and orientations and mechanically tested to determine the dependence of the elastic modulus on the mesh architecture. Concurrently, a multiscale modeling strategy developed for type I collagen networks was employed to predict the mechanical behavior of the polyurethane meshes. Experimentally, the measured elastic modulus of the meshes varied from 0.56 to 3.0 MPa depending on fiber diameter and the degree of fiber alignment. Model predictions for tensile loading parallel to fiber orientation agreed well with experimental measurements for a wide range of conditions when a fitted fiber modulus of 18 MPa was used. Although the model predictions were less accurate in transverse loading of anisotropic samples, these results indicate that computational modeling can assist in design of electrospun artificial tissue scaffolds. PMID:19627797
Current Approaches to Bone Tissue Engineering: The Interface between Biology and Engineering.
Li, Jiao Jiao; Ebied, Mohamed; Xu, Jen; Zreiqat, Hala
2018-03-01
The successful regeneration of bone tissue to replace areas of bone loss in large defects or at load-bearing sites remains a significant clinical challenge. Over the past few decades, major progress is achieved in the field of bone tissue engineering to provide alternative therapies, particularly through approaches that are at the interface of biology and engineering. To satisfy the diverse regenerative requirements of bone tissue, the field moves toward highly integrated approaches incorporating the knowledge and techniques from multiple disciplines, and typically involves the use of biomaterials as an essential element for supporting or inducing bone regeneration. This review summarizes the types of approaches currently used in bone tissue engineering, beginning with those primarily based on biology or engineering, and moving into integrated approaches in the areas of biomaterial developments, biomimetic design, and scalable methods for treating large or load-bearing bone defects, while highlighting potential areas for collaboration and providing an outlook on future developments. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Perturbing engine performance measurements to determine optimal engine control settings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Li; Lee, Donghoon; Yilmaz, Hakan
Methods and systems for optimizing a performance of a vehicle engine are provided. The method includes determining an initial value for a first engine control parameter based on one or more detected operating conditions of the vehicle engine, determining a value of an engine performance variable, and artificially perturbing the determined value of the engine performance variable. The initial value for the first engine control parameter is then adjusted based on the perturbed engine performance variable causing the engine performance variable to approach a target engine performance variable. Operation of the vehicle engine is controlled based on the adjusted initialmore » value for the first engine control parameter. These acts are repeated until the engine performance variable approaches the target engine performance variable.« less
A hardware/software environment to support R D in intelligent machines and mobile robotic systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mann, R.C.
1990-01-01
The Center for Engineering Systems Advanced Research (CESAR) serves as a focal point at the Oak Ridge National Laboratory (ORNL) for basic and applied research in intelligent machines. R D at CESAR addresses issues related to autonomous systems, unstructured (i.e. incompletely known) operational environments, and multiple performing agents. Two mobile robot prototypes (HERMIES-IIB and HERMIES-III) are being used to test new developments in several robot component technologies. This paper briefly introduces the computing environment at CESAR which includes three hypercube concurrent computers (two on-board the mobile robots), a graphics workstation, VAX, and multiple VME-based systems (several on-board the mobile robots).more » The current software environment at CESAR is intended to satisfy several goals, e.g.: code portability, re-usability in different experimental scenarios, modularity, concurrent computer hardware transparent to applications programmer, future support for multiple mobile robots, support human-machine interface modules, and support for integration of software from other, geographically disparate laboratories with different hardware set-ups. 6 refs., 1 fig.« less
Update on Integrated Optical Design Analyzer
NASA Technical Reports Server (NTRS)
Moore, James D., Jr.; Troy, Ed
2003-01-01
Updated information on the Integrated Optical Design Analyzer (IODA) computer program has become available. IODA was described in Software for Multidisciplinary Concurrent Optical Design (MFS-31452), NASA Tech Briefs, Vol. 25, No. 10 (October 2001), page 8a. To recapitulate: IODA facilitates multidisciplinary concurrent engineering of highly precise optical instruments. The architecture of IODA was developed by reviewing design processes and software in an effort to automate design procedures. IODA significantly reduces design iteration cycle time and eliminates many potential sources of error. IODA integrates the modeling efforts of a team of experts in different disciplines (e.g., optics, structural analysis, and heat transfer) working at different locations and provides seamless fusion of data among thermal, structural, and optical models used to design an instrument. IODA is compatible with data files generated by the NASTRAN structural-analysis program and the Code V (Registered Trademark) optical-analysis program, and can be used to couple analyses performed by these two programs. IODA supports multiple-load-case analysis for quickly accomplishing trade studies. IODA can also model the transient response of an instrument under the influence of dynamic loads and disturbances.
Transputer parallel processing at NASA Lewis Research Center
NASA Technical Reports Server (NTRS)
Ellis, Graham K.
1989-01-01
The transputer parallel processing lab at NASA Lewis Research Center (LeRC) consists of 69 processors (transputers) that can be connected into various networks for use in general purpose concurrent processing applications. The main goal of the lab is to develop concurrent scientific and engineering application programs that will take advantage of the computational speed increases available on a parallel processor over the traditional sequential processor. Current research involves the development of basic programming tools. These tools will help standardize program interfaces to specific hardware by providing a set of common libraries for applications programmers. The thrust of the current effort is in developing a set of tools for graphics rendering/animation. The applications programmer currently has two options for on-screen plotting. One option can be used for static graphics displays and the other can be used for animated motion. The option for static display involves the use of 2-D graphics primitives that can be called from within an application program. These routines perform the standard 2-D geometric graphics operations in real-coordinate space as well as allowing multiple windows on a single screen.
ERIC Educational Resources Information Center
Fang, Jun
2012-01-01
Two critical issues are of great concern in engineering education today: the increasingly broad requirements for 21st-century engineers and the lack of effective instructional approaches needed to produce students who meet the requirements. However, pedagogical approaches in engineering have remained relatively unchanged for the last 40 years and…
A unifying framework for systems modeling, control systems design, and system operation
NASA Technical Reports Server (NTRS)
Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.
2005-01-01
Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.
In Search of Search Engine Marketing Strategy Amongst SME's in Ireland
NASA Astrophysics Data System (ADS)
Barry, Chris; Charleton, Debbie
Researchers have identified the Web as a searchers first port of call for locating information. Search Engine Marketing (SEM) strategies have been noted as a key consideration when developing, maintaining and managing Websites. A study presented here of SEM practices of Irish small to medium enterprises (SMEs) reveals they plan to spend more resources on SEM in the future. Most firms utilize an informal SEM strategy, where Website optimization is perceived most effective in attracting traffic. Respondents cite the use of ‘keywords in title and description tags’ as the most used SEM technique, followed by the use of ‘keywords throughout the whole Website’; while ‘Pay for Placement’ was most widely used Paid Search technique. In concurrence with the literature, measuring SEM performance remains a significant challenge with many firms unsure if they measure it effectively. An encouraging finding is that Irish SMEs adopt a positive ethical posture when undertaking SEM.
Challenges and Opportunities in Interdisciplinary Materials Research Experiences for Undergraduates
NASA Astrophysics Data System (ADS)
Vohra, Yogesh; Nordlund, Thomas
2009-03-01
The University of Alabama at Birmingham (UAB) offer a broad range of interdisciplinary materials research experiences to undergraduate students with diverse backgrounds in physics, chemistry, applied mathematics, and engineering. The research projects offered cover a broad range of topics including high pressure physics, microelectronic materials, nano-materials, laser materials, bioceramics and biopolymers, cell-biomaterials interactions, planetary materials, and computer simulation of materials. The students welcome the opportunity to work with an interdisciplinary team of basic science, engineering, and biomedical faculty but the challenge is in learning the key vocabulary for interdisciplinary collaborations, experimental tools, and working in an independent capacity. The career development workshops dealing with the graduate school application process and the entrepreneurial business activities were found to be most effective. The interdisciplinary university wide poster session helped student broaden their horizons in research careers. The synergy of the REU program with other concurrently running high school summer programs on UAB campus will also be discussed.
Modular co-culture engineering, a new approach for metabolic engineering.
Zhang, Haoran; Wang, Xiaonan
2016-09-01
With the development of metabolic engineering, employment of a selected microbial host for accommodation of a designed biosynthetic pathway to produce a target compound has achieved tremendous success in the past several decades. Yet, increasing requirements for sophisticated microbial biosynthesis call for establishment and application of more advanced metabolic engineering methodologies. Recently, important progress has been made towards employing more than one engineered microbial strains to constitute synthetic co-cultures and modularizing the biosynthetic labor between the co-culture members in order to improve bioproduction performance. This emerging approach, referred to as modular co-culture engineering in this review, presents a valuable opportunity for expanding the scope of the broad field of metabolic engineering. We highlight representative research accomplishments using this approach, especially those utilizing metabolic engineering tools for microbial co-culture manipulation. Key benefits and major challenges associated with modular co-culture engineering are also presented and discussed. Copyright © 2016 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.
"Bridging" Engineering & Art: An Outreach Approach for Middle and High School Students
ERIC Educational Resources Information Center
Asiabanpour, Bahram; DesChamps-Benke, Nicole; Wilson, Thomas; Loerwald, Matthew; Gourgey, Hannah
2010-01-01
This paper describes a novel outreach approach to high school and middle school students to familiarize them with engineering functions and methods. In this approach students participated in a seven-day summer research camp and learned many engineering skills and tools such as CAD solid modeling, finite element analysis, rapid prototyping,…
Coming to Terms with Engineering Design as Content
ERIC Educational Resources Information Center
Lewis, Theodore
2005-01-01
This article addresses the challenges posed by engineering design as a content area of technology education. What adjustments will technology teachers have to make in their approach to teaching and learning when they teach design as engineering in response to the new standards? How faithful to engineering as practiced must their approach be? There…
A TAPS Interactive Multimedia Package to Solve Engineering Dynamics Problem
ERIC Educational Resources Information Center
Sidhu, S. Manjit; Selvanathan, N.
2005-01-01
Purpose: To expose engineering students to using modern technologies, such as multimedia packages, to learn, visualize and solve engineering problems, such as in mechanics dynamics. Design/methodology/approach: A multimedia problem-solving prototype package is developed to help students solve an engineering problem in a step-by-step approach. A…
Chromosomal aberrations in peripheral lymphocytes of train engine drivers.
Nordenson, I; Mild, K H; Järventaus, H; Hirvonen, A; Sandström, M; Wilén, J; Blix, N; Norppa, H
2001-07-01
Studies of Swedish railway employees have indicated that railroad engine drivers have an increased cancer morbidity and incidence of chronic lymphatic leukemia. The drivers are exposed to relatively high magnetic fields (MF), ranging from a few to over a hundred microT. Although the possible genotoxic potential of MF is unclear, some earlier studies have indicated that occupational exposure to MF may increase chromosome aberrations in blood lymphocytes. Since an increased level of chromosomal aberrations has been suggested to predict elevated cancer risk, we performed a cytogenetic analysis on cultured (48 h) peripheral lymphocytes of Swedish train engine drivers. A pilot study of 18 engine drivers indicated a significant difference in the frequency of cells with chromosomal aberrations (gaps included or excluded) in comparison with seven concurrent referents (train dispatchers) and a control group of 16 office workers. The engine drivers had about four times higher frequency of cells with chromosome-type aberrations (excluding gaps) than the office workers (P < 0.01) and the dispatchers (P < 0.05). Seventy-eight percent of the engine drivers showed at least one cell per 100 with chromosome-type aberrations compared with 29% among the dispatchers and 31% among the office workers. In a follow-up study, another 30 engine drivers showed an increase (P < 0.05) in the frequency of cells with chromosome-type aberrations (gaps excluded) as compared with 30 referent policemen. Sixty percent of the engine drivers had one or more cells (per 100 cells) with chromosome-type aberrations compared with 30% among the policemen. In conclusion, the results of the two studies support the hypothesis that exposure to MF at mean intensities of 2-15 microT can induce chromosomal damage. Copyright 2001 Wiley-Liss, Inc.
2012-04-09
signatures (RSS), in particular, despeckling, superresolution and convergence rate, for a variety of admissible 115 imaging array sensor...attain the superresolution performances in the resulting SSP estimates (3.4), we propose the VA inspired approach [13], [14] to specify the POCS
Tribal-federal collaboration in resource management
Ellen M. Donoghue; Sara A. Thompson; John C. Bliss
2010-01-01
The increase in collaborative projects involving American Indian tribes and natural resource management agencies in the United States reflects two emergent trends: 1) the use of collaborative approaches between agencies and groups in managing natural resources; and 2) the concurrent increased recognition of American Indian rights, institutionalization of consultation...
Mixed-Methods Research Methodologies
ERIC Educational Resources Information Center
Terrell, Steven R.
2012-01-01
Mixed-Method studies have emerged from the paradigm wars between qualitative and quantitative research approaches to become a widely used mode of inquiry. Depending on choices made across four dimensions, mixed-methods can provide an investigator with many design choices which involve a range of sequential and concurrent strategies. Defining…
Multilingual Rehabilitation Terminology. A Preliminary Study.
ERIC Educational Resources Information Center
Wagner. , Elizabeth M.
In rehabilitation, the team approach demands harmonious communication among practitioners of many professions and occupations at many different levels. Technical terminology needs to be concurrently understood or able to be explained easily. Although rehabilitation terminology is still somewhat in a state of flux, enough terms have been identified…
On a New Approach to Education about Ethics for Engineers at Meijou University
NASA Astrophysics Data System (ADS)
Fukaya, Minoru; Morimoto, Tsukasa; Kimura, Noritsugu
We propose a new approach to education of so called “engineering ethics”. This approach has two important elements in its teaching system. One is “problem-solving learning”, and the other is “discussion ability”. So far, engineering ethics started at the ethical standpoint. But we put the viewpoint of problem-solving learning at the educational base of engineering ethics. Because many problems have complicated structures, so if we want to solve them, we should discuss each other. Problem-solving ability and discussion ability, they help engineers to solve the complex problems in their social everyday life. Therefore, Meijo University names engineering ethics “ethics for engineers”. At Meijou University about 1300 students take classes in both ethics for engineers and environmental ethics for one year.
Execution of Multidisciplinary Design Optimization Approaches on Common Test Problems
NASA Technical Reports Server (NTRS)
Balling, R. J.; Wilkinson, C. A.
1997-01-01
A class of synthetic problems for testing multidisciplinary design optimization (MDO) approaches is presented. These test problems are easy to reproduce because all functions are given as closed-form mathematical expressions. They are constructed in such a way that the optimal value of all variables and the objective is unity. The test problems involve three disciplines and allow the user to specify the number of design variables, state variables, coupling functions, design constraints, controlling design constraints, and the strength of coupling. Several MDO approaches were executed on two sample synthetic test problems. These approaches included single-level optimization approaches, collaborative optimization approaches, and concurrent subspace optimization approaches. Execution results are presented, and the robustness and efficiency of these approaches an evaluated for these sample problems.
NASA Technical Reports Server (NTRS)
Kemeny, Sabrina E.
1994-01-01
Electronic and optoelectronic hardware implementations of highly parallel computing architectures address several ill-defined and/or computation-intensive problems not easily solved by conventional computing techniques. The concurrent processing architectures developed are derived from a variety of advanced computing paradigms including neural network models, fuzzy logic, and cellular automata. Hardware implementation technologies range from state-of-the-art digital/analog custom-VLSI to advanced optoelectronic devices such as computer-generated holograms and e-beam fabricated Dammann gratings. JPL's concurrent processing devices group has developed a broad technology base in hardware implementable parallel algorithms, low-power and high-speed VLSI designs and building block VLSI chips, leading to application-specific high-performance embeddable processors. Application areas include high throughput map-data classification using feedforward neural networks, terrain based tactical movement planner using cellular automata, resource optimization (weapon-target assignment) using a multidimensional feedback network with lateral inhibition, and classification of rocks using an inner-product scheme on thematic mapper data. In addition to addressing specific functional needs of DOD and NASA, the JPL-developed concurrent processing device technology is also being customized for a variety of commercial applications (in collaboration with industrial partners), and is being transferred to U.S. industries. This viewgraph p resentation focuses on two application-specific processors which solve the computation intensive tasks of resource allocation (weapon-target assignment) and terrain based tactical movement planning using two extremely different topologies. Resource allocation is implemented as an asynchronous analog competitive assignment architecture inspired by the Hopfield network. Hardware realization leads to a two to four order of magnitude speed-up over conventional techniques and enables multiple assignments, (many to many), not achievable with standard statistical approaches. Tactical movement planning (finding the best path from A to B) is accomplished with a digital two-dimensional concurrent processor array. By exploiting the natural parallel decomposition of the problem in silicon, a four order of magnitude speed-up over optimized software approaches has been demonstrated.
Nonrecursive formulations of multibody dynamics and concurrent multiprocessing
NASA Technical Reports Server (NTRS)
Kurdila, Andrew J.; Menon, Ramesh
1993-01-01
Since the late 1980's, research in recursive formulations of multibody dynamics has flourished. Historically, much of this research can be traced to applications of low dimensionality in mechanism and vehicle dynamics. Indeed, there is little doubt that recursive order N methods are the method of choice for this class of systems. This approach has the advantage that a minimal number of coordinates are utilized, parallelism can be induced for certain system topologies, and the method is of order N computational cost for systems of N rigid bodies. Despite the fact that many authors have dismissed redundant coordinate formulations as being of order N(exp 3), and hence less attractive than recursive formulations, we present recent research that demonstrates that at least three distinct classes of redundant, nonrecursive multibody formulations consistently achieve order N computational cost for systems of rigid and/or flexible bodies. These formulations are as follows: (1) the preconditioned range space formulation; (2) penalty methods; and (3) augmented Lagrangian methods for nonlinear multibody dynamics. The first method can be traced to its foundation in equality constrained quadratic optimization, while the last two methods have been studied extensively in the context of coercive variational boundary value problems in computational mechanics. Until recently, however, they have not been investigated in the context of multibody simulation, and present theoretical questions unique to nonlinear dynamics. All of these nonrecursive methods have additional advantages with respect to recursive order N methods: (1) the formalisms retain the highly desirable order N computational cost; (2) the techniques are amenable to concurrent simulation strategies; (3) the approaches do not depend upon system topology to induce concurrency; and (4) the methods can be derived to balance the computational load automatically on concurrent multiprocessors. In addition to the presentation of the fundamental formulations, this paper presents new theoretical results regarding the rate of convergence of order N constraint stabilization schemes associated with the newly introduced class of methods.
Broadening ethics teaching in engineering: beyond the individualistic approach.
Conlon, Eddie; Zandvoort, Henk
2011-06-01
There is a widespread approach to the teaching of ethics to engineering students in which the exclusive focus is on engineers as individual agents and the broader context in which they do their work is ignored. Although this approach has frequently been criticised in the literature, it persists on a wide scale, as can be inferred from accounts in the educational literature and from the contents of widely used textbooks in engineering ethics. In this contribution we intend to: (1) Restate why the individualistic approach to the teaching of ethics to engineering students is inadequate in view of preparing them for ethical, professional and social responsibility; (2) Examine the existing literature regarding the possible contribution of Science, Technology and Society (STS) scholarship in addressing the inadequacies of the individualistic approach; and (3) Assess this possible contribution of STS in order to realise desired learning outcomes regarding the preparation of students for ethical and social responsibility.
Non-genetic engineering of cells for drug delivery and cell-based therapy.
Wang, Qun; Cheng, Hao; Peng, Haisheng; Zhou, Hao; Li, Peter Y; Langer, Robert
2015-08-30
Cell-based therapy is a promising modality to address many unmet medical needs. In addition to genetic engineering, material-based, biochemical, and physical science-based approaches have emerged as novel approaches to modify cells. Non-genetic engineering of cells has been applied in delivering therapeutics to tissues, homing of cells to the bone marrow or inflammatory tissues, cancer imaging, immunotherapy, and remotely controlling cellular functions. This new strategy has unique advantages in disease therapy and is complementary to existing gene-based cell engineering approaches. A better understanding of cellular systems and different engineering methods will allow us to better exploit engineered cells in biomedicine. Here, we review non-genetic cell engineering techniques and applications of engineered cells, discuss the pros and cons of different methods, and provide our perspectives on future research directions. Copyright © 2014 Elsevier B.V. All rights reserved.
Expanding the metabolic engineering toolbox with directed evolution.
Abatemarco, Joseph; Hill, Andrew; Alper, Hal S
2013-12-01
Cellular systems can be engineered into factories that produce high-value chemicals from renewable feedstock. Such an approach requires an expanded toolbox for metabolic engineering. Recently, protein engineering and directed evolution strategies have started to play a growing and critical role within metabolic engineering. This review focuses on the various ways in which directed evolution can be applied in conjunction with metabolic engineering to improve product yields. Specifically, we discuss the application of directed evolution on both catalytic and non-catalytic traits of enzymes, on regulatory elements, and on whole genomes in a metabolic engineering context. We demonstrate how the goals of metabolic pathway engineering can be achieved in part through evolving cellular parts as opposed to traditional approaches that rely on gene overexpression and deletion. Finally, we discuss the current limitations in screening technology that hinder the full implementation of a metabolic pathway-directed evolution approach. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Gallaway, Glen R.
1987-01-01
Human Engineering in many projects is at best a limited support function. In this Navy project the Human Engineering function is an integral component of the systems design and development process. Human Engineering is a member of the systems design organization. This ensures that people considerations are: (1) identified early in the project; (2) accounted for in the specifications; (3) incorporated into the design; and (4) the tested product meets the needs and expectations of the people while meeting the overall systems requirements. The project exemplifies achievements that can be made by the symbiosis between systems designers, engineers and Human Engineering. This approach increases Human Engineering's effectiveness and value to a project because it becomes an accepted, contributing team member. It is an approach to doing Human Engineering that should be considered for most projects. The functional and organizational issues giving this approach strength are described.
Update - Concept of Operations for Integrated Model-Centric Engineering at JPL
NASA Technical Reports Server (NTRS)
Bayer, Todd J.; Bennett, Matthew; Delp, Christopher L.; Dvorak, Daniel; Jenkins, Steven J.; Mandutianu, Sanda
2011-01-01
The increasingly ambitious requirements levied on JPL's space science missions, and the development pace of such missions, challenge our current engineering practices. All the engineering disciplines face this growth in complexity to some degree, but the challenges are greatest in systems engineering where numerous competing interests must be reconciled and where complex system level interactions must be identified and managed. Undesired system-level interactions are increasingly a major risk factor that cannot be reliably exposed by testing, and natural-language single-viewpoint specifications areinadequate to capture and expose system level interactions and characteristics. Systems engineering practices must improve to meet these challenges, and the most promising approach today is the movement toward a more integrated and model-centric approach to mission conception, design, implementation and operations. This approach elevates engineering models to a principal role in systems engineering, gradually replacing traditional document centric engineering practices.
Educating the humanitarian engineer.
Passino, Kevin M
2009-12-01
The creation of new technologies that serve humanity holds the potential to help end global poverty. Unfortunately, relatively little is done in engineering education to support engineers' humanitarian efforts. Here, various strategies are introduced to augment the teaching of engineering ethics with the goal of encouraging engineers to serve as effective volunteers for community service. First, codes of ethics, moral frameworks, and comparative analysis of professional service standards lay the foundation for expectations for voluntary service in the engineering profession. Second, standard coverage of global issues in engineering ethics educates humanitarian engineers about aspects of the community that influence technical design constraints encountered in practice. Sample assignments on volunteerism are provided, including a prototypical design problem that integrates community constraints into a technical design problem in a novel way. Third, it is shown how extracurricular engineering organizations can provide a theory-practice approach to education in volunteerism. Sample completed projects are described for both undergraduates and graduate students. The student organization approach is contrasted with the service-learning approach. Finally, long-term goals for establishing better infrastructure are identified for educating the humanitarian engineer in the university, and supporting life-long activities of humanitarian engineers.
ERIC Educational Resources Information Center
Garcia, Oscar N.; Varanasi, Murali R.; Acevedo, Miguel F.; Guturu, Parthasarathy
2011-01-01
We analyze and study the beginning of a new Electrical Engineering Department, supported by an NSF Departmental Level Reform award, within a new College of Engineering in the 21st Century and also describe the academic approach and influences of an innovative cognitive-based approach to curriculum development. In addition, the approach taken…
ERIC Educational Resources Information Center
Asiabanpour, Bahram
2010-01-01
In this paper a novel outreach approach to high school students to familiarize them with engineering functions and methods is explained. In this approach students participated in a seven days research camp and learned many engineering skills and tools such as CAD solid modeling, finite element analysis, rapid prototyping, mechanical tests, team…
ERIC Educational Resources Information Center
Garces, Andres; Sanchez-Barba, Luis Fernando
2011-01-01
We describe an alternative educational approach for an inorganic chemistry laboratory module named "Experimentation in Chemistry", which is included in Industrial Engineering and Chemical Engineering courses. The main aims of the new approach were to reduce the high levels of failure and dropout on the module and to make the content match the…
Lee, Kuei-Hua; Tsai, Yueh-Ting; Lai, Jung-Nien; Lin, Shun-Ku
2013-01-01
Background. The increased practice of traditional Chinese medicine (TCM) worldwide has raised concerns regarding herb-drug interactions. The purpose of our study is to analyze the concurrent use of Chinese herbal products (CHPs) among Taiwanese insomnia patients taking hypnotic drugs. Methods. The usage, frequency of services, and CHP prescribed among 53,949 insomnia sufferers were evaluated from a random sample of 1 million beneficiaries in the National Health Insurance Research Database. A logistic regression method was used to identify the factors that were associated with the coprescription of a CHP and a hypnotic drug. Cox proportional hazards regressions were performed to calculate the hazard ratios (HRs) of hip fracture between the two groups. Results. More than 1 of every 3 hypnotic users also used a CHP concurrently. Jia-Wei-Xiao-Yao-San (Augmented Rambling Powder) and Suan-Zao-Ren-Tang (Zizyphus Combination) were the 2 most commonly used CHPs that were coadministered with hypnotic drugs. The HR of hip fracture for hypnotic-drug users who used a CHP concurrently was 0.57-fold (95% CI = 0.47-0.69) that of hypnotic-drug users who did not use a CHP. Conclusion. Exploring potential CHP-drug interactions and integrating both healthcare approaches might be beneficial for the overall health and quality of life of insomnia sufferers.
ERIC Educational Resources Information Center
Kelley, Todd; Brenner, Daniel C.; Pieper, Jon T.
2010-01-01
A comparative study was conducted to compare two approaches to engineering design curriculum between different schools (inter-school) and between two curricular approaches, "Project Lead the Way" (PLTW) and "Engineering Projects in Community Service" (EPIC High) (inter-curricular). The researchers collected curriculum…
ERIC Educational Resources Information Center
Ford, Julie Dyke
2012-01-01
This program profile describes a new approach towards integrating communication within Mechanical Engineering curricula. The author, who holds a joint appointment between Technical Communication and Mechanical Engineering at New Mexico Institute of Mining and Technology, has been collaborating with Mechanical Engineering colleagues to establish a…
ERIC Educational Resources Information Center
Loch, Birgit; Lamborn, Julia
2016-01-01
Many approaches to make mathematics relevant to first-year engineering students have been described. These include teaching practical engineering applications, or a close collaboration between engineering and mathematics teaching staff on unit design and teaching. In this paper, we report on a novel approach where we gave higher year engineering…
NASA Technical Reports Server (NTRS)
Rinehart, Aidan W.; Simon, Donald L.
2015-01-01
This paper presents a model-based architecture for performance trend monitoring and gas path fault diagnostics designed for analyzing streaming transient aircraft engine measurement data. The technique analyzes residuals between sensed engine outputs and model predicted outputs for fault detection and isolation purposes. Diagnostic results from the application of the approach to test data acquired from an aircraft turbofan engine are presented. The approach is found to avoid false alarms when presented nominal fault-free data. Additionally, the approach is found to successfully detect and isolate gas path seeded-faults under steady-state operating scenarios although some fault misclassifications are noted during engine transients. Recommendations for follow-on maturation and evaluation of the technique are also presented.
A top-down approach in control engineering third-level teaching: The case of hydrogen-generation
NASA Astrophysics Data System (ADS)
Setiawan, Eko; Habibi, M. Afnan; Fall, Cheikh; Hodaka, Ichijo
2017-09-01
This paper presents a top-down approach in control engineering third-level teaching. The paper shows the control engineering solution for the issue of practical implementation in order to motivate students. The proposed strategy only focuses on one technique of control engineering to lead student correctly. The proposed teaching steps are 1) defining the problem, 2) listing of acquired knowledge or required skill, 3) selecting of one control engineering technique, 4) arrangement the order of teaching: problem introduction, implementation of control engineering technique, explanation of system block diagram, model derivation, controller design, and 5) enrichment knowledge by the other control techniques. The approach presented highlights hardware implementation and the use of software simulation as a self-learning tool for students.
NASA Technical Reports Server (NTRS)
Rinehart, Aidan W.; Simon, Donald L.
2014-01-01
This paper presents a model-based architecture for performance trend monitoring and gas path fault diagnostics designed for analyzing streaming transient aircraft engine measurement data. The technique analyzes residuals between sensed engine outputs and model predicted outputs for fault detection and isolation purposes. Diagnostic results from the application of the approach to test data acquired from an aircraft turbofan engine are presented. The approach is found to avoid false alarms when presented nominal fault-free data. Additionally, the approach is found to successfully detect and isolate gas path seeded-faults under steady-state operating scenarios although some fault misclassifications are noted during engine transients. Recommendations for follow-on maturation and evaluation of the technique are also presented.
Regeneratively cooled rocket engine for space storable propellants
NASA Technical Reports Server (NTRS)
Wagner, W. R.
1973-01-01
Analysis, design, fabrication, and test efforts were performed for the existing OF2/B2H6 regeneratively cooled lK (4448 N) thrust chamber to illustrate simultaneous B2H6 fuel and OF2 oxidizer cooling and to provide results for a gaseous propellant condition injected into the combustion chamber. Data derived from performance, thermal and flow measurements confirmed predictions derived from previous test work and from concurrent analytical study. Development data derived from the experimental study were indicated to be sufficient to develop a preflight thrust chamber demonstrator prototype for future space mission objectives.
Proceedings of the NASA Conference on Space Telerobotics, volume 3
NASA Technical Reports Server (NTRS)
Rodriguez, Guillermo (Editor); Seraji, Homayoun (Editor)
1989-01-01
The theme of the Conference was man-machine collaboration in space. The Conference provided a forum for researchers and engineers to exchange ideas on the research and development required for application of telerobotics technology to the space systems planned for the 1990s and beyond. The Conference: (1) provided a view of current NASA telerobotic research and development; (2) stimulated technical exchange on man-machine systems, manipulator control, machine sensing, machine intelligence, concurrent computation, and system architectures; and (3) identified important unsolved problems of current interest which can be dealt with by future research.
Overview of Computer Simulation Modeling Approaches and Methods
Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett
2005-01-01
The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...
Digital Native Students: Gender Differences in Mathematics and Gaming
ERIC Educational Resources Information Center
Yong, Su-Ting
2017-01-01
The purpose of this study was to explore gender differences among digital native students in mathematics learning and gaming. A quantitative dominant mixed methods approach was employed in which quantitative surveys [174 students] and qualitative interviews [eight students, eight parents and six teachers] were administered concurrently. Data…
Post-Positivist Research: Two Examples of Methodological Pluralism.
ERIC Educational Resources Information Center
Wildemuth, Barbara M.
1993-01-01
Discussion of positivist and interpretive approaches to research and postpositivism focuses on two studies that apply interpretive research in different ways: an exploratory study of user-developed computing applications conducted prior to a positivist study and a study of end-user searching behaviors conducted concurrently with a positivist…
ERIC Educational Resources Information Center
Venezia, Andrea; Bracco, Kathy Reeves; Nodine, Thad
2010-01-01
There is substantial work being done--in California and nationwide--to develop college readiness standards; expand concurrent enrollment programs; communicate clearly about the key cognitive strategies necessary for postsecondary success (e.g., analytical thinking); improve student supports; and implement other approaches to improve students'…
Hydrologic calibration of paired watersheds using a MOSUM approach
H. Ssegane; Devendra Amatya; A. Muwamba; G. M. Chescheir; T. Appelboom; E. W. Tollner; J. E. Nettles; M. A. Youssef; F. Birgand; R. W. Skaggs
2015-01-01
Paired watershed studies have historically been used to quantify hydrologic effects of land use and management practices by concurrently monitoring two neighboring watersheds (a control and a treatment) during the calibration (pre-treatment) and post-treatment periods. This study characterizes seasonal water table and flow response to rainfall during the...
Belisle, Hannah Appelbaum; Hennink, Monique; Ordóñez, Claudia E.; John, Sally; Ngubane-Joye, Eunephacia; Hampton, Jane; Sunpath, Henry; Preston-Whyte, Eleanor; Marconi, Vincent C.
2014-01-01
The concurrent use of traditional African medicine (TAM) and allopathic medicine is not well understood for people living with HIV (PLHIV) in the era of antiretroviral therapy (ART). This cross-sectional, qualitative study examines perceptions of the concurrent use of TAM and ART among: i) patients receiving ART at the Sinikithemba HIV Clinic of McCord Hospital, in Durban, South Africa; ii) allopathic medical providers (doctors, nurses, HIV counsellors) from Sinikithemba; and iii) local traditional healers. Data were collected through in-depth interviews and focus group discussions with 26 participants between July and October, 2011. Patients in this study did not view TAM as an alternative to ART; rather, results show that patients employ TAM and ART for distinctly different needs. More research is needed to further understand the relationship between traditional and allopathic approaches to health care in South Africa, to improve cultural relevance in the provision and delivery of care for PLHIV, and to pragmatically address the concerns of healthcare providers and public health officials managing this intersection in South Africa and elsewhere. PMID:25346069
Engineering design activities and conceptual change in middle school science
NASA Astrophysics Data System (ADS)
Schnittka, Christine G.
The purpose of this research was to investigate the impact of engineering design classroom activities on conceptual change in science, and on attitudes toward and knowledge about engineering. Students were given a situated learning context and a rationale for learning science in an active, inquiry-based method, and worked in small collaborative groups. One eighth-grade physical science teacher and her students participated in a unit on heat transfer and thermal energy. One class served as the control while two others received variations of an engineering design treatment. Data were gathered from teacher and student entrance and exit interviews, audio recordings of student dialog during group work, video recordings and observations of all classes, pre- and posttests on science content and engineering attitudes, and artifacts and all assignments completed by students. Qualitative and quantitative data were collected concurrently, but analysis took place in two phases. Qualitative data were analyzed in an ongoing manner so that the researcher could explore emerging theories and trends as the study progressed. These results were compared to and combined with the results of the quantitative data analysis. Analysis of the data was carried out in the interpretive framework of analytic induction. Findings indicated that students overwhelmingly possessed alternative conceptions about heat transfer, thermal energy, and engineering prior to the interventions. While all three classes made statistically significant gains in their knowledge about heat and energy, students in the engineering design class with the targeted demonstrations made the most significant gains over the other two other classes. Engineering attitudes changed significantly in the two classes that received the engineering design intervention. Implications from this study can inform teachers' use of engineering design activities in science classrooms. These implications are: (1) Alternative conceptions will persist when not specifically addressed. (2) Engineering design activities are not enough to promote conceptual change. (3) A middle school teacher can successfully implement an engineering design-based curriculum in a science class. (4) Results may also be of interest to science curriculum developers and engineering educators involved in developing engineering outreach curricula for middle school students.
Ghosh, Saptarshi; Rao, Pamidimukkala Brahmananda; Kumar, P Ravindra; Manam, Surendra
2015-01-01
The organ preservation approach of choice for the treatment of locally advanced head and neck cancers is concurrent chemoradiation with three weekly high doses of cisplatin. Although this is an efficacious treatment policy, it has high acute systemic and mucosal toxicities, which lead to frequent treatment breaks and increased overall treatment time. Hence, the current study was undertaken to evaluate the efficacy of concurrent chemoradiation using 40 mg/m2 weekly cisplatin. This is a single institutional retrospective study including the data of 266 locally advanced head and neck cancer patients who were treated with concurrent chemoradiation using 40 mg/m2 weekly cisplatin from January 2012 to January 2014. A p-value of < 0.05 was taken to be significant statistically for all purposes in the study. The mean age of the study patients was 48.8 years. Some 36.1% of the patients had oral cavity primary tumors. The mean overall treatment time was 57.2 days. With a mean follow up of 15.2 months for all study patients and 17.5 months for survivors, 3 year local control, locoregional control and disease free survival were seen in 62.8%, 42.8% and 42.1% of the study patients. Primary tumor site, nodal stage of disease, AJCC stage of the disease and number of cycles of weekly cisplatin demonstrated statistically significant correlations with 3 year local control, locoregional control and disease free survival. Concurrent chemoradiotherapy with moderate dose weekly cisplatin is an efficacious treatment regime for locally advanced head and neck cancers with tolerable toxicity which can be used in developing countries with limited resources.
Concurrent approach for evolving compact decision rule sets
NASA Astrophysics Data System (ADS)
Marmelstein, Robert E.; Hammack, Lonnie P.; Lamont, Gary B.
1999-02-01
The induction of decision rules from data is important to many disciplines, including artificial intelligence and pattern recognition. To improve the state of the art in this area, we introduced the genetic rule and classifier construction environment (GRaCCE). It was previously shown that GRaCCE consistently evolved decision rule sets from data, which were significantly more compact than those produced by other methods (such as decision tree algorithms). The primary disadvantage of GRaCCe, however, is its relatively poor run-time execution performance. In this paper, a concurrent version of the GRaCCE architecture is introduced, which improves the efficiency of the original algorithm. A prototype of the algorithm is tested on an in- house parallel processor configuration and the results are discussed.
The common engine concept for ALS application - A cost reduction approach
NASA Technical Reports Server (NTRS)
Bair, E. K.; Schindler, C. M.
1989-01-01
Future launch systems require the application of propulsion systems which have been designed and developed to meet mission model needs while providing high degrees of reliability and cost effectiveness. Vehicle configurations which utilize different propellant combinations for booster and core stages can benefit from a common engine approach where a single engine design can be configured to operate on either set of propellants and thus serve as either a booster or core engine. Engine design concepts and mission application for a vehicle employing a common engine are discussed. Engine program cost estimates were made and cost savings, over the design and development of two unique engines, estimated.
A correlational approach to predicting operator status
NASA Technical Reports Server (NTRS)
Shingledecker, Clark A.
1988-01-01
This paper discusses a research approach for identifying and validating candidate physiological and behavioral parameters which can be used to predict the performance capabilities of aircrew and other system operators. In this methodology, concurrent and advance correlations are computed between predictor values and criterion performance measures. Continuous performance and sleep loss are used as stressors to promote performance variation. Preliminary data are presented which suggest dependence of prediction capability on the resource allocation policy of the operator.
NASA Astrophysics Data System (ADS)
Ehlen, Judy
2005-04-01
Weathered mantle comprises the materials above bedrock and below the soil. It can vary in thickness from millimeters to hundreds of meters, depending primarily on climate and parent material. Study of the weathered mantle comes within the realms of four disciplines: geology, geomorphology, soil science, and civil engineering, each of which uses a different approach to describe and classify the material. The approaches of engineers, geomorphologists, and geologists are contrasted and compared using example papers from the published literature. Soil scientists rarely study the weathering profile as such, and instead concentrate upon soil-forming processes and spatial distribution primarily in the solum. Engineers, including engineering geologists, study the stability and durability of the weathered mantle and the strength of the materials using sophisticated procedures to classify weathered materials, but their approach tends to be one-dimensional. Furthermore, they believe that the study of mineralogy and chemistry is not useful. Geomorphologists deal with weathering in terms of process—how the weathered mantle is formed—and with respect to landform evolution using a spatial approach. Geologists tend to ignore the weathered mantle because it is not bedrock, or to study its mineralogy and/or chemistry in the laboratory. I recommend that the approaches of the various disciplines be integrated—geomorphologists and geologists should consider using engineering weathering classifications, and geologists should adopt a spatial perspective to weathering, as should engineers and engineering geologists.
Aircraft engine and auxiliary power unit emissions from combusting JP-8 fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kimm, L.T.; Sylvia, D.A.; Gerstle, T.C.
1997-12-31
Due to safety considerations and in an effort to standardize Department of Defense fuels, the US Air Force (USAF) replaced the naptha-based JP-4, MIL-T-5624, with the kerosene-based JP-8, MIL-T-83133, as the standard turbine fuel. Although engine emissions from combustion of JP-4 are well documented for criteria pollutants, little information exists for criteria and hazardous air pollutants from combustion of JP-8 fuel. Due to intrinsic differences between these two raw fuels, their combustion products were expected to differ. As part of a broader engine testing program, the Air Force, through the Human Systems Center at Brooks AFB, TX, has contracted tomore » have the emissions characterized from aircraft engines and auxiliary power units (APUs). Criteria pollutant and targeted HAP emissions of selected USAF aircraft engines were quantified during the test program. Emission test results will be used to develop emission factors for the tested aircraft engines and APUs. The Air Force intends to develop a mathematical relationship, using the data collected during this series of tests and from previous tests, to extrapolate existing JP-4 emission factors to representative JP-8 emission factors for other engines. This paper reports sampling methodologies for the following aircraft engine emissions tests: F110-GE-100, F101-GE-102, TF33-P-102, F108-CF-100, T56-A-15, and T39-GE-1A/C. The UH-60A helicopter engine, T700-GE-700, and the C-5A/B and C-130H auxiliary power units (GTCP165-1 and GTCP85-180, respectively) were also tested. Testing was performed at various engine settings to determine emissions of particulate matter, carbon monoxide, nitrogen oxides, sulfur oxides, total hydrocarbon, and selected hazardous air pollutants. Ambient monitoring was conducted concurrently to establish background pollutant concentrations for data correction.« less
NASA Astrophysics Data System (ADS)
Clark, Robin; Andrews, Jane
2014-11-01
This paper begins with the argument that within modern-day society, engineering has shifted from being the scientific and technical mainstay of industrial, and more recently digital change to become the most vital driver of future advancement. In order to meet the inevitable challenges resulting from this role, the nature of engineering education is constantly evolving and as such engineering education has to change. The paper argues that what is needed is a fresh approach to engineering education - one that is sufficiently flexible so as to capture the fast-changing needs of engineering education as a discipline, whilst being pedagogically suitable for use with a range of engineering epistemologies. It provides an overview of a case study in which a new approach to engineering education has been developed and evaluated. The approach, which is based on the concept of scholarship, is described in detail. This is followed by a discussion of how the approach has been put into practice and evaluated. The paper concludes by arguing that within today's market-driven university world, the need for effective learning and teaching practice, based in good scholarship, is fundamental to student success.
NASA Astrophysics Data System (ADS)
Rolland, Colette; Yu, Eric; Salinesi, Camille; Castro, Jaelson
The use of intentional concepts, the notion of "goal" in particular, has been prominent in recent approaches to requirement engineering (RE). Goal-oriented frameworks and methods for requirements engineering (GORE) have been keynote topics in requirements engineering, conceptual modelling, and more generally in software engineering. What are the conceptual modelling foundations in these approaches? RIGiM (Requirements Intentions and Goals in Conceptual Modelling) aims to provide a forum for discussing the interplay between requirements engineering and conceptual modelling, and in particular, to investigate how goal- and intention-driven approaches help in conceptualising purposeful systems. What are the fundamental objectives and premises of requirements engineering and conceptual modelling respectively, and how can they complement each other? What are the demands on conceptual modelling from the standpoint of requirements engineering? What conceptual modelling techniques can be further taken advantage of in requirements engineering? What are the upcoming modelling challenges and issues in GORE? What are the unresolved open questions? What lessons are there to be learnt from industrial experiences? What empirical data are there to support the cost-benefit analysis when adopting GORE methods? Are there application domains or types of project settings for which goals and intentional approaches are particularly suitable or not suitable? What degree of formalization and automation, or interactivity is feasible and appropriate for what types of participants during requirements engineering?