Sample records for requirements engineering framework

  1. Framework for Architecture Trade Study Using MBSE and Performance Simulation

    NASA Technical Reports Server (NTRS)

    Ryan, Jessica; Sarkani, Shahram; Mazzuchim, Thomas

    2012-01-01

    Increasing complexity in modern systems as well as cost and schedule constraints require a new paradigm of system engineering to fulfill stakeholder needs. Challenges facing efficient trade studies include poor tool interoperability, lack of simulation coordination (design parameters) and requirements flowdown. A recent trend toward Model Based System Engineering (MBSE) includes flexible architecture definition, program documentation, requirements traceability and system engineering reuse. As a new domain MBSE still lacks governing standards and commonly accepted frameworks. This paper proposes a framework for efficient architecture definition using MBSE in conjunction with Domain Specific simulation to evaluate trade studies. A general framework is provided followed with a specific example including a method for designing a trade study, defining candidate architectures, planning simulations to fulfill requirements and finally a weighted decision analysis to optimize system objectives.

  2. Engineering Change Management Method Framework in Mechanical Engineering

    NASA Astrophysics Data System (ADS)

    Stekolschik, Alexander

    2016-11-01

    Engineering changes make an impact on different process chains in and outside the company, and lead to most error costs and time shifts. In fact, 30 to 50 per cent of development costs result from technical changes. Controlling engineering change processes can help us to avoid errors and risks, and contribute to cost optimization and a shorter time to market. This paper presents a method framework for controlling engineering changes at mechanical engineering companies. The developed classification of engineering changes and accordingly process requirements build the basis for the method framework. The developed method framework comprises two main areas: special data objects managed in different engineering IT tools and process framework. Objects from both areas are building blocks that can be selected to the overall business process based on the engineering process type and change classification. The process framework contains steps for the creation of change objects (both for overall change and for parts), change implementation, and release. Companies can select singleprocess building blocks from the framework, depending on the product development process and change impact. The developed change framework has been implemented at a division (10,000 employees) of a big German mechanical engineering company.

  3. An exploration of the professional competencies required in engineering asset management

    NASA Astrophysics Data System (ADS)

    Bish, Adelle J.; Newton, Cameron J.; Browning, Vicky; O'Connor, Peter; Anibaldi, Renata

    2014-07-01

    Engineering asset management (EAM) is a rapidly growing and developing field. However, efforts to select and develop engineers in this area are complicated by our lack of understanding of the full range of competencies required to perform. This exploratory study sought to clarify and categorise the professional competencies required of individuals at different hierarchical levels within EAM. Data from 14 field interviews, 61 online surveys, and 10 expert panel interviews were used to develop an initial professional competency framework. Overall, nine competency clusters were identified. These clusters indicate that engineers working in this field need to be able to collaborate and influence others, complete objectives within organisational guidelines, and be able to manage themselves effectively. Limitations and potential uses of this framework in engineering education and research are discussed.

  4. The development of Sustainability Graduate Community (SGC) as a learning pathway for sustainability education - a framework for engineering programmes in Malaysia Technical Universities Network (MTUN)

    NASA Astrophysics Data System (ADS)

    Johan, Kartina; Mohd Turan, Faiz

    2016-11-01

    ‘Environmental and sustainability’ is one of the Program Outcome (PO) designated by the Board of Engineers Malaysia (BEM) as one of the accreditation program requirement. However, to-date the implementation of sustainability elements in engineering programme in the technical universities in Malaysia is within individual faculty's curriculum plan and lack of university-level structured learning pathway, which enable all students to have access to an education in sustainability across all disciplines. Sustainability Graduate Community (SGC) is a framework designed to provide a learning pathway in the curriculum of engineering programs to inculcate sustainability education among engineering graduates. This paper aims to study the required attributes in Sustainability Graduate Community (SGC) framework to produce graduates who are not just engineers but also skilful in sustainability competencies using Global Project Management (GPM) P5 Standard for Sustainability. The development of the conceptual framework is to provide a constructive teaching and learning plan for educators and policy makers to work on together in developing the Sustainability Graduates (SG), the new kind of graduates from Malaysia Technical Universities Network (MTUN) in Malaysia who are literate in sustainability practices. The framework also support the call for developing holistic students based on Malaysian Education Blueprint (Higher Education) and address the gap between the statuses of engineering qualification to the expected competencies from industries in Malaysia in particular by achieving the SG attributes outlined in the framework

  5. NASA software documentation standard software engineering program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  6. Framework Support For Knowledge-Based Software Development

    NASA Astrophysics Data System (ADS)

    Huseth, Steve

    1988-03-01

    The advent of personal engineering workstations has brought substantial information processing power to the individual programmer. Advanced tools and environment capabilities supporting the software lifecycle are just beginning to become generally available. However, many of these tools are addressing only part of the software development problem by focusing on rapid construction of self-contained programs by a small group of talented engineers. Additional capabilities are required to support the development of large programming systems where a high degree of coordination and communication is required among large numbers of software engineers, hardware engineers, and managers. A major player in realizing these capabilities is the framework supporting the software development environment. In this paper we discuss our research toward a Knowledge-Based Software Assistant (KBSA) framework. We propose the development of an advanced framework containing a distributed knowledge base that can support the data representation needs of tools, provide environmental support for the formalization and control of the software development process, and offer a highly interactive and consistent user interface.

  7. NASA Software Documentation Standard

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as "Standard") is designed to support the documentation of all software developed for NASA; its goal is to provide a framework and model for recording the essential information needed throughout the development life cycle and maintenance of a software system. The NASA Software Documentation Standard can be applied to the documentation of all NASA software. The Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. The basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  8. Mississippi Curriculum Framework for Small Engine Repair (Program CIP: 47.0606--Small Engine Mechanic and Repairer). Secondary Programs.

    ERIC Educational Resources Information Center

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which reflects Mississippi's statutory requirement that instructional programs be based on core curricula and performance-based assessment, contains outlines of the instructional units required in local instructional management plans and daily lesson plans for small engine repair I and II. Presented first are a program description…

  9. An Exploratory Study of Cost Engineering in Axiomatic Design: Creation of the Cost Model Based on an FR-DP Map

    NASA Technical Reports Server (NTRS)

    Lee, Taesik; Jeziorek, Peter

    2004-01-01

    Large complex projects cost large sums of money throughout their life cycle for a variety of reasons and causes. For such large programs, the credible estimation of the project cost, a quick assessment of the cost of making changes, and the management of the project budget with effective cost reduction determine the viability of the project. Cost engineering that deals with these issues requires a rigorous method and systematic processes. This paper introduces a logical framework to a&e effective cost engineering. The framework is built upon Axiomatic Design process. The structure in the Axiomatic Design process provides a good foundation to closely tie engineering design and cost information together. The cost framework presented in this paper is a systematic link between the functional domain (FRs), physical domain (DPs), cost domain (CUs), and a task/process-based model. The FR-DP map relates a system s functional requirements to design solutions across all levels and branches of the decomposition hierarchy. DPs are mapped into CUs, which provides a means to estimate the cost of design solutions - DPs - from the cost of the physical entities in the system - CUs. The task/process model describes the iterative process ot-developing each of the CUs, and is used to estimate the cost of CUs. By linking the four domains, this framework provides a superior traceability from requirements to cost information.

  10. openSE: a Systems Engineering Framework Particularly Suited to Particle Accelerator Studies and Development Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonnal, P.; Féral, B.; Kershaw, K.

    Particle accelerator projects share many characteristics with industrial projects. However, experience has shown that best practice of industrial project management is not always well suited to particle accelerator projects. Major differences include the number and complexity of technologies involved, the importance of collaborative work, development phases that can last more than a decade, and the importance of telerobotics and remote handling to address future preventive and corrective maintenance requirements due to induced radioactivity, to cite just a few. The openSE framework it is a systems engineering and project management framework specifically designed for scientific facilities’ systems and equipment studies andmore » development projects. Best practices in project management, in systems and requirements engineering, in telerobotics and remote handling and in radiation safety management were used as sources of inspiration, together with analysis of current practices surveyed at CERN, GSI and ESS.« less

  11. A Methodological Framework for Enterprise Information System Requirements Derivation

    NASA Astrophysics Data System (ADS)

    Caplinskas, Albertas; Paškevičiūtė, Lina

    Current information systems (IS) are enterprise-wide systems supporting strategic goals of the enterprise and meeting its operational business needs. They are supported by information and communication technologies (ICT) and other software that should be fully integrated. To develop software responding to real business needs, we need requirements engineering (RE) methodology that ensures the alignment of requirements for all levels of enterprise system. The main contribution of this chapter is a requirement-oriented methodological framework allowing to transform business requirements level by level into software ones. The structure of the proposed framework reflects the structure of Zachman's framework. However, it has other intentions and is purposed to support not the design but the RE issues.

  12. A Framework for Performing V&V within Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1996-01-01

    Verification and validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In order to provide early detection of errors, V&V is conducted in parallel with system development, often beginning with the concept phase. In reuse-based software engineering, however, decisions on the requirements, design and even implementation of domain assets can be made prior to beginning development of a specific system. In this case, V&V must be performed during domain engineering in order to have an impact on system development. This paper describes a framework for performing V&V within architecture-centric, reuse-based software engineering. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  13. StakeMeter: value-based stakeholder identification and quantification framework for value-based software systems.

    PubMed

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.

  14. Framework to Delay Corn Rootworm Resistance

    EPA Pesticide Factsheets

    This proposed framework is intended to delay the corn rootworm pest becoming resistant to corn genetically engineered to produce Bt proteins, which kill corn rootworms but do not affect people or wildlife. It includes requirements on Bt corn manufacturers.

  15. Integrating Innovation Skills in an Introductory Engineering Design-Build Course

    ERIC Educational Resources Information Center

    Liebenberg, Leon; Mathews, Edward Henry

    2012-01-01

    Modern engineering curricula have started to emphasize design, mostly in the form of design-build experiences. Apart from instilling important problem-solving skills, such pedagogical frameworks address the critical social skill aspects of engineering education due to their team-based, project-based nature. However, it is required of the…

  16. Mississippi Curriculum Framework for Diesel Equipment Repair & Service (Program CIP: 47.0605--Diesel Engine Mechanic & Repairer). Secondary Programs.

    ERIC Educational Resources Information Center

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which reflects Mississippi's statutory requirement that instructional programs be based on core curricula and performance-based assessment, contains outlines of the instructional units required in local instructional management plans and daily lesson plans for diesel engine mechanics I and II. Presented first are a program…

  17. Committee on Earth Observation Satellites (CEOS) Systems Engineering Office (SEO). Ocean Surface Topography (OST) Workshop, Ruedesheim an Rhein, Germany. [CEOS SEO Status Report

    NASA Technical Reports Server (NTRS)

    Killough, Brian D., Jr.

    2008-01-01

    The CEOS Systems Engineering Office will present a 2007 status report of the CEOS constellation process, present a new systems engineering framework, and analysis results from the GEO Societal Benefit Area (SBA) assessment and the OST constellation requirements assessment.

  18. StakeMeter: Value-Based Stakeholder Identification and Quantification Framework for Value-Based Software Systems

    PubMed Central

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490

  19. Engineering evidence for carbon monoxide toxicity cases.

    PubMed

    Galatsis, Kosmas

    2016-07-01

    Unintentional carbon monoxide poisonings and fatalities lead to many toxicity cases. Given the unusual physical properties of carbon monoxide-in that the gas is odorless and invisible-unorganized and erroneous methods in obtaining engineering evidence as required during the discovery process often occurs. Such evidence gathering spans domains that include building construction, appliance installation, industrial hygiene, mechanical engineering, combustion and physics. In this paper, we attempt to place a systematic framework that is relevant to key aspects in engineering evidence gathering for unintentional carbon monoxide poisoning cases. Such a framework aims to increase awareness of this process and relevant issues to help guide legal counsel and expert witnesses. © The Author(s) 2015.

  20. A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints.

    PubMed

    Sundharam, Sakthivel Manikandan; Navet, Nicolas; Altmeyer, Sebastian; Havet, Lionel

    2018-02-20

    Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system.

  1. A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints

    PubMed Central

    Navet, Nicolas; Havet, Lionel

    2018-01-01

    Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system. PMID:29461489

  2. Sensemaking in a Value Based Context for Large Scale Complex Engineered Systems

    NASA Astrophysics Data System (ADS)

    Sikkandar Basha, Nazareen

    The design and the development of Large-Scale Complex Engineered Systems (LSCES) requires the involvement of multiple teams and numerous levels of the organization and interactions with large numbers of people and interdisciplinary departments. Traditionally, requirements-driven Systems Engineering (SE) is used in the design and development of these LSCES. The requirements are used to capture the preferences of the stakeholder for the LSCES. Due to the complexity of the system, multiple levels of interactions are required to elicit the requirements of the system within the organization. Since LSCES involves people and interactions between the teams and interdisciplinary departments, it should be socio-technical in nature. The elicitation of the requirements of most large-scale system projects are subjected to creep in time and cost due to the uncertainty and ambiguity of requirements during the design and development. In an organization structure, the cost and time overrun can occur at any level and iterate back and forth thus increasing the cost and time. To avoid such creep past researches have shown that rigorous approaches such as value based designing can be used to control it. But before the rigorous approaches can be used, the decision maker should have a proper understanding of requirements creep and the state of the system when the creep occurs. Sensemaking is used to understand the state of system when the creep occurs and provide a guidance to decision maker. This research proposes the use of the Cynefin framework, sensemaking framework which can be used in the design and development of LSCES. It can aide in understanding the system and decision making to minimize the value gap due to requirements creep by eliminating ambiguity which occurs during design and development. A sample hierarchical organization is used to demonstrate the state of the system at the occurrence of requirements creep in terms of cost and time using the Cynefin framework. These trials are continued for different requirements and at different sub-system level. The results obtained show that the Cynefin framework can be used to improve the value of the system and can be used for predictive analysis. The decision makers can use these findings and use rigorous approaches and improve the design of Large Scale Complex Engineered Systems.

  3. Preface to RIGiM 2009

    NASA Astrophysics Data System (ADS)

    Rolland, Colette; Yu, Eric; Salinesi, Camille; Castro, Jaelson

    The use of intentional concepts, the notion of "goal" in particular, has been prominent in recent approaches to requirement engineering (RE). Goal-oriented frameworks and methods for requirements engineering (GORE) have been keynote topics in requirements engineering, conceptual modelling, and more generally in software engineering. What are the conceptual modelling foundations in these approaches? RIGiM (Requirements Intentions and Goals in Conceptual Modelling) aims to provide a forum for discussing the interplay between requirements engineering and conceptual modelling, and in particular, to investigate how goal- and intention-driven approaches help in conceptualising purposeful systems. What are the fundamental objectives and premises of requirements engineering and conceptual modelling respectively, and how can they complement each other? What are the demands on conceptual modelling from the standpoint of requirements engineering? What conceptual modelling techniques can be further taken advantage of in requirements engineering? What are the upcoming modelling challenges and issues in GORE? What are the unresolved open questions? What lessons are there to be learnt from industrial experiences? What empirical data are there to support the cost-benefit analysis when adopting GORE methods? Are there application domains or types of project settings for which goals and intentional approaches are particularly suitable or not suitable? What degree of formalization and automation, or interactivity is feasible and appropriate for what types of participants during requirements engineering?

  4. AdaFF: Adaptive Failure-Handling Framework for Composite Web Services

    NASA Astrophysics Data System (ADS)

    Kim, Yuna; Lee, Wan Yeon; Kim, Kyong Hoon; Kim, Jong

    In this paper, we propose a novel Web service composition framework which dynamically accommodates various failure recovery requirements. In the proposed framework called Adaptive Failure-handling Framework (AdaFF), failure-handling submodules are prepared during the design of a composite service, and some of them are systematically selected and automatically combined with the composite Web service at service instantiation in accordance with the requirement of individual users. In contrast, existing frameworks cannot adapt the failure-handling behaviors to user's requirements. AdaFF rapidly delivers a composite service supporting the requirement-matched failure handling without manual development, and contributes to a flexible composite Web service design in that service architects never care about failure handling or variable requirements of users. For proof of concept, we implement a prototype system of the AdaFF, which automatically generates a composite service instance with Web Services Business Process Execution Language (WS-BPEL) according to the users' requirement specified in XML format and executes the generated instance on the ActiveBPEL engine.

  5. Toward Engineering Synthetic Microbial Metabolism

    PubMed Central

    McArthur, George H.; Fong, Stephen S.

    2010-01-01

    The generation of well-characterized parts and the formulation of biological design principles in synthetic biology are laying the foundation for more complex and advanced microbial metabolic engineering. Improvements in de novo DNA synthesis and codon-optimization alone are already contributing to the manufacturing of pathway enzymes with improved or novel function. Further development of analytical and computer-aided design tools should accelerate the forward engineering of precisely regulated synthetic pathways by providing a standard framework for the predictable design of biological systems from well-characterized parts. In this review we discuss the current state of synthetic biology within a four-stage framework (design, modeling, synthesis, analysis) and highlight areas requiring further advancement to facilitate true engineering of synthetic microbial metabolism. PMID:20037734

  6. A Measurement Framework for Team Level Assessment of Innovation Capability in Early Requirements Engineering

    NASA Astrophysics Data System (ADS)

    Regnell, Björn; Höst, Martin; Nilsson, Fredrik; Bengtsson, Henrik

    When developing software-intensive products for a market-place it is important for a development organisation to create innovative features for coming releases in order to achieve advantage over competitors. This paper focuses on assessment of innovation capability at team level in relation to the requirements engineering that is taking place before the actual product development projects are decided, when new business models, technology opportunities and intellectual property rights are created and investigated through e.g. prototyping and concept development. The result is a measurement framework focusing on four areas: innovation elicitation, selection, impact and ways-of-working. For each area, candidate measurements were derived from interviews to be used as inspiration in the development of a tailored measurement program. The framework is based on interviews with participants of a software team with specific innovation responsibilities and validated through cross-case analysis and feedback from practitioners.

  7. Integration Framework of Process Planning based on Resource Independent Operation Summary to Support Collaborative Manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulvatunyou, Boonserm; Wysk, Richard A.; Cho, Hyunbo

    2004-06-01

    In today's global manufacturing environment, manufacturing functions are distributed as never before. Design, engineering, fabrication, and assembly of new products are done routinely in many different enterprises scattered around the world. Successful business transactions require the sharing of design and engineering data on an unprecedented scale. This paper describes a framework that facilitates the collaboration of engineering tasks, particularly process planning and analysis, to support such globalized manufacturing activities. The information models of data and the software components that integrate those information models are described. The integration framework uses an Integrated Product and Process Data (IPPD) representation called a Resourcemore » Independent Operation Summary (RIOS) to facilitate the communication of business and manufacturing requirements. Hierarchical process modeling, process planning decomposition and an augmented AND/OR directed graph are used in this representation. The Resource Specific Process Planning (RSPP) module assigns required equipment and tools, selects process parameters, and determines manufacturing costs based on two-level hierarchical RIOS data. The shop floor knowledge (resource and process knowledge) and a hybrid approach (heuristic and linear programming) to linearize the AND/OR graph provide the basis for the planning. Finally, a prototype system is developed and demonstrated with an exemplary part. Java and XML (Extensible Markup Language) are used to ensure software and information portability.« less

  8. The Flow Engine Framework: A Cognitive Model of Optimal Human Experience

    PubMed Central

    Šimleša, Milija; Guegan, Jérôme; Blanchard, Edouard; Tarpin-Bernard, Franck; Buisine, Stéphanie

    2018-01-01

    Flow is a well-known concept in the fields of positive and applied psychology. Examination of a large body of flow literature suggests there is a need for a conceptual model rooted in a cognitive approach to explain how this psychological phenomenon works. In this paper, we propose the Flow Engine Framework, a theoretical model explaining dynamic interactions between rearranged flow components and fundamental cognitive processes. Using an IPO framework (Inputs – Processes – Outputs) including a feedback process, we organize flow characteristics into three logically related categories: inputs (requirements for flow), mediating and moderating cognitive processes (attentional and motivational mechanisms) and outputs (subjective and objective outcomes), describing the process of the flow. Comparing flow with an engine, inputs are depicted as flow-fuel, core processes cylinder strokes and outputs as power created to provide motion. PMID:29899807

  9. A Sociotechnical Framework for Governing Climate Engineering

    PubMed Central

    2015-01-01

    Proposed ways of governing climate engineering have most often been supported by narrowly framed and unreflexive appraisals and processes. This article explores the governance implications of a Deliberative Mapping project that, unlike other governance principles, have emerged from an extensive process of reflection and reflexivity. In turn, the project has made significant advances in addressing the current deficit of responsibly defined criteria for shaping governance propositions. Three such propositions argue that (1) reflexive foresight of the imagined futures in which climate engineering proposals might reside is required; (2) the performance and acceptance of climate engineering proposals should be decided in terms of robustness, not optimality; and (3) climate engineering proposals should be satisfactorily opened up before they can be considered legitimate objects of governance. Taken together, these propositions offer a sociotechnical framework not simply for governing climate engineering but for governing responses to climate change at large. PMID:26973363

  10. Structural engineering masters level education framework of knowledge for the needs of initial professional practice

    NASA Astrophysics Data System (ADS)

    Balogh, Zsuzsa Enriko

    For at least the last decade, engineering, civil engineering, along with structural engineering as a profession within civil engineering, have and continue to face an emerging need for "Raising the Bar" of preparedness of young engineers seeking to become practicing professional engineers. The present consensus of the civil engineering profession is that the increasing need for broad and in-depth knowledge should require the young structural engineers to have at least a Masters-Level education. This study focuses on the Masters-Level preparedness in the structural engineering area within the civil engineering field. It follows much of the methodology used in the American Society of Civil Engineers (ASCE) Body of Knowledge determination for civil engineering and extends this type of study to better define the portion of the young engineers preparation beyond the undergraduate program for one specialty area of civil engineering. The objective of this research was to create a Framework of Knowledge for the young engineer which identifies and recognizes the needs of the profession, along with the profession's expectations of how those needs can be achieved in the graduate-level academic setting, in the practice environment, and through lifelong learning opportunities with an emphasis on the initial five years experience past completion of a Masters program in structural engineering. This study applied a modified Delphi method to obtain the critical information from members of the structural engineering profession. The results provide a Framework of Knowledge which will be useful to several groups seeking to better ensure the preparedness of the future young structural engineers at the Masters-Level.

  11. NASA's Systems Engineering Approaches for Addressing Public Health Surveillance Requirements

    NASA Technical Reports Server (NTRS)

    Vann, Timi

    2003-01-01

    NASA's systems engineering has its heritage in space mission analysis and design, including the end-to-end approach to managing every facet of the extreme engineering required for successful space missions. NASA sensor technology, understanding of remote sensing, and knowledge of Earth system science, can be powerful new tools for improved disease surveillance and environmental public health tracking. NASA's systems engineering framework facilitates the match between facilitates the match between partner needs and decision support requirements in the areas of 1) Science/Data; 2) Technology; 3) Integration. Partnerships between NASA and other Federal agencies are diagrammed in this viewgraph presentation. NASA's role in these partnerships is to provide systemic and sustainable solutions that contribute to the measurable enhancement of a partner agency's disease surveillance efforts.

  12. An MBSE Approach to Space Suit Development

    NASA Technical Reports Server (NTRS)

    Cordova, Lauren; Kovich, Christine; Sargusingh, Miriam

    2012-01-01

    The EVA/Space Suit Development Office (ESSD) Systems Engineering and Integration (SE&I) team has utilized MBSE in multiple programs. After developing operational and architectural models, the MBSE framework was expanded to link the requirements space to the system models through functional analysis and interfaces definitions. By documenting all the connections within the technical baseline, ESSD experienced significant efficiency improvements in analysis and identification of change impacts. One of the biggest challenges presented to the MBSE structure was a program transition and restructuring effort, which was completed successfully in 4 months culminating in the approval of a new EVA Technical Baseline. During this time three requirements sets spanning multiple DRMs were streamlined into one NASA-owned Systems Requirement Document (SRD) that successfully identified requirements relevant to the current hardware development effort while remaining extensible to support future hardware developments. A capability-based hierarchy was established to provide a more flexible framework for future space suit development that can support multiple programs with minimal rework of basic EVA/Space Suit requirements. This MBSE approach was most recently applied for generation of an EMU Demonstrator technical baseline being developed for an ISS DTO. The relatively quick turnaround of operational concepts, architecture definition, and requirements for this new suit development has allowed us to test and evolve the MBSE process and framework in an extremely different setting while still offering extensibility and traceability throughout ESSD projects. The ESSD MBSE framework continues to be evolved in order to support integration of all products associated with the SE&I engine.

  13. A comprehensive framework for evaluating the environmental health and safety implications of engineered nanomaterials

    EPA Science Inventory

    Engineered nanomaterials (ENM) are a growing aspect of the global economy, and their safe and sustainable development, use and eventual disposal requires the capability to forecast and avoid potential problems. This review is concerned with the releases of ENM into the environmen...

  14. Industrial training approach using GPM P5 Standard for Sustainability in Project Management: a framework for sustainability competencies in the 21st century

    NASA Astrophysics Data System (ADS)

    Johan, Kartina; Mohd Turan, Faiz

    2016-11-01

    Malaysian Engineering Accreditation (Engineering Programme Accreditation Manual, 2007) requires all bachelor degree in engineering programmes to incorporate a minimum of two months industrial training in order for the programme to be accredited by the council. The industrial training has the objective to provide students on the insights of being an engineer at the workplace hence increasing their knowledge in employability skills prior to graduation. However the current structure of industrial training is not able to inculcate good leadership ability and prepare students with sustainability competencies needed in the era of Sustainable Development (SD). This paper aims to study project management methodology as a framework to create a training pathway in industrial training for students in engineering programs using Green Project Management (GPM) P5 standard for sustainability in project management. The framework involves students as interns, supervisors from both university and industry and also participation from NonProfit Organisation (NPO). The framework focus on the development of the student's competency in employability skills, lean leadership and sustainability competencies using experiential learning approach. Deliverables of the framework include internship report, professional sustainability report using GPM P5 standard and competency assessment. The post-industrial phase of the framework is constructed for students to be assessed collaboratively by the university, industry and the sustainability practitioner in the country. The ability for the interns to act as a change agent in sustainability practices is measured by the competency assessment and the quality of the sustainability report. The framework support the call for developing holistic students based on Malaysian Education Blueprint (Higher Education) 2015-2025 and address the gap between the statuses of engineering qualification to the sustainability competencies in the 21st century in particular by achieving the Sustainability Graduates (SG) attributes outlined in the framework.

  15. A Theory of Information Quality and a Framework for its Implementation in the Requirements Engineering Process

    NASA Astrophysics Data System (ADS)

    Grenn, Michael W.

    This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of information. The meaning of requirements information in the systems engineering context is estimated or measured in terms of the cumulative requirements quality Q which corresponds to the distribution of the requirements among the available quality levels. The requirements entropy framework (REF) implements the theory to address the requirements engineering problem. The REF defines the relationship between requirements changes, requirements volatility, requirements quality, requirements entropy and uncertainty, and engineering effort. The REF is evaluated via simulation experiments to assess its practical utility as a new method for measuring, monitoring and predicting requirements trends and engineering effort at any given time in the process. The REF treats the requirements engineering process as an open system in which the requirements are discrete information entities that transition from initial states of high entropy, disorder and uncertainty toward the desired state of minimum entropy as engineering effort is input and requirements increase in quality. The distribution of the total number of requirements R among the N discrete quality levels is determined by the number of defined quality attributes accumulated by R at any given time. Quantum statistics are used to estimate the number of possibilities P for arranging R among the available quality levels. The requirements entropy H R is estimated using R, N and P by extending principles of information theory and statistical mechanics to the requirements engineering process. The information I increases as HR and uncertainty decrease, and the change in information AI needed to reach the desired state of quality is estimated from the perspective of the receiver. The HR may increase, decrease or remain steady depending on the degree to which additions, deletions and revisions impact the distribution of R among the quality levels. Current requirements trend metrics generally treat additions, deletions and revisions the same and simply measure the quantity of these changes over time. The REF evaluates the quantity of requirements changes over time, distinguishes between their positive and negative effects by calculating their impact on HR, Q, and AI, and forecasts when the desired state will be reached, enabling more accurate assessment of the status and progress of the requirements engineering effort. Results from random variable simulations suggest the REF is an improved leading indicator of requirements trends that can be readily combined with current methods. The increase in I, or decrease in H R and uncertainty, is proportional to the engineering effort E input into the requirements engineering process. The REF estimates the AE needed to transition R from their current state of quality to the desired end state or some other interim state of interest. Simulation results are compared with measured engineering effort data for Department of Defense programs published in the SE literature, and the results suggest the REF is a promising new method for estimation of AE.

  16. Design Standards for Engineered Tissues

    PubMed Central

    Nawroth, Janna C.; Parker, Kevin Kit

    2013-01-01

    Traditional technologies are required to meet specific, quantitative standards of safety and performance. In tissue engineering, similar standards will have to be developed to enable routine clinical use and customized tissue fabrication. In this essay, we discuss a framework of concepts leading towards general design standards for tissue-engineering, focusing in particular on systematic design strategies, control of cell behavior, physiological scaling, fabrication modes and functional evaluation. PMID:23267860

  17. Design of the Curriculum for a Second-Cycle Course in Civil Engineering in the Context of the Bologna Framework

    ERIC Educational Resources Information Center

    Gavin, K. G.

    2010-01-01

    This paper describes the design of the curriculum for a Master of Engineering programme in civil engineering at University College Dublin. The revised programme was established to meet the requirements of the Bologna process and this paper specifically considers the design of a new, second-cycle master's component of the programme. In addition to…

  18. How can systems engineering inform the methods of programme evaluation in health professions education?

    PubMed

    Rojas, David; Grierson, Lawrence; Mylopoulos, Maria; Trbovich, Patricia; Bagli, Darius; Brydges, Ryan

    2018-04-01

    We evaluate programmes in health professions education (HPE) to determine their effectiveness and value. Programme evaluation has evolved from use of reductionist frameworks to those addressing the complex interactions between programme factors. Researchers in HPE have recently suggested a 'holistic programme evaluation' aiming to better describe and understand the implications of 'emergent processes and outcomes'. We propose a programme evaluation framework informed by principles and tools from systems engineering. Systems engineers conceptualise complexity and emergent elements in unique ways that may complement and extend contemporary programme evaluations in HPE. We demonstrate how the abstract decomposition space (ADS), an engineering knowledge elicitation tool, provides the foundation for a systems engineering informed programme evaluation designed to capture both planned and emergent programme elements. We translate the ADS tool to use education-oriented language, and describe how evaluators can use it to create a programme-specific ADS through iterative refinement. We provide a conceptualisation of emergent elements and an equation that evaluators can use to identify the emergent elements in their programme. Using our framework, evaluators can analyse programmes not as isolated units with planned processes and planned outcomes, but as unfolding, complex interactive systems that will exhibit emergent processes and emergent outcomes. Subsequent analysis of these emergent elements will inform the evaluator as they seek to optimise and improve the programme. Our proposed systems engineering informed programme evaluation framework provides principles and tools for analysing the implications of planned and emergent elements, as well as their potential interactions. We acknowledge that our framework is preliminary and will require application and constant refinement. We suggest that our framework will also advance our understanding of the construct of 'emergence' in HPE research. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  19. Jupiter Europa Orbiter Architecture Definition Process

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Shishko, Robert

    2011-01-01

    The proposed Jupiter Europa Orbiter mission, planned for launch in 2020, is using a new architectural process and framework tool to drive its model-based systems engineering effort. The process focuses on getting the architecture right before writing requirements and developing a point design. A new architecture framework tool provides for the structured entry and retrieval of architecture artifacts based on an emerging architecture meta-model. This paper describes the relationships among these artifacts and how they are used in the systems engineering effort. Some early lessons learned are discussed.

  20. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was built using the OpenMDAO framework. Pycycle provides analytic derivatives allowing for an efficient use of gradient-based optimization methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  1. Activity-Centered Domain Characterization for Problem-Driven Scientific Visualization

    PubMed Central

    Marai, G. Elisabeta

    2018-01-01

    Although visualization design models exist in the literature in the form of higher-level methodological frameworks, these models do not present a clear methodological prescription for the domain characterization step. This work presents a framework and end-to-end model for requirements engineering in problem-driven visualization application design. The framework and model are based on the activity-centered design paradigm, which is an enhancement of human-centered design. The proposed activity-centered approach focuses on user tasks and activities, and allows an explicit link between the requirements engineering process with the abstraction stage—and its evaluation—of existing, higher-level visualization design models. In a departure from existing visualization design models, the resulting model: assigns value to a visualization based on user activities; ranks user tasks before the user data; partitions requirements in activity-related capabilities and nonfunctional characteristics and constraints; and explicitly incorporates the user workflows into the requirements process. A further merit of this model is its explicit integration of functional specifications, a concept this work adapts from the software engineering literature, into the visualization design nested model. A quantitative evaluation using two sets of interdisciplinary projects supports the merits of the activity-centered model. The result is a practical roadmap to the domain characterization step of visualization design for problem-driven data visualization. Following this domain characterization model can help remove a number of pitfalls that have been identified multiple times in the visualization design literature. PMID:28866550

  2. A software framework for developing measurement applications under variable requirements.

    PubMed

    Arpaia, Pasquale; Buzio, Marco; Fiscarelli, Lucio; Inglese, Vitaliano

    2012-11-01

    A framework for easily developing software for measurement and test applications under highly and fast-varying requirements is proposed. The framework allows the software quality, in terms of flexibility, usability, and maintainability, to be maximized. Furthermore, the development effort is reduced and finalized, by relieving the test engineer of development details. The framework can be configured for satisfying a large set of measurement applications in a generic field for an industrial test division, a test laboratory, or a research center. As an experimental case study, the design, the implementation, and the assessment inside the application to a measurement scenario of magnet testing at the European Organization for Nuclear Research is reported.

  3. Engineering Values into Genetic Engineering: A Proposed Analytic Framework for Scientific Social Responsibility

    PubMed Central

    Cho, Mildred K.

    2016-01-01

    Recent experiments have been used to “edit” genomes of various plant, animal and other species, including humans, with unprecedented precision. Furthermore, editing Cas9 endonuclease gene with a gene encoding the desired guide RNA into an organism, adjacent to an altered gene, could create a “gene drive” that could spread a trait through an entire population of organisms. These experiments represent advances along a spectrum of technological abilities that genetic engineers have been working on since the advent of recombinant DNA techniques. The scientific and bioethics communities have built substantial literatures about the ethical and policy implications of genetic engineering, especially in the age of bioterrorism. However, recent CRISPr/Cas experiments have triggered a rehashing of previous policy discussions, suggesting that the scientific community requires guidance on how to think about social responsibility. We propose a framework to enable analysis of social responsibility, using two examples of genetic engineering experiments. PMID:26632356

  4. Engineering Values Into Genetic Engineering: A Proposed Analytic Framework for Scientific Social Responsibility.

    PubMed

    Sankar, Pamela L; Cho, Mildred K

    2015-01-01

    Recent experiments have been used to "edit" genomes of various plant, animal and other species, including humans, with unprecedented precision. Furthermore, editing the Cas9 endonuclease gene with a gene encoding the desired guide RNA into an organism, adjacent to an altered gene, could create a "gene drive" that could spread a trait through an entire population of organisms. These experiments represent advances along a spectrum of technological abilities that genetic engineers have been working on since the advent of recombinant DNA techniques. The scientific and bioethics communities have built substantial literatures about the ethical and policy implications of genetic engineering, especially in the age of bioterrorism. However, recent CRISPr/Cas experiments have triggered a rehashing of previous policy discussions, suggesting that the scientific community requires guidance on how to think about social responsibility. We propose a framework to enable analysis of social responsibility, using two examples of genetic engineering experiments.

  5. A Theory of Information Quality and a Framework for Its Implementation in the Requirements Engineering Process

    ERIC Educational Resources Information Center

    Grenn, Michael W.

    2013-01-01

    This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of…

  6. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  7. A Decision Fusion Framework for Treatment Recommendation Systems.

    PubMed

    Mei, Jing; Liu, Haifeng; Li, Xiang; Xie, Guotong; Yu, Yiqin

    2015-01-01

    Treatment recommendation is a nontrivial task--it requires not only domain knowledge from evidence-based medicine, but also data insights from descriptive, predictive and prescriptive analysis. A single treatment recommendation system is usually trained or modeled with a limited (size or quality) source. This paper proposes a decision fusion framework, combining both knowledge-driven and data-driven decision engines for treatment recommendation. End users (e.g. using the clinician workstation or mobile apps) could have a comprehensive view of various engines' opinions, as well as the final decision after fusion. For implementation, we leverage several well-known fusion algorithms, such as decision templates and meta classifiers (of logistic and SVM, etc.). Using an outcome-driven evaluation metric, we compare the fusion engine with base engines, and our experimental results show that decision fusion is a promising way towards a more valuable treatment recommendation.

  8. Neutron imaging data processing using the Mantid framework

    NASA Astrophysics Data System (ADS)

    Pouzols, Federico M.; Draper, Nicholas; Nagella, Sri; Yang, Erica; Sajid, Ahmed; Ross, Derek; Ritchie, Brian; Hill, John; Burca, Genoveva; Minniti, Triestino; Moreton-Smith, Christopher; Kockelmann, Winfried

    2016-09-01

    Several imaging instruments are currently being constructed at neutron sources around the world. The Mantid software project provides an extensible framework that supports high-performance computing for data manipulation, analysis and visualisation of scientific data. At ISIS, IMAT (Imaging and Materials Science & Engineering) will offer unique time-of-flight neutron imaging techniques which impose several software requirements to control the data reduction and analysis. Here we outline the extensions currently being added to Mantid to provide specific support for neutron imaging requirements.

  9. A Human Factors Framework for Payload Display Design

    NASA Technical Reports Server (NTRS)

    Dunn, Mariea C.; Hutchinson, Sonya L.

    1998-01-01

    During missions to space, one charge of the astronaut crew is to conduct research experiments. These experiments, referred to as payloads, typically are controlled by computers. Crewmembers interact with payload computers by using visual interfaces or displays. To enhance the safety, productivity, and efficiency of crewmember interaction with payload displays, particular attention must be paid to the usability of these displays. Enhancing display usability requires adoption of a design process that incorporates human factors engineering principles at each stage. This paper presents a proposed framework for incorporating human factors engineering principles into the payload display design process.

  10. WFIRST: Coronagraph Systems Engineering and Performance Budgets

    NASA Astrophysics Data System (ADS)

    Poberezhskiy, Ilya; cady, eric; Frerking, Margaret A.; Kern, Brian; Nemati, Bijan; Noecker, Martin; Seo, Byoung-Joon; Zhao, Feng; Zhou, Hanying

    2018-01-01

    The WFIRST coronagraph instrument (CGI) will be the first in-space coronagraph using active wavefront control to directly image and characterize mature exoplanets and zodiacal disks in reflected starlight. For CGI systems engineering, including requirements development, CGI performance is predicted using a hierarchy of performance budgets to estimate various noise components — spatial and temporal flux variations — that obscure exoplanet signals in direct imaging and spectroscopy configurations. These performance budgets are validated through a robust integrated modeling and testbed model validation efforts.We present the performance budgeting framework used by WFIRST for the flow-down of coronagraph science requirements, mission constraints, and observatory interfaces to measurable instrument engineering parameters.

  11. Re-engineering the Federal planning process: A total Federal planning strategy, integrating NEPA with modern management tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eccleston, C.H.

    1997-09-05

    The National Environmental Policy Act (NEPA) of 1969 was established by Congress more than a quarter of a century ago, yet there is a surprising lack of specific tools, techniques, and methodologies for effectively implementing these regulatory requirements. Lack of professionally accepted techniques is a principal factor responsible for many inefficiencies. Often, decision makers do not fully appreciate or capitalize on the true potential which NEPA provides as a platform for planning future actions. New approaches and modem management tools must be adopted to fully achieve NEPA`s mandate. A new strategy, referred to as Total Federal Planning, is proposed formore » unifying large-scale federal planning efforts under a single, systematic, structured, and holistic process. Under this approach, the NEPA planning process provides a unifying framework for integrating all early environmental and nonenvironmental decision-making factors into a single comprehensive planning process. To promote effectiveness and efficiency, modem tools and principles from the disciplines of Value Engineering, Systems Engineering, and Total Quality Management are incorporated. Properly integrated and implemented, these planning tools provide the rigorous, structured, and disciplined framework essential in achieving effective planning. Ultimately, the goal of a Total Federal Planning strategy is to construct a unified and interdisciplinary framework that substantially improves decision-making, while reducing the time, cost, redundancy, and effort necessary to comply with environmental and other planning requirements. At a time when Congress is striving to re-engineer the governmental framework, apparatus, and process, a Total Federal Planning philosophy offers a systematic approach for uniting the disjointed and often convoluted planning process currently used by most federal agencies. Potentially this approach has widespread implications in the way federal planning is approached.« less

  12. An Experimental Framework for Executing Applications in Dynamic Grid Environments

    NASA Technical Reports Server (NTRS)

    Huedo, Eduardo; Montero, Ruben S.; Llorente, Ignacio M.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The Grid opens up opportunities for resource-starved scientists and engineers to harness highly distributed computing resources. A number of Grid middleware projects are currently available to support the simultaneous exploitation of heterogeneous resources distributed in different administrative domains. However, efficient job submission and management continue being far from accessible to ordinary scientists and engineers due to the dynamic and complex nature of the Grid. This report describes a new Globus framework that allows an easier and more efficient execution of jobs in a 'submit and forget' fashion. Adaptation to dynamic Grid conditions is achieved by supporting automatic application migration following performance degradation, 'better' resource discovery, requirement change, owner decision or remote resource failure. The report also includes experimental results of the behavior of our framework on the TRGP testbed.

  13. Systematic Curriculum Integration of Sustainable Development Using Life Cycle Approaches: The Case of the Civil Engineering Department at the Université de Sherbrooke

    ERIC Educational Resources Information Center

    Roure, Bastien; Anand, Chirjiv; Bisaillon, Véronique; Amor, Ben

    2018-01-01

    Purpose: The purpose of this paper is to provide a consistent and systematic integration framework of sustainable development (SD) in a civil engineering (CE) curriculum, given the connection between the two. Curriculum integration is a challenging project and requires the development of certain protocols to ensure success.…

  14. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  15. Analysis of airframe/engine interactions in integrated flight and propulsion control

    NASA Technical Reports Server (NTRS)

    Schierman, John D.; Schmidt, David K.

    1991-01-01

    An analysis framework for the assessment of dynamic cross-coupling between airframe and engine systems from the perspective of integrated flight/propulsion control is presented. This analysis involves to determining the significance of the interactions with respect to deterioration in stability robustness and performance, as well as critical frequency ranges where problems may occur due to these interactions. The analysis illustrated here investigates both the airframe's effects on the engine control loops and the engine's effects on the airframe control loops in two case studies. The second case study involves a multi-input/multi-output analysis of the airframe. Sensitivity studies are performed on critical interactions to examine the degradations in the system's stability robustness and performance. Magnitudes of the interactions required to cause instabilities, as well as the frequencies at which the instabilities occur are recorded. Finally, the analysis framework is expanded to include control laws which contain cross-feeds between the airframe and engine systems.

  16. A concept ideation framework for medical device design.

    PubMed

    Hagedorn, Thomas J; Grosse, Ian R; Krishnamurty, Sundar

    2015-06-01

    Medical device design is a challenging process, often requiring collaboration between medical and engineering domain experts. This collaboration can be best institutionalized through systematic knowledge transfer between the two domains coupled with effective knowledge management throughout the design innovation process. Toward this goal, we present the development of a semantic framework for medical device design that unifies a large medical ontology with detailed engineering functional models along with the repository of design innovation information contained in the US Patent Database. As part of our development, existing medical, engineering, and patent document ontologies were modified and interlinked to create a comprehensive medical device innovation and design tool with appropriate properties and semantic relations to facilitate knowledge capture, enrich existing knowledge, and enable effective knowledge reuse for different scenarios. The result is a Concept Ideation Framework for Medical Device Design (CIFMeDD). Key features of the resulting framework include function-based searching and automated inter-domain reasoning to uniquely enable identification of functionally similar procedures, tools, and inventions from multiple domains based on simple semantic searches. The significance and usefulness of the resulting framework for aiding in conceptual design and innovation in the medical realm are explored via two case studies examining medical device design problems. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. A hierarchical-multiobjective framework for risk management

    NASA Technical Reports Server (NTRS)

    Haimes, Yacov Y.; Li, Duan

    1991-01-01

    A broad hierarchical-multiobjective framework is established and utilized to methodologically address the management of risk. United into the framework are the hierarchical character of decision-making, the multiple decision-makers at separate levels within the hierarchy, the multiobjective character of large-scale systems, the quantitative/empirical aspects, and the qualitative/normative/judgmental aspects. The methodological components essentially consist of hierarchical-multiobjective coordination, risk of extreme events, and impact analysis. Examples of applications of the framework are presented. It is concluded that complex and interrelated forces require an analysis of trade-offs between engineering analysis and societal preferences, as in the hierarchical-multiobjective framework, to successfully address inherent risk.

  18. [Innovation guidelines and strategies for pharmaceutical engineering of Chinese medicine and their industrial translation].

    PubMed

    Cheng, Yi-Yu; Qu, Hai-Bin; Zhang, Bo-Li

    2013-01-01

    This paper briefly analyzes the bottlenecks and major technical requirements for pharmaceutical industry of Chinese medicine, providing current status of pharmaceutical engineering of Chinese medicine. The innovation directions and strategies of the pharmaceutical engineering for manufacturing Chinese medicine are proposed along with the framework of their core technology. As a consequence, the development of the third-generation pharmaceutical technology for Chinese medicine, featured as "precision, digital and intelligent", is recommended. The prospects of the pharmaceutical technology are also forecasted.

  19. System Engineering Concept Demonstration, Effort Summary. Volume 1

    DTIC Science & Technology

    1992-12-01

    involve only the system software, user frameworks and user tools. U •User Tool....s , Catalyst oExternal 00 Computer Framwork P OSystems • •~ Sysytem...analysis, synthesis, optimization, conceptual design of Catalyst. The paper discusses the definition, design, test, and evaluation; operational concept...This approach will allow system engineering The conceptual requirements for the Process Model practitioners to recognize and tailor the model. This

  20. Investigation of a Verification and Validation Tool with a Turbofan Aircraft Engine Application

    NASA Technical Reports Server (NTRS)

    Uth, Peter; Narang-Siddarth, Anshu; Wong, Edmond

    2018-01-01

    The development of more advanced control architectures for turbofan aircraft engines can yield gains in performance and efficiency over the lifetime of an engine. However, the implementation of these increasingly complex controllers is contingent on their ability to provide safe, reliable engine operation. Therefore, having the means to verify the safety of new control algorithms is crucial. As a step towards this goal, CoCoSim, a publicly available verification tool for Simulink, is used to analyze C-MAPSS40k, a 40,000 lbf class turbo-fan engine model developed at NASA for testing new control algorithms. Due to current limitations of the verification software, several modifications are made to C-MAPSS40k to achieve compatibility with CoCoSim. Some of these modifications sacrifice fidelity to the original model. Several safety and performance requirements typical for turbofan engines are identified and constructed into a verification framework. Preliminary results using an industry standard baseline controller for these requirements are presented. While verification capabilities are demonstrated, a truly comprehensive analysis will require further development of the verification tool.

  1. A Systems Engineering Approach to Architecture Development

    NASA Technical Reports Server (NTRS)

    Di Pietro, David A.

    2014-01-01

    Architecture development is conducted prior to system concept design when there is a need to determine the best-value mix of systems that works collectively in specific scenarios and time frames to accomplish a set of mission area objectives. While multiple architecture frameworks exist, they often require use of unique taxonomies and data structures. In contrast, this presentation characterizes architecture development using terminology widely understood within the systems engineering community. Using a notional civil space architecture example, it employs a multi-tier framework to describe the enterprise level architecture and illustrates how results of lower tier, mission area architectures integrate into the enterprise architecture. It also presents practices for conducting effective mission area architecture studies, including establishing the trade space, developing functions and metrics, evaluating the ability of potential design solutions to meet the required functions, and expediting study execution through the use of iterative design cycles.

  2. A Systems Engineering Approach to Architecture Development

    NASA Technical Reports Server (NTRS)

    Di Pietro, David A.

    2015-01-01

    Architecture development is often conducted prior to system concept design when there is a need to determine the best-value mix of systems that works collectively in specific scenarios and time frames to accomplish a set of mission area objectives. While multiple architecture frameworks exist, they often require use of unique taxonomies and data structures. In contrast, this paper characterizes architecture development using terminology widely understood within the systems engineering community. Using a notional civil space architecture example, it employs a multi-tier framework to describe the enterprise level architecture and illustrates how results of lower tier, mission area architectures integrate into the enterprise architecture. It also presents practices for conducting effective mission area architecture studies, including establishing the trade space, developing functions and metrics, evaluating the ability of potential design solutions to meet the required functions, and expediting study execution through the use of iterative design cycles.

  3. A Systems Engineering Approach to Architecture Development

    NASA Technical Reports Server (NTRS)

    Di Pietro, David A.

    2015-01-01

    Architecture development is often conducted prior to system concept design when there is a need to determine the best-value mix of systems that works collectively in specific scenarios and time frames to accomplish a set of mission area objectives. While multiple architecture frameworks exist, they often require use of unique taxonomies and data structures. In contrast, this paper characterizes architecture development using terminology widely understood within the systems engineering community. Using a notional civil space architecture example, it employs a multi-tier framework to describe the enterprise level architecture and illustrates how results of lower tier, mission area architectures integrate into the enterprise architecture. It also presents practices for conducting effective mission area architecture studies, including establishing the trade space, developing functions and metrics, evaluating the ability of potential design solutions to meet the required functions, and expediting study execution through the use of iterative design cycles

  4. Enterprise and system of systems capability development life-cycle processes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, David Franklin

    2014-08-01

    This report and set of appendices are a collection of memoranda originally drafted circa 2007-2009 for the purpose of describing and detailing a models-based systems engineering approach for satisfying enterprise and system-of-systems life cycle process requirements. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. The main thrust of the material presents a rational exposâe of a structured enterprise development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of standard systems engineering processes. While themore » approach described invokes application of the Department of Defense Architectural Framework (DoDAF), it is suitable for use with other architectural description frameworks.« less

  5. Quantum state engineering using one-dimensional discrete-time quantum walks

    NASA Astrophysics Data System (ADS)

    Innocenti, Luca; Majury, Helena; Giordani, Taira; Spagnolo, Nicolò; Sciarrino, Fabio; Paternostro, Mauro; Ferraro, Alessandro

    2017-12-01

    Quantum state preparation in high-dimensional systems is an essential requirement for many quantum-technology applications. The engineering of an arbitrary quantum state is, however, typically strongly dependent on the experimental platform chosen for implementation, and a general framework is still missing. Here we show that coined quantum walks on a line, which represent a framework general enough to encompass a variety of different platforms, can be used for quantum state engineering of arbitrary superpositions of the walker's sites. We achieve this goal by identifying a set of conditions that fully characterize the reachable states in the space comprising walker and coin and providing a method to efficiently compute the corresponding set of coin parameters. We assess the feasibility of our proposal by identifying a linear optics experiment based on photonic orbital angular momentum technology.

  6. Requirements Development for the NASA Advanced Engineering Environment (AEE)

    NASA Technical Reports Server (NTRS)

    Rogers, Eric; Hale, Joseph P.; Zook, Keith; Gowda, Sanjay; Salas, Andrea O.

    2003-01-01

    The requirements development process for the Advanced Engineering Environment (AEE) is presented. This environment has been developed to allow NASA to perform independent analysis and design of space transportation architectures and technologies. Given the highly collaborative and distributed nature of AEE, a variety of organizations are involved in the development, operations and management of the system. Furthermore, there are additional organizations involved representing external customers and stakeholders. Thorough coordination and effective communication is essential to translate desired expectations of the system into requirements. Functional, verifiable requirements for this (and indeed any) system are necessary to fulfill several roles. Requirements serve as a contractual tool, configuration management tool, and as an engineering tool, sometimes simultaneously. The role of requirements as an engineering tool is particularly important because a stable set of requirements for a system provides a common framework of system scope and characterization among team members. Furthermore, the requirements provide the basis for checking completion of system elements and form the basis for system verification. Requirements are at the core of systems engineering. The AEE Project has undertaken a thorough process to translate the desires and expectations of external customers and stakeholders into functional system-level requirements that are captured with sufficient rigor to allow development planning, resource allocation and system-level design, development, implementation and verification. These requirements are maintained in an integrated, relational database that provides traceability to governing Program requirements and also to verification methods and subsystem-level requirements.

  7. Utilizing the National Research Council's (NRC) Conceptual Framework for the Next Generation Science Standards (NGSS): A Self-Study in My Science, Engineering, and Mathematics Classroom

    NASA Astrophysics Data System (ADS)

    Corvo, Arthur Francis

    Given the reality that active and competitive participation in the 21 st century requires American students to deepen their scientific and mathematical knowledge base, the National Research Council (NRC) proposed a new conceptual framework for K--12 science education. The framework consists of an integration of what the NRC report refers to as the three dimensions: scientific and engineering practices, crosscutting concepts, and core ideas in four disciplinary areas (physical, life and earth/spaces sciences, and engineering/technology). The Next Generation Science Standards (NGSS ), which are derived from this new framework, were released in April 2013 and have implications on teacher learning and development in Science, Technology, Engineering, and Mathematics (STEM). Given the NGSS's recent introduction, there is little research on how teachers can prepare for its release. To meet this research need, I implemented a self-study aimed at examining my teaching practices and classroom outcomes through the lens of the NRC's conceptual framework and the NGSS. The self-study employed design-based research (DBR) methods to investigate what happened in my secondary classroom when I designed, enacted, and reflected on units of study for my science, engineering, and mathematics classes. I utilized various best practices including Learning for Use (LfU) and Understanding by Design (UbD) models for instructional design, talk moves as a tool for promoting discourse, and modeling instruction for these designed units of study. The DBR strategy was chosen to promote reflective cycles, which are consistent with and in support of the self-study framework. A multiple case, mixed-methods approach was used for data collection and analysis. The findings in the study are reported by study phase in terms of unit planning, unit enactment, and unit reflection. The findings have implications for science teaching, teacher professional development, and teacher education.

  8. Onyx-Advanced Aeropropulsion Simulation Framework Created

    NASA Technical Reports Server (NTRS)

    Reed, John A.

    2001-01-01

    The Numerical Propulsion System Simulation (NPSS) project at the NASA Glenn Research Center is developing a new software environment for analyzing and designing aircraft engines and, eventually, space transportation systems. Its purpose is to dramatically reduce the time, effort, and expense necessary to design and test jet engines by creating sophisticated computer simulations of an aerospace object or system (refs. 1 and 2). Through a university grant as part of that effort, researchers at the University of Toledo have developed Onyx, an extensible Java-based (Sun Micro-systems, Inc.), objectoriented simulation framework, to investigate how advanced software design techniques can be successfully applied to aeropropulsion system simulation (refs. 3 and 4). The design of Onyx's architecture enables users to customize and extend the framework to add new functionality or adapt simulation behavior as required. It exploits object-oriented technologies, such as design patterns, domain frameworks, and software components, to develop a modular system in which users can dynamically replace components with others having different functionality.

  9. A Framework for Automating Cost Estimates in Assembly Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calton, T.L.; Peters, R.R.

    1998-12-09

    When a product concept emerges, the manufacturing engineer is asked to sketch out a production strategy and estimate its cost. The engineer is given an initial product design, along with a schedule of expected production volumes. The engineer then determines the best approach to manufacturing the product, comparing a variey of alternative production strategies. The engineer must consider capital cost, operating cost, lead-time, and other issues in an attempt to maximize pro$ts. After making these basic choices and sketching the design of overall production, the engineer produces estimates of the required capital, operating costs, and production capacity. 177is process maymore » iterate as the product design is refined in order to improve its pe~ormance or manufacturability. The focus of this paper is on the development of computer tools to aid manufacturing engineers in their decision-making processes. This computer sof~are tool provides aj?amework in which accurate cost estimates can be seamlessly derivedfiom design requirements at the start of any engineering project. Z+e result is faster cycle times through first-pass success; lower ll~e cycie cost due to requirements-driven design and accurate cost estimates derived early in the process.« less

  10. Abstracted Workow Framework with a Structure from Motion Application

    NASA Astrophysics Data System (ADS)

    Rossi, Adam J.

    In scientific and engineering disciplines, from academia to industry, there is an increasing need for the development of custom software to perform experiments, construct systems, and develop products. The natural mindset initially is to shortcut and bypass all overhead and process rigor in order to obtain an immediate result for the problem at hand, with the misconception that the software will simply be thrown away at the end. In a majority of the cases, it turns out the software persists for many years, and likely ends up in production systems for which it was not initially intended. In the current study, a framework that can be used in both industry and academic applications mitigates underlying problems associated with developing scientific and engineering software. This results in software that is much more maintainable, documented, and usable by others, specifically allowing new users to extend capabilities of components already implemented in the framework. There is a multi-disciplinary need in the fields of imaging science, computer science, and software engineering for a unified implementation model, which motivates the development of an abstracted software framework. Structure from motion (SfM) has been identified as one use case where the abstracted workflow framework can improve research efficiencies and eliminate implementation redundancies in scientific fields. The SfM process begins by obtaining 2D images of a scene from different perspectives. Features from the images are extracted and correspondences are established. This provides a sufficient amount of information to initialize the problem for fully automated processing. Transformations are established between views, and 3D points are established via triangulation algorithms. The parameters for the camera models for all views / images are solved through bundle adjustment, establishing a highly consistent point cloud. The initial sparse point cloud and camera matrices are used to generate a dense point cloud through patch based techniques or densification algorithms such as Semi-Global Matching (SGM). The point cloud can be visualized or exploited by both humans and automated techniques. In some cases the point cloud is "draped" with original imagery in order to enhance the 3D model for a human viewer. The SfM workflow can be implemented in the abstracted framework, making it easily leverageable and extensible by multiple users. Like many processes in scientific and engineering domains, the workflow described for SfM is complex and requires many disparate components to form a functional system, often utilizing algorithms implemented by many users in different languages / environments and without knowledge of how the component fits into the larger system. In practice, this generally leads to issues interfacing the components, building the software for desired platforms, understanding its concept of operations, and how it can be manipulated in order to fit the desired function for a particular application. In addition, other scientists and engineers instinctively wish to analyze the performance of the system, establish new algorithms, optimize existing processes, and establish new functionality based on current research. This requires a framework whereby new components can be easily plugged in without affecting the current implemented functionality. The need for a universal programming environment establishes the motivation for the development of the abstracted workflow framework. This software implementation, named Catena, provides base classes from which new components must derive in order to operate within the framework. The derivation mandates requirements be satisfied in order to provide a complete implementation. Additionally, the developer must provide documentation of the component in terms of its overall function and inputs. The interface input and output values corresponding to the component must be defined in terms of their respective data types, and the implementation uses mechanisms within the framework to retrieve and send the values. This process requires the developer to componentize their algorithm rather than implement it monolithically. Although the requirements of the developer are slightly greater, the benefits realized from using Catena far outweigh the overhead, and results in extensible software. This thesis provides a basis for the abstracted workflow framework concept and the Catena software implementation. The benefits are also illustrated using a detailed examination of the SfM process as an example application.

  11. Sustainable water management under future uncertainty with eco-engineering decision scaling

    NASA Astrophysics Data System (ADS)

    Poff, N. Leroy; Brown, Casey M.; Grantham, Theodore E.; Matthews, John H.; Palmer, Margaret A.; Spence, Caitlin M.; Wilby, Robert L.; Haasnoot, Marjolijn; Mendoza, Guillermo F.; Dominique, Kathleen C.; Baeza, Andres

    2016-01-01

    Managing freshwater resources sustainably under future climatic and hydrological uncertainty poses novel challenges. Rehabilitation of ageing infrastructure and construction of new dams are widely viewed as solutions to diminish climate risk, but attaining the broad goal of freshwater sustainability will require expansion of the prevailing water resources management paradigm beyond narrow economic criteria to include socially valued ecosystem functions and services. We introduce a new decision framework, eco-engineering decision scaling (EEDS), that explicitly and quantitatively explores trade-offs in stakeholder-defined engineering and ecological performance metrics across a range of possible management actions under unknown future hydrological and climate states. We illustrate its potential application through a hypothetical case study of the Iowa River, USA. EEDS holds promise as a powerful framework for operationalizing freshwater sustainability under future hydrological uncertainty by fostering collaboration across historically conflicting perspectives of water resource engineering and river conservation ecology to design and operate water infrastructure for social and environmental benefits.

  12. A reliability as an independent variable (RAIV) methodology for optimizing test planning for liquid rocket engines

    NASA Astrophysics Data System (ADS)

    Strunz, Richard; Herrmann, Jeffrey W.

    2011-12-01

    The hot fire test strategy for liquid rocket engines has always been a concern of space industry and agency alike because no recognized standard exists. Previous hot fire test plans focused on the verification of performance requirements but did not explicitly include reliability as a dimensioning variable. The stakeholders are, however, concerned about a hot fire test strategy that balances reliability, schedule, and affordability. A multiple criteria test planning model is presented that provides a framework to optimize the hot fire test strategy with respect to stakeholder concerns. The Staged Combustion Rocket Engine Demonstrator, a program of the European Space Agency, is used as example to provide the quantitative answer to the claim that a reduced thrust scale demonstrator is cost beneficial for a subsequent flight engine development. Scalability aspects of major subsystems are considered in the prior information definition inside the Bayesian framework. The model is also applied to assess the impact of an increase of the demonstrated reliability level on schedule and affordability.

  13. Sustainable water management under future uncertainty with eco-engineering decision scaling

    USGS Publications Warehouse

    Poff, N LeRoy; Brown, Casey M; Grantham, Theodore E.; Matthews, John H; Palmer, Margaret A.; Spence, Caitlin M; Wilby, Robert L.; Haasnoot, Marjolijn; Mendoza, Guillermo F; Dominique, Kathleen C; Baeza, Andres

    2015-01-01

    Managing freshwater resources sustainably under future climatic and hydrological uncertainty poses novel challenges. Rehabilitation of ageing infrastructure and construction of new dams are widely viewed as solutions to diminish climate risk, but attaining the broad goal of freshwater sustainability will require expansion of the prevailing water resources management paradigm beyond narrow economic criteria to include socially valued ecosystem functions and services. We introduce a new decision framework, eco-engineering decision scaling (EEDS), that explicitly and quantitatively explores trade-offs in stakeholder-defined engineering and ecological performance metrics across a range of possible management actions under unknown future hydrological and climate states. We illustrate its potential application through a hypothetical case study of the Iowa River, USA. EEDS holds promise as a powerful framework for operationalizing freshwater sustainability under future hydrological uncertainty by fostering collaboration across historically conflicting perspectives of water resource engineering and river conservation ecology to design and operate water infrastructure for social and environmental benefits.

  14. Development of a robust framework for controlling high performance turbofan engines

    NASA Astrophysics Data System (ADS)

    Miklosovic, Robert

    This research involves the development of a robust framework for controlling complex and uncertain multivariable systems. Where mathematical modeling is often tedious or inaccurate, the new method uses an extended state observer (ESO) to estimate and cancel dynamic information in real time and dynamically decouple the system. As a result, controller design and tuning become transparent as the number of required model parameters is reduced. Much research has been devoted towards the application of modern multivariable control techniques on aircraft engines. However, few, if any, have been implemented on an operational aircraft, partially due to the difficulty in tuning the controller for satisfactory performance. The new technique is applied to a modern two-spool, high-pressure ratio, low-bypass turbofan with mixed-flow afterburning. A realistic Modular Aero-Propulsion System Simulation (MAPSS) package, developed by NASA, is used to demonstrate the new design process and compare its performance with that of a supplied nominal controller. This approach is expected to reduce gain scheduling over the full operating envelope of the engine and allow a controller to be tuned for engine-to-engine variations.

  15. Engineers' Responsibilities for Global Electronic Waste: Exploring Engineering Student Writing Through a Care Ethics Lens.

    PubMed

    Campbell, Ryan C; Wilson, Denise

    2017-04-01

    This paper provides an empirically informed perspective on the notion of responsibility using an ethical framework that has received little attention in the engineering-related literature to date: ethics of care. In this work, we ground conceptual explorations of engineering responsibility in empirical findings from engineering student's writing on the human health and environmental impacts of "backyard" electronic waste recycling/disposal. Our findings, from a purposefully diverse sample of engineering students in an introductory electrical engineering course, indicate that most of these engineers of tomorrow associated engineers with responsibility for the electronic waste (e-waste) problem in some way. However, a number of responses suggested attempts to deflect responsibility away from engineers towards, for example, the government or the companies for whom engineers work. Still other students associated both engineers and non-engineers with responsibility, demonstrating the distributed/collective nature of responsibility that will be required to achieve a solution to the global problem of excessive e-waste. Building upon one element of a framework for care ethics adopted from the wider literature, these empirical findings are used to facilitate a preliminary, conceptual exploration of care-ethical responsibility within the context of engineering and e-waste recycling/disposal. The objective of this exploration is to provide a first step toward understanding how care-ethical responsibility applies to engineering. We also hope to seed dialogue within the engineering community about its ethical responsibilities on the issue. We conclude the paper with a discussion of its implications for engineering education and engineering ethics that suggests changes for educational policy and the practice of engineering.

  16. Robust Decision-making Applied to Model Selection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemez, Francois M.

    2012-08-06

    The scientific and engineering communities are relying more and more on numerical models to simulate ever-increasingly complex phenomena. Selecting a model, from among a family of models that meets the simulation requirements, presents a challenge to modern-day analysts. To address this concern, a framework is adopted anchored in info-gap decision theory. The framework proposes to select models by examining the trade-offs between prediction accuracy and sensitivity to epistemic uncertainty. The framework is demonstrated on two structural engineering applications by asking the following question: Which model, of several numerical models, approximates the behavior of a structure when parameters that define eachmore » of those models are unknown? One observation is that models that are nominally more accurate are not necessarily more robust, and their accuracy can deteriorate greatly depending upon the assumptions made. It is posited that, as reliance on numerical models increases, establishing robustness will become as important as demonstrating accuracy.« less

  17. The International Safety Framework for nuclear power source applications in outer space-Useful and substantial guidance

    NASA Astrophysics Data System (ADS)

    Summerer, L.; Wilcox, R. E.; Bechtel, R.; Harbison, S.

    2015-06-01

    In 2009, the International Safety Framework for Nuclear Power Source Applications in Outer Space was adopted, following a multi-year process that involved all major space faring nations under the auspices of a partnership between the UN Committee on the Peaceful Uses of Outer Space and the International Atomic Energy Agency. The Safety Framework reflects an international consensus on best practices to achieve safety. Following the 1992 UN Principles Relevant to the Use of Nuclear Power Sources in Outer Space, it is the second attempt by the international community to draft guidance promoting the safety of applications of nuclear power sources in space missions. NPS applications in space have unique safety considerations compared with terrestrial applications. Mission launch and outer space operational requirements impose size, mass and other space environment limitations not present for many terrestrial nuclear facilities. Potential accident conditions could expose nuclear power sources to extreme physical conditions. The Safety Framework is structured to provide guidance for both the programmatic and technical aspects of safety. In addition to sections containing specific guidance for governments and for management, it contains technical guidance pertinent to the design, development and all mission phases of space NPS applications. All sections of the Safety Framework contain elements directly relevant to engineers and space mission designers for missions involving space nuclear power sources. The challenge for organisations and engineers involved in the design and development processes of space nuclear power sources and applications is to implement the guidance provided in the Safety Framework by integrating it into the existing standard space mission infrastructure of design, development and operational requirements, practices and processes. This adds complexity to the standard space mission and launch approval processes. The Safety Framework is deliberately generic to remain relevantly independent of technological progress, of national organisational setups and of space mission types. Implementing its guidance therefore leaves room for interpretation and adaptation. Relying on reported practices, we analyse the guidance particularly relevant to engineers and space mission designers.

  18. An environmental decision framework applied to marine engine control technologies.

    PubMed

    Corbett, James J; Chapman, David

    2006-06-01

    This paper develops a decision framework for considering emission control technologies on marine engines, informed by standard decision theory, with an open structure that may be adapted by operators with specific vessel and technology attributes different from those provided here. Attributes relate objectives important to choosing control technologies with specific alternatives that may meet several of the objectives differently. The transparent framework enables multiple stakeholders to understand how different subjective judgments and varying attribute properties may result in different technology choices. Standard scoring techniques ensure that attributes are not biased by subjective scoring and that weights are the primary quantitative input where subjective preferences are exercised. An expected value decision structure is adopted that considers probabilities (likelihood) that a given alternative can meet its claims; alternative decision criteria are discussed. Capital and annual costs are combined using a net present value approach. An iterative approach is advocated that allows for screening and disqualifying alternatives that do not meet minimum conditions for acceptance, such as engine warranty or U.S. Coast Guard requirements. This decision framework assists vessel operators in considering explicitly important attributes and in representing choices clearly to other stakeholders concerned about reducing air pollution from vessels. This general decision structure may also be applied similarly to other environmental controls in marine applications.

  19. Identifying 21st Century STEM Competencies Using Workplace Data

    NASA Astrophysics Data System (ADS)

    Jang, Hyewon

    2016-04-01

    Gaps between science, technology, engineering, and mathematics (STEM) education and required workplace skills have been identified in industry, academia, and government. Educators acknowledge the need to reform STEM education to better prepare students for their future careers. We pursue this growing interest in the skills needed for STEM disciplines and ask whether frameworks for 21st century skills and engineering education cover all of important STEM competencies. In this study, we identify important STEM competencies and evaluate the relevance of current frameworks applied in education using the standardized job-specific database operated and maintained by the US Department of Labor. Our analysis of the importance of 109 skills, types of knowledge and work activities, revealed 18 skills, seven categories of knowledge, and 27 work activities important for STEM workers. We investigate the perspectives of STEM and non-STEM job incumbents, comparing the importance of each skill, knowledge, and work activity for the two groups. We aimed to condense dimensions of the 52 key areas by categorizing them according to the Katz and Kahn (1978) framework and testing for inter-rater reliability. Our findings show frameworks for 21st century skills and engineering education do not encompass all important STEM competencies. Implications for STEM education programs are discussed, including how they can bridge gaps between education and important workplace competencies.

  20. Design Patterns for Learning and Assessment: Facilitating the Introduction of a Complex Simulation-Based Learning Environment into a Community of Instructors

    ERIC Educational Resources Information Center

    Frezzo, Dennis C.; Behrens, John T.; Mislevy, Robert J.

    2010-01-01

    Simulation environments make it possible for science and engineering students to learn to interact with complex systems. Putting these capabilities to effective use for learning, and assessing learning, requires more than a simulation environment alone. It requires a conceptual framework for the knowledge, skills, and ways of thinking that are…

  1. Design Patterns for Learning and Assessment: Facilitating the Introduction of a Complex Simulation-Based Learning Environment into a Community of Instructors

    NASA Astrophysics Data System (ADS)

    Frezzo, Dennis C.; Behrens, John T.; Mislevy, Robert J.

    2010-04-01

    Simulation environments make it possible for science and engineering students to learn to interact with complex systems. Putting these capabilities to effective use for learning, and assessing learning, requires more than a simulation environment alone. It requires a conceptual framework for the knowledge, skills, and ways of thinking that are meant to be developed, in order to design activities that target these capabilities. The challenges of using simulation environments effectively are especially daunting in dispersed social systems. This article describes how these challenges were addressed in the context of the Cisco Networking Academies with a simulation tool for computer networks called Packet Tracer. The focus is on a conceptual support framework for instructors in over 9,000 institutions around the world for using Packet Tracer in instruction and assessment, by learning to create problem-solving scenarios that are at once tuned to the local needs of their students and consistent with the epistemic frame of "thinking like a network engineer." We describe a layered framework of tools and interfaces above the network simulator that supports the use of Packet Tracer in the distributed community of instructors and students.

  2. CEOS SEO and GISS Meeting

    NASA Technical Reports Server (NTRS)

    Killough, Brian; Stover, Shelley

    2008-01-01

    The Committee on Earth Observation Satellites (CEOS) provides a brief to the Goddard Institute for Space Studies (GISS) regarding the CEOS Systems Engineering Office (SEO) and current work on climate requirements and analysis. A "system framework" is provided for the Global Earth Observation System of Systems (GEOSS). SEO climate-related tasks are outlined including the assessment of essential climate variable (ECV) parameters, use of the "systems framework" to determine relevant informational products and science models and the performance of assessments and gap analyses of measurements and missions for each ECV. Climate requirements, including instruments and missions, measurements, knowledge and models, and decision makers, are also outlined. These requirements would establish traceability from instruments to products and services allowing for benefit evaluation of instruments and measurements. Additionally, traceable climate requirements would provide a better understanding of global climate models.

  3. NASA Data Acquisition System Software Development for Rocket Propulsion Test Facilities

    NASA Technical Reports Server (NTRS)

    Herbert, Phillip W., Sr.; Elliot, Alex C.; Graves, Andrew R.

    2015-01-01

    Current NASA propulsion test facilities include Stennis Space Center in Mississippi, Marshall Space Flight Center in Alabama, Plum Brook Station in Ohio, and White Sands Test Facility in New Mexico. Within and across these centers, a diverse set of data acquisition systems exist with different hardware and software platforms. The NASA Data Acquisition System (NDAS) is a software suite designed to operate and control many critical aspects of rocket engine testing. The software suite combines real-time data visualization, data recording to a variety formats, short-term and long-term acquisition system calibration capabilities, test stand configuration control, and a variety of data post-processing capabilities. Additionally, data stream conversion functions exist to translate test facility data streams to and from downstream systems, including engine customer systems. The primary design goals for NDAS are flexibility, extensibility, and modularity. Providing a common user interface for a variety of hardware platforms helps drive consistency and error reduction during testing. In addition, with an understanding that test facilities have different requirements and setups, the software is designed to be modular. One engine program may require real-time displays and data recording; others may require more complex data stream conversion, measurement filtering, or test stand configuration management. The NDAS suite allows test facilities to choose which components to use based on their specific needs. The NDAS code is primarily written in LabVIEW, a graphical, data-flow driven language. Although LabVIEW is a general-purpose programming language; large-scale software development in the language is relatively rare compared to more commonly used languages. The NDAS software suite also makes extensive use of a new, advanced development framework called the Actor Framework. The Actor Framework provides a level of code reuse and extensibility that has previously been difficult to achieve using LabVIEW. The

  4. Designing flexible engineering systems utilizing embedded architecture options

    NASA Astrophysics Data System (ADS)

    Pierce, Jeff G.

    This dissertation develops and applies an integrated framework for embedding flexibility in an engineered system architecture. Systems are constantly faced with unpredictability in the operational environment, threats from competing systems, obsolescence of technology, and general uncertainty in future system demands. Current systems engineering and risk management practices have focused almost exclusively on mitigating or preventing the negative consequences of uncertainty. This research recognizes that high uncertainty also presents an opportunity to design systems that can flexibly respond to changing requirements and capture additional value throughout the design life. There does not exist however a formalized approach to designing appropriately flexible systems. This research develops a three stage integrated flexibility framework based on the concept of architecture options embedded in the system design. Stage One defines an eight step systems engineering process to identify candidate architecture options. This process encapsulates the operational uncertainty though scenario development, traces new functional requirements to the affected design variables, and clusters the variables most sensitive to change. The resulting clusters can generate insight into the most promising regions in the architecture to embed flexibility in the form of architecture options. Stage Two develops a quantitative option valuation technique, grounded in real options theory, which is able to value embedded architecture options that exhibit variable expiration behavior. Stage Three proposes a portfolio optimization algorithm, for both discrete and continuous options, to select the optimal subset of architecture options, subject to budget and risk constraints. Finally, the feasibility, extensibility and limitations of the framework are assessed by its application to a reconnaissance satellite system development problem. Detailed technical data, performance models, and cost estimates were compiled for the Tactical Imaging Constellation Architecture Study and leveraged to complete a realistic proof-of-concept.

  5. Requirements for psychological models to support design: Towards ecological task analysis

    NASA Technical Reports Server (NTRS)

    Kirlik, Alex

    1991-01-01

    Cognitive engineering is largely concerned with creating environmental designs to support skillful and effective human activity. A set of necessary conditions are proposed for psychological models capable of supporting this enterprise. An analysis of the psychological nature of the design product is used to identify a set of constraints that models must meet if they can usefully guide design. It is concluded that cognitive engineering requires models with resources for describing the integrated human-environment system, and that these models must be capable of describing the activities underlying fluent and effective interaction. These features are required in order to be able to predict the cognitive activity that will be required given various design concepts, and to design systems that promote the acquisition of fluent, skilled behavior. These necessary conditions suggest that an ecological approach can provide valuable resources for psychological modeling to support design. Relying heavily on concepts from Brunswik's and Gibson's ecological theories, ecological task analysis is proposed as a framework in which to predict the types of cognitive activity required to achieve productive behavior, and to suggest how interfaces can be manipulated to alleviate certain types of cognitive demands. The framework is described in terms, and illustrated with an example from the previous research on modeling skilled human-environment interaction.

  6. Dynamic optimization of chemical processes using ant colony framework.

    PubMed

    Rajesh, J; Gupta, K; Kusumakar, H S; Jayaraman, V K; Kulkarni, B D

    2001-11-01

    Ant colony framework is illustrated by considering dynamic optimization of six important bench marking examples. This new computational tool is simple to implement and can tackle problems with state as well as terminal constraints in a straightforward fashion. It requires fewer grid points to reach the global optimum at relatively very low computational effort. The examples with varying degree of complexities, analyzed here, illustrate its potential for solving a large class of process optimization problems in chemical engineering.

  7. Integrating International Engineering Organizations For Successful ISS Operations

    NASA Technical Reports Server (NTRS)

    Blome, Elizabeth; Duggan, Matt; Patten, L.; Pieterek, Hhtrud

    2006-01-01

    The International Space Station (ISS) is a multinational orbiting space laboratory that is built in cooperation with 16 nations. The design and sustaining engineering expertise is spread worldwide. As the number of Partners with orbiting elements on the ISS grows, the challenge NASA is facing as the ISS integrator is to ensure that engineering expertise and data are accessible in a timely fashion to ensure ongoing operations and mission success. Integrating international engineering teams requires definition and agreement on common processes and responsibilities, joint training and the emergence of a unique engineering team culture. ISS engineers face daunting logistical and political challenges regarding data sharing requirements. To assure systematic information sharing and anomaly resolution of integrated anomalies, the ISS Partners are developing multi-lateral engineering interface procedures. Data sharing and individual responsibility are key aspects of this plan. This paper describes several examples of successful multilateral anomaly resolution. These successes were used to form the framework of the Partner to Partner engineering interface procedures, and this paper describes those currently documented multilateral engineering processes. Furthermore, it addresses the challenges experienced to date, and the forward work expected in establishing a successful working relationship with Partners as their hardware is launched.

  8. Ex-ante assessment of the safety effects of intelligent transport systems.

    PubMed

    Kulmala, Risto

    2010-07-01

    There is a need to develop a comprehensive framework for the safety assessment of Intelligent Transport Systems (ITS). This framework should: (1) cover all three dimensions of road safety-exposure, crash risk and consequence, (2) cover, in addition to the engineering effect, also the effects due to behavioural adaptation and (3) be compatible with the other aspects of state of the art road safety theories. A framework based on nine ITS safety mechanisms is proposed and discussed with regard to the requirements set to the framework. In order to illustrate the application of the framework in practice, the paper presents a method based on the framework and the results from applying that method for twelve intelligent vehicle systems in Europe. The framework is also compared to two recent frameworks applied in the safety assessment of intelligent vehicle safety systems. Copyright 2010 Elsevier Ltd. All rights reserved.

  9. LIFE CYCLE DESIGN GUIDANCE MANUAL - ENVIRONMENTAL REQUIREMENTS AND THE PRODUCT SYSTEM

    EPA Science Inventory

    The U.S Environmental Protection Agency's (EPA) Risk Reduction Engineering Laboratory and the University of Michigan are cooperating in a project to reduce environmental impacts and health risks through product system design. The resulting framework for life cycle design is pr...

  10. Towards a Framework for Evolvable Network Design

    NASA Astrophysics Data System (ADS)

    Hassan, Hoda; Eltarras, Ramy; Eltoweissy, Mohamed

    The layered Internet architecture that had long guided network design and protocol engineering was an “interconnection architecture” defining a framework for interconnecting networks rather than a model for generic network structuring and engineering. We claim that the approach of abstracting the network in terms of an internetwork hinders the thorough understanding of the network salient characteristics and emergent behavior resulting in impeding design evolution required to address extreme scale, heterogeneity, and complexity. This paper reports on our work in progress that aims to: 1) Investigate the problem space in terms of the factors and decisions that influenced the design and development of computer networks; 2) Sketch the core principles for designing complex computer networks; and 3) Propose a model and related framework for building evolvable, adaptable and self organizing networks We will adopt a bottom up strategy primarily focusing on the building unit of the network model, which we call the “network cell”. The model is inspired by natural complex systems. A network cell is intrinsically capable of specialization, adaptation and evolution. Subsequently, we propose CellNet; a framework for evolvable network design. We outline scenarios for using the CellNet framework to enhance legacy Internet protocol stack.

  11. Engineers’ Responsibilities for Global Electronic Waste: Exploring Engineering Student Writing Through a Care Ethics Lens

    PubMed Central

    Campbell, Ryan C.; Wilson, Denise

    2016-01-01

    This paper provides an empirically informed perspective on the notion of responsibility using an ethical framework that has received little attention in the engineering-related literature to date: ethics of care. In this work, we ground conceptual explorations of engineering responsibility in empirical findings from engineering student’s writing on the human health and environmental impacts of “backyard” electronic waste recycling/disposal. Our findings, from a purposefully diverse sample of engineering students in an introductory electrical engineering course, indicate that most of these engineers of tomorrow associated engineers with responsibility for the electronic waste (e-waste) problem in some way. However, a number of responses suggested attempts to deflect responsibility away from engineers towards, for example, the government or the companies for whom engineers work. Still other students associated both engineers and non-engineers with responsibility, demonstrating the distributed/collective nature of responsibility that will be required to achieve a solution to the global problem of excessive e-waste. Building upon one element of a framework for care ethics adopted from the wider literature, these empirical findings are used to facilitate a preliminary, conceptual exploration of care-ethical responsibility within the context of engineering and e-waste recycling/disposal. The objective of this exploration is to provide a first step toward understanding how care-ethical responsibility applies to engineering. We also hope to seed dialogue within the engineering community about its ethical responsibilities on the issue. We conclude the paper with a discussion of its implications for engineering education and engineering ethics that suggests changes for educational policy and the practice of engineering. PMID:27368195

  12. Reusable rocket engine intelligent control system framework design, phase 2

    NASA Technical Reports Server (NTRS)

    Nemeth, ED; Anderson, Ron; Ols, Joe; Olsasky, Mark

    1991-01-01

    Elements of an advanced functional framework for reusable rocket engine propulsion system control are presented for the Space Shuttle Main Engine (SSME) demonstration case. Functional elements of the baseline functional framework are defined in detail. The SSME failure modes are evaluated and specific failure modes identified for inclusion in the advanced functional framework diagnostic system. Active control of the SSME start transient is investigated, leading to the identification of a promising approach to mitigating start transient excursions. Key elements of the functional framework are simulated and demonstration cases are provided. Finally, the advanced function framework for control of reusable rocket engines is presented.

  13. Tool for the Integrated Dynamic Numerical Propulsion System Simulation (NPSS)/Turbine Engine Closed-Loop Transient Analysis (TTECTrA) User's Guide

    NASA Technical Reports Server (NTRS)

    Chin, Jeffrey C.; Csank, Jeffrey T.

    2016-01-01

    The Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA ver2) is a control design tool thatenables preliminary estimation of transient performance for models without requiring a full nonlinear controller to bedesigned. The program is compatible with subsonic engine models implemented in the MATLAB/Simulink (TheMathworks, Inc.) environment and Numerical Propulsion System Simulation (NPSS) framework. At a specified flightcondition, TTECTrA will design a closed-loop controller meeting user-defined requirements in a semi or fully automatedfashion. Multiple specifications may be provided, in which case TTECTrA will design one controller for each, producing acollection of controllers in a single run. Each resulting controller contains a setpoint map, a schedule of setpointcontroller gains, and limiters; all contributing to transient characteristics. The goal of the program is to providesteady-state engine designers with more immediate feedback on the transient engine performance earlier in the design cycle.

  14. Complex Adaptive Systems of Systems (CASoS) engineering and foundations for global design.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brodsky, Nancy S.; Finley, Patrick D.; Beyeler, Walter Eugene

    2012-01-01

    Complex Adaptive Systems of Systems, or CASoS, are vastly complex ecological, sociological, economic and/or technical systems which must be recognized and reckoned with to design a secure future for the nation and the world. Design within CASoS requires the fostering of a new discipline, CASoS Engineering, and the building of capability to support it. Towards this primary objective, we created the Phoenix Pilot as a crucible from which systemization of the new discipline could emerge. Using a wide range of applications, Phoenix has begun building both theoretical foundations and capability for: the integration of Applications to continuously build common understandingmore » and capability; a Framework for defining problems, designing and testing solutions, and actualizing these solutions within the CASoS of interest; and an engineering Environment required for 'the doing' of CASoS Engineering. In a secondary objective, we applied CASoS Engineering principles to begin to build a foundation for design in context of Global CASoS« less

  15. Introducing the CERT (Trademark) Resiliency Engineering Framework: Improving the Security and Sustainability Processes

    DTIC Science & Technology

    2007-05-01

    Organizational Structure 40 6.1.3 Funding Model 40 6.1.4 Role of Information Technology 40 6.2 Considering Process Improvement 41 6.2.1 Dimensions of...to the process definition for resiliency engineering. 6.1.3 Funding Model Just as organizational structures tend to align across security and...responsibility. Adopting an enter- prise view of operational resiliency and a process improvement approach requires that the funding model evolve to one

  16. Reference Model for Project Support Environments Version 1.0

    DTIC Science & Technology

    1993-02-28

    relationship with the framework’s Process Support services and with the Lifecycle Process Engineering services. Examples: "* ORCA (Object-based...Design services. Examples: "* ORCA (Object-based Requirements Capture and Analysis). "* RETRAC (REquirements TRACeability). 4.3 Life-Cycle Process...34traditional" computer tools. Operations: Examples of audio and video processing operations include: "* Create, modify, and delete sound and video data

  17. Requirements Modeling with the Aspect-oriented User Requirements Notation (AoURN): A Case Study

    NASA Astrophysics Data System (ADS)

    Mussbacher, Gunter; Amyot, Daniel; Araújo, João; Moreira, Ana

    The User Requirements Notation (URN) is a recent ITU-T standard that supports requirements engineering activities. The Aspect-oriented URN (AoURN) adds aspect-oriented concepts to URN, creating a unified framework that allows for scenario-based, goal-oriented, and aspect-oriented modeling. AoURN is applied to the car crash crisis management system (CCCMS), modeling its functional and non-functional requirements (NFRs). AoURN generally models all use cases, NFRs, and stakeholders as individual concerns and provides general guidelines for concern identification. AoURN handles interactions between concerns, capturing their dependencies and conflicts as well as the resolutions. We present a qualitative comparison of aspect-oriented techniques for scenario-based and goal-oriented requirements engineering. An evaluation carried out based on the metrics adapted from literature and a task-based evaluation suggest that AoURN models are more scalable than URN models and exhibit better modularity, reusability, and maintainability.

  18. Visual Computing Environment Workshop

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles (Compiler)

    1998-01-01

    The Visual Computing Environment (VCE) is a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis.

  19. Virtual Collaborative Environments for System of Systems Engineering and Applications for ISAT

    NASA Technical Reports Server (NTRS)

    Dryer, David A.

    2002-01-01

    This paper describes an system of systems or metasystems approach and models developed to help prepare engineering organizations for distributed engineering environments. These changes in engineering enterprises include competition in increasingly global environments; new partnering opportunities caused by advances in information and communication technologies, and virtual collaboration issues associated with dispersed teams. To help address challenges and needs in this environment, a framework is proposed that can be customized and adapted for NASA to assist in improved engineering activities conducted in distributed, enhanced engineering environments. The approach is designed to prepare engineers for such distributed collaborative environments by learning and applying e-engineering methods and tools to a real-world engineering development scenario. The approach consists of two phases: an e-engineering basics phase and e-engineering application phase. The e-engineering basics phase addresses skills required for e-engineering. The e-engineering application phase applies these skills in a distributed collaborative environment to system development projects.

  20. Methods for automated semantic definition of manufacturing structures (mBOM) in mechanical engineering companies

    NASA Astrophysics Data System (ADS)

    Stekolschik, Alexander, Prof.

    2017-10-01

    The bill of materials (BOM), which involves all parts and assemblies of the product, is the core of any mechanical or electronic product. The flexible and integrated management of engineering (Engineering Bill of Materials [eBOM]) and manufacturing (Manufacturing Bill of Materials [mBOM]) structures is the key to the creation of modern products in mechanical engineering companies. This paper presents a method framework for the creation and control of e- and, especially, mBOM. The requirements, resulting from the process of differentiation between companies that produce serialized or engineered-to-order products, are considered in the analysis phase. The main part of the paper describes different approaches to fully or partly automated creation of mBOM. The first approach is the definition of part selection rules in the generic mBOM templates. The mBOM can be derived from the eBOM for partly standardized products by using this method. Another approach is the simultaneous use of semantic rules, options, and parameters in both structures. The implementation of the method framework (selection of use cases) in a standard product lifecycle management (PLM) system is part of the research.

  1. Students' Developing Understanding of Water in Environmental Systems

    ERIC Educational Resources Information Center

    Covitt, Beth A.; Gunckel, Kristin L.; Anderson, Charles W.

    2009-01-01

    The authors developed a framework of empirically grounded curricular goals for water-science literacy and documented the challenges that students face in achieving these goals. Water-related environmental science literacy requires an understanding of connected natural and human-engineered systems at multiple scales ranging from atomic-molecular…

  2. The European water framework directive: water quality classification and implications to engineering planning.

    PubMed

    Achleitner, Stefan; De Toffol, Sara; Engelhard, Carolina; Rauch, Wolfgang

    2005-04-01

    The European Water framework directive (WFD) is probably the most important environmental management directive that has been enacted over the last decade in the European Union. The directive aims at achieving an overall good ecological status in all European water bodies. In this article, we discuss the implementation steps of the WFD and their implications for environmental engineering practice while focusing on rivers as the main receiving waters. Arising challenges for engineers and scientists are seen in the quantitative assessment of water quality, where standardized systems are needed to estimate the biological status. This is equally of concern in engineering planning, where the prediction of ecological impacts is required. Studies dealing with both classification and prediction of the ecological water quality are reviewed. Further, the combined emission-water quality approach is discussed. Common understanding of this combined approach is to apply the most stringent of either water quality or emission standard to a certain case. In contrast, for example, the Austrian water act enables the application of only the water quality based approach--at least on a temporary basis.

  3. Reengineering Biomedical Translational Research with Engineering Ethics.

    PubMed

    Sunderland, Mary E; Nayak, Rahul Uday

    2015-08-01

    It is widely accepted that translational research practitioners need to acquire special skills and knowledge that will enable them to anticipate, analyze, and manage a range of ethical issues. While there is a small but growing literature that addresses the ethics of translational research, there is a dearth of scholarship regarding how this might apply to engineers. In this paper we examine engineers as key translators and argue that they are well positioned to ask transformative ethical questions. Asking engineers to both broaden and deepen their consideration of ethics in their work, however, requires a shift in the way ethics is often portrayed and perceived in science and engineering communities. Rather than interpreting ethics as a roadblock to the success of translational research, we suggest that engineers should be encouraged to ask questions about the socio-ethical dimensions of their work. This requires expanding the conceptual framework of engineering beyond its traditional focus on "how" and "what" questions to also include "why" and "who" questions to facilitate the gathering of normative, socially-situated information. Empowering engineers to ask "why" and "who" questions should spur the development of technologies and practices that contribute to improving health outcomes.

  4. An Engineering Technology Skills Framework that Reflects Workforce Needs on Maui and the Big Island of Hawai'i

    NASA Astrophysics Data System (ADS)

    Seagroves, S.; Hunter, L.

    2010-12-01

    The Akamai Workforce Initiative (AWI) is an interdisciplinary effort to improve science/engineering education in the state of Hawai'i, and to train a diverse population of local students in the skills needed for a high-tech economy. In 2009, the AWI undertook a survey of industry partners on Maui and the Big Island of Hawai'i to develop an engineering technology skills framework that will guide curriculum development at the U. of Hawai'i - Maui (formerly Maui Community College). This engineering skills framework builds directly on past engineering-education developments within the Center for Adaptive Optics Professional Development Program, and draws on curriculum development frameworks and engineering skills standards from the literature. Coupling that previous work with reviews of past Akamai Internship projects and information from previous conversations with the local high-tech community led to a structured-interview format where engineers and managers could contribute meaningful commentary to this framework. By incorporating these local high-tech companies' needs for entry-level engineers and technicians, a skills framework emerges that is unique and illuminating. Two surprising features arise in this framework: (1) "technician-like" skills of making existing technology work are on similar footing with "engineer-like" skills of creating new technology; in fact, both engineers and technicians at these workplaces use both sets of skills; and (2) project management skills are emphasized by employers even for entry-level positions.

  5. Seismic risk management of non-engineered buildings

    NASA Astrophysics Data System (ADS)

    Winar, Setya

    Earthquakes have long been feared as one of nature's most terrifying and devastating events. Although seismic codes clearly exist in countries with a high seismic risk to save lives and human suffering, earthquakes still continue to cause tragic events with high death tolls, particularly due to the collapse of widespread non-engineered buildings with non-seismic resistance in developing countries such as Indonesia. The implementation of seismic codes in non-engineered construction is the key to ensuring earthquake safety. In fact, such implementation is not simple, because it comprises all forms of cross disciplinary and cross sectoral linkages at different levels of understanding, commitment, and skill. This fact suggests that a widely agreed framework can help to harmonise the various perspectives. Hence, this research is aimed at developing an integrated framework for guiding and monitoring seismic risk reduction of non-engineered buildings in Indonesia via a risk management method.Primarily, the proposed framework for the study has drawn heavily on wider literature, the three existing frameworks around the world, and on the contribution of various stakeholders who participated in the study. A postal questionnaire survey, selected interviews, and workshop event constituted the primary data collection methods. As a robust framework needed to be achieved, the following two workshop events, which were conducted in Yogyakarta City and Bengkulu City in Indonesia, were carried out for practicality, validity, and moderation or any identifiable improvement requirements. The data collected was analysed with the assistance of SPSS and NVivo software programmes.This research found that the content of the proposed framework comprises 63 pairs of characteristic-indicators complemented by (a) three important factors of effective seismic risk management of non-engineered buildings, (b) three guiding principles for sustainable dissemination to the grass root communities and (c) a map of agents of change. Among the 63 pairs, there are 19 technical interventions and 44 non-technical interventions. These findings contribute to the wider knowledge in the domain of the seismic risk management of non-engineered buildings, in order to: (a) provide a basis for effective political advocacy, (b) reflect the multidimensional and inter-disciplinary nature of seismic risk reduction, (c) assist a wide range of users in determining roles, responsibilities, and accountabilities, and (d) provide the basis for setting goals and targets.

  6. A Framework of Working Across Disciplines in Early Design and R&D of Large Complex Engineered Systems

    NASA Technical Reports Server (NTRS)

    McGowan, Anna-Maria Rivas; Papalambros, Panos Y.; Baker, Wayne E.

    2015-01-01

    This paper examines four primary methods of working across disciplines during R&D and early design of large-scale complex engineered systems such as aerospace systems. A conceptualized framework, called the Combining System Elements framework, is presented to delineate several aspects of cross-discipline and system integration practice. The framework is derived from a theoretical and empirical analysis of current work practices in actual operational settings and is informed by theories from organization science and engineering. The explanatory framework may be used by teams to clarify assumptions and associated work practices, which may reduce ambiguity in understanding diverse approaches to early systems research, development and design. The framework also highlights that very different engineering results may be obtained depending on work practices, even when the goals for the engineered system are the same.

  7. Comparison of Physics Frameworks for WebGL-Based Game Engine

    NASA Astrophysics Data System (ADS)

    Yogya, Resa; Kosala, Raymond

    2014-03-01

    Recently, a new technology called WebGL shows a lot of potentials for developing games. However since this technology is still new, there are still many potentials in the game development area that are not explored yet. This paper tries to uncover the potential of integrating physics frameworks with WebGL technology in a game engine for developing 2D or 3D games. Specifically we integrated three open source physics frameworks: Bullet, Cannon, and JigLib into a WebGL-based game engine. Using experiment, we assessed these frameworks in terms of their correctness or accuracy, performance, completeness and compatibility. The results show that it is possible to integrate open source physics frameworks into a WebGLbased game engine, and Bullet is the best physics framework to be integrated into the WebGL-based game engine.

  8. In the soft-to-hard technical spectrum: Where is software engineering?

    NASA Technical Reports Server (NTRS)

    Leibfried, Theodore F.; Macdonald, Robert B.

    1992-01-01

    In the computer journals and tabloids, there have been a plethora of articles written about the software engineering field. But while advocates of the need for an engineering approach to software development, it is impressive how many authors have treated the subject of software engineering without adequately addressing the fundamentals of what engineering as a discipline consists of. A discussion is presented of the various related facets of this issue in a logical framework to advance the thesis that the software development process is necessarily an engineering process. The purpose is to examine more of the details of the issue of whether or not the design and development of software for digital computer processing systems should be both viewed and treated as a legitimate field of professional engineering. Also, the type of academic and professional level education programs that would be required to support a software engineering discipline is examined.

  9. Bacterial computing with engineered populations.

    PubMed

    Amos, Martyn; Axmann, Ilka Maria; Blüthgen, Nils; de la Cruz, Fernando; Jaramillo, Alfonso; Rodriguez-Paton, Alfonso; Simmel, Friedrich

    2015-07-28

    We describe strategies for the construction of bacterial computing platforms by describing a number of results from the recently completed bacterial computing with engineered populations project. In general, the implementation of such systems requires a framework containing various components such as intracellular circuits, single cell input/output and cell-cell interfacing, as well as extensive analysis. In this overview paper, we describe our approach to each of these, and suggest possible areas for future research. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  10. BIOMedical Search Engine Framework: Lightweight and customized implementation of domain-specific biomedical search engines.

    PubMed

    Jácome, Alberto G; Fdez-Riverola, Florentino; Lourenço, Anália

    2016-07-01

    Text mining and semantic analysis approaches can be applied to the construction of biomedical domain-specific search engines and provide an attractive alternative to create personalized and enhanced search experiences. Therefore, this work introduces the new open-source BIOMedical Search Engine Framework for the fast and lightweight development of domain-specific search engines. The rationale behind this framework is to incorporate core features typically available in search engine frameworks with flexible and extensible technologies to retrieve biomedical documents, annotate meaningful domain concepts, and develop highly customized Web search interfaces. The BIOMedical Search Engine Framework integrates taggers for major biomedical concepts, such as diseases, drugs, genes, proteins, compounds and organisms, and enables the use of domain-specific controlled vocabulary. Technologies from the Typesafe Reactive Platform, the AngularJS JavaScript framework and the Bootstrap HTML/CSS framework support the customization of the domain-oriented search application. Moreover, the RESTful API of the BIOMedical Search Engine Framework allows the integration of the search engine into existing systems or a complete web interface personalization. The construction of the Smart Drug Search is described as proof-of-concept of the BIOMedical Search Engine Framework. This public search engine catalogs scientific literature about antimicrobial resistance, microbial virulence and topics alike. The keyword-based queries of the users are transformed into concepts and search results are presented and ranked accordingly. The semantic graph view portraits all the concepts found in the results, and the researcher may look into the relevance of different concepts, the strength of direct relations, and non-trivial, indirect relations. The number of occurrences of the concept shows its importance to the query, and the frequency of concept co-occurrence is indicative of biological relations meaningful to that particular scope of research. Conversely, indirect concept associations, i.e. concepts related by other intermediary concepts, can be useful to integrate information from different studies and look into non-trivial relations. The BIOMedical Search Engine Framework supports the development of domain-specific search engines. The key strengths of the framework are modularity and extensibilityin terms of software design, the use of open-source consolidated Web technologies, and the ability to integrate any number of biomedical text mining tools and information resources. Currently, the Smart Drug Search keeps over 1,186,000 documents, containing more than 11,854,000 annotations for 77,200 different concepts. The Smart Drug Search is publicly accessible at http://sing.ei.uvigo.es/sds/. The BIOMedical Search Engine Framework is freely available for non-commercial use at https://github.com/agjacome/biomsef. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Integrated System-Level Optimization for Concurrent Engineering With Parametric Subsystem Modeling

    NASA Technical Reports Server (NTRS)

    Schuman, Todd; DeWeck, Oliver L.; Sobieski, Jaroslaw

    2005-01-01

    The introduction of concurrent design practices to the aerospace industry has greatly increased the productivity of engineers and teams during design sessions as demonstrated by JPL's Team X. Simultaneously, advances in computing power have given rise to a host of potent numerical optimization methods capable of solving complex multidisciplinary optimization problems containing hundreds of variables, constraints, and governing equations. Unfortunately, such methods are tedious to set up and require significant amounts of time and processor power to execute, thus making them unsuitable for rapid concurrent engineering use. This paper proposes a framework for Integration of System-Level Optimization with Concurrent Engineering (ISLOCE). It uses parametric neural-network approximations of the subsystem models. These approximations are then linked to a system-level optimizer that is capable of reaching a solution quickly due to the reduced complexity of the approximations. The integration structure is described in detail and applied to the multiobjective design of a simplified Space Shuttle external fuel tank model. Further, a comparison is made between the new framework and traditional concurrent engineering (without system optimization) through an experimental trial with two groups of engineers. Each method is evaluated in terms of optimizer accuracy, time to solution, and ease of use. The results suggest that system-level optimization, running as a background process during integrated concurrent engineering sessions, is potentially advantageous as long as it is judiciously implemented.

  12. Advanced Information Technology in Simulation Based Life Cycle Design

    NASA Technical Reports Server (NTRS)

    Renaud, John E.

    2003-01-01

    In this research a Collaborative Optimization (CO) approach for multidisciplinary systems design is used to develop a decision based design framework for non-deterministic optimization. To date CO strategies have been developed for use in application to deterministic systems design problems. In this research the decision based design (DBD) framework proposed by Hazelrigg is modified for use in a collaborative optimization framework. The Hazelrigg framework as originally proposed provides a single level optimization strategy that combines engineering decisions with business decisions in a single level optimization. By transforming this framework for use in collaborative optimization one can decompose the business and engineering decision making processes. In the new multilevel framework of Decision Based Collaborative Optimization (DBCO) the business decisions are made at the system level. These business decisions result in a set of engineering performance targets that disciplinary engineering design teams seek to satisfy as part of subspace optimizations. The Decision Based Collaborative Optimization framework more accurately models the existing relationship between business and engineering in multidisciplinary systems design.

  13. Flight deck engine advisor

    NASA Technical Reports Server (NTRS)

    Shontz, W. D.; Records, R. M.; Antonelli, D. R.

    1992-01-01

    The focus of this project is on alerting pilots to impending events in such a way as to provide the additional time required for the crew to make critical decisions concerning non-normal operations. The project addresses pilots' need for support in diagnosis and trend monitoring of faults as they affect decisions that must be made within the context of the current flight. Monitoring and diagnostic modules developed under the NASA Faultfinder program were restructured and enhanced using input data from an engine model and real engine fault data. Fault scenarios were prepared to support knowledge base development activities on the MONITAUR and DRAPhyS modules of Faultfinder. An analysis of the information requirements for fault management was included in each scenario. A conceptual framework was developed for systematic evaluation of the impact of context variables on pilot action alternatives as a function of event/fault combinations.

  14. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS

    PubMed Central

    Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2016-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optimizations1 to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor a , an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions. PMID:27896971

  15. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS.

    PubMed

    Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2017-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.

  16. Hydra: a scalable proteomic search engine which utilizes the Hadoop distributed computing framework

    PubMed Central

    2012-01-01

    Background For shotgun mass spectrometry based proteomics the most computationally expensive step is in matching the spectra against an increasingly large database of sequences and their post-translational modifications with known masses. Each mass spectrometer can generate data at an astonishingly high rate, and the scope of what is searched for is continually increasing. Therefore solutions for improving our ability to perform these searches are needed. Results We present a sequence database search engine that is specifically designed to run efficiently on the Hadoop MapReduce distributed computing framework. The search engine implements the K-score algorithm, generating comparable output for the same input files as the original implementation. The scalability of the system is shown, and the architecture required for the development of such distributed processing is discussed. Conclusion The software is scalable in its ability to handle a large peptide database, numerous modifications and large numbers of spectra. Performance scales with the number of processors in the cluster, allowing throughput to expand with the available resources. PMID:23216909

  17. Hydra: a scalable proteomic search engine which utilizes the Hadoop distributed computing framework.

    PubMed

    Lewis, Steven; Csordas, Attila; Killcoyne, Sarah; Hermjakob, Henning; Hoopmann, Michael R; Moritz, Robert L; Deutsch, Eric W; Boyle, John

    2012-12-05

    For shotgun mass spectrometry based proteomics the most computationally expensive step is in matching the spectra against an increasingly large database of sequences and their post-translational modifications with known masses. Each mass spectrometer can generate data at an astonishingly high rate, and the scope of what is searched for is continually increasing. Therefore solutions for improving our ability to perform these searches are needed. We present a sequence database search engine that is specifically designed to run efficiently on the Hadoop MapReduce distributed computing framework. The search engine implements the K-score algorithm, generating comparable output for the same input files as the original implementation. The scalability of the system is shown, and the architecture required for the development of such distributed processing is discussed. The software is scalable in its ability to handle a large peptide database, numerous modifications and large numbers of spectra. Performance scales with the number of processors in the cluster, allowing throughput to expand with the available resources.

  18. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  19. Engineering Encounters: Engineer It, Learn It--Science and Engineering Practices in Action

    ERIC Educational Resources Information Center

    Lachapelle, Cathy P.; Sargianis, Kristin; Cunningham, Christine M.

    2013-01-01

    Engineering is prominently included in the "Next Generation Science Standards" (Achieve Inc. 2013), as it was in "A Framework for K-12 Science Education" (NRC 2012). The National Research Council, authors of the "Framework," write, "Engineering and technology are featured alongside the natural sciences (physical…

  20. Framework for adaptive interoperability of manufacturing enterprises (FAIME): a case study

    NASA Astrophysics Data System (ADS)

    Sims, John E.; Chu, Bei Tseng B.; Long, Junshen; Matthews, Mike; Barnes, Johnny G.; Jones, Chris H.; Anderson, Rayne A.; Lambert, Russ; Drake, Doug C.; Hamilton, Mark A.; Connard, Mark

    1997-01-01

    In todays global economy, manufacturing industries require to connect disparate applications seamlessly. They require not only to exchange data and transactions, but present a single business process image to their employees in the office, headquarters, and on the plant floor. Also, it is imperative that small and medium size manufacturing companies deploy manufacturing execution systems applications in conjunction with modern enterprise resource programs for cycle time reduction and better quality. This paper presents the experiences and reflections on a project that created a tool set to assist the above be accomplished not only in a shorter cycle time, with a better predictable quality, and with an object oriented framework, but also a tool set that allows the manufacturer to still use legacy applications. This framework has the capability of plug-and- play so that future migrations and re-engineering of processes are more productive.

  1. Building a Framework for Engineering Design Experiences in High School

    ERIC Educational Resources Information Center

    Denson, Cameron D.; Lammi, Matthew

    2014-01-01

    In this article, Denson and Lammi put forth a conceptual framework that will help promote the successful infusion of engineering design experiences into high school settings. When considering a conceptual framework of engineering design in high school settings, it is important to consider the complex issue at hand. For the purposes of this…

  2. Integrated Life-Cycle Framework for Maintenance, Monitoring and Reliability of Naval Ship Structures

    DTIC Science & Technology

    2012-08-15

    number of times, a fast and accurate method for analyzing the ship hull is required. In order to obtain this required computational speed and accuracy...Naval Engineers Fleet Maintenance & Modernization Symposium (FMMS 2011) [8] and the Eleventh International Conference on Fast Sea Transportation ( FAST ...probabilistic strength of the ship hull. First, a novel deterministic method for the fast and accurate calculation of the strength of the ship hull is

  3. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  4. Achieving Maximum Integration Utilizing Requirements Flow Down

    NASA Technical Reports Server (NTRS)

    Archiable, Wes; Askins, Bruce

    2011-01-01

    A robust and experienced systems engineering team is essential for a successful program. It is often a challenge to build a core systems engineering team early enough in a program to maximize integration and assure a common path for all supporting teams in a project. Ares I was no exception. During the planning of IVGVT, the team had many challenges including lack of: early identification of stakeholders, team training in NASA s system engineering practices, solid requirements flow down and a top down documentation strategy. The IVGVT team started test planning early in the program before the systems engineering framework had been matured due to an aggressive schedule. Therefore the IVGVT team increased their involvement in the Constellation systems engineering effort. Program level requirements were established that flowed down to IVGVT aligning all stakeholders to a common set of goals. The IVGVT team utilized the APPEL REQ Development Management course providing the team a NASA focused model to follow. The IVGVT team engaged directly with the model verification and validation process to assure that a solid set of requirements drove the need for the test event. The IVGVT team looked at the initial planning state, analyzed the current state and then produced recommendations for the ideal future state of a wide range of systems engineering functions and processes. Based on this analysis, the IVGVT team was able to produce a set of lessons learned and to provide suggestions for future programs or tests to use in their initial planning phase.

  5. A proposal for a computer-based framework of support for public health in the management of biological incidents: the Czech Republic experience.

    PubMed

    Bures, Vladimír; Otcenásková, Tereza; Cech, Pavel; Antos, Karel

    2012-11-01

    Biological incidents jeopardising public health require decision-making that consists of one dominant feature: complexity. Therefore, public health decision-makers necessitate appropriate support. Based on the analogy with business intelligence (BI) principles, the contextual analysis of the environment and available data resources, and conceptual modelling within systems and knowledge engineering, this paper proposes a general framework for computer-based decision support in the case of a biological incident. At the outset, the analysis of potential inputs to the framework is conducted and several resources such as demographic information, strategic documents, environmental characteristics, agent descriptors and surveillance systems are considered. Consequently, three prototypes were developed, tested and evaluated by a group of experts. Their selection was based on the overall framework scheme. Subsequently, an ontology prototype linked with an inference engine, multi-agent-based model focusing on the simulation of an environment, and expert-system prototypes were created. All prototypes proved to be utilisable support tools for decision-making in the field of public health. Nevertheless, the research revealed further issues and challenges that might be investigated by both public health focused researchers and practitioners.

  6. Identifying Threshold Concepts: Case Study of an Open Catchment Hydraulics Course

    ERIC Educational Resources Information Center

    Knight, D. B.; Callaghan, D. P.; Baldock, T. E.; Meyer, J. H. F.

    2014-01-01

    The Threshold Concept Framework is used to initiate a dialogue on an empirically supported pedagogy that focuses on students' conceptual understanding required for solving application-based problems. The present paper uses a triangulation approach to identify the threshold concept in a third-year undergraduate civil engineering course on open…

  7. Generalizing Gillespie’s Direct Method to Enable Network-Free Simulations

    DOE PAGES

    Suderman, Ryan T.; Mitra, Eshan David; Lin, Yen Ting; ...

    2018-03-28

    Gillespie’s direct method for stochastic simulation of chemical kinetics is a staple of computational systems biology research. However, the algorithm requires explicit enumeration of all reactions and all chemical species that may arise in the system. In many cases, this is not feasible due to the combinatorial explosion of reactions and species in biological networks. Rule-based modeling frameworks provide a way to exactly represent networks containing such combinatorial complexity, and generalizations of Gillespie’s direct method have been developed as simulation engines for rule-based modeling languages. Here, we provide both a high-level description of the algorithms underlying the simulation engines, termedmore » network-free simulation algorithms, and how they have been applied in systems biology research. We also define a generic rule-based modeling framework and describe a number of technical details required for adapting Gillespie’s direct method for network-free simulation. Lastly, we briefly discuss potential avenues for advancing network-free simulation and the role they continue to play in modeling dynamical systems in biology.« less

  8. Generalizing Gillespie’s Direct Method to Enable Network-Free Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suderman, Ryan T.; Mitra, Eshan David; Lin, Yen Ting

    Gillespie’s direct method for stochastic simulation of chemical kinetics is a staple of computational systems biology research. However, the algorithm requires explicit enumeration of all reactions and all chemical species that may arise in the system. In many cases, this is not feasible due to the combinatorial explosion of reactions and species in biological networks. Rule-based modeling frameworks provide a way to exactly represent networks containing such combinatorial complexity, and generalizations of Gillespie’s direct method have been developed as simulation engines for rule-based modeling languages. Here, we provide both a high-level description of the algorithms underlying the simulation engines, termedmore » network-free simulation algorithms, and how they have been applied in systems biology research. We also define a generic rule-based modeling framework and describe a number of technical details required for adapting Gillespie’s direct method for network-free simulation. Lastly, we briefly discuss potential avenues for advancing network-free simulation and the role they continue to play in modeling dynamical systems in biology.« less

  9. Parallel scalability and efficiency of vortex particle method for aeroelasticity analysis of bluff bodies

    NASA Astrophysics Data System (ADS)

    Tolba, Khaled Ibrahim; Morgenthal, Guido

    2018-01-01

    This paper presents an analysis of the scalability and efficiency of a simulation framework based on the vortex particle method. The code is applied for the numerical aerodynamic analysis of line-like structures. The numerical code runs on multicore CPU and GPU architectures using OpenCL framework. The focus of this paper is the analysis of the parallel efficiency and scalability of the method being applied to an engineering test case, specifically the aeroelastic response of a long-span bridge girder at the construction stage. The target is to assess the optimal configuration and the required computer architecture, such that it becomes feasible to efficiently utilise the method within the computational resources available for a regular engineering office. The simulations and the scalability analysis are performed on a regular gaming type computer.

  10. GoldenBraid 2.0: A Comprehensive DNA Assembly Framework for Plant Synthetic Biology1[C][W][OA

    PubMed Central

    Sarrion-Perdigones, Alejandro; Vazquez-Vilar, Marta; Palací, Jorge; Castelijns, Bas; Forment, Javier; Ziarsolo, Peio; Blanca, José; Granell, Antonio; Orzaez, Diego

    2013-01-01

    Plant synthetic biology aims to apply engineering principles to plant genetic design. One strategic requirement of plant synthetic biology is the adoption of common standardized technologies that facilitate the construction of increasingly complex multigene structures at the DNA level while enabling the exchange of genetic building blocks among plant bioengineers. Here, we describe GoldenBraid 2.0 (GB2.0), a comprehensive technological framework that aims to foster the exchange of standard DNA parts for plant synthetic biology. GB2.0 relies on the use of type IIS restriction enzymes for DNA assembly and proposes a modular cloning schema with positional notation that resembles the grammar of natural languages. Apart from providing an optimized cloning strategy that generates fully exchangeable genetic elements for multigene engineering, the GB2.0 toolkit offers an ever-growing open collection of DNA parts, including a group of functionally tested, premade genetic modules to build frequently used modules like constitutive and inducible expression cassettes, endogenous gene silencing and protein-protein interaction tools, etc. Use of the GB2.0 framework is facilitated by a number of Web resources that include a publicly available database, tutorials, and a software package that provides in silico simulations and laboratory protocols for GB2.0 part domestication and multigene engineering. In short, GB2.0 provides a framework to exchange both information and physical DNA elements among bioengineers to help implement plant synthetic biology projects. PMID:23669743

  11. GoldenBraid 2.0: a comprehensive DNA assembly framework for plant synthetic biology.

    PubMed

    Sarrion-Perdigones, Alejandro; Vazquez-Vilar, Marta; Palací, Jorge; Castelijns, Bas; Forment, Javier; Ziarsolo, Peio; Blanca, José; Granell, Antonio; Orzaez, Diego

    2013-07-01

    Plant synthetic biology aims to apply engineering principles to plant genetic design. One strategic requirement of plant synthetic biology is the adoption of common standardized technologies that facilitate the construction of increasingly complex multigene structures at the DNA level while enabling the exchange of genetic building blocks among plant bioengineers. Here, we describe GoldenBraid 2.0 (GB2.0), a comprehensive technological framework that aims to foster the exchange of standard DNA parts for plant synthetic biology. GB2.0 relies on the use of type IIS restriction enzymes for DNA assembly and proposes a modular cloning schema with positional notation that resembles the grammar of natural languages. Apart from providing an optimized cloning strategy that generates fully exchangeable genetic elements for multigene engineering, the GB2.0 toolkit offers an evergrowing open collection of DNA parts, including a group of functionally tested, premade genetic modules to build frequently used modules like constitutive and inducible expression cassettes, endogenous gene silencing and protein-protein interaction tools, etc. Use of the GB2.0 framework is facilitated by a number of Web resources that include a publicly available database, tutorials, and a software package that provides in silico simulations and laboratory protocols for GB2.0 part domestication and multigene engineering. In short, GB2.0 provides a framework to exchange both information and physical DNA elements among bioengineers to help implement plant synthetic biology projects.

  12. Mobius Assembly: A versatile Golden-Gate framework towards universal DNA assembly.

    PubMed

    Andreou, Andreas I; Nakayama, Naomi

    2018-01-01

    Synthetic biology builds upon the foundation of engineering principles, prompting innovation and improvement in biotechnology via a design-build-test-learn cycle. A community-wide standard in DNA assembly would enable bio-molecular engineering at the levels of predictivity and universality in design and construction that are comparable to other engineering fields. Golden Gate Assembly technology, with its robust capability to unidirectionally assemble numerous DNA fragments in a one-tube reaction, has the potential to deliver a universal standard framework for DNA assembly. While current Golden Gate Assembly frameworks (e.g. MoClo and Golden Braid) render either high cloning capacity or vector toolkit simplicity, the technology can be made more versatile-simple, streamlined, and cost/labor-efficient, without compromising capacity. Here we report the development of a new Golden Gate Assembly framework named Mobius Assembly, which combines vector toolkit simplicity with high cloning capacity. It is based on a two-level, hierarchical approach and utilizes a low-frequency cutter to reduce domestication requirements. Mobius Assembly embraces the standard overhang designs designated by MoClo, Golden Braid, and Phytobricks and is largely compatible with already available Golden Gate part libraries. In addition, dropout cassettes encoding chromogenic proteins were implemented for cost-free visible cloning screening that color-code different cloning levels. As proofs of concept, we have successfully assembled up to 16 transcriptional units of various pigmentation genes in both operon and multigene arrangements. Taken together, Mobius Assembly delivers enhanced versatility and efficiency in DNA assembly, facilitating improved standardization and automation.

  13. Protein design in systems metabolic engineering for industrial strain development.

    PubMed

    Chen, Zhen; Zeng, An-Ping

    2013-05-01

    Accelerating the process of industrial bacterial host strain development, aimed at increasing productivity, generating new bio-products or utilizing alternative feedstocks, requires the integration of complementary approaches to manipulate cellular metabolism and regulatory networks. Systems metabolic engineering extends the concept of classical metabolic engineering to the systems level by incorporating the techniques used in systems biology and synthetic biology, and offers a framework for the development of the next generation of industrial strains. As one of the most useful tools of systems metabolic engineering, protein design allows us to design and optimize cellular metabolism at a molecular level. Here, we review the current strategies of protein design for engineering cellular synthetic pathways, metabolic control systems and signaling pathways, and highlight the challenges of this subfield within the context of systems metabolic engineering. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Professional Ethics of Software Engineers: An Ethical Framework.

    PubMed

    Lurie, Yotam; Mark, Shlomo

    2016-04-01

    The purpose of this article is to propose an ethical framework for software engineers that connects software developers' ethical responsibilities directly to their professional standards. The implementation of such an ethical framework can overcome the traditional dichotomy between professional skills and ethical skills, which plagues the engineering professions, by proposing an approach to the fundamental tasks of the practitioner, i.e., software development, in which the professional standards are intrinsically connected to the ethical responsibilities. In so doing, the ethical framework improves the practitioner's professionalism and ethics. We call this approach Ethical-Driven Software Development (EDSD), as an approach to software development. EDSD manifests the advantages of an ethical framework as an alternative to the all too familiar approach in professional ethics that advocates "stand-alone codes of ethics". We believe that one outcome of this synergy between professional and ethical skills is simply better engineers. Moreover, since there are often different software solutions, which the engineer can provide to an issue at stake, the ethical framework provides a guiding principle, within the process of software development, that helps the engineer evaluate the advantages and disadvantages of different software solutions. It does not and cannot affect the end-product in and of-itself. However, it can and should, make the software engineer more conscious and aware of the ethical ramifications of certain engineering decisions within the process.

  15. An automated qualification framework for the MeerKAT CAM (Control-And-Monitoring)

    NASA Astrophysics Data System (ADS)

    van den Heever, Lize; Marais, Neilen; Slabber, Martin

    2016-08-01

    This paper introduces and discusses the design of an Automated Qualification Framework (AQF) that was developed to automate as much as possible of the formal Qualification Testing of the Control And Monitoring (CAM) subsystem of the 64 dish MeerKAT radio telescope currently under construction in the Karoo region of South Africa. The AQF allows each Integrated CAM Test to reference the MeerKAT CAM requirement and associated verification requirement it covers and automatically produces the Qualification Test Procedure and Qualification Test Report from the test steps and evaluation steps annotated in the Integrated CAM Tests. The MeerKAT System Engineers are extremely happy with the AQF results, but mostly by the approach and process it enforces.

  16. Model-based engineering for medical-device software.

    PubMed

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  17. Cell-based therapeutics: the next pillar of medicine.

    PubMed

    Fischbach, Michael A; Bluestone, Jeffrey A; Lim, Wendell A

    2013-04-03

    Two decades ago, the pharmaceutical industry-long dominated by small-molecule drugs-was revolutionized by the the advent of biologics. Today, biomedicine sits on the cusp of a new revolution: the use of microbial and human cells as versatile therapeutic engines. Here, we discuss the promise of this "third pillar" of therapeutics in the context of current scientific, regulatory, economic, and perceptual challenges. History suggests that the advent of cellular medicines will require the development of a foundational cellular engineering science that provides a systematic framework for safely and predictably altering and regulating cellular behaviors.

  18. Engineering education in 21st century

    NASA Astrophysics Data System (ADS)

    Alam, Firoz; Sarkar, Rashid; La Brooy, Roger; Chowdhury, Harun

    2016-07-01

    The internationalization of engineering curricula and engineering practices has begun in Europe, Anglosphere (English speaking) nations and Asian emerging economies through the Bologna Process and International Engineering Alliance (Washington Accord). Both the Bologna Process and the Washington Accord have introduced standardized outcome based engineering competencies and frameworks for the attainment of these competencies by restructuring existing and undertaking some new measures for an intelligent adaptation of the engineering curriculum and pedagogy. Thus graduates with such standardized outcome based curriculum can move freely as professional engineers with mutual recognition within member nations. Despite having similar or near similar curriculum, Bangladeshi engineering graduates currently cannot get mutual recognition in nations of Washington Accord and the Bologna Process due to the non-compliance of outcome based curriculum and pedagogy. This paper emphasizes the steps that are required to undertake by the engineering educational institutions and the professional body in Bangladesh to make the engineering competencies, curriculum and pedagogy compliant to the global engineering alliance. Achieving such compliance will usher in a new era for the global mobility and global engagement by Bangladesh trained engineering graduates.

  19. Development of the Functional Flow Block Diagram for the J-2X Rocket Engine System

    NASA Technical Reports Server (NTRS)

    White, Thomas; Stoller, Sandra L.; Greene, WIlliam D.; Christenson, Rick L.; Bowen, Barry C.

    2007-01-01

    The J-2X program calls for the upgrade of the Apollo-era Rocketdyne J-2 engine to higher power levels, using new materials and manufacturing techniques, and with more restrictive safety and reliability requirements than prior human-rated engines in NASA history. Such requirements demand a comprehensive systems engineering effort to ensure success. Pratt & Whitney Rocketdyne system engineers performed a functional analysis of the engine to establish the functional architecture. J-2X functions were captured in six major operational blocks. Each block was divided into sub-blocks or states. In each sub-block, functions necessary to perform each state were determined. A functional engine schematic consistent with the fidelity of the system model was defined for this analysis. The blocks, sub-blocks, and functions were sequentially numbered to differentiate the states in which the function were performed and to indicate the sequence of events. The Engine System was functionally partitioned, to provide separate and unique functional operators. Establishing unique functional operators as work output of the System Architecture process is novel in Liquid Propulsion Engine design. Each functional operator was described such that its unique functionality was identified. The decomposed functions were then allocated to the functional operators both of which were the inputs to the subsystem or component performance specifications. PWR also used a novel approach to identify and map the engine functional requirements to customer-specified functions. The final result was a comprehensive Functional Flow Block Diagram (FFBD) for the J-2X Engine System, decomposed to the component level and mapped to all functional requirements. This FFBD greatly facilitates component specification development, providing a well-defined trade space for functional trades at the subsystem and component level. It also provides a framework for function-based failure modes and effects analysis (FMEA), and a rigorous baseline for the functional architecture.

  20. IT Requirements Integration in High-Rise Construction Design Projects

    NASA Astrophysics Data System (ADS)

    Levina, Anastasia; Ilin, Igor; Esedulaev, Rustam

    2018-03-01

    The paper discusses the growing role of IT support for the operation of modern high-rise buildings, due to the complexity of managing engineering systems of buildings and the requirements of consumers for the IT infrastructure. The existing regulatory framework for the development of design documentation for construction, including high-rise buildings, is analyzed, and the lack of coherence in the development of this documentation with the requirements for the creation of an automated management system and the corresponding IT infrastructure is stated. The lack of integration between these areas is the cause of delays and inefficiencies both at the design stage and at the stage of putting the building into operation. The paper proposes an approach to coordinate the requirements of the IT infrastructure of high-rise buildings and design documentation for construction. The solution to this problem is possible within the framework of the enterprise architecture concept by coordinating the requirements of the IT and technological layers at the design stage of the construction.

  1. Development and Application of a Systems Engineering Framework to Support Online Course Design and Delivery

    ERIC Educational Resources Information Center

    Bozkurt, Ipek; Helm, James

    2013-01-01

    This paper develops a systems engineering-based framework to assist in the design of an online engineering course. Specifically, the purpose of the framework is to provide a structured methodology for the design, development and delivery of a fully online course, either brand new or modified from an existing face-to-face course. The main strength…

  2. The engineering of cybernetic systems

    NASA Astrophysics Data System (ADS)

    Fry, Robert L.

    2002-05-01

    This tutorial develops a logical basis for the engineering of systems that operate cybernetically. The term cybernetic system has a clear quantitative definition. It is a system that dynamically matches acquired information to selected actions relative to a computational issue that defines the essential purpose of the system or machine. This notion requires that information and control be further quantified. The logic of questions and assertions as developed by Cox provides one means of doing this. The design and operation of cybernetic systems can be understood by contrasting these kinds of systems with communication systems and information theory as developed by Shannon. The joint logic of questions and assertions can be seen to underlie and be common to both information theory as applied to the design of discrete communication systems and to a theory of discrete general systems. The joint logic captures a natural complementarity between systems that transmit and receive information and those that acquire and act on it. Specific comparisons and contrasts are made between the source rate and channel capacity of a communication system and the acquisition rate and control capacity of a general system. An overview is provided of the joint logic of questions and assertions and the ties that this logic has to both conventional information theory and to a general theory of systems. I-diagrams, the interrogative complement of Venn diagrams, are described as providing valuable reasoning tools. An initial framework is suggested for the design of cybernetic systems. Two examples are given to illustrate this framework as applied to discrete cybernetic systems. These examples include a predator-prey problem as illustrated through "The Dog Chrysippus Pursuing its Prey," and the derivation of a single-neuron system that operates cybernetically and is biologically plausible. Future areas of research are highlighted which require development for a mature engineering framework.

  3. Software Engineering Frameworks: Textbooks vs. Student Perceptions

    ERIC Educational Resources Information Center

    McMaster, Kirby; Hadfield, Steven; Wolthuis, Stuart; Sambasivam, Samuel

    2012-01-01

    This research examines the frameworks used by Computer Science and Information Systems students at the conclusion of their first semester of study of Software Engineering. A questionnaire listing 64 Software Engineering concepts was given to students upon completion of their first Software Engineering course. This survey was given to samples of…

  4. Probabilistic framework for product design optimization and risk management

    NASA Astrophysics Data System (ADS)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  5. Scaling up Three-Dimensional Science Learning through Teacher-Led Study Groups across a State

    ERIC Educational Resources Information Center

    Reiser, Brian J.; Michaels, Sarah; Moon, Jean; Bell, Tara; Dyer, Elizabeth; Edwards, Kelsey D.; McGill, Tara A. W.; Novak, Michael; Park, Aimee

    2017-01-01

    The vision for science teaching in the Framework for K-12 Science Education and the Next Generation Science Standards requires a radical departure from traditional science teaching. Science literacy is defined as three-dimensional (3D), in which students engage in science and engineering practices to develop and apply science disciplinary ideas…

  6. The Large Synoptic Survey Telescope OCS and TCS models

    NASA Astrophysics Data System (ADS)

    Schumacher, German; Delgado, Francisco

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) is a project envisioned as a system of systems with demanding science, technical, and operational requirements, that must perform as a fully integrated unit. The design and implementation of such a system poses big engineering challenges when performing requirements analysis, detailed interface definitions, operational modes and control strategy studies. The OMG System Modeling Language (SysML) has been selected as the framework for the systems engineering analysis and documentation for the LSST. Models for the overall system architecture and different observatory subsystems have been built describing requirements, structure, interfaces and behavior. In this paper we show the models for the Observatory Control System (OCS) and the Telescope Control System (TCS), and how this methodology has helped in the clarification of the design and requirements. In one common language, the relationships of the OCS, TCS, Camera and Data management subsystems are captured with models of the structure, behavior, requirements and the traceability between them.

  7. Fast and Efficient Feature Engineering for Multi-Cohort Analysis of EHR Data.

    PubMed

    Ozery-Flato, Michal; Yanover, Chen; Gottlieb, Assaf; Weissbrod, Omer; Parush Shear-Yashuv, Naama; Goldschmidt, Yaara

    2017-01-01

    We present a framework for feature engineering, tailored for longitudinal structured data, such as electronic health records (EHRs). To fast-track feature engineering and extraction, the framework combines general-use plug-in extractors, a multi-cohort management mechanism, and modular memoization. Using this framework, we rapidly extracted thousands of features from diverse and large healthcare data sources in multiple projects.

  8. Overview of NASA MSFC IEC Federated Engineering Collaboration Capability

    NASA Technical Reports Server (NTRS)

    Moushon, Brian; McDuffee, Patrick

    2005-01-01

    The MSFC IEC federated engineering framework is currently developing a single collaborative engineering framework across independent NASA centers. The federated approach allows NASA centers the ability to maintain diversity and uniqueness, while providing interoperability. These systems are integrated together in a federated framework without compromising individual center capabilities. MSFC IEC's Federation Framework will have a direct affect on how engineering data is managed across the Agency. The approach is directly attributed in response to the Columbia Accident Investigation Board (CAB) finding F7.4-11 which states the Space Shuttle Program has a wealth of data sucked away in multiple databases without a convenient way to integrate and use the data for management, engineering, or safety decisions. IEC s federated capability is further supported by OneNASA recommendation 6 that identifies the need to enhance cross-Agency collaboration by putting in place common engineering and collaborative tools and databases, processes, and knowledge-sharing structures. MSFC's IEC Federated Framework is loosely connected to other engineering applications that can provide users with the integration needed to achieve an Agency view of the entire product definition and development process, while allowing work to be distributed across NASA Centers and contractors. The IEC DDMS federation framework eliminates the need to develop a single, enterprise-wide data model, where the goal of having a common data model shared between NASA centers and contractors is very difficult to achieve.

  9. Control of fluxes in metabolic networks

    PubMed Central

    Basler, Georg; Nikoloski, Zoran; Larhlimi, Abdelhalim; Barabási, Albert-László; Liu, Yang-Yu

    2016-01-01

    Understanding the control of large-scale metabolic networks is central to biology and medicine. However, existing approaches either require specifying a cellular objective or can only be used for small networks. We introduce new coupling types describing the relations between reaction activities, and develop an efficient computational framework, which does not require any cellular objective for systematic studies of large-scale metabolism. We identify the driver reactions facilitating control of 23 metabolic networks from all kingdoms of life. We find that unicellular organisms require a smaller degree of control than multicellular organisms. Driver reactions are under complex cellular regulation in Escherichia coli, indicating their preeminent role in facilitating cellular control. In human cancer cells, driver reactions play pivotal roles in malignancy and represent potential therapeutic targets. The developed framework helps us gain insights into regulatory principles of diseases and facilitates design of engineering strategies at the interface of gene regulation, signaling, and metabolism. PMID:27197218

  10. Massachusetts Science and Technology Engineering Curriculum Framework

    ERIC Educational Resources Information Center

    Massachusetts Department of Education, 2006

    2006-01-01

    This 2006 "Massachusetts Science and Technology/Engineering Curriculum Framework" provides a guide for teachers and curriculum coordinators regarding specific content to be taught from PreK through high school. Following this "Organization" chapter, the "Framework" contains the following sections: (1) Philosophy and…

  11. The influence of interdisciplinary collaboration on decision making: a framework to analyse stakeholder coalitions, evolution and learning in strategic delta planning

    NASA Astrophysics Data System (ADS)

    Vermoolen, Myrthe; Hermans, Leon

    2015-04-01

    The sustained development of urbanizing deltas requires that conflicting interests are reconciled, in an environment characterized by technical complexity and knowledge limitations. However, integrating ideas and establishing cooperation between actors with different backgrounds and roles still proves a challenge. Agreeing on strategic choices is difficult and implementation of agreed plans may lead to unanticipated and unintended outcomes. How can individual disciplinary perspectives come together and establish a broadly-supported and well-informed plan, the implementation of which contributes to sustainable delta development? The growing recognition of this need to bring together different stakeholders and different disciplinary perspectives runs parallel to a paradigm shift from 'hard' hydrological engineering to multi-functional and more 'soft' hydrological engineering in water management. As a result, there is now more attention for interdisciplinary collaboration that not only takes the physical characteristics of water systems into account, but also the interaction between physical and societal components of these systems. Thus, it is important to study interdisciplinary collaboration and how this influences decision-making. Our research looks into this connection, using a case in delta planning in the Netherlands, where there have been several (attempts for) integration of spatial planning and flood risk/ water management, e.g. in the case of the Dutch Delta Programme. This means that spatial designers and their designs play an important role in the strategic delta planning process as well, next to civil engineers, etc. This study explores the roles of stakeholders, experts and policy makers in interdisciplinary decision-making in dynamic delta planning processes, using theories and methods that focus on coalitions, learning and changes over time in policy and planning processes. This requires an expansion of the existing frameworks to study interdisciplinary collaboration. The question here is how to combine policy science frameworks (e.g. the Advocacy Coalition Framework) and social network methods (e.g. Social Network Analysis) with frameworks that allow a connection with the physical delta systems. This will result in a new framework for analysing interdisciplinary stakeholder coalitions, evolution and learning in strategic delta planning. The use of this framework will be illustrated with an example from strategic delta planning in the Dutch Southwest Delta. With this, we want to see how spatial planning and water management disciplines have combined into new policies for delta management in the Netherlands over the past 25 years.

  12. An Australian study of possible selves perceived by undergraduate engineering students

    NASA Astrophysics Data System (ADS)

    Bennett, Dawn; Male, Sally A.

    2017-11-01

    In this study, we worked with second-year engineering students at an Australian university to examine previously identified threshold concepts within the theoretical framework of Possible Selves. Using workshops as the context for intensive work with students, students were encouraged to consider their future lives and work, including their engineering fears, expectations, and aspirations. The findings revealed many students to have a poor understanding of the realities of engineering work. Moreover, perceived gaps between self-efficacy and the requirements of engineering work appeared to be motivating if students deemed it possible to reduce the gap, but demotivating if they identified a characteristic over which there was perceived to be no control. The study suggests that these engineering students needed more opportunities to explore both the roles of engineers and their own possible selves. Overall, the findings indicate that higher education students may need encouragement and support to explore potential future roles, and they strengthen calls for further research in this area.

  13. A machine learning-based framework to identify type 2 diabetes through electronic health records

    PubMed Central

    Zheng, Tao; Xie, Wei; Xu, Liling; He, Xiaoying; Zhang, Ya; You, Mingrong; Yang, Gong; Chen, You

    2016-01-01

    Objective To discover diverse genotype-phenotype associations affiliated with Type 2 Diabetes Mellitus (T2DM) via genome-wide association study (GWAS) and phenome-wide association study (PheWAS), more cases (T2DM subjects) and controls (subjects without T2DM) are required to be identified (e.g., via Electronic Health Records (EHR)). However, existing expert based identification algorithms often suffer in a low recall rate and could miss a large number of valuable samples under conservative filtering standards. The goal of this work is to develop a semi-automated framework based on machine learning as a pilot study to liberalize filtering criteria to improve recall rate with a keeping of low false positive rate. Materials and methods We propose a data informed framework for identifying subjects with and without T2DM from EHR via feature engineering and machine learning. We evaluate and contrast the identification performance of widely-used machine learning models within our framework, including k-Nearest-Neighbors, Naïve Bayes, Decision Tree, Random Forest, Support Vector Machine and Logistic Regression. Our framework was conducted on 300 patient samples (161 cases, 60 controls and 79 unconfirmed subjects), randomly selected from 23,281 diabetes related cohort retrieved from a regional distributed EHR repository ranging from 2012 to 2014. Results We apply top-performing machine learning algorithms on the engineered features. We benchmark and contrast the accuracy, precision, AUC, sensitivity and specificity of classification models against the state-of-the-art expert algorithm for identification of T2DM subjects. Our results indicate that the framework achieved high identification performances (∼0.98 in average AUC), which are much higher than the state-of-the-art algorithm (0.71 in AUC). Discussion Expert algorithm-based identification of T2DM subjects from EHR is often hampered by the high missing rates due to their conservative selection criteria. Our framework leverages machine learning and feature engineering to loosen such selection criteria to achieve a high identification rate of cases and controls. Conclusions Our proposed framework demonstrates a more accurate and efficient approach for identifying subjects with and without T2DM from EHR. PMID:27919371

  14. A machine learning-based framework to identify type 2 diabetes through electronic health records.

    PubMed

    Zheng, Tao; Xie, Wei; Xu, Liling; He, Xiaoying; Zhang, Ya; You, Mingrong; Yang, Gong; Chen, You

    2017-01-01

    To discover diverse genotype-phenotype associations affiliated with Type 2 Diabetes Mellitus (T2DM) via genome-wide association study (GWAS) and phenome-wide association study (PheWAS), more cases (T2DM subjects) and controls (subjects without T2DM) are required to be identified (e.g., via Electronic Health Records (EHR)). However, existing expert based identification algorithms often suffer in a low recall rate and could miss a large number of valuable samples under conservative filtering standards. The goal of this work is to develop a semi-automated framework based on machine learning as a pilot study to liberalize filtering criteria to improve recall rate with a keeping of low false positive rate. We propose a data informed framework for identifying subjects with and without T2DM from EHR via feature engineering and machine learning. We evaluate and contrast the identification performance of widely-used machine learning models within our framework, including k-Nearest-Neighbors, Naïve Bayes, Decision Tree, Random Forest, Support Vector Machine and Logistic Regression. Our framework was conducted on 300 patient samples (161 cases, 60 controls and 79 unconfirmed subjects), randomly selected from 23,281 diabetes related cohort retrieved from a regional distributed EHR repository ranging from 2012 to 2014. We apply top-performing machine learning algorithms on the engineered features. We benchmark and contrast the accuracy, precision, AUC, sensitivity and specificity of classification models against the state-of-the-art expert algorithm for identification of T2DM subjects. Our results indicate that the framework achieved high identification performances (∼0.98 in average AUC), which are much higher than the state-of-the-art algorithm (0.71 in AUC). Expert algorithm-based identification of T2DM subjects from EHR is often hampered by the high missing rates due to their conservative selection criteria. Our framework leverages machine learning and feature engineering to loosen such selection criteria to achieve a high identification rate of cases and controls. Our proposed framework demonstrates a more accurate and efficient approach for identifying subjects with and without T2DM from EHR. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. Coordination and Cooperation to Achieve the GEOSS Space Segment: A Systems Approach

    NASA Technical Reports Server (NTRS)

    Killough, Brian D., Jr.

    2007-01-01

    Established in April 2007, the SEO has made significant accomplishments in the support of CEOS and the virtual constellations. These accomplishments include (1) constellation trade studies for Atmospheric Composition and Land Surface Imaging, (2) a new engineering framework for requirements definition, assessment and architecture planning, (3) completion of a draft requirements document and gap analysis for the Atmospheric Composition Virtual Constellation, and (4) the development of a DVD video highlighting CEOS and the Virtual Constellation concept.

  16. THE NUCLEAR RAMJET PROPULSION SYSTEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merkle, T.C.

    1959-06-30

    The most practical nuclear ramjet systems consist of a suituble inlet diffusor system followed by a singlepass, straight-through heat exchanger (reactor) which couples into a typical exhaust nozzle. Within this framework, possibilities ars governed by the aerodynamic requirements of flight, the nuclear requirements of the reactor, the chemical problems associated with breathing air, and the mechanical properties of materials at elevated temperatures. The major research and development areas which must be entered in the actual production of such an engine are discussed. (W.D.M.)

  17. A Generic Software Architecture For Prognostics

    NASA Technical Reports Server (NTRS)

    Teubert, Christopher; Daigle, Matthew J.; Sankararaman, Shankar; Goebel, Kai; Watkins, Jason

    2017-01-01

    Prognostics is a systems engineering discipline focused on predicting end-of-life of components and systems. As a relatively new and emerging technology, there are few fielded implementations of prognostics, due in part to practitioners perceiving a large hurdle in developing the models, algorithms, architecture, and integration pieces. As a result, no open software frameworks for applying prognostics currently exist. This paper introduces the Generic Software Architecture for Prognostics (GSAP), an open-source, cross-platform, object-oriented software framework and support library for creating prognostics applications. GSAP was designed to make prognostics more accessible and enable faster adoption and implementation by industry, by reducing the effort and investment required to develop, test, and deploy prognostics. This paper describes the requirements, design, and testing of GSAP. Additionally, a detailed case study involving battery prognostics demonstrates its use.

  18. Services supporting collaborative alignment of engineering networks

    NASA Astrophysics Data System (ADS)

    Jansson, Kim; Uoti, Mikko; Karvonen, Iris

    2015-08-01

    Large-scale facilities such as power plants, process factories, ships and communication infrastructures are often engineered and delivered through geographically distributed operations. The competencies required are usually distributed across several contributing organisations. In these complicated projects, it is of key importance that all partners work coherently towards a common goal. VTT and a number of industrial organisations in the marine sector have participated in a national collaborative research programme addressing these needs. The main output of this programme was development of the Innovation and Engineering Maturity Model for Marine-Industry Networks. The recently completed European Union Framework Programme 7 project COIN developed innovative solutions and software services for enterprise collaboration and enterprise interoperability. One area of focus in that work was services for collaborative project management. This article first addresses a number of central underlying research themes and previous research results that have influenced the development work mentioned above. This article presents two approaches for the development of services that support distributed engineering work. Experience from use of the services is analysed, and potential for development is identified. This article concludes with a proposal for consolidation of the two above-mentioned methodologies. This article outlines the characteristics and requirements of future services supporting collaborative alignment of engineering networks.

  19. Mobius Assembly: A versatile Golden-Gate framework towards universal DNA assembly

    PubMed Central

    Andreou, Andreas I.

    2018-01-01

    Synthetic biology builds upon the foundation of engineering principles, prompting innovation and improvement in biotechnology via a design-build-test-learn cycle. A community-wide standard in DNA assembly would enable bio-molecular engineering at the levels of predictivity and universality in design and construction that are comparable to other engineering fields. Golden Gate Assembly technology, with its robust capability to unidirectionally assemble numerous DNA fragments in a one-tube reaction, has the potential to deliver a universal standard framework for DNA assembly. While current Golden Gate Assembly frameworks (e.g. MoClo and Golden Braid) render either high cloning capacity or vector toolkit simplicity, the technology can be made more versatile—simple, streamlined, and cost/labor-efficient, without compromising capacity. Here we report the development of a new Golden Gate Assembly framework named Mobius Assembly, which combines vector toolkit simplicity with high cloning capacity. It is based on a two-level, hierarchical approach and utilizes a low-frequency cutter to reduce domestication requirements. Mobius Assembly embraces the standard overhang designs designated by MoClo, Golden Braid, and Phytobricks and is largely compatible with already available Golden Gate part libraries. In addition, dropout cassettes encoding chromogenic proteins were implemented for cost-free visible cloning screening that color-code different cloning levels. As proofs of concept, we have successfully assembled up to 16 transcriptional units of various pigmentation genes in both operon and multigene arrangements. Taken together, Mobius Assembly delivers enhanced versatility and efficiency in DNA assembly, facilitating improved standardization and automation. PMID:29293531

  20. Design of an embedded inverse-feedforward biomolecular tracking controller for enzymatic reaction processes.

    PubMed

    Foo, Mathias; Kim, Jongrae; Sawlekar, Rucha; Bates, Declan G

    2017-04-06

    Feedback control is widely used in chemical engineering to improve the performance and robustness of chemical processes. Feedback controllers require a 'subtractor' that is able to compute the error between the process output and the reference signal. In the case of embedded biomolecular control circuits, subtractors designed using standard chemical reaction network theory can only realise one-sided subtraction, rendering standard controller design approaches inadequate. Here, we show how a biomolecular controller that allows tracking of required changes in the outputs of enzymatic reaction processes can be designed and implemented within the framework of chemical reaction network theory. The controller architecture employs an inversion-based feedforward controller that compensates for the limitations of the one-sided subtractor that generates the error signals for a feedback controller. The proposed approach requires significantly fewer chemical reactions to implement than alternative designs, and should have wide applicability throughout the fields of synthetic biology and biological engineering.

  1. Crops in silico: A community wide multi-scale computational modeling framework of plant canopies

    NASA Astrophysics Data System (ADS)

    Srinivasan, V.; Christensen, A.; Borkiewic, K.; Yiwen, X.; Ellis, A.; Panneerselvam, B.; Kannan, K.; Shrivastava, S.; Cox, D.; Hart, J.; Marshall-Colon, A.; Long, S.

    2016-12-01

    Current crop models predict a looming gap between supply and demand for primary foodstuffs over the next 100 years. While significant yield increases were achieved in major food crops during the early years of the green revolution, the current rates of yield increases are insufficient to meet future projected food demand. Furthermore, with projected reduction in arable land, decrease in water availability, and increasing impacts of climate change on future food production, innovative technologies are required to sustainably improve crop yield. To meet these challenges, we are developing Crops in silico (Cis), a biologically informed, multi-scale, computational modeling framework that can facilitate whole plant simulations of crop systems. The Cis framework is capable of linking models of gene networks, protein synthesis, metabolic pathways, physiology, growth, and development in order to investigate crop response to different climate scenarios and resource constraints. This modeling framework will provide the mechanistic details to generate testable hypotheses toward accelerating directed breeding and engineering efforts to increase future food security. A primary objective for building such a framework is to create synergy among an inter-connected community of biologists and modelers to create a realistic virtual plant. This framework advantageously casts the detailed mechanistic understanding of individual plant processes across various scales in a common scalable framework that makes use of current advances in high performance and parallel computing. We are currently designing a user friendly interface that will make this tool equally accessible to biologists and computer scientists. Critically, this framework will provide the community with much needed tools for guiding future crop breeding and engineering, understanding the emergent implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment.

  2. Characterizing the impact of spatiotemporal variations in stormwater infrastructure on hydrologic conditions

    NASA Astrophysics Data System (ADS)

    Jovanovic, T.; Mejia, A.; Hale, R. L.; Gironas, J. A.

    2015-12-01

    Urban stormwater infrastructure design has evolved in time, reflecting changes in stormwater policy and regulations, and in engineering design. This evolution makes urban basins heterogeneous socio-ecological-technological systems. We hypothesize that this heterogeneity creates unique impact trajectories in time and impact hotspots in space within and across cities. To explore this, we develop and implement a network hydro-engineering modeling framework based on high-resolution digital elevation and stormwater infrastructure data. The framework also accounts for climatic, soils, land use, and vegetation conditions in an urban basin, thus making it useful to study the impacts of stormwater infrastructure across cities. Here, to evaluate the framework, we apply it to urban basins in the metropolitan areas of Phoenix, Arizona. We use it to estimate different metrics to characterize the storm-event hydrologic response. We estimate both traditional metrics (e.g., peak flow, time to peak, and runoff volume) as well as new metrics (e.g., basin-scale dispersion mechanisms). We also use the dispersion mechanisms to assess the scaling characteristics of urban basins. Ultimately, we find that the proposed framework can be used to understand and characterize the impacts associated with stormwater infrastructure on hydrologic conditions within a basin. Additionally, we find that the scaling approach helps in synthesizing information but it requires further validation using additional urban basins.

  3. Framework for a space shuttle main engine health monitoring system

    NASA Technical Reports Server (NTRS)

    Hawman, Michael W.; Galinaitis, William S.; Tulpule, Sharayu; Mattedi, Anita K.; Kamenetz, Jeffrey

    1990-01-01

    A framework developed for a health management system (HMS) which is directed at improving the safety of operation of the Space Shuttle Main Engine (SSME) is summarized. An emphasis was placed on near term technology through requirements to use existing SSME instrumentation and to demonstrate the HMS during SSME ground tests within five years. The HMS framework was developed through an analysis of SSME failure modes, fault detection algorithms, sensor technologies, and hardware architectures. A key feature of the HMS framework design is that a clear path from the ground test system to a flight HMS was maintained. Fault detection techniques based on time series, nonlinear regression, and clustering algorithms were developed and demonstrated on data from SSME ground test failures. The fault detection algorithms exhibited 100 percent detection of faults, had an extremely low false alarm rate, and were robust to sensor loss. These algorithms were incorporated into a hierarchical decision making strategy for overall assessment of SSME health. A preliminary design for a hardware architecture capable of supporting real time operation of the HMS functions was developed. Utilizing modular, commercial off-the-shelf components produced a reliable low cost design with the flexibility to incorporate advances in algorithm and sensor technology as they become available.

  4. Harnessing cell–biomaterial interactions for osteochondral tissue regeneration.

    PubMed

    Kim, Kyobum; Yoon, Diana M; Mikos, Antonios; Kasper, F Kurtis

    2012-01-01

    Articular cartilage that is damaged or diseased often requires surgical intervention to repair the tissue; therefore, tissue engineering strategies have been developed to aid in cartilage regeneration. Tissue engineering approaches often require the integration of cells, biomaterials, and growth factors to direct and support tissue formation. A variety of cell types have been isolated from adipose, bone marrow, muscle, and skin tissue to promote cartilage regeneration. The interaction of cells with each other and with their surrounding environment has been shown to play a key role in cartilage engineering. In tissue engineering approaches, biomaterials are commonly used to provide an initial framework for cell recruitment and proliferation and tissue formation. Modifications of the properties of biomaterials, such as creating sites for cell binding, altering their physicochemical characteristics, and regulating the delivery of growth factors, can have a significant influence on chondrogenesis. Overall, the goal is to completely restore healthy cartilage within an articular cartilage defect. This chapter aims to provide information about the importance of cell–biomaterial interactions for the chondrogenic differentiation of various cell populations that can eventually produce functional cartilage matrix that is indicative of healthy cartilage tissue.

  5. Enterprise resource planning (ERP) implementation using the value engineering methodology and Six Sigma tools

    NASA Astrophysics Data System (ADS)

    Leu, Jun-Der; Lee, Larry Jung-Hsing

    2017-09-01

    Enterprise resource planning (ERP) is a software solution that integrates the operational processes of the business functions of an enterprise. However, implementing ERP systems is a complex process. In addition to the technical issues, companies must address problems associated with business process re-engineering, time and budget control, and organisational change. Numerous industrial studies have shown that the failure rate of ERP implementation is high, even for well-designed systems. Thus, ERP projects typically require a clear methodology to support the project execution and effectiveness. In this study, we propose a theoretical model for ERP implementation. The value engineering (VE) method forms the basis of the proposed framework, which integrates Six Sigma tools. The proposed framework encompasses five phases: knowledge generation, analysis, creation, development and execution. In the VE method, potential ERP problems related to software, hardware, consultation and organisation are analysed in a group-decision manner and in relation to value, and Six Sigma tools are applied to avoid any project defects. We validate the feasibility of the proposed model by applying it to an international manufacturing enterprise in Taiwan. The results show improvements in customer response time and operational efficiency in terms of work-in-process and turnover of materials. Based on the evidence from the case study, the theoretical framework is discussed together with the study's limitations and suggestions for future research.

  6. Experimentation in software engineering

    NASA Technical Reports Server (NTRS)

    Basili, V. R.; Selby, R. W.; Hutchens, D. H.

    1986-01-01

    Experimentation in software engineering supports the advancement of the field through an iterative learning process. In this paper, a framework for analyzing most of the experimental work performed in software engineering over the past several years is presented. A variety of experiments in the framework is described and their contribution to the software engineering discipline is discussed. Some useful recommendations for the application of the experimental process in software engineering are included.

  7. Cell-Based Therapeutics: The Next Pillar of Medicine

    PubMed Central

    Fischbach, Michael A.; Bluestone, Jeffrey A.; Lim, Wendell A.

    2013-01-01

    Two decades ago, the pharmaceutical industry—long dominated by small-molecule drugs—was revolutionized by the the advent of biologics. Today, biomedicine sits on the cusp of a new revolution: the use of microbial and human cells as versatile therapeutic engines. Here, we discuss the promise of this “third pillar” of therapeutics in the context of current scientific, regulatory, economic, and perceptual challenges. History suggests that the advent of cellular medicines will require the development of a foundational cellular engineering science that provides a systematic framework for safely and predictably altering and regulating cellular behaviors. PMID:23552369

  8. Integrated health monitoring and controls for rocket engines

    NASA Technical Reports Server (NTRS)

    Merrill, W. C.; Musgrave, J. L.; Guo, T. H.

    1992-01-01

    Current research in intelligent control systems at the Lewis Research Center is described in the context of a functional framework. The framework is applicable to a variety of reusable space propulsion systems for existing and future launch vehicles. It provides a 'road map' technology development to enable enhanced engine performance with increased reliability, durability, and maintainability. The framework hierarchy consists of a mission coordination level, a propulsion system coordination level, and an engine control level. Each level is described in the context of the Space Shuttle Main Engine. The concept of integrating diagnostics with control is discussed within the context of the functional framework. A distributed real time simulation testbed is used to realize and evaluate the functionalities in closed loop.

  9. A Requirement Engineering Framework for Electronic Data Sharing of Health Care Data Between Organizations

    NASA Astrophysics Data System (ADS)

    Liu, Xia; Peyton, Liam; Kuziemsky, Craig

    Health care is increasingly provided to citizens by a network of collaboration that includes multiple providers and locations. Typically, that collaboration is on an ad-hoc basis via phone calls, faxes, and paper based documentation. Internet and wireless technologies provide an opportunity to improve this situation via electronic data sharing. These new technologies make possible new ways of working and collaboration but it can be difficult for health care organizations to understand how to use the new technologies while still ensuring that their policies and objectives are being met. It is also important to have a systematic approach to validate that e-health processes deliver the performance improvements that are expected. Using a case study of a palliative care patient receiving home care from a team of collaborating health organizations, we introduce a framework based on requirements engineering. Key concerns and objectives are identified and modeled (privacy, security, quality of care, and timeliness of service). And, then, proposed business processes which use new technologies are modeled in terms of these concerns and objectives to assess their impact and ensure that electronic data sharing is well regulated.

  10. NASA System Safety Handbook. Volume 2: System Safety Concepts, Guidelines, and Implementation Examples

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Benjamin, Allan; Everett, Christopher; Feather, Martin; Rutledge, Peter; Sen, Dev; Youngblood, Robert

    2015-01-01

    This is the second of two volumes that collectively comprise the NASA System Safety Handbook. Volume 1 (NASASP-210-580) was prepared for the purpose of presenting the overall framework for System Safety and for providing the general concepts needed to implement the framework. Volume 2 provides guidance for implementing these concepts as an integral part of systems engineering and risk management. This guidance addresses the following functional areas: 1.The development of objectives that collectively define adequate safety for a system, and the safety requirements derived from these objectives that are levied on the system. 2.The conduct of system safety activities, performed to meet the safety requirements, with specific emphasis on the conduct of integrated safety analysis (ISA) as a fundamental means by which systems engineering and risk management decisions are risk-informed. 3.The development of a risk-informed safety case (RISC) at major milestone reviews to argue that the systems safety objectives are satisfied (and therefore that the system is adequately safe). 4.The evaluation of the RISC (including supporting evidence) using a defined set of evaluation criteria, to assess the veracity of the claims made therein in order to support risk acceptance decisions.

  11. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  12. Practical Application of Model-based Programming and State-based Architecture to Space Missions

    NASA Technical Reports Server (NTRS)

    Horvath, Gregory A.; Ingham, Michel D.; Chung, Seung; Martin, Oliver; Williams, Brian

    2006-01-01

    Innovative systems and software engineering solutions are required to meet the increasingly challenging demands of deep-space robotic missions. While recent advances in the development of an integrated systems and software engineering approach have begun to address some of these issues, they are still at the core highly manual and, therefore, error-prone. This paper describes a task aimed at infusing MIT's model-based executive, Titan, into JPL's Mission Data System (MDS), a unified state-based architecture, systems engineering process, and supporting software framework. Results of the task are presented, including a discussion of the benefits and challenges associated with integrating mature model-based programming techniques and technologies into a rigorously-defined domain specific architecture.

  13. Environmental engineering education: examples of accreditation and quality assurance

    NASA Astrophysics Data System (ADS)

    Caporali, E.; Catelani, M.; Manfrida, G.; Valdiserri, J.

    2013-12-01

    Environmental engineers respond to the challenges posed by a growing population, intensifying land-use pressures, natural resources exploitation as well as rapidly evolving technology. The environmental engineer must develop technically sound solutions within the framework of maintaining or improving environmental quality, complying with public policy, and optimizing the utilization of resources. The engineer provides system and component design, serves as a technical advisor in policy making and legal deliberations, develops management schemes for resources, and provides technical evaluations of systems. Through the current work of environmental engineers, individuals and businesses are able to understand how to coordinate society's interaction with the environment. There will always be a need for engineers who are able to integrate the latest technologies into systems to respond to the needs for food and energy while protecting natural resources. In general, the environment-related challenges and problems need to be faced at global level, leading to the globalization of the engineering profession which requires not only the capacity to communicate in a common technical language, but also the assurance of an adequate and common level of technical competences, knowledge and understanding. In this framework, the Europe-based EUR ACE (European Accreditation of Engineering Programmes) system, currently operated by ENAEE - European Network for Accreditation of Engineering Education can represent the proper framework and accreditation system in order to provide a set of measures to assess the quality of engineering degree programmes in Europe and abroad. The application of the accreditation model EUR-ACE, and of the National Italian Degree Courses Accreditation System, promoted by the Italian National Agency for the Evaluation of Universities and Research Institutes (ANVUR), to the Environmental Engineering Degree Courses at the University of Firenze is presented. In particular, the accreditation models of the multidisciplinary first cycle degree in Civil, Building and Environmental Engineering and the more specific second cycle degree in Environmental Engineering are discussed. The critical issues to assure the quality and the status of environmental engineering graduates, in terms of applying knowledge capacities and technical innovative competences, according to the more engineering focused EUR-ACE skill descriptors as well as with respect to the Dublin descriptors, at local and global scale are also compared. The involvement of the professional working world in the definition of goals in skills, of typical expectations of achievements and abilities is also described. The system for educating engineers in communicating knowledge and understanding, making informed judgments and choices, capacities to lifelong learning is in addition assessed. The promotion of innovative aspects related with the environmental engineering education, and of the role that science and technology could play in environmental engineering education is also taken into consideration.

  14. A Design for Computationally Enabled Analyses Supporting the Pre-Intervention Analytical Framework (PIAF)

    DTIC Science & Technology

    2015-06-01

    public release; distribution is unlimited. The US Army Engineer Research and Development Center (ERDC) solves the nation’s toughest engineering and...Framework (PIAF) Timothy K. Perkins and Chris C. Rewerts Construction Engineering Research Laboratory U.S. Army Engineer Research and Development Center...Prepared for U.S. Army Corps of Engineers Washington, DC 20314-1000 Under Project P2 335530, “Cultural Reasoning and Ethnographic Analysis for the

  15. Synthetic biology and regulatory networks: where metabolic systems biology meets control engineering

    PubMed Central

    He, Fei; Murabito, Ettore; Westerhoff, Hans V.

    2016-01-01

    Metabolic pathways can be engineered to maximize the synthesis of various products of interest. With the advent of computational systems biology, this endeavour is usually carried out through in silico theoretical studies with the aim to guide and complement further in vitro and in vivo experimental efforts. Clearly, what counts is the result in vivo, not only in terms of maximal productivity but also robustness against environmental perturbations. Engineering an organism towards an increased production flux, however, often compromises that robustness. In this contribution, we review and investigate how various analytical approaches used in metabolic engineering and synthetic biology are related to concepts developed by systems and control engineering. While trade-offs between production optimality and cellular robustness have already been studied diagnostically and statically, the dynamics also matter. Integration of the dynamic design aspects of control engineering with the more diagnostic aspects of metabolic, hierarchical control and regulation analysis is leading to the new, conceptual and operational framework required for the design of robust and productive dynamic pathways. PMID:27075000

  16. High-Heat-Flux Cyclic Durability of Thermal and Environmental Barrier Coatings

    NASA Technical Reports Server (NTRS)

    Zhu, Dongming; Ghosn, Louis L.; Miller, Robert A.

    2007-01-01

    Advanced ceramic thermal and environmental barrier coatings will play an increasingly important role in future gas turbine engines because of their ability to protect the engine components and further raise engine temperatures. For the supersonic vehicles currently envisioned in the NASA fundamental aeronautics program, advanced gas turbine engines will be used to provide high power density thrust during the extended supersonic flight of the aircraft, while meeting stringent low emission requirements. Advanced ceramic coating systems are critical to the performance, life and durability of the hot-section components of the engine systems. In this work, the laser and burner rig based high-heat-flux testing approaches were developed to investigate the coating cyclic response and failure mechanisms under simulated supersonic long-duration cruise mission. The accelerated coating cracking and delamination mechanism under the engine high-heat-flux, and extended supersonic cruise time conditions will be addressed. A coating life prediction framework may be realized by examining the crack initiation and propagation in conjunction with environmental degradation under high-heat-flux test conditions.

  17. System level airworthiness tool: A comprehensive approach to small unmanned aircraft system airworthiness

    NASA Astrophysics Data System (ADS)

    Burke, David A.

    One of the pillars of aviation safety is assuring sound engineering practices through airworthiness certification. As Unmanned Aircraft Systems (UAS) grow in popularity, the need for airworthiness standards and verification methods tailored for UAS becomes critical. While airworthiness practices for large UAS may be similar to manned aircraft, it is clear that small UAS require a paradigm shift from the airworthiness practices of manned aircraft. Although small in comparison to manned aircraft these aircraft are not merely remote controlled toys. Small UAS may be complex aircraft flying in the National Airspace System (NAS) over populated areas for extended durations and beyond line of sight of the operators. A comprehensive systems engineering framework for certifying small UAS at the system level is needed. This work presents a point based tool that evaluates small UAS by rewarding good engineering practices in design, analysis, and testing. The airworthiness requirements scale with vehicle size and operational area, while allowing flexibility for new technologies and unique configurations.

  18. The necessity of a theory of biology for tissue engineering: metabolism-repair systems.

    PubMed

    Ganguli, Suman; Hunt, C Anthony

    2004-01-01

    Since there is no widely accepted global theory of biology, tissue engineering and bioengineering lack a theoretical understanding of the systems being engineered. By default, tissue engineering operates with a "reductionist" theoretical approach, inherited from traditional engineering of non-living materials. Long term, that approach is inadequate, since it ignores essential aspects of biology. Metabolism-repair systems are a theoretical framework which explicitly represents two "functional" aspects of living organisms: self-repair and self-replication. Since repair and replication are central to tissue engineering, we advance metabolism-repair systems as a potential theoretical framework for tissue engineering. We present an overview of the framework, and indicate directions to pursue for extending it to the context of tissue engineering. We focus on biological networks, both metabolic and cellular, as one such direction. The construction of these networks, in turn, depends on biological protocols. Together these concepts may help point the way to a global theory of biology appropriate for tissue engineering.

  19. Socialization Experiences Resulting from Engineering Teaching Assistantships at Purdue University

    ERIC Educational Resources Information Center

    Mena, Irene B.

    2010-01-01

    The purpose of this study was to explore and understand the types of socialization experiences that result from engineering teaching assistantships. Using situated learning as the theoretical framework and phenomenology as the methodological framework, this study highlights the experiences of 28 engineering doctoral students who worked as…

  20. An approach in building a chemical compound search engine in oracle database.

    PubMed

    Wang, H; Volarath, P; Harrison, R

    2005-01-01

    A searching or identifying of chemical compounds is an important process in drug design and in chemistry research. An efficient search engine involves a close coupling of the search algorithm and database implementation. The database must process chemical structures, which demands the approaches to represent, store, and retrieve structures in a database system. In this paper, a general database framework for working as a chemical compound search engine in Oracle database is described. The framework is devoted to eliminate data type constrains for potential search algorithms, which is a crucial step toward building a domain specific query language on top of SQL. A search engine implementation based on the database framework is also demonstrated. The convenience of the implementation emphasizes the efficiency and simplicity of the framework.

  1. A Novel Design Framework for Structures/Materials with Enhanced Mechanical Performance

    PubMed Central

    Liu, Jie; Fan, Xiaonan; Wen, Guilin; Qing, Qixiang; Wang, Hongxin; Zhao, Gang

    2018-01-01

    Structure/material requires simultaneous consideration of both its design and manufacturing processes to dramatically enhance its manufacturability, assembly and maintainability. In this work, a novel design framework for structural/material with a desired mechanical performance and compelling topological design properties achieved using origami techniques is presented. The framework comprises four procedures, including topological design, unfold, reduction manufacturing, and fold. The topological design method, i.e., the solid isotropic material penalization (SIMP) method, serves to optimize the structure in order to achieve the preferred mechanical characteristics, and the origami technique is exploited to allow the structure to be rapidly and easily fabricated. Topological design and unfold procedures can be conveniently completed in a computer; then, reduction manufacturing, i.e., cutting, is performed to remove materials from the unfolded flat plate; the final structure is obtained by folding out the plate from the previous procedure. A series of cantilevers, consisting of origami parallel creases and Miura-ori (usually regarded as a metamaterial) and made of paperboard, are designed with the least weight and the required stiffness by using the proposed framework. The findings here furnish an alternative design framework for engineering structures that could be better than the 3D-printing technique, especially for large structures made of thin metal materials. PMID:29642555

  2. A Novel Design Framework for Structures/Materials with Enhanced Mechanical Performance.

    PubMed

    Liu, Jie; Fan, Xiaonan; Wen, Guilin; Qing, Qixiang; Wang, Hongxin; Zhao, Gang

    2018-04-09

    Abstract : Structure/material requires simultaneous consideration of both its design and manufacturing processes to dramatically enhance its manufacturability, assembly and maintainability. In this work, a novel design framework for structural/material with a desired mechanical performance and compelling topological design properties achieved using origami techniques is presented. The framework comprises four procedures, including topological design, unfold, reduction manufacturing, and fold. The topological design method, i.e., the solid isotropic material penalization (SIMP) method, serves to optimize the structure in order to achieve the preferred mechanical characteristics, and the origami technique is exploited to allow the structure to be rapidly and easily fabricated. Topological design and unfold procedures can be conveniently completed in a computer; then, reduction manufacturing, i.e., cutting, is performed to remove materials from the unfolded flat plate; the final structure is obtained by folding out the plate from the previous procedure. A series of cantilevers, consisting of origami parallel creases and Miura-ori (usually regarded as a metamaterial) and made of paperboard, are designed with the least weight and the required stiffness by using the proposed framework. The findings here furnish an alternative design framework for engineering structures that could be better than the 3D-printing technique, especially for large structures made of thin metal materials.

  3. GeoFramework: A Modeling Framework for Solid Earth Geophysics

    NASA Astrophysics Data System (ADS)

    Gurnis, M.; Aivazis, M.; Tromp, J.; Tan, E.; Thoutireddy, P.; Liu, Q.; Choi, E.; Dicaprio, C.; Chen, M.; Simons, M.; Quenette, S.; Appelbe, B.; Aagaard, B.; Williams, C.; Lavier, L.; Moresi, L.; Law, H.

    2003-12-01

    As data sets in geophysics become larger and of greater relevance to other earth science disciplines, and as earth science becomes more interdisciplinary in general, modeling tools are being driven in new directions. There is now a greater need to link modeling codes to one another, link modeling codes to multiple datasets, and to make modeling software available to non modeling specialists. Coupled with rapid progress in computer hardware (including the computational speed afforded by massively parallel computers), progress in numerical algorithms, and the introduction of software frameworks, these lofty goals of merging software in geophysics are now possible. The GeoFramework project, a collaboration between computer scientists and geoscientists, is a response to these needs and opportunities. GeoFramework is based on and extends Pyre, a Python-based modeling framework, recently developed to link solid (Lagrangian) and fluid (Eulerian) models, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. The utility and generality of Pyre as a general purpose framework in science is now being recognized. Besides its use in engineering and geophysics, it is also being used in particle physics and astronomy. Geology and geophysics impose their own unique requirements on software frameworks which are not generally available in existing frameworks and so there is a need for research in this area. One of the special requirements is the way Lagrangian and Eulerian codes will need to be linked in time and space within a plate tectonics context. GeoFramework has grown beyond its initial goal of linking a limited number of exiting codes together. The following codes are now being reengineered within the context of Pyre: Tecton, 3-D FE Visco-elastic code for lithospheric relaxation; CitComS, a code for spherical mantle convection; SpecFEM3D, a SEM code for global and regional seismic waves; eqsim, a FE code for dynamic earthquake rupture; SNAC, a developing 3-D coded based on the FLAC method for visco-elastoplastic deformation; SNARK, a 3-D FE-PIC method for viscoplastic deformation; and gPLATES an open source paleogeographic/plate tectonics modeling package. We will demonstrate how codes can be linked with themselves, such as a regional and global model of mantle convection and a visco-elastoplastic representation of the crust within viscous mantle flow. Finally, we will describe how http://GeoFramework.org has become a distribution site for a suite of modeling software in geophysics.

  4. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill; Feiereisen, William (Technical Monitor)

    2000-01-01

    The term "Grid" refers to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. The vision for NASN's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks that will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: The scientist / design engineer whose primary interest is problem solving (e.g., determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user if the tool designer: The computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. This paper describes the current state of IPG (the operational testbed), the set of capabilities being put into place for the operational prototype IPG, as well as some of the longer term R&D tasks.

  5. A Systems Engineering Framework for Implementing a Security and Critical Patch Management Process in Diverse Environments (Academic Departments' Workstations)

    NASA Astrophysics Data System (ADS)

    Mohammadi, Hadi

    Use of the Patch Vulnerability Management (PVM) process should be seriously considered for any networked computing system. The PVM process prevents the operating system (OS) and software applications from being attacked due to security vulnerabilities, which lead to system failures and critical data leakage. The purpose of this research is to create and design a Security and Critical Patch Management Process (SCPMP) framework based on Systems Engineering (SE) principles. This framework will assist Information Technology Department Staff (ITDS) to reduce IT operating time and costs and mitigate the risk of security and vulnerability attacks. Further, this study evaluates implementation of the SCPMP in the networked computing systems of an academic environment in order to: 1. Meet patch management requirements by applying SE principles. 2. Reduce the cost of IT operations and PVM cycles. 3. Improve the current PVM methodologies to prevent networked computing systems from becoming the targets of security vulnerability attacks. 4. Embed a Maintenance Optimization Tool (MOT) in the proposed framework. The MOT allows IT managers to make the most practicable choice of methods for deploying and installing released patches and vulnerability remediation. In recent years, there has been a variety of frameworks for security practices in every networked computing system to protect computer workstations from becoming compromised or vulnerable to security attacks, which can expose important information and critical data. I have developed a new mechanism for implementing PVM for maximizing security-vulnerability maintenance, protecting OS and software packages, and minimizing SCPMP cost. To increase computing system security in any diverse environment, particularly in academia, one must apply SCPMP. I propose an optimal maintenance policy that will allow ITDS to measure and estimate the variation of PVM cycles based on their department's requirements. My results demonstrate that MOT optimizes the process of implementing SCPMP in academic workstations.

  6. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis. Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2010-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II (POST2) simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL Systems Analysis (EDL-SA) team, that is conducting studies of the technologies and architectures that are required to enable higher mass robotic and human mission to Mars. The appendices to the original report are contained in this document.

  7. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis, Phase 2 Results

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2011-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL-Systems Analysis (SA) team that is conducting studies of the technologies and architectures that are required to enable human and higher mass robotic missions to Mars. The findings, observations, and recommendations from the NESC are provided in this report.

  8. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis. Volume 1

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2010-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II (POST2) simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL Systems Analysis (EDL-SA) team, that is conducting studies of the technologies and architectures that are required to enable higher mass robotic and human mission to Mars. The findings of the assessment are contained in this report.

  9. The Development of a Conceptual Framework for New K-12 Science Education Standards (Invited)

    NASA Astrophysics Data System (ADS)

    Keller, T.

    2010-12-01

    The National Academy of Sciences has created a committee of 18 National Academy of Science and Engineering members, academic scientists, cognitive and learning scientists, and educators, educational policymakers and researchers to develop a framework to guide new K-12 science education standards. The committee began its work in January, 2010, released a draft of the framework in July, 2010, and intends to have the final framework in the first quarter of 2011. The committee was helped in early phases of the work by consultant design teams. The framework is designed to help realize a vision for science and engineering education in which all students actively engage in science and engineering practices in order to deepen their understanding of core ideas in science over multiple years of school. These three dimensions - core disciplinary ideas, science and engineering practices, and cross-cutting elements - must blend together to build an exciting, relevant, and forward looking science education. The framework will be used as a base for development of next generation K-12 science education standards.

  10. Dynamics simulations for engineering macromolecular interactions

    NASA Astrophysics Data System (ADS)

    Robinson-Mosher, Avi; Shinar, Tamar; Silver, Pamela A.; Way, Jeffrey

    2013-06-01

    The predictable engineering of well-behaved transcriptional circuits is a central goal of synthetic biology. The artificial attachment of promoters to transcription factor genes usually results in noisy or chaotic behaviors, and such systems are unlikely to be useful in practical applications. Natural transcriptional regulation relies extensively on protein-protein interactions to insure tightly controlled behavior, but such tight control has been elusive in engineered systems. To help engineer protein-protein interactions, we have developed a molecular dynamics simulation framework that simplifies features of proteins moving by constrained Brownian motion, with the goal of performing long simulations. The behavior of a simulated protein system is determined by summation of forces that include a Brownian force, a drag force, excluded volume constraints, relative position constraints, and binding constraints that relate to experimentally determined on-rates and off-rates for chosen protein elements in a system. Proteins are abstracted as spheres. Binding surfaces are defined radially within a protein. Peptide linkers are abstracted as small protein-like spheres with rigid connections. To address whether our framework could generate useful predictions, we simulated the behavior of an engineered fusion protein consisting of two 20 000 Da proteins attached by flexible glycine/serine-type linkers. The two protein elements remained closely associated, as if constrained by a random walk in three dimensions of the peptide linker, as opposed to showing a distribution of distances expected if movement were dominated by Brownian motion of the protein domains only. We also simulated the behavior of fluorescent proteins tethered by a linker of varying length, compared the predicted Förster resonance energy transfer with previous experimental observations, and obtained a good correspondence. Finally, we simulated the binding behavior of a fusion of two ligands that could simultaneously bind to distinct cell-surface receptors, and explored the landscape of linker lengths and stiffnesses that could enhance receptor binding of one ligand when the other ligand has already bound to its receptor, thus, addressing potential mechanisms for improving targeted signal transduction proteins. These specific results have implications for the design of targeted fusion proteins and artificial transcription factors involving fusion of natural domains. More broadly, the simulation framework described here could be extended to include more detailed system features such as non-spherical protein shapes and electrostatics, without requiring detailed, computationally expensive specifications. This framework should be useful in predicting behavior of engineered protein systems including binding and dissociation reactions.

  11. Dynamics simulations for engineering macromolecular interactions.

    PubMed

    Robinson-Mosher, Avi; Shinar, Tamar; Silver, Pamela A; Way, Jeffrey

    2013-06-01

    The predictable engineering of well-behaved transcriptional circuits is a central goal of synthetic biology. The artificial attachment of promoters to transcription factor genes usually results in noisy or chaotic behaviors, and such systems are unlikely to be useful in practical applications. Natural transcriptional regulation relies extensively on protein-protein interactions to insure tightly controlled behavior, but such tight control has been elusive in engineered systems. To help engineer protein-protein interactions, we have developed a molecular dynamics simulation framework that simplifies features of proteins moving by constrained Brownian motion, with the goal of performing long simulations. The behavior of a simulated protein system is determined by summation of forces that include a Brownian force, a drag force, excluded volume constraints, relative position constraints, and binding constraints that relate to experimentally determined on-rates and off-rates for chosen protein elements in a system. Proteins are abstracted as spheres. Binding surfaces are defined radially within a protein. Peptide linkers are abstracted as small protein-like spheres with rigid connections. To address whether our framework could generate useful predictions, we simulated the behavior of an engineered fusion protein consisting of two 20,000 Da proteins attached by flexible glycine/serine-type linkers. The two protein elements remained closely associated, as if constrained by a random walk in three dimensions of the peptide linker, as opposed to showing a distribution of distances expected if movement were dominated by Brownian motion of the protein domains only. We also simulated the behavior of fluorescent proteins tethered by a linker of varying length, compared the predicted Förster resonance energy transfer with previous experimental observations, and obtained a good correspondence. Finally, we simulated the binding behavior of a fusion of two ligands that could simultaneously bind to distinct cell-surface receptors, and explored the landscape of linker lengths and stiffnesses that could enhance receptor binding of one ligand when the other ligand has already bound to its receptor, thus, addressing potential mechanisms for improving targeted signal transduction proteins. These specific results have implications for the design of targeted fusion proteins and artificial transcription factors involving fusion of natural domains. More broadly, the simulation framework described here could be extended to include more detailed system features such as non-spherical protein shapes and electrostatics, without requiring detailed, computationally expensive specifications. This framework should be useful in predicting behavior of engineered protein systems including binding and dissociation reactions.

  12. Composable Framework Support for Software-FMEA Through Model Execution

    NASA Astrophysics Data System (ADS)

    Kocsis, Imre; Patricia, Andras; Brancati, Francesco; Rossi, Francesco

    2016-08-01

    Performing Failure Modes and Effect Analysis (FMEA) during software architecture design is becoming a basic requirement in an increasing number of domains; however, due to the lack of standardized early design phase model execution, classic SW-FMEA approaches carry significant risks and are human effort-intensive even in processes that use Model-Driven Engineering.Recently, modelling languages with standardized executable semantics have emerged. Building on earlier results, this paper describes framework support for generating executable error propagation models from such models during software architecture design. The approach carries the promise of increased precision, decreased risk and more automated execution for SW-FMEA during dependability- critical system development.

  13. Framework for Implementing Engineering Senior Design Capstone Courses and Design Clinics

    ERIC Educational Resources Information Center

    Franchetti, Matthew; Hefzy, Mohamed Samir; Pourazady, Mehdi; Smallman, Christine

    2012-01-01

    Senior design capstone projects for engineering students are essential components of an undergraduate program that enhances communication, teamwork, and problem solving skills. Capstone projects with industry are well established in management, but not as heavily utilized in engineering. This paper outlines a general framework that can be used by…

  14. Feasibility study of an Integrated Program for Aerospace-vehicle Design (IPAD) system. Volume 5: Design of the IPAD system. Part 2: System design. Part 3: General purpose utilities, phase 1, task 2

    NASA Technical Reports Server (NTRS)

    Garrocq, C. A.; Hurley, M. J.

    1973-01-01

    Viable designs are presented of various elements of the IPAD framework software, data base management system, and required new languages in relation to the capabilities of operating systems software. A thorough evaluation was made of the basic systems functions to be provide by each software element, its requirements defined in the conceptual design, the operating systems features affecting its design, and the engineering/design functions which it was intended to enhance.

  15. Engineering Analysis Using a Web-based Protocol

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.; Claus, Russell W.

    2002-01-01

    This paper reviews the development of a web-based framework for engineering analysis. A one-dimensional, high-speed analysis code called LAPIN was used in this study, but the approach can be generalized to any engineering analysis tool. The web-based framework enables users to store, retrieve, and execute an engineering analysis from a standard web-browser. We review the encapsulation of the engineering data into the eXtensible Markup Language (XML) and various design considerations in the storage and retrieval of application data.

  16. Using Self-Determination Theory to build communities of support to aid in the retention of women in engineering

    NASA Astrophysics Data System (ADS)

    Dell, Elizabeth M.; Verhoeven, Yen; Christman, Jeanne W.; Garrick, Robert D.

    2018-05-01

    Diverse perspectives are required to address the technological problems facing our world. Although women perform as well as their male counterparts in math and science prior to entering college, the numbers of women students entering and completing engineering programmes are far below their representation in the workforce. This paper reports on a qualitative, multiyear study of the experiences of women students in an Engineering Technology programme. The project addressed some of the unique, fundamental challenges that female students face within their programmes, and the authors describe a programmatic framework based on Self-Determination Theory as an intervention for the recruitment and retention of female engineering students. Data from focus groups and interviews show how students were supported in their undergraduate experiences and how inclusive learning environments are needed to further improve outcomes. Conceptual issues and methodological considerations of our outcomes are presented.

  17. An Example of Concurrent Engineering

    NASA Technical Reports Server (NTRS)

    Rowe, Sidney; Whitten, David; Cloyd, Richard; Coppens, Chris; Rodriguez, Pedro

    1998-01-01

    The Collaborative Engineering Design and Analysis Room (CEDAR) facility allows on-the- spot design review capability for any project during all phases of development. The required disciplines assemble in this facility to work on any problems (analysis, manufacturing, inspection, etc.) associated with a particular design. A small highly focused team of specialists can meet in this room to better expedite the process of developing a solution to an engineering task within the framework of the constraints that are unique to each discipline. This facility provides the engineering tools and translators to develop a concept within the confines of the room or with remote team members that could access the team's data from other locations. The CEDAR area is envisioned as excellent for failure investigation meetings to be conducted where the computer capabilities can be utilized in conjunction with the Smart Board display to develop failure trees, brainstorm failure modes, and evaluate possible solutions.

  18. a Conceptual Framework for Virtual Geographic Environments Knowledge Engineering

    NASA Astrophysics Data System (ADS)

    You, Lan; Lin, Hui

    2016-06-01

    VGE geographic knowledge refers to the abstract and repeatable geo-information which is related to the geo-science problem, geographical phenomena and geographical laws supported by VGE. That includes expert experiences, evolution rule, simulation processes and prediction results in VGE. This paper proposes a conceptual framework for VGE knowledge engineering in order to effectively manage and use geographic knowledge in VGE. Our approach relies on previous well established theories on knowledge engineering and VGE. The main contribution of this report is following: (1) The concepts of VGE knowledge and VGE knowledge engineering which are defined clearly; (2) features about VGE knowledge different with common knowledge; (3) geographic knowledge evolution process that help users rapidly acquire knowledge in VGE; and (4) a conceptual framework for VGE knowledge engineering providing the supporting methodologies system for building an intelligent VGE. This conceptual framework systematically describes the related VGE knowledge theories and key technologies. That will promote the rapid transformation from geodata to geographic knowledge, and furtherly reduce the gap between the data explosion and knowledge absence.

  19. Developing a framework for qualitative engineering: Research in design and analysis of complex structural systems

    NASA Technical Reports Server (NTRS)

    Franck, Bruno M.

    1990-01-01

    The research is focused on automating the evaluation of complex structural systems, whether for the design of a new system or the analysis of an existing one, by developing new structural analysis techniques based on qualitative reasoning. The problem is to identify and better understand: (1) the requirements for the automation of design, and (2) the qualitative reasoning associated with the conceptual development of a complex system. The long-term objective is to develop an integrated design-risk assessment environment for the evaluation of complex structural systems. The scope of this short presentation is to describe the design and cognition components of the research. Design has received special attention in cognitive science because it is now identified as a problem solving activity that is different from other information processing tasks (1). Before an attempt can be made to automate design, a thorough understanding of the underlying design theory and methodology is needed, since the design process is, in many cases, multi-disciplinary, complex in size and motivation, and uses various reasoning processes involving different kinds of knowledge in ways which vary from one context to another. The objective is to unify all the various types of knowledge under one framework of cognition. This presentation focuses on the cognitive science framework that we are using to represent the knowledge aspects associated with the human mind's abstraction abilities and how we apply it to the engineering knowledge and engineering reasoning in design.

  20. Design process of the nanofluid injection mechanism in nuclear power plants

    NASA Astrophysics Data System (ADS)

    Kang, Myoung-Suk; Jee, Changhyun; Park, Sangjun; Bang, In Choel; Heo, Gyunyoung

    2011-04-01

    Nanofluids, which are engineered suspensions of nanoparticles in a solvent such as water, have been found to show enhanced coolant properties such as higher critical heat flux and surface wettability at modest concentrations, which is a useful characteristic in nuclear power plants (NPPs). This study attempted to provide an example of engineering applications in NPPs using nanofluid technology. From these motivations, the conceptual designs of the emergency core cooling systems (ECCSs) assisted by nanofluid injection mechanism were proposed after following a design framework to develop complex engineering systems. We focused on the analysis of functional requirements for integrating the conventional ECCSs and nanofluid injection mechanism without loss of performance and reliability. Three candidates of nanofluid-engineered ECCS proposed in previous researches were investigated by applying axiomatic design (AD) in the manner of reverse engineering and it enabled to identify the compatibility of functional requirements and potential design vulnerabilities. The methods to enhance such vulnerabilities were referred from TRIZ and concretized for the ECCS of the Korean nuclear power plant. The results show a method to decouple the ECCS designs with the installation of a separate nanofluids injection tank adjacent to the safety injection tanks such that a low pH environment for nanofluids can be maintained at atmospheric pressure which is favorable for their injection in passive manner.

  1. Design process of the nanofluid injection mechanism in nuclear power plants

    PubMed Central

    2011-01-01

    Nanofluids, which are engineered suspensions of nanoparticles in a solvent such as water, have been found to show enhanced coolant properties such as higher critical heat flux and surface wettability at modest concentrations, which is a useful characteristic in nuclear power plants (NPPs). This study attempted to provide an example of engineering applications in NPPs using nanofluid technology. From these motivations, the conceptual designs of the emergency core cooling systems (ECCSs) assisted by nanofluid injection mechanism were proposed after following a design framework to develop complex engineering systems. We focused on the analysis of functional requirements for integrating the conventional ECCSs and nanofluid injection mechanism without loss of performance and reliability. Three candidates of nanofluid-engineered ECCS proposed in previous researches were investigated by applying axiomatic design (AD) in the manner of reverse engineering and it enabled to identify the compatibility of functional requirements and potential design vulnerabilities. The methods to enhance such vulnerabilities were referred from TRIZ and concretized for the ECCS of the Korean nuclear power plant. The results show a method to decouple the ECCS designs with the installation of a separate nanofluids injection tank adjacent to the safety injection tanks such that a low pH environment for nanofluids can be maintained at atmospheric pressure which is favorable for their injection in passive manner. PMID:21711896

  2. Design process of the nanofluid injection mechanism in nuclear power plants.

    PubMed

    Kang, Myoung-Suk; Jee, Changhyun; Park, Sangjun; Bang, In Choel; Heo, Gyunyoung

    2011-04-27

    Nanofluids, which are engineered suspensions of nanoparticles in a solvent such as water, have been found to show enhanced coolant properties such as higher critical heat flux and surface wettability at modest concentrations, which is a useful characteristic in nuclear power plants (NPPs). This study attempted to provide an example of engineering applications in NPPs using nanofluid technology. From these motivations, the conceptual designs of the emergency core cooling systems (ECCSs) assisted by nanofluid injection mechanism were proposed after following a design framework to develop complex engineering systems. We focused on the analysis of functional requirements for integrating the conventional ECCSs and nanofluid injection mechanism without loss of performance and reliability. Three candidates of nanofluid-engineered ECCS proposed in previous researches were investigated by applying axiomatic design (AD) in the manner of reverse engineering and it enabled to identify the compatibility of functional requirements and potential design vulnerabilities. The methods to enhance such vulnerabilities were referred from TRIZ and concretized for the ECCS of the Korean nuclear power plant. The results show a method to decouple the ECCS designs with the installation of a separate nanofluids injection tank adjacent to the safety injection tanks such that a low pH environment for nanofluids can be maintained at atmospheric pressure which is favorable for their injection in passive manner.

  3. Simulation Modeling to Compare High-Throughput, Low-Iteration Optimization Strategies for Metabolic Engineering

    PubMed Central

    Heinsch, Stephen C.; Das, Siba R.; Smanski, Michael J.

    2018-01-01

    Increasing the final titer of a multi-gene metabolic pathway can be viewed as a multivariate optimization problem. While numerous multivariate optimization algorithms exist, few are specifically designed to accommodate the constraints posed by genetic engineering workflows. We present a strategy for optimizing expression levels across an arbitrary number of genes that requires few design-build-test iterations. We compare the performance of several optimization algorithms on a series of simulated expression landscapes. We show that optimal experimental design parameters depend on the degree of landscape ruggedness. This work provides a theoretical framework for designing and executing numerical optimization on multi-gene systems. PMID:29535690

  4. Ecological requirements for pallid sturgeon reproduction and recruitment in the Lower Missouri River: Annual report 2010

    USGS Publications Warehouse

    DeLonay, Aaron J.; Jacobson, Robert B.; Papoulias, Diana M.; Wildhaber, Mark L.; Chojnacki, Kimberly A.; Pherigo, Emily K.; Haas, Justin D.; Mestl, Gerald E.

    2012-01-01

    The Comprehensive Sturgeon Research Project is a multiyear, multiagency collaborative research framework developed to provide information to support pallid sturgeon recovery and Missouri River management decisions. The project strategy integrates field and laboratory studies of sturgeon reproductive ecology, early life history, habitat requirements, and physiology. The project scope of work is developed annually with cooperating research partners and in collaboration with the U.S. Army Corps of Engineers, Missouri River Recovery—Integrated Science Program. The research consists of several interdependent and complementary tasks that engage multiple disciplines. The research tasks in the 2010 scope of work primarily address spawning as a probable factor limiting pallid sturgeon survival and recovery, although limited pilot studies also have been initiated to examine the requirements of early life stages. The research is designed to inform management decisions affecting channel re-engineering, flow modification, and pallid sturgeon population augmentation on the Missouri River, and throughout the range of the species. Research and progress made through this project are reported to the U.S. Army Corps of Engineers annually. This annual report details the research effort and progress made by the Comprehensive Sturgeon Research Project during 2010.

  5. Configurable analog-digital conversion using the neural engineering framework

    PubMed Central

    Mayr, Christian G.; Partzsch, Johannes; Noack, Marko; Schüffny, Rene

    2014-01-01

    Efficient Analog-Digital Converters (ADC) are one of the mainstays of mixed-signal integrated circuit design. Besides the conventional ADCs used in mainstream ICs, there have been various attempts in the past to utilize neuromorphic networks to accomplish an efficient crossing between analog and digital domains, i.e., to build neurally inspired ADCs. Generally, these have suffered from the same problems as conventional ADCs, that is they require high-precision, handcrafted analog circuits and are thus not technology portable. In this paper, we present an ADC based on the Neural Engineering Framework (NEF). It carries out a large fraction of the overall ADC process in the digital domain, i.e., it is easily portable across technologies. The analog-digital conversion takes full advantage of the high degree of parallelism inherent in neuromorphic networks, making for a very scalable ADC. In addition, it has a number of features not commonly found in conventional ADCs, such as a runtime reconfigurability of the ADC sampling rate, resolution and transfer characteristic. PMID:25100933

  6. Integrating computational methods to retrofit enzymes to synthetic pathways.

    PubMed

    Brunk, Elizabeth; Neri, Marilisa; Tavernelli, Ivano; Hatzimanikatis, Vassily; Rothlisberger, Ursula

    2012-02-01

    Microbial production of desired compounds provides an efficient framework for the development of renewable energy resources. To be competitive to traditional chemistry, one requirement is to utilize the full capacity of the microorganism to produce target compounds with high yields and turnover rates. We use integrated computational methods to generate and quantify the performance of novel biosynthetic routes that contain highly optimized catalysts. Engineering a novel reaction pathway entails addressing feasibility on multiple levels, which involves handling the complexity of large-scale biochemical networks while respecting the critical chemical phenomena at the atomistic scale. To pursue this multi-layer challenge, our strategy merges knowledge-based metabolic engineering methods with computational chemistry methods. By bridging multiple disciplines, we provide an integral computational framework that could accelerate the discovery and implementation of novel biosynthetic production routes. Using this approach, we have identified and optimized a novel biosynthetic route for the production of 3HP from pyruvate. Copyright © 2011 Wiley Periodicals, Inc.

  7. Control of fluxes in metabolic networks.

    PubMed

    Basler, Georg; Nikoloski, Zoran; Larhlimi, Abdelhalim; Barabási, Albert-László; Liu, Yang-Yu

    2016-07-01

    Understanding the control of large-scale metabolic networks is central to biology and medicine. However, existing approaches either require specifying a cellular objective or can only be used for small networks. We introduce new coupling types describing the relations between reaction activities, and develop an efficient computational framework, which does not require any cellular objective for systematic studies of large-scale metabolism. We identify the driver reactions facilitating control of 23 metabolic networks from all kingdoms of life. We find that unicellular organisms require a smaller degree of control than multicellular organisms. Driver reactions are under complex cellular regulation in Escherichia coli, indicating their preeminent role in facilitating cellular control. In human cancer cells, driver reactions play pivotal roles in malignancy and represent potential therapeutic targets. The developed framework helps us gain insights into regulatory principles of diseases and facilitates design of engineering strategies at the interface of gene regulation, signaling, and metabolism. © 2016 Basler et al.; Published by Cold Spring Harbor Laboratory Press.

  8. Evolution of Students' Varied Conceptualizations About Socially Responsible Engineering: A Four Year Longitudinal Study.

    PubMed

    Rulifson, Greg; Bielefeldt, Angela R

    2018-03-20

    Engineers should learn how to act on their responsibility to society during their education. At present, however, it is unknown what students think about the meaning of socially responsible engineering. This paper synthesizes 4 years of longitudinal interviews with engineering students as they progressed through college. The interviews revolved broadly around how students saw the connections between engineering and social responsibility, and what influenced these ideas. Using the Weidman Input-Environment-Output model as a framework, this research found that influences included required classes such as engineering ethics, capstone design, and some technical courses, pre-college volunteering and familial values, co-curricular groups such as Engineers Without Borders and the Society of Women Engineers, as well as professional experiences through internships. Further, some experiences such as technical courses and engineering internships contributed to confine students' understanding of an engineer's social responsibility. Overall, students who stayed in engineering tended to converge on basic responsibilities such as safety and bettering society as a whole, but tended to become less concerned with improving the lives of the marginalized and disadvantaged. Company loyalty also became important for some students. These results have valuable, transferable contributions, providing guidance to foster students' ideas on socially responsible engineering.

  9. Development Education and Engineering: A Framework for Incorporating Reality of Developing Countries into Engineering Studies

    ERIC Educational Resources Information Center

    Perez-Foguet, A.; Oliete-Josa, S.; Saz-Carranza, A.

    2005-01-01

    Purpose: To show the key points of a development education program for engineering studies fitted within the framework of the human development paradigm. Design/methodology/approach: The bases of the concept of technology for human development are presented, and the relationship with development education analysed. Special attention is dedicated…

  10. FRIEND Engine Framework: a real time neurofeedback client-server system for neuroimaging studies

    PubMed Central

    Basilio, Rodrigo; Garrido, Griselda J.; Sato, João R.; Hoefle, Sebastian; Melo, Bruno R. P.; Pamplona, Fabricio A.; Zahn, Roland; Moll, Jorge

    2015-01-01

    In this methods article, we present a new implementation of a recently reported FSL-integrated neurofeedback tool, the standalone version of “Functional Real-time Interactive Endogenous Neuromodulation and Decoding” (FRIEND). We will refer to this new implementation as the FRIEND Engine Framework. The framework comprises a client-server cross-platform solution for real time fMRI and fMRI/EEG neurofeedback studies, enabling flexible customization or integration of graphical interfaces, devices, and data processing. This implementation allows a fast setup of novel plug-ins and frontends, which can be shared with the user community at large. The FRIEND Engine Framework is freely distributed for non-commercial, research purposes. PMID:25688193

  11. Green engineering education through a U.S. EPA/academia collaboration.

    PubMed

    Shonnard, David R; Allen, David T; Nguyen, Nhan; Austin, Sharon Weil; Hesketh, Robert

    2003-12-01

    The need to use resources efficiently and reduce environmental impacts of industrial products and processes is becoming increasingly important in engineering design; therefore, green engineering principles are gaining prominence within engineering education. This paper describes a general framework for incorporating green engineering design principles into engineering curricula, with specific examples for chemical engineering. The framework for teaching green engineering discussed in this paper mirrors the 12 Principles of Green Engineering proposed by Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37, 94A-101A), especially in methods for estimating the hazardous nature of chemicals, strategies for pollution prevention, and approaches leading to efficient energy and material utilization. The key elements in green engineering education, which enlarge the "box" for engineering design, are environmental literacy, environmentally conscious design, and beyond-the-plant boundary considerations.

  12. Virtue ethics, positive psychology, and a new model of science and engineering ethics education.

    PubMed

    Han, Hyemin

    2015-04-01

    This essay develops a new conceptual framework of science and engineering ethics education based on virtue ethics and positive psychology. Virtue ethicists and positive psychologists have argued that current rule-based moral philosophy, psychology, and education cannot effectively promote students' moral motivation for actual moral behavior and may even lead to negative outcomes, such as moral schizophrenia. They have suggested that their own theoretical framework of virtue ethics and positive psychology can contribute to the effective promotion of motivation for self-improvement by connecting the notion of morality and eudaimonic happiness. Thus this essay attempts to apply virtue ethics and positive psychology to science and engineering ethics education and to develop a new conceptual framework for more effective education. In addition to the conceptual-level work, this essay suggests two possible educational methods: moral modeling and involvement in actual moral activity in science and engineering ethics classes, based on the conceptual framework.

  13. Adaptive Acquisition: An Evolving Framework for Tailoring Engineering and Procurement of Defense Systems

    DTIC Science & Technology

    2017-01-31

    mapping critical business workflows and then optimizing them with appropriate evolutionary technology choices is often called “ Product Line Architecture... technologies , products , services, and processes, and the USG evaluates them against its 360o requirements objectives, and refines them as appropriate, clarity...in rapidly evolving technological domains (e.g. by applying best commercial practices for open standard product line architecture.) An MP might be

  14. A Framework for Advancing Career and Technical Education: Recommendations for the Reauthorization of the Carl D. Perkins Act. Policy Brief

    ERIC Educational Resources Information Center

    Alliance for Excellent Education, 2012

    2012-01-01

    The nation's economy is only as strong as the educational foundation that supports it. Economic success in the twenty-first century requires a labor force capable of demonstrating an advanced level of both knowledge and skill. To be a true engine of growth, the nation's education system must be aligned with these demands. This is why the…

  15. Scientific and Engineering Practices in K-12 Classrooms: Understanding "A Framework for K-12 Science Education"

    ERIC Educational Resources Information Center

    Bybee, Rodger W.

    2011-01-01

    In this article, the author presents the science and engineering practices from the recently released "A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas" (NRC 2011). The author recognizes the changes implied by the new framework, and eventually a new generation of science education standards will present new…

  16. A development framework for semantically interoperable health information systems.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  17. V&V framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hills, Richard G.; Maniaci, David Charles; Naughton, Jonathan W.

    2015-09-01

    A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3)more » uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.« less

  18. Developing a Comprehensive Model of Intensive Care Unit Processes: Concept of Operations.

    PubMed

    Romig, Mark; Tropello, Steven P; Dwyer, Cindy; Wyskiel, Rhonda M; Ravitz, Alan; Benson, John; Gropper, Michael A; Pronovost, Peter J; Sapirstein, Adam

    2015-04-23

    This study aimed to use a systems engineering approach to improve performance and stakeholder engagement in the intensive care unit to reduce several different patient harms. We developed a conceptual framework or concept of operations (ConOps) to analyze different types of harm that included 4 steps as follows: risk assessment, appropriate therapies, monitoring and feedback, as well as patient and family communications. This framework used a transdisciplinary approach to inventory the tasks and work flows required to eliminate 7 common types of harm experienced by patients in the intensive care unit. The inventory gathered both implicit and explicit information about how the system works or should work and converted the information into a detailed specification that clinicians could understand and use. Using the ConOps document, we created highly detailed work flow models to reduce harm and offer an example of its application to deep venous thrombosis. In the deep venous thrombosis model, we identified tasks that were synergistic across different types of harm. We will use a system of systems approach to integrate the variety of subsystems and coordinate processes across multiple types of harm to reduce the duplication of tasks. Through this process, we expect to improve efficiency and demonstrate synergistic interactions that ultimately can be applied across the spectrum of potential patient harms and patient locations. Engineering health care to be highly reliable will first require an understanding of the processes and work flows that comprise patient care. The ConOps strategy provided a framework for building complex systems to reduce patient harm.

  19. Model Wind Turbine Design in a Project-Based Middle School Engineering Curriculum Built on State Frameworks

    ERIC Educational Resources Information Center

    Cogger, Steven D.; Miley, Daniel H.

    2012-01-01

    This paper proposes that project-based active learning is a key part of engineering education at the middle school level. One project from a comprehensive middle school engineering curriculum developed by the authors is described to show how active learning and state frameworks can coexist. The theoretical basis for learning and assessment in a…

  20. A Legal Perspective on Business: Modeling the Impact of Law

    NASA Astrophysics Data System (ADS)

    Ghanavati, Sepideh; Siena, Alberto; Perini, Anna; Amyot, Daniel; Peyton, Liam; Susi, Angelo

    Modern goal-oriented requirements engineering frameworks use modeling as a means of better understanding a domain, leading to an overall improvement in the quality of the requirements. Regulations and laws impose additional context and constraints on software goals and can limit the satisfaction of stakeholder needs. Organizations and software developers need modeling tools that can properly address the potential deep impact legal issues can have on the effectiveness of business strategies. In this paper, we perform a preliminary study into the development of a modeling framework able to support the analysis of legal prescriptions alongside business strategies. We demonstrate, via an example drawn from a case study of the Health Insurance Portability and Accountability Act (HIPAA), how models of this law can be built with the GRL modeling language and how they can be evaluated as part of the business goal models.

  1. The Need for V&V in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1997-01-01

    V&V is currently performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to entire' domain or product line rather than a critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. engineering. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for activities.

  2. Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration

    NASA Technical Reports Server (NTRS)

    Lin, Risheng; Afjeh, Abdollah A.

    2003-01-01

    This paper discusses the detailed design of an XML databinding framework for aircraft engine simulation. The framework provides an object interface to access and use engine data. while at the same time preserving the meaning of the original data. The Language independent representation of engine component data enables users to move around XML data using HTTP through disparate networks. The application of this framework is demonstrated via a web-based turbofan propulsion system simulation using the World Wide Web (WWW). A Java Servlet based web component architecture is used for rendering XML engine data into HTML format and dealing with input events from the user, which allows users to interact with simulation data from a web browser. The simulation data can also be saved to a local disk for archiving or to restart the simulation at a later time.

  3. Synthetic biology and regulatory networks: where metabolic systems biology meets control engineering.

    PubMed

    He, Fei; Murabito, Ettore; Westerhoff, Hans V

    2016-04-01

    Metabolic pathways can be engineered to maximize the synthesis of various products of interest. With the advent of computational systems biology, this endeavour is usually carried out through in silico theoretical studies with the aim to guide and complement further in vitro and in vivo experimental efforts. Clearly, what counts is the result in vivo, not only in terms of maximal productivity but also robustness against environmental perturbations. Engineering an organism towards an increased production flux, however, often compromises that robustness. In this contribution, we review and investigate how various analytical approaches used in metabolic engineering and synthetic biology are related to concepts developed by systems and control engineering. While trade-offs between production optimality and cellular robustness have already been studied diagnostically and statically, the dynamics also matter. Integration of the dynamic design aspects of control engineering with the more diagnostic aspects of metabolic, hierarchical control and regulation analysis is leading to the new, conceptual and operational framework required for the design of robust and productive dynamic pathways. © 2016 The Author(s).

  4. An algorithm for designing minimal microbial communities with desired metabolic capacities

    PubMed Central

    Eng, Alexander; Borenstein, Elhanan

    2016-01-01

    Motivation: Recent efforts to manipulate various microbial communities, such as fecal microbiota transplant and bioreactor systems’ optimization, suggest a promising route for microbial community engineering with numerous medical, environmental and industrial applications. However, such applications are currently restricted in scale and often rely on mimicking or enhancing natural communities, calling for the development of tools for designing synthetic communities with specific, tailored, desired metabolic capacities. Results: Here, we present a first step toward this goal, introducing a novel algorithm for identifying minimal sets of microbial species that collectively provide the enzymatic capacity required to synthesize a set of desired target product metabolites from a predefined set of available substrates. Our method integrates a graph theoretic representation of network flow with the set cover problem in an integer linear programming (ILP) framework to simultaneously identify possible metabolic paths from substrates to products while minimizing the number of species required to catalyze these metabolic reactions. We apply our algorithm to successfully identify minimal communities both in a set of simple toy problems and in more complex, realistic settings, and to investigate metabolic capacities in the gut microbiome. Our framework adds to the growing toolset for supporting informed microbial community engineering and for ultimately realizing the full potential of such engineering efforts. Availability and implementation: The algorithm source code, compilation, usage instructions and examples are available under a non-commercial research use only license at https://github.com/borenstein-lab/CoMiDA. Contact: elbo@uw.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153571

  5. The importance of meta-ethics in engineering education.

    PubMed

    Haws, David R

    2004-04-01

    Our shared moral framework is negotiated as part of the social contract. Some elements of that framework are established (tell the truth under oath), but other elements lack an overlapping consensus (just when can an individual lie to protect his or her privacy?). The tidy bits of our accepted moral framework have been codified, becoming the subject of legal rather than ethical consideration. Those elements remaining in the realm of ethics seem fragmented and inconsistent. Yet, our engineering students will need to navigate the broken ground of this complex moral landscape. A minimalist approach would leave our students with formulated dogma--principles of right and wrong such as the National Society for Professional Engineers (NSPE) Code of Ethics for Engineers--but without any insight into the genesis of these principles. A slightly deeper, micro-ethics approach would teach our students to solve ethical problems by applying heuristics--giving our students a rational process to manipulate ethical dilemmas using the same principles simply referenced a priori by dogma. A macro-ethics approach--helping students to inductively construct a posteriori principles from case studies--goes beyond the simple statement or manipulation of principles, but falls short of linking personal moral principles to the larger, social context. Ultimately, it is this social context that requires both the application of ethical principles, and the negotiation of moral values--from an understanding of meta-ethics. The approaches to engineering ethics instruction (dogma, heuristics, case studies, and meta-ethics) can be associated with stages of moral development. If we leave our students with only a dogmatic reaction to ethical dilemmas, they will be dependent on the ethical decisions of others (a denial of their fundamental potential for moral autonomy). Heuristics offers a tool to deal independently with moral questions, but a tool that too frequently reduces to casuistry when rigidly applied to "simplified" dilemmas. Case studies, while providing a context for engineering ethics, can encourage the premature analysis of specific moral conduct rather than the development of broad moral principles--stifling our students' facility with meta-ethics. Clearly, if a moral sense is developmental, ethics instruction should lead our students from lower to higher stages of moral development.

  6. Strategic science: new frameworks to bring scientific expertise to environmental disaster response

    USGS Publications Warehouse

    Stoepler, Teresa Michelle; Ludwig, Kristin A.

    2015-01-01

    Science is critical to society’s ability to prepare for, respond to, and recover from environmental crises. Natural and technological disasters such as disease outbreaks, volcanic eruptions, hurricanes, oil spills, and tsunamis require coordinated scientific expertise across a range of disciplines to shape effective policies and protocols. Five years after the Deepwater Horizon oil spill, new organizational frameworks have arisen for scientists and engineers to apply their expertise to disaster response and recovery in a variety of capacities. Here, we describe examples of these opportunities, including an exciting new collaboration between the Association for the Sciences of Limnology and Oceanography (ASLO) and the Department of the Interior’s (DOI) Strategic Sciences Group (SSG).

  7. A Rigorous Framework for Optimization of Expensive Functions by Surrogates

    NASA Technical Reports Server (NTRS)

    Booker, Andrew J.; Dennis, J. E., Jr.; Frank, Paul D.; Serafini, David B.; Torczon, Virginia; Trosset, Michael W.

    1998-01-01

    The goal of the research reported here is to develop rigorous optimization algorithms to apply to some engineering design problems for which design application of traditional optimization approaches is not practical. This paper presents and analyzes a framework for generating a sequence of approximations to the objective function and managing the use of these approximations as surrogates for optimization. The result is to obtain convergence to a minimizer of an expensive objective function subject to simple constraints. The approach is widely applicable because it does not require, or even explicitly approximate, derivatives of the objective. Numerical results are presented for a 31-variable helicopter rotor blade design example and for a standard optimization test example.

  8. Estimating unknown input parameters when implementing the NGA ground-motion prediction equations in engineering practice

    USGS Publications Warehouse

    Kaklamanos, James; Baise, Laurie G.; Boore, David M.

    2011-01-01

    The ground-motion prediction equations (GMPEs) developed as part of the Next Generation Attenuation of Ground Motions (NGA-West) project in 2008 are becoming widely used in seismic hazard analyses. However, these new models are considerably more complicated than previous GMPEs, and they require several more input parameters. When employing the NGA models, users routinely face situations in which some of the required input parameters are unknown. In this paper, we present a framework for estimating the unknown source, path, and site parameters when implementing the NGA models in engineering practice, and we derive geometrically-based equations relating the three distance measures found in the NGA models. Our intent is for the content of this paper not only to make the NGA models more accessible, but also to help with the implementation of other present or future GMPEs.

  9. A Decision Support Framework for Evaluation of Engineered ...

    EPA Pesticide Factsheets

    Engineered nanomaterials (ENM) are currently being developed and applied at rates that far exceed our ability to evaluate their potential for environmental or human health risks. The gap between material development and capacity for assessment grows wider every day. Transformative approaches are required that enhance our ability to forecast potential exposure and adverse health risks based on limited information such as the physical and chemical parameters of ENM, their proposed uses, and functional assays reflective of key ENM - environmental interactions. We are developing a framework that encompasses the potential for release of nanomaterials across a product life cycle, environmental transport, transformations and fate, exposure to sensitive species, including humans, and the potential for causing adverse effects. Each component of the framework is conceive of as a sequential segmented model depicting the movement, transformations and actions of ENM through environmental or biological compartments, and along which targeted functional assays can be developed that are indicative of projected rates of ENM movement or action. The eventual goal is to allow simple predictive models to be built that incorporate the data from key functional assays and thereby allow rapid screening of the projected margin of exposure for proposed applications of ENM enabled products. In this way, cases where a substantially safe margin of exposure is forecast can be reduced in

  10. Holographic heat engine within the framework of massive gravity

    NASA Astrophysics Data System (ADS)

    Mo, Jie-Xiong; Li, Gu-Qiang

    2018-05-01

    Heat engine models are constructed within the framework of massive gravity in this paper. For the four-dimensional charged black holes in massive gravity, it is shown that the existence of graviton mass improves the heat engine efficiency significantly. The situation is more complicated for the five-dimensional neutral black holes since the constant which corresponds to the third massive potential also contributes to the efficiency. It is also shown that the existence of graviton mass can improve the heat engine efficiency. Moreover, we probe how the massive gravity influences the behavior of the heat engine efficiency approaching the Carnot efficiency.

  11. Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition

    DTIC Science & Technology

    2017-01-01

    004 OFFICE OF NAVAL RESEARCH ATTN JASON STACK MINE WARFARE & OCEAN ENGINEERING PROGRAMS CODE 32, SUITE 1092 875 N RANDOLPH ST ARLINGTON VA 22203 ONR...naval mine countermeasures (MCM) operations by automating a large portion of the data analysis. Successful long-term implementation of ATR requires a...Modular Algorithm Testbed Suite; MATS; Mine Countermeasures Operations U U U SAR 24 Derek R. Kolacinski (850) 230-7218 THIS PAGE INTENTIONALLY LEFT

  12. Implementation of a Goal-Based Systems Engineering Process Using the Systems Modeling Language (SysML)

    NASA Technical Reports Server (NTRS)

    Patterson, Jonathan D.; Breckenridge, Jonathan T.; Johnson, Stephen B.

    2013-01-01

    Building upon the purpose, theoretical approach, and use of a Goal-Function Tree (GFT) being presented by Dr. Stephen B. Johnson, described in a related Infotech 2013 ISHM abstract titled "Goal-Function Tree Modeling for Systems Engineering and Fault Management", this paper will describe the core framework used to implement the GFTbased systems engineering process using the Systems Modeling Language (SysML). These two papers are ideally accepted and presented together in the same Infotech session. Statement of problem: SysML, as a tool, is currently not capable of implementing the theoretical approach described within the "Goal-Function Tree Modeling for Systems Engineering and Fault Management" paper cited above. More generally, SysML's current capabilities to model functional decompositions in the rigorous manner required in the GFT approach are limited. The GFT is a new Model-Based Systems Engineering (MBSE) approach to the development of goals and requirements, functions, and its linkage to design. As a growing standard for systems engineering, it is important to develop methods to implement GFT in SysML. Proposed Method of Solution: Many of the central concepts of the SysML language are needed to implement a GFT for large complex systems. In the implementation of those central concepts, the following will be described in detail: changes to the nominal SysML process, model view definitions and examples, diagram definitions and examples, and detailed SysML construct and stereotype definitions.

  13. Dynamics simulations for engineering macromolecular interactions

    PubMed Central

    Robinson-Mosher, Avi; Shinar, Tamar; Silver, Pamela A.; Way, Jeffrey

    2013-01-01

    The predictable engineering of well-behaved transcriptional circuits is a central goal of synthetic biology. The artificial attachment of promoters to transcription factor genes usually results in noisy or chaotic behaviors, and such systems are unlikely to be useful in practical applications. Natural transcriptional regulation relies extensively on protein-protein interactions to insure tightly controlled behavior, but such tight control has been elusive in engineered systems. To help engineer protein-protein interactions, we have developed a molecular dynamics simulation framework that simplifies features of proteins moving by constrained Brownian motion, with the goal of performing long simulations. The behavior of a simulated protein system is determined by summation of forces that include a Brownian force, a drag force, excluded volume constraints, relative position constraints, and binding constraints that relate to experimentally determined on-rates and off-rates for chosen protein elements in a system. Proteins are abstracted as spheres. Binding surfaces are defined radially within a protein. Peptide linkers are abstracted as small protein-like spheres with rigid connections. To address whether our framework could generate useful predictions, we simulated the behavior of an engineered fusion protein consisting of two 20 000 Da proteins attached by flexible glycine/serine-type linkers. The two protein elements remained closely associated, as if constrained by a random walk in three dimensions of the peptide linker, as opposed to showing a distribution of distances expected if movement were dominated by Brownian motion of the protein domains only. We also simulated the behavior of fluorescent proteins tethered by a linker of varying length, compared the predicted Förster resonance energy transfer with previous experimental observations, and obtained a good correspondence. Finally, we simulated the binding behavior of a fusion of two ligands that could simultaneously bind to distinct cell-surface receptors, and explored the landscape of linker lengths and stiffnesses that could enhance receptor binding of one ligand when the other ligand has already bound to its receptor, thus, addressing potential mechanisms for improving targeted signal transduction proteins. These specific results have implications for the design of targeted fusion proteins and artificial transcription factors involving fusion of natural domains. More broadly, the simulation framework described here could be extended to include more detailed system features such as non-spherical protein shapes and electrostatics, without requiring detailed, computationally expensive specifications. This framework should be useful in predicting behavior of engineered protein systems including binding and dissociation reactions. PMID:23822508

  14. An Illustrative Guide to the Minerva Framework

    NASA Astrophysics Data System (ADS)

    Flom, Erik; Leonard, Patrick; Hoeffel, Udo; Kwak, Sehyun; Pavone, Andrea; Svensson, Jakob; Krychowiak, Maciej; Wendelstein 7-X Team Collaboration

    2017-10-01

    Modern phsyics experiments require tracking and modelling data and their associated uncertainties on a large scale, as well as the combined implementation of multiple independent data streams for sophisticated modelling and analysis. The Minerva Framework offers a centralized, user-friendly method of large-scale physics modelling and scientific inference. Currently used by teams at multiple large-scale fusion experiments including the Joint European Torus (JET) and Wendelstein 7-X (W7-X), the Minerva framework provides a forward-model friendly architecture for developing and implementing models for large-scale experiments. One aspect of the framework involves so-called data sources, which are nodes in the graphical model. These nodes are supplied with engineering and physics parameters. When end-user level code calls a node, it is checked network-wide against its dependent nodes for changes since its last implementation and returns version-specific data. Here, a filterscope data node is used as an illustrative example of the Minerva Framework's data management structure and its further application to Bayesian modelling of complex systems. This work has been carried out within the framework of the EUROfusion Consortium and has received funding from the Euratom research and training programme 2014-2018 under Grant Agreement No. 633053.

  15. Modeling patient safety incidents knowledge with the Categorial Structure method.

    PubMed

    Souvignet, Julien; Bousquet, Cédric; Lewalle, Pierre; Trombert-Paviot, Béatrice; Rodrigues, Jean Marie

    2011-01-01

    Following the WHO initiative named World Alliance for Patient Safety (PS) launched in 2004 a conceptual framework developed by PS national reporting experts has summarized the knowledge available. As a second step, the Department of Public Health of the University of Saint Etienne team elaborated a Categorial Structure (a semi formal structure not related to an upper level ontology) identifying the elements of the semantic structure underpinning the broad concepts contained in the framework for patient safety. This knowledge engineering method has been developed to enable modeling patient safety information as a prerequisite for subsequent full ontology development. The present article describes the semantic dissection of the concepts, the elicitation of the ontology requirements and the domain constraints of the conceptual framework. This ontology includes 134 concepts and 25 distinct relations and will serve as basis for an Information Model for Patient Safety.

  16. Deep Borehole Disposal Safety Analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeze, Geoffrey A.; Stein, Emily; Price, Laura L.

    This report presents a preliminary safety analysis for the deep borehole disposal (DBD) concept, using a safety case framework. A safety case is an integrated collection of qualitative and quantitative arguments, evidence, and analyses that substantiate the safety, and the level of confidence in the safety, of a geologic repository. This safety case framework for DBD follows the outline of the elements of a safety case, and identifies the types of information that will be required to satisfy these elements. At this very preliminary phase of development, the DBD safety case focuses on the generic feasibility of the DBD concept.more » It is based on potential system designs, waste forms, engineering, and geologic conditions; however, no specific site or regulatory framework exists. It will progress to a site-specific safety case as the DBD concept advances into a site-specific phase, progressing through consent-based site selection and site investigation and characterization.« less

  17. Product design for energy reduction in concurrent engineering: An Inverted Pyramid Approach

    NASA Astrophysics Data System (ADS)

    Alkadi, Nasr M.

    Energy factors in product design in concurrent engineering (CE) are becoming an emerging dimension for several reasons; (a) the rising interest in "green design and manufacturing", (b) the national energy security concerns and the dramatic increase in energy prices, (c) the global competition in the marketplace and global climate change commitments including carbon tax and emission trading systems, and (d) the widespread recognition of the need for sustainable development. This research presents a methodology for the intervention of energy factors in concurrent engineering product development process to significantly reduce the manufacturing energy requirement. The work presented here is the first attempt at integrating the design for energy in concurrent engineering framework. It adds an important tool to the DFX toolbox for evaluation of the impact of design decisions on the product manufacturing energy requirement early during the design phase. The research hypothesis states that "Product Manufacturing Energy Requirement is a Function of Design Parameters". The hypothesis was tested by conducting experimental work in machining and heat treating that took place at the manufacturing lab of the Industrial and Management Systems Engineering Department (IMSE) at West Virginia University (WVU) and at a major U.S steel manufacturing plant, respectively. The objective of the machining experiment was to study the effect of changing specific product design parameters (Material type and diameter) and process design parameters (metal removal rate) on a gear head lathe input power requirement through performing defined sets of machining experiments. The objective of the heat treating experiment was to study the effect of varying product charging temperature on the fuel consumption of a walking beams reheat furnace. The experimental work in both directions have revealed important insights into energy utilization in machining and heat-treating processes and its variance based on product, process, and system design parameters. In depth evaluation to how the design and manufacturing normally happen in concurrent engineering provided a framework to develop energy system levels in machining within the concurrent engineering environment using the method of "Inverted Pyramid Approach", (IPA). The IPA features varying levels of output energy based information depending on the input design parameters that is available during each stage (level) of the product design. The experimental work, the in-depth evaluation of design and manufacturing in CE, and the developed energy system levels in machining provided a solid base for the development of the model for the design for energy reduction in CE. The model was used to analyze an example part where 12 evolving designs were thoroughly reviewed to investigate the sensitivity of energy to design parameters in machining. The model allowed product design teams to address manufacturing energy concerns early during the design stage. As a result, ranges for energy sensitive design parameters impacting product manufacturing energy consumption were found in earlier levels. As designer proceeds to deeper levels in the model, this range tightens and results in significant energy reductions.

  18. Designing Computer Learning Environments for Engineering and Computer Science: The Scaffolded Knowledge Integration Framework.

    ERIC Educational Resources Information Center

    Linn, Marcia C.

    1995-01-01

    Describes a framework called scaffolded knowledge integration and illustrates how it guided the design of two successful course enhancements in the field of computer science and engineering: the LISP Knowledge Integration Environment and the spatial reasoning environment. (101 references) (Author/MKR)

  19. Strategies for Maximizing Successful Drug Substance Technology Transfer Using Engineering, Shake-Down, and Wet Test Runs.

    PubMed

    Abraham, Sushil; Bain, David; Bowers, John; Larivee, Victor; Leira, Francisco; Xie, Jasmina

    2015-01-01

    The technology transfer of biological products is a complex process requiring control of multiple unit operations and parameters to ensure product quality and process performance. To achieve product commercialization, the technology transfer sending unit must successfully transfer knowledge about both the product and the process to the receiving unit. A key strategy for maximizing successful scale-up and transfer efforts is the effective use of engineering and shake-down runs to confirm operational performance and product quality prior to embarking on good manufacturing practice runs such as process performance qualification runs. We consider key factors to consider in making the decision to perform shake-down or engineering runs. We also present industry benchmarking results of how engineering runs are used in drug substance technology transfers alongside the main themes and best practices that have emerged. Our goal is to provide companies with a framework for ensuring the "right first time" technology transfers with effective deployment of resources within increasingly aggressive timeline constraints. © PDA, Inc. 2015.

  20. Consistent design schematics for biological systems: standardization of representation in biological engineering

    PubMed Central

    Matsuoka, Yukiko; Ghosh, Samik; Kitano, Hiroaki

    2009-01-01

    The discovery by design paradigm driving research in synthetic biology entails the engineering of de novo biological constructs with well-characterized input–output behaviours and interfaces. The construction of biological circuits requires iterative phases of design, simulation and assembly, leading to the fabrication of a biological device. In order to represent engineered models in a consistent visual format and further simulating them in silico, standardization of representation and model formalism is imperative. In this article, we review different efforts for standardization, particularly standards for graphical visualization and simulation/annotation schemata adopted in systems biology. We identify the importance of integrating the different standardization efforts and provide insights into potential avenues for developing a common framework for model visualization, simulation and sharing across various tools. We envision that such a synergistic approach would lead to the development of global, standardized schemata in biology, empowering deeper understanding of molecular mechanisms as well as engineering of novel biological systems. PMID:19493898

  1. A Multidisciplinary Approach to Mixer-Ejector Analysis and Design

    NASA Technical Reports Server (NTRS)

    Hendricks, Eric, S.; Seidel, Jonathan, A.

    2012-01-01

    The design of an engine for a civil supersonic aircraft presents a difficult multidisciplinary problem to propulsion system engineers. There are numerous competing requirements for the engine, such as to be efficient during cruise while yet quiet enough at takeoff to meet airport noise regulations. The use of mixer-ejector nozzles presents one possible solution to this challenge. However, designing a mixer-ejector which will successfully address both of these concerns is a difficult proposition. Presented in this paper is an integrated multidisciplinary approach to the analysis and design of these systems. A process that uses several low-fidelity tools to evaluate both the performance and acoustics of mixer-ejectors nozzles is described. This process is further expanded to include system-level modeling of engines and aircraft to determine the effects on mission performance and noise near airports. The overall process is developed in the OpenMDAO framework currently being developed by NASA. From the developed process, sample results are given for a notional mixer-ejector design, thereby demonstrating the capabilities of the method.

  2. Method Engineering: A Service-Oriented Approach

    NASA Astrophysics Data System (ADS)

    Cauvet, Corine

    In the past, a large variety of methods have been published ranging from very generic frameworks to methods for specific information systems. Method Engineering has emerged as a research discipline for designing, constructing and adapting methods for Information Systems development. Several approaches have been proposed as paradigms in method engineering. The meta modeling approach provides means for building methods by instantiation, the component-based approach aims at supporting the development of methods by using modularization constructs such as method fragments, method chunks and method components. This chapter presents an approach (SO2M) for method engineering based on the service paradigm. We consider services as autonomous computational entities that are self-describing, self-configuring and self-adapting. They can be described, published, discovered and dynamically composed for processing a consumer's demand (a developer's requirement). The method service concept is proposed to capture a development process fragment for achieving a goal. Goal orientation in service specification and the principle of service dynamic composition support method construction and method adaptation to different development contexts.

  3. A systems engineering perspective on the human-centered design of health information systems.

    PubMed

    Samaras, George M; Horst, Richard L

    2005-02-01

    The discipline of systems engineering, over the past five decades, has used a structured systematic approach to managing the "cradle to grave" development of products and processes. While elements of this approach are typically used to guide the development of information systems that instantiate a significant user interface, it appears to be rare for the entire process to be implemented. In fact, a number of authors have put forth development lifecycle models that are subsets of the classical systems engineering method, but fail to include steps such as incremental hazard analysis and post-deployment corrective and preventative actions. In that most health information systems have safety implications, we argue that the design and development of such systems would benefit by implementing this systems engineering approach in full. Particularly with regard to bringing a human-centered perspective to the formulation of system requirements and the configuration of effective user interfaces, this classical systems engineering method provides an excellent framework for incorporating human factors (ergonomics) knowledge and integrating ergonomists in the interdisciplinary development of health information systems.

  4. Multidisciplinary model-based-engineering for laser weapon systems: recent progress

    NASA Astrophysics Data System (ADS)

    Coy, Steve; Panthaki, Malcolm

    2013-09-01

    We are working to develop a comprehensive, integrated software framework and toolset to support model-based engineering (MBE) of laser weapons systems. MBE has been identified by the Office of the Director, Defense Science and Engineering as one of four potentially "game-changing" technologies that could bring about revolutionary advances across the entire DoD research and development and procurement cycle. To be effective, however, MBE requires robust underlying modeling and simulation technologies capable of modeling all the pertinent systems, subsystems, components, effects, and interactions at any level of fidelity that may be required in order to support crucial design decisions at any point in the system development lifecycle. Very often the greatest technical challenges are posed by systems involving interactions that cut across two or more distinct scientific or engineering domains; even in cases where there are excellent tools available for modeling each individual domain, generally none of these domain-specific tools can be used to model the cross-domain interactions. In the case of laser weapons systems R&D these tools need to be able to support modeling of systems involving combined interactions among structures, thermal, and optical effects, including both ray optics and wave optics, controls, atmospheric effects, target interaction, computational fluid dynamics, and spatiotemporal interactions between lasing light and the laser gain medium. To address this problem we are working to extend Comet™, to add the addition modeling and simulation capabilities required for this particular application area. In this paper we will describe our progress to date.

  5. Data Integration Framework Data Management Plan Remote Sensing Dataset

    DTIC Science & Technology

    2016-07-01

    performed by the Coastal Observations and Analysis Branch (CEERD-HFA) of the Flood and Storm Protection Division (CEERD-HF), U.S. Army Engineer Research... Protection Division, Coastal Observations and Analysis Branch CESAM U.S. Army Corps of Engineers, Mobile District CESAM-OP-J U.S. Army Corps of Engineers...ER D C/ CH L SR -1 6- 2 Coastal Ocean Data Systems Program Data Integration Framework Data Management Plan Remote Sensing Dataset Co

  6. A survey of the neuroscience resource landscape: perspectives from the neuroscience information framework.

    PubMed

    Cachat, Jonathan; Bandrowski, Anita; Grethe, Jeffery S; Gupta, Amarnath; Astakhov, Vadim; Imam, Fahim; Larson, Stephen D; Martone, Maryann E

    2012-01-01

    The number of available neuroscience resources (databases, tools, materials, and networks) available via the Web continues to expand, particularly in light of newly implemented data sharing policies required by funding agencies and journals. However, the nature of dense, multifaceted neuroscience data and the design of classic search engine systems make efficient, reliable, and relevant discovery of such resources a significant challenge. This challenge is especially pertinent for online databases, whose dynamic content is largely opaque to contemporary search engines. The Neuroscience Information Framework was initiated to address this problem of finding and utilizing neuroscience-relevant resources. Since its first production release in 2008, NIF has been surveying the resource landscape for the neurosciences, identifying relevant resources and working to make them easily discoverable by the neuroscience community. In this chapter, we provide a survey of the resource landscape for neuroscience: what types of resources are available, how many there are, what they contain, and most importantly, ways in which these resources can be utilized by the research community to advance neuroscience research. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Life-cycle and cost of goods assessment of fed-batch and perfusion-based manufacturing processes for mAbs.

    PubMed

    Bunnak, Phumthep; Allmendinger, Richard; Ramasamy, Sri V; Lettieri, Paola; Titchener-Hooker, Nigel J

    2016-09-01

    Life-cycle assessment (LCA) is an environmental assessment tool that quantifies the environmental impact associated with a product or a process (e.g., water consumption, energy requirements, and solid waste generation). While LCA is a standard approach in many commercial industries, its application has not been exploited widely in the bioprocessing sector. To contribute toward the design of more cost-efficient, robust and environmentally-friendly manufacturing process for monoclonal antibodies (mAbs), a framework consisting of an LCA and economic analysis combined with a sensitivity analysis of manufacturing process parameters and a production scale-up study is presented. The efficiency of the framework is demonstrated using a comparative study of the two most commonly used upstream configurations for mAb manufacture, namely fed-batch (FB) and perfusion-based processes. Results obtained by the framework are presented using a range of visualization tools, and indicate that a standard perfusion process (with a pooling duration of 4 days) has similar cost of goods than a FB process but a larger environmental footprint because it consumed 35% more water, demanded 17% more energy, and emitted 17% more CO 2 than the FB process. Water consumption was the most important impact category, especially when scaling-up the processes, as energy was required to produce process water and water-for-injection, while CO 2 was emitted from energy generation. The sensitivity analysis revealed that the perfusion process can be made more environmentally-friendly than the FB process if the pooling duration is extended to 8 days. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1324-1335, 2016. © 2016 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  8. A Survey of Statistical Models for Reverse Engineering Gene Regulatory Networks

    PubMed Central

    Huang, Yufei; Tienda-Luna, Isabel M.; Wang, Yufeng

    2009-01-01

    Statistical models for reverse engineering gene regulatory networks are surveyed in this article. To provide readers with a system-level view of the modeling issues in this research, a graphical modeling framework is proposed. This framework serves as the scaffolding on which the review of different models can be systematically assembled. Based on the framework, we review many existing models for many aspects of gene regulation; the pros and cons of each model are discussed. In addition, network inference algorithms are also surveyed under the graphical modeling framework by the categories of point solutions and probabilistic solutions and the connections and differences among the algorithms are provided. This survey has the potential to elucidate the development and future of reverse engineering GRNs and bring statistical signal processing closer to the core of this research. PMID:20046885

  9. NASA/DoD Aerospace Knowledge Diffusion Research Project. Paper 31: The information-seeking behavior of engineers

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Bishop, Ann P.; Barclay, Rebecca O.; Kennedy, John M.

    1993-01-01

    Engineers are an extraordinarily diverse group of professionals, but an attribute common to all engineers is their use of information. Engineering can be conceptualized as an information processing system that must deal with work-related uncertainty through patterns of technical communications. Throughout the process, data, information, and tacit knowledge are being acquired, produced, transferred, and utilized. While acknowledging that other models exist, we have chosen to view the information-seeking behavior of engineers within a conceptual framework of the engineer as an information processor. This article uses the chosen framework to discuss information-seeking behavior of engineers, reviewing selected literature and empirical studies from library and information science, management, communications, and sociology. The article concludes by proposing a research agenda designed to extend our current, limited knowledge of the way engineers process information.

  10. USE Efficiency: an innovative educational programme for energy efficiency in buildings

    NASA Astrophysics Data System (ADS)

    Papadopoulos, Theofilos A.; Christoforidis, Georgios C.; Papagiannis, Grigoris K.

    2017-10-01

    Power engineers are expected to play a pivotal role in transforming buildings into smart and energy-efficient structures, which is necessary since buildings are responsible for a considerable amount of the total energy consumption. To fulfil this role, a holistic approach in education is required, tackling subjects traditionally related to other engineering disciplines. In this context, USE Efficiency is an inter-institutional and interdisciplinary educational programme implemented in nine European Universities targeting energy efficiency in buildings. The educational programme effectively links professors, students, engineers and industry experts, creating a unique learning environment. The scope of the paper is to present the methodology and the general framework followed in the USE Efficiency programme. The proposed methodology can be adopted for the design and implementation of educational programmes on energy efficiency and sustainable development in higher education. End-of-course survey results showed positive feedback from the participating students, indicating the success of the programme.

  11. An asynchronous traversal engine for graph-based rich metadata management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Dong; Carns, Philip; Ross, Robert B.

    Rich metadata in high-performance computing (HPC) systems contains extended information about users, jobs, data files, and their relationships. Property graphs are a promising data model to represent heterogeneous rich metadata flexibly. Specifically, a property graph can use vertices to represent different entities and edges to record the relationships between vertices with unique annotations. The high-volume HPC use case, with millions of entities and relationships, naturally requires an out-of-core distributed property graph database, which must support live updates (to ingest production information in real time), low-latency point queries (for frequent metadata operations such as permission checking), and large-scale traversals (for provenancemore » data mining). Among these needs, large-scale property graph traversals are particularly challenging for distributed graph storage systems. Most existing graph systems implement a "level synchronous" breadth-first search algorithm that relies on global synchronization in each traversal step. This performs well in many problem domains; but a rich metadata management system is characterized by imbalanced graphs, long traversal lengths, and concurrent workloads, each of which has the potential to introduce or exacerbate stragglers (i.e., abnormally slow steps or servers in a graph traversal) that lead to low overall throughput for synchronous traversal algorithms. Previous research indicated that the straggler problem can be mitigated by using asynchronous traversal algorithms, and many graph-processing frameworks have successfully demonstrated this approach. Such systems require the graph to be loaded into a separate batch-processing framework instead of being iteratively accessed, however. In this work, we investigate a general asynchronous graph traversal engine that can operate atop a rich metadata graph in its native format. We outline a traversal-aware query language and key optimizations (traversal-affiliate caching and execution merging) necessary for efficient performance. We further explore the effect of different graph partitioning strategies on the traversal performance for both synchronous and asynchronous traversal engines. Our experiments show that the asynchronous graph traversal engine is more efficient than its synchronous counterpart in the case of HPC rich metadata processing, where more servers are involved and larger traversals are needed. Furthermore, the asynchronous traversal engine is more adaptive to different graph partitioning strategies.« less

  12. An asynchronous traversal engine for graph-based rich metadata management

    DOE PAGES

    Dai, Dong; Carns, Philip; Ross, Robert B.; ...

    2016-06-23

    Rich metadata in high-performance computing (HPC) systems contains extended information about users, jobs, data files, and their relationships. Property graphs are a promising data model to represent heterogeneous rich metadata flexibly. Specifically, a property graph can use vertices to represent different entities and edges to record the relationships between vertices with unique annotations. The high-volume HPC use case, with millions of entities and relationships, naturally requires an out-of-core distributed property graph database, which must support live updates (to ingest production information in real time), low-latency point queries (for frequent metadata operations such as permission checking), and large-scale traversals (for provenancemore » data mining). Among these needs, large-scale property graph traversals are particularly challenging for distributed graph storage systems. Most existing graph systems implement a "level synchronous" breadth-first search algorithm that relies on global synchronization in each traversal step. This performs well in many problem domains; but a rich metadata management system is characterized by imbalanced graphs, long traversal lengths, and concurrent workloads, each of which has the potential to introduce or exacerbate stragglers (i.e., abnormally slow steps or servers in a graph traversal) that lead to low overall throughput for synchronous traversal algorithms. Previous research indicated that the straggler problem can be mitigated by using asynchronous traversal algorithms, and many graph-processing frameworks have successfully demonstrated this approach. Such systems require the graph to be loaded into a separate batch-processing framework instead of being iteratively accessed, however. In this work, we investigate a general asynchronous graph traversal engine that can operate atop a rich metadata graph in its native format. We outline a traversal-aware query language and key optimizations (traversal-affiliate caching and execution merging) necessary for efficient performance. We further explore the effect of different graph partitioning strategies on the traversal performance for both synchronous and asynchronous traversal engines. Our experiments show that the asynchronous graph traversal engine is more efficient than its synchronous counterpart in the case of HPC rich metadata processing, where more servers are involved and larger traversals are needed. Furthermore, the asynchronous traversal engine is more adaptive to different graph partitioning strategies.« less

  13. A system for environmental model coupling and code reuse: The Great Rivers Project

    NASA Astrophysics Data System (ADS)

    Eckman, B.; Rice, J.; Treinish, L.; Barford, C.

    2008-12-01

    As part of the Great Rivers Project, IBM is collaborating with The Nature Conservancy and the Center for Sustainability and the Global Environment (SAGE) at the University of Wisconsin, Madison to build a Modeling Framework and Decision Support System (DSS) designed to help policy makers and a variety of stakeholders (farmers, fish & wildlife managers, hydropower operators, et al.) to assess, come to consensus, and act on land use decisions representing effective compromises between human use and ecosystem preservation/restoration. Initially focused on Brazil's Paraguay-Parana, China's Yangtze, and the Mississippi Basin in the US, the DSS integrates data and models from a wide variety of environmental sectors, including water balance, water quality, carbon balance, crop production, hydropower, and biodiversity. In this presentation we focus on the modeling framework aspect of this project. In our approach to these and other environmental modeling projects, we see a flexible, extensible modeling framework infrastructure for defining and running multi-step analytic simulations as critical. In this framework, we divide monolithic models into atomic components with clearly defined semantics encoded via rich metadata representation. Once models and their semantics and composition rules have been registered with the system by their authors or other experts, non-expert users may construct simulations as workflows of these atomic model components. A model composition engine enforces rules/constraints for composing model components into simulations, to avoid the creation of Frankenmodels, models that execute but produce scientifically invalid results. A common software environment and common representations of data and models are required, as well as an adapter strategy for code written in e.g., Fortran or python, that still enables efficient simulation runs, including parallelization. Since each new simulation, as a new composition of model components, requires calibration of parameters (fudge factors) to produce scientifically valid results, we are also developing an autocalibration engine. Finally, visualization is a key element of this modeling framework strategy, both to convey complex scientific data effectively, and also to enable non-expert users to make full use of the relevant features of the framework. We are developing a visualization environment with a strong data model, to enable visualizations, model results, and data all to be handled similarly.

  14. Towards Archetypes-Based Software Development

    NASA Astrophysics Data System (ADS)

    Piho, Gunnar; Roost, Mart; Perkins, David; Tepandi, Jaak

    We present a framework for the archetypes based engineering of domains, requirements and software (Archetypes-Based Software Development, ABD). An archetype is defined as a primordial object that occurs consistently and universally in business domains and in business software systems. An archetype pattern is a collaboration of archetypes. Archetypes and archetype patterns are used to capture conceptual information into domain specific models that are utilized by ABD. The focus of ABD is on software factories - family-based development artefacts (domain specific languages, patterns, frameworks, tools, micro processes, and others) that can be used to build the family members. We demonstrate the usage of ABD for developing laboratory information management system (LIMS) software for the Clinical and Biomedical Proteomics Group, at the Leeds Institute of Molecular Medicine, University of Leeds.

  15. A Modular Framework for Modeling Hardware Elements in Distributed Engine Control Systems

    NASA Technical Reports Server (NTRS)

    Zinnecker, Alicia M.; Culley, Dennis E.; Aretskin-Hariton, Eliot D.

    2014-01-01

    Progress toward the implementation of distributed engine control in an aerospace application may be accelerated through the development of a hardware-in-the-loop (HIL) system for testing new control architectures and hardware outside of a physical test cell environment. One component required in an HIL simulation system is a high-fidelity model of the control platform: sensors, actuators, and the control law. The control system developed for the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k) provides a verifiable baseline for development of a model for simulating a distributed control architecture. This distributed controller model will contain enhanced hardware models, capturing the dynamics of the transducer and the effects of data processing, and a model of the controller network. A multilevel framework is presented that establishes three sets of interfaces in the control platform: communication with the engine (through sensors and actuators), communication between hardware and controller (over a network), and the physical connections within individual pieces of hardware. This introduces modularity at each level of the model, encouraging collaboration in the development and testing of various control schemes or hardware designs. At the hardware level, this modularity is leveraged through the creation of a Simulink(R) library containing blocks for constructing smart transducer models complying with the IEEE 1451 specification. These hardware models were incorporated in a distributed version of the baseline C-MAPSS40k controller and simulations were run to compare the performance of the two models. The overall tracking ability differed only due to quantization effects in the feedback measurements in the distributed controller. Additionally, it was also found that the added complexity of the smart transducer models did not prevent real-time operation of the distributed controller model, a requirement of an HIL system.

  16. A Modular Framework for Modeling Hardware Elements in Distributed Engine Control Systems

    NASA Technical Reports Server (NTRS)

    Zinnecker, Alicia M.; Culley, Dennis E.; Aretskin-Hariton, Eliot D.

    2015-01-01

    Progress toward the implementation of distributed engine control in an aerospace application may be accelerated through the development of a hardware-in-the-loop (HIL) system for testing new control architectures and hardware outside of a physical test cell environment. One component required in an HIL simulation system is a high-fidelity model of the control platform: sensors, actuators, and the control law. The control system developed for the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k) provides a verifiable baseline for development of a model for simulating a distributed control architecture. This distributed controller model will contain enhanced hardware models, capturing the dynamics of the transducer and the effects of data processing, and a model of the controller network. A multilevel framework is presented that establishes three sets of interfaces in the control platform: communication with the engine (through sensors and actuators), communication between hardware and controller (over a network), and the physical connections within individual pieces of hardware. This introduces modularity at each level of the model, encouraging collaboration in the development and testing of various control schemes or hardware designs. At the hardware level, this modularity is leveraged through the creation of a SimulinkR library containing blocks for constructing smart transducer models complying with the IEEE 1451 specification. These hardware models were incorporated in a distributed version of the baseline C-MAPSS40k controller and simulations were run to compare the performance of the two models. The overall tracking ability differed only due to quantization effects in the feedback measurements in the distributed controller. Additionally, it was also found that the added complexity of the smart transducer models did not prevent real-time operation of the distributed controller model, a requirement of an HIL system.

  17. A Modular Framework for Modeling Hardware Elements in Distributed Engine Control Systems

    NASA Technical Reports Server (NTRS)

    Zinnecker, Alicia Mae; Culley, Dennis E.; Aretskin-Hariton, Eliot D.

    2014-01-01

    Progress toward the implementation of distributed engine control in an aerospace application may be accelerated through the development of a hardware-in-the-loop (HIL) system for testing new control architectures and hardware outside of a physical test cell environment. One component required in an HIL simulation system is a high-fidelity model of the control platform: sensors, actuators, and the control law. The control system developed for the Commercial Modular Aero-Propulsion System Simulation 40k (40,000 pound force thrust) (C-MAPSS40k) provides a verifiable baseline for development of a model for simulating a distributed control architecture. This distributed controller model will contain enhanced hardware models, capturing the dynamics of the transducer and the effects of data processing, and a model of the controller network. A multilevel framework is presented that establishes three sets of interfaces in the control platform: communication with the engine (through sensors and actuators), communication between hardware and controller (over a network), and the physical connections within individual pieces of hardware. This introduces modularity at each level of the model, encouraging collaboration in the development and testing of various control schemes or hardware designs. At the hardware level, this modularity is leveraged through the creation of a Simulink (R) library containing blocks for constructing smart transducer models complying with the IEEE 1451 specification. These hardware models were incorporated in a distributed version of the baseline C-MAPSS40k controller and simulations were run to compare the performance of the two models. The overall tracking ability differed only due to quantization effects in the feedback measurements in the distributed controller. Additionally, it was also found that the added complexity of the smart transducer models did not prevent real-time operation of the distributed controller model, a requirement of an HIL system.

  18. Modular extracellular sensor architecture for engineering mammalian cell-based devices.

    PubMed

    Daringer, Nichole M; Dudek, Rachel M; Schwarz, Kelly A; Leonard, Joshua N

    2014-12-19

    Engineering mammalian cell-based devices that monitor and therapeutically modulate human physiology is a promising and emerging frontier in clinical synthetic biology. However, realizing this vision will require new technologies enabling engineered circuitry to sense and respond to physiologically relevant cues. No existing technology enables an engineered cell to sense exclusively extracellular ligands, including proteins and pathogens, without relying upon native cellular receptors or signal transduction pathways that may be subject to crosstalk with native cellular components. To address this need, we here report a technology we term a Modular Extracellular Sensor Architecture (MESA). This self-contained receptor and signal transduction platform is maximally orthogonal to native cellular processes and comprises independent, tunable protein modules that enable performance optimization and straightforward engineering of novel MESA that recognize novel ligands. We demonstrate ligand-inducible activation of MESA signaling, optimization of receptor performance using design-based approaches, and generation of MESA biosensors that produce outputs in the form of either transcriptional regulation or transcription-independent reconstitution of enzymatic activity. This systematic, quantitative platform characterization provides a framework for engineering MESA to recognize novel ligands and for integrating these sensors into diverse mammalian synthetic biology applications.

  19. System Software Framework for System of Systems Avionics

    NASA Technical Reports Server (NTRS)

    Ferguson, Roscoe C.; Peterson, Benjamin L; Thompson, Hiram C.

    2005-01-01

    Project Constellation implements NASA's vision for space exploration to expand human presence in our solar system. The engineering focus of this project is developing a system of systems architecture. This architecture allows for the incremental development of the overall program. Systems can be built and connected in a "Lego style" manner to generate configurations supporting various mission objectives. The development of the avionics or control systems of such a massive project will result in concurrent engineering. Also, each system will have software and the need to communicate with other (possibly heterogeneous) systems. Fortunately, this design problem has already been solved during the creation and evolution of systems such as the Internet and the Department of Defense's successful effort to standardize distributed simulation (now IEEE 1516). The solution relies on the use of a standard layered software framework and a communication protocol. A standard framework and communication protocol is suggested for the development and maintenance of Project Constellation systems. The ARINC 653 standard is a great start for such a common software framework. This paper proposes a common system software framework that uses the Real Time Publish/Subscribe protocol for framework-to-framework communication to extend ARINC 653. It is highly recommended that such a framework be established before development. This is important for the success of concurrent engineering. The framework provides an infrastructure for general system services and is designed for flexibility to support a spiral development effort.

  20. Conflict Management in Collaborative Engineering Design: Basic Research in Fundamental Theory, Modeling Framework, and Computer Support for Collaborative Engineering Activities

    DTIC Science & Technology

    2002-01-01

    behaviors are influenced by social interactions, and to how modern IT sys- tems should be designed to support these group technical activities. The...engineering disciplines to behavior, decision, psychology, organization, and the social sciences. “Conflict manage- ment activity in collaborative...Researchers instead began to search for an entirely new paradigm, starting from a theory in social science, to construct a conceptual framework to describe

  1. Understanding the Role of Academic Language on Conceptual Understanding in an Introductory Materials Science and Engineering Course

    NASA Astrophysics Data System (ADS)

    Kelly, Jacquelyn

    Students may use the technical engineering terms without knowing what these words mean. This creates a language barrier in engineering that influences student learning. Previous research has been conducted to characterize the difference between colloquial and scientific language. Since this research had not yet been applied explicitly to engineering, conclusions from the area of science education were used instead. Various researchers outlined strategies for helping students acquire scientific language. However, few examined and quantified the relationship it had on student learning. A systemic functional linguistics framework was adopted for this dissertation which is a framework that has not previously been used in engineering education research. This study investigated how engineering language proficiency influenced conceptual understanding of introductory materials science and engineering concepts. To answer the research questions about engineering language proficiency, a convenience sample of forty-one undergraduate students in an introductory materials science and engineering course was used. All data collected was integrated with the course. Measures included the Materials Concept Inventory, a written engineering design task, and group observations. Both systemic functional linguistics and mental models frameworks were utilized to interpret data and guide analysis. A series of regression analyses were conducted to determine if engineering language proficiency predicts group engineering term use, if conceptual understanding predicts group engineering term use, and if conceptual understanding predicts engineering language proficiency. Engineering academic language proficiency was found to be strongly linked to conceptual understanding in the context of introductory materials engineering courses. As the semester progressed, this relationship became even stronger. The more engineering concepts students are expected to learn, the more important it is that they are proficient in engineering language. However, exposure to engineering terms did not influence engineering language proficiency. These results stress the importance of engineering language proficiency for learning, but warn that simply exposing students to engineering terms does not promote engineering language proficiency.

  2. Combinatorial Fusion Analysis for Meta Search Information Retrieval

    NASA Astrophysics Data System (ADS)

    Hsu, D. Frank; Taksa, Isak

    Leading commercial search engines are built as single event systems. In response to a particular search query, the search engine returns a single list of ranked search results. To find more relevant results the user must frequently try several other search engines. A meta search engine was developed to enhance the process of multi-engine querying. The meta search engine queries several engines at the same time and fuses individual engine results into a single search results list. The fusion of multiple search results has been shown (mostly experimentally) to be highly effective. However, the question of why and how the fusion should be done still remains largely unanswered. In this chapter, we utilize the combinatorial fusion analysis proposed by Hsu et al. to analyze combination and fusion of multiple sources of information. A rank/score function is used in the design and analysis of our framework. The framework provides a better understanding of the fusion phenomenon in information retrieval. For example, to improve the performance of the combined multiple scoring systems, it is necessary that each of the individual scoring systems has relatively high performance and the individual scoring systems are diverse. Additionally, we illustrate various applications of the framework using two examples from the information retrieval domain.

  3. Aspects, Wrappers and Events

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.

    2003-01-01

    This viewgraph presentation provides information on Object Infrastructure Framework (OIF), an Aspect-Oriented Programming (AOP) system. The presentation begins with an introduction to the difficulties and requirements of distributed computing, including functional and non-functional requirements (ilities). The architecture of Distributed Object Technology includes stubs, proxies for implementation objects, and skeletons, proxies for client applications. The key OIF ideas (injecting behavior, annotated communications, thread contexts, and pragma) are discussed. OIF is an AOP mechanism; AOP is centered on: 1) Separate expression of crosscutting concerns; 2) Mechanisms to weave the separate expressions into a unified system. AOP is software engineering technology for separately expressing systematic properties while nevertheless producing running systems that embody these properties.

  4. Augmenting endogenous repair of soft tissues with nanofibre scaffolds

    PubMed Central

    Snelling, Sarah; Dakin, Stephanie; Carr, Andrew

    2018-01-01

    As our ability to engineer nanoscale materials has developed we can now influence endogenous cellular processes with increasing precision. Consequently, the use of biomaterials to induce and guide the repair and regeneration of tissues is a rapidly developing area. This review focuses on soft tissue engineering, it will discuss the types of biomaterial scaffolds available before exploring physical, chemical and biological modifications to synthetic scaffolds. We will consider how these properties, in combination, can provide a precise design process, with the potential to meet the requirements of the injured and diseased soft tissue niche. Finally, we frame our discussions within clinical trial design and the regulatory framework, the consideration of which is fundamental to the successful translation of new biomaterials. PMID:29695606

  5. Review: bioprinting: a beginning.

    PubMed

    Mironov, Vladimir; Reis, Nuno; Derby, Brian

    2006-04-01

    An increasing demand for directed assembly of biologically relevant materials, with prescribed three-dimensional hierarchical organizations, is stimulating technology developments with the ultimate goal of re-creating multicellular tissues and organs de novo. Existing techniques, mostly adapted from other applications or fields of research, are capable of independently meeting partial requirements for engineering biological or biomimetic structures, but their integration toward organ engineering is proving difficult. Inspired by recent developments in material transfer processes operating at all relevant length scales--from nano to macro--which are amenable to biological elements, a new research field of bioprinting and biopatterning has emerged. Here we present a short review regarding the framework, state of the art, and perspectives of this new field, based on the findings presented at a recent international workshop.

  6. Making a Good Group Decision (Low Risk) in Singapore Under an Environment That Has Time and Cost Constraints

    DTIC Science & Technology

    2014-09-01

    decision-making framework to eliminate bias and promote effective communication. Using a collaborative approach built on systems engineering and...framework to eliminate bias and promote effective communication. Using a collaborative approach built on systems engineering and decision-making...Organization .......................................................................................61 2. Bias

  7. Outcomes-Based Assessment and Learning: Trialling Change in a Postgraduate Civil Engineering Course

    ERIC Educational Resources Information Center

    El-Maaddawy, Tamer; Deneen, Christopher

    2017-01-01

    This paper aims to demonstrate how assessment tasks can function within an outcomes-based learning framework to evaluate student attainment of learning outcomes. An outcomes-based learning framework designed to integrate teaching, learning, and assessment activities was developed and implemented in a civil engineering master-level course. The…

  8. Envisioning engineering education and practice in the coming intelligence convergence era — a complex adaptive systems approach

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.

    2013-12-01

    Some of the recent attempts for improving and transforming engineering education are reviewed. The attempts aim at providing the entry level engineers with the skills needed to address the challenges of future large-scale complex systems and projects. Some of the frontier sectors and future challenges for engineers are outlined. The major characteristics of the coming intelligence convergence era (the post-information age) are identified. These include the prevalence of smart devices and environments, the widespread applications of anticipatory computing and predictive / prescriptive analytics, as well as a symbiotic relationship between humans and machines. Devices and machines will be able to learn from, and with, humans in a natural collaborative way. The recent game changers in learnscapes (learning paradigms, technologies, platforms, spaces, and environments) that can significantly impact engineering education in the coming era are identified. Among these are open educational resources, knowledge-rich classrooms, immersive interactive 3D learning, augmented reality, reverse instruction / flipped classroom, gamification, robots in the classroom, and adaptive personalized learning. Significant transformative changes in, and mass customization of, learning are envisioned to emerge from the synergistic combination of the game changers and other technologies. The realization of the aforementioned vision requires the development of a new multidisciplinary framework of emergent engineering for relating innovation, complexity and cybernetics, within the future learning environments. The framework can be used to treat engineering education as a complex adaptive system, with dynamically interacting and communicating components (instructors, individual, small, and large groups of learners). The emergent behavior resulting from the interactions can produce progressively better, and continuously improving, learning environment. As a first step towards the realization of the vision, intelligent adaptive cyber-physical ecosystems need to be developed to facilitate collaboration between the various stakeholders of engineering education, and to accelerate the development of a skilled engineering workforce. The major components of the ecosystems include integrated knowledge discovery and exploitation facilities, blended learning and research spaces, novel ultra-intelligent software agents, multimodal and autonomous interfaces, and networked cognitive and tele-presence robots.

  9. The Umbra Simulation and Integration Framework Applied to Emergency Response Training

    NASA Technical Reports Server (NTRS)

    Hamilton, Paul Lawrence; Britain, Robert

    2010-01-01

    The Mine Emergency Response Interactive Training Simulation (MERITS) is intended to prepare personnel to manage an emergency in an underground coal mine. The creation of an effective training environment required realistic emergent behavior in response to simulation events and trainee interventions, exploratory modification of miner behavior rules, realistic physics, and incorporation of legacy code. It also required the ability to add rich media to the simulation without conflicting with normal desktop security settings. Our Umbra Simulation and Integration Framework facilitated agent-based modeling of miners and rescuers and made it possible to work with subject matter experts to quickly adjust behavior through script editing, rather than through lengthy programming and recompilation. Integration of Umbra code with the WebKit browser engine allowed the use of JavaScript-enabled local web pages for media support. This project greatly extended the capabilities of Umbra in support of training simulations and has implications for simulations that combine human behavior, physics, and rich media.

  10. A reusable rocket engine intelligen control

    NASA Technical Reports Server (NTRS)

    Merrill, Walter C.; Lorenzo, Carl F.

    1988-01-01

    An intelligent control system for reusable space propulsion systems for future launch vehicles is described. The system description includes a framework for the design. The framework consists of an execution level with high-speed control and diagnostics, and a coordination level which marries expert system concepts with traditional control. A comparison is made between air breathing and rocket engine control concepts to assess the relative levels of development and to determine the applicability of air breathing control concepts to future reusable rocket engine systems.

  11. A reusable rocket engine intelligent control

    NASA Technical Reports Server (NTRS)

    Merrill, Walter C.; Lorenzo, Carl F.

    1988-01-01

    An intelligent control system for reusable space propulsion systems for future launch vehicles is described. The system description includes a framework for the design. The framework consists of an execution level with high-speed control and diagnostics, and a coordination level which marries expert system concepts with traditional control. A comparison is made between air breathing and rocket engine control concepts to assess the relative levels of development and to determine the applicability of air breathing control concepts ot future reusable rocket engine systems.

  12. Extending enterprise architecture modelling with business goals and requirements

    NASA Astrophysics Data System (ADS)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  13. A Framework for RFID Survivability Requirement Analysis and Specification

    NASA Astrophysics Data System (ADS)

    Zuo, Yanjun; Pimple, Malvika; Lande, Suhas

    Many industries are becoming dependent on Radio Frequency Identification (RFID) technology for inventory management and asset tracking. The data collected about tagged objects though RFID is used in various high level business operations. The RFID system should hence be highly available, reliable, and dependable and secure. In addition, this system should be able to resist attacks and perform recovery in case of security incidents. Together these requirements give rise to the notion of a survivable RFID system. The main goal of this paper is to analyze and specify the requirements for an RFID system to become survivable. These requirements, if utilized, can assist the system in resisting against devastating attacks and recovering quickly from damages. This paper proposes the techniques and approaches for RFID survivability requirements analysis and specification. From the perspective of system acquisition and engineering, survivability requirement is the important first step in survivability specification, compliance formulation, and proof verification.

  14. Towards a Framework for Modeling Space Systems Architectures

    NASA Technical Reports Server (NTRS)

    Shames, Peter; Skipper, Joseph

    2006-01-01

    Topics covered include: 1) Statement of the problem: a) Space system architecture is complex; b) Existing terrestrial approaches must be adapted for space; c) Need a common architecture methodology and information model; d) Need appropriate set of viewpoints. 2) Requirements on a space systems model. 3) Model Based Engineering and Design (MBED) project: a) Evaluated different methods; b) Adapted and utilized RASDS & RM-ODP; c) Identified useful set of viewpoints; d) Did actual model exchanges among selected subset of tools. 4) Lessons learned & future vision.

  15. Mars aerobrake assembly simulation

    NASA Technical Reports Server (NTRS)

    Filatovs, G. J.; Lee, Gordon K. F.; Garvey, John

    1992-01-01

    On-orbit assembly operation simulations in neutral buoyancy conditions are presently undertaken by a partial/full-scale Mars mission aerobrake mockup, whose design, conducted in the framework of an engineering senior students' design project, involved several levels of constraints for critical physical and operational features. Allowances had to be made for the auxiliary constraints introduced by underwater testing, as well as the subsegmenting required for overland shipment to the neutral-buoyancy testing facility. This mockup aerobrake's fidelity is determined by the numerous, competing design objectives.

  16. Introduction to the Security Engineering Risk Analysis (SERA) Framework

    DTIC Science & Technology

    2014-11-01

    military aircraft has increased from 8% to 80%. At the same time, the size of software in military aircraft has grown from 1,000 lines of code in the F...4A to 1.7 million lines of code in the F-22. This growth trend is expected to con- tinue over time [NASA 2009]. As software exerts more control of...their root causes can be traced to the software’s requirements, architecture, design, or code . Studies have shown that the cost of addressing a software

  17. Data center networks and network architecture

    NASA Astrophysics Data System (ADS)

    Esaki, Hiroshi

    2014-02-01

    This paper discusses and proposes the architectural framework, which is for data center networks. The data center networks require new technical challenges, and it would be good opportunity to change the functions, which are not need in current and future networks. Based on the observation and consideration on data center networks, this paper proposes; (i) Broadcast-free layer 2 network (i.e., emulation of broadcast at the end-node), (ii) Full-mesh point-to-point pipes, and (iii) IRIDES (Invitation Routing aDvertisement for path Engineering System).

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kornreich, Drew E; Vaidya, Rajendra U; Ammerman, Curtt N

    Integrated Computational Materials Engineering (ICME) is a novel overarching approach to bridge length and time scales in computational materials science and engineering. This approach integrates all elements of multi-scale modeling (including various empirical and science-based models) with materials informatics to provide users the opportunity to tailor material selections based on stringent application needs. Typically, materials engineering has focused on structural requirements (stress, strain, modulus, fracture toughness etc.) while multi-scale modeling has been science focused (mechanical threshold strength model, grain-size models, solid-solution strengthening models etc.). Materials informatics (mechanical property inventories) on the other hand, is extensively data focused. All of thesemore » elements are combined within the framework of ICME to create architecture for the development, selection and design new composite materials for challenging environments. We propose development of the foundations for applying ICME to composite materials development for nuclear and high-radiation environments (including nuclear-fusion energy reactors, nuclear-fission reactors, and accelerators). We expect to combine all elements of current material models (including thermo-mechanical and finite-element models) into the ICME framework. This will be accomplished through the use of a various mathematical modeling constructs. These constructs will allow the integration of constituent models, which in tum would allow us to use the adaptive strengths of using a combinatorial scheme (fabrication and computational) for creating new composite materials. A sample problem where these concepts are used is provided in this summary.« less

  19. Technology Benefit Estimator (T/BEST): User's Manual

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Chamis, Christos C.; Abumeri, Galib

    1994-01-01

    The Technology Benefit Estimator (T/BEST) system is a formal method to assess advanced technologies and quantify the benefit contributions for prioritization. T/BEST may be used to provide guidelines to identify and prioritize high payoff research areas, help manage research and limited resources, show the link between advanced concepts and the bottom line, i.e., accrued benefit and value, and to communicate credibly the benefits of research. The T/BEST software computer program is specifically designed to estimating benefits, and benefit sensitivities, of introducing new technologies into existing propulsion systems. Key engine cycle, structural, fluid, mission and cost analysis modules are used to provide a framework for interfacing with advanced technologies. An open-ended, modular approach is used to allow for modification and addition of both key and advanced technology modules. T/BEST has a hierarchical framework that yields varying levels of benefit estimation accuracy that are dependent on the degree of input detail available. This hierarchical feature permits rapid estimation of technology benefits even when the technology is at the conceptual stage. As knowledge of the technology details increases the accuracy of the benefit analysis increases. Included in T/BEST's framework are correlations developed from a statistical data base that is relied upon if there is insufficient information given in a particular area, e.g., fuel capacity or aircraft landing weight. Statistical predictions are not required if these data are specified in the mission requirements. The engine cycle, structural fluid, cost, noise, and emissions analyses interact with the default or user material and component libraries to yield estimates of specific global benefits: range, speed, thrust, capacity, component life, noise, emissions, specific fuel consumption, component and engine weights, pre-certification test, mission performance engine cost, direct operating cost, life cycle cost, manufacturing cost, development cost, risk, and development time. Currently, T/BEST operates on stand-alone or networked workstations, and uses a UNIX shell or script to control the operation of interfaced FORTRAN based analyses. T/BEST's interface structure works equally well with non-FORTRAN or mixed software analysis. This interface structure is designed to maintain the integrity of the expert's analyses by interfacing with expert's existing input and output files. Parameter input and output data (e.g., number of blades, hub diameters, etc.) are passed via T/BEST's neutral file, while copious data (e.g., finite element models, profiles, etc.) are passed via file pointers that point to the expert's analyses output files. In order to make the communications between the T/BEST's neutral file and attached analyses codes simple, only two software commands, PUT and GET, are required. This simplicity permits easy access to all input and output variables contained within the neutral file. Both public domain and proprietary analyses codes may be attached with a minimal amount of effort, while maintaining full data and analysis integrity, and security. T/BESt's sotware framework, status, beginner-to-expert operation, interface architecture, analysis module addition, and key analysis modules are discussed. Representative examples of T/BEST benefit analyses are shown.

  20. Technology Benefit Estimator (T/BEST): User's manual

    NASA Astrophysics Data System (ADS)

    Generazio, Edward R.; Chamis, Christos C.; Abumeri, Galib

    1994-12-01

    The Technology Benefit Estimator (T/BEST) system is a formal method to assess advanced technologies and quantify the benefit contributions for prioritization. T/BEST may be used to provide guidelines to identify and prioritize high payoff research areas, help manage research and limited resources, show the link between advanced concepts and the bottom line, i.e., accrued benefit and value, and to communicate credibly the benefits of research. The T/BEST software computer program is specifically designed to estimating benefits, and benefit sensitivities, of introducing new technologies into existing propulsion systems. Key engine cycle, structural, fluid, mission and cost analysis modules are used to provide a framework for interfacing with advanced technologies. An open-ended, modular approach is used to allow for modification and addition of both key and advanced technology modules. T/BEST has a hierarchical framework that yields varying levels of benefit estimation accuracy that are dependent on the degree of input detail available. This hierarchical feature permits rapid estimation of technology benefits even when the technology is at the conceptual stage. As knowledge of the technology details increases the accuracy of the benefit analysis increases. Included in T/BEST's framework are correlations developed from a statistical data base that is relied upon if there is insufficient information given in a particular area, e.g., fuel capacity or aircraft landing weight. Statistical predictions are not required if these data are specified in the mission requirements. The engine cycle, structural fluid, cost, noise, and emissions analyses interact with the default or user material and component libraries to yield estimates of specific global benefits: range, speed, thrust, capacity, component life, noise, emissions, specific fuel consumption, component and engine weights, pre-certification test, mission performance engine cost, direct operating cost, life cycle cost, manufacturing cost, development cost, risk, and development time. Currently, T/BEST operates on stand-alone or networked workstations, and uses a UNIX shell or script to control the operation of interfaced FORTRAN based analyses. T/BEST's interface structure works equally well with non-FORTRAN or mixed software analysis. This interface structure is designed to maintain the integrity of the expert's analyses by interfacing with expert's existing input and output files. Parameter input and output data (e.g., number of blades, hub diameters, etc.) are passed via T/BEST's neutral file, while copious data (e.g., finite element models, profiles, etc.) are passed via file pointers that point to the expert's analyses output files. In order to make the communications between the T/BEST's neutral file and attached analyses codes simple, only two software commands, PUT and GET, are required. This simplicity permits easy access to all input and output variables contained within the neutral file. Both public domain and proprietary analyses codes may be attached with a minimal amount of effort, while maintaining full data and analysis integrity, and security.

  1. Multidisciplinary Environments: A History of Engineering Framework Development

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.; Gillian, Ronnie E.

    2006-01-01

    This paper traces the history of engineering frameworks and their use by Multidisciplinary Design Optimization (MDO) practitioners. The approach is to reference papers that have been presented at one of the ten previous Multidisciplinary Analysis and Optimization (MA&O) conferences. By limiting the search to MA&O papers, the authors can (1) identify the key ideas that led to general purpose MDO frameworks and (2) uncover roadblocks that delayed the development of these ideas. The authors make no attempt to assign credit for revolutionary ideas or to assign blame for missed opportunities. Rather, the goal is to trace the various threads of computer architecture and software framework research and to observe how these threads contributed to the commercial framework products available today.

  2. An Agent-Based Optimization Framework for Engineered Complex Adaptive Systems with Application to Demand Response in Electricity Markets

    NASA Astrophysics Data System (ADS)

    Haghnevis, Moeed

    The main objective of this research is to develop an integrated method to study emergent behavior and consequences of evolution and adaptation in engineered complex adaptive systems (ECASs). A multi-layer conceptual framework and modeling approach including behavioral and structural aspects is provided to describe the structure of a class of engineered complex systems and predict their future adaptive patterns. The approach allows the examination of complexity in the structure and the behavior of components as a result of their connections and in relation to their environment. This research describes and uses the major differences of natural complex adaptive systems (CASs) with artificial/engineered CASs to build a framework and platform for ECAS. While this framework focuses on the critical factors of an engineered system, it also enables one to synthetically employ engineering and mathematical models to analyze and measure complexity in such systems. In this way concepts of complex systems science are adapted to management science and system of systems engineering. In particular an integrated consumer-based optimization and agent-based modeling (ABM) platform is presented that enables managers to predict and partially control patterns of behaviors in ECASs. Demonstrated on the U.S. electricity markets, ABM is integrated with normative and subjective decision behavior recommended by the U.S. Department of Energy (DOE) and Federal Energy Regulatory Commission (FERC). The approach integrates social networks, social science, complexity theory, and diffusion theory. Furthermore, it has unique and significant contribution in exploring and representing concrete managerial insights for ECASs and offering new optimized actions and modeling paradigms in agent-based simulation.

  3. Light-optimized growth of cyanobacterial cultures: Growth phases and productivity of biomass and secreted molecules in light-limited batch growth.

    PubMed

    Clark, Ryan L; McGinley, Laura L; Purdy, Hugh M; Korosh, Travis C; Reed, Jennifer L; Root, Thatcher W; Pfleger, Brian F

    2018-03-27

    Cyanobacteria are photosynthetic microorganisms whose metabolism can be modified through genetic engineering for production of a wide variety of molecules directly from CO 2 , light, and nutrients. Diverse molecules have been produced in small quantities by engineered cyanobacteria to demonstrate the feasibility of photosynthetic biorefineries. Consequently, there is interest in engineering these microorganisms to increase titer and productivity to meet industrial metrics. Unfortunately, differing experimental conditions and cultivation techniques confound comparisons of strains and metabolic engineering strategies. In this work, we discuss the factors governing photoautotrophic growth and demonstrate nutritionally replete conditions in which a model cyanobacterium can be grown to stationary phase with light as the sole limiting substrate. We introduce a mathematical framework for understanding the dynamics of growth and product secretion in light-limited cyanobacterial cultures. Using this framework, we demonstrate how cyanobacterial growth in differing experimental systems can be easily scaled by the volumetric photon delivery rate using the model organisms Synechococcus sp. strain PCC7002 and Synechococcus elongatus strain UTEX2973. We use this framework to predict scaled up growth and product secretion in 1L photobioreactors of two strains of Synechococcus PCC7002 engineered for production of l-lactate or L-lysine. The analytical framework developed in this work serves as a guide for future metabolic engineering studies of cyanobacteria to allow better comparison of experiments performed in different experimental systems and to further investigate the dynamics of growth and product secretion. Copyright © 2018 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  4. Active stability augmentation of large space structures: A stochastic control problem

    NASA Technical Reports Server (NTRS)

    Balakrishnan, A. V.

    1987-01-01

    A problem in SCOLE is that of slewing an offset antenna on a long flexible beam-like truss attached to the space shuttle, with rather stringent pointing accuracy requirements. The relevant methodology aspects in robust feedback-control design for stability augmentation of the beam using on-board sensors is examined. It is framed as a stochastic control problem, boundary control of a distributed parameter system described by partial differential equations. While the framework is mathematical, the emphasis is still on an engineering solution. An abstract mathematical formulation is developed as a nonlinear wave equation in a Hilbert space. That the system is controllable is shown and a feedback control law that is robust in the sense that it does not require quantitative knowledge of system parameters is developed. The stochastic control problem that arises in instrumenting this law using appropriate sensors is treated. Using an engineering first approximation which is valid for small damping, formulas for optimal choice of the control gain are developed.

  5. Biofunctionalized Plants as Diverse Biomaterials for Human Cell Culture.

    PubMed

    Fontana, Gianluca; Gershlak, Joshua; Adamski, Michal; Lee, Jae-Sung; Matsumoto, Shion; Le, Hau D; Binder, Bernard; Wirth, John; Gaudette, Glenn; Murphy, William L

    2017-04-01

    The commercial success of tissue engineering products requires efficacy, cost effectiveness, and the possibility of scaleup. Advances in tissue engineering require increased sophistication in the design of biomaterials, often challenging the current manufacturing techniques. Interestingly, several of the properties that are desirable for biomaterial design are embodied in the structure and function of plants. This study demonstrates that decellularized plant tissues can be used as adaptable scaffolds for culture of human cells. With simple biofunctionalization technique, it is possible to enable adhesion of human cells on a diverse set of plant tissues. The elevated hydrophilicity and excellent water transport abilities of plant tissues allow cell expansion over prolonged periods of culture. Moreover, cells are able to conform to the microstructure of the plant frameworks, resulting in cell alignment and pattern registration. In conclusion, the current study shows that it is feasible to use plant tissues as an alternative feedstock of scaffolds for mammalian cells. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Industrial Complex for Solid Radwaste Management at Chernobyle Nuclear Power Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahner, S.; Fomin, V. V.

    2002-02-26

    In the framework of the preparation for the decommissioning of the Chernobyl Nuclear Power Plant (ChNPP) an Industrial Complex for Solid Radwaste Management (ICSRM) will be built under the EC TACIS Program in the vicinity of ChNPP. The paper will present the proposed concepts and their integration into existing buildings and installations. Further, the paper will consider the safety cases, as well as the integration of Western and Ukrainian Organizations into a cohesive project team and the requirement to guarantee the fulfillment of both Western standards and Ukrainian regulations and licensing requirements. The paper will provide information on the statusmore » of the interim design and the effects of value engineering on the output of basic design phase. The paper therefor summarizes the design results of the involved design engineers of the Design and Process Providers BNFL (LOT 1), RWE NUKEM GmbH (LOT 2 and General) and INITEC (LOT 3).« less

  7. The FoReVer Methodology: A MBSE Framework for Formal Verification

    NASA Astrophysics Data System (ADS)

    Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald

    2013-08-01

    The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.

  8. Women Engineers and the Influence of Childhood Technologic Environment

    ERIC Educational Resources Information Center

    Mazdeh, Shahla

    2011-01-01

    This phenomenological multi-case study investigated the influence of women engineers' childhood exposure to engineering concepts on their preparation for an engineering profession. An ecologic model (Bronfenbrenner, 1979) was used as the conceptual framework of this research. Twelve professional women engineers from various age and…

  9. Conceptual Framework to Help Promote Retention and Transfer in the Introductory Chemical Engineering Course

    ERIC Educational Resources Information Center

    Hanyak, Michael E., Jr.

    2015-01-01

    In an introductory chemical engineering course, the conceptual framework of a holistic problem-solving methodology in conjunction with a problem-based learning approach has been shown to create a learning environment that nurtures deep learning rather than surface learning. Based on exam scores, student grades are either the same or better than…

  10. Clinical engineering and risk management in healthcare technological process using architecture framework.

    PubMed

    Signori, Marcos R; Garcia, Renato

    2010-01-01

    This paper presents a model that aids the Clinical Engineering to deal with Risk Management in the Healthcare Technological Process. The healthcare technological setting is complex and supported by three basics entities: infrastructure (IS), healthcare technology (HT), and human resource (HR). Was used an Enterprise Architecture - MODAF (Ministry of Defence Architecture Framework) - to model this process for risk management. Thus, was created a new model to contribute to the risk management in the HT process, through the Clinical Engineering viewpoint. This architecture model can support and improve the decision making process of the Clinical Engineering to the Risk Management in the Healthcare Technological process.

  11. Engineering Design Skills Coverage in K-12 Engineering Program Curriculum Materials in the USA

    ERIC Educational Resources Information Center

    Chabalengula, Vivien M.; Mumba, Frackson

    2017-01-01

    The current "K-12 Science Education framework" and "Next Generation Science Standards" (NGSS) in the United States emphasise the integration of engineering design in science instruction to promote scientific literacy and engineering design skills among students. As such, many engineering education programmes have developed…

  12. Engineering Design Tools for Shape Memory Alloy Actuators: CASMART Collaborative Best Practices and Case Studies

    NASA Technical Reports Server (NTRS)

    Wheeler, Robert W.; Benafan, Othmane; Gao, Xiujie; Calkins, Frederick T; Ghanbari, Zahra; Hommer, Garrison; Lagoudas, Dimitris; Petersen, Andrew; Pless, Jennifer M.; Stebner, Aaron P.; hide

    2016-01-01

    The primary goal of the Consortium for the Advancement of Shape Memory Alloy Research and Technology (CASMART) is to enable the design of revolutionary applications based on shape memory alloy (SMA) technology. In order to help realize this goal and reduce the development time and required experience for the fabrication of SMA actuation systems, several modeling tools have been developed for common actuator types and are discussed herein along with case studies, which highlight the capabilities and limitations of these tools. Due to their ability to sustain high stresses and recover large deformations, SMAs have many potential applications as reliable, lightweight, solid-state actuators. Their advantage over classical actuators can also be further improved when the actuator geometry is modified to fit the specific application. In this paper, three common actuator designs are studied: wires, which are lightweight, low-profile, and easily implemented; springs, which offer actuation strokes upwards of 200 at reduced mechanical loads; and torque tubes, which can provide large actuation forces in small volumes and develop a repeatable zero-load actuation response (known as the two-way shape memory effect). The modeling frameworks, which have been implemented in the design tools, are developed for each of these frequently used SMA actuator types. In order to demonstrate the versatility and flexibility of the presented design tools, as well as validate their modeling framework, several design challenges were completed. These case studies include the design and development of an active hinge for the deployment of a solar array or foldable space structure, an adaptive solar array deployment and positioning system, a passive air temperature controller for regulation flow temperatures inside of a jet engine, and a redesign of the Corvette active hatch, which allows for pressure equalization of the car interior. For each of the presented case studies, a prototype or proof-of-concept was fabricated and the experimental results and lessons learned are discussed. This analysis presents a collection of CASMART collaborative best practices in order to allow readers to utilize the available design tools and understand their modeling principles. These design tools, which are based on engineering models, can provide first-order optimal designs and are a basic and efficient method for either demonstrating design feasibility or refining design parameters. Although the design and integration of an SMA-based actuation system always requires application- and environment-specific engineering considerations, common modeling tools can significantly reduce the investment required for actuation system development and provide valuable engineering insight.

  13. Improving TOGAF ADM 9.1 Migration Planning Phase by ITIL V3 Service Transition

    NASA Astrophysics Data System (ADS)

    Hanum Harani, Nisa; Akhmad Arman, Arry; Maulana Awangga, Rolly

    2018-04-01

    Modification planning of business transformation involving technological utilization required a system of transition and migration planning process. Planning of system migration activity is the most important. The migration process is including complex elements such as business re-engineering, transition scheme mapping, data transformation, application development, individual involvement by computer and trial interaction. TOGAF ADM is the framework and method of enterprise architecture implementation. TOGAF ADM provides a manual refer to the architecture and migration planning. The planning includes an implementation solution, in this case, IT solution, but when the solution becomes an IT operational planning, TOGAF could not handle it. This paper presents a new model framework detail transitions process of integration between TOGAF and ITIL. We evaluated our models in field study inside a private university.

  14. Towards a Decision Support System for Space Flight Operations

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Hogle, Charles; Ruszkowski, James

    2013-01-01

    The Mission Operations Directorate (MOD) at the Johnson Space Center (JSC) has put in place a Model Based Systems Engineering (MBSE) technological framework for the development and execution of the Flight Production Process (FPP). This framework has provided much added value and return on investment to date. This paper describes a vision for a model based Decision Support System (DSS) for the development and execution of the FPP and its design and development process. The envisioned system extends the existing MBSE methodology and technological framework which is currently in use. The MBSE technological framework currently in place enables the systematic collection and integration of data required for building an FPP model for a diverse set of missions. This framework includes the technology, people and processes required for rapid development of architectural artifacts. It is used to build a feasible FPP model for the first flight of spacecraft and for recurrent flights throughout the life of the program. This model greatly enhances our ability to effectively engage with a new customer. It provides a preliminary work breakdown structure, data flow information and a master schedule based on its existing knowledge base. These artifacts are then refined and iterated upon with the customer for the development of a robust end-to-end, high-level integrated master schedule and its associated dependencies. The vision is to enhance this framework to enable its application for uncertainty management, decision support and optimization of the design and execution of the FPP by the program. Furthermore, this enhanced framework will enable the agile response and redesign of the FPP based on observed system behavior. The discrepancy of the anticipated system behavior and the observed behavior may be due to the processing of tasks internally, or due to external factors such as changes in program requirements or conditions associated with other organizations that are outside of MOD. The paper provides a roadmap for the three increments of this vision. These increments include (1) hardware and software system components and interfaces with the NASA ground system, (2) uncertainty management and (3) re-planning and automated execution. Each of these increments provide value independently; but some may also enable building of a subsequent increment.

  15. Comparing Freshman and doctoral engineering students in design: mapping with a descriptive framework

    NASA Astrophysics Data System (ADS)

    Carmona Marques, P.

    2017-11-01

    This paper reports the results of a study of engineering students' approaches to an open-ended design problem. To carry out this, sketches and interviews were collected from 9 freshmen (first year) and 10 doctoral engineering students, when they designed solutions for orange squeezers. Sketches and interviews were analysed and mapped with a descriptive 'ideation framework' (IF) of the design process, to document and compare their design creativity (Carmona Marques, P., A. Silva, E. Henriques, and C. Magee. 2014. "A Descriptive Framework of the Design Process from a Dual Cognitive Engineering Perspective." International Journal of Design Creativity and Innovation 2 (3): 142-164). The results show that the designers worked in a manner largely consistent with the IF for generalisation and specialisation loops. Also, doctoral students produced more alternative solutions during the ideation process. In addition, compared to freshman, doctoral used the generalisation loop of the IF, working at higher levels of abstraction. The iterative nature of design is highlighted during this study - a potential contribution to decrease the gap between both groups in engineering education.

  16. Engineering of In Vitro 3D Capillary Beds by Self-Directed Angiogenic Sprouting

    PubMed Central

    Chan, Juliana M.; Zervantonakis, Ioannis K.; Rimchala, Tharathorn; Polacheck, William J.; Whisler, Jordan; Kamm, Roger D.

    2012-01-01

    In recent years, microfluidic systems have been used to study fundamental aspects of angiogenesis through the patterning of single-layered, linear or geometric vascular channels. In vivo, however, capillaries exist in complex, three-dimensional (3D) networks, and angiogenic sprouting occurs with a degree of unpredictability in all x,y,z planes. The ability to generate capillary beds in vitro that can support thick, biological tissues remains a key challenge to the regeneration of vital organs. Here, we report the engineering of 3D capillary beds in an in vitro microfluidic platform that is comprised of a biocompatible collagen I gel supported by a mechanical framework of alginate beads. The engineered vessels have patent lumens, form robust ∼1.5 mm capillary networks across the devices, and support the perfusion of 1 µm fluorescent beads through them. In addition, the alginate beads offer a modular method to encapsulate and co-culture cells that either promote angiogenesis or require perfusion for cell viability in engineered tissue constructs. This laboratory-constructed vascular supply may be clinically significant for the engineering of capillary beds and higher order biological tissues in a scalable and modular manner. PMID:23226527

  17. Explore-create-share study: An evaluation of teachers as curriculum innovators in engineering education

    NASA Astrophysics Data System (ADS)

    Berry, Ayora

    The purpose of this study was to investigate the effects of a curriculum design-based (CDB) professional development model on K-12 teachers' capacity to integrate engineering education in the classroom. This teacher professional development approach differs from other training programs where teachers learn how to use a standard curriculum and adopt it in their classrooms. In a CDB professional development model teachers actively design lessons, student resources, and assessments for their classroom instruction. In other science, technology, engineering and mathematics (STEM) disciplines, CDB professional development has been reported to (a) position teachers as architects of change, (b) provide a professional learning vehicle for educators to reflect on instructional practices and develop content knowledge, (c) inspire a sense of ownership in curriculum decision-making among teachers, and (d) use an instructional approach that is coherent with teachers' interests and professional goals. The CDB professional development program in this study used the Explore-Create-Share (ECS) framework as an instructional model to support teacher-led curriculum design and implementation. To evaluate the impact of the CDB professional development and associated ECS instructional model, three research studies were conducted. In each study, the participants completed a six-month CDB professional development program, the PTC STEM Certificate Program, that included sixty-two instructional contact hours. Participants learned about industry and education engineering concepts, tested engineering curricula, collaborated with K-12 educators and industry professionals, and developed project-based engineering curricula using the ECS framework. The first study evaluated the impact of the CDB professional development program on teachers' engineering knowledge, self-efficacy in designing engineering curriculum, and instructional practice in developing project-based engineering units. The study included twenty-six teachers and data was collected pre-, mid-, and post-program using teacher surveys and a curriculum analysis instrument. The second study evaluated teachers' perceptions of the ECS model as a curriculum authoring tool and the quality of the curriculum units they developed. The study included sixty-two participants and data was collected post-program using teacher surveys and a curriculum analysis instrument. The third study evaluated teachers' experiences implementing ECS units in the classroom with a focus on identifying the benefits, challenges and solutions associated with project-based engineering in the classroom. The study included thirty-one participants and data was collected using an open-ended survey instrument after teachers completed implementation of the ECS curriculum unit. Results of these three studies indicate that teachers can be prepared to integrate engineering in the classroom using a CDB professional development model. Teachers reported an increase in engineering content knowledge, improved their self-efficacy in curriculum planning, and developed high quality instructional units that were aligned to engineering design practices and STEM educational standards. The ECS instructional model was acknowledged as a valuable tool for developing and implementing engineering education in the classroom. Teachers reported that ECS curriculum design aligned with their teaching goals, provided a framework to integrate engineering with other subject-area concepts, and incorporated innovative teaching strategies. After implementing ECS units in the classroom, teachers reported that the ECS model engaged students in engineering design challenges that were situated in a real world context and required the application of interdisciplinary content knowledge and skills. Teachers also reported a number of challenges related to scheduling, content alignment, and access to resources. In the face of these obstacles, teachers presented a number of solutions that included optimization of one's teaching practice, being resource savvy, and adopting a growth mindset.

  18. Requirements management for Gemini Observatory: a small organization with big development projects

    NASA Astrophysics Data System (ADS)

    Close, Madeline; Serio, Andrew; Cordova, Martin; Hardie, Kayla

    2016-08-01

    Gemini Observatory is an astronomical observatory operating two premier 8m-class telescopes, one in each hemisphere. As an operational facility, a majority of Gemini's resources are spent on operations however the observatory undertakes major development projects as well. Current projects include new facility science instruments, an operational paradigm shift to full remote operations, and new operations tools for planning, configuration and change control. Three years ago, Gemini determined that a specialized requirements management tool was needed. Over the next year, the Gemini Systems Engineering Group investigated several tools, selected one for a trial period and configured it for use. Configuration activities including definition of systems engineering processes, development of a requirements framework, and assignment of project roles to tool roles. Test projects were implemented in the tool. At the conclusion of the trial, the group determined that the Gemini could meet its requirements management needs without use of a specialized requirements management tool, and the group identified a number of lessons learned which are described in the last major section of this paper. These lessons learned include how to conduct an organizational needs analysis prior to pursuing a tool; caveats concerning tool criteria and the selection process; the prerequisites and sequence of activities necessary to achieve an optimum configuration of the tool; the need for adequate staff resources and staff training; and a special note regarding organizations in transition and archiving of requirements.

  19. AAL service development loom--from the idea to a marketable business model.

    PubMed

    Kriegel, Johannes; Auinger, Klemens

    2015-01-01

    The Ambient Assisted Living (AAL) market is still in an early stage of development. Previous approaches of comprehensive AAL services are mostly supply-side driven and focused on hardware and software. Usually this type of AAL solutions does not lead to a sustainable success on the market. Research and development increasingly focuses on demand and customer requirements in addition to the social and legal framework. The question is: How can a systematic performance measurement strategy along a service development process support the market-ready design of a concrete business model for AAL service? Within the EU funded research project DALIA (Assistant for Daily Life Activities at Home) an iterative service development process uses an adapted Osterwalder business model canvas. The application of a performance measurement index (PMI) to support the process has been developed and tested. Development of an iterative service development model using a supporting PMI. The PMI framework is developed throughout the engineering of a virtual assistant (AVATAR) as a modular interface to connect informal carers with necessary and useful services. Future research should seek to ensure that the PMI enables meaningful transparency regarding targeting (e.g. innovative AAL service), design (e.g. functional hybrid AAL service) and implementation (e.g. marketable AAL support services). To this end, a further reference to further testing practices is required. The aim must be to develop a weighted PMI in the context of further research, which supports both the service engineering and the subsequent service management process.

  20. Teaching for adaptive expertise in biomedical engineering ethics.

    PubMed

    Martin, Taylor; Rayne, Karen; Kemp, Nate J; Hart, Jack; Diller, Kenneth R

    2005-04-01

    This paper considers an approach to teaching ethics in bioengineering based on the How People Learn (HPL) framework. Curricula based on this framework have been effective in mathematics and science instruction from the kindergarten to the college levels. This framework is well suited to teaching bioengineering ethics because it helps learners develop "adaptive expertise". Adaptive expertise refers to the ability to use knowledge and experience in a domain to learn in unanticipated situations. It differs from routine expertise, which requires using knowledge appropriately to solve routine problems. Adaptive expertise is an important educational objective for bioengineers because the regulations and knowledge base in the discipline are likely to change significantly over the course of their careers. This study compares the performance of undergraduate bioengineering students who learned about ethics for stem cell research using the HPL method of instruction to the performance of students who learned following a standard lecture sequence. Both groups learned the factual material equally well, but the HPL group was more prepared to act adaptively when presented with a novel situation.

  1. The power of simplicity: a fast-and-frugal heuristics approach to performance science.

    PubMed

    Raab, Markus; Gigerenzer, Gerd

    2015-01-01

    Performance science is a fairly new multidisciplinary field that integrates performance domains such as sports, medicine, business, and the arts. To give its many branches a structure and its research a direction, it requires a theoretical framework. We demonstrate the applications of this framework with examples from sport and medicine. Because performance science deals mainly with situations of uncertainty rather than known risks, the needed framework can be provided by the fast-and-frugal heuristics approach. According to this approach, experts learn to rely on heuristics in an adaptive way in order to make accurate decisions. We investigate the adaptive use of heuristics in three ways: the descriptive study of the heuristics in the cognitive "adaptive toolbox;" the prescriptive study of their "ecological rationality," that is, the characterization of the situations in which a given heuristic works; and the engineering study of "intuitive design," that is, the design of transparent aids for making better decisions.

  2. The power of simplicity: a fast-and-frugal heuristics approach to performance science

    PubMed Central

    Raab, Markus; Gigerenzer, Gerd

    2015-01-01

    Performance science is a fairly new multidisciplinary field that integrates performance domains such as sports, medicine, business, and the arts. To give its many branches a structure and its research a direction, it requires a theoretical framework. We demonstrate the applications of this framework with examples from sport and medicine. Because performance science deals mainly with situations of uncertainty rather than known risks, the needed framework can be provided by the fast-and-frugal heuristics approach. According to this approach, experts learn to rely on heuristics in an adaptive way in order to make accurate decisions. We investigate the adaptive use of heuristics in three ways: the descriptive study of the heuristics in the cognitive “adaptive toolbox;” the prescriptive study of their “ecological rationality,” that is, the characterization of the situations in which a given heuristic works; and the engineering study of “intuitive design,” that is, the design of transparent aids for making better decisions. PMID:26579051

  3. Emergent mechanics of biological structures

    PubMed Central

    Dumont, Sophie; Prakash, Manu

    2014-01-01

    Mechanical force organizes life at all scales, from molecules to cells and tissues. Although we have made remarkable progress unraveling the mechanics of life's individual building blocks, our understanding of how they give rise to the mechanics of larger-scale biological structures is still poor. Unlike the engineered macroscopic structures that we commonly build, biological structures are dynamic and self-organize: they sculpt themselves and change their own architecture, and they have structural building blocks that generate force and constantly come on and off. A description of such structures defies current traditional mechanical frameworks. It requires approaches that account for active force-generating parts and for the formation of spatial and temporal patterns utilizing a diverse array of building blocks. In this Perspective, we term this framework “emergent mechanics.” Through examples at molecular, cellular, and tissue scales, we highlight challenges and opportunities in quantitatively understanding the emergent mechanics of biological structures and the need for new conceptual frameworks and experimental tools on the way ahead. PMID:25368421

  4. The Diamond Beamline Controls and Data Acquisition Software Architecture

    NASA Astrophysics Data System (ADS)

    Rees, N.

    2010-06-01

    The software for the Diamond Light Source beamlines[1] is based on two complementary software frameworks: low level control is provided by the Experimental Physics and Industrial Control System (EPICS) framework[2][3] and the high level user interface is provided by the Java based Generic Data Acquisition or GDA[4][5]. EPICS provides a widely used, robust, generic interface across a wide range of hardware where the user interfaces are focused on serving the needs of engineers and beamline scientists to obtain detailed low level views of all aspects of the beamline control systems. The GDA system provides a high-level system that combines an understanding of scientific concepts, such as reciprocal lattice coordinates, a flexible python syntax scripting interface for the scientific user to control their data acquisition, and graphical user interfaces where necessary. This paper describes the beamline software architecture in more detail, highlighting how these complementary frameworks provide a flexible system that can accommodate a wide range of requirements.

  5. Production of biofuels and biochemicals: in need of an ORACLE.

    PubMed

    Miskovic, Ljubisa; Hatzimanikatis, Vassily

    2010-08-01

    The engineering of cells for the production of fuels and chemicals involves simultaneous optimization of multiple objectives, such as specific productivity, extended substrate range and improved tolerance - all under a great degree of uncertainty. The achievement of these objectives under physiological and process constraints will be impossible without the use of mathematical modeling. However, the limited information and the uncertainty in the available information require new methods for modeling and simulation that will characterize the uncertainty and will quantify, in a statistical sense, the expectations of success of alternative metabolic engineering strategies. We discuss these considerations toward developing a framework for the Optimization and Risk Analysis of Complex Living Entities (ORACLE) - a computational method that integrates available information into a mathematical structure to calculate control coefficients. Copyright 2010 Elsevier Ltd. All rights reserved.

  6. Waste IPSC : Thermal-Hydrologic-Chemical-Mechanical (THCM) modeling and simulation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeze, Geoffrey A.; Wang, Yifeng; Arguello, Jose Guadalupe, Jr.

    2010-10-01

    Waste IPSC Objective is to develop an integrated suite of high performance computing capabilities to simulate radionuclide movement through the engineered components and geosphere of a radioactive waste storage or disposal system: (1) with robust thermal-hydrologic-chemical-mechanical (THCM) coupling; (2) for a range of disposal system alternatives (concepts, waste form types, engineered designs, geologic settings); (3) for long time scales and associated large uncertainties; (4) at multiple model fidelities (sub-continuum, high-fidelity continuum, PA); and (5) in accordance with V&V and software quality requirements. THCM Modeling collaborates with: (1) Other Waste IPSC activities: Sub-Continuum Processes (and FMM), Frameworks and Infrastructure (and VU,more » ECT, and CT); (2) Waste Form Campaign; (3) Used Fuel Disposition (UFD) Campaign; and (4) ASCEM.« less

  7. Design of composite scaffolds and three-dimensional shape analysis for tissue-engineered ear

    PubMed Central

    Cervantes, Thomas M.; Bassett, Erik K.; Tseng, Alan; Kimura, Anya; Roscioli, Nick; Randolph, Mark A.; Vacanti, Joseph P.; Hadlock, Theresa A.; Gupta, Rajiv; Pomerantseva, Irina; Sundback, Cathryn A.

    2013-01-01

    Engineered cartilage is a promising option for auricular reconstruction. We have previously demonstrated that a titanium wire framework within a composite collagen ear-shaped scaffold helped to maintain the gross dimensions of the engineered ear after implantation, resisting the deformation forces encountered during neocartilage maturation and wound healing. The ear geometry was redesigned to achieve a more accurate aesthetic result when implanted subcutaneously in a nude rat model. A non-invasive method was developed to assess size and shape changes of the engineered ear in three dimensions. Computer models of the titanium framework were obtained from CT scans before and after implantation. Several parameters were measured including the overall length, width and depth, the minimum intrahelical distance and overall curvature values for each beam section within the framework. Local curvature values were measured to gain understanding of the bending forces experienced by the framework structure in situ. Length and width changed by less than 2%, whereas the depth decreased by approximately 8% and the minimum intrahelical distance changed by approximately 12%. Overall curvature changes identified regions most susceptible to deformation. Eighty-nine per cent of local curvature measurements experienced a bending moment less than 50 µN-m owing to deformation forces during implantation. These quantitative shape analysis results have identified opportunities to improve shape fidelity of engineered ear constructs. PMID:23904585

  8. Design of composite scaffolds and three-dimensional shape analysis for tissue-engineered ear.

    PubMed

    Cervantes, Thomas M; Bassett, Erik K; Tseng, Alan; Kimura, Anya; Roscioli, Nick; Randolph, Mark A; Vacanti, Joseph P; Hadlock, Theresa A; Gupta, Rajiv; Pomerantseva, Irina; Sundback, Cathryn A

    2013-10-06

    Engineered cartilage is a promising option for auricular reconstruction. We have previously demonstrated that a titanium wire framework within a composite collagen ear-shaped scaffold helped to maintain the gross dimensions of the engineered ear after implantation, resisting the deformation forces encountered during neocartilage maturation and wound healing. The ear geometry was redesigned to achieve a more accurate aesthetic result when implanted subcutaneously in a nude rat model. A non-invasive method was developed to assess size and shape changes of the engineered ear in three dimensions. Computer models of the titanium framework were obtained from CT scans before and after implantation. Several parameters were measured including the overall length, width and depth, the minimum intrahelical distance and overall curvature values for each beam section within the framework. Local curvature values were measured to gain understanding of the bending forces experienced by the framework structure in situ. Length and width changed by less than 2%, whereas the depth decreased by approximately 8% and the minimum intrahelical distance changed by approximately 12%. Overall curvature changes identified regions most susceptible to deformation. Eighty-nine per cent of local curvature measurements experienced a bending moment less than 50 µN-m owing to deformation forces during implantation. These quantitative shape analysis results have identified opportunities to improve shape fidelity of engineered ear constructs.

  9. Third cycle university studies in Europe in the field of agricultural engineering and in the emerging discipline of biosystems engineering.

    PubMed

    Ayuga, F; Briassoulis, D; Aguado, P; Farkas, I; Griepentrog, H; Lorencowicz, E

    2010-01-01

    The main objectives of European Thematic Network entitled 'Education and Research in Agricultural for Biosystems Engineering in Europe (ERABEE-TN)' is to initiate and contribute to the structural development and the assurance of the quality assessment of the emerging discipline of Biosystems Engineering in Europe. ERABEE is co-financed by the European Community in the framework of the LLP Programme. The partnership consists of 35 participants from 27 Erasmus countries, out of which 33 are Higher Education Area Institutions (EDU) and 2 are Student Associations (ASS). 13 Erasmus participants (e.g. Thematic Networks, Professional Associations, and Institutions from Brazil, Croatia, Russia and Serbia) are also involved in the Thematic Network through synergies. To date, very few Biosystems Engineering programs exist in Europe and those that are initiated are at a very primitive stage of development. The innovative and novel goal of the Thematic Network is to promote this critical transition, which requires major restructuring in Europe, exploiting along this direction the outcomes accomplished by its predecessor; the USAEE-TN (University Studies in Agricultural Engineering in Europe). It also aims at enhancing the compatibility among the new programmes of Biosystems Engineering, aiding their recognition and accreditation at European and International level and facilitating greater mobility of skilled personnel, researchers and students. One of the technical objectives of ERABEE is dealing with mapping and promoting the third cycle studies (including European PhDs) and supporting the integration of research at the 1st and 2nd cycle regarding European Biosystems Engineering university studies. During the winter 2008 - spring 2009 period, members of ERABEE conducted a survey on the contemporary status of doctoral studies in Europe, and on a possible scheme for promotion of cooperation and synergies in the framework of the third cycle of studies and the European Doctorate in Biosystems Engineering in Europe. This paper presents the results of the survey. The legal regulations and their extent on the different countries concerning the third cycle are presented, along with the current structure of third cycle studies. The evolution and adaptation to the new EHEA in each country is also considered. Information was also gathered on the emerging topics of the Biosystems Engineering field and how these topics could be addressed by the new doctoral programmes at the European level.

  10. Automatic Earth observation data service based on reusable geo-processing workflow

    NASA Astrophysics Data System (ADS)

    Chen, Nengcheng; Di, Liping; Gong, Jianya; Yu, Genong; Min, Min

    2008-12-01

    A common Sensor Web data service framework for Geo-Processing Workflow (GPW) is presented as part of the NASA Sensor Web project. This framework consists of a data service node, a data processing node, a data presentation node, a Catalogue Service node and BPEL engine. An abstract model designer is used to design the top level GPW model, model instantiation service is used to generate the concrete BPEL, and the BPEL execution engine is adopted. The framework is used to generate several kinds of data: raw data from live sensors, coverage or feature data, geospatial products, or sensor maps. A scenario for an EO-1 Sensor Web data service for fire classification is used to test the feasibility of the proposed framework. The execution time and influences of the service framework are evaluated. The experiments show that this framework can improve the quality of services for sensor data retrieval and processing.

  11. A novel medical image data-based multi-physics simulation platform for computational life sciences.

    PubMed

    Neufeld, Esra; Szczerba, Dominik; Chavannes, Nicolas; Kuster, Niels

    2013-04-06

    Simulating and modelling complex biological systems in computational life sciences requires specialized software tools that can perform medical image data-based modelling, jointly visualize the data and computational results, and handle large, complex, realistic and often noisy anatomical models. The required novel solvers must provide the power to model the physics, biology and physiology of living tissue within the full complexity of the human anatomy (e.g. neuronal activity, perfusion and ultrasound propagation). A multi-physics simulation platform satisfying these requirements has been developed for applications including device development and optimization, safety assessment, basic research, and treatment planning. This simulation platform consists of detailed, parametrized anatomical models, a segmentation and meshing tool, a wide range of solvers and optimizers, a framework for the rapid development of specialized and parallelized finite element method solvers, a visualization toolkit-based visualization engine, a Python scripting interface for customized applications, a coupling framework, and more. Core components are cross-platform compatible and use open formats. Several examples of applications are presented: hyperthermia cancer treatment planning, tumour growth modelling, evaluating the magneto-haemodynamic effect as a biomarker and physics-based morphing of anatomical models.

  12. Becoming an Engineer: Toward a Three Dimensional View of Engineering Learning. Research Brief

    ERIC Educational Resources Information Center

    Stevens, Reed; O'Connor, Kevin; Garrison, Lari; Jocuns, Andrew; Amos, Daniel M.

    2008-01-01

    In this paper, the authors develop an analytical framework referred to as "Becoming an Engineer" that focuses upon changes occurring over time as students traverse their undergraduate educations in engineering. This paper discusses three conceptual dimensions used to follow the engineering students' educational pathways: the development of…

  13. What Engineering Sophomores Know and Would Like to Know about Engineering Information Sources and Access

    ERIC Educational Resources Information Center

    Ercegovac, Zorana

    2009-01-01

    This exploratory study reports on what engineering undergraduate students know and would like to learn about engineering information sources and access. Responses were obtained on selected performance measures within the framework of "Information Literacy Standards for Science and Engineering/Technology" (ACRL/ALA/STS 2006). The results are based…

  14. "Scholarship of Impact" Framework in Engineering Education Research: Learnings from the Institute for Scholarship on Engineering Education. Research Brief

    ERIC Educational Resources Information Center

    Lande, Micah; Adams, Robin; Chen, Helen; Currano, Becky; Leifer, Larry

    2007-01-01

    The Institute for Scholarship on Engineering Education (ISEE) program is one element of the NSF-sponsored Center for the Advancement of Engineering Education (CAEE). Its primary goal is to build a community of engineering education scholars who can think and work across disciplines with an ultimate aim of improving the engineering student…

  15. Socialization Experiences Resulting from Doctoral Engineering Teaching Assistantships

    ERIC Educational Resources Information Center

    Mena, Irene B.; Diefes-Dux, Heidi A.; Capobianco, Brenda M.

    2013-01-01

    The purpose of this study was to explore and characterize the types of socialization experiences that result from engineering teaching assistantships. Using situated learning and communities of practice as the theoretical framework, this study highlights the experiences of 28 engineering doctoral students who worked as engineering teaching…

  16. An Object-Oriented Computer Code for Aircraft Engine Weight Estimation

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.; Naylor, Bret A.

    2009-01-01

    Reliable engine-weight estimation at the conceptual design stage is critical to the development of new aircraft engines. It helps to identify the best engine concept amongst several candidates. At NASA Glenn Research Center (GRC), the Weight Analysis of Turbine Engines (WATE) computer code, originally developed by Boeing Aircraft, has been used to estimate the engine weight of various conceptual engine designs. The code, written in FORTRAN, was originally developed for NASA in 1979. Since then, substantial improvements have been made to the code to improve the weight calculations for most of the engine components. Most recently, to improve the maintainability and extensibility of WATE, the FORTRAN code has been converted into an object-oriented version. The conversion was done within the NASA's NPSS (Numerical Propulsion System Simulation) framework. This enables WATE to interact seamlessly with the thermodynamic cycle model which provides component flow data such as airflows, temperatures, and pressures, etc., that are required for sizing the components and weight calculations. The tighter integration between the NPSS and WATE would greatly enhance system-level analysis and optimization capabilities. It also would facilitate the enhancement of the WATE code for next-generation aircraft and space propulsion systems. In this paper, the architecture of the object-oriented WATE code (or WATE++) is described. Both the FORTRAN and object-oriented versions of the code are employed to compute the dimensions and weight of a 300-passenger aircraft engine (GE90 class). Both versions of the code produce essentially identical results as should be the case.

  17. Towards a Framework to Improve the Quality of Teaching and Learning: Consciousness and Validation in Computer Engineering Science, UCT

    ERIC Educational Resources Information Center

    Lévano, Marcos; Albornoz, Andrea

    2016-01-01

    This paper aims to propose a framework to improve the quality in teaching and learning in order to develop good practices to train professionals in the career of computer engineering science. To demonstrate the progress and achievements, our work is based on two principles for the formation of professionals, one based on the model of learning…

  18. Bridging the Gap between Engineering Design and PK-12 Curriculum Development through the Use of the STEM Education Quality Framework

    ERIC Educational Resources Information Center

    Pinnell, Margaret; Rowly, James; Preiss, Sandi; Franco, Suzanne; Blust, Rebecca; Beach, Renee

    2013-01-01

    This paper will describe a unique partnership among the Department of Teacher Education and School of Engineering at the University of Dayton (UD) and the Dayton Regional STEM Center (DRSC). This partnership resulted in the development of the STEM Education Quality Framework (SQF), a tool to guide educators in teaching, learning and refining STEM…

  19. Evaluation of Online Teacher and Student Materials for the Framework for K-12 Science Education Science and Engineering Crosscutting Concepts

    ERIC Educational Resources Information Center

    Schwab, Patrick

    2013-01-01

    The National Research Council developed and published the "Framework for K-12 Science Education," a new set of concepts that many states were planning on adopting. Part of this new endeavor included a set of science and engineering crosscutting concepts to be incorporated into science materials and activities, a first in science…

  20. Active Learning session based on Didactical Engineering framework for conceptual change in students' equilibrium and stability understanding

    NASA Astrophysics Data System (ADS)

    Canu, Michael; Duque, Mauricio; de Hosson, Cécile

    2017-01-01

    Engineering students on control courses lack a deep understanding of equilibrium and stability that are crucial concepts in this discipline. Several studies have shown that students find it difficult to understand simple familiar or academic static equilibrium cases as well as dynamic ones from mechanics even if they know the discipline's criteria and formulae. Our aim is to study the impact of a specific and innovative classroom session, containing well-chosen situations that address students' misconceptions. We propose an example of Active Learning experiment based both on the Didactical Engineering methodology and the Conceptual Fields Theory that aims at promoting a conceptual change in students. The chosen methodology allows, at the same time, a proper design of the student learning activities, an accurate monitoring of the students' rational use during the tasks and provides an internal tool for the evaluation of the session's efficiency. Although the expected starting conceptual change was detected, it would require another activity in order to be reinforced.

  1. Development of a Multi-Disciplinary Computing Environment (MDICE)

    NASA Technical Reports Server (NTRS)

    Kingsley, Gerry; Siegel, John M., Jr.; Harrand, Vincent J.; Lawrence, Charles; Luker, Joel J.

    1999-01-01

    The growing need for and importance of multi-component and multi-disciplinary engineering analysis has been understood for many years. For many applications, loose (or semi-implicit) coupling is optimal, and allows the use of various legacy codes without requiring major modifications. For this purpose, CFDRC and NASA LeRC have developed a computational environment to enable coupling between various flow analysis codes at several levels of fidelity. This has been referred to as the Visual Computing Environment (VCE), and is being successfully applied to the analysis of several aircraft engine components. Recently, CFDRC and AFRL/VAAC (WL) have extended the framework and scope of VCE to enable complex multi-disciplinary simulations. The chosen initial focus is on aeroelastic aircraft applications. The developed software is referred to as MDICE-AE, an extensible system suitable for integration of several engineering analysis disciplines. This paper describes the methodology, basic architecture, chosen software technologies, salient library modules, and the current status of and plans for MDICE. A fluid-structure interaction application is described in a separate companion paper.

  2. Problems of standardizing and technical regulation in the electric power industry

    NASA Astrophysics Data System (ADS)

    Grabchak, E. P.

    2016-12-01

    A mandatory condition to ensure normal operation of a power system and efficiency in the sector is standardization and legal regulation of technological activities of electric power engineering entities and consumers. Compared to the times of USSR, the present-time technical guidance documents are not mandatory to follow in most cases, being of an advisory nature due to the lack of new ones. During the last five years, the industry has been showing a deterioration of the situation in terms of ensuring reliability and engineering controllability as a result of the dominant impact of short-term market stimuli and the differences in basic technological policies. In absence of clear requirements regarding the engineering aspects of such activities, production operation does not contribute to the preserving of technical integrity of the Russian power system, which leads to the loss of performance capability and controllability and causes disturbances in the power supply to consumers. The result of this problem is a high rate of accident incidence. The dynamics of accidents by the type of equipment is given, indicating a persisting trend of growth in the number of accidents, which are of a systematic nature. Several problematic aspects of engineering activities of electric power engineering entities, requiring standardization and legal regulation are pointed out: in the domestic power system, a large number of power electrotechnical and generating equipment operate along with systems of regulation, which do not comply with the principles and technical rules representing a framework where the Energy System of Russia is built and functioning

  3. Model-Based Engineering Design for Trade Space Exploration throughout the Design Cycle

    NASA Technical Reports Server (NTRS)

    Lamassoure, Elisabeth S.; Wall, Stephen D.; Easter, Robert W.

    2004-01-01

    This paper presents ongoing work to standardize model-based system engineering as a complement to point design development in the conceptual design phase of deep space missions. It summarizes two first steps towards practical application of this capability within the framework of concurrent engineering design teams and their customers. The first step is standard generation of system sensitivities models as the output of concurrent engineering design sessions, representing the local trade space around a point design. A review of the chosen model development process, and the results of three case study examples, demonstrate that a simple update to the concurrent engineering design process can easily capture sensitivities to key requirements. It can serve as a valuable tool to analyze design drivers and uncover breakpoints in the design. The second step is development of rough-order- of-magnitude, broad-range-of-validity design models for rapid exploration of the trade space, before selection of a point design. At least one case study demonstrated the feasibility to generate such models in a concurrent engineering session. The experiment indicated that such a capability could yield valid system-level conclusions for a trade space composed of understood elements. Ongoing efforts are assessing the practicality of developing end-to-end system-level design models for use before even convening the first concurrent engineering session, starting with modeling an end-to-end Mars architecture.

  4. The Framework of Intervention Engine Based on Learning Analytics

    ERIC Educational Resources Information Center

    Sahin, Muhittin; Yurdugül, Halil

    2017-01-01

    Learning analytics primarily deals with the optimization of learning environments and the ultimate goal of learning analytics is to improve learning and teaching efficiency. Studies on learning analytics seem to have been made in the form of adaptation engine and intervention engine. Adaptation engine studies are quite widespread, but intervention…

  5. Human Factors Principles in Information Dashboard Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hugo, Jacques V.; St. Germain, Shawn

    When planning for control room upgrades, nuclear power plants have to deal with a multitude of engineering and operational impacts. This will inevitably include several human factors considerations, including physical ergonomics of workstations, viewing angles, lighting, seating, new communication requirements, and new concepts of operation. In helping nuclear power utilities to deal with these challenges, the Idaho National Laboratory (INL) has developed effective methods to manage the various phases of the upgrade life cycle. These methods focus on integrating human factors engineering processes with the plant’s systems engineering process, a large part of which is the development of end-state conceptsmore » for control room modernization. Such an end-state concept is a description of a set of required conditions that define the achievement of the plant’s objectives for the upgrade. Typically, the end-state concept describes the transition of a conventional control room, over time, to a facility that employs advanced digital automation technologies in a way that significantly improves system reliability, reduces human and control room-related hazards, reduces system and component obsolescence, and significantly improves operator performance. To make the various upgrade phases as concrete and as visible as possible, an end-state concept would include a set of visual representations of the control room before and after various upgrade phases to provide the context and a framework within which to consider the various options in the upgrade. This includes the various control systems, human-system interfaces to be replaced, and possible changes to operator workstations. This paper describes how this framework helps to ensure an integrated and cohesive outcome that is consistent with human factors engineering principles and also provide substantial improvement in operator performance. The paper further describes the application of this integrated approach in the strategic modernization program at a nuclear power plant where legacy systems are upgraded to advanced digital technologies through a systematic process that links human factors principles to the systems engineering process. This approach will help to create an integrated control room architecture beyond what is possible for individual subsystem upgrades alone. In addition, several human factors design and evaluation methods were used to develop the end-state concept, including interactive sessions with operators in INL’s Human System Simulation Laboratory, three-dimensional modeling to visualize control board changes.« less

  6. A surety engineering framework to reduce cognitive systems risks.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caudell, Thomas P.; Peercy, David Eugene; Caldera, Eva O.

    Cognitive science research investigates the advancement of human cognition and neuroscience capabilities. Addressing risks associated with these advancements can counter potential program failures, legal and ethical issues, constraints to scientific research, and product vulnerabilities. Survey results, focus group discussions, cognitive science experts, and surety researchers concur technical risks exist that could impact cognitive science research in areas such as medicine, privacy, human enhancement, law and policy, military applications, and national security (SAND2006-6895). This SAND report documents a surety engineering framework and a process for identifying cognitive system technical, ethical, legal and societal risks and applying appropriate surety methods to reducemore » such risks. The framework consists of several models: Specification, Design, Evaluation, Risk, and Maturity. Two detailed case studies are included to illustrate the use of the process and framework. Several Appendices provide detailed information on existing cognitive system architectures; ethical, legal, and societal risk research; surety methods and technologies; and educing information research with a case study vignette. The process and framework provide a model for how cognitive systems research and full-scale product development can apply surety engineering to reduce perceived and actual risks.« less

  7. U.S. Army Corps of Engineers Needs to Improve Contract Oversight of Military Construction Projects at Bagram Airfield, Afghanistan

    DTIC Science & Technology

    2012-11-26

    Minutes Required 116 64 142 58 380 No. of Contractor Meeting Minutes Available Preparatory Meetings Initial Meetings 0 0 18 15 0 0 20 2 38...work, for which the contractors should have prepared a total of 190 preparatory and 190 initial meeting minutes (total of 380 meeting minutes...27, 2012 DoD IG Report No. D-2010-059, “Contingency Contracting: A Framework for Reform,” May 14, 2010 DoD IG Report No. SPO -2009-005, “Assessment

  8. A Generic Multibody Parachute Simulation Model

    NASA Technical Reports Server (NTRS)

    Neuhaus, Jason Richard; Kenney, Patrick Sean

    2006-01-01

    Flight simulation of dynamic atmospheric vehicles with parachute systems is a complex task that is not easily modeled in many simulation frameworks. In the past, the performance of vehicles with parachutes was analyzed by simulations dedicated to parachute operations and were generally not used for any other portion of the vehicle flight trajectory. This approach required multiple simulation resources to completely analyze the performance of the vehicle. Recently, improved software engineering practices and increased computational power have allowed a single simulation to model the entire flight profile of a vehicle employing a parachute.

  9. NASTRAN interfacing modules within the Integrated Analysis Capability (IAC) Program

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1986-01-01

    The IAC program provides the framework required for the development of an extensive multidisciplinary analysis capability. Several NASTRAN related capabilities were developed which can all be expanded in a routine manner to meet in-house unique needs. Plans are to complete the work discussed herein and to provide it to the engineering community through COSMIC. Release is to be after the current IAC Level 2 contract work on the IAC executive system is completed and meshed with the interfacing modules and analysis capabilities under development at the GSFC.

  10. The SLH framework for modeling quantum input-output networks

    DOE PAGES

    Combes, Joshua; Kerckhoff, Joseph; Sarovar, Mohan

    2017-09-04

    Here, many emerging quantum technologies demand precise engineering and control over networks consisting of quantum mechanical degrees of freedom connected by propagating electromagnetic fields, or quantum input-output networks. Here we review recent progress in theory and experiment related to such quantum input-output networks, with a focus on the SLH framework, a powerful modeling framework for networked quantum systems that is naturally endowed with properties such as modularity and hierarchy. We begin by explaining the physical approximations required to represent any individual node of a network, e.g. atoms in cavity or a mechanical oscillator, and its coupling to quantum fields bymore » an operator triple ( S,L,H). Then we explain how these nodes can be composed into a network with arbitrary connectivity, including coherent feedback channels, using algebraic rules, and how to derive the dynamics of network components and output fields. The second part of the review discusses several extensions to the basic SLH framework that expand its modeling capabilities, and the prospects for modeling integrated implementations of quantum input-output networks. In addition to summarizing major results and recent literature, we discuss the potential applications and limitations of the SLH framework and quantum input-output networks, with the intention of providing context to a reader unfamiliar with the field.« less

  11. The SLH framework for modeling quantum input-output networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Combes, Joshua; Kerckhoff, Joseph; Sarovar, Mohan

    Here, many emerging quantum technologies demand precise engineering and control over networks consisting of quantum mechanical degrees of freedom connected by propagating electromagnetic fields, or quantum input-output networks. Here we review recent progress in theory and experiment related to such quantum input-output networks, with a focus on the SLH framework, a powerful modeling framework for networked quantum systems that is naturally endowed with properties such as modularity and hierarchy. We begin by explaining the physical approximations required to represent any individual node of a network, e.g. atoms in cavity or a mechanical oscillator, and its coupling to quantum fields bymore » an operator triple ( S,L,H). Then we explain how these nodes can be composed into a network with arbitrary connectivity, including coherent feedback channels, using algebraic rules, and how to derive the dynamics of network components and output fields. The second part of the review discusses several extensions to the basic SLH framework that expand its modeling capabilities, and the prospects for modeling integrated implementations of quantum input-output networks. In addition to summarizing major results and recent literature, we discuss the potential applications and limitations of the SLH framework and quantum input-output networks, with the intention of providing context to a reader unfamiliar with the field.« less

  12. Designing computer learning environments for engineering and computer science: The scaffolded knowledge integration framework

    NASA Astrophysics Data System (ADS)

    Linn, Marcia C.

    1995-06-01

    Designing effective curricula for complex topics and incorporating technological tools is an evolving process. One important way to foster effective design is to synthesize successful practices. This paper describes a framework called scaffolded knowledge integration and illustrates how it guided the design of two successful course enhancements in the field of computer science and engineering. One course enhancement, the LISP Knowledge Integration Environment, improved learning and resulted in more gender-equitable outcomes. The second course enhancement, the spatial reasoning environment, addressed spatial reasoning in an introductory engineering course. This enhancement minimized the importance of prior knowledge of spatial reasoning and helped students develop a more comprehensive repertoire of spatial reasoning strategies. Taken together, the instructional research programs reinforce the value of the scaffolded knowledge integration framework and suggest directions for future curriculum reformers.

  13. A Conceptual Framework for SAHRA Integrated Multi-resolution Modeling in the Rio Grande Basin

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Gupta, H.; Springer, E.; Wagener, T.; Brookshire, D.; Duffy, C.

    2004-12-01

    The sustainable management of water resources in a river basin requires an integrated analysis of the social, economic, environmental and institutional dimensions of the problem. Numerical models are commonly used for integration of these dimensions and for communication of the analysis results to stakeholders and policy makers. The National Science Foundation Science and Technology Center for Sustainability of semi-Arid Hydrology and Riparian Areas (SAHRA) has been developing integrated multi-resolution models to assess impacts of climate variability and land use change on water resources in the Rio Grande Basin. These models not only couple natural systems such as surface and ground waters, but will also include engineering, economic and social components that may be involved in water resources decision-making processes. This presentation will describe the conceptual framework being developed by SAHRA to guide and focus the multiple modeling efforts and to assist the modeling team in planning, data collection and interpretation, communication, evaluation, etc. One of the major components of this conceptual framework is a Conceptual Site Model (CSM), which describes the basin and its environment based on existing knowledge and identifies what additional information must be collected to develop technically sound models at various resolutions. The initial CSM is based on analyses of basin profile information that has been collected, including a physical profile (e.g., topographic and vegetative features), a man-made facility profile (e.g., dams, diversions, and pumping stations), and a land use and ecological profile (e.g., demographics, natural habitats, and endangered species). Based on the initial CSM, a Conceptual Physical Model (CPM) is developed to guide and evaluate the selection of a model code (or numerical model) for each resolution to conduct simulations and predictions. A CPM identifies, conceptually, all the physical processes and engineering and socio-economic activities occurring (or to occur) in the real system that the corresponding numerical models are required to address, such as riparian evapotranspiration responses to vegetation change and groundwater pumping impacts on soil moisture contents. Simulation results from different resolution models and observations of the real system will then be compared to evaluate the consistency among the CSM, the CPMs, and the numerical models, and feedbacks will be used to update the models. In a broad sense, the evaluation of the models (conceptual or numerical), as well as the linkages between them, can be viewed as a part of the overall conceptual framework. As new data are generated and understanding improves, the models will evolve, and the overall conceptual framework is refined. The development of the conceptual framework becomes an on-going process. We will describe the current state of this framework and the open questions that have to be addressed in the future.

  14. 20th Annual Systems Engineering Conference, Thursday, Volume 4

    DTIC Science & Technology

    2017-10-26

    Daniel Dault, Air Force Research Lab 19809 Physics Based Modeling & Simulation For Shock and Vulnerability Assessments - Navy Enhanced Sierra...19811 Version 1.0 of the New INCOSE Competency Framework u Mr. Don Gelosh 19515 A Proposed Engineering Training Framework and Competency Methodology...nonlinearity ▪ QEV, Transient, Frequency Domain ▪ Inverse Methods Capability ▪ Coupled Physics ▪ Fluids: nemo, aero and sigma ▪ Thermal (unidirection): fuego

  15. Space debris mitigation - engineering strategies

    NASA Astrophysics Data System (ADS)

    Taylor, E.; Hammond, M.

    The problem of space debris pollution is acknowledged to be of growing concern by space agencies, leading to recent activities in the field of space debris mitigation. A review of the current (and near-future) mitigation guidelines, handbooks, standards and licensing procedures has identified a number of areas where further work is required. In order for space debris mitigation to be implemented in spacecraft manufacture and operation, the authors suggest that debris-related criteria need to become design parameters (following the same process as applied to reliability and radiation). To meet these parameters, spacecraft manufacturers and operators will need processes (supported by design tools and databases and implementation standards). A particular aspect of debris mitigation, as compared with conventional requirements (e.g. radiation and reliability) is the current and near-future national and international regulatory framework and associated liability aspects. A framework for these implementation standards is presented, in addition to results of in-house research and development on design tools and databases (including collision avoidance in GTO and SSTO and evaluation of failure criteria on composite and aluminium structures).

  16. Explicit formulation of second and third order optical nonlinearity in the FDTD framework

    NASA Astrophysics Data System (ADS)

    Varin, Charles; Emms, Rhys; Bart, Graeme; Fennel, Thomas; Brabec, Thomas

    2018-01-01

    The finite-difference time-domain (FDTD) method is a flexible and powerful technique for rigorously solving Maxwell's equations. However, three-dimensional optical nonlinearity in current commercial and research FDTD softwares requires solving iteratively an implicit form of Maxwell's equations over the entire numerical space and at each time step. Reaching numerical convergence demands significant computational resources and practical implementation often requires major modifications to the core FDTD engine. In this paper, we present an explicit method to include second and third order optical nonlinearity in the FDTD framework based on a nonlinear generalization of the Lorentz dispersion model. A formal derivation of the nonlinear Lorentz dispersion equation is equally provided, starting from the quantum mechanical equations describing nonlinear optics in the two-level approximation. With the proposed approach, numerical integration of optical nonlinearity and dispersion in FDTD is intuitive, transparent, and fully explicit. A strong-field formulation is also proposed, which opens an interesting avenue for FDTD-based modelling of the extreme nonlinear optics phenomena involved in laser filamentation and femtosecond micromachining of dielectrics.

  17. Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition.

    PubMed

    Ordóñez, Francisco Javier; Roggen, Daniel

    2016-01-18

    Human activity recognition (HAR) tasks have traditionally been solved using engineered features obtained by heuristic processes. Current research suggests that deep convolutional neural networks are suited to automate feature extraction from raw sensor inputs. However, human activities are made of complex sequences of motor movements, and capturing this temporal dynamics is fundamental for successful HAR. Based on the recent success of recurrent neural networks for time series domains, we propose a generic deep framework for activity recognition based on convolutional and LSTM recurrent units, which: (i) is suitable for multimodal wearable sensors; (ii) can perform sensor fusion naturally; (iii) does not require expert knowledge in designing features; and (iv) explicitly models the temporal dynamics of feature activations. We evaluate our framework on two datasets, one of which has been used in a public activity recognition challenge. Our results show that our framework outperforms competing deep non-recurrent networks on the challenge dataset by 4% on average; outperforming some of the previous reported results by up to 9%. Our results show that the framework can be applied to homogeneous sensor modalities, but can also fuse multimodal sensors to improve performance. We characterise key architectural hyperparameters' influence on performance to provide insights about their optimisation.

  18. Organizational Influences on Interdisciplinary Interactions during Research and Design of Large-Scale Complex Engineered Systems

    NASA Technical Reports Server (NTRS)

    McGowan, Anna-Maria R.; Seifert, Colleen M.; Papalambros, Panos Y.

    2012-01-01

    The design of large-scale complex engineered systems (LaCES) such as an aircraft is inherently interdisciplinary. Multiple engineering disciplines, drawing from a team of hundreds to thousands of engineers and scientists, are woven together throughout the research, development, and systems engineering processes to realize one system. Though research and development (R&D) is typically focused in single disciplines, the interdependencies involved in LaCES require interdisciplinary R&D efforts. This study investigates the interdisciplinary interactions that take place during the R&D and early conceptual design phases in the design of LaCES. Our theoretical framework is informed by both engineering practices and social science research on complex organizations. This paper provides preliminary perspective on some of the organizational influences on interdisciplinary interactions based on organization theory (specifically sensemaking), data from a survey of LaCES experts, and the authors experience in the research and design. The analysis reveals couplings between the engineered system and the organization that creates it. Survey respondents noted the importance of interdisciplinary interactions and their significant benefit to the engineered system, such as innovation and problem mitigation. Substantial obstacles to interdisciplinarity are uncovered beyond engineering that include communication and organizational challenges. Addressing these challenges may ultimately foster greater efficiencies in the design and development of LaCES and improved system performance by assisting with the collective integration of interdependent knowledge bases early in the R&D effort. This research suggests that organizational and human dynamics heavily influence and even constrain the engineering effort for large-scale complex systems.

  19. An Object-oriented Computer Code for Aircraft Engine Weight Estimation

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.; Naylor, Bret A.

    2008-01-01

    Reliable engine-weight estimation at the conceptual design stage is critical to the development of new aircraft engines. It helps to identify the best engine concept amongst several candidates. At NASA Glenn (GRC), the Weight Analysis of Turbine Engines (WATE) computer code, originally developed by Boeing Aircraft, has been used to estimate the engine weight of various conceptual engine designs. The code, written in FORTRAN, was originally developed for NASA in 1979. Since then, substantial improvements have been made to the code to improve the weight calculations for most of the engine components. Most recently, to improve the maintainability and extensibility of WATE, the FORTRAN code has been converted into an object-oriented version. The conversion was done within the NASA s NPSS (Numerical Propulsion System Simulation) framework. This enables WATE to interact seamlessly with the thermodynamic cycle model which provides component flow data such as airflows, temperatures, and pressures, etc. that are required for sizing the components and weight calculations. The tighter integration between the NPSS and WATE would greatly enhance system-level analysis and optimization capabilities. It also would facilitate the enhancement of the WATE code for next-generation aircraft and space propulsion systems. In this paper, the architecture of the object-oriented WATE code (or WATE++) is described. Both the FORTRAN and object-oriented versions of the code are employed to compute the dimensions and weight of a 300- passenger aircraft engine (GE90 class). Both versions of the code produce essentially identical results as should be the case. Keywords: NASA, aircraft engine, weight, object-oriented

  20. Environmental engineering education for developing countries: framework for the future.

    PubMed

    Ujang, Z; Henze, M; Curtis, T; Schertenleib, R; Beal, L L

    2004-01-01

    This paper presents the existing philosophy, approach, criteria and delivery of environmental engineering education (E3) for developing countries. In general, environmental engineering is being taught in almost all major universities in developing countries, mostly under civil engineering degree programmes. There is an urgent need to address specific inputs that are particularly important for developing countries with respect to the reality of urbanisation and industrialisation. The main component of E3 in the near future will remain on basic sanitation in most developing countries, with special emphasis on the consumer-demand approach. In order to substantially overcome environmental problems in developing countries, E3 should include integrated urban water management, sustainable sanitation, appropriate technology, cleaner production, wastewater minimisation and financial framework.

  1. A Social Cognitive Approach to Understanding Engineering Career Interest and Expectations among Underrepresented Students in School-Based Clubs

    ERIC Educational Resources Information Center

    Dika, Sandra L.; Alvarez, Jaquelina; Santos, Jeannette; Suárez, Oscar Marcelo

    2016-01-01

    Interest in engineering at early stages of the educational career is one important precursor to choosing to study engineering in college, and engineering-related clubs are designed to foster such interest and diversify the engineering pipeline. In this study, the researchers employed a social cognitive career theory framework to examine level of…

  2. Sixth-Grade Students' Views of the Nature of Engineering and Images of Engineers

    ERIC Educational Resources Information Center

    Karatas, Faik O.; Micklos, Amy; Bodner, George M.

    2011-01-01

    This study investigated the views of the nature of engineering held by 6th-grade students to provide a baseline upon which activities or curriculum materials might be developed to introduce middle-school students to the work of engineers and the process of engineering design. A phenomenographic framework was used to guide the analysis of data…

  3. LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.

    2017-08-01

    MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.

  4. How NASA Expanded its Innovation Framework to Find New Solutions to Old Problems

    NASA Technical Reports Server (NTRS)

    Davis, Jeffrey R.

    2010-01-01

    A radio frequency engineer from rural New Hampshire contributed the best solution to a public challenge issued by NASA's Space Life Sciences Directorate. This is a clear example of what Aneesh Chopra, the US Federal Chief Technology Officer, describes as the notion that in our society, knowledge is widely dispersed. And if it s widely dispersed, how do we capture the insights from the American people?" Chopra later said, to a live audience at the 2010 Rethinking Government event: "A semi-retired radio frequency engineer was able to share his idea about how to solve this problem, and it so blew away other ideas that NASA said it exceeded their requirements! No complicated RFP, the need for lobbyists, some convoluted processes, etc. Just a smart person who was paid a modest fee for his insight."

  5. Photogrammetry and remote sensing education subjects

    NASA Astrophysics Data System (ADS)

    Lazaridou, Maria A.; Karagianni, Aikaterini Ch.

    2017-09-01

    The rapid technologic advances in the scientific areas of photogrammetry and remote sensing require continuous readjustments at the educational programs and their implementation. The teaching teamwork should deal with the challenge to offer the volume of the knowledge without preventing the understanding of principles and methods and also to introduce "new" knowledge (advances, trends) followed by evaluation and presentation of relevant applications. This is of particular importance for a Civil Engineering Faculty as this in Aristotle University of Thessaloniki, as the framework of Photogrammetry and Remote Sensing is closely connected with applications in the four educational Divisions of the Faculty. This paper refers to the above and includes subjects of organizing the courses in photogrammetry and remote sensing in the Civil Engineering Faculty of Aristotle University of Thessaloniki. A scheme of the general curriculum as well the teaching aims and methods are also presented.

  6. Ontology based log content extraction engine for a posteriori security control.

    PubMed

    Azkia, Hanieh; Cuppens-Boulahia, Nora; Cuppens, Frédéric; Coatrieux, Gouenou

    2012-01-01

    In a posteriori access control, users are accountable for actions they performed and must provide evidence, when required by some legal authorities for instance, to prove that these actions were legitimate. Generally, log files contain the needed data to achieve this goal. This logged data can be recorded in several formats; we consider here IHE-ATNA (Integrating the healthcare enterprise-Audit Trail and Node Authentication) as log format. The difficulty lies in extracting useful information regardless of the log format. A posteriori access control frameworks often include a log filtering engine that provides this extraction function. In this paper we define and enforce this function by building an IHE-ATNA based ontology model, which we query using SPARQL, and show how the a posteriori security controls are made effective and easier based on this function.

  7. Teaching smartphone and microcontroller systems using "Android Java"

    NASA Astrophysics Data System (ADS)

    Tigrek, Seyitriza

    Mobile devices are becoming indispensable tools for many students and educators. Mobile technology is starting a new era in the computing methodologies in many engineering disciplines and laboratories. Microcontroller extension that communicates with mobile devices will take the data acquisition and control process into a new level in the sensing technology and communication. The purpose of this thesis is to develop a framework to incorporate the new mobile platform with robust embedded systems into the engineering curriculum. For this purpose a course material is developed "Introduction to Programming Java on a Mobile Platform" to teach novice programmers how to create applications, specifically on Android. Combining an introductory level programming class with the Android platform can appeal to non-programming individuals in multiple disciplines. The proposed course curriculum reduces the learning time, and allows senior engineering students to use the new framework for their specific needs in the labs such as mobile data acquisition and control projects. This work provides techniques for instructors with modest programming background to teach cutting edge technology, which is smartphone programming. Techniques developed in this work minimize unnecessary information carried into current teaching approaches with hands-on practice. It also helps the students with minimal background requirements overcome the barriers that have evolved around computer programming. The motivation of this thesis is to create a tailored programming introductory course to teach Java programming on Android by incorporating selected efficient methods from extant literature. The mechanism proposed in this thesis is to keep students motivated by an active approach based on student-centered learning with collaborative work. Teamwork through pair programming is adapted in this teaching process. Bloom's taxonomy, along with a knowledge survey, is used as a guide to classify the information and exercise problems. A prototype curriculum is a deliverable of this research that is suitable for novice programmers-such as engineering freshmen students. It also contains advanced material that allows senior students to use mobile phone and a microcontroller system to enhance engineering laboratories.

  8. Establishing a Numerical Modeling Framework for Hydrologic Engineering Analyses of Extreme Storm Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xiaodong; Hossain, Faisal; Leung, L. Ruby

    In this study a numerical modeling framework for simulating extreme storm events was established using the Weather Research and Forecasting (WRF) model. Such a framework is necessary for the derivation of engineering parameters such as probable maximum precipitation that are the cornerstone of large water management infrastructure design. Here this framework was built based on a heavy storm that occurred in Nashville (USA) in 2010, and verified using two other extreme storms. To achieve the optimal setup, several combinations of model resolutions, initial/boundary conditions (IC/BC), cloud microphysics and cumulus parameterization schemes were evaluated using multiple metrics of precipitation characteristics. Themore » evaluation suggests that WRF is most sensitive to IC/BC option. Simulation generally benefits from finer resolutions up to 5 km. At the 15km level, NCEP2 IC/BC produces better results, while NAM IC/BC performs best at the 5km level. Recommended model configuration from this study is: NAM or NCEP2 IC/BC (depending on data availability), 15km or 15km-5km nested grids, Morrison microphysics and Kain-Fritsch cumulus schemes. Validation of the optimal framework suggests that these options are good starting choices for modeling extreme events similar to the test cases. This optimal framework is proposed in response to emerging engineering demands of extreme storm events forecasting and analyses for design, operations and risk assessment of large water infrastructures.« less

  9. Evaluation of Frameworks for HSCT Design Optimization

    NASA Technical Reports Server (NTRS)

    Krishnan, Ramki

    1998-01-01

    This report is an evaluation of engineering frameworks that could be used to augment, supplement, or replace the existing FIDO 3.5 (Framework for Interdisciplinary Design and Optimization Version 3.5) framework. The report begins with the motivation for this effort, followed by a description of an "ideal" multidisciplinary design and optimization (MDO) framework. The discussion then turns to how each candidate framework stacks up against this ideal. This report ends with recommendations as to the "best" frameworks that should be down-selected for detailed review.

  10. Program Facilitates CMMI Appraisals

    NASA Technical Reports Server (NTRS)

    Sweetser, Wesley

    2005-01-01

    A computer program has been written to facilitate appraisals according to the methodology of Capability Maturity Model Integration (CMMI). [CMMI is a government/industry standard, maintained by the Software Engineering Institute at Carnegie Mellon University, for objectively assessing the engineering capability and maturity of an organization (especially, an organization that produces software)]. The program assists in preparation for a CMMI appraisal by providing drop-down lists suggesting required artifacts or evidence. It identifies process areas for which similar evidence is required and includes a copy feature that reduces or eliminates repetitive data entry. It generates reports to show the entire framework for reference, the appraisal artifacts to determine readiness for an appraisal, and lists of interviewees and questions to ask them during the appraisal. During an appraisal, the program provides screens for entering observations and ratings, and reviewing evidence provided thus far. Findings concerning strengths and weaknesses can be exported for use in a report or a graphical presentation. The program generates a chart showing capability level ratings of the organization. A context-sensitive Windows help system enables a novice to use the program and learn about the CMMI appraisal process.

  11. Study on utilization of advanced composites in commercial aircraft wing structures. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Sakata, I. F.; Ostrom, R. B.; Cardinale, S. V.

    1978-01-01

    The effort required by commercial transport manufacturers to accomplish the transition from current construction materials and practices to extensive use of composites in aircraft wings was investigated. The engineering and manufacturing disciplines which normally participate in the design, development, and production of an aircraft were employed to ensure that all of the factors that would enter a decision to commit to production of a composite wing structure were addressed. A conceptual design of an advanced technology reduced energy aircraft provided the framework for identifying and investigating unique design aspects. A plan development effort defined the essential technology needs and formulated approaches for effecting the required wing development. The wing development program plans, resource needs, and recommendations are summarized.

  12. Tissue Engineering Whole Bones Through Endochondral Ossification: Regenerating the Distal Phalanx.

    PubMed

    Sheehy, Eamon J; Mesallati, Tariq; Kelly, Lara; Vinardell, Tatiana; Buckley, Conor T; Kelly, Daniel J

    2015-01-01

    Novel strategies are urgently required to facilitate regeneration of entire bones lost due to trauma or disease. In this study, we present a novel framework for the regeneration of whole bones by tissue engineering anatomically shaped hypertrophic cartilaginous grafts in vitro that subsequently drive endochondral bone formation in vivo. To realize this, we first fabricated molds from digitized images to generate mesenchymal stem cell-laden alginate hydrogels in the shape of different bones (the temporomandibular joint [TMJ] condyle and the distal phalanx). These constructs could be stimulated in vitro to generate anatomically shaped hypertrophic cartilaginous tissues that had begun to calcify around their periphery. Constructs were then formed into the shape of the distal phalanx to create the hypertrophic precursor of the osseous component of an engineered long bone. A layer of cartilage engineered through self-assembly of chondrocytes served as the articular surface of these constructs. Following chondrogenic priming and subcutaneous implantation, the hypertrophic phase of the engineered phalanx underwent endochondral ossification, leading to the generation of a vascularized bone integrated with a covering layer of stable articular cartilage. Furthermore, spatial bone deposition within the construct could be modulated by altering the architecture of the osseous component before implantation. These findings open up new horizons to whole limb regeneration by recapitulating key aspects of normal bone development.

  13. Developing Energy Technology Course for Undergraduate Engineering Management Study Program in Lake Toba Area with Particular Focus to Sustainable Energy Systems in Development Context

    NASA Astrophysics Data System (ADS)

    Manik, Yosef; Sinaga, Rizal; Saragi, Hadi

    2018-02-01

    Undergraduate Engineering Management Study Program of Institut Teknologi Del is one of the pioneers for its field in Indonesia. Located in Lake Toba Area, this study program has a mission to provide high quality Engineering Management education that produces globally competitive graduates who in turn will contribute to local development. Framing the Energy Technology course—one of the core subjects in Engineering Management Body of Knowledge—in the context of sustainable development of Lake Toba Area is very essential. Thus, one particular focus in this course is sustainable energy systems in local development context that incorporates identification and analysis of locally available energy resources. In this paper we present our experience in designing such course. In this work, we introduce the domains that shape the Engineering Management Body of Knowledge. Then, we explain the results of our evaluation on the key considerations to meet the rapidly changing needs of society in local context. Later, we present the framework of the learning outcomes and the syllabus as a result of mapping the road map with the requirement. At the end, the summary from the first two semesters of delivering this course in academic year 2015/2016 and 2016/2017 are reported.

  14. IMAGINE: Interstellar MAGnetic field INference Engine

    NASA Astrophysics Data System (ADS)

    Steininger, Theo

    2018-03-01

    IMAGINE (Interstellar MAGnetic field INference Engine) performs inference on generic parametric models of the Galaxy. The modular open source framework uses highly optimized tools and technology such as the MultiNest sampler (ascl:1109.006) and the information field theory framework NIFTy (ascl:1302.013) to create an instance of the Milky Way based on a set of parameters for physical observables, using Bayesian statistics to judge the mismatch between measured data and model prediction. The flexibility of the IMAGINE framework allows for simple refitting for newly available data sets and makes state-of-the-art Bayesian methods easily accessible particularly for random components of the Galactic magnetic field.

  15. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3) Coupling large-scale computing and data systems to scientific and engineering instruments (e.g., realtime interaction with experiments through real-time data analysis and interpretation presented to the experimentalist in ways that allow direct interaction with the experiment (instead of just with instrument control); (5) Highly interactive, augmented reality and virtual reality remote collaborations (e.g., Ames / Boeing Remote Help Desk providing field maintenance use of coupled video and NDI to a remote, on-line airframe structures expert who uses this data to index into detailed design databases, and returns 3D internal aircraft geometry to the field); (5) Single computational problems too large for any single system (e.g. the rotocraft reference calculation). Grids also have the potential to provide pools of resources that could be called on in extraordinary / rapid response situations (such as disaster response) because they can provide common interfaces and access mechanisms, standardized management, and uniform user authentication and authorization, for large collections of distributed resources (whether or not they normally function in concert). IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: the scientist / design engineer whose primary interest is problem solving (e.g. determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user is the tool designer: the computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. The results of the analysis of the needs of these two types of users provides a broad set of requirements that gives rise to a general set of required capabilities. The IPG project is intended to address all of these requirements. In some cases the required computing technology exists, and in some cases it must be researched and developed. The project is using available technology to provide a prototype set of capabilities in a persistent distributed computing testbed. Beyond this, there are required capabilities that are not immediately available, and whose development spans the range from near-term engineering development (one to two years) to much longer term R&D (three to six years). Additional information is contained in the original.

  16. A relativistic type Ibc supernova without a detected gamma-ray burst.

    PubMed

    Soderberg, A M; Chakraborti, S; Pignata, G; Chevalier, R A; Chandra, P; Ray, A; Wieringa, M H; Copete, A; Chaplin, V; Connaughton, V; Barthelmy, S D; Bietenholz, M F; Chugai, N; Stritzinger, M D; Hamuy, M; Fransson, C; Fox, O; Levesque, E M; Grindlay, J E; Challis, P; Foley, R J; Kirshner, R P; Milne, P A; Torres, M A P

    2010-01-28

    Long duration gamma-ray bursts (GRBs) mark the explosive death of some massive stars and are a rare sub-class of type Ibc supernovae. They are distinguished by the production of an energetic and collimated relativistic outflow powered by a central engine (an accreting black hole or neutron star). Observationally, this outflow is manifested in the pulse of gamma-rays and a long-lived radio afterglow. Until now, central-engine-driven supernovae have been discovered exclusively through their gamma-ray emission, yet it is expected that a larger population goes undetected because of limited satellite sensitivity or beaming of the collimated emission away from our line of sight. In this framework, the recovery of undetected GRBs may be possible through radio searches for type Ibc supernovae with relativistic outflows. Here we report the discovery of luminous radio emission from the seemingly ordinary type Ibc SN 2009bb, which requires a substantial relativistic outflow powered by a central engine. A comparison with our radio survey of type Ibc supernovae reveals that the fraction harbouring central engines is low, about one per cent, measured independently from, but consistent with, the inferred rate of nearby GRBs. Independently, a second mildly relativistic supernova has been reported.

  17. Computer Simulation of the VASIMR Engine

    NASA Technical Reports Server (NTRS)

    Garrison, David

    2005-01-01

    The goal of this project is to develop a magneto-hydrodynamic (MHD) computer code for simulation of the VASIMR engine. This code is designed be easy to modify and use. We achieve this using the Cactus framework, a system originally developed for research in numerical relativity. Since its release, Cactus has become an extremely powerful and flexible open source framework. The development of the code will be done in stages, starting with a basic fluid dynamic simulation and working towards a more complex MHD code. Once developed, this code can be used by students and researchers in order to further test and improve the VASIMR engine.

  18. The Use of Executive Control Processes in Engineering Design by Engineering Students and Professional Engineers

    ERIC Educational Resources Information Center

    Dixon, Raymond A.; Johnson, Scott D.

    2012-01-01

    A cognitive construct that is important when solving engineering design problems is executive control process, or metacognition. It is a central feature of human consciousness that enables one "to be aware of, monitor, and control mental processes." The framework for this study was conceptualized by integrating the model for creative design, which…

  19. Transnational Discourses of Knowledge and Learning in Professional Work: Examples from Computer Engineering

    ERIC Educational Resources Information Center

    Nerland, Monika

    2010-01-01

    Taking a Foucauldian framework as its point of departure, this paper discusses how transnational discourses of knowledge and learning operate in the profession of computer engineering and form a certain logic through which modes of being an engineer are regulated. Both the knowledge domain of computer engineering and its related labour market is…

  20. Engineering Design for Engineering Design: Benefits, Models, and Examples from Practice

    ERIC Educational Resources Information Center

    Turner, Ken L., Jr.; Kirby, Melissa; Bober, Sue

    2016-01-01

    Engineering design, a framework for studying and solving societal problems, is a key component of STEM education. It is also the area of greatest challenge within the Next Generation Science Standards, NGSS. Many teachers feel underprepared to teach or create activities that feature engineering design, and integrating a lesson plan of core content…

  1. Building a Framework for Engineering Design Experiences in STEM: A Synthesis

    ERIC Educational Resources Information Center

    Denson, Cameron D.

    2011-01-01

    Since the inception of the National Center for Engineering and Technology Education in 2004, educators and researchers have struggled to identify the necessary components of a "good" engineering design challenge for high school students. In reading and analyzing the position papers on engineering design many themes emerged that may begin to form a…

  2. Practical Framework: Implementing OEE Method in Manufacturing Process Environment

    NASA Astrophysics Data System (ADS)

    Maideen, N. C.; Sahudin, S.; Mohd Yahya, N. H.; Norliawati, A. O.

    2016-02-01

    Manufacturing process environment requires reliable machineries in order to be able to satisfy the market demand. Ideally, a reliable machine is expected to be operated and produce a quality product at its maximum designed capability. However, due to some reason, the machine usually unable to achieved the desired performance. Since the performance will affect the productivity of the system, a measurement technique should be applied. Overall Equipment Effectiveness (OEE) is a good method to measure the performance of the machine. The reliable result produced from OEE can then be used to propose a suitable corrective action. There are a lot of published paper mentioned about the purpose and benefit of OEE that covers what and why factors. However, the how factor not yet been revealed especially the implementation of OEE in manufacturing process environment. Thus, this paper presents a practical framework to implement OEE and a case study has been discussed to explain in detail each steps proposed. The proposed framework is beneficial to the engineer especially the beginner to start measure their machine performance and later improve the performance of the machine.

  3. Eigenspace perturbations for uncertainty estimation of single-point turbulence closures

    NASA Astrophysics Data System (ADS)

    Iaccarino, Gianluca; Mishra, Aashwin Ananda; Ghili, Saman

    2017-02-01

    Reynolds-averaged Navier-Stokes (RANS) models represent the workhorse for predicting turbulent flows in complex industrial applications. However, RANS closures introduce a significant degree of epistemic uncertainty in predictions due to the potential lack of validity of the assumptions utilized in model formulation. Estimating this uncertainty is a fundamental requirement for building confidence in such predictions. We outline a methodology to estimate this structural uncertainty, incorporating perturbations to the eigenvalues and the eigenvectors of the modeled Reynolds stress tensor. The mathematical foundations of this framework are derived and explicated. Thence, this framework is applied to a set of separated turbulent flows, while compared to numerical and experimental data and contrasted against the predictions of the eigenvalue-only perturbation methodology. It is exhibited that for separated flows, this framework is able to yield significant enhancement over the established eigenvalue perturbation methodology in explaining the discrepancy against experimental observations and high-fidelity simulations. Furthermore, uncertainty bounds of potential engineering utility can be estimated by performing five specific RANS simulations, reducing the computational expenditure on such an exercise.

  4. Neural Networks for Computer Vision: A Framework for Specifications of a General Purpose Vision System

    NASA Astrophysics Data System (ADS)

    Skrzypek, Josef; Mesrobian, Edmond; Gungner, David J.

    1989-03-01

    The development of autonomous land vehicles (ALV) capable of operating in an unconstrained environment has proven to be a formidable research effort. The unpredictability of events in such an environment calls for the design of a robust perceptual system, an impossible task requiring the programming of a system bases on the expectation of future, unconstrained events. Hence, the need for a "general purpose" machine vision system that is capable of perceiving and understanding images in an unconstrained environment in real-time. The research undertaken at the UCLA Machine Perception Laboratory addresses this need by focusing on two specific issues: 1) the long term goals for machine vision research as a joint effort between the neurosciences and computer science; and 2) a framework for evaluating progress in machine vision. In the past, vision research has been carried out independently within different fields including neurosciences, psychology, computer science, and electrical engineering. Our interdisciplinary approach to vision research is based on the rigorous combination of computational neuroscience, as derived from neurophysiology and neuropsychology, with computer science and electrical engineering. The primary motivation behind our approach is that the human visual system is the only existing example of a "general purpose" vision system and using a neurally based computing substrate, it can complete all necessary visual tasks in real-time.

  5. A Rationale and Framework for Establishing a Systems Engineering Community Within the Department of the Army

    DTIC Science & Technology

    2011-03-01

    transition into the ranks of employees. • Make contact not solely with students , but with all those who impact their decision-making: parents, teachers ...SUBTITLE A Rationale and Framework for Establishing a Systems Engineering Community Within the Department of the Army 6. AUTHOR( S ) Alan Clayton...fields the most operationally effective military force in the world. However, fielding such a force has been challenging, as seen by the multiple

  6. Gasoline Engine Mechanics. Florida Vocational Program Guide.

    ERIC Educational Resources Information Center

    University of South Florida, Tampa. Dept. of Adult and Vocational Education.

    This vocational program guide is intended to assist in the organization, operation, and evaluation of a program in gasoline engine mechanics in school districts, area vocational centers, and community colleges. The following topics are covered: job duties of small-engine mechanics; program content (curriculum framework and student performance…

  7. PUKHA: A New Pedagogical Experience

    ERIC Educational Resources Information Center

    De Magalhaes, A. Barbedo; Estima, M.; Almada-Lobo, B.

    2007-01-01

    Society needs responsible leaders and entrepreneurs. CDIO (conceive, design, implement and operate) is a framework for engineering education based on outcomes, more than on contents, that has been adopted by a growing number of engineering educational institutions for producing the next generation of engineering leaders. In order to support…

  8. Cooperation with Central and Eastern Europe in Language Engineering.

    ERIC Educational Resources Information Center

    Andersen, Poul

    This paper outlines trends and activities in Central and Eastern European language research and language-related software development (language engineering) and briefly describes some specific projects. The language engineering segment of the European Union's Fourth Framework Programme, intended to facilitate use of telematics applications and…

  9. The Teaching of Crystallography to Materials Scientists and Engineers.

    ERIC Educational Resources Information Center

    Wuensch, Bernhardt J.

    1988-01-01

    Provides a framework of the disciplines of materials science and engineering as they have developed. Discusses the philosophy, content, and approach to teaching these courses. Indicates the range of crystallographic topics contained in the materials science and engineering curriculum at the Massachussetts Institute of Technology. (CW)

  10. Molecular simulations for energy, environmental and pharmaceutical applications of nanoporous materials: from zeolites, metal-organic frameworks to protein crystals.

    PubMed

    Jiang, Jianwen; Babarao, Ravichandar; Hu, Zhongqiao

    2011-07-01

    Nanoporous materials have widespread applications in chemical industry, but the pathway from laboratory synthesis and testing to practical utilization of nanoporous materials is substantially challenging and requires fundamental understanding from the bottom up. With ever-growing computational resources, molecular simulations have become an indispensable tool for material characterization, screening and design. This tutorial review summarizes the recent simulation studies in zeolites, metal-organic frameworks and protein crystals, and provides a molecular overview for energy, environmental and pharmaceutical applications of nanoporous materials with increasing degree of complexity in building blocks. It is demonstrated that molecular-level studies can bridge the gap between physical and engineering sciences, unravel microscopic insights that are otherwise experimentally inaccessible, and assist in the rational design of new materials. The review is concluded with major challenges in future simulation exploration of novel nanoporous materials for emerging applications.

  11. A reusability and efficiency oriented software design method for mobile land inspection

    NASA Astrophysics Data System (ADS)

    Cai, Wenwen; He, Jun; Wang, Qing

    2008-10-01

    Aiming at the requirement from the real-time land inspection domain, a land inspection handset system was presented in this paper. In order to increase the reusability of the system, a design pattern based framework was presented. Encapsulation for command like actions by applying COMMAND pattern was proposed for the problem of complex UI interactions. Integrating several GPS-log parsing engines into a general parsing framework was archived by introducing STRATEGY pattern. A network transmission module based network middleware was constructed. For mitigating the high coupling of complex network communication programs, FACTORY pattern was applied to facilitate the decoupling. Moreover, in order to efficiently manipulate huge GIS datasets, a VISITOR pattern and Quad-tree based multi-scale representation method was presented. It had been proved practically that these design patterns reduced the coupling between the subsystems, and improved the expansibility.

  12. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    NASA Technical Reports Server (NTRS)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  13. Master of Puppets: Cooperative Multitasking for In Situ Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morozov, Dmitriy; Lukic, Zarija

    2016-01-01

    Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. Here, we present a novel design for running multiple codes in situ: using coroutines and position-independent executables we enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. We present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. This design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The techniques we present can also be integrated into other in situ frameworks.« less

  14. Henson v1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monozov, Dmitriy; Lukie, Zarija

    2016-04-01

    Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. The developers present a novel design for running multiple codes in situ: using coroutines and position-independent executables they enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. They present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. Our design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The presented techniques can also be integrated into other in situ frameworks.« less

  15. Frameworks for amending reservoir water management

    USGS Publications Warehouse

    Mower, Ethan; Miranda, Leandro E.

    2013-01-01

    Managing water storage and withdrawals in many reservoirs requires establishing seasonal targets for water levels (i.e., rule curves) that are influenced by regional precipitation and diverse water demands. Rule curves are established as an attempt to balance various water needs such as flood control, irrigation, and environmental benefits such as fish and wildlife management. The processes and challenges associated with amending rule curves to balance multiuse needs are complicated and mostly unfamiliar to non-US Army Corps of Engineers (USACE) natural resource managers and to the public. To inform natural resource managers and the public we describe the policies and process involved in amending rule curves in USACE reservoirs, including 3 frameworks: a general investigation, a continuing authority program, and the water control plan. Our review suggests that water management in reservoirs can be amended, but generally a multitude of constraints and competing demands must be addressed before such a change can be realized.

  16. Supporting capacity sharing in the cloud manufacturing environment based on game theory and fuzzy logic

    NASA Astrophysics Data System (ADS)

    Argoneto, Pierluigi; Renna, Paolo

    2016-02-01

    This paper proposes a Framework for Capacity Sharing in Cloud Manufacturing (FCSCM) able to support the capacity sharing issue among independent firms. The success of geographical distributed plants depends strongly on the use of opportune tools to integrate their resources and demand forecast in order to gather a specific production objective. The framework proposed is based on two different tools: a cooperative game algorithm, based on the Gale-Shapley model, and a fuzzy engine. The capacity allocation policy takes into account the utility functions of the involved firms. It is shown how the capacity allocation policy proposed induces all firms to report truthfully their information about their requirements. A discrete event simulation environment has been developed to test the proposed FCSCM. The numerical results show the drastic reduction of unsatisfied capacity obtained by the model of cooperation implemented in this work.

  17. Coupling between a multi-physics workflow engine and an optimization framework

    NASA Astrophysics Data System (ADS)

    Di Gallo, L.; Reux, C.; Imbeaux, F.; Artaud, J.-F.; Owsiak, M.; Saoutic, B.; Aiello, G.; Bernardi, P.; Ciraolo, G.; Bucalossi, J.; Duchateau, J.-L.; Fausser, C.; Galassi, D.; Hertout, P.; Jaboulay, J.-C.; Li-Puma, A.; Zani, L.

    2016-03-01

    A generic coupling method between a multi-physics workflow engine and an optimization framework is presented in this paper. The coupling architecture has been developed in order to preserve the integrity of the two frameworks. The objective is to provide the possibility to replace a framework, a workflow or an optimizer by another one without changing the whole coupling procedure or modifying the main content in each framework. The coupling is achieved by using a socket-based communication library for exchanging data between the two frameworks. Among a number of algorithms provided by optimization frameworks, Genetic Algorithms (GAs) have demonstrated their efficiency on single and multiple criteria optimization. Additionally to their robustness, GAs can handle non-valid data which may appear during the optimization. Consequently GAs work on most general cases. A parallelized framework has been developed to reduce the time spent for optimizations and evaluation of large samples. A test has shown a good scaling efficiency of this parallelized framework. This coupling method has been applied to the case of SYCOMORE (SYstem COde for MOdeling tokamak REactor) which is a system code developed in form of a modular workflow for designing magnetic fusion reactors. The coupling of SYCOMORE with the optimization platform URANIE enables design optimization along various figures of merit and constraints.

  18. Translating Extreme Precipitation Data from Climate Change Projections into Resilient Engineering Applications

    NASA Astrophysics Data System (ADS)

    Cook, L. M.; Samaras, C.; Anderson, C.

    2016-12-01

    Engineers generally use historical precipitation trends to inform assumptions and parameters for long-lived infrastructure designs. However, resilient design calls for the adjustment of current engineering practice to incorporate a range of future climate conditions that are likely to be different than the past. Despite the availability of future projections from downscaled climate models, there remains a considerable mismatch between climate model outputs and the inputs needed in the engineering community to incorporate climate resiliency. These factors include differences in temporal and spatial scales, model uncertainties, and a lack of criteria for selection of an ensemble of models. This research addresses the limitations to working with climate data by providing a framework for the use of publicly available downscaled climate projections to inform engineering resiliency. The framework consists of five steps: 1) selecting the data source based on the engineering application, 2) extracting the data at a specific location, 3) validating for performance against observed data, 4) post-processing for bias or scale, and 5) selecting the ensemble and calculating statistics. The framework is illustrated with an example application to extreme precipitation-frequency statistics, the 25-year daily precipitation depth, using four publically available climate data sources: NARCCAP, USGS, Reclamation, and MACA. The attached figure presents the results for step 5 from the framework, analyzing how the 24H25Y depth changes when the model ensemble is culled based on model performance against observed data, for both post-processing techniques: bias-correction and change factor. Culling the model ensemble increases both the mean and median values for all data sources, and reduces range for NARCCAP and MACA ensembles due to elimination of poorer performing models, and in some cases, those that predict a decrease in future 24H25Y precipitation volumes. This result is especially relevant to engineers who wish to reduce the range of the ensemble and remove contradicting models; however, this result is not generalizable for all cases. Finally, this research highlights the need for the formation of an intermediate entity that is able to translate climate projections into relevant engineering information.

  19. Identifying barriers to Science, Technology, Society and environment (STSE) educational goals and pedagogy in science education: A case study of UMASS Lowell undergraduate engineering

    NASA Astrophysics Data System (ADS)

    Phaneuf, Tiffany

    The implementation of sustainable development in higher education is a global trend. Engineers, as gatekeepers of technological innovation, confront increasingly complex world issues ranging from economic and social to political and environmental. Recently, a multitude of government reports have argued that solving such complex problems requires changes in the pedagogy of engineering education, such as that prescribed by the Science, Technology, Society, and education (STS) movement that grew out of the environmental movement in the 70s. In STS students are engaged in the community by understanding that scientific progress is innately a sociopolitical process that involves dimensions of power, wealth and responsibility. United States accreditation criteria now demand "the broad education necessary to understand the impact of engineering solutions in a global, economic, environmental, and societal context" (ABET Engineering Accreditation Commission 2005). With such emphasis on STS education as necessary to address complex world issues, it is vital to assess the barriers in the traditional engineering curriculum that may inhibit the success of such educational reform. This study identifies barriers to STS goals and pedagogy in post secondary science education by using the Francis College of Engineering at UMASS Lowell as a single case study. The study draws on existing literature to develop a theoretical framework for assessing four hypothesized barriers to STS education in undergraduate engineering. Identification of barriers to STS education in engineering generates a critical reflection of post secondary science education and its role in preparing engineers to be active citizens in shaping a rapidly globalizing world. The study offers policy recommendations for enabling post secondary science education to incorporate STS education into its curriculum.

  20. The Nexus between Science Literacy & Technical Literacy: A State by State Analysis of Engineering Content in State Science Standards

    ERIC Educational Resources Information Center

    Koehler, Catherine M.; Faraclas, Elias; Giblin, David; Moss, David M.; Kazerounian, Kazem

    2013-01-01

    This study explores how engineering concepts are represented in secondary science standards across the nation by examining how engineering and technical concepts are infused into these frameworks. Secondary science standards from 49 states plus the District of Columbia were analyzed and ranked based on how many engineering concepts were found.…

  1. Assessing the impact of modeling limits on intelligent systems

    NASA Technical Reports Server (NTRS)

    Rouse, William B.; Hammer, John M.

    1990-01-01

    The knowledge bases underlying intelligent systems are validated. A general conceptual framework is provided for considering the roles in intelligent systems of models of physical, behavioral, and operational phenomena. A methodology is described for identifying limits in particular intelligent systems, and the use of the methodology is illustrated via an experimental evaluation of the pilot-vehicle interface within the Pilot's Associate. The requirements and functionality are outlined for a computer based knowledge engineering environment which would embody the approach advocated and illustrated in earlier discussions. Issues considered include the specific benefits of this functionality, the potential breadth of applicability, and technical feasibility.

  2. Women Engineering Transfer Students: The Community College Experience

    ERIC Educational Resources Information Center

    Patterson, Susan J.

    2011-01-01

    An interpretative philosophical framework was applied to a case study to document the particular experiences and perspectives of ten women engineering transfer students who once attended a community college and are currently enrolled in one of two university professional engineering programs. This study is important because women still do not earn…

  3. Environments for Fostering Effective Critical Thinking in Geotechnical Engineering Education (Geo-EFFECTs)

    ERIC Educational Resources Information Center

    Pierce, Charles E.; Gassman, Sarah L.; Huffman, Jeffrey T.

    2013-01-01

    This paper describes the development, implementation, and assessment of instructional materials for geotechnical engineering concepts using the Environments for Fostering Effective Critical Thinking (EFFECTs) pedagogical framework. The central learning goals of engineering EFFECTs are to (i) improve the understanding and retention of a specific…

  4. Teaching Engineering Ethics with Sustainability as Context

    ERIC Educational Resources Information Center

    Byrne, Edmond P.

    2012-01-01

    Purpose: The purpose of this paper is to ascertain the engagement and response of students to the teaching of engineering ethics incorporating a macro ethical framework whereby sustainability is viewed as context to professional practice. This involves incorporating a broader conception of engineering than is typically applied in conventional…

  5. Sustaining Liminality: Experiences and Negotiations of International Females in U.S. Engineering Graduate Programs

    ERIC Educational Resources Information Center

    Dutta, Debalina

    2012-01-01

    This project examines the intersectionalities of international females in engineering graduate programs of the United States, using frameworks of sustainability and liminality theory. According to Dutta and Kisselburgh (2011) international females in graduate engineering constitute the "minorities of minorities," not only in terms of…

  6. Core Ideas of Engineering and Technology

    ERIC Educational Resources Information Center

    Sneider, Cary

    2012-01-01

    Last month, Rodger Bybee's article, "Scientific and Engineering Practices in K-12 Classrooms," provided an overview of Chapter 3 in "A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas" (NRC 2011). Chapter 3 describes the practices of science and engineering that students are expected to develop during 13 years…

  7. Engineering Encounters: Can a Student Really Do What Engineers Do?

    ERIC Educational Resources Information Center

    Brown, Sherri; Newman, Channa; Dearing-Smith, Kelley; Smith, Stephanie

    2014-01-01

    "Framework for K-12 Science Education" states that "children are natural engineers … they spontaneously build sand castles, dollhouses, and hamster enclosures and use a variety of tools and materials for their own playful purposes" (NRC 2012, p. 70). The "Next Generation Science Standards" ("NGSS") also…

  8. Are the expected benefits of requirements reuse hampered by distance? An experiment.

    PubMed

    Carrillo de Gea, Juan M; Nicolás, Joaquín; Fernández-Alemán, José L; Toval, Ambrosio; Idri, Ali

    2016-01-01

    Software development processes are often performed by distributed teams which may be separated by great distances. Global software development (GSD) has undergone a significant growth in recent years. The challenges concerning GSD are especially relevant to requirements engineering (RE). Stakeholders need to share a common ground, but there are many difficulties as regards the potentially variable interpretation of the requirements in different contexts. We posit that the application of requirements reuse techniques could alleviate this problem through the diminution of the number of requirements open to misinterpretation. This paper presents a reuse-based approach with which to address RE in GSD, with special emphasis on specification techniques, namely parameterised requirements and traceability relationships. An experiment was carried out with the participation of 29 university students enrolled on a Computer Science and Engineering course. Two main scenarios that represented co-localisation and distribution in software development were portrayed by participants from Spain and Morocco. The global teams achieved a slightly better performance than the co-located teams as regards effectiveness , which could be a result of the worse productivity of the global teams in comparison to the co-located teams. Subjective perceptions were generally more positive in the case of the distributed teams ( difficulty , speed and understanding ), with the exception of quality . A theoretical model has been proposed as an evaluation framework with which to analyse, from the point of view of the factor of distance, the effect of requirements specification techniques on a set of performance and perception-based variables. The experiment utilised a new internationalisation requirements catalogue. None of the differences found between co-located and distributed teams were significant according to the outcome of our statistical tests. The well-known benefits of requirements reuse in traditional co-located projects could, therefore, also be expected in GSD projects.

  9. An Introduction to Transient Engine Applications Using the Numerical Propulsion System Simulation (NPSS) and MATLAB

    NASA Technical Reports Server (NTRS)

    Chin, Jeffrey C.; Csank, Jeffrey T.; Haller, William J.; Seidel, Jonathan A.

    2016-01-01

    This document outlines methodologies designed to improve the interface between the Numerical Propulsion System Simulation framework and various control and dynamic analyses developed in the Matlab and Simulink environment. Although NPSS is most commonly used for steady-state modeling, this paper is intended to supplement the relatively sparse documentation on it's transient analysis functionality. Matlab has become an extremely popular engineering environment, and better methodologies are necessary to develop tools that leverage the benefits of these disparate frameworks. Transient analysis is not a new feature of the Numerical Propulsion System Simulation (NPSS), but transient considerations are becoming more pertinent as multidisciplinary trade-offs begin to play a larger role in advanced engine designs. This paper serves to supplement the relatively sparse documentation on transient modeling and cover the budding convergence between NPSS and Matlab based modeling toolsets. The following sections explore various design patterns to rapidly develop transient models. Each approach starts with a base model built with NPSS, and assumes the reader already has a basic understanding of how to construct a steady-state model. The second half of the paper focuses on further enhancements required to subsequently interface NPSS with Matlab codes. The first method being the simplest and most straightforward but performance constrained, and the last being the most abstract. These methods aren't mutually exclusive and the specific implementation details could vary greatly based on the designer's discretion. Basic recommendations are provided to organize model logic in a format most easily amenable to integration with existing Matlab control toolsets.

  10. National Cycle Program (NCP) Common Analysis Tool for Aeropropulsion

    NASA Technical Reports Server (NTRS)

    Follen, G.; Naiman, C.; Evans, A.

    1999-01-01

    Through the NASA/Industry Cooperative Effort (NICE) agreement, NASA Lewis and industry partners are developing a new engine simulation, called the National Cycle Program (NCP), which is the initial framework of NPSS. NCP is the first phase toward achieving the goal of NPSS. This new software supports the aerothermodynamic system simulation process for the full life cycle of an engine. The National Cycle Program (NCP) was written following the Object Oriented Paradigm (C++, CORBA). The software development process used was also based on the Object Oriented paradigm. Software reviews, configuration management, test plans, requirements, design were all apart of the process used in developing NCP. Due to the many contributors to NCP, the stated software process was mandatory for building a common tool intended for use by so many organizations. The U.S. aircraft and airframe companies recognize NCP as the future industry standard for propulsion system modeling.

  11. Synthetic Botany.

    PubMed

    Boehm, Christian R; Pollak, Bernardo; Purswani, Nuri; Patron, Nicola; Haseloff, Jim

    2017-07-05

    Plants are attractive platforms for synthetic biology and metabolic engineering. Plants' modular and plastic body plans, capacity for photosynthesis, extensive secondary metabolism, and agronomic systems for large-scale production make them ideal targets for genetic reprogramming. However, efforts in this area have been constrained by slow growth, long life cycles, the requirement for specialized facilities, a paucity of efficient tools for genetic manipulation, and the complexity of multicellularity. There is a need for better experimental and theoretical frameworks to understand the way genetic networks, cellular populations, and tissue-wide physical processes interact at different scales. We highlight new approaches to the DNA-based manipulation of plants and the use of advanced quantitative imaging techniques in simple plant models such as Marchantia polymorpha. These offer the prospects of improved understanding of plant dynamics and new approaches to rational engineering of plant traits. Copyright © 2017 Cold Spring Harbor Laboratory Press; all rights reserved.

  12. Ecological requirements for pallid sturgeon reproduction and recruitment in the Missouri River: annual report 2011

    USGS Publications Warehouse

    DeLonay, Aaron J.; Jacobson, Robert B.; Chojnacki, Kimberly A.; Annis, Mandy L.; Braaten, P. J.; Elliott, Caroline M.; Fuller, D. B.; Haas, Justin D.; Haddix, Tyler M.; Ladd, Hallie L.A.; McElroy, Brandon J.; Mestl, Gerald E.; Papoulias, Diana M.; Rhoten, Jason C.; Wildhaber, Mark L.

    2014-01-01

    The Comprehensive Sturgeon Research Project is a multiyear, multiagency collaborative research framework developed to provide information to support pallid sturgeon recovery and Missouri River management decisions. The project strategy integrates field and laboratory studies of sturgeon reproductive ecology, early life history, habitat requirements, and physiology. The project scope of work is developed annually with cooperating research partners and in collaboration with the U.S. Army Corps of Engineers, Missouri River Recovery—Integrated Science Program. The research consists of several interdependent and complementary tasks that engage multiple disciplines. The research tasks in the 2011 scope of work emphasized understanding of reproductive migrations and spawning of adult sturgeon, and hatch and drift of larvae. These tasks were addressed in three hydrologically and geomorphologically distinct parts of the Missouri River Basin: the Lower Missouri River downstream from Gavins Point Dam, the Upper Missouri River downstream from Fort Peck Dam and including downstream reaches of the Milk River, and the Lower Yellowstone River. The research is designed to inform management decisions related to channel re-engineering, flow modification, and pallid sturgeon population augmentation on the Missouri River, and throughout the range of the species. Research and progress made through this project are reported to the U.S. Army Corps of Engineers annually. This annual report details the research effort and progress made by the Comprehensive Sturgeon Research Project during 2011.

  13. Ecological requirements for pallid sturgeon reproduction and recruitment in the Missouri River—Annual report 2014

    USGS Publications Warehouse

    Delonay, Aaron J.; Chojnacki, Kimberly A.; Jacobson, Robert B.; Braaten, Patrick J.; Buhl, Kevin J.; Elliott, Caroline M.; Erwin, Susannah O.; Faulkner, Jacob D.A.; Candrl, James S.; Fuller, David B.; Backes, Kenneth M.; Haddix, Tyler M.; Rugg, Matthew L.; Wesolek, Christopher J.; Eder, Brandon L.; Mestl, Gerald E.

    2016-03-16

    The Comprehensive Sturgeon Research Project is a multiyear, multiagency collaborative research framework developed to provide information to support pallid sturgeon recovery and Missouri River management decisions. The project strategy integrates field and laboratory studies of sturgeon reproductive ecology, early life history, habitat requirements, and physiology. The project scope of work is developed annually with collaborating research partners and in cooperation with the U.S. Army Corps of Engineers, Missouri River Recovery Program–Integrated Science Program. The project research consists of several interdependent and complementary tasks that involve multiple disciplines.The project research tasks in the 2014 scope of work emphasized understanding of reproductive migrations and spawning of adult pallid sturgeon and hatch and drift of larvae. These tasks were addressed in three hydrologically and geomorphologically distinct parts of the Missouri River Basin: the Lower Missouri River downstream from Gavins Point Dam, the Upper Missouri River downstream from Fort Peck Dam and downstream reaches of the Milk River, and the Lower Yellowstone River. The project research is designed to inform management decisions related to channel re-engineering, flow modification, and pallid sturgeon population augmentation on the Missouri River and throughout the range of the species. Research and progress made through this project are reported to the U.S. Army Corps of Engineers annually. This annual report details the research effort and progress made by the Comprehensive Sturgeon Research Project during 2014.

  14. Growing a National Learning Environments and Resources Network for Science, Mathematics, Engineering, and Technology Education: Current Issues and Opportunities for the NSDL Program; Open Linking in the Scholarly Information Environment Using the OpenURL Framework; The HeadLine Personal Information Environment: Evaluation Phase One.

    ERIC Educational Resources Information Center

    Zia, Lee L.; Van de Sompel, Herbert; Beit-Arie, Oren; Gambles, Anne

    2001-01-01

    Includes three articles that discuss the National Science Foundation's National Science, Mathematics, Engineering, and Technology Education Digital Library (NSDL) program; the OpenURL framework for open reference linking in the Web-based scholarly information environment; and HeadLine (Hybrid Electronic Access and Delivery in the Library Networked…

  15. Accelerating Corporate Research in the Development, Application and Deployment of Human Language Technologies

    DTIC Science & Technology

    2003-01-01

    dubbed UIMA . At the heart of UIMA are powerful search capabilities and a data-driven framework for the development, composition and distributed...example, to Processing Resources in the GATE archi- tecture (Cunningham et al., 2000). In UIMA , a TAE is a recursive structure which may be composed of sub...closer look at the analysis engine framework . UIMA specifies an interface for an analysis engine; roughly speaking it is “CAS in” and “CAS out

  16. Component Cost Reduction by Value Engineering: A Case Study

    NASA Astrophysics Data System (ADS)

    Kalluri, Vinayak; Kodali, Rambabu

    2017-04-01

    The concept value engineering (VE) acts to increase the value of a product through the improvement in existent functions without increasing their costs. In other words, VE is a function oriented, systematic team approach study to provide value in a product, system or service. The authors systematically explore VE through the six step framework proposed by SAVE and a case study is presented to address the concern of reduction in cost without compromising the function of a hydraulic steering cylinder through the aforementioned VE framework.

  17. uCollaborator: Framework for STEM Project Collaboration among Geographically-Dispersed Student/Faculty Teams

    ERIC Educational Resources Information Center

    Fiore, Stephen M.; Rodriguez, Walter E.; Carstens, Deborah S.

    2012-01-01

    This paper presents a framework for facilitating communication among STEM project teams that are geographically dispersed in synchronous or asynchronous online courses. The framework has been developed to: (a) improve how engineering and technology students and faculty work with collocated and geographically-dispersed teams; and (b) to connect the…

  18. A Framework for the Evaluation of CASE Tool Learnability in Educational Environments

    ERIC Educational Resources Information Center

    Senapathi, Mali

    2005-01-01

    The aim of the research is to derive a framework for the evaluation of Computer Aided Software Engineering (CASE) tool learnability in educational environments. Drawing from the literature of Human Computer Interaction and educational research, a framework for evaluating CASE tool learnability in educational environments is derived. The two main…

  19. A Framework for Authenticity in the Mathematics and Statistics Classroom

    ERIC Educational Resources Information Center

    Garrett, Lauretta; Huang, Li; Charleton, Maria Calhoun

    2016-01-01

    Authenticity is a term commonly used in reference to pedagogical and curricular qualities of mathematics teaching and learning, but its use lacks a coherent framework. The work of researchers in engineering education provides such a framework. Authentic qualities of mathematics teaching and learning are fit within a model described by Strobel,…

  20. Foundations for Streaming Model Transformations by Complex Event Processing.

    PubMed

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  1. [European Marketing Authorisation: a long process. Experiences of small biotech companies with the ATMP regulation].

    PubMed

    Buljovčić, Z

    2011-07-01

    On 30 December 2008, the Regulation (EC) 1394/2007 on advanced therapy medicinal products (ATMPs) entered into force. Herewith the first EU-wide regulatory framework for ATMPs was established. It requires a central marketing authorisation application to the EMA (European Medicinal Agency). This new framework especially changes the code of regulatory practice for tissue engineered products (TEPs), as no registration procedure had been previously required for autologous TEPs. This also meant that no clinical proof of efficacy achieved by a pivotal clinical trial was necessary. Difficulties and their background as well as the vast requirements for product development that have to be addressed by small companies within a very short time frame are presented. Hereby, it is obvious that regulatory experience which is required to identify and implement the resulting implications was not in place yet and still had to be established. The lack of regulatory experience also resulted in difficulties with scientific advice preparation, expectations toward regulatory agencies, consultants, and transformation of regulatory requirements. Addressing the regulatory requirements within the transition period is even more difficult for entrepreneurs with products which are assigned for indications resulting in complex challenges to the trial design. Due to the enormous time pressure to generate data and due to the implied financial pressure, different adaptation strategies are evolving. In Germany the "hospital exemption" according to §4b AMG (German Medicinal Products Law) is of major importance. A reorientation toward acellular products and a slow down in development of new ATMP products is expected.

  2. Topology optimization for nonlinear dynamic problems: Considerations for automotive crashworthiness

    NASA Astrophysics Data System (ADS)

    Kaushik, Anshul; Ramani, Anand

    2014-04-01

    Crashworthiness of automotive structures is most often engineered after an optimal topology has been arrived at using other design considerations. This study is an attempt to incorporate crashworthiness requirements upfront in the topology synthesis process using a mathematically consistent framework. It proposes the use of equivalent linear systems from the nonlinear dynamic simulation in conjunction with a discrete-material topology optimizer. Velocity and acceleration constraints are consistently incorporated in the optimization set-up. Issues specific to crash problems due to the explicit solution methodology employed, nature of the boundary conditions imposed on the structure, etc. are discussed and possible resolutions are proposed. A demonstration of the methodology on two-dimensional problems that address some of the structural requirements and the types of loading typical of frontal and side impact is provided in order to show that this methodology has the potential for topology synthesis incorporating crashworthiness requirements.

  3. About, for, in or through entrepreneurship in engineering education

    NASA Astrophysics Data System (ADS)

    Mäkimurto-Koivumaa, Soili; Belt, Pekka

    2016-09-01

    Engineering competences form a potential basis for entrepreneurship. There are pressures to find new approaches to entrepreneurship education (EE) in engineering education, as the traditional analytical logic of engineering does not match the modern view of entrepreneurship. Since the previous models do not give tangible enough tools on how to organise EE in practice, this article aims to develop a new framework for EE at the university level. We approach this aim by analysing existing scientific literature complemented by long-term practical observations, enabling a fruitful interplay between theory and practice. The developed framework recommends aspects in EE to be emphasised during each year of the study process. Action-based learning methods are highlighted in the beginning of studies to support students' personal growth. Explicit business knowledge is to be gradually increased only when professional, field-specific knowledge has been adequately accumulated.

  4. Learning framework of “Integrating Techniques” for Solving Problems and Its Empirical Application in Doctoral Course in Mechanical Engineering

    NASA Astrophysics Data System (ADS)

    Otsuka, Yuichi; Ohta, Kazuhide; Noguchi, Hiroshi

    The 21st century Center of Excellence (COE) program in Department of Mechanical Engineering Science at Kyushu University construct the training framework of learning “Integrating Techniques” by research presentations for students in different majors and accident analyses for practical cases by Ph.D course students. The training framework is composed of three processes : 1) Peer review among Ph.D course students for the presentations, 2) Instructions by teachers in order to improve the quality of the presentations based on the result of the peer-reviews, 3) Final evaluation for the improved presentations by teachers and the students. This research has elucidated the quantitative effectiveness of the framework by the evaluations using questionnaires for the presentations. Furthermore, the result of investigation for the course students has observed positive correlation between the significance of integration techniques and the enthusiasm for participating the course, which reveals the efficacy of the learning framework proposed.

  5. [Computer aided design for fixed partial denture framework based on reverse engineering technology].

    PubMed

    Sun, Yu-chun; Lü, Pei-jun; Wang, Yong

    2006-03-01

    To explore a computer aided design (CAD) route for the framework of domestic fixed partial denture (FPD) and confirm the suitable method of 3-D CAD. The working area of a dentition model was scanned with a 3-D mechanical scanner. Using the reverse engineering (RE) software, margin and border curves were extracted and several reference curves were created to ensure the dimension and location of pontic framework that was taken from the standard database. The shoulder parts of the retainers were created after axial surfaces constructed. The connecting areas, axial line and curving surface of the framework connector were finally created. The framework of a three-unit FPD was designed with RE technology, which showed smooth surfaces and continuous contours. The design route is practical. The result of this study is significant in theory and practice, which will provide a reference for establishing the computer aided design/computer aided manufacture (CAD/CAM) system of domestic FPD.

  6. In search of standards to support circularity in product policies: A systematic approach.

    PubMed

    Tecchio, Paolo; McAlister, Catriona; Mathieux, Fabrice; Ardente, Fulvio

    2017-12-01

    The aspiration of a circular economy is to shift material flows toward a zero waste and pollution production system. The process of shifting to a circular economy has been initiated by the European Commission in their action plan for the circular economy. The EU Ecodesign Directive is a key policy in this transition. However, to date the focus of access to market requirements on products has primarily been upon energy efficiency. The absence of adequate metrics and standards has been a key barrier to the inclusion of resource efficiency requirements. This paper proposes a framework to boost sustainable engineering and resource use by systematically identifying standardization needs and features. Standards can then support the setting of appropriate material efficiency requirements in EU product policy. Three high-level policy goals concerning material efficiency of products were identified: embodied impact reduction, lifetime extension and residual waste reduction. Through a lifecycle perspective, a matrix of interactions among material efficiency topics (recycled content, re-used content, relevant material content, durability, upgradability, reparability, re-manufacturability, reusability, recyclability, recoverability, relevant material separability) and policy goals was created. The framework was tested on case studies for electronic displays and washing machines. For potential material efficiency requirements, specific standardization needs were identified, such as adequate metrics for performance measurements, reliable and repeatable tests, and calculation procedures. The proposed novel framework aims to provide a method by which to identify key material efficiency considerations within the policy context, and to map out the generic and product-specific standardisation needs to support ecodesign. Via such an approach, many different stakeholders (industry, academics, policy makers, non-governmental organizations etc.) can be involved in material efficiency standards and regulations. Requirements and standards concerning material efficiency would compel product manufacturers, but also help designers and interested parties in addressing the sustainable resource use issue.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanz Rodrigo, Javier; Chávez Arroyo, Roberto Aurelio; Moriarty, Patrick

    The increasing size of wind turbines, with rotors already spanning more than 150 m diameter and hub heights above 100 m, requires proper modeling of the atmospheric boundary layer (ABL) from the surface to the free atmosphere. Furthermore, large wind farm arrays create their own boundary layer structure with unique physics. This poses significant challenges to traditional wind engineering models that rely on surface-layer theories and engineering wind farm models to simulate the flow in and around wind farms. However, adopting an ABL approach offers the opportunity to better integrate wind farm design tools and meteorological models. The challenge ismore » how to build the bridge between atmospheric and wind engineering model communities and how to establish a comprehensive evaluation process that identifies relevant physical phenomena for wind energy applications with modeling and experimental requirements. A framework for model verification, validation, and uncertainty quantification is established to guide this process by a systematic evaluation of the modeling system at increasing levels of complexity. In terms of atmospheric physics, 'building the bridge' means developing models for the so-called 'terra incognita,' a term used to designate the turbulent scales that transition from mesoscale to microscale. This range of scales within atmospheric research deals with the transition from parameterized to resolved turbulence and the improvement of surface boundary-layer parameterizations. The coupling of meteorological and wind engineering flow models and the definition of a formal model evaluation methodology, is a strong area of research for the next generation of wind conditions assessment and wind farm and wind turbine design tools. Some fundamental challenges are identified in order to guide future research in this area.« less

  8. Prediction of Launch Vehicle Ignition Overpressure and Liftoff Acoustics

    NASA Technical Reports Server (NTRS)

    Casiano, Matthew

    2009-01-01

    The LAIOP (Launch Vehicle Ignition Overpressure and Liftoff Acoustic Environments) program predicts the external pressure environment generated during liftoff for a large variety of rocket types. These environments include ignition overpressure, produced by the rapid acceleration of exhaust gases during rocket-engine start transient, and launch acoustics, produced by turbulence in the rocket plume. The ignition overpressure predictions are time-based, and the launch acoustic predictions are frequency-based. Additionally, the software can predict ignition overpressure mitigation, using water-spray injection into the rocket exhaust stream, for a limited number of configurations. The framework developed for these predictions is extensive, though some options require additional relevant data and development time. Once these options are enabled, the already extensively capable code will be further enhanced. The rockets, or launch vehicles, can either be elliptically or cylindrically shaped, and up to eight strap-on structures (boosters or tanks) are allowed. Up to four engines are allowed for the core launch vehicle, which can be of two different types. Also, two different sizes of strap-on structures can be used, and two different types of booster engines are allowed. Both tabular and graphical presentations of the predicted environments at the selected locations can be reviewed by the user. The output includes summaries of rocket-engine operation, ignition overpressure time histories, and one-third octave sound pressure spectra of the predicted launch acoustics. Also, documentation is available to the user to help him or her understand the various aspects of the graphical user interface and the required input parameters.

  9. Achieving Agility and Stability in Large-Scale Software Development

    DTIC Science & Technology

    2013-01-16

    temporary team is assigned to prepare layers and frameworks for future feature teams. Presentation Layer Domain Layer Data Access Layer Framework...http://www.sei.cmu.edu/training/ elearning ~ Software Engineering Institute CarnegieMellon

  10. Introductory guide to integrated ecological framework.

    DOT National Transportation Integrated Search

    2014-10-01

    This guide introduces the Integrated Ecological Framework (IEF) to Texas Department of Transportation : (TxDOT) engineers and planners. IEF is step-by-step approach to integrating ecological and : transportation planning with the goal of avoiding imp...

  11. A Knowledge Discovery framework for Planetary Defense

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Yang, C. P.; Li, Y.; Yu, M.; Bambacus, M.; Seery, B.; Barbee, B.

    2016-12-01

    Planetary Defense, a project funded by NASA Goddard and the NSF, is a multi-faceted effort focused on the mitigation of Near Earth Object (NEO) threats to our planet. Currently, there exists a dispersion of information concerning NEO's amongst different organizations and scientists, leading to a lack of a coherent system of information to be used for efficient NEO mitigation. In this paper, a planetary defense knowledge discovery engine is proposed to better assist the development and integration of a NEO responding system. Specifically, we have implemented an organized information framework by two means: 1) the development of a semantic knowledge base, which provides a structure for relevant information. It has been developed by the implementation of web crawling and natural language processing techniques, which allows us to collect and store the most relevant structured information on a regular basis. 2) the development of a knowledge discovery engine, which allows for the efficient retrieval of information from our knowledge base. The knowledge discovery engine has been built on the top of Elasticsearch, an open source full-text search engine, as well as cutting-edge machine learning ranking and recommendation algorithms. This proposed framework is expected to advance the knowledge discovery and innovation in planetary science domain.

  12. AIBench: a rapid application development framework for translational research in biomedicine.

    PubMed

    Glez-Peña, D; Reboiro-Jato, M; Maia, P; Rocha, M; Díaz, F; Fdez-Riverola, F

    2010-05-01

    Applied research in both biomedical discovery and translational medicine today often requires the rapid development of fully featured applications containing both advanced and specific functionalities, for real use in practice. In this context, new tools are demanded that allow for efficient generation, deployment and reutilization of such biomedical applications as well as their associated functionalities. In this context this paper presents AIBench, an open-source Java desktop application framework for scientific software development with the goal of providing support to both fundamental and applied research in the domain of translational biomedicine. AIBench incorporates a powerful plug-in engine, a flexible scripting platform and takes advantage of Java annotations, reflection and various design principles in order to make it easy to use, lightweight and non-intrusive. By following a basic input-processing-output life cycle, it is possible to fully develop multiplatform applications using only three types of concepts: operations, data-types and views. The framework automatically provides functionalities that are present in a typical scientific application including user parameter definition, logging facilities, multi-threading execution, experiment repeatability and user interface workflow management, among others. The proposed framework architecture defines a reusable component model which also allows assembling new applications by the reuse of libraries from past projects or third-party software. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.

  13. Toward a comprehensive landscape vegetation monitoring framework

    NASA Astrophysics Data System (ADS)

    Kennedy, Robert; Hughes, Joseph; Neeti, Neeti; Larrue, Tara; Gregory, Matthew; Roberts, Heather; Ohmann, Janet; Kane, Van; Kane, Jonathan; Hooper, Sam; Nelson, Peder; Cohen, Warren; Yang, Zhiqiang

    2016-04-01

    Blossoming Earth observation resources provide great opportunity to better understand land vegetation dynamics, but also require new techniques and frameworks to exploit their potential. Here, I describe several parallel projects that leverage time-series Landsat imagery to describe vegetation dynamics at regional and continental scales. At the core of these projects are the LandTrendr algorithms, which distill time-series earth observation data into periods of consistent long or short-duration dynamics. In one approach, we built an integrated, empirical framework to blend these algorithmically-processed time-series data with field data and lidar data to ascribe yearly change in forest biomass across the US states of Washington, Oregon, and California. In a separate project, we expanded from forest-only monitoring to full landscape land cover monitoring over the same regional scale, including both categorical class labels and continuous-field estimates. In these and other projects, we apply machine-learning approaches to ascribe all changes in vegetation to driving processes such as harvest, fire, urbanization, etc., allowing full description of both disturbance and recovery processes and drivers. Finally, we are moving toward extension of these same techniques to continental and eventually global scales using Google Earth Engine. Taken together, these approaches provide one framework for describing and understanding processes of change in vegetation communities at broad scales.

  14. Bioprocess scale-up/down as integrative enabling technology: from fluid mechanics to systems biology and beyond.

    PubMed

    Delvigne, Frank; Takors, Ralf; Mudde, Rob; van Gulik, Walter; Noorman, Henk

    2017-09-01

    Efficient optimization of microbial processes is a critical issue for achieving a number of sustainable development goals, considering the impact of microbial biotechnology in agrofood, environment, biopharmaceutical and chemical industries. Many of these applications require scale-up after proof of concept. However, the behaviour of microbial systems remains unpredictable (at least partially) when shifting from laboratory-scale to industrial conditions. The need for robust microbial systems is thus highly needed in this context, as well as a better understanding of the interactions between fluid mechanics and cell physiology. For that purpose, a full scale-up/down computational framework is already available. This framework links computational fluid dynamics (CFD), metabolic flux analysis and agent-based modelling (ABM) for a better understanding of the cell lifelines in a heterogeneous environment. Ultimately, this framework can be used for the design of scale-down simulators and/or metabolically engineered cells able to cope with environmental fluctuations typically found in large-scale bioreactors. However, this framework still needs some refinements, such as a better integration of gas-liquid flows in CFD, and taking into account intrinsic biological noise in ABM. © 2017 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.

  15. Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition

    PubMed Central

    Ordóñez, Francisco Javier; Roggen, Daniel

    2016-01-01

    Human activity recognition (HAR) tasks have traditionally been solved using engineered features obtained by heuristic processes. Current research suggests that deep convolutional neural networks are suited to automate feature extraction from raw sensor inputs. However, human activities are made of complex sequences of motor movements, and capturing this temporal dynamics is fundamental for successful HAR. Based on the recent success of recurrent neural networks for time series domains, we propose a generic deep framework for activity recognition based on convolutional and LSTM recurrent units, which: (i) is suitable for multimodal wearable sensors; (ii) can perform sensor fusion naturally; (iii) does not require expert knowledge in designing features; and (iv) explicitly models the temporal dynamics of feature activations. We evaluate our framework on two datasets, one of which has been used in a public activity recognition challenge. Our results show that our framework outperforms competing deep non-recurrent networks on the challenge dataset by 4% on average; outperforming some of the previous reported results by up to 9%. Our results show that the framework can be applied to homogeneous sensor modalities, but can also fuse multimodal sensors to improve performance. We characterise key architectural hyperparameters’ influence on performance to provide insights about their optimisation. PMID:26797612

  16. Field-widened Michelson interferometer for spectral discrimination in high-spectral-resolution lidar: theoretical framework.

    PubMed

    Cheng, Zhongtao; Liu, Dong; Luo, Jing; Yang, Yongying; Zhou, Yudi; Zhang, Yupeng; Duan, Lulin; Su, Lin; Yang, Liming; Shen, Yibing; Wang, Kaiwei; Bai, Jian

    2015-05-04

    A field-widened Michelson interferometer (FWMI) is developed to act as the spectral discriminator in high-spectral-resolution lidar (HSRL). This realization is motivated by the wide-angle Michelson interferometer (WAMI) which has been used broadly in the atmospheric wind and temperature detection. This paper describes an independent theoretical framework about the application of the FWMI in HSRL for the first time. In the framework, the operation principles and application requirements of the FWMI are discussed in comparison with that of the WAMI. Theoretical foundations for designing this type of interferometer are introduced based on these comparisons. Moreover, a general performance estimation model for the FWMI is established, which can provide common guidelines for the performance budget and evaluation of the FWMI in the both design and operation stages. Examples incorporating many practical imperfections or conditions that may degrade the performance of the FWMI are given to illustrate the implementation of the modeling. This theoretical framework presents a complete and powerful tool for solving most of theoretical or engineering problems encountered in the FWMI application, including the designing, parameter calibration, prior performance budget, posterior performance estimation, and so on. It will be a valuable contribution to the lidar community to develop a new generation of HSRLs based on the FWMI spectroscopic filter.

  17. Parallels in Computer-Aided Design Framework and Software Development Environment Efforts.

    DTIC Science & Technology

    1992-05-01

    de - sign kits, and tool and design management frameworks. Also, books about software engineer- ing environments [Long 91] and electronic design...tool integration [Zarrella 90], and agreement upon a universal de - sign automation framework, such as the CAD Framework Initiative (CFI) [Malasky 91...ments: identification, control, status accounting, and audit and review. The paper by Dart ex- tracts 15 CM concepts from existing SDEs and tools

  18. On extracting design principles from biology: I. Method-General answers to high-level design questions for bioinspired robots.

    PubMed

    Haberland, M; Kim, S

    2015-02-02

    When millions of years of evolution suggest a particular design solution, we may be tempted to abandon traditional design methods and copy the biological example. However, biological solutions do not often translate directly into the engineering domain, and even when they do, copying eliminates the opportunity to improve. A better approach is to extract design principles relevant to the task of interest, incorporate them in engineering designs, and vet these candidates against others. This paper presents the first general framework for determining whether biologically inspired relationships between design input variables and output objectives and constraints are applicable to a variety of engineering systems. Using optimization and statistics to generalize the results beyond a particular system, the framework overcomes shortcomings observed of ad hoc methods, particularly those used in the challenging study of legged locomotion. The utility of the framework is demonstrated in a case study of the relative running efficiency of rotary-kneed and telescoping-legged robots.

  19. Service Cart For Engines

    NASA Technical Reports Server (NTRS)

    Ng, Gim Shek

    1995-01-01

    Cart supports rear-mounted air-cooled engine from Volkswagen or Porsche automobile. One person removes, repairs, tests, and reinstalls engine of car, van, or home-built airplane. Consists of framework of wood, steel, and aluminum components supported by four wheels. Engine lifted from vehicle by hydraulic jack and gently lowered onto waiting cart. Jack removed from under engine. Rear of vehicle raised just enough that engine can be rolled out from under it. Cart easily supports 200-lb engine. Also used to hold transmission. With removable sheet-metal top, cart used as portable seat.

  20. LIFE CYCLE ENGINEERING GUIDELINES

    EPA Science Inventory

    This document provides guidelines for the implementation of LCE concepts, information, and techniques in engineering products, systems, processes, and facilities. To make this document as practical and useable as possible, a unifying LCE framework is presented. Subsequent topics ...

  1. Generating Alternative Engineering Designs by Integrating Desktop VR with Genetic Algorithms

    ERIC Educational Resources Information Center

    Chandramouli, Magesh; Bertoline, Gary; Connolly, Patrick

    2009-01-01

    This study proposes an innovative solution to the problem of multiobjective engineering design optimization by integrating desktop VR with genetic computing. Although, this study considers the case of construction design as an example to illustrate the framework, this method can very much be extended to other engineering design problems as well.…

  2. Enhancing Critical Thinking across the Undergraduate Experience: An Exemplar from Engineering

    ERIC Educational Resources Information Center

    Ralston, Patricia A.; Bays, Cathy L.

    2013-01-01

    Faculty in a large, urban school of engineering designed a longitudinal study to assess the critical thinking skills of undergraduate students as they progressed through the engineering program. The Paul-Elder critical thinking framework was used to design course assignments and develop a holistic assessment rubric. The curriculum was re-designed…

  3. Integration, Authenticity, and Relevancy in College Science through Engineering Design

    ERIC Educational Resources Information Center

    Turner, Ken L., Jr.; Hoffman, Adam R.

    2018-01-01

    Engineering design is an ideal perspective for engaging students in college science classes. An engineering design problem-solving framework was used to create a general chemistry lab activity focused on an important environmental issue--dead zones. Dead zones impact over 400 locations around the world and are a result of nutrient pollution, one…

  4. Cooperative Engineering as a Joint Action

    ERIC Educational Resources Information Center

    Joffredo-Le Brun, Sophie; Morellato, Mireille; Sensevy, Gérard; Quilio, Serge

    2018-01-01

    This paper describes some elements of a specific kind of design-based research, cooperative engineering. In the first part of the paper, we argue that cooperative engineering can be analyzed through a joint action framework. We first present some conceptual tools that the Joint Action Theory in Didactics proposes in order to understand didactic…

  5. Understanding the Leaky Engineering Pipeline: Motivation and Job Adaptability of Female Engineers

    ERIC Educational Resources Information Center

    Saraswathiamma, Manjusha Thekkedathu

    2010-01-01

    This dissertation is a mixed-method study conducted using qualitative grounded theory and quantitative survey and correlation approaches. This study aims to explore the motivation and adaptability of females in the engineering profession and to develop a theoretical framework for both motivation and adaptability issues. As a result, this study…

  6. An Innovative Improvement of Engineering Learning System Using Computational Fluid Dynamics Concept

    ERIC Educational Resources Information Center

    Hung, T. C.; Wang, S. K.; Tai, S. W.; Hung, C. T.

    2007-01-01

    An innovative concept of an electronic learning system has been established in an attempt to achieve a technology that provides engineering students with an instructive and affordable framework for learning engineering-related courses. This system utilizes an existing Computational Fluid Dynamics (CFD) package, Active Server Pages programming,…

  7. Probabilistic consensus scoring improves tandem mass spectrometry peptide identification.

    PubMed

    Nahnsen, Sven; Bertsch, Andreas; Rahnenführer, Jörg; Nordheim, Alfred; Kohlbacher, Oliver

    2011-08-05

    Database search is a standard technique for identifying peptides from their tandem mass spectra. To increase the number of correctly identified peptides, we suggest a probabilistic framework that allows the combination of scores from different search engines into a joint consensus score. Central to the approach is a novel method to estimate scores for peptides not found by an individual search engine. This approach allows the estimation of p-values for each candidate peptide and their combination across all search engines. The consensus approach works better than any single search engine across all different instrument types considered in this study. Improvements vary strongly from platform to platform and from search engine to search engine. Compared to the industry standard MASCOT, our approach can identify up to 60% more peptides. The software for consensus predictions is implemented in C++ as part of OpenMS, a software framework for mass spectrometry. The source code is available in the current development version of OpenMS and can easily be used as a command line application or via a graphical pipeline designer TOPPAS.

  8. From EGEE Operations Portal towards EGI Operations Portal

    NASA Astrophysics Data System (ADS)

    Cordier, Hélène; L'Orphelin, Cyril; Reynaud, Sylvain; Lequeux, Olivier; Loikkanen, Sinikka; Veyre, Pierre

    Grid operators in EGEE have been using a dedicated dashboard as their central operational tool, stable and scalable for the last 5 years despite continuous upgrade from specifications by users, monitoring tools or data providers. In EGEE-III, recent regionalisation of operations led the Operations Portal developers to conceive a standalone instance of this tool. We will see how the dashboard reorganization paved the way for the re-engineering of the portal itself. The outcome is an easily deployable package customized with relevant information sources and specific decentralized operational requirements. This package is composed of a generic and scalable data access mechanism, Lavoisier; a renowned php framework for configuration flexibility, Symfony and a MySQL database. VO life cycle and operational information, EGEE broadcast and Downtime notifications are next for the major reorganization until all other key features of the Operations Portal are migrated to the framework. Features specifications will be sketched at the same time to adapt to EGI requirements and to upgrade. Future work on feature regionalisation, on new advanced features or strategy planning will be tracked in EGI- Inspire through the Operations Tools Advisory Group, OTAG, where all users, customers and third parties of the Operations Portal are represented from January 2010.

  9. Alternative Classification Framework for Engineering Capability Enhancement

    ERIC Educational Resources Information Center

    Patamakajonpong, Mana; Chandarasupsang, Tirapot

    2015-01-01

    Purpose: This paper aims to present an alternative practical framework to classify the skill and knowledge of the individual trainees by comparing it with the expert in an organization. This framework gives the benefit to the organization in order to know the ability level of the personnel and to be able to provide the personnel development method…

  10. 46 CFR 11.502 - Additional requirements for engineer endorsements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 1 2013-10-01 2013-10-01 false Additional requirements for engineer endorsements. 11... SEAMEN REQUIREMENTS FOR OFFICER ENDORSEMENTS Professional Requirements for Engineer Officer § 11.502 Additional requirements for engineer endorsements. (a) For all original and raise of grade of engineer...

  11. 46 CFR 11.502 - Additional requirements for engineer endorsements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Additional requirements for engineer endorsements. 11... SEAMEN REQUIREMENTS FOR OFFICER ENDORSEMENTS Professional Requirements for Engineer Officer § 11.502 Additional requirements for engineer endorsements. (a) For all original and raise of grade of engineer...

  12. 46 CFR 11.502 - Additional requirements for engineer endorsements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 1 2012-10-01 2012-10-01 false Additional requirements for engineer endorsements. 11... SEAMEN REQUIREMENTS FOR OFFICER ENDORSEMENTS Professional Requirements for Engineer Officer § 11.502 Additional requirements for engineer endorsements. (a) For all original and raise of grade of engineer...

  13. 46 CFR 11.502 - Additional requirements for engineer endorsements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 1 2011-10-01 2011-10-01 false Additional requirements for engineer endorsements. 11... SEAMEN REQUIREMENTS FOR OFFICER ENDORSEMENTS Professional Requirements for Engineer Officer § 11.502 Additional requirements for engineer endorsements. (a) For all original and raise of grade of engineer...

  14. UMCP-BG and E collaboration in nuclear power engineering in the framework of DOE-Utility Nuclear Power Engineering Education Matching Grant Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfe, Lothar PhD

    2000-03-01

    The DOE-Utility Nuclear Power Engineering Education Matching Grant Program has been established to support the education of students in Nuclear Engineering Programs to maintain a knowledgeable workforce in the United States in order to keep nuclear power as a viable component in a mix of energy sources for the country. The involvement of the utility industry ensures that this grant program satisfies the needs and requirements of local nuclear energy producers and at the same time establishes a strong linkage between education and day-to-day nuclear power generation. As of 1997, seventeen pairs of university-utility partners existed. UMCP was never amore » member of that group of universities, but applied for the first time with a proposal to Baltimore Gas and Electric Company in January 1999 [1]. This proposal was generously granted by BG&E [2,3] in the form of a gift in the amount of $25,000 from BG&E's Corporate Contribution Program. Upon the arrival of a newly appointed Director of Administration in the Department of Materials and Nuclear Engineering, the BG&E check was deposited into the University's Maryland Foundation Fund. The receipt of the letter and the check enabled UMCP to apply for DOE's matching funds in the same amount by a proposal.« less

  15. Rapid-X - An FPGA Development Toolset Using a Custom Simulink Library for MTCA.4 Modules

    NASA Astrophysics Data System (ADS)

    Prędki, Paweł; Heuer, Michael; Butkowski, Łukasz; Przygoda, Konrad; Schlarb, Holger; Napieralski, Andrzej

    2015-06-01

    The recent introduction of advanced hardware architectures such as the Micro Telecommunications Computing Architecture (MTCA) caused a change in the approach to implementation of control schemes in many fields. The development has been moving away from traditional programming languages ( C/C++), to hardware description languages (VHDL, Verilog), which are used in FPGA development. With MATLAB/Simulink it is possible to describe complex systems with block diagrams and simulate their behavior. Those diagrams are then used by the HDL experts to implement exactly the required functionality in hardware. Both the porting of existing applications and adaptation of new ones require a lot of development time from them. To solve this, Xilinx System Generator, a toolbox for MATLAB/Simulink, allows rapid prototyping of those block diagrams using hardware modelling. It is still up to the firmware developer to merge this structure with the hardware-dependent HDL project. This prevents the application engineer from quickly verifying the proposed schemes in real hardware. The framework described in this article overcomes these challenges, offering a hardware-independent library of components that can be used in Simulink/System Generator models. The components are subsequently translated into VHDL entities and integrated with a pre-prepared VHDL project template. Furthermore, the entire implementation process is run in the background, giving the user an almost one-click path from control scheme modelling and simulation to bit-file generation. This approach allows the application engineers to quickly develop new schemes and test them in real hardware environment. The applications may range from simple data logging or signal generation ones to very advanced controllers. Taking advantage of the Simulink simulation capabilities and user-friendly hardware implementation routines, the framework significantly decreases the development time of FPGA-based applications.

  16. Nanofiber Scaffold-Based Tissue-Engineered Retinal Pigment Epithelium to Treat Degenerative Eye Diseases

    PubMed Central

    Khristov, Vladimir; Wan, Qin; Sharma, Ruchi; Jha, Balendu Shekhar; Lotfi, Mostafa; Maminishkis, Arvydas; Simon, Carl G.

    2016-01-01

    Abstract Clinical-grade manufacturing of a functional retinal pigment epithelium (RPE) monolayer requires reproducing, as closely as possible, the natural environment in which RPE grows. In vitro, this can be achieved by a tissue engineering approach, in which the RPE is grown on a nanofibrous biological or synthetic scaffold. Recent research has shown that nanofiber scaffolds perform better for cell growth and transplantability compared with their membrane counterparts and that the success of the scaffold in promoting cell growth/function is not heavily material dependent. With these strides, the field has advanced enough to begin to consider implementation of one, or a combination, of the tissue engineering strategies discussed herein. In this study, we review the current state of tissue engineering research for in vitro culture of RPE/scaffolds and the parameters for optimal scaffold design that have been uncovered during this research. Next, we discuss production methods and manufacturers that are capable of producing the nanofiber scaffolds in such a way that would be biologically, regulatory, clinically, and commercially viable. Then, a discussion of how the scaffolds could be characterized, both morphologically and mechanically, to develop a testing process that is viable for regulatory screening is performed. Finally, an example of a tissue-engineered RPE/scaffold construct is given to provide the reader a framework for understanding how these pieces could fit together to develop a tissue-engineered RPE/scaffold construct that could pass regulatory scrutiny and can be commercially successful. PMID:27110730

  17. Developing Sustainable Urban Water-Energy Infrastructures: Applying a Multi-Sectoral Social-Ecological-Infrastructural Systems (SEIS) Framework

    NASA Astrophysics Data System (ADS)

    Ramaswami, A.

    2016-12-01

    Urban infrastructure - broadly defined to include the systems that provide water, energy, food, shelter, transportation-communication, sanitation and green/public spaces in cities - have tremendous impact on the environment and on human well-being (Ramaswami et al., 2016; Ramaswami et al., 2012). Aggregated globally, these sectors contribute 90% of global greenhouse gas (GHG) emissions and 96% of global water withdrawals. Urban infrastructure contributions to such impacts are beginning to dominate. Cities are therefore becoming the action arena for infrastructure transformations that can achieve high levels of service delivery while reducing environmental impacts and enhancing human well-being. Achieving sustainable urban infrastructure transitions requires: information about the engineered infrastructure, and its interaction with the natural (ecological-environmental) and the social sub-systems In this paper, we apply a multi-sector, multi-scalar Social-Ecological-Infrastructural Systems framework that describes the interactions among biophysical engineered infrastructures, the natural environment and the social system in a systems-approach to inform urban infrastructure transformations. We apply the SEIS framework to inform water and energy sector transformations in cities to achieve environmental and human health benefits realized at multiple scales - local, regional and global. Local scales address pollution, health, wellbeing and inequity within the city; regional scales address regional pollution, scarcity, as well as supply risks in the water-energy sectors; global impacts include greenhouse gas emissions and climate impacts. Different actors shape infrastructure transitions including households, businesses, and policy actors. We describe the development of novel cross-sectoral strategies at the water-energy nexus in cities, focusing on water, waste and energy sectors, in a case study of Delhi, India. Ramaswami, A.; Russell, A.G.; Culligan, P.J.; Sharma, K.R.; Kumar, E. (2016). Meta-Principles for developing smart, sustainable, and healthy cities, Science, 352(6288), 940-3. Ramaswami, A., et al. A Social-Ecological Infrastructural Systems Framework for Inter-Disciplinary Study of Sustainable City-Systems. J. Ind Ecol, 16(6): 801-813, 2012.

  18. Programming (Tips) for Physicists & Engineers

    ScienceCinema

    Ozcan, Erkcan

    2018-02-19

    Programming for today's physicists and engineers. Work environment: today's astroparticle, accelerator experiments and information industry rely on large collaborations. Need more than ever: code sharing/resuse, code building--framework integration, documentation and good visualization, working remotely, not reinventing the wheel.

  19. Programming (Tips) for Physicists & Engineers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozcan, Erkcan

    2010-07-13

    Programming for today's physicists and engineers. Work environment: today's astroparticle, accelerator experiments and information industry rely on large collaborations. Need more than ever: code sharing/resuse, code building--framework integration, documentation and good visualization, working remotely, not reinventing the wheel.

  20. A knowledge engineering framework towards clinical support for adverse drug event prevention: the PSIP approach.

    PubMed

    Koutkias, Vassilis; Stalidis, George; Chouvarda, Ioanna; Lazou, Katerina; Kilintzis, Vassilis; Maglaveras, Nicos

    2009-01-01

    Adverse Drug Events (ADEs) are currently considered as a major public health issue, endangering patients' safety and causing significant healthcare costs. Several research efforts are currently concentrating on the reduction of preventable ADEs by employing Information Technology (IT) solutions, which aim to provide healthcare professionals and patients with relevant knowledge and decision support tools. In this context, we present a knowledge engineering approach towards the construction of a Knowledge-based System (KBS) regarded as the core part of a CDSS (Clinical Decision Support System) for ADE prevention, all developed in the context of the EU-funded research project PSIP (Patient Safety through Intelligent Procedures in Medication). In the current paper, we present the knowledge sources considered in PSIP and the implications they pose to knowledge engineering, the methodological approach followed, as well as the components defining the knowledge engineering framework based on relevant state-of-the-art technologies and representation formalisms.

  1. Pore surface engineering in covalent organic frameworks.

    PubMed

    Nagai, Atsushi; Guo, Zhaoqi; Feng, Xiao; Jin, Shangbin; Chen, Xiong; Ding, Xuesong; Jiang, Donglin

    2011-11-15

    Covalent organic frameworks (COFs) are a class of important porous materials that allow atomically precise integration of building blocks to achieve pre-designable pore size and geometry; however, pore surface engineering in COFs remains challenging. Here we introduce pore surface engineering to COF chemistry, which allows the controlled functionalization of COF pore walls with organic groups. This functionalization is made possible by the use of azide-appended building blocks for the synthesis of COFs with walls to which a designable content of azide units is anchored. The azide units can then undergo a quantitative click reaction with alkynes to produce pore surfaces with desired groups and preferred densities. The diversity of click reactions performed shows that the protocol is compatible with the development of various specific surfaces in COFs. Therefore, this methodology constitutes a step in the pore surface engineering of COFs to realize pre-designed compositions, components and functions.

  2. The Semantic eScience Framework

    NASA Astrophysics Data System (ADS)

    McGuinness, Deborah; Fox, Peter; Hendler, James

    2010-05-01

    The goal of this effort is to design and implement a configurable and extensible semantic eScience framework (SESF). Configuration requires research into accommodating different levels of semantic expressivity and user requirements from use cases. Extensibility is being achieved in a modular approach to the semantic encodings (i.e. ontologies) performed in community settings, i.e. an ontology framework into which specific applications all the way up to communities can extend the semantics for their needs.We report on how we are accommodating the rapid advances in semantic technologies and tools and the sustainable software path for the future (certain) technical advances. In addition to a generalization of the current data science interface, we will present plans for an upper-level interface suitable for use by clearinghouses, and/or educational portals, digital libraries, and other disciplines.SESF builds upon previous work in the Virtual Solar-Terrestrial Observatory. The VSTO utilizes leading edge knowledge representation, query and reasoning techniques to support knowledge-enhanced search, data access, integration, and manipulation. It encodes term meanings and their inter-relationships in ontologies anduses these ontologies and associated inference engines to semantically enable the data services. The Semantically-Enabled Science Data Integration (SESDI) project implemented data integration capabilities among three sub-disciplines; solar radiation, volcanic outgassing and atmospheric structure using extensions to existingmodular ontolgies and used the VSTO data framework, while adding smart faceted search and semantic data registrationtools. The Semantic Provenance Capture in Data Ingest Systems (SPCDIS) has added explanation provenance capabilities to an observational data ingest pipeline for images of the Sun providing a set of tools to answer diverseend user questions such as ``Why does this image look bad?. http://tw.rpi.edu/portal/SESF

  3. The Semantic eScience Framework

    NASA Astrophysics Data System (ADS)

    Fox, P. A.; McGuinness, D. L.

    2009-12-01

    The goal of this effort is to design and implement a configurable and extensible semantic eScience framework (SESF). Configuration requires research into accommodating different levels of semantic expressivity and user requirements from use cases. Extensibility is being achieved in a modular approach to the semantic encodings (i.e. ontologies) performed in community settings, i.e. an ontology framework into which specific applications all the way up to communities can extend the semantics for their needs.We report on how we are accommodating the rapid advances in semantic technologies and tools and the sustainable software path for the future (certain) technical advances. In addition to a generalization of the current data science interface, we will present plans for an upper-level interface suitable for use by clearinghouses, and/or educational portals, digital libraries, and other disciplines.SESF builds upon previous work in the Virtual Solar-Terrestrial Observatory. The VSTO utilizes leading edge knowledge representation, query and reasoning techniques to support knowledge-enhanced search, data access, integration, and manipulation. It encodes term meanings and their inter-relationships in ontologies anduses these ontologies and associated inference engines to semantically enable the data services. The Semantically-Enabled Science Data Integration (SESDI) project implemented data integration capabilities among three sub-disciplines; solar radiation, volcanic outgassing and atmospheric structure using extensions to existingmodular ontolgies and used the VSTO data framework, while adding smart faceted search and semantic data registrationtools. The Semantic Provenance Capture in Data Ingest Systems (SPCDIS) has added explanation provenance capabilities to an observational data ingest pipeline for images of the Sun providing a set of tools to answer diverseend user questions such as ``Why does this image look bad?.

  4. Interactions between Flight Dynamics and Propulsion Systems of Air-Breathing Hypersonic Vehicles

    NASA Astrophysics Data System (ADS)

    Dalle, Derek J.

    The development and application of a first-principles-derived reduced-order model called MASIV (Michigan/AFRL Scramjet In Vehicle) for an air-breathing hypersonic vehicle is discussed. Several significant and previously unreported aspects of hypersonic flight are investigated. A fortunate coupling between increasing Mach number and decreasing angle of attack is shown to extend the range of operating conditions for a class of supersonic inlets. Detailed maps of isolator unstart and ram-to-scram transition are shown on the flight corridor map for the first time. In scram mode the airflow remains supersonic throughout the engine, while in ram mode there is a region of subsonic flow. Accurately predicting the transition between these two modes requires models for complex shock interactions, finite-rate chemistry, fuel-air mixing, pre-combustion shock trains, and thermal choking, which are incorporated into a unified framework here. Isolator unstart occurs when the pre-combustion shock train is longer than the isolator, which blocks airflow from entering the engine. Finally, cooptimization of the vehicle design and trajectory is discussed. An optimal control technique is introduced that greatly reduces the number of computations required to optimize the simulated trajectory.

  5. Ecological requirements for pallid sturgeon reproduction and recruitment in the Lower Missouri River: Annual report 2009

    USGS Publications Warehouse

    DeLonay, Aaron J.; Jacobson, Robert B.; Papoulias, Diana M.; Wildhaber, Mark L.; Chojnacki, Kimberly A.; Pherigo, Emily K.; Bergthold, Casey L.; Mestl, Gerald E.

    2010-01-01

    The Comprehensive Sturgeon Research Project is a multiyear, multiagency collaborative research framework developed to provide information to support pallid sturgeon recovery and Missouri River management decisions. The general Comprehensive Sturgeon Research Project strategy is to integrate field and laboratory studies of sturgeon reproductive ecology, habitat requirements, and physiology to produce a predictive understanding of sturgeon population dynamics. The project scope of work is developed annually with cooperating research partners and in collaboration with the U.S. Army Corps of Engineers, Missouri River Recovery-Integrated Science Program. The research consists of several interdependent and complementary research tasks engaging multiple disciplines that primarily address spawning as a probable limiting factor in reproduction and survival of the pallid sturgeon. The research is multifaceted and is designed to provide information needed for management decisions impacting habitat restoration, flow modification, and pallid sturgeon population augmentation on the Missouri River, and throughout the range of the species. Research activities and progress towards understanding of the species are reported to the U.S. Army Corps of Engineers annually. This annual report details the research effort and progress made by Comprehensive Sturgeon Research Project during 2009.

  6. Improving INPE'S balloon ground facilities for operation of the protoMIRAX experiment

    NASA Astrophysics Data System (ADS)

    Mattiello-Francisco, F.; Rinke, E.; Fernandes, J. O.; Cardoso, L.; Cardoso, P.; Braga, J.

    2014-10-01

    The system requirements for reusing the scientific balloon ground facilities available at INPE were a challenge to the ground system engineers involved in the protoMIRAX X-ray astronomy experiment. A significant effort on software updating was required for the balloon ground station. Considering that protoMIRAX is a pathfinder for the MIRAX satellite mission, a ground infrastructure compatible with INPE's satellite operation approach would be useful and highly recommended to control and monitor the experiment during the balloon flights. This approach will make use of the SATellite Control System (SATCS), a software-based architecture developed at INPE for satellite commanding and monitoring. SATCS complies with particular operational requirements of different satellites by using several customized object-oriented software elements and frameworks. We present the ground solution designed for protoMIRAX operation, the Control and Reception System (CRS). A new server computer, properly configured with Ethernet, has extended the existing ground station facilities with switch, converters and new software (OPS/SERVER) in order to support the available uplink and downlink channels being mapped to TCP/IP gateways required by SATCS. Currently, the CRS development is customizing the SATCS for the kernel functions of protoMIRAX command and telemetry processing. Design-patterns, component-based libraries and metadata are widely used in the SATCS in order to extend the frameworks to address the Packet Utilization Standard (PUS) for ground-balloon communication, in compliance with the services provided by the data handling computer onboard the protoMIRAX balloon.

  7. Framework Requirements for MDO Application Development

    NASA Technical Reports Server (NTRS)

    Salas, A. O.; Townsend, J. C.

    1999-01-01

    Frameworks or problem solving environments that support application development form an active area of research. The Multidisciplinary Optimization Branch at NASA Langley Research Center is investigating frameworks for supporting multidisciplinary analysis and optimization research. The Branch has generated a list of framework requirements, based on the experience gained from the Framework for Interdisciplinary Design Optimization project and the information acquired during a framework evaluation process. In this study, four existing frameworks are examined against these requirements. The results of this examination suggest several topics for further framework research.

  8. A Recommended Framework for the Network-Centric Acquisition Process

    DTIC Science & Technology

    2009-09-01

    ISO /IEC 12207 , Systems and Software Engineering-Software Life-Cycle Processes  ANSI/EIA 632, Processes for Engineering a System. There are...engineering [46]. Some of the process models presented in the DAG are:  ISO /IEC 15288, Systems and Software Engineering-System Life-Cycle Processes...e.g., ISO , IA, Security, etc.). Vetting developers helps ensure that they are using industry best industry practices and maximize the IA compliance

  9. Industrial biosystems engineering and biorefinery systems.

    PubMed

    Chen, Shulin

    2008-06-01

    The concept of Industrial Biosystems Engineering (IBsE) was suggested as a new engineering branch to be developed for meeting the needs for science, technology and professionals by the upcoming bioeconomy. With emphasis on systems, IBsE builds upon the interfaces between systems biology, bioprocessing, and systems engineering. This paper discussed the background, the suggested definition, the theoretical framework and methodologies of this new discipline as well as its challenges and future development.

  10. Generalizing the extensibility of a dynamic geometry software

    NASA Astrophysics Data System (ADS)

    Herceg, Đorđe; Radaković, Davorka; Herceg, Dejana

    2012-09-01

    Plug-and-play visual components in a Dynamic Geometry Software (DGS) enable development of visually attractive, rich and highly interactive dynamic drawings. We are developing SLGeometry, a DGS that contains a custom programming language, a computer algebra system (CAS engine) and a graphics subsystem. The basic extensibility framework on SLGeometry supports dynamic addition of new functions from attribute annotated classes that implement runtime metadata registration in code. We present a general plug-in framework for dynamic importing of arbitrary Silverlight user interface (UI) controls into SLGeometry at runtime. The CAS engine maintains a metadata storage that describes each imported visual component and enables two-way communication between the expressions stored in the engine and the UI controls on the screen.

  11. OWLing Clinical Data Repositories With the Ontology Web Language

    PubMed Central

    Pastor, Xavier; Lozano, Esther

    2014-01-01

    Background The health sciences are based upon information. Clinical information is usually stored and managed by physicians with precarious tools, such as spreadsheets. The biomedical domain is more complex than other domains that have adopted information and communication technologies as pervasive business tools. Moreover, medicine continuously changes its corpus of knowledge because of new discoveries and the rearrangements in the relationships among concepts. This scenario makes it especially difficult to offer good tools to answer the professional needs of researchers and constitutes a barrier that needs innovation to discover useful solutions. Objective The objective was to design and implement a framework for the development of clinical data repositories, capable of facing the continuous change in the biomedicine domain and minimizing the technical knowledge required from final users. Methods We combined knowledge management tools and methodologies with relational technology. We present an ontology-based approach that is flexible and efficient for dealing with complexity and change, integrated with a solid relational storage and a Web graphical user interface. Results Onto Clinical Research Forms (OntoCRF) is a framework for the definition, modeling, and instantiation of data repositories. It does not need any database design or programming. All required information to define a new project is explicitly stated in ontologies. Moreover, the user interface is built automatically on the fly as Web pages, whereas data are stored in a generic repository. This allows for immediate deployment and population of the database as well as instant online availability of any modification. Conclusions OntoCRF is a complete framework to build data repositories with a solid relational storage. Driven by ontologies, OntoCRF is more flexible and efficient to deal with complexity and change than traditional systems and does not require very skilled technical people facilitating the engineering of clinical software systems. PMID:25599697

  12. OWLing Clinical Data Repositories With the Ontology Web Language.

    PubMed

    Lozano-Rubí, Raimundo; Pastor, Xavier; Lozano, Esther

    2014-08-01

    The health sciences are based upon information. Clinical information is usually stored and managed by physicians with precarious tools, such as spreadsheets. The biomedical domain is more complex than other domains that have adopted information and communication technologies as pervasive business tools. Moreover, medicine continuously changes its corpus of knowledge because of new discoveries and the rearrangements in the relationships among concepts. This scenario makes it especially difficult to offer good tools to answer the professional needs of researchers and constitutes a barrier that needs innovation to discover useful solutions. The objective was to design and implement a framework for the development of clinical data repositories, capable of facing the continuous change in the biomedicine domain and minimizing the technical knowledge required from final users. We combined knowledge management tools and methodologies with relational technology. We present an ontology-based approach that is flexible and efficient for dealing with complexity and change, integrated with a solid relational storage and a Web graphical user interface. Onto Clinical Research Forms (OntoCRF) is a framework for the definition, modeling, and instantiation of data repositories. It does not need any database design or programming. All required information to define a new project is explicitly stated in ontologies. Moreover, the user interface is built automatically on the fly as Web pages, whereas data are stored in a generic repository. This allows for immediate deployment and population of the database as well as instant online availability of any modification. OntoCRF is a complete framework to build data repositories with a solid relational storage. Driven by ontologies, OntoCRF is more flexible and efficient to deal with complexity and change than traditional systems and does not require very skilled technical people facilitating the engineering of clinical software systems.

  13. An Ethical (Descriptive) Framework for Judgment of Actions and Decisions in the Construction Industry and Engineering-Part I.

    PubMed

    Alkhatib, Omar J; Abdou, Alaa

    2018-04-01

    The construction industry is usually characterized as a fragmented system of multiple-organizational entities in which members from different technical backgrounds and moral values join together to develop a particular business or project. The greatest challenge in the construction process for the achievement of a successful practice is the development of an outstanding reputation, which is built on identifying and applying an ethical framework. This framework should reflect a common ethical ground for myriad people involved in this process to survive and compete ethically in today's turbulent construction market. This study establishes a framework for ethical judgment of behavior and actions conducted in the construction process. The framework was primarily developed based on the essential attributes of business management identified in the literature review and subsequently incorporates additional attributes identified to prevent breaches in the construction industry and common ethical values related to professional engineering. The proposed judgment framework is based primarily on the ethical dimension of professional responsibility. The Ethical Judgment Framework consists of descriptive approaches involving technical, professional, administrative, and miscellaneous terms. The framework provides the basis for judging actions as either ethical or unethical. Furthermore, the framework can be implemented as a form of preventive ethics, which would help avoid ethical dilemmas and moral allegations. The framework can be considered a decision-making model to guide actions and improve the ethical reasoning process that would help individuals think through possible implications and consequences of ethical dilemmas in the construction industry.

  14. Cross-species 3D virtual reality toolbox for visual and cognitive experiments.

    PubMed

    Doucet, Guillaume; Gulli, Roberto A; Martinez-Trujillo, Julio C

    2016-06-15

    Although simplified visual stimuli, such as dots or gratings presented on homogeneous backgrounds, provide strict control over the stimulus parameters during visual experiments, they fail to approximate visual stimulation in natural conditions. Adoption of virtual reality (VR) in neuroscience research has been proposed to circumvent this problem, by combining strict control of experimental variables and behavioral monitoring within complex and realistic environments. We have created a VR toolbox that maximizes experimental flexibility while minimizing implementation costs. A free VR engine (Unreal 3) has been customized to interface with any control software via text commands, allowing seamless introduction into pre-existing laboratory data acquisition frameworks. Furthermore, control functions are provided for the two most common programming languages used in visual neuroscience: Matlab and Python. The toolbox offers milliseconds time resolution necessary for electrophysiological recordings and is flexible enough to support cross-species usage across a wide range of paradigms. Unlike previously proposed VR solutions whose implementation is complex and time-consuming, our toolbox requires minimal customization or technical expertise to interface with pre-existing data acquisition frameworks as it relies on already familiar programming environments. Moreover, as it is compatible with a variety of display and input devices, identical VR testing paradigms can be used across species, from rodents to humans. This toolbox facilitates the addition of VR capabilities to any laboratory without perturbing pre-existing data acquisition frameworks, or requiring any major hardware changes. Copyright © 2016 Z. All rights reserved.

  15. Requirements as Goals and Commitments Too

    NASA Astrophysics Data System (ADS)

    Chopra, Amit K.; Mylopoulos, John; Dalpiaz, Fabiano; Giorgini, Paolo; Singh, Munindar P.

    In traditional software engineering research and practice, requirements are classified either as functional or non-functional. Functional requirements consist of all functions the system-to-be ought to support, and have been modeled in terms of box-and-arrow diagrams in the spirit of SADT. Non-functional requirements include desired software qualities for the system-to-be and have been described either in natural language or in terms of metrics. This orthodoxy was challenged in the mid-90 s by a host of proposals that had a common theme: all requirements are initially stakeholder goals and ought to be elicited, modeled and analyzed as such. Through systematic processes, these goals can be refined into specifications of functions the system-to-be needs to deliver, while actions assigned to external actors need to be executed. This view is dominating Requirements Engineering (RE) research and is beginning to have an impact on RE practice. We propose a next step along this line of research, by adopting the concept of conditional commitment as companion concept to that of goal. Goals are intentional entities that capture the needs and wants of stakeholders. Commitments, on the other hand, are social concepts that define the willingness and capability of an actor A to fulfill a predicate ϕ for the benefit of actor B, provided B (in return) fulfills predicate ψ for the benefit of actor A. In our conceptualization, goals are mapped to collections of commitments rather than functions, qualities, or actor assignments. We motivate the importance of the concept of commitment for RE through examples and discussion. We also contrast our proposal with state-of-the-art requirements modeling and analysis frameworks, such as KAOS, MAP, i * and Tropos.

  16. Pursuing High-Mobility n-Type Organic Semiconductors by Combination of "Molecule-Framework" and "Side-Chain" Engineering.

    PubMed

    Zhang, Cheng; Zang, Yaping; Zhang, Fengjiao; Diao, Ying; McNeill, Christopher R; Di, Chong-An; Zhu, Xiaozhang; Zhu, Daoben

    2016-10-01

    "Molecule-framework" and "side-chain" engineering is powerful for the design of high-performance organic semiconductors. Based on 2DQTTs, the relationship between molecular structure, film microstructure, and charge-transport property in organic thin-film transistors (OTFTs) is studied. 2DQTT-o-B exhibits outstanding electron mobilities of 5.2 cm 2 V -1 s -1 , which is a record for air-stable solution-processable n-channel small-molecule OTFTs to date. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Crystal Engineering of an nbo Topology Metal-Organic Framework for Chemical Fixation of CO₂ under Ambient Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Wen-Yang; Chen, Yao; Niu, Youhong

    Crystal engineering of the nbo metal–organic framework (MOF) platform MOF-505 with a custom-designed azamacrocycle ligand (1,4,7,10-tetrazazcyclododecane-N,N',N'',N'''-tetra-p-methylbenzoic acid) leads to a high density of well-oriented Lewis active sites within the cuboctahedral cage in MMCF-2, [Cu₂(Cu-tactmb)(H₂O)₃(NO₃)₂]. This MOF demonstrates high catalytic activity for the chemical fixation of CO₂ into cyclic carbonates at room temperature under 1 atm pressure.

  18. Design of reinforcement welding machine within steel framework for marine engineering

    NASA Astrophysics Data System (ADS)

    Wang, Gang; Wu, Jin

    2017-04-01

    In this project, a design scheme that reinforcement welding machine is added within the steel framework is proposed according to the double-side welding technology for box-beam structure in marine engineering. Then the design and development of circuit and transmission mechanism for new welding equipment are completed as well with one sample machine being made. Moreover, the trial running is finished finally. Main technical parameters of the equipment are: the working stroke: ≥1500mm, the welding speed: 8˜15cm/min and the welding sheet thickness: ≥20mm.

  19. Human Systems Integration (HSI) Practitioner's Guide

    NASA Technical Reports Server (NTRS)

    Zumbado, Jennifer Rochlis

    2015-01-01

    The NASA/SP-2015-3709, Human Systems Integration (HSI) Practitioner's Guide, also known as the "HSIPG," provides a tool for implementing HSI activities within the NASA systems engineering framework. The HSIPG is written to aid the HSI practitioner engaged in a program or project (P/P), and serves as a knowledge base to allow the practitioner to step into an HSI lead or team member role for NASA missions. Additionally, this HSIPG is written to address the role of HSI in the P/P management and systems engineering communities and aid their understanding of the value added by incorporating good HSI practices into their programs and projects. Through helping to build a community of knowledgeable HSI practitioners, this document also hopes to build advocacy across the Agency for establishing strong, consistent HSI policies and practices. Human Systems Integration (HSI) has been successfully adopted (and adapted) by several federal agencies-most notably the U.S. Department of Defense (DoD) and the Nuclear Regulatory Commission (NRC)-as a methodology for reducing system life cycle costs (LCCs). These cost savings manifest themselves due to reductions in required numbers of personnel, the practice of human-centered design, decreased reliance on specialized skills for operations, shortened training time, efficient logistics and maintenance, and fewer safety-related risks and mishaps due to unintended human/system interactions. The HSI process for NASA establishes how cost savings and mission success can be realized through systems engineering. Every program or project has unique attributes. This HSIPG is not intended to provide one-size-fits-all recommendations for HSI implementation. Rather, HSI processes should be tailored to the size, scope, and goals of individual situations. The instructions and processes identified here are best used as a starting point for implementing human-centered system concepts and designs across programs and projects of varying types, including manned and unmanned, human spaceflight, aviation, robotics, and environmental science missions. The practitioner using this guide should have expertise in Systems Engineering or other disciplines involved in producing systems with anticipated human interactions. (See section 1.6 of this guide for further discussion on HSI discipline domains.) The HSIPG provides an "HSI layer" to the NASA Systems Engineering Engine (SEE), detailed in NASA Procedural Requirement (NPR) 7123.1B, NASA Systems Engineering Processes and Requirements, and further explained in NASA/SP-2007-6105, Systems Engineering Handbook (see HSIPG Table 2.2-1, NASA Documents with HSI Content, for specific references and document versions).

  20. Identifying and Verifying Earthquake Engineering Concepts to Create a Knowledge Base in STEM Education: A Modified Delphi Study

    ERIC Educational Resources Information Center

    Cavlazoglu, Baki; Stuessy, Carol L.

    2017-01-01

    Stakeholders in STEM education have called for integrating engineering content knowledge into STEM-content classrooms. To answer the call, stakeholders in science education announced a new framework, Next Generation Science Standards, which focuses on the integration of science and engineering in K-12 science education. However, research indicates…

  1. 2014 Abridged Technology and Engineering Literacy Framework for the 2014 National Assessment of Educational Progress

    ERIC Educational Resources Information Center

    National Assessment Governing Board, 2014

    2014-01-01

    Due to the growing importance of technology and engineering in the educational landscape, and to support America's ability to contribute to and compete in a global economy, the National Assessment Governing Board (NAGB) initiated development of the first NAEP Technology and Engineering Literacy (TEL) Assessment. Relating to national efforts in…

  2. Teachers' Thoughts on Student Decision Making during Engineering Design Lessons

    ERIC Educational Resources Information Center

    Meyer, Helen

    2018-01-01

    In this paper, I share the results of a study of teachers' ideas about student decision-making at entry into a professional development program to integrate engineering into their instruction. The framework for the Engineering Design Process (EDP) was based on a Challenge-Based Learning (CBL) model. The EDP embedded within the CBL model suggests…

  3. A Framework for Quality K-12 Engineering Education: Research and Development

    ERIC Educational Resources Information Center

    Moore, Tamara J.; Glancy, Aran W.; Tank, Kristina M.; Kersten, Jennifer A.; Smith, Karl A.; Stohlmann, Micah S.

    2014-01-01

    Recent U.S. national documents have laid the foundation for highlighting the connection between science, technology, engineering and mathematics at the K-12 level. However, there is not a clear definition or a well-established tradition of what constitutes a quality engineering education at the K-12 level. The purpose of the current work has been…

  4. Comparing Freshman and Doctoral Engineering Students in Design: Mapping with a Descriptive Framework

    ERIC Educational Resources Information Center

    Carmona Marques, P.

    2017-01-01

    This paper reports the results of a study of engineering students' approaches to an open-ended design problem. To carry out this, sketches and interviews were collected from 9 freshmen (first year) and 10 doctoral engineering students, when they designed solutions for orange squeezers. Sketches and interviews were analysed and mapped with a…

  5. Integrated identification, modeling and control with applications

    NASA Astrophysics Data System (ADS)

    Shi, Guojun

    This thesis deals with the integration of system design, identification, modeling and control. In particular, six interdisciplinary engineering problems are addressed and investigated. Theoretical results are established and applied to structural vibration reduction and engine control problems. First, the data-based LQG control problem is formulated and solved. It is shown that a state space model is not necessary to solve this problem; rather a finite sequence from the impulse response is the only model data required to synthesize an optimal controller. The new theory avoids unnecessary reliance on a model, required in the conventional design procedure. The infinite horizon model predictive control problem is addressed for multivariable systems. The basic properties of the receding horizon implementation strategy is investigated and the complete framework for solving the problem is established. The new theory allows the accommodation of hard input constraints and time delays. The developed control algorithms guarantee the closed loop stability. A closed loop identification and infinite horizon model predictive control design procedure is established for engine speed regulation. The developed algorithms are tested on the Cummins Engine Simulator and desired results are obtained. A finite signal-to-noise ratio model is considered for noise signals. An information quality index is introduced which measures the essential information precision required for stabilization. The problems of minimum variance control and covariance control are formulated and investigated. Convergent algorithms are developed for solving the problems of interest. The problem of the integrated passive and active control design is addressed in order to improve the overall system performance. A design algorithm is developed, which simultaneously finds: (i) the optimal values of the stiffness and damping ratios for the structure, and (ii) an optimal output variance constrained stabilizing controller such that the active control energy is minimized. A weighted q-Markov COVER method is introduced for identification with measurement noise. The result is use to develop an iterative closed loop identification/control design algorithm. The effectiveness of the algorithm is illustrated by experimental results.

  6. Preparing culturally responsive teachers of science, technology, engineering, and math using the Geophysical Institute Framework for Professional Development in Alaska

    NASA Astrophysics Data System (ADS)

    Berry Bertram, Kathryn

    2011-12-01

    The Geophysical Institute (GI) Framework for Professional Development was designed to prepare culturally responsive teachers of science, technology, engineering, and math (STEM). Professional development programs based on the framework are created for rural Alaskan teachers who instruct diverse classrooms that include indigenous students. This dissertation was written in response to the question, "Under what circumstances is the GI Framework for Professional Development effective in preparing culturally responsive teachers of science, technology, engineering, and math?" Research was conducted on two professional development programs based on the GI Framework: the Arctic Climate Modeling Program (ACMP) and the Science Teacher Education Program (STEP). Both programs were created by backward design to student learning goals aligned with Alaska standards and rooted in principles of indigenous ideology. Both were created with input from Alaska Native cultural knowledge bearers, Arctic scientists, education researchers, school administrators, and master teachers with extensive instructional experience. Both provide integrated instruction reflective of authentic Arctic research practices, and training in diverse methods shown to increase indigenous student STEM engagement. While based on the same framework, these programs were chosen for research because they offer distinctly different training venues for K-12 teachers. STEP offered two-week summer institutes on the UAF campus for more than 175 teachers from 33 Alaska school districts. By contrast, ACMP served 165 teachers from one rural Alaska school district along the Bering Strait. Due to challenges in making professional development opportunities accessible to all teachers in this geographically isolated district, ACMP offered a year-round mix of in-person, long-distance, online, and local training. Discussion centers on a comparison of the strategies used by each program to address GI Framework cornerstones, on methodologies used to conduct program research, and on findings obtained. Research indicates that in both situations the GI Framework for Professional Development was effective in preparing culturally responsive STEM teachers. Implications of these findings and recommendations for future research are discussed in the conclusion.

  7. Reprint of "Safe places for pedestrians: using cognitive work analysis to consider the relationships between the engineering and urban design of footpaths".

    PubMed

    Stevens, Nicholas; Salmon, Paul

    2015-01-01

    Footpaths provide an integral component of our urban environments and have the potential to act as safe places for people and the focus for community life. Despite this, the approach to designing footpaths that are safe while providing this sense of place often occurs in silos. There is often very little consideration given to how designing for sense of place impacts safety and vice versa. The aim of this study was to use a systems analysis and design framework to develop a design template for an 'ideal' footpath system that embodies both safety and sense of place. This was achieved through using the first phase of the Cognitive Work Analysis framework, Work Domain Analysis, to specify a model of footpaths as safe places for pedestrians. This model was subsequently used to assess two existing footpath environments to determine the extent to which they meet the design requirements specified. The findings show instances where the existing footpaths both meet and fail to meet the design requirements specified. Through utilising a systems approach for footpaths, this paper has provided a novel design template that can inform new footpath design efforts or be used to evaluate the extent to which existing footpaths achieve their safety and sense of place requirements. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Safe places for pedestrians: using cognitive work analysis to consider the relationships between the engineering and urban design of footpaths.

    PubMed

    Stevens, Nicholas; Salmon, Paul

    2014-11-01

    Footpaths provide an integral component of our urban environments and have the potential to act as safe places for people and the focus for community life. Despite this, the approach to designing footpaths that are safe while providing this sense of place often occurs in silos. There is often very little consideration given to how designing for sense of place impacts safety and vice versa. The aim of this study was to use a systems analysis and design framework to develop a design template for an 'ideal' footpath system that embodies both safety and sense of place. This was achieved through using the first phase of the Cognitive Work Analysis framework, Work Domain Analysis, to specify a model of footpaths as safe places for pedestrians. This model was subsequently used to assess two existing footpath environments to determine the extent to which they meet the design requirements specified. The findings show instances where the existing footpaths both meet and fail to meet the design requirements specified. Through utilising a systems approach for footpaths, this paper has provided a novel design template that can inform new footpath design efforts or be used to evaluate the extent to which existing footpaths achieve their safety and sense of place requirements. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Refinement and dissemination of a digital platform for sharing transportation education materials.

    DOT National Transportation Integrated Search

    2015-07-01

    National agencies have called for more widespread adoption of best practices in engineering education. To facilitate this sharing of practices a : web-based system framework used by transportation engineering educators to share curricular materials a...

  10. Graphical CONOPS Prototype to Demonstrate Emerging Methods, Processes, and Tools at ARDEC

    DTIC Science & Technology

    2013-07-17

    Concept Engineering Framework (ICEF), an extensive literature review was conducted to discover metrics that exist for evaluating concept engineering...language to ICEF to SysML ................................................ 34 Table 5 Artifact metrics ...50 Table 6 Collaboration metrics

  11. A Framework for Lab Work Management in Mass Courses. Application to Low Level Input/Output without Hardware

    ERIC Educational Resources Information Center

    Rodriguez, Santiago; Zamorano, Juan; Rosales, Francisco; Dopico, Antonio Garcia; Pedraza, Jose Luis

    2007-01-01

    This paper describes a complete lab work management framework designed and developed in the authors' department to help teachers to manage the small projects that students are expected to complete as lab assignments during their graduate-level computer engineering studies. The paper focuses on an application example of the framework to a specific…

  12. Decision Aids Using Heterogeneous Intelligence Analysis

    DTIC Science & Technology

    2010-08-20

    developing a Geocultural service, a software framework and inferencing engine for the Transparent Urban Structures program. The scope of the effort...has evolved as the program has matured and is including multiple data sources, as well as interfaces out to the ONR architectural framework . Tasks...Interface; Application Program Interface; Application Programmer Interface CAF Common Application Framework EDA Event Driven Architecture a 16. SECURITY

  13. Release of genetically engineered insects: a framework to identify potential ecological effects

    PubMed Central

    David, Aaron S; Kaser, Joe M; Morey, Amy C; Roth, Alexander M; Andow, David A

    2013-01-01

    Genetically engineered (GE) insects have the potential to radically change pest management worldwide. With recent approvals of GE insect releases, there is a need for a synthesized framework to evaluate their potential ecological and evolutionary effects. The effects may occur in two phases: a transitory phase when the focal population changes in density, and a steady state phase when it reaches a new, constant density. We review potential effects of a rapid change in insect density related to population outbreaks, biological control, invasive species, and other GE organisms to identify a comprehensive list of potential ecological and evolutionary effects of GE insect releases. We apply this framework to the Anopheles gambiae mosquito – a malaria vector being engineered to suppress the wild mosquito population – to identify effects that may occur during the transitory and steady state phases after release. Our methodology reveals many potential effects in each phase, perhaps most notably those dealing with immunity in the transitory phase, and with pathogen and vector evolution in the steady state phase. Importantly, this framework identifies knowledge gaps in mosquito ecology. Identifying effects in the transitory and steady state phases allows more rigorous identification of the potential ecological effects of GE insect release. PMID:24198955

  14. Challenges of Maintaining Good Clinical Laboratory Practices in Low-Resource Settings:  A Health Program Evaluation Framework Case Study From East Africa.

    PubMed

    Zhang, Helen L; Omondi, Michael W; Musyoka, Augustine M; Afwamba, Isaac A; Swai, Remigi P; Karia, Francis P; Muiruri, Charles; Reddy, Elizabeth A; Crump, John A; Rubach, Matthew P

    2016-08-01

    Using a clinical research laboratory as a case study, we sought to characterize barriers to maintaining Good Clinical Laboratory Practice (GCLP) services in a developing world setting. Using a US Centers for Disease Control and Prevention framework for program evaluation in public health, we performed an evaluation of the Kilimanjaro Christian Medical Centre-Duke University Health Collaboration clinical research laboratory sections of the Kilimanjaro Clinical Research Institute in Moshi, Tanzania. Laboratory records from November 2012 through October 2014 were reviewed for this analysis. During the 2-year period of study, seven instrument malfunctions suspended testing required for open clinical trials. A median (range) of 9 (1-55) days elapsed between instrument malfunction and biomedical engineer service. Sixteen (76.1%) of 21 suppliers of reagents, controls, and consumables were based outside Tanzania. Test throughput among laboratory sections used a median (range) of 0.6% (0.2%-2.7%) of instrument capacity. Five (55.6%) of nine laboratory technologists left their posts over 2 years. These findings demonstrate that GCLP laboratory service provision in this setting is hampered by delays in biomedical engineer support, delays and extra costs in commodity procurement, low testing throughput, and high personnel turnover. © American Society for Clinical Pathology, 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Defining Gas Turbine Engine Performance Requirements for the Large Civil TiltRotor (LCTR2)

    NASA Technical Reports Server (NTRS)

    Snyder, Christopher A.

    2013-01-01

    Defining specific engine requirements is a critical part of identifying technologies and operational models for potential future rotary wing vehicles. NASA's Fundamental Aeronautics Program, Subsonic Rotary Wing Project has identified the Large Civil TiltRotor (LCTR) as the configuration to best meet technology goals. This notional vehicle concept has evolved with more clearly defined mission and operational requirements to the LCTR-iteration 2 (LCTR2). This paper reports on efforts to further review and refine the LCTR2 analyses to ascertain specific engine requirements and propulsion sizing criteria. The baseline mission and other design or operational requirements are reviewed. Analysis tools are described to help understand their interactions and underlying assumptions. Various design and operational conditions are presented and explained for their contribution to defining operational and engine requirements. These identified engine requirements are discussed to suggest which are most critical to the engine sizing and operation. The most-critical engine requirements are compared to in-house NASA engine simulations to try to ascertain which operational requirements define engine requirements versus points within the available engine operational capability. Finally, results are summarized with suggestions for future efforts to improve analysis capabilities, and better define and refine mission and operational requirements.

  16. Social and Personal Factors in Semantic Infusion Projects

    NASA Astrophysics Data System (ADS)

    West, P.; Fox, P. A.; McGuinness, D. L.

    2009-12-01

    As part of our semantic data framework activities across multiple, diverse disciplines we required the involvement of domain scientists, computer scientists, software engineers, data managers, and often, social scientists. This involvement from a cross-section of disciplines turns out to be a social exercise as much as it is a technical and methodical activity. Each member of the team is used to different modes of working, expectations, vocabularies, levels of participation, and incentive and reward systems. We will examine how both roles and personal responsibilities play in the development of semantic infusion projects, and how an iterative development cycle can contribute to the successful completion of such a project.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    A. Alfonsi; C. Rabiti; D. Mandelli

    The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data miningmore » module« less

  18. SimBasin: A serious gaming framework for integrated and cooperative decision-making in water management

    NASA Astrophysics Data System (ADS)

    Angarita, H.; Craven, J.; Caggiano, F.; Corzo, G.

    2016-12-01

    An Integrated approach involving extensive stakeholder dialogue is widely advocated in sustainable water management. However, it requires a social learning process in which scientist and stakeholders become aware of the relationship between their own frames of reference and those of others, differences can be dealt with constructively, and shared ideas can be used to facilitate cooperation. Key obstacles in this process are heritage systems, attitudes and processes, factually wrong, incomplete or unshared mental models, and lack of science-policy dialogue (Pahl-Wostl et al., 2005) To overcome these barriers, a space is required which is free of heritage systems, where mental models can be safely and easily compared and corrected, and where scientists and policy-makers can come together. A "serious game" can serve as such a space - Serious games are games or simulations used to achieve an organizational or educational goal, and such games have already been used to facilitate stakeholder cooperation in the water management sector (Rusca et al., 2005). As well as bringing stakeholders together, they can be an accessible interface between scientific models and non-experts. Here we present SimBasin, a multiplayer serious game framework and development engine. The engine allows to easily create a simulated multiplayer basin management game using WEAP water resources modelling software (SEI, 1992-2015), to facilitate the communication of the complex, long term and wide range relationships between hydrologic, climate, and human systems present in river basins, and enable dialogue between policy-makers and scientists. Different games have been created using the Sim-Basin engine and used in various contexts. Here are discussed experiences with stakeholders at a national forum in Bogotá, flood risk management agencies in the lower Magdalena River Basin in Colombia and with water professionals in Bangkok. The experience shows that the game is a useful tool for enabling dialogue and provides interesting insights into the way computer models and stakeholders' mental models can interact with and enrich each other. SimBasin software and supporting materials are freely available online for download at http://simbasin.hilab.nl.

  19. Describing Nanomaterials: A Uniform Description System

    NASA Astrophysics Data System (ADS)

    Rumble, John; Freiman, Steve; Teague, Clayton

    2014-03-01

    Products involving nanomaterials are growing rapidly and nanoparticles also occur naturally. Materials, scientists, engineers, health officials, and regulators have realized they need a common description system. Led by CODATA and VAMAS, a Uniform Description System (UDS) for nanomaterials is being developed to meet the requirements of a broad range of scientific and technical disciplines and different user communities. The goal of the CODATA/VAMAS effort is the creation of a complete set of descriptors that can be used by all communities, e.g., materials, physics, chemistry, agricultural, medical, etc., interested in nanomaterials. The description system must be relevant to researchers, manufacturers of nanomaterials, materials selectors, and regulators. The purpose of the UDS for materials on the nanoscale is twofold: Uniqueness and Equivalency. The first step in the development of the UDS has been the creation of a Framework that will be used by the different communities to guide in the selection of descriptors relevant to their needs. This talk is a brief description of the draft of such a Framework, and how the framework will be translated into a robust description system with input from many scientific communities including physics. A contribution from the CODATA/VAMAS Working Group on the Description of Nanomaterials.

  20. Planning Framework for Mesolevel Optimization of Urban Runoff Control Schemes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Qianqian; Blohm, Andrew; Liu, Bo

    A planning framework is developed to optimize runoff control schemes at scales relevant for regional planning at an early stage. The framework employs less sophisticated modeling approaches to allow a practical application in developing regions with limited data sources and computing capability. The methodology contains three interrelated modules: (1)the geographic information system (GIS)-based hydrological module, which aims at assessing local hydrological constraints and potential for runoff control according to regional land-use descriptions; (2)the grading module, which is built upon the method of fuzzy comprehensive evaluation. It is used to establish a priority ranking system to assist the allocation of runoffmore » control targets at the subdivision level; and (3)the genetic algorithm-based optimization module, which is included to derive Pareto-based optimal solutions for mesolevel allocation with multiple competing objectives. The optimization approach describes the trade-off between different allocation plans and simultaneously ensures that all allocation schemes satisfy the minimum requirement on runoff control. Our results highlight the importance of considering the mesolevel allocation strategy in addition to measures at macrolevels and microlevels in urban runoff management. (C) 2016 American Society of Civil Engineers.« less

  1. 77 FR 43076 - Federal Acquisition Regulation; Information Collection; Value Engineering Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-23

    ...; Information Collection; Value Engineering Requirements AGENCIES: Department of Defense (DOD), General Services... collection requirement concerning Value Engineering Requirements. Public comments are particularly invited on... Information Collection 9000- 0027, Value Engineering Requirements, by any of the following methods...

  2. 14 CFR 125.265 - Flight engineer requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Flight engineer requirements. 125.265... Requirements § 125.265 Flight engineer requirements. (a) No person may operate an airplane for which a flight engineer is required by the type certification requirements without a flight crewmember holding a current...

  3. 14 CFR 125.265 - Flight engineer requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Flight engineer requirements. 125.265... Requirements § 125.265 Flight engineer requirements. (a) No person may operate an airplane for which a flight engineer is required by the type certification requirements without a flight crewmember holding a current...

  4. 14 CFR 125.265 - Flight engineer requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Flight engineer requirements. 125.265... Requirements § 125.265 Flight engineer requirements. (a) No person may operate an airplane for which a flight engineer is required by the type certification requirements without a flight crewmember holding a current...

  5. 14 CFR 125.265 - Flight engineer requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Flight engineer requirements. 125.265... Requirements § 125.265 Flight engineer requirements. (a) No person may operate an airplane for which a flight engineer is required by the type certification requirements without a flight crewmember holding a current...

  6. 14 CFR 125.265 - Flight engineer requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 3 2014-01-01 2014-01-01 false Flight engineer requirements. 125.265... Requirements § 125.265 Flight engineer requirements. (a) No person may operate an airplane for which a flight engineer is required by the type certification requirements without a flight crewmember holding a current...

  7. Statistical Teleodynamics: Toward a Theory of Emergence.

    PubMed

    Venkatasubramanian, Venkat

    2017-10-24

    The central scientific challenge of the 21st century is developing a mathematical theory of emergence that can explain and predict phenomena such as consciousness and self-awareness. The most successful research program of the 20th century, reductionism, which goes from the whole to parts, seems unable to address this challenge. This is because addressing this challenge inherently requires an opposite approach, going from parts to the whole. In addition, reductionism, by the very nature of its inquiry, typically does not concern itself with teleology or purposeful behavior. Modeling emergence, in contrast, requires the addressing of teleology. Together, these two requirements present a formidable challenge in developing a successful mathematical theory of emergence. In this article, I describe a new theory of emergence, called statistical teleodynamics, that addresses certain aspects of the general problem. Statistical teleodynamics is a mathematical framework that unifies three seemingly disparate domains-purpose-free entities in statistical mechanics, human engineered teleological systems in systems engineering, and nature-evolved teleological systems in biology and sociology-within the same conceptual formalism. This theory rests on several key conceptual insights, the most important one being the recognition that entropy mathematically models the concept of fairness in economics and philosophy and, equivalently, the concept of robustness in systems engineering. These insights help prove that the fairest inequality of income is a log-normal distribution, which will emerge naturally at equilibrium in an ideal free market society. Similarly, the theory predicts the emergence of the three classes of network organization-exponential, scale-free, and Poisson-seen widely in a variety of domains. Statistical teleodynamics is the natural generalization of statistical thermodynamics, the most successful parts-to-whole systems theory to date, but this generalization is only a modest step toward a more comprehensive mathematical theory of emergence.

  8. Contextual Shaping of Student Design Practices: The Role of Constraint in First-Year Engineering Design

    NASA Astrophysics Data System (ADS)

    Goncher, Andrea M.

    thResearch on engineering design is a core area of concern within engineering education, and a fundamental understanding of how engineering students approach and undertake design is necessary in order to develop effective design models and pedagogies. This dissertation contributes to scholarship on engineering design by addressing a critical, but as yet underexplored, problem: how does the context in which students design shape their design practices? Using a qualitative study comprising of video data of design sessions, focus group interviews with students, and archives of their design work, this research explored how design decisions and actions are shaped by context, specifically the context of higher education. To develop a theoretical explanation for observed behavior, this study used the nested structuration. framework proposed by Perlow, Gittell, & Katz (2004). This framework explicated how teamwork is shaped by mutually reinforcing relationships at the individual, organizational, and institutional levels. I appropriated this framework to look specifically at how engineering students working on a course-related design project identify constraints that guide their design and how these constraints emerge as students interact while working on the project. I first identified and characterized the parameters associated with the design project from the student perspective and then, through multi-case studies of four design teams, I looked at the role these parameters play in student design practices. This qualitative investigation of first-year engineering student design teams revealed mutual and interconnected relationships between students and the organizations and institutions that they are a part of. In addition to contributing to research on engineering design, this work provides guidelines and practices to help design educators develop more effective design projects by incorporating constraints that enable effective design and learning. Moreover, I found that when appropriated in the context of higher education, multiple sublevels existed within nested structuration's organizational context and included course-level and project-level factors. The implications of this research can be used to improve the design of engineering course projects as well as the design of research efforts related to design in engineering education.

  9. A Decision Support Framework for Evaluation of Engineered Nanomaterials

    EPA Science Inventory

    Engineered nanomaterials (ENM) are currently being developed and applied at rates that far exceed our ability to evaluate their potential for environmental or human health risks. The gap between material development and capacity for assessment grows wider every day. Transforma...

  10. Developing a scalable modeling architecture for studying survivability technologies

    NASA Astrophysics Data System (ADS)

    Mohammad, Syed; Bounker, Paul; Mason, James; Brister, Jason; Shady, Dan; Tucker, David

    2006-05-01

    To facilitate interoperability of models in a scalable environment, and provide a relevant virtual environment in which Survivability technologies can be evaluated, the US Army Research Development and Engineering Command (RDECOM) Modeling Architecture for Technology Research and Experimentation (MATREX) Science and Technology Objective (STO) program has initiated the Survivability Thread which will seek to address some of the many technical and programmatic challenges associated with the effort. In coordination with different Thread customers, such as the Survivability branches of various Army labs, a collaborative group has been formed to define the requirements for the simulation environment that would in turn provide them a value-added tool for assessing models and gauge system-level performance relevant to Future Combat Systems (FCS) and the Survivability requirements of other burgeoning programs. An initial set of customer requirements has been generated in coordination with the RDECOM Survivability IPT lead, through the Survivability Technology Area at RDECOM Tank-automotive Research Development and Engineering Center (TARDEC, Warren, MI). The results of this project are aimed at a culminating experiment and demonstration scheduled for September, 2006, which will include a multitude of components from within RDECOM and provide the framework for future experiments to support Survivability research. This paper details the components with which the MATREX Survivability Thread was created and executed, and provides insight into the capabilities currently demanded by the Survivability faculty within RDECOM.

  11. A high-speed DAQ framework for future high-level trigger and event building clusters

    NASA Astrophysics Data System (ADS)

    Caselle, M.; Ardila Perez, L. E.; Balzer, M.; Dritschler, T.; Kopmann, A.; Mohr, H.; Rota, L.; Vogelgesang, M.; Weber, M.

    2017-03-01

    Modern data acquisition and trigger systems require a throughput of several GB/s and latencies of the order of microseconds. To satisfy such requirements, a heterogeneous readout system based on FPGA readout cards and GPU-based computing nodes coupled by InfiniBand has been developed. The incoming data from the back-end electronics is delivered directly into the internal memory of GPUs through a dedicated peer-to-peer PCIe communication. High performance DMA engines have been developed for direct communication between FPGAs and GPUs using "DirectGMA (AMD)" and "GPUDirect (NVIDIA)" technologies. The proposed infrastructure is a candidate for future generations of event building clusters, high-level trigger filter farms and low-level trigger system. In this paper the heterogeneous FPGA-GPU architecture will be presented and its performance be discussed.

  12. 46 CFR 11.502 - General requirements for national engineer endorsements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 1 2014-10-01 2014-10-01 false General requirements for national engineer endorsements... AND SEAMEN REQUIREMENTS FOR OFFICER ENDORSEMENTS Professional Requirements for National Engineer Officer Endorsements § 11.502 General requirements for national engineer endorsements. (a) For all...

  13. The soil management assessment framework: A potential soil health assessment tool

    USDA-ARS?s Scientific Manuscript database

    The Soil Management Assessment Framework (SMAF) was developed in the 1990s utilizing Systems Engineering and Ecology experiences with scoring functions to normalize disparate soil physical, chemical, and biological indicator data representing critical properties and processes associated with soil qu...

  14. A scalable architecture for incremental specification and maintenance of procedural and declarative clinical decision-support knowledge.

    PubMed

    Hatsek, Avner; Shahar, Yuval; Taieb-Maimon, Meirav; Shalom, Erez; Klimov, Denis; Lunenfeld, Eitan

    2010-01-01

    Clinical guidelines have been shown to improve the quality of medical care and to reduce its costs. However, most guidelines exist in a free-text representation and, without automation, are not sufficiently accessible to clinicians at the point of care. A prerequisite for automated guideline application is a machine-comprehensible representation of the guidelines. In this study, we designed and implemented a scalable architecture to support medical experts and knowledge engineers in specifying and maintaining the procedural and declarative aspects of clinical guideline knowledge, resulting in a machine comprehensible representation. The new framework significantly extends our previous work on the Digital electronic Guidelines Library (DeGeL) The current study designed and implemented a graphical framework for specification of declarative and procedural clinical knowledge, Gesher. We performed three different experiments to evaluate the functionality and usability of the major aspects of the new framework: Specification of procedural clinical knowledge, specification of declarative clinical knowledge, and exploration of a given clinical guideline. The subjects included clinicians and knowledge engineers (overall, 27 participants). The evaluations indicated high levels of completeness and correctness of the guideline specification process by both the clinicians and the knowledge engineers, although the best results, in the case of declarative-knowledge specification, were achieved by teams including a clinician and a knowledge engineer. The usability scores were high as well, although the clinicians' assessment was significantly lower than the assessment of the knowledge engineers.

  15. Software And Systems Engineering Risk Management

    DTIC Science & Technology

    2010-04-01

    RSKM 2004 COSO Enterprise RSKM Framework 2006 ISO/IEC 16085 Risk Management Process 2008 ISO/IEC 12207 Software Lifecycle Processes 2009 ISO/IEC...1 Software And Systems Engineering Risk Management John Walz VP Technical and Conferences Activities, IEEE Computer Society Vice-Chair Planning...Software & Systems Engineering Standards Committee, IEEE Computer Society US TAG to ISO TMB Risk Management Working Group Systems and Software

  16. NREL Kicks Off Next Phase of Advanced Computer-Aided Battery Engineering |

    Science.gov Websites

    lithium-ion (Li-ion) batteries, known as a multi-scale multi-domain (GH-MSMD) model framework, was News | NREL Kicks Off Next Phase of Advanced Computer-Aided Battery Engineering NREL Kicks Off Next Phase of Advanced Computer-Aided Battery Engineering March 16, 2016 NREL researcher looks across

  17. Investigating the Impact of Using a CAD Simulation Tool on Students' Learning of Design Thinking

    NASA Astrophysics Data System (ADS)

    Taleyarkhan, Manaz; Dasgupta, Chandan; Garcia, John Mendoza; Magana, Alejandra J.

    2018-02-01

    Engineering design thinking is hard to teach and still harder to learn by novices primarily due to the undetermined nature of engineering problems that often results in multiple solutions. In this paper, we investigate the effect of teaching engineering design thinking to freshmen students by using a computer-aided Design (CAD) simulation software. We present a framework for characterizing different levels of engineering design thinking displayed by students who interacted with the CAD simulation software in the context of a collaborative assignment. This framework describes the presence of four levels of engineering design thinking—beginning designer, adept beginning designer, informed designer, adept informed designer. We present the characteristics associated with each of these four levels as they pertain to four engineering design strategies that students pursued in this study—understanding the design challenge, building knowledge, weighing options and making tradeoffs, and reflecting on the process. Students demonstrated significant improvements in two strategies—understanding the design challenge and building knowledge. We discuss the affordances of the CAD simulation tool along with the learning environment that potentially helped students move towards Adept informed designers while pursuing these design strategies.

  18. 77 FR 66464 - Federal Acquisition Regulation; Submission for OMB Review; Value Engineering Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-05

    ...; Submission for OMB Review; Value Engineering Requirements AGENCIES: Department of Defense (DOD), General... collection requirement concerning Value Engineering Requirements. A notice was published in the Federal... comments identified by Information Collection 9000- 0027, Value Engineering Requirements, by any of the...

  19. A cell-free framework for rapid biosynthetic pathway prototyping and enzyme discovery.

    PubMed

    Karim, Ashty S; Jewett, Michael C

    2016-07-01

    Speeding up design-build-test (DBT) cycles is a fundamental challenge facing biochemical engineering. To address this challenge, we report a new cell-free protein synthesis driven metabolic engineering (CFPS-ME) framework for rapid biosynthetic pathway prototyping. In our framework, cell-free cocktails for synthesizing target small molecules are assembled in a mix-and-match fashion from crude cell lysates either containing selectively enriched pathway enzymes from heterologous overexpression or directly producing pathway enzymes in lysates by CFPS. As a model, we apply our approach to n-butanol biosynthesis showing that Escherichia coli lysates support a highly active 17-step CoA-dependent n-butanol pathway in vitro. The elevated degree of flexibility in the cell-free environment allows us to manipulate physiochemical conditions, access enzymatic nodes, discover new enzymes, and prototype enzyme sets with linear DNA templates to study pathway performance. We anticipate that CFPS-ME will facilitate efforts to define, manipulate, and understand metabolic pathways for accelerated DBT cycles without the need to reengineer organisms. Copyright © 2016 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  20. Towards a Brokering Framework for Business Process Execution

    NASA Astrophysics Data System (ADS)

    Santoro, Mattia; Bigagli, Lorenzo; Roncella, Roberto; Mazzetti, Paolo; Nativi, Stefano

    2013-04-01

    Advancing our knowledge of environmental phenomena and their interconnections requires an intensive use of environmental models. Due to the complexity of Earth system, the representation of complex environmental processes often requires the use of more than one model (often from different disciplines). The Group on Earth Observation (GEO) launched the Model Web initiative to increase present accessibility and interoperability of environmental models, allowing their flexible composition into complex Business Processes (BPs). A few, basic principles are at the base of the Model Web concept (Nativi, et al.): (i) Open access, (ii) Minimal entry-barriers, (iii) Service-driven approach, and (iv) Scalability. This work proposes an architectural solution, based on the Brokering approach for multidisciplinary interoperability, aiming to contribute to the Model Web vision. The Brokering approach is currently adopted in the new GEOSS Common Infrastructure (GCI) as was presented at the last GEO Plenary meeting in Istanbul, November 2011. We designed and prototyped a component called BP Broker. The high-level functionalities provided by the BP Broker are: • Discover the needed model implementations in an open, distributed and heterogeneous environment; • Check I/O consistency of BPs and provide suggestions for mismatches resolving: • Publish the EBP as a standard model resource for re-use. • Submit the compiled BP (EBP) to a WF-engine for execution. A BP Broker has the following features: • Support multiple abstract BP specifications; • Support encoding in multiple WF-engine languages. According to the Brokering principles, the designed system is flexible enough to support the use of multiple BP design (visual) tools, heterogeneous Web interfaces for model execution (e.g. OGC WPS, WSDL, etc.), and different Workflow engines. The present implementation makes use of BPMN 2.0 notation for BP design and jBPM workflow engine for eBP execution; however, the strong decoupling which characterizes the design of the BP Broker easily allows supporting other technologies. The main benefits of the proposed approach are: (i) no need for a composition infrastructure, (ii) alleviation from technicalities of workflow definitions, (iii) support of incomplete BPs, and (iv) the reuse of existing BPs as atomic processes. The BP Broker was designed and prototyped in the EC funded projects EuroGEOSS (http://www.eurogeoss.eu) and UncertWeb (http://www.uncertweb.org); the latter project provided also the use scenarios that were used to test the framework: the eHabitat scenario (calculation habitat similarity likelihood) and the FERA scenario (impact of climate change on land-use and crop yield). Three more scenarios are presently under development. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreements n. 248488 and n. 226487. References Nativi, S., Mazzetti, P., & Geller, G. (2012), "Environmental model access and interoperability: The GEO Model Web initiative". Environmental Modelling & Software , 1-15

Top