Working on the Boundaries: Philosophies and Practices of the Design Process
NASA Technical Reports Server (NTRS)
Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.
1996-01-01
While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.
Formalisms for user interface specification and design
NASA Technical Reports Server (NTRS)
Auernheimer, Brent J.
1989-01-01
The application of formal methods to the specification and design of human-computer interfaces is described. A broad outline of human-computer interface problems, a description of the field of cognitive engineering and two relevant research results, the appropriateness of formal specification techniques, and potential NASA application areas are described.
Managing Analysis Models in the Design Process
NASA Technical Reports Server (NTRS)
Briggs, Clark
2006-01-01
Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.
Case Study of 'Engineering Peer Meetings' in JPL's ST-6 Project
NASA Technical Reports Server (NTRS)
Chao, Lawrence P.; Tumer, Irem
2004-01-01
This design process error-proofing case study describes a design review practice implemented by a project manager at NASA Jet Propulsion Laboratory. There are many types of reviews at NASA: required and not, formalized and informal, programmatic and technical. Standing project formal reviews such as the Preliminary Design Review (PDR) and Critical Design Review (CDR) are a required part of every project and mission development. However, the engineering peer reviews that support teams technical work on such projects are often informal, ad hoc, and inconsistent across the organization. This case study discusses issues and innovations identified by a project manager at JPL and implemented in 'engineering peer meetings' for his group.
Case Study of "Engineering Peer Meetings" in JPL's ST-6 Project
NASA Technical Reports Server (NTRS)
Tumer, Irem Y.; Chao, Lawrence P.
2003-01-01
This design process error-proofing case study describes a design review practice implemented by a project manager at NASA Jet Propulsion Laboratory. There are many types of reviews at NASA: required and not, formalized and informal, programmatic and technical. Standing project formal reviews such as the Preliminary Design Review (PDR) and Critical Design Review (CDR) are a required part of every project and mission development. However, the engineering peer reviews that support teams technical work on such projects are often informal, ad hoc, and inconsistent across the organization. This case study discusses issues and innovations identified by a project manager at JPL and implemented in "engineering peer meetings" for his group.
Formal Verification of Complex Systems based on SysML Functional Requirements
2014-12-23
Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools
FORMED: Bringing Formal Methods to the Engineering Desktop
2016-02-01
integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification
Toward a mathematical formalism of performance, task difficulty, and activation
NASA Technical Reports Server (NTRS)
Samaras, George M.
1988-01-01
The rudiments of a mathematical formalism for handling operational, physiological, and psychological concepts are developed for use by the man-machine system design engineer. The formalism provides a framework for developing a structured, systematic approach to the interface design problem, using existing mathematical tools, and simplifying the problem of telling a machine how to measure and use performance.
Multi-Attribute Tradespace Exploration in Space System Design
NASA Astrophysics Data System (ADS)
Ross, A. M.; Hastings, D. E.
2002-01-01
The complexity inherent in space systems necessarily requires intense expenditures of resources both human and monetary. The high level of ambiguity present in the early design phases of these systems causes long, highly iterative, and costly design cycles. This paper looks at incorporating decision theory methods into the early design processes to streamline communication of wants and needs among stakeholders and between levels of design. Communication channeled through formal utility interviews and analysis enables engineers to better understand the key drivers for the system and allows a more thorough exploration of the design tradespace. Multi-Attribute Tradespace Exploration (MATE), an evolving process incorporating decision theory into model and simulation- based design, has been applied to several space system case studies at MIT. Preliminary results indicate that this process can improve the quality of communication to more quickly resolve project ambiguity, and enable the engineer to discover better value designs for multiple stakeholders. MATE is also being integrated into a concurrent design environment to facilitate the transfer knowledge of important drivers into higher fidelity design phases. Formal utility theory provides a mechanism to bridge the language barrier between experts of different backgrounds and differing needs (e.g. scientists, engineers, managers, etc). MATE with concurrent design couples decision makers more closely to the design, and most importantly, maintains their presence between formal reviews.
Warfighting Concepts to Future Weapon System Designs (WARCON)
2003-09-12
34* Software design documents rise to litigation. "* A Material List "Cost information that may support, or may * Final Engineering Process Maps be...document may include design the system as derived from the engineering design, software development, SRD. MTS Technologies, Inc. 26 FOR OFFICIAL USE...document, early in the development phase. It is software engineers produce the vision of important to establish a standard, formal the design effort. As
Formal methods in computer system design
NASA Astrophysics Data System (ADS)
Hoare, C. A. R.
1989-12-01
This note expounds a philosophy of engineering design which is stimulated, guided and checked by mathematical calculations and proofs. Its application to software engineering promises the same benifits as those derived from the use of mathematics in all other branches of modern science.
Recent Developments: PKI Square Dish for the Soleras Project
NASA Technical Reports Server (NTRS)
Rogers, W. E.
1984-01-01
The Square Dish solar collectors are subjected to rigorous design attention regarding corrosion at the site, and certification of the collector structure. The microprocessor controls and tracking mechanisms are improved in the areas of fail safe operations, durability, and low parasitic power requirements. Prototype testing demonstrates performance efficiency of approximately 72% at 730 F outlet temperature. Studies are conducted that include developing formal engineering design studies, developing formal engineering design drawing and fabrication details, establishing subcontracts for fabrication of major components, and developing a rigorous quality control system. The improved design is more cost effective to product and the extensive manuals developed for assembly and operation/maintenance result in faster field assembly and ease of operation.
Recent developments: PKI square dish for the Soleras Project
NASA Astrophysics Data System (ADS)
Rogers, W. E.
1984-03-01
The Square Dish solar collectors are subjected to rigorous design attention regarding corrosion at the site, and certification of the collector structure. The microprocessor controls and tracking mechanisms are improved in the areas of fail safe operations, durability, and low parasitic power requirements. Prototype testing demonstrates performance efficiency of approximately 72% at 730 F outlet temperature. Studies are conducted that include developing formal engineering design studies, developing formal engineering design drawing and fabrication details, establishing subcontracts for fabrication of major components, and developing a rigorous quality control system. The improved design is more cost effective to product and the extensive manuals developed for assembly and operation/maintenance result in faster field assembly and ease of operation.
ERIC Educational Resources Information Center
Dahm, Kevin; Newell, James
2001-01-01
Reports on a course at Rowan University, based on the economic design of a baseball stadium, that offers an introduction to multidisciplinary engineering design linked with formal training in technical communication. Addresses four pedagogical goals: (1) developing public speaking skills in a realistic, business setting; (2) giving students…
Systems engineering principles for the design of biomedical signal processing systems.
Faust, Oliver; Acharya U, Rajendra; Sputh, Bernhard H C; Min, Lim Choo
2011-06-01
Systems engineering aims to produce reliable systems which function according to specification. In this paper we follow a systems engineering approach to design a biomedical signal processing system. We discuss requirements capturing, specification definition, implementation and testing of a classification system. These steps are executed as formal as possible. The requirements, which motivate the system design, are based on diabetes research. The main requirement for the classification system is to be a reliable component of a machine which controls diabetes. Reliability is very important, because uncontrolled diabetes may lead to hyperglycaemia (raised blood sugar) and over a period of time may cause serious damage to many of the body systems, especially the nerves and blood vessels. In a second step, these requirements are refined into a formal CSP‖ B model. The formal model expresses the system functionality in a clear and semantically strong way. Subsequently, the proven system model was translated into an implementation. This implementation was tested with use cases and failure cases. Formal modeling and automated model checking gave us deep insight in the system functionality. This insight enabled us to create a reliable and trustworthy implementation. With extensive tests we established trust in the reliability of the implementation. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Proceedings of the First NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)
2009-01-01
Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.
Youth's Engagement as Scientists and Engineers in an Afterschool Making and Tinkering Program
NASA Astrophysics Data System (ADS)
Simpson, Amber; Burris, Alexandra; Maltese, Adam
2017-11-01
Making and tinkering is currently gaining traction as an interdisciplinary approach to education. However, little is known about how these activities and explorations in formal and informal learning spaces address the content and skills common to professionals across science, technology, engineering, and mathematics. As such, the purpose of this qualitative study was to examine how youth were engaged in the eight science and engineering practices outlined within the US Next Generation Science Standards within an informal learning environment utilizing principles of tinkering within the daily activities. Findings highlight how youth and facilitators engaged and enacted in practices common to scientists and engineers. Yet, in this study, enactment of these practices "looked" differently than might be expected in a formal learning environment such as a laboratory setting. For example, in this setting, students were observed carrying out trials on their design as opposed to carrying out a formal scientific investigation. Results also highlight instances of doing science and engineering not explicitly stated within parameters of formal education documents in the USA, such as experiences with failure.
NASA Technical Reports Server (NTRS)
Lacovara, R. C.
1990-01-01
The notions, benefits, and drawbacks of numeric simulation are introduced. Two formal simulation languages, Simpscript and Modsim are introduced. The capabilities of each are discussed briefly, and then the two programs are compared. The use of simulation in the process of design engineering for the Control and Monitoring System (CMS) for Space Station Freedom is discussed. The application of the formal simulation language to the CMS design is presented, and recommendations are made as to their use.
The Second NASA Formal Methods Workshop 1992
NASA Technical Reports Server (NTRS)
Johnson, Sally C. (Compiler); Holloway, C. Michael (Compiler); Butler, Ricky W. (Compiler)
1992-01-01
The primary goal of the workshop was to bring together formal methods researchers and aerospace industry engineers to investigate new opportunities for applying formal methods to aerospace problems. The first part of the workshop was tutorial in nature. The second part of the workshop explored the potential of formal methods to address current aerospace design and verification problems. The third part of the workshop involved on-line demonstrations of state-of-the-art formal verification tools. Also, a detailed survey was filled in by the attendees; the results of the survey are compiled.
Engineering in Communities: Learning by Doing
ERIC Educational Resources Information Center
Goggins, J.
2012-01-01
Purpose: The purpose of this paper is to focus on a number of initiatives in civil engineering undergraduate programmes at the National University of Ireland, Galway (NUIG) that allow students to complete engineering projects in the community, enabling them to learn by doing. Design/methodology/approach: A formal commitment to civic engagement was…
Integrating ethics in design through the value-sensitive design approach.
Cummings, Mary L
2006-10-01
The Accreditation Board of Engineering and Technology (ABET) has declared that to achieve accredited status, 'engineering programs must demonstrate that their graduates have an understanding of professional and ethical responsibility.' Many engineering professors struggle to integrate this required ethics instruction in technical classes and projects because of the lack of a formalized ethics-in-design approach. However, one methodology developed in human-computer interaction research, the Value-Sensitive Design approach, can serve as an engineering education tool which bridges the gap between design and ethics for many engineering disciplines. The three major components of Value-Sensitive Design, conceptual, technical, and empirical, exemplified through a case study which focuses on the development of a command and control supervisory interface for a military cruise missile.
A Study of Technical Engineering Peer Reviews at NASA
NASA Technical Reports Server (NTRS)
Chao, Lawrence P.; Tumer, Irem Y.; Bell, David G.
2003-01-01
This report describes the state of practices of design reviews at NASA and research into what can be done to improve peer review practices. There are many types of reviews at NASA: required and not, formalized and informal, programmatic and technical. Standing project formal reviews such as the Preliminary Design Review and Critical Design Review are a required part of every project and mission development. However, the technical, engineering peer reviews that support teams' work on such projects are informal, some times ad hoc, and inconsistent across the organization. The goal of this work is to identify best practices and lessons learned from NASA's experience, supported by academic research and methodologies to ultimately improve the process. This research has determined that the organization, composition, scope, and approach of the reviews impact their success. Failure Modes and Effects Analysis (FMEA) can identify key areas of concern before or in the reviews. Product definition tools like the Project Priority Matrix, engineering-focused Customer Value Chain Analysis (CVCA), and project or system-based Quality Function Deployment (QFD) help prioritize resources in reviews. The use of information technology and structured design methodologies can strengthen the engineering peer review process to help NASA work towards error-proofing the design process.
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Abumeri, Galib H.
2000-01-01
Aircraft engines are assemblies of dynamically interacting components. Engine updates to keep present aircraft flying safely and engines for new aircraft are progressively required to operate in more demanding technological and environmental requirements. Designs to effectively meet those requirements are necessarily collections of multi-scale, multi-level, multi-disciplinary analysis and optimization methods and probabilistic methods are necessary to quantify respective uncertainties. These types of methods are the only ones that can formally evaluate advanced composite designs which satisfy those progressively demanding requirements while assuring minimum cost, maximum reliability and maximum durability. Recent research activities at NASA Glenn Research Center have focused on developing multi-scale, multi-level, multidisciplinary analysis and optimization methods. Multi-scale refers to formal methods which describe complex material behavior metal or composite; multi-level refers to integration of participating disciplines to describe a structural response at the scale of interest; multidisciplinary refers to open-ended for various existing and yet to be developed discipline constructs required to formally predict/describe a structural response in engine operating environments. For example, these include but are not limited to: multi-factor models for material behavior, multi-scale composite mechanics, general purpose structural analysis, progressive structural fracture for evaluating durability and integrity, noise and acoustic fatigue, emission requirements, hot fluid mechanics, heat-transfer and probabilistic simulations. Many of these, as well as others, are encompassed in an integrated computer code identified as Engine Structures Technology Benefits Estimator (EST/BEST) or Multi-faceted/Engine Structures Optimization (MP/ESTOP). The discipline modules integrated in MP/ESTOP include: engine cycle (thermodynamics), engine weights, internal fluid mechanics, cost, mission and coupled structural/thermal, various composite property simulators and probabilistic methods to evaluate uncertainty effects (scatter ranges) in all the design parameters. The objective of the proposed paper is to briefly describe a multi-faceted design analysis and optimization capability for coupled multi-discipline engine structures optimization. Results are presented for engine and aircraft type metrics to illustrate the versatility of that capability. Results are also presented for reliability, noise and fatigue to illustrate its inclusiveness. For example, replacing metal rotors with composites reduces the engine weight by 20 percent, 15 percent noise reduction, and an order of magnitude improvement in reliability. Composite designs exist to increase fatigue life by at least two orders of magnitude compared to state-of-the-art metals.
Review of Estelle and LOTOS with respect to critical computer applications
NASA Technical Reports Server (NTRS)
Bown, Rodney L.
1991-01-01
Man rated NASA space vehicles seem to represent a set of ultimate critical computer applications. These applications require a high degree of security, integrity, and safety. A variety of formal and/or precise modeling techniques are becoming available for the designer of critical systems. The design phase of the software engineering life cycle includes the modification of non-development components. A review of the Estelle and LOTOS formal description languages is presented. Details of the languages and a set of references are provided. The languages were used to formally describe some of the Open System Interconnect (OSI) protocols.
What is the Final Verification of Engineering Requirements?
NASA Technical Reports Server (NTRS)
Poole, Eric
2010-01-01
This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.
NASA Technical Reports Server (NTRS)
Weber, Doug; Jamsek, Damir
1994-01-01
The goal of this task was to investigate how formal methods could be incorporated into a software engineering process for flight-control systems under DO-178B and to demonstrate that process by developing a formal specification for NASA's Guidance and Controls Software (GCS) Experiment. GCS is software to control the descent of a spacecraft onto a planet's surface. The GCS example is simplified from a real example spacecraft, but exhibits the characteristics of realistic spacecraft control software. The formal specification is written in Larch.
Improving engineering system design by formal decomposition, sensitivity analysis, and optimization
NASA Technical Reports Server (NTRS)
Sobieski, J.; Barthelemy, J. F. M.
1985-01-01
A method for use in the design of a complex engineering system by decomposing the problem into a set of smaller subproblems is presented. Coupling of the subproblems is preserved by means of the sensitivity derivatives of the subproblem solution to the inputs received from the system. The method allows for the division of work among many people and computers.
Usability engineering: domain analysis activities for augmented-reality systems
NASA Astrophysics Data System (ADS)
Gabbard, Joseph; Swan, J. E., II; Hix, Deborah; Lanzagorta, Marco O.; Livingston, Mark; Brown, Dennis B.; Julier, Simon J.
2002-05-01
This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured, iterative, stepwise development process. Like the related disciplines of software and systems engineering, usability engineering is a combination of management principals and techniques, formal and semi- formal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system that displays heads- up battlefield intelligence information to a dismounted warrior. The paper discusses our general usability engineering process. We originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented reality system. The focus of this paper is our work on domain analysis, the first activity of the usability engineering process. We describe our plans for and our progress to date on our domain analysis for BARS. We give results in terms of a specific urban battlefield use case we have designed.
NASA Technical Reports Server (NTRS)
Richardson, David
2018-01-01
Model-Based Systems Engineering (MBSE) is the formalized application of modeling to support system requirements, design, analysis, verification and validation activities beginning in the conceptual design phase and continuing throughout development and later life cycle phases . This presentation will discuss the value proposition that MBSE has for Systems Engineering, and the associated culture change needed to adopt it.
2011-08-01
design space is large. His research contributions are to the field of Decision-based Design, specifically in linking consumer preferences and...Integrating Consumer Preferences into Engineering Design, to be published in 2012. He received his PhD from Northwestern University in Mechanical
Third NASA Langley Formal Methods Workshop
NASA Technical Reports Server (NTRS)
Holloway, C. Michael (Compiler)
1995-01-01
This publication constitutes the proceedings of NASA Langley Research Center's third workshop on the application of formal methods to the design and verification of life-critical systems. This workshop brought together formal methods researchers, industry engineers, and academicians to discuss the potential of NASA-sponsored formal methods and to investigate new opportunities for applying these methods to industry problems. contained herein are copies of the material presented at the workshop, summaries of many of the presentations, a complete list of attendees, and a detailed summary of the Langley formal methods program. Much of this material is available electronically through the World-Wide Web via the following URL.
Engineering Changes in Product Design - A Review
NASA Astrophysics Data System (ADS)
Karthik, K.; Janardhan Reddy, K., Dr
2016-09-01
Changes are fundamental to product development. Engineering changes are unavoidable and can arise at any phase of the product life cycle. The consideration of market requirements, customer/user feedbacks, manufacturing constraints, design innovations etc., turning them into viable products can be accomplished when product change is managed properly. In the early design cycle, informal changes are accepted. However, changes become formal when its complexity and cost increases, and as product matures. To maximize the market shares, manufacturers have to effectively and efficiently manage engineering changes by means of Configuration Control. The paper gives a broad overview about ‘Engineering Change Management’ (ECM) through configuration management and its implications in product design. The aim is to give an idea and understanding about the engineering changes in product design scenario to the new researchers. This paper elaborates the significant aspect of managing the engineering changes and the importance of ECM in a product life cycle.
Formalization of the engineering science discipline - knowledge engineering
NASA Astrophysics Data System (ADS)
Peng, Xiao
Knowledge is the most precious ingredient facilitating aerospace engineering research and product development activities. Currently, the most common knowledge retention methods are paper-based documents, such as reports, books and journals. However, those media have innate weaknesses. For example, four generations of flying wing aircraft (Horten, Northrop XB-35/YB-49, Boeing BWB and many others) were mostly developed in isolation. The subsequent engineers were not aware of the previous developments, because these projects were documented such which prevented the next generation of engineers to benefit from the previous lessons learned. In this manner, inefficient knowledge retention methods have become a primary obstacle for knowledge transfer from the experienced to the next generation of engineers. In addition, the quality of knowledge itself is a vital criterion; thus, an accurate measure of the quality of 'knowledge' is required. Although qualitative knowledge evaluation criteria have been researched in other disciplines, such as the AAA criterion by Ernest Sosa stemming from the field of philosophy, a quantitative knowledge evaluation criterion needs to be developed which is capable to numerically determine the qualities of knowledge for aerospace engineering research and product development activities. To provide engineers with a high-quality knowledge management tool, the engineering science discipline Knowledge Engineering has been formalized to systematically address knowledge retention issues. This research undertaking formalizes Knowledge Engineering as follows: 1. Categorize knowledge according to its formats and representations for the first time, which serves as the foundation for the subsequent knowledge management function development. 2. Develop an efficiency evaluation criterion for knowledge management by analyzing the characteristics of both knowledge and the parties involved in the knowledge management processes. 3. Propose and develop an innovative Knowledge-Based System (KBS), AVD KBS, forming a systematic approach facilitating knowledge management. 4. Demonstrate the efficiency advantages of AVDKBS over traditional knowledge management methods via selected design case studies. This research formalizes, for the first time, Knowledge Engineering as a distinct discipline by delivering a robust and high-quality knowledge management and process tool, AVDKBS. Formalizing knowledge proves to significantly impact the effectiveness of aerospace knowledge retention and utilization.
ERIC Educational Resources Information Center
Susomrith, Pattanee; Coetzer, Alan
2015-01-01
Purpose: This paper aims to investigate barriers to employee participation in voluntary formal training and development opportunities from the perspective of employees in small engineering businesses. Design/methodology/approach: An exploratory qualitative methodology involving data collection via site visits and in-depth semi-structured…
Dynamic Gate Product and Artifact Generation from System Models
NASA Technical Reports Server (NTRS)
Jackson, Maddalena; Delp, Christopher; Bindschadler, Duane; Sarrel, Marc; Wollaeger, Ryan; Lam, Doris
2011-01-01
Model Based Systems Engineering (MBSE) is gaining acceptance as a way to formalize systems engineering practice through the use of models. The traditional method of producing and managing a plethora of disjointed documents and presentations ("Power-Point Engineering") has proven both costly and limiting as a means to manage the complex and sophisticated specifications of modern space systems. We have developed a tool and method to produce sophisticated artifacts as views and by-products of integrated models, allowing us to minimize the practice of "Power-Point Engineering" from model-based projects and demonstrate the ability of MBSE to work within and supersede traditional engineering practices. This paper describes how we have created and successfully used model-based document generation techniques to extract paper artifacts from complex SysML and UML models in support of successful project reviews. Use of formal SysML and UML models for architecture and system design enables production of review documents, textual artifacts, and analyses that are consistent with one-another and require virtually no labor-intensive maintenance across small-scale design changes and multiple authors. This effort thus enables approaches that focus more on rigorous engineering work and less on "PowerPoint engineering" and production of paper-based documents or their "office-productivity" file equivalents.
A systematic approach to embedded biomedical decision making.
Song, Zhe; Ji, Zhongkai; Ma, Jian-Guo; Sputh, Bernhard; Acharya, U Rajendra; Faust, Oliver
2012-11-01
An embedded decision making is a key feature for many biomedical systems. In most cases human life directly depends on correct decisions made by these systems, therefore they have to work reliably. This paper describes how we applied systems engineering principles to design a high performance embedded classification system in a systematic and well structured way. We introduce the structured design approach by discussing requirements capturing, specifications refinement, implementation and testing. Thereby, we follow systems engineering principles and execute each of these processes as formal as possible. The requirements, which motivate the system design, describe an automated decision making system for diagnostic support. These requirements are refined into the implementation of a support vector machine (SVM) algorithm which enables us to integrate automated decision making in embedded systems. With a formal model we establish functionality, stability and reliability of the system. Furthermore, we investigated different parallel processing configurations of this computationally complex algorithm. We found that, by adding SVM processes, an almost linear speedup is possible. Once we established these system properties, we translated the formal model into an implementation. The resulting implementation was tested using XMOS processors with both normal and failure cases, to build up trust in the implementation. Finally, we demonstrated that our parallel implementation achieves the speedup, predicted by the formal model. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Engineering Aid 3 & 2, Vol. 1. Rate Training Manual and Nonresident Career Course.
ERIC Educational Resources Information Center
Naval Education and Training Command, Washington, DC.
Designed for individual study and not formal classroom instruction, this rate training manual provides subject matter that relates directly to the occupational qualifications of the Engineering Aid (EA) rating. This eight-chapter volume focuses on administrative matters, mathematics, and basic drafting. Chapter 1 discusses the scope of the EA…
ERIC Educational Resources Information Center
Huang, Shaobo; Mejia, Joel Alejandro; Becker, Kurt; Neilson, Drew
2015-01-01
Improving high school physics teaching and learning is important to the long-term success of science, technology, engineering, and mathematics (STEM) education. Efforts are currently in place to develop an understanding of science among high school students through formal and informal educational experiences in engineering design activities…
Engineering Aid 3 & 2, Vol. 2. Rate Training Manual.
ERIC Educational Resources Information Center
Bernal, Benito C., Jr.
Designed for individual study and not formal classroom instruction, this rate training manual provides subject matter that relates directly to the occupational qualifications of the Engineering Aid (EA) rating. This volume contains 10 chapters which deal with: (1) wood and light frame structures (examining the uses, kinds, sizes, and grades of…
Joint electrical engineering/physics course sequence for optics fundamentals and design
NASA Astrophysics Data System (ADS)
Magnusson, Robert; Maldonado, Theresa A.; Black, Truman D.
2000-06-01
Optics is a key technology in a broad range of engineering and science applications of high national priority. Engineers and scientists with a sound background in this field are needed to preserve technical leadership and to establish new directions of research and development. To meet this educational need, a joint Electrical Engineering/Physics optics course sequence was created as PHYS 3445 Fundamentals of Optics and EE 4444 Optical Systems Design, both with a laboratory component. The objectives are to educate EE and Physics undergraduate students in the fundamentals of optics; in interdisciplinary problem solving; in design and analysis; in handling optical components; and in skills such as communications and team cooperation. Written technical reports in professional format are required, formal presentations are given, and participation in paper design contests is encouraged.
NASA Technical Reports Server (NTRS)
Carr, Daniel; Ellenberger, Rich
2008-01-01
The Human Factors Implementation Team (HFIT) process has been used to verify human factors requirements for NASA International Space Station (ISS) payloads since 2003, resulting in $2.4 million in avoided costs. This cost benefit has been realized by greatly reducing the need to process time-consuming formal waivers (exceptions) for individual requirements violations. The HFIT team, which includes astronauts and their technical staff, acts as the single source for human factors requirements integration of payloads. HFIT has the authority to provide inputs during early design phases, thus eliminating many potential requirements violations in a cost-effective manner. In those instances where it is not economically or technically feasible to meet the precise metric of a given requirement, HFIT can work with the payload engineers to develop common sense solutions and formally document that the resulting payload design does not materially affect the astronaut s ability to operate and interact with the payload. The HFIT process is fully ISO 9000 compliant and works concurrently with NASA s formal systems engineering work flow. Due to its success with payloads, the HFIT process is being adapted and extended to ISS systems hardware. Key aspects of this process are also being considered for NASA's Space Shuttle replacement, the Crew Exploration Vehicle.
NASA Astrophysics Data System (ADS)
Brambilla, Marco; Ceri, Stefano; Valle, Emanuele Della; Facca, Federico M.; Tziviskou, Christina
Although Semantic Web Services are expected to produce a revolution in the development of Web-based systems, very few enterprise-wide design experiences are available; one of the main reasons is the lack of sound Software Engineering methods and tools for the deployment of Semantic Web applications. In this chapter, we present an approach to software development for the Semantic Web based on classical Software Engineering methods (i.e., formal business process development, computer-aided and component-based software design, and automatic code generation) and on semantic methods and tools (i.e., ontology engineering, semantic service annotation and discovery).
User Participation and Participatory Design: Topics in Computing Education.
ERIC Educational Resources Information Center
Kautz, Karlheinz
1996-01-01
Discusses user participation and participatory design in the context of formal education for computing professionals. Topics include the current curriculum debate; mathematical- and engineering-based education; traditional system-development training; and an example of a course program that includes computers and society, and prototyping. (53…
Model-based engineering for medical-device software.
Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi
2010-01-01
This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.
Management Of Optical Projects
NASA Astrophysics Data System (ADS)
Young, Peter S.; Olson, David R.
1981-03-01
This paper discusses the management of optical projects from the concept stage, beginning with system specifications, through design, optical fabrication and test tasks. Special emphasis is placed on effective coupling of design engineering with fabrication development and utilization of available technology. Contrasts are drawn between accepted formalized management techniques, the realities of dealing with fragile components and the necessity of an effective project team which integrates the special characteristics of highly skilled optical specialists including lens designers, optical engineers, opticians, and metrologists. Examples are drawn from the HEAO-2 X-Ray Telescope and Space Telescope projects.
Analyses of Public Utility Building - Students Designs, Aimed at their Energy Efficiency Improvement
NASA Astrophysics Data System (ADS)
Wołoszyn, Marek Adam
2017-10-01
Public utility buildings are formally, structurally and functionally complex entities. Frequently, the process of their design involves the retroactive reconsideration of energy engineering issues, once a building concept has already been completed. At that stage, minor formal corrections are made along with the design of the external layer of the building in order to satisfy applicable standards. Architecture students do the same when designing assigned public utility buildings. In order to demonstrate energy-related defects of building designs developed by students, the conduct of analyses was proposed. The completed designs of public utility buildings were examined with regard to energy efficiency of the solutions they feature through the application of the following programs: Ecotect, Vasari, and in case of simpler analyses ArchiCad program extensions were sufficient.
Implementation of Scene Shadows in the Target Acquistion TDA (TARGAC).
1994-11-01
B-2 APPENDIX C: ENGINEERING CHANGE REPORTS .......................... C-1 APPENDIX D: TASK...Appendix C contains the details of each change made. Each change is accompanied by an Engineering Change Report (ECR) and in-line documentation of the source...code. Appendix D is a formal design document of the changes needed to implement shadowing by small-scale features. The implementation presented in
What can formal methods offer to digital flight control systems design
NASA Technical Reports Server (NTRS)
Good, Donald I.
1990-01-01
Formal methods research begins to produce methods which will enable mathematic modeling of the physical behavior of digital hardware and software systems. The development of these methods directly supports the NASA mission of increasing the scope and effectiveness of flight system modeling capabilities. The conventional, continuous mathematics that is used extensively in modeling flight systems is not adequate for accurate modeling of digital systems. Therefore, the current practice of digital flight control system design has not had the benefits of extensive mathematical modeling which are common in other parts of flight system engineering. Formal methods research shows that by using discrete mathematics, very accurate modeling of digital systems is possible. These discrete modeling methods will bring the traditional benefits of modeling to digital hardware and hardware design. Sound reasoning about accurate mathematical models of flight control systems can be an important part of reducing risk of unsafe flight control.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saw, C; Baikadi, M; Peters, C
2015-06-15
Purpose: Using systems engineering to design HDR skin treatment operation for small lesions using shielded applicators to enhance patient safety. Methods: Systems engineering is an interdisciplinary field that offers formal methodologies to study, design, implement, and manage complex engineering systems as a whole over their life-cycles. The methodologies deal with human work-processes, coordination of different team, optimization, and risk management. The V-model of systems engineering emphasize two streams, the specification and the testing streams. The specification stream consists of user requirements, functional requirements, and design specifications while the testing on installation, operational, and performance specifications. In implementing system engineering tomore » this project, the user and functional requirements are (a) HDR unit parameters be downloaded from the treatment planning system, (b) dwell times and positions be generated by treatment planning system, (c) source decay be computer calculated, (d) a double-check system of treatment parameters to comply with the NRC regulation. These requirements are intended to reduce human intervention to improve patient safety. Results: A formal investigation indicated that the user requirements can be satisfied. The treatment operation consists of using the treatment planning system to generate a pseudo plan that is adjusted for different shielded applicators to compute the dwell times. The dwell positions, channel numbers, and the dwell times are verified by the medical physicist and downloaded into the HDR unit. The decayed source strength is transferred to a spreadsheet that computes the dwell times based on the type of applicators and prescribed dose used. Prior to treatment, the source strength, dwell times, dwell positions, and channel numbers are double-checked by the radiation oncologist. No dosimetric parameters are manually calculated. Conclusion: Systems engineering provides methodologies to effectively design the HDR treatment operation that minimize human intervention and improve patient safety.« less
An overview of very high level software design methods
NASA Technical Reports Server (NTRS)
Asdjodi, Maryam; Hooper, James W.
1988-01-01
Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.
Genetic Design Automation: engineering fantasy or scientific renewal?
Lux, Matthew W.; Bramlett, Brian W.; Ball, David A.; Peccoud, Jean
2013-01-01
Synthetic biology aims to make genetic systems more amenable to engineering, which has naturally led to the development of Computer-Aided Design (CAD) tools. Experimentalists still primarily rely on project-specific ad-hoc workflows instead of domain-specific tools, suggesting that CAD tools are lagging behind the front line of the field. Here, we discuss the scientific hurdles that have limited the productivity gains anticipated from existing tools. We argue that the real value of efforts to develop CAD tools is the formalization of genetic design rules that determine the complex relationships between genotype and phenotype. PMID:22001068
Preparing engineers for the challenges of community engagement
NASA Astrophysics Data System (ADS)
Harsh, Matthew; Bernstein, Michael J.; Wetmore, Jameson; Cozzens, Susan; Woodson, Thomas; Castillo, Rafael
2017-11-01
Despite calls to address global challenges through community engagement, engineers are not formally prepared to engage with communities. Little research has been done on means to address this 'engagement gap' in engineering education. We examine the efficacy of an intensive, two-day Community Engagement Workshop for engineers, designed to help engineers better look beyond technology, listen to and learn from people, and empower communities. We assessed the efficacy of the workshop in a non-experimental pre-post design using a questionnaire and a concept map. Questionnaire results indicate participants came away better able to ask questions more broadly inclusive of non-technological dimensions of engineering projects. Concept map results indicate participants have a greater understanding of ways social factors shape complex material systems after completing the programme. Based on the workshop's strengths and weaknesses, we discuss the potential of expanding and supplementing the programme to help engineers account for social aspects central to engineered systems.
Fusing Quantitative Requirements Analysis with Model-based Systems Engineering
NASA Technical Reports Server (NTRS)
Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven
2006-01-01
A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.
Automotive Stirling Engine Development Program
NASA Technical Reports Server (NTRS)
Nightingale, N.; Richey, A.; Farrell, R.; Riecke, G.; Ernst, W.; Howarth, R.; Cronin, M.; Simetkosky, M.; Smith, G.; Meacher, J.
1985-01-01
Development test activities on Mod I engines directed toward evaluating technologies for potential inclusion in the Mod II engine are summarized. Activities covered include: test of a 12-tube combustion gas recirculation combustor; manufacture and flow-distribution test of a two-manifold annular heater head; piston rod/piston base joint; single-solid piston rings; and a digital air/fuel concept. Also summarized are results of a formal assessment of candidate technologies for the Mod II engine, and preliminary design work for the Mod II. The overall program philosophy weight is outlined, and data and test results are presented.
Linear quadratic servo control of a reusable rocket engine
NASA Technical Reports Server (NTRS)
Musgrave, Jeffrey L.
1991-01-01
The paper deals with the development of a design method for a servo component in the frequency domain using singular values and its application to a reusable rocket engine. A general methodology used to design a class of linear multivariable controllers for intelligent control systems is presented. Focus is placed on performance and robustness characteristics, and an estimator design performed in the framework of the Kalman-filter formalism with emphasis on using a sensor set different from the commanded values is discussed. It is noted that loop transfer recovery modifies the nominal plant noise intensities in order to obtain the desired degree of robustness to uncertainty reflected at the plant input. Simulation results demonstrating the performance of the linear design on a nonlinear engine model over all power levels during mainstage operation are discussed.
Pedagogical Basis of DAS Formalism in Engineering Education
ERIC Educational Resources Information Center
Hiltunen, J.; Heikkinen, E.-P.; Jaako, J.; Ahola, J.
2011-01-01
The paper presents a new approach for a bachelor-level curriculum structure in engineering. The approach is called DAS formalism according to its three phases: description, analysis and synthesis. Although developed specifically for process and environmental engineering, DAS formalism has a generic nature and it could also be used in other…
Light UAV Support Ship (ASW) (LUSSA)
2011-08-01
35 9.5 TriSWACH Model Test Data...7 Figure 8: TriSWACH Model ...Innovation in Ship Design (CISD) used the Northrop Grumman Bat UAV (formally known as the Swift Engineering Killer Bee KB4) to model launch, recovery, and
A Biotic Game Design Project for Integrated Life Science and Engineering Education
Denisin, Aleksandra K.; Rensi, Stefano; Sanchez, Gabriel N.; Quake, Stephen R.; Riedel-Kruse, Ingmar H.
2015-01-01
Engaging, hands-on design experiences are key for formal and informal Science, Technology, Engineering, and Mathematics (STEM) education. Robotic and video game design challenges have been particularly effective in stimulating student interest, but equivalent experiences for the life sciences are not as developed. Here we present the concept of a "biotic game design project" to motivate student learning at the interface of life sciences and device engineering (as part of a cornerstone bioengineering devices course). We provide all course material and also present efforts in adapting the project's complexity to serve other time frames, age groups, learning focuses, and budgets. Students self-reported that they found the biotic game project fun and motivating, resulting in increased effort. Hence this type of design project could generate excitement and educational impact similar to robotics and video games. PMID:25807212
A biotic game design project for integrated life science and engineering education.
Cira, Nate J; Chung, Alice M; Denisin, Aleksandra K; Rensi, Stefano; Sanchez, Gabriel N; Quake, Stephen R; Riedel-Kruse, Ingmar H
2015-03-01
Engaging, hands-on design experiences are key for formal and informal Science, Technology, Engineering, and Mathematics (STEM) education. Robotic and video game design challenges have been particularly effective in stimulating student interest, but equivalent experiences for the life sciences are not as developed. Here we present the concept of a "biotic game design project" to motivate student learning at the interface of life sciences and device engineering (as part of a cornerstone bioengineering devices course). We provide all course material and also present efforts in adapting the project's complexity to serve other time frames, age groups, learning focuses, and budgets. Students self-reported that they found the biotic game project fun and motivating, resulting in increased effort. Hence this type of design project could generate excitement and educational impact similar to robotics and video games.
Air Force Space Command. Space and Missile Systems Center Standard. Configuration Management
2008-06-13
Aerospace Corporation report number TOR-2006( 8583 )-1. 3. Beneficial comments (recommendations, additions, deletions) and any pertinent data that...Engineering Drawing Practices IEEE STD 610.12 Glossary of Software Engineering Terminology, September 28,1990 ISO /IEC 12207 Software Life...item, regardless of media, formally designated and fixed at a specific time during the configuration item’s life cycle. (Source: ISO /IEC 12207
ERIC Educational Resources Information Center
Oakes, G. L.; Felton, A. J.; Garner, K. B.
2006-01-01
The BSc in computer aided product design (CAPD) course at the University of Wolverhampton was conceived as a collaborative venture in 1989 between the School of Engineering and the School of Art and Design. The award was at the forefront of forging interdisciplinary collaboration at undergraduate level in the field of product design. It has…
Design review - A tool for all seasons.
NASA Technical Reports Server (NTRS)
Liberman, D. S.
1972-01-01
The origins of design review are considered together with questions of definitions. The main characteristics which distinguish the concept of design review discussed from the basic master-apprentice relationship include competence, objectivity, formality, and a systematic approach. Preliminary, major, and final reviews are the steps used in the management of the design and development process in each company. It is shown that the design review is generically a systems engineering milestone review with certain unique characteristics.
The Role of Formal Experiment Design in Hypersonic Flight System Technology Development
NASA Technical Reports Server (NTRS)
McClinton, Charles R.; Ferlemann, Shelly M.; Rock, Ken E.; Ferlemann, Paul G.
2002-01-01
Hypersonic airbreathing engine (scramjet) powered vehicles are being considered to replace conventional rocket-powered launch systems. Effective utilization of scramjet engines requires careful integration with the air vehicle. This integration synergistically combines aerodynamic forces with propulsive cycle functions of the engine. Due to the highly integrated nature of the hypersonic vehicle design problem, the large flight envelope, and the large number of design variables, the use of a statistical design approach in design is effective. Modern Design-of-Experiments (MDOE) has been used throughout the Hyper-X program, for both systems analysis and experimental testing. Application of MDOE fall into four categories: (1) experimental testing; (2) studies of unit phenomena; (3) refining engine design; and (4) full vehicle system optimization. The MDOE process also provides analytical models, which are also used to document lessons learned, supplement low-level design tools, and accelerate future studies. This paper will discuss the design considerations for scramjet-powered vehicles, specifics of MDOE utilized for Hyper-X, and present highlights from the use of these MDOE methods within the Hyper-X Program.
Applying formal methods and object-oriented analysis to existing flight software
NASA Technical Reports Server (NTRS)
Cheng, Betty H. C.; Auernheimer, Brent
1993-01-01
Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.
Reconfigurable Hardware Adapts to Changing Mission Demands
NASA Technical Reports Server (NTRS)
2003-01-01
A new class of computing architectures and processing systems, which use reconfigurable hardware, is creating a revolutionary approach to implementing future spacecraft systems. With the increasing complexity of electronic components, engineers must design next-generation spacecraft systems with new technologies in both hardware and software. Derivation Systems, Inc., of Carlsbad, California, has been working through NASA s Small Business Innovation Research (SBIR) program to develop key technologies in reconfigurable computing and Intellectual Property (IP) soft cores. Founded in 1993, Derivation Systems has received several SBIR contracts from NASA s Langley Research Center and the U.S. Department of Defense Air Force Research Laboratories in support of its mission to develop hardware and software for high-assurance systems. Through these contracts, Derivation Systems began developing leading-edge technology in formal verification, embedded Java, and reconfigurable computing for its PF3100, Derivational Reasoning System (DRS ), FormalCORE IP, FormalCORE PCI/32, FormalCORE DES, and LavaCORE Configurable Java Processor, which are designed for greater flexibility and security on all space missions.
Formal verification of an avionics microprocessor
NASA Technical Reports Server (NTRS)
Srivas, Mandayam, K.; Miller, Steven P.
1995-01-01
Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.
Experience report: Using formal methods for requirements analysis of critical spacecraft software
NASA Technical Reports Server (NTRS)
Lutz, Robyn R.; Ampo, Yoko
1994-01-01
Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.
NASA systems engineering handbook
NASA Astrophysics Data System (ADS)
Shishko, Robert; Aster, Robert; Chamberlain, Robert G.; McDuffee, Patrick; Pieniazek, Les; Rowell, Tom; Bain, Beth; Cox, Renee I.; Mooz, Harold; Polaski, Lou
1995-06-01
This handbook brings the fundamental concepts and techniques of systems engineering to NASA personnel in a way that recognizes the nature of NASA systems and environment. It is intended to accompany formal NASA training courses on systems engineering and project management when appropriate, and is designed to be a top-level overview. The concepts were drawn from NASA field center handbooks, NMI's/NHB's, the work of the NASA-wide Systems Engineering Working Group and the Systems Engineering Process Improvement Task team, several non-NASA textbooks and guides, and material from independent systems engineering courses taught to NASA personnel. Five core chapters cover systems engineering fundamentals, the NASA Project Cycle, management issues in systems engineering, systems analysis and modeling, and specialty engineering integration. It is not intended as a directive.
Genetic design automation: engineering fantasy or scientific renewal?
Lux, Matthew W; Bramlett, Brian W; Ball, David A; Peccoud, Jean
2012-02-01
The aim of synthetic biology is to make genetic systems more amenable to engineering, which has naturally led to the development of computer-aided design (CAD) tools. Experimentalists still primarily rely on project-specific ad hoc workflows instead of domain-specific tools, which suggests that CAD tools are lagging behind the front line of the field. Here, we discuss the scientific hurdles that have limited the productivity gains anticipated from existing tools. We argue that the real value of efforts to develop CAD tools is the formalization of genetic design rules that determine the complex relationships between genotype and phenotype. Copyright © 2011 Elsevier Ltd. All rights reserved.
Integrating Science and Engineering to Implement Evidence-Based Practices in Health Care Settings.
Wu, Shinyi; Duan, Naihua; Wisdom, Jennifer P; Kravitz, Richard L; Owen, Richard R; Sullivan, J Greer; Wu, Albert W; Di Capua, Paul; Hoagwood, Kimberly Eaton
2015-09-01
Integrating two distinct and complementary paradigms, science and engineering, may produce more effective outcomes for the implementation of evidence-based practices in health care settings. Science formalizes and tests innovations, whereas engineering customizes and optimizes how the innovation is applied tailoring to accommodate local conditions. Together they may accelerate the creation of an evidence-based healthcare system that works effectively in specific health care settings. We give examples of applying engineering methods for better quality, more efficient, and safer implementation of clinical practices, medical devices, and health services systems. A specific example was applying systems engineering design that orchestrated people, process, data, decision-making, and communication through a technology application to implement evidence-based depression care among low-income patients with diabetes. We recommend that leading journals recognize the fundamental role of engineering in implementation research, to improve understanding of design elements that create a better fit between program elements and local context.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shuman, A.; Beavis, L.
Early in 1990, J. A. Wilder, Supervisor of Sandia National Laboratories (SNLA), Division 2565 requested that a meeting of the scientists and engineers responsible for developing and producing switch tubes be set up to discuss in a semi-formal way the science and technology of switch tubes. Programmatic and administrative issues were specifically exempted from the discussions. L. Beavis, Division 7471, SNL and A. Shuman, EG G, Salem were made responsible for organizing a program including the materials and processes of switch tubes. The purpose of the Switch Tube Advanced Technology meeting was to allow personnel from Allied Signal Kansas Citymore » Division (AS/KCD); EG G, Salem and Sandia National Laboratories (SNL) to discuss a variety of issues involved in the development and production of switch tubes. It was intended that the formal and informal discussions would allow a better understanding of the production problems by material and process engineers and of the materials and processes by production engineers. This program consisted of formal presentations on May 23 and informal discussions on May 24. The topics chosen for formal presentation were suggested by the people of AS/KCD, EG G, Salem, and SNL involved with the design, development and production of switch tubes. The topics selected were generic. They were not directed to any specific switch tube but rather to all switch tubes in production and development. This document includes summaries of the material presented at the formal presentation on May 23.« less
NASA Technical Reports Server (NTRS)
Hall, Philip; Whitfield, Susan
2011-01-01
As NASA undertakes increasingly complex projects, the need for expert systems engineers and leaders in systems engineering is becoming more pronounced. As a result of this issue, the Agency has undertaken an initiative to develop more systems engineering leaders through its Systems Engineering Leadership Development Program; however, the NASA Office of the Chief Engineer has also called on the field Centers to develop mechanisms to strengthen their expertise in systems engineering locally. In response to this call, Marshall Space Flight Center (MSFC) has developed a comprehensive development program for aspiring systems engineers and systems engineering leaders. This presentation will summarize the two-level program, which consists of a combination of training courses and on-the-job, developmental training assignments at the Center to help develop stronger expertise in systems engineering and technical leadership. In addition, it will focus on the success the program has had in its pilot year. The program hosted a formal kickoff event for Level I on October 13, 2009. The first class includes 42 participants from across MSFC and Michoud Assembly Facility (MAF). A formal call for Level II is forthcoming. With the new Agency focus on research and development of new technologies, having a strong pool of well-trained systems engineers is becoming increasingly more critical. Programs such as the Marshall Systems Engineering Leadership Development Program, as well as those developed at other Centers, help ensure that there is an upcoming generation of trained systems engineers and systems engineering leaders to meet future design challenges.
An Overview of Starfish: A Table-Centric Tool for Interactive Synthesis
NASA Technical Reports Server (NTRS)
Tsow, Alex
2008-01-01
Engineering is an interactive process that requires intelligent interaction at many levels. My thesis [1] advances an engineering discipline for high-level synthesis and architectural decomposition that integrates perspicuous representation, designer interaction, and mathematical rigor. Starfish, the software prototype for the design method, implements a table-centric transformation system for reorganizing control-dominated system expressions into high-level architectures. Based on the digital design derivation (DDD) system a designer-guided synthesis technique that applies correctness preserving transformations to synchronous data flow specifications expressed as co- recursive stream equations Starfish enhances user interaction and extends the reachable design space by incorporating four innovations: behavior tables, serialization tables, data refinement, and operator retiming. Behavior tables express systems of co-recursive stream equations as a table of guarded signal updates. Developers and users of the DDD system used manually constructed behavior tables to help them decide which transformations to apply and how to specify them. These design exercises produced several formally constructed hardware implementations: the FM9001 microprocessor, an SECD machine for evaluating LISP, and the SchemEngine, garbage collected machine for interpreting a byte-code representation of compiled Scheme programs. Bose and Tuna, two of DDD s developers, have subsequently commercialized the design derivation methodology at Derivation Systems, Inc. (DSI). DSI has formally derived and validated PCI bus interfaces and a Java byte-code processor; they further executed a contract to prototype SPIDER-NASA's ultra-reliable communications bus. To date, most derivations from DDD and DRS have targeted hardware due to its synchronous design paradigm. However, Starfish expressions are independent of the synchronization mechanism; there is no commitment to hardware or globally broadcast clocks. Though software back-ends for design derivation are limited to the DDD stream-interpreter, targeting synchronous or real-time software is not substantively different from targeting hardware.
Toward a Model-Based Approach to Flight System Fault Protection
NASA Technical Reports Server (NTRS)
Day, John; Murray, Alex; Meakin, Peter
2012-01-01
Fault Protection (FP) is a distinct and separate systems engineering sub-discipline that is concerned with the off-nominal behavior of a system. Flight system fault protection is an important part of the overall flight system systems engineering effort, with its own products and processes. As with other aspects of systems engineering, the FP domain is highly amenable to expression and management in models. However, while there are standards and guidelines for performing FP related analyses, there are not standards or guidelines for formally relating the FP analyses to each other or to the system hardware and software design. As a result, the material generated for these analyses are effectively creating separate models that are only loosely-related to the system being designed. Development of approaches that enable modeling of FP concerns in the same model as the system hardware and software design enables establishment of formal relationships that has great potential for improving the efficiency, correctness, and verification of the implementation of flight system FP. This paper begins with an overview of the FP domain, and then continues with a presentation of a SysML/UML model of the FP domain and the particular analyses that it contains, by way of showing a potential model-based approach to flight system fault protection, and an exposition of the use of the FP models in FSW engineering. The analyses are small examples, inspired by current real-project examples of FP analyses.
A system approach to aircraft optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw
1991-01-01
Mutual couplings among the mathematical models of physical phenomena and parts of a system such as an aircraft complicate the design process because each contemplated design change may have a far reaching consequence throughout the system. Techniques are outlined for computing these influences as system design derivatives useful for both judgemental and formal optimization purposes. The techniques facilitate decomposition of the design process into smaller, more manageable tasks and they form a methodology that can easily fit into existing engineering organizations and incorporate their design tools.
Using a Formal Approach for Reverse Engineering and Design Recovery to Support Software Reuse
NASA Technical Reports Server (NTRS)
Gannod, Gerald C.
2002-01-01
This document describes 3rd year accomplishments and summarizes overall project accomplishments. Included as attachments are all published papers from year three. Note that the budget for this project was discontinued after year two, but that a residual budget from year two allowed minimal continuance into year three. Accomplishments include initial investigations into log-file based reverse engineering, service-based software reuse, and a source to XML generator.
7 CFR 1726.201 - Formal competitive bidding.
Code of Federal Regulations, 2010 CFR
2010-01-01
... AGRICULTURE ELECTRIC SYSTEM CONSTRUCTION POLICIES AND PROCEDURES Procurement Procedures § 1726.201 Formal... formal competitive bidding: (a) Selection of qualified bidders. The borrower (acting through its engineer... § 1726.23). (b) Invitations to bid. The borrower (acting through its engineer, if applicable) is...
7 CFR 1726.201 - Formal competitive bidding.
Code of Federal Regulations, 2013 CFR
2013-01-01
... AGRICULTURE ELECTRIC SYSTEM CONSTRUCTION POLICIES AND PROCEDURES Procurement Procedures § 1726.201 Formal... formal competitive bidding: (a) Selection of qualified bidders. The borrower (acting through its engineer... § 1726.23). (b) Invitations to bid. The borrower (acting through its engineer, if applicable) is...
7 CFR 1726.201 - Formal competitive bidding.
Code of Federal Regulations, 2014 CFR
2014-01-01
... AGRICULTURE ELECTRIC SYSTEM CONSTRUCTION POLICIES AND PROCEDURES Procurement Procedures § 1726.201 Formal... formal competitive bidding: (a) Selection of qualified bidders. The borrower (acting through its engineer... § 1726.23). (b) Invitations to bid. The borrower (acting through its engineer, if applicable) is...
7 CFR 1726.201 - Formal competitive bidding.
Code of Federal Regulations, 2012 CFR
2012-01-01
... AGRICULTURE ELECTRIC SYSTEM CONSTRUCTION POLICIES AND PROCEDURES Procurement Procedures § 1726.201 Formal... formal competitive bidding: (a) Selection of qualified bidders. The borrower (acting through its engineer... § 1726.23). (b) Invitations to bid. The borrower (acting through its engineer, if applicable) is...
7 CFR 1726.201 - Formal competitive bidding.
Code of Federal Regulations, 2011 CFR
2011-01-01
... AGRICULTURE ELECTRIC SYSTEM CONSTRUCTION POLICIES AND PROCEDURES Procurement Procedures § 1726.201 Formal... formal competitive bidding: (a) Selection of qualified bidders. The borrower (acting through its engineer... § 1726.23). (b) Invitations to bid. The borrower (acting through its engineer, if applicable) is...
NASA Systems Engineering Handbook
NASA Technical Reports Server (NTRS)
Shishko, Robert; Aster, Robert; Chamberlain, Robert G.; Mcduffee, Patrick; Pieniazek, Les; Rowell, Tom; Bain, Beth; Cox, Renee I.; Mooz, Harold; Polaski, Lou
1995-01-01
This handbook brings the fundamental concepts and techniques of systems engineering to NASA personnel in a way that recognizes the nature of NASA systems and environment. It is intended to accompany formal NASA training courses on systems engineering and project management when appropriate, and is designed to be a top-level overview. The concepts were drawn from NASA field center handbooks, NMI's/NHB's, the work of the NASA-wide Systems Engineering Working Group and the Systems Engineering Process Improvement Task team, several non-NASA textbooks and guides, and material from independent systems engineering courses taught to NASA personnel. Five core chapters cover systems engineering fundamentals, the NASA Project Cycle, management issues in systems engineering, systems analysis and modeling, and specialty engineering integration. It is not intended as a directive. Superseded by: NASA/SP-2007-6105 Rev 1 (20080008301).
The Direction of Web-based Training: A Practitioner's View.
ERIC Educational Resources Information Center
Kilby, Tim
2001-01-01
Web-based training has had achievements and disappointments as online learning has matured. Best practices include user-centered design, knowledge object structures, usability engineering, and formal evaluation. Knowledge management, peer-to-peer learning, and personal learning appliances will continue to alter the online learning landscape. (SK)
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-16
... automobile engines. The company reports that workers leased from Caravan Knight Facilities Management, LLC..., Formally Known as Chrysler LLC, Kenosha Engine Plant, Including On-Site Leased Workers From Caravan Knight..., Kenosha Engine Plant, Kenosha, Wisconsin. The notice was published in [[Page 34171
NASA Astrophysics Data System (ADS)
Millet, Charlyne; Oget, David; Cavallucci, Denis
2017-11-01
Innovation is a key component to the success and longevity of companies. Our research opens the 'black box' of creativity and innovation in R&D teams. We argue that understanding the nature of R&D projects in terms of creativity/innovation, efficiency/inefficiency, is important for designing education policies and improving engineering curriculum. Our research addresses the inventive design process, a lesser known aspect of the innovation process, in 197 R&D departments of industrial sector companies in France. One fundamental issue facing companies is to evaluate processes and results of innovation. Results show that the evaluation of innovation is confined by a lack of methodology of inventive projects. We will be establishing the foundations of a formal ontology for inventive design projects and finally some recommendations for engineering education.
Integrated analysis of engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1981-01-01
The need for light, durable, fuel efficient, cost effective aircraft requires the development of engine structures which are flexible, made from advaced materials (including composites), resist higher temperatures, maintain tighter clearances and have lower maintenance costs. The formal quantification of any or several of these requires integrated computer programs (multilevel and/or interdisciplinary analysis programs interconnected) for engine structural analysis/design. Several integrated analysis computer prorams are under development at Lewis Reseach Center. These programs include: (1) COBSTRAN-Composite Blade Structural Analysis, (2) CODSTRAN-Composite Durability Structural Analysis, (3) CISTRAN-Composite Impact Structural Analysis, (4) STAEBL-StruTailoring of Engine Blades, and (5) ESMOSS-Engine Structures Modeling Software System. Three other related programs, developed under Lewis sponsorship, are described.
1985-02-21
Approvoid foT public 90Ieleol, 2* . tJni7nited " - . - o . - ’--. * . -... . 1 UNCLASSIFIED S, E CURITY CLASSIFICATION OF THIS PAGE-" REPORT DOCUMENTATION...ACCESSION NO. 11. TITLE (Include Security Classification) . Veta -Analysis of Human Factors Engineering Studies Comparing Individual Differences, Practice...Background C Opportunity D Significance E History III. PHASE I FINAL REPORT A Literature Review B Formal Analysis C Results D Implications for Phase II IV
Perspectives on knowledge in engineering design
NASA Technical Reports Server (NTRS)
Rasdorf, W. J.
1985-01-01
Various perspectives are given of the knowledge currently used in engineering design, specifically dealing with knowledge-based expert systems (KBES). Constructing an expert system often reveals inconsistencies in domain knowledge while formalizing it. The types of domain knowledge (facts, procedures, judgments, and control) differ from the classes of that knowledge (creative, innovative, and routine). The feasible tasks for expert systems can be determined based on these types and classes of knowledge. Interpretive tasks require reasoning about a task in light of the knowledge available, where generative tasks create potential solutions to be tested against constraints. Only after classifying the domain by type and level can the engineer select a knowledge-engineering tool for the domain being considered. The critical features to be weighed after classification are knowledge representation techniques, control strategies, interface requirements, compatibility with traditional systems, and economic considerations.
Investigations in Computer-Aided Instruction and Computer-Aided Controls. Final Report.
ERIC Educational Resources Information Center
Rosenberg, R.C.; And Others
These research projects, designed to delve into certain relationships between humans and computers, are focused on computer-assisted instruction and on man-computer interaction. One study demonstrates that within the limits of formal engineering theory, a computer simulated laboratory (Dynamic Systems Laboratory) can be built in which freshmen…
Improving Function Allocation for Integrated Systems Design
1996-06-01
in the movie Star Trek—The Next Generation, the android DATA is both perceived and treated as a member of the crew. That type of perceptual change...time-consuming, formal contract change. LABORATORY VIEW OF FUNCTION ALLOCATION In 1951, Paul M. Fitts, the founder of the Human Engineering Divi
Euler Teaches a Class in Structural Steel Design
ERIC Educational Resources Information Center
Boyajian, David M.
2009-01-01
Even before steel was a topic of formal study for structural engineers, the brilliant eighteenth century Swiss mathematician and physicist, Leonhard Euler (1707-1783), investigated the theory governing the elastic behaviour of columns, the results of which are incorporated into the American Institute of Steel Construction's (AISC's) Bible: the…
Describing the What and Why of Students' Difficulties in Boolean Logic
ERIC Educational Resources Information Center
Herman, Geoffrey L.; Loui, Michael C.; Kaczmarczyk, Lisa; Zilles, Craig
2012-01-01
The ability to reason with formal logic is a foundational skill for computer scientists and computer engineers that scaffolds the abilities to design, debug, and optimize. By interviewing students about their understanding of propositional logic and their ability to translate from English specifications to Boolean expressions, we characterized…
Software Safety Risk in Legacy Safety-Critical Computer Systems
NASA Technical Reports Server (NTRS)
Hill, Janice; Baggs, Rhoda
2007-01-01
Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.
Engineering and Software Engineering
NASA Astrophysics Data System (ADS)
Jackson, Michael
The phrase ‘software engineering' has many meanings. One central meaning is the reliable development of dependable computer-based systems, especially those for critical applications. This is not a solved problem. Failures in software development have played a large part in many fatalities and in huge economic losses. While some of these failures may be attributable to programming errors in the narrowest sense—a program's failure to satisfy a given formal specification—there is good reason to think that most of them have other roots. These roots are located in the problem of software engineering rather than in the problem of program correctness. The famous 1968 conference was motivated by the belief that software development should be based on “the types of theoretical foundations and practical disciplines that are traditional in the established branches of engineering.” Yet after forty years of currency the phrase ‘software engineering' still denotes no more than a vague and largely unfulfilled aspiration. Two major causes of this disappointment are immediately clear. First, too many areas of software development are inadequately specialised, and consequently have not developed the repertoires of normal designs that are the indispensable basis of reliable engineering success. Second, the relationship between structural design and formal analytical techniques for software has rarely been one of fruitful synergy: too often it has defined a boundary between competing dogmas, at which mutual distrust and incomprehension deprive both sides of advantages that should be within their grasp. This paper discusses these causes and their effects. Whether the common practice of software development will eventually satisfy the broad aspiration of 1968 is hard to predict; but an understanding of past failure is surely a prerequisite of future success.
Integrating Science and Engineering to Implement Evidence-Based Practices in Health Care Settings
Wu, Shinyi; Duan, Naihua; Wisdom, Jennifer P.; Kravitz, Richard L.; Owen, Richard R.; Sullivan, Greer; Wu, Albert W.; Di Capua, Paul; Hoagwood, Kimberly Eaton
2015-01-01
Integrating two distinct and complementary paradigms, science and engineering, may produce more effective outcomes for the implementation of evidence-based practices in health care settings. Science formalizes and tests innovations, whereas engineering customizes and optimizes how the innovation is applied tailoring to accommodate local conditions. Together they may accelerate the creation of an evidence-based healthcare system that works effectively in specific health care settings. We give examples of applying engineering methods for better quality, more efficient, and safer implementation of clinical practices, medical devices, and health services systems. A specific example was applying systems engineering design that orchestrated people, process, data, decision-making, and communication through a technology application to implement evidence-based depression care among low-income patients with diabetes. We recommend that leading journals recognize the fundamental role of engineering in implementation research, to improve understanding of design elements that create a better fit between program elements and local context. PMID:25217100
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-30
... automobile engines. The company reports that workers leased from Syncreon were employed on-site at the..., Formally Known as Chrysler LLC, Kenosha Engine Plant, Including On-Site Leased Workers From Caravan Knight... Chrysler, LLC, Kenosha Engine Plant, Kenosha, Wisconsin. The notice was published in the Federal Register...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lopez, M.A.M.
1995-03-01
Even though the ALARA philosophy was formally implemented in the early 1980`s, to some extent, ALARA considerations already had been incorporated into the design of most commercial equipment and facilities based on experience and engineering development. In Mexico, the design of medical and industrial facilities were based on international recommendations containing those considerations. With the construction of Laguna Verde Nuclear Power Station, formal ALARA groups were created to review some parts of its design, and to prepare the ALARA Program and related procedures necessary for its commercial operation. This paper begins with a brief historical description of ALARA development inmore » Mexico, and then goes on to discuss our regulatory frame in Radiation Protection, some aspects of the ALARA Program, efforts in controlling and reducing of sources of radiation, and finally, future perspectives in the ALARA field.« less
Matsuoka, Yukiko; Ghosh, Samik; Kitano, Hiroaki
2009-01-01
The discovery by design paradigm driving research in synthetic biology entails the engineering of de novo biological constructs with well-characterized input–output behaviours and interfaces. The construction of biological circuits requires iterative phases of design, simulation and assembly, leading to the fabrication of a biological device. In order to represent engineered models in a consistent visual format and further simulating them in silico, standardization of representation and model formalism is imperative. In this article, we review different efforts for standardization, particularly standards for graphical visualization and simulation/annotation schemata adopted in systems biology. We identify the importance of integrating the different standardization efforts and provide insights into potential avenues for developing a common framework for model visualization, simulation and sharing across various tools. We envision that such a synergistic approach would lead to the development of global, standardized schemata in biology, empowering deeper understanding of molecular mechanisms as well as engineering of novel biological systems. PMID:19493898
Design of Astrometric Mission (JASMINE) by Applying Model Driven System Engineering
NASA Astrophysics Data System (ADS)
Yamada, Y.; Miyashita, H.; Nakamura, H.; Suenaga, K.; Kamiyoshi, S.; Tsuiki, A.
2010-12-01
We are planning space astrometric satellite mission named JASMINE. The target accuracy of parallaxes in JASMINE observation is 10 micro arc second, which corresponds to 1 nm scale on the focal plane. It is very hard to measure the 1 nm scale deformation of focal plane. Eventually, we need to add the deformation to the observation equations when estimating stellar astrometric parameters, which requires considering many factors such as instrument models and observation data analysis. In this situation, because the observation equations become more complex, we may reduce the stability of the hardware, nevertheless, we require more samplings due to the lack of rigidity of each estimation. This mission imposes a number of trades-offs in the engineering choices and then decide the optimal design from a number of candidates. In order to efficiently support such decisions, we apply Model Driven Systems Engineering (MDSE), which improves the efficiency of the engineering by revealing and formalizing requirements, specifications, and designs to find a good balance among various trade-offs.
Statistical analysis of Turbine Engine Diagnostic (TED) field test data
NASA Astrophysics Data System (ADS)
Taylor, Malcolm S.; Monyak, John T.
1994-11-01
During the summer of 1993, a field test of turbine engine diagnostic (TED) software, developed jointly by U.S. Army Research Laboratory and the U.S. Army Ordnance Center and School, was conducted at Fort Stuart, GA. The data were collected in conformance with a cross-over design, some of whose considerations are detailed. The initial analysis of the field test data was exploratory, followed by a more formal investigation. Technical aspects of the data analysis insights that were elicited are reported.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2004-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
HDL to verification logic translator
NASA Technical Reports Server (NTRS)
Gambles, J. W.; Windley, P. J.
1992-01-01
The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.
Formal and heuristic system decomposition methods in multidisciplinary synthesis. Ph.D. Thesis, 1991
NASA Technical Reports Server (NTRS)
Bloebaum, Christina L.
1991-01-01
The multidisciplinary interactions which exist in large scale engineering design problems provide a unique set of difficulties. These difficulties are associated primarily with unwieldy numbers of design variables and constraints, and with the interdependencies of the discipline analysis modules. Such obstacles require design techniques which account for the inherent disciplinary couplings in the analyses and optimizations. The objective of this work was to develop an efficient holistic design synthesis methodology that takes advantage of the synergistic nature of integrated design. A general decomposition approach for optimization of large engineering systems is presented. The method is particularly applicable for multidisciplinary design problems which are characterized by closely coupled interactions among discipline analyses. The advantage of subsystem modularity allows for implementation of specialized methods for analysis and optimization, computational efficiency, and the ability to incorporate human intervention and decision making in the form of an expert systems capability. The resulting approach is not a method applicable to only a specific situation, but rather, a methodology which can be used for a large class of engineering design problems in which the system is non-hierarchic in nature.
Sawlog weights for Appalachian hardwoods
Floyd G. Timson; Floyd G. Timson
1972-01-01
The tables are presented in this paper as reference material needed as a foundation for further work in the field of hardwood log weights. Such work may be undertaken by researchers, engineers, and equipment designers in the form of formal and informal studies, or by timbermen in the normal course of action to improve their operations.
Reengineering Framework for Systems in Education
ERIC Educational Resources Information Center
Choquet, Christophe; Corbiere, Alain
2006-01-01
Specifications recently proposed as standards in the domain of Technology Enhanced Learning (TEL), question the designers of TEL systems on how to put them into practice. Recent studies in Model Driven Engineering have highlighted the need for a framework which could formalize the use of these specifications as well as enhance the quality of the…
STEM Learning through Engineering Design: Impact on Middle Secondary Students' Interest towards STEM
ERIC Educational Resources Information Center
Shahali, Edy Hafizan Mohd; Halim, Lilia; Rasul, Mohamad Sattar; Osman, Kamisah; Zulkifeli, Mohd Afendi
2017-01-01
The purpose of this study was to identify students' changes of (i) interest toward STEM subjects and (ii) interest to pursuing STEM career after participating in non-formal integrated STEM education programme. The programme exposed students with integrated STEM education through project based learning involving the application of five phases…
A Formal Application of Safety and Risk Assessment in Software Systems
2004-09-01
characteristics of Software Engineering, Development, and Safety...against a comparison of planned and actual schedules, costs, and characteristics . Software Safety is focused on the reduction of unsafe incidents...they merely carry out the role for which they were anatomically designed.55 Software is characteristically like an anatomical cell as it merely
1989-10-13
and other non -technical aspects of the system). System-wide Perspective. The systerm that is being designed and engineered must include not just the...specifications and is regarded as the lowest-level (implementation) of detail.-’ Ihis decomposition follows the typical "top down" design methodology ...formal verification process has contributed to the security and correctness of the TCB design and implementation. FORMUL METHODOLOGY DESCRIPTION The
Architecting the Human Space Flight Program with Systems Modeling Language (SysML)
NASA Technical Reports Server (NTRS)
Jackson, Maddalena M.; Fernandez, Michela Munoz; McVittie, Thomas I.; Sindiy, Oleg V.
2012-01-01
The next generation of missions in NASA's Human Space Flight program focuses on the development and deployment of highly complex systems (e.g., Orion Multi-Purpose Crew Vehicle, Space Launch System, 21st Century Ground System) that will enable astronauts to venture beyond low Earth orbit and explore the moon, near-Earth asteroids, and beyond. Architecting these highly complex system-of-systems requires formal systems engineering techniques for managing the evolution of the technical features in the information exchange domain (e.g., data exchanges, communication networks, ground software) and also, formal correlation of the technical architecture to stakeholders' programmatic concerns (e.g., budget, schedule, risk) and design development (e.g., assumptions, constraints, trades, tracking of unknowns). This paper will describe how the authors have applied System Modeling Language (SysML) to implement model-based systems engineering for managing the description of the End-to-End Information System (EEIS) architecture and associated development activities and ultimately enables stakeholders to understand, reason, and answer questions about the EEIS under design for proposed lunar Exploration Missions 1 and 2 (EM-1 and EM-2).
Efficient design of multituned transmission line NMR probes: the electrical engineering approach.
Frydel, J A; Krzystyniak, M; Pienkowski, D; Pietrzak, M; de Sousa Amadeu, N; Ratajczyk, T; Idzik, K; Gutmann, T; Tietze, D; Voigt, S; Fenn, A; Limbach, H H; Buntkowsky, G
2011-01-01
Transmission line-based multi-channel solid state NMR probes have many advantages regarding the cost of construction, number of RF-channels, and achievable RF-power levels. Nevertheless, these probes are only rarely employed in solid state-NMR-labs, mainly owing to the difficult experimental determination of the necessary RF-parameters. Here, the efficient design of multi-channel solid state MAS-NMR probes employing transmission line theory and modern techniques of electrical engineering is presented. As technical realization a five-channel ((1)H, (31)P, (13)C, (2)H and (15)N) probe for operation at 7 Tesla is described. This very cost efficient design goal is a multi port single coil transmission line probe based on the design developed by Schaefer and McKay. The electrical performance of the probe is determined by measuring of Scattering matrix parameters (S-parameters) in particular input/output ports. These parameters are compared to the calculated parameters of the design employing the S-matrix formalism. It is shown that the S-matrix formalism provides an excellent tool for examination of transmission line probes and thus the tool for a rational design of these probes. On the other hand, the resulting design provides excellent electrical performance. From a point of view of Nuclear Magnetic Resonance (NMR), calibration spectra of particular ports (channels) are of great importance. The estimation of the π/2 pulses length for all five NMR channels is presented. Copyright © 2011 Elsevier Inc. All rights reserved.
Rewriting Logic Semantics of a Plan Execution Language
NASA Technical Reports Server (NTRS)
Dowek, Gilles; Munoz, Cesar A.; Rocha, Camilo
2009-01-01
The Plan Execution Interchange Language (PLEXIL) is a synchronous language developed by NASA to support autonomous spacecraft operations. In this paper, we propose a rewriting logic semantics of PLEXIL in Maude, a high-performance logical engine. The rewriting logic semantics is by itself a formal interpreter of the language and can be used as a semantic benchmark for the implementation of PLEXIL executives. The implementation in Maude has the additional benefit of making available to PLEXIL designers and developers all the formal analysis and verification tools provided by Maude. The formalization of the PLEXIL semantics in rewriting logic poses an interesting challenge due to the synchronous nature of the language and the prioritized rules defining its semantics. To overcome this difficulty, we propose a general procedure for simulating synchronous set relations in rewriting logic that is sound and, for deterministic relations, complete. We also report on the finding of two issues at the design level of the original PLEXIL semantics that were identified with the help of the executable specification in Maude.
Optical systems engineering - A tutorial
NASA Technical Reports Server (NTRS)
Wyman, C. L.
1979-01-01
The paper examines the use of the systems engineering approach in the design of optical systems, noting that the use of such an approach which involves an integrated interdisciplinary approach to the development of systems is most appropriate for optics. It is shown that the high precision character of optics leads to complex and subtle effects on optical system performance, resulting from structural, thermal dynamical, control system, and manufacturing and assembly considerations. Attention is given to communication problems that often occur among users and optical engineers due to the unique factors of optical systems. It is concluded that it is essential that the optics community provide leadership to resolve communication problems and fully formalize the field of optical systems engineering.
1991-10-01
SUBJECT TERMS 15. NUMBER OF PAGES engineering management information systems method formalization 60 information engineering process modeling 16 PRICE...CODE information systems requirements definition methods knowlede acquisition methods systems engineering 17. SECURITY CLASSIFICATION ji. SECURITY... Management , Inc., Santa Monica, California. CORYNEN, G. C., 1975, A Mathematical Theory of Modeling and Simula- tion. Ph.D. Dissertation, Department
Universal Quantum Computing with Arbitrary Continuous-Variable Encoding.
Lau, Hoi-Kwan; Plenio, Martin B
2016-09-02
Implementing a qubit quantum computer in continuous-variable systems conventionally requires the engineering of specific interactions according to the encoding basis states. In this work, we present a unified formalism to conduct universal quantum computation with a fixed set of operations but arbitrary encoding. By storing a qubit in the parity of two or four qumodes, all computing processes can be implemented by basis state preparations, continuous-variable exponential-swap operations, and swap tests. Our formalism inherits the advantages that the quantum information is decoupled from collective noise, and logical qubits with different encodings can be brought to interact without decoding. We also propose a possible implementation of the required operations by using interactions that are available in a variety of continuous-variable systems. Our work separates the "hardware" problem of engineering quantum-computing-universal interactions, from the "software" problem of designing encodings for specific purposes. The development of quantum computer architecture could hence be simplified.
Universal Quantum Computing with Arbitrary Continuous-Variable Encoding
NASA Astrophysics Data System (ADS)
Lau, Hoi-Kwan; Plenio, Martin B.
2016-09-01
Implementing a qubit quantum computer in continuous-variable systems conventionally requires the engineering of specific interactions according to the encoding basis states. In this work, we present a unified formalism to conduct universal quantum computation with a fixed set of operations but arbitrary encoding. By storing a qubit in the parity of two or four qumodes, all computing processes can be implemented by basis state preparations, continuous-variable exponential-swap operations, and swap tests. Our formalism inherits the advantages that the quantum information is decoupled from collective noise, and logical qubits with different encodings can be brought to interact without decoding. We also propose a possible implementation of the required operations by using interactions that are available in a variety of continuous-variable systems. Our work separates the "hardware" problem of engineering quantum-computing-universal interactions, from the "software" problem of designing encodings for specific purposes. The development of quantum computer architecture could hence be simplified.
Weaving a Formal Methods Education with Problem-Based Learning
NASA Astrophysics Data System (ADS)
Gibson, J. Paul
The idea of weaving formal methods through computing (or software engineering) degrees is not a new one. However, there has been little success in developing and implementing such a curriculum. Formal methods continue to be taught as stand-alone modules and students, in general, fail to see how fundamental these methods are to the engineering of software. A major problem is one of motivation — how can the students be expected to enthusiastically embrace a challenging subject when the learning benefits, beyond passing an exam and achieving curriculum credits, are not clear? Problem-based learning has gradually moved from being an innovative pedagogique technique, commonly used to better-motivate students, to being widely adopted in the teaching of many different disciplines, including computer science and software engineering. Our experience shows that a good problem can be re-used throughout a student's academic life. In fact, the best computing problems can be used with children (young and old), undergraduates and postgraduates. In this paper we present a process for weaving formal methods through a University curriculum that is founded on the application of problem-based learning and a library of good software engineering problems, where students learn about formal methods without sitting a traditional formal methods module. The process of constructing good problems and integrating them into the curriculum is shown to be analagous to the process of engineering software. This approach is not intended to replace more traditional formal methods modules: it will better prepare students for such specialised modules and ensure that all students have an understanding and appreciation for formal methods even if they do not go on to specialise in them.
Designing an architectural style for Pervasive Healthcare systems.
Rafe, Vahid; Hajvali, Masoumeh
2013-04-01
Nowadays, the Pervasive Healthcare (PH) systems are considered as an important research area. These systems have a dynamic structure and configuration. Therefore, an appropriate method for designing such systems is necessary. The Publish/Subscribe Architecture (pub/sub) is one of the convenient architectures to support such systems. PH systems are safety critical; hence, errors can bring disastrous results. To prevent such problems, a powerful analytical tool is required. So using a proper formal language like graph transformation systems for developing of these systems seems necessary. But even if software engineers use such high level methodologies, errors may occur in the system under design. Hence, it should be investigated automatically and formally that whether this model of system satisfies all their requirements or not. In this paper, a dynamic architectural style for developing PH systems is presented. Then, the behavior of these systems is modeled and evaluated using GROOVE toolset. The results of the analysis show its high reliability.
Helping System Engineers Bridge the Peaks
NASA Technical Reports Server (NTRS)
Rungta, Neha; Tkachuk, Oksana; Person, Suzette; Biatek, Jason; Whalen, Michael W.; Castle, Joseph; Castle, JosephGundy-Burlet, Karen
2014-01-01
In our experience at NASA, system engineers generally follow the Twin Peaks approach when developing safety-critical systems. However, iterations between the peaks require considerable manual, and in some cases duplicate, effort. A significant part of the manual effort stems from the fact that requirements are written in English natural language rather than a formal notation. In this work, we propose an approach that enables system engineers to leverage formal requirements and automated test generation to streamline iterations, effectively "bridging the peaks". The key to the approach is a formal language notation that a) system engineers are comfortable with, b) is supported by a family of automated V&V tools, and c) is semantically rich enough to describe the requirements of interest. We believe the combination of formalizing requirements and providing tool support to automate the iterations will lead to a more efficient Twin Peaks implementation at NASA.
Artificial Symmetry-Breaking for Morphogenetic Engineering Bacterial Colonies.
Nuñez, Isaac N; Matute, Tamara F; Del Valle, Ilenne D; Kan, Anton; Choksi, Atri; Endy, Drew; Haseloff, Jim; Rudge, Timothy J; Federici, Fernan
2017-02-17
Morphogenetic engineering is an emerging field that explores the design and implementation of self-organized patterns, morphologies, and architectures in systems composed of multiple agents such as cells and swarm robots. Synthetic biology, on the other hand, aims to develop tools and formalisms that increase reproducibility, tractability, and efficiency in the engineering of biological systems. We seek to apply synthetic biology approaches to the engineering of morphologies in multicellular systems. Here, we describe the engineering of two mechanisms, symmetry-breaking and domain-specific cell regulation, as elementary functions for the prototyping of morphogenetic instructions in bacterial colonies. The former represents an artificial patterning mechanism based on plasmid segregation while the latter plays the role of artificial cell differentiation by spatial colocalization of ubiquitous and segregated components. This separation of patterning from actuation facilitates the design-build-test-improve engineering cycle. We created computational modules for CellModeller representing these basic functions and used it to guide the design process and explore the design space in silico. We applied these tools to encode spatially structured functions such as metabolic complementation, RNAPT7 gene expression, and CRISPRi/Cas9 regulation. Finally, as a proof of concept, we used CRISPRi/Cas technology to regulate cell growth by controlling methionine synthesis. These mechanisms start from single cells enabling the study of morphogenetic principles and the engineering of novel population scale structures from the bottom up.
ERIC Educational Resources Information Center
Martin, Laura M. W.; Beach, King
Performances of 45 individuals with varying degrees of formal and informal training in machining and programming were compared on tasks designed to tap intellectual changes that may occur with the introduction of computer numerical control (CNC). Participants--30 machinists, 8 machine operators, and 7 engineers--were asked background questions and…
The approach to engineering tasks composition on knowledge portals
NASA Astrophysics Data System (ADS)
Novogrudska, Rina; Globa, Larysa; Schill, Alexsander; Romaniuk, Ryszard; Wójcik, Waldemar; Karnakova, Gaini; Kalizhanova, Aliya
2017-08-01
The paper presents an approach to engineering tasks composition on engineering knowledge portals. The specific features of engineering tasks are highlighted, their analysis makes the basis for partial engineering tasks integration. The formal algebraic system for engineering tasks composition is proposed, allowing to set the context-independent formal structures for engineering tasks elements' description. The method of engineering tasks composition is developed that allows to integrate partial calculation tasks into general calculation tasks on engineering portals, performed on user request demand. The real world scenario «Calculation of the strength for the power components of magnetic systems» is represented, approving the applicability and efficiency of proposed approach.
Recent trends related to the use of formal methods in software engineering
NASA Technical Reports Server (NTRS)
Prehn, Soren
1986-01-01
An account is given of some recent developments and trends related to the development and use of formal methods in software engineering. Ongoing activities in Europe are focussed on, since there seems to be a notable difference in attitude towards industrial usage of formal methods in Europe and in the U.S. A more detailed account is given of the currently most widespread formal method in Europe: the Vienna Development Method. Finally, the use of Ada is discussed in relation to the application of formal methods, and the potential for constructing Ada-specific tools based on that method is considered.
Requirements Analysis for Large Ada Programs: Lessons Learned on CCPDS- R
1989-12-01
when the design had matured and This approach was not optimal from the formal the SRS role was to be the tester’s contract, implemen- testing and...on the software development CPU processing load. These constraints primar- process is the necessity to include sufficient testing ily affect algorithm...allocations and timing requirements are by-products of the software design process when multiple CSCls are a P R StrR eSOFTWARE ENGINEERING executed within
Partridge, Chris; de Cesare, Sergio; Mitchell, Andrew; Odell, James
2018-01-01
Formalization is becoming more common in all stages of the development of information systems, as a better understanding of its benefits emerges. Classification systems are ubiquitous, no more so than in domain modeling. The classification pattern that underlies these systems provides a good case study of the move toward formalization in part because it illustrates some of the barriers to formalization, including the formal complexity of the pattern and the ontological issues surrounding the "one and the many." Powersets are a way of characterizing the (complex) formal structure of the classification pattern, and their formalization has been extensively studied in mathematics since Cantor's work in the late nineteenth century. One can use this formalization to develop a useful benchmark. There are various communities within information systems engineering (ISE) that are gradually working toward a formalization of the classification pattern. However, for most of these communities, this work is incomplete, in that they have not yet arrived at a solution with the expressiveness of the powerset benchmark. This contrasts with the early smooth adoption of powerset by other information systems communities to, for example, formalize relations. One way of understanding the varying rates of adoption is recognizing that the different communities have different historical baggage. Many conceptual modeling communities emerged from work done on database design, and this creates hurdles to the adoption of the high level of expressiveness of powersets. Another relevant factor is that these communities also often feel, particularly in the case of domain modeling, a responsibility to explain the semantics of whatever formal structures they adopt. This paper aims to make sense of the formalization of the classification pattern in ISE and surveys its history through the literature, starting from the relevant theoretical works of the mathematical literature and gradually shifting focus to the ISE literature. The literature survey follows the evolution of ISE's understanding of how to formalize the classification pattern. The various proposals are assessed using the classical example of classification; the Linnaean taxonomy formalized using powersets as a benchmark for formal expressiveness. The broad conclusion of the survey is that (1) the ISE community is currently in the early stages of the process of understanding how to formalize the classification pattern, particularly in the requirements for expressiveness exemplified by powersets, and (2) that there is an opportunity to intervene and speed up the process of adoption by clarifying this expressiveness. Given the central place that the classification pattern has in domain modeling, this intervention has the potential to lead to significant improvements.
ERIC Educational Resources Information Center
Vick, Matthew E.; Garvey, Michael P.
2016-01-01
The Boy Scouts of America's Environmental Science and Engineering merit badges are two of their over 120 merit badges offered as a part of a non-formal educational program to U.S. boys. The Scientific and Engineering Practices of the U.S. Next Generation Science Standards provide a vision of science education that includes integrating eight…
Linshiz, Gregory; Goldberg, Alex; Konry, Tania; Hillson, Nathan J
2012-01-01
Synthetic biology is a nascent field that emerged in earnest only around the turn of the millennium. It aims to engineer new biological systems and impart new biological functionality, often through genetic modifications. The design and construction of new biological systems is a complex, multistep process, requiring multidisciplinary collaborative efforts from "fusion" scientists who have formal training in computer science or engineering, as well as hands-on biological expertise. The public has high expectations for synthetic biology and eagerly anticipates the development of solutions to the major challenges facing humanity. This article discusses laboratory practices and the conduct of research in synthetic biology. It argues that the fusion science approach, which integrates biology with computer science and engineering best practices, including standardization, process optimization, computer-aided design and laboratory automation, miniaturization, and systematic management, will increase the predictability and reproducibility of experiments and lead to breakthroughs in the construction of new biological systems. The article also discusses several successful fusion projects, including the development of software tools for DNA construction design automation, recursive DNA construction, and the development of integrated microfluidics systems.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Towards a theoretical clarification of biomimetics using conceptual tools from engineering design.
Drack, M; Limpinsel, M; de Bruyn, G; Nebelsick, J H; Betz, O
2017-12-13
Many successful examples of biomimetic products are available, and most research efforts in this emerging field are directed towards the development of specific applications. The theoretical and conceptual underpinnings of the knowledge transfer between biologists, engineers and architects are, however, poorly investigated. The present article addresses this gap. We use a 'technomorphic' approach, i.e. the application of conceptual tools derived from engineering design, to better understand the processes operating during a typical biomimetic research project. This helps to elucidate the formal connections between functions, working principles and constructions (in a broad sense)-because the 'form-function-relationship' is a recurring issue in biology and engineering. The presented schema also serves as a conceptual framework that can be implemented for future biomimetic projects. The concepts of 'function' and 'working principle' are identified as the core elements in the biomimetic knowledge transfer towards applications. This schema not only facilitates the development of a common language in the emerging science of biomimetics, but also promotes the interdisciplinary dialogue among its subdisciplines.
An ORCID based synchronization framework for a national CRIS ecosystem.
Mendes Moreira, João; Cunha, Alcino; Macedo, Nuno
2015-01-01
PTCRIS (Portuguese Current Research Information System) is a program aiming at the creation and sustained development of a national integrated information ecosystem, to support research management according to the best international standards and practices. This paper reports on the experience of designing and prototyping a synchronization framework for PTCRIS based on ORCID (Open Researcher and Contributor ID). This framework embraces the "input once, re-use often" principle, and will enable a substantial reduction of the research output management burden by allowing automatic information exchange between the various national systems. The design of the framework followed best practices in rigorous software engineering, namely well-established principles in the research field of consistency management, and relied on formal analysis techniques and tools for its validation and verification. The notion of consistency between the services was formally specified and discussed with the stakeholders before the technical aspects on how to preserve said consistency were explored. Formal specification languages and automated verification tools were used to analyze the specifications and generate usage scenarios, useful for validation with the stakeholder and essential to certificate compliant services.
Developing Formal Correctness Properties from Natural Language Requirements
NASA Technical Reports Server (NTRS)
Nikora, Allen P.
2006-01-01
This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.
Composing, Analyzing and Validating Software Models
NASA Astrophysics Data System (ADS)
Sheldon, Frederick T.
1998-10-01
This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.
Composing, Analyzing and Validating Software Models
NASA Technical Reports Server (NTRS)
Sheldon, Frederick T.
1998-01-01
This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.
MatLab Programming for Engineers Having No Formal Programming Knowledge
NASA Technical Reports Server (NTRS)
Shaykhian, Linda H.; Shaykhian, Gholam Ali
2007-01-01
MatLab is one of the most widely used very high level programming languages for Scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. Also, stated are the current limitations of the MatLab, which possibly can be taken care of by Mathworks Inc. in a future version to make MatLab more versatile.
Clinical Immersion and Biomedical Engineering Design Education: "Engineering Grand Rounds".
Walker, Matthew; Churchwell, André L
2016-03-01
Grand Rounds is a ritual of medical education and inpatient care comprised of presenting the medical problems and treatment of a patient to an audience of physicians, residents, and medical students. Traditionally, the patient would be in attendance for the presentation and would answer questions. Grand Rounds has evolved considerably over the years with most sessions being didactic-rarely having a patient present (although, in some instances, an actor will portray the patient). Other members of the team, such as nurses, nurse practitioners, and biomedical engineers, are not traditionally involved in the formal teaching process. In this study we examine the rapid ideation in a clinical setting to forge a system of cross talk between engineers and physicians as a steady state at the praxis of ideation and implementation.
Leading the Teacher Team--Balancing between Formal and Informal Power in Program Leadership
ERIC Educational Resources Information Center
Högfeldt, Anna-Karin; Malmi, Lauri; Kinnunen, Päivi; Jerbrant, Anna; Strömberg, Emma; Berglund, Anders; Villadsen, Jørgen
2018-01-01
This continuous research within Nordic engineering institutions targets the contexts and possibilities for leadership among engineering education program directors. The IFP-model, developed based on analysis of interviews with program leaders in these institutions, visualizes the program director's informal and formal power. The model is presented…
The Study on Collaborative Manufacturing Platform Based on Agent
NASA Astrophysics Data System (ADS)
Zhang, Xiao-yan; Qu, Zheng-geng
To fulfill the trends of knowledge-intensive in collaborative manufacturing development, we have described multi agent architecture supporting knowledge-based platform of collaborative manufacturing development platform. In virtue of wrapper service and communication capacity agents provided, the proposed architecture facilitates organization and collaboration of multi-disciplinary individuals and tools. By effectively supporting the formal representation, capture, retrieval and reuse of manufacturing knowledge, the generalized knowledge repository based on ontology library enable engineers to meaningfully exchange information and pass knowledge across boundaries. Intelligent agent technology increases traditional KBE systems efficiency and interoperability and provides comprehensive design environments for engineers.
DARPA Initiative in Concurrent Engineering (DICE). Phase 2
1990-07-31
XS spreadsheet tool " Q-Calc spreadsheet tool " TAE+ outer wrapper for XS • Framemaker-based formal EDN (Electronic Design Notebook) " Data...shared global object space and object persistence. Technical Results Module Development XS Integration Environment A prototype of the wrapper concepts...for a spreadsheet integration environment, using an X-Windows based extensible Lotus 1-2-3 emulation called XS , and was (initially) targeted for
1983-10-05
battle damage. Others are local electrical power and cooling disruptions. Again, a highly critical function is lost if its computer site is destroyed. A...formalized design of the test bed to meet the requirements of the functional description and goals of the program. AMTEC --Z3IT TASKS: 610, 710, 810
NASA Technical Reports Server (NTRS)
Lucas, S. H.; Scotti, S. J.
1989-01-01
The nonlinear mathematical programming method (formal optimization) has had many applications in engineering design. A figure illustrates the use of optimization techniques in the design process. The design process begins with the design problem, such as the classic example of the two-bar truss designed for minimum weight as seen in the leftmost part of the figure. If formal optimization is to be applied, the design problem must be recast in the form of an optimization problem consisting of an objective function, design variables, and constraint function relations. The middle part of the figure shows the two-bar truss design posed as an optimization problem. The total truss weight is the objective function, the tube diameter and truss height are design variables, with stress and Euler buckling considered as constraint function relations. Lastly, the designer develops or obtains analysis software containing a mathematical model of the object being optimized, and then interfaces the analysis routine with existing optimization software such as CONMIN, ADS, or NPSOL. This final state of software development can be both tedious and error-prone. The Sizing and Optimization Language (SOL), a special-purpose computer language whose goal is to make the software implementation phase of optimum design easier and less error-prone, is presented.
A Foundational Approach to Designing Geoscience Ontologies
NASA Astrophysics Data System (ADS)
Brodaric, B.
2009-05-01
E-science systems are increasingly deploying ontologies to aid online geoscience research. Geoscience ontologies are typically constructed independently by isolated individuals or groups who tend to follow few design principles. This limits the usability of the ontologies due to conceptualizations that are vague, conflicting, or narrow. Advances in foundational ontologies and formal engineering approaches offer promising solutions, but these advanced techniques have had limited application in the geosciences. This paper develops a design approach for geoscience ontologies by extending aspects of the DOLCE foundational ontology and the OntoClean method. Geoscience examples will be presented to demonstrate the feasibility of the approach.
NASA Astrophysics Data System (ADS)
Coutoumanos, Vincent
The following research is intended to develop more formal mechanisms for collection, analysis, retention and dissemination of information relating to brand influence on high-technology products. Specifically, these high-technology products are associated with the engineering applications that likely would involve the loss of human life in the advent of catastrophic failure. The results of the study lead to an extension of theory involving marketing and product selection of "highly engineered" parts within the aerospace industry. The findings were separated into three distinct areas: 1) Information load will play a large role in the final design decision. If the designer is under a high level of information load during the time of a design decision, he or she likely will gravitate to the traditional design choice, regardless of the level of brand strength. 2) Even when strong brand names, like 3M, were offered as the non-traditional design choice, engineers gravitated to the traditional design choice that was presented in a mock Society for Manufacturing Engineers article. 3) Designer self-efficacy by itself will not often contribute to a decision maker's design choice. However, these data collected indicates that a combination of high designer self-efficacy moderated by high brand strength is likely to contribute significantly to a decision maker's decision. The post-hoc finding shows that many designers having high levels of self-efficacy could be developing a sense of comfort with strong brand names (like 3M) when making a design choice.
Colloquium: Modeling the dynamics of multicellular systems: Application to tissue engineering
NASA Astrophysics Data System (ADS)
Kosztin, Ioan; Vunjak-Novakovic, Gordana; Forgacs, Gabor
2012-10-01
Tissue engineering is a rapidly evolving discipline that aims at building functional tissues to improve or replace damaged ones. To be successful in such an endeavor, ideally, the engineering of tissues should be based on the principles of developmental biology. Recent progress in developmental biology suggests that the formation of tissues from the composing cells is often guided by physical laws. Here a comprehensive computational-theoretical formalism is presented that is based on experimental input and incorporates biomechanical principles of developmental biology. The formalism is described and it is shown that it correctly reproduces and predicts the quantitative characteristics of the fundamental early developmental process of tissue fusion. Based on this finding, the formalism is then used toward the optimization of the fabrication of tubular multicellular constructs, such as a vascular graft, by bioprinting, a novel tissue engineering technology.
NASA Technical Reports Server (NTRS)
Bolton, Matthew L.; Bass, Ellen J.
2009-01-01
Both the human factors engineering (HFE) and formal methods communities are concerned with finding and eliminating problems with safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to use model checking with HFE practices to perform formal verification of a human-interactive system. Despite the use of a seemingly simple target system, a patient controlled analgesia pump, the initial model proved to be difficult for the model checker to verify in a reasonable amount of time. This resulted in a number of model revisions that affected the HFE architectural, representativeness, and understandability goals of the effort. If formal methods are to meet the needs of the HFE community, additional modeling tools and technological developments are necessary.
Systems engineering: A formal approach. Part 1: System concepts
NASA Astrophysics Data System (ADS)
Vanhee, K. M.
1993-03-01
Engineering is the scientific discipline focused on the creation of new artifacts that are supposed to be of some use to our society. Different types of artifacts require different engineering approaches. However, in all these disciplines the development of a new artifact is divided into stages. Three stages can always be recognized: Analysis, Design, and Realization. The book considers only the first two stages of the development process. It focuses on a specific type of artifacts, called discrete dynamic systems. These systems consist of active components of actors that consume and produce passive components or tokens. Three subtypes are studied in more detail: business systems (like a factory or restaurant), information systems (whether automated or not), and automated systems (systems that are controlled by an automated information system). The first subtype is studied by industrial engineers, the last by software engineers and electrical engineers, whereas the second is a battlefield for all three disciplines. The union of these disciplines is called systems engineering.
How to Boost Engineering Support Via Web 2.0 - Seeds for the Ares Project...and/or Yours?
NASA Technical Reports Server (NTRS)
Scott, David W.
2010-01-01
The Mission Operations Laboratory (MOL) at Marshall Space Flight Center (MSFC) is responsible for Engineering Support capability for NASA s Ares launch system development. In pursuit of this, MOL is building the Ares Engineering and Operations Network (AEON), a web-based portal intended to provide a seamless interface to support and simplify two critical activities: a) Access and analyze Ares manufacturing, test, and flight performance data, with access to Shuttle data for comparison. b) Provide archive storage for engineering instrumentation data to support engineering design, development, and test. A mix of NASA-written and COTS software provides engineering analysis tools. A by-product of using a data portal to access and display data is access to collaborative tools inherent in a Web 2.0 environment. This paper discusses how Web 2.0 techniques, particularly social media, might be applied to the traditionally conservative and formal engineering support arena. A related paper by the author [1] considers use
The Engineering Design Process: Conceptions Along the Learning-to-Teach Continuum
NASA Astrophysics Data System (ADS)
Iveland, Ashley
In this study, I sought to identify differences in the views and understandings of engineering design among individuals along the learning-to-teach continuum. To do so, I conducted a comprehensive review of literature to determine the various aspects of engineering design described in the fields of professional engineering and engineering education. Additionally, I reviewed literature on the methods used in teaching engineering design at the secondary (grade 7-12) level - to describe the various models used in classrooms, even before the implementation of the Next Generation Science Standards (NGSS Lead States, 2013). Last, I defined four groups along the learning-to-teach continuum: prospective, preservice, and practicing teachers, as well as teacher educators. The context of this study centered around a California public university, including an internship program where undergraduates engaged with practicing mentor teachers in science and engineering teaching at local high schools, and a teacher education program where secondary science preservice teachers and the teacher educators who taught them participated. Interviews were conducted with all participants to gain insights into their views and understandings of engineering design. Prospective and preservice teachers were interviewed multiple times throughout the year and completed concept maps of the engineering design process multiple times as well; practicing teachers and teacher educators were interviewed once. Three levels of analyses were conducted. I identified 30 aspects of engineering discussed by participants. Through phenomenographic methods, I also constructed six conceptual categories for engineering design to organize those aspects most commonly discussed. These categories were combined to demonstrate a participant's view of engineering design (e.g., business focused, human centered, creative, etc.) as well as their complexity of understanding of engineering design overall (the more categories their ideas fit within, the more complex their understanding was thought to be). I found that the most commonly referenced aspects of engineering design were in line with the three main dimensions described in the Next Generation Science Standards (NGSS Lead States, 2013). I also found that the practicing teacher participants overall conveyed the most complex and integrated understandings of engineering design, with the undergraduate, prospective teachers not far behind. One of the most important factors related to a more integrated understanding of engineering design was having formal engineering experience, especially in the form of conducting engineering research or having been a professional engineer. Further, I found that female participants were more likely than their male counterparts to view engineering as having a human element--recognizing the need to collaborate with others throughout the process and the need to think about the potential user of the product the engineer is solving the problem for. These findings suggest that prior experience with engineering, and not experience in the classroom or with engineering education, tends to lead to a deeper, more authentic view of engineering. Finally, I close with a discussion of the overall findings, limitations of the study, potential implications, and future work.
Formal, Non-Formal and Informal Learning in the Sciences
ERIC Educational Resources Information Center
Ainsworth, Heather L.; Eaton, Sarah Elaine
2010-01-01
This research report investigates the links between formal, non-formal and informal learning and the differences between them. In particular, the report aims to link these notions of learning to the field of sciences and engineering in Canada and the United States, including professional development of adults working in these fields. It offers…
Crowd-Sourced Help with Emergent Knowledge for Optimized Formal Verification (CHEKOFV)
2016-03-01
up game Binary Fission, which was deployed during Phase Two of CHEKOFV. Xylem: The Code of Plants is a casual game for players using mobile ...there are the design and engineering challenges of building a game infrastructure that integrates verification technology with crowd participation...the backend processes that annotate the originating software. Allowing players to construct their own equations opened up the flexibility to receive
Formal Language Design in the Context of Domain Engineering
2000-03-28
73 Related Work 75 5.1 Feature oriented domain analysis ( FODA ) 75 5.2 Organizational domain modeling (ODM) 76 5.3 Domain-Specific Software...However there are only a few that are well defined and used repeatedly in practice. These include: Feature oriented domain analysis ( FODA ), Organizational...Feature oriented domain analysis ( FODA ) Feature oriented domain analysis ( FODA ) is a domain analysis method being researched and applied by the SEI
The application of SSADM to modelling the logical structure of proteins.
Saldanha, J; Eccles, J
1991-10-01
A logical design that describes the overall structure of proteins, together with a more detailed design describing secondary and some supersecondary structures, has been constructed using the computer-aided software engineering (CASE) tool, Auto-mate. Auto-mate embodies the philosophy of the Structured Systems Analysis and Design Method (SSADM) which enables the logical design of computer systems. Our design will facilitate the building of large information systems, such as databases and knowledgebases in the field of protein structure, by the derivation of system requirements from our logical model prior to producing the final physical system. In addition, the study has highlighted the ease of employing SSADM as a formalism in which to conduct the transferral of concepts from an expert into a design for a knowledge-based system that can be implemented on a computer (the knowledge-engineering exercise). It has been demonstrated how SSADM techniques may be extended for the purpose of modelling the constituent Prolog rules. This facilitates the integration of the logical system design model with the derived knowledge-based system.
NASA Technical Reports Server (NTRS)
Rowe, Sidney E.
2010-01-01
In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.
Closing the Gap between Formalism and Application--PBL and Mathematical Skills in Engineering
ERIC Educational Resources Information Center
Christensen, Ole Ravn
2008-01-01
A common problem in learning mathematics concerns the gap between, on the one hand, doing the formalisms and calculations of abstract mathematics and, on the other hand, applying these in a specific contextualized setting for example the engineering world. The skills acquired through problem-based learning (PBL), in the special model used at…
Re-Engineering the Mission Operations System (MOS) for the Prime and Extended Mission
NASA Technical Reports Server (NTRS)
Hunt, Joseph C., Jr.; Cheng, Leo Y.
2012-01-01
One of the most challenging tasks in a space science mission is designing the Mission Operations System (MOS). Whereas the focus of the project is getting the spacecraft built and tested for launch, the mission operations engineers must build a system to carry out the science objectives. The completed MOS design is then formally assessed in the many reviews. Once a mission has completed the reviews, the Mission Operation System (MOS) design has been validated to the Functional Requirements and is ready for operations. The design was built based on heritage processes, new technology, and lessons learned from past experience. Furthermore, our operational concepts must be properly mapped to the mission design and science objectives. However, during the course of implementing the science objective in the operations phase after launch, the MOS experiences an evolutional change to adapt for actual performance characteristics. This drives the re-engineering of the MOS, because the MOS includes the flight and ground segments. Using the Spitzer mission as an example we demonstrate how the MOS design evolved for both the prime and extended mission to enhance the overall efficiency for science return. In our re-engineering process, we ensured that no requirements were violated or mission objectives compromised. In most cases, optimized performance across the MOS, including gains in science return as well as savings in the budget profile was achieved. Finally, we suggest a need to better categorize the Operations Phase (Phase E) in the NASA Life-Cycle Phases of Formulation and Implementation
NASA Astrophysics Data System (ADS)
Hakkila, Jon; Runyon, Cassndra; Benfield, M. P. J.; Turner, Matthew W.; Farrington, Phillip A.
2015-08-01
We report on five years of an exciting and successful educational collaboration in which science undergraduates at the College of Charleston work with engineering seniors at the University of Alabama in Huntsville to design a planetary science mission in response to a mock announcement of opportunity. Alabama high schools are also heavily involved in the project, and other colleges and universities have also participated. During the two-semester course students learn about scientific goals, past missions, methods of observation, instrumentation, and component integration, proposal writing, and presentation. More importantly, students learn about real-world communication and teamwork, and go through a series of baseline reviews before presenting their results at a formal final review for a panel of NASA scientists and engineers. The project is competitive, with multiple mission designs competing with one another for the best review score. Past classes have involved missions to Venus, Europa, Titan, Mars, asteroids, comets, and even the Moon. Classroom successes and failures have both been on epic scales.
Goal-Function Tree Modeling for Systems Engineering and Fault Management
NASA Technical Reports Server (NTRS)
Johnson, Stephen B.; Breckenridge, Jonathan T.
2013-01-01
The draft NASA Fault Management (FM) Handbook (2012) states that Fault Management (FM) is a "part of systems engineering", and that it "demands a system-level perspective" (NASAHDBK- 1002, 7). What, exactly, is the relationship between systems engineering and FM? To NASA, systems engineering (SE) is "the art and science of developing an operable system capable of meeting requirements within often opposed constraints" (NASA/SP-2007-6105, 3). Systems engineering starts with the elucidation and development of requirements, which set the goals that the system is to achieve. To achieve these goals, the systems engineer typically defines functions, and the functions in turn are the basis for design trades to determine the best means to perform the functions. System Health Management (SHM), by contrast, defines "the capabilities of a system that preserve the system's ability to function as intended" (Johnson et al., 2011, 3). Fault Management, in turn, is the operational subset of SHM, which detects current or future failures, and takes operational measures to prevent or respond to these failures. Failure, in turn, is the "unacceptable performance of intended function." (Johnson 2011, 605) Thus the relationship of SE to FM is that SE defines the functions and the design to perform those functions to meet system goals and requirements, while FM detects the inability to perform those functions and takes action. SHM and FM are in essence "the dark side" of SE. For every function to be performed (SE), there is the possibility that it is not successfully performed (SHM); FM defines the means to operationally detect and respond to this lack of success. We can also describe this in terms of goals: for every goal to be achieved, there is the possibility that it is not achieved; FM defines the means to operationally detect and respond to this inability to achieve the goal. This brief description of relationships between SE, SHM, and FM provide hints to a modeling approach to provide formal connectivity between the nominal (SE), and off-nominal (SHM and FM) aspects of functions and designs. This paper describes a formal modeling approach to the initial phases of the development process that integrates the nominal and off-nominal perspectives in a model that unites SE goals and functions of with the failure to achieve goals and functions (SHM/FM).
Aerospace engineering design by systematic decomposition and multilevel optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Barthelemy, J. F. M.; Giles, G. L.
1984-01-01
A method for systematic analysis and optimization of large engineering systems, by decomposition of a large task into a set of smaller subtasks that is solved concurrently is described. The subtasks may be arranged in hierarchical levels. Analyses are carried out in each subtask using inputs received from other subtasks, and are followed by optimizations carried out from the bottom up. Each optimization at the lower levels is augmented by analysis of its sensitivity to the inputs received from other subtasks to account for the couplings among the subtasks in a formal manner. The analysis and optimization operations alternate iteratively until they converge to a system design whose performance is maximized with all constraints satisfied. The method, which is still under development, is tentatively validated by test cases in structural applications and an aircraft configuration optimization.
Toxicity reduction in industrial effluents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1990-01-01
Wastewater treatment technology is undergoing a profound transformation as a result of the fundamental changes in regulations and permit requirements. Established design procedures and criteria which have served the industry well for decades are no longer useful. Toxicity reduction requirements have forced reconsideration of design standards and caused practicing environmental engineers to seek additional training in the biological sciences. Formal academic programs have not traditionally provided the cross-training between biologists and engineers which is necessary to address these issues. This book describes not only the process of identifying the toxicity problem, but also the treatment technologies which are applicable tomore » reduction or elimination of toxicity. The information provided in this book is a compilation of the experience of ECK-ENFELDER INC. in serving the environmental needs of major industry, and the experience of the individual contributors in research and consultations.« less
Autonomous and Autonomic Swarms
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Truszkowski, Walter F.; Rouff, Christopher A.; Sterritt, Roy
2005-01-01
A watershed in systems engineering is represented by the advent of swarm-based systems that accomplish missions through cooperative action by a (large) group of autonomous individuals each having simple capabilities and no global knowledge of the group s objective. Such systems, with individuals capable of surviving in hostile environments, pose unprecedented challenges to system developers. Design and testing and verification at much higher levels will be required, together with the corresponding tools, to bring such systems to fruition. Concepts for possible future NASA space exploration missions include autonomous, autonomic swarms. Engineering swarm-based missions begins with understanding autonomy and autonomicity and how to design, test, and verify systems that have those properties and, simultaneously, the capability to accomplish prescribed mission goals. Formal methods-based technologies, both projected and in development, are described in terms of their potential utility to swarm-based system developers.
Offshore safety case approach and formal safety assessment of ships.
Wang, J
2002-01-01
Tragic marine and offshore accidents have caused serious consequences including loss of lives, loss of property, and damage of the environment. A proactive, risk-based "goal setting" regime is introduced to the marine and offshore industries to increase the level of safety. To maximize marine and offshore safety, risks need to be modeled and safety-based decisions need to be made in a logical and confident way. Risk modeling and decision-making tools need to be developed and applied in a practical environment. This paper describes both the offshore safety case approach and formal safety assessment of ships in detail with particular reference to the design aspects. The current practices and the latest development in safety assessment in both the marine and offshore industries are described. The relationship between the offshore safety case approach and formal ship safety assessment is described and discussed. Three examples are used to demonstrate both the offshore safety case approach and formal ship safety assessment. The study of risk criteria in marine and offshore safety assessment is carried out. The recommendations on further work required are given. This paper gives safety engineers in the marine and offshore industries an overview of the offshore safety case approach and formal ship safety assessment. The significance of moving toward a risk-based "goal setting" regime is given.
Peer Learning in a MATLAB Programming Course
NASA Astrophysics Data System (ADS)
Reckinger, Shanon
2016-11-01
Three forms of research-based peer learning were implemented in the design of a MATLAB programming course for mechanical engineering undergraduate students. First, a peer learning program was initiated. These undergraduate peer learning leaders played two roles in the course, (I) they were in the classroom helping students' with their work, and, (II) they led optional two hour helps sessions outside of the class time. The second form of peer learning was implemented through the inclusion of a peer discussion period following in class clicker quizzes. The third form of peer learning had the students creating video project assignments and posting them on YouTube to explain course topics to their peers. Several other more informal techniques were used to encourage peer learning. Student feedback in the form of both instructor-designed survey responses and formal course evaluations (quantitative and narrative) will be presented. Finally, effectiveness will be measured by formal assessment, direct and indirect to these peer learning methods. This will include both academic data/grades and pre/post test scores. Overall, the course design and its inclusion of these peer learning techniques demonstrate effectiveness.
Academic Achievement and Formal Thought in Engineering Students
ERIC Educational Resources Information Center
Vazquez, Stella Maris; de Anglat, Hilda Difabio
2009-01-01
Introduction: Research on university-level academic performance has significantly linked failure and dropping out to formal reasoning deficiency. We have not found any papers on formal thought in Argentine university students, in spite of the obvious shortcomings observed in the classrooms. Thus, the main objective of this paper was exploring the…
NASA Astrophysics Data System (ADS)
White, Susan M.
Women engineers remain underrepresented in employment in engineering fields in the United States. Feminist theory views this gender disparity beyond equity in numbers for women engineers and looks at structural issues of women's access, opportunities, and quality of experience in the workplace. Research on women's success and persistence in engineering education is diverse; however, there are few studies that focus on the early years of women's careers in engineering and less using a phenomenological research design. Experiences of women engineers who have completed one to five years of professional engineering employment are presented using a phenomenological research design. Research questions explored the individual and composite experiences for the co-researchers of the study as well as challenges and advantages of the phenomenon of having completed one to five years of professional engineering employment. Themes that emanated from the data were a feeling that engineering is a positive profession, liking math and science from an early age, having experiences of attending math and science camps or learning and practicing engineering interests with their fathers for some co-researchers. Other themes included a feeling of being different as a woman in the engineering workplace, taking advantage of opportunities for training, education, and advancement to further their careers, and the role of informal and formal mentoring in developing workplace networks and engineering expertise. Co-researchers negotiated issues of management quality and support, experiences of gender discrimination in the workplace, and having to make decisions balancing their careers and family responsibilities. Finally, the women engineers for this research study expressed intentions to persist in their careers while pursuing expertise and experience in their individual engineering fields.
Code of Federal Regulations, 2010 CFR
2010-10-01
... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions... two value engineering approaches: (1) The first is an incentive approach in which contractor...
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John
2005-01-01
Requirements-to-Design-to-Code (R2D2C) is an approach to the engineering of computer-based systems that embodies the idea of requirements-based programming in system development. It goes further; however, in that the approach offers not only an underlying formalism, but full formal development from requirements capture through to the automatic generation of provably-correct code. As such, the approach has direct application to the development of systems requiring autonomic properties. We describe a prototype tool to support the method, and illustrate its applicability to the development of LOGOS, a NASA autonomous ground control system, which exhibits autonomic behavior. Finally, we briefly discuss other areas where the approach and prototype tool are being considered for application.
Research into the development of a knowledge acquisition taxonomy
NASA Technical Reports Server (NTRS)
Fink, Pamela K.; Herren, L. Tandy
1991-01-01
The focus of the research was on the development of a problem solving taxonomy that can support and direct the knowledge engineering process during the development of an intelligent tutoring system. The results of the research are necessarily general. Being only a small initial attempt at a fundamental problem in artificial intelligence and cognitive psychology, the process has had to be bootstrapped and the results can only provide pointers to further, more formal research designs.
NASA's Space Launch System: Systems Engineering Approach for Affordability and Mission Success
NASA Technical Reports Server (NTRS)
Hutt, John J.; Whitehead, Josh; Hanson, John
2017-01-01
NASA is working toward the first launch of the Space Launch System, a new, unmatched capability for deep space exploration with launch readiness planned for 2019. Since program start in 2011, SLS has passed several major formal design milestones, and every major element of the vehicle has produced test and flight hardware. The SLS approach to systems engineering has been key to the program's success. Key aspects of the SLS SE&I approach include: 1) minimizing the number of requirements, 2) elimination of explicit verification requirements, 3) use of certified models of subsystem capability in lieu of requirements when appropriate and 4) certification of capability beyond minimum required capability.
Code of Federal Regulations, 2013 CFR
2013-10-01
... engineering effort is applied to areas of the contract that offer opportunities for considerable savings... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions...
Code of Federal Regulations, 2014 CFR
2014-10-01
... engineering effort is applied to areas of the contract that offer opportunities for considerable savings... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions...
Code of Federal Regulations, 2012 CFR
2012-10-01
... engineering effort is applied to areas of the contract that offer opportunities for considerable savings... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions...
Developing Non-Formal Education Competences as a Complement of Formal Education for STEM Lecturers
ERIC Educational Resources Information Center
Terrazas-Marín, Roy Alonso
2018-01-01
This paper focuses on a current practice piece on professional development for university lecturers, transformative learning, dialogism and STEM (Science, Technology, Engineering and Mathematics) education. Its main goals are to identify the key characteristics that allow STEM educators to experiment with the usage of non-formal education…
Generating Alternative Proposals for the Louvre Using Procedural Modeling
NASA Astrophysics Data System (ADS)
Calogero, E.; Arnold, D.
2011-09-01
This paper presents the process of reconstructing two facade designs for the East wing of the Louvre using procedural modeling. The first proposal reconstructed is Louis Le Vau's 1662 scheme and the second is the 1668 design of the "petit conseil" that still stands today. The initial results presented show how such reconstructions may aid general and expert understanding of the two designs. It is claimed that by formalizing the facade description into a shape grammar in CityEngine, a systematized approach to a stylistic analysis is possible. It is also asserted that such an analysis is still best understood in the historical context of what is known about the contemporary design intentions of the building creators and commissioners.
Why Engineers Should Consider Formal Methods
NASA Technical Reports Server (NTRS)
Holloway, C. Michael
1997-01-01
This paper presents a logical analysis of a typical argument favoring the use of formal methods for software development, and suggests an alternative argument that is simpler and stronger than the typical one.
NASA Technical Reports Server (NTRS)
Korsmeyer, David; Schreiner, John
2002-01-01
This technology evaluation report documents the findings and recommendations of the Engineering for Complex Systems Program (formerly Design for Safety) PRACA Enhancement Pilot Study of the Space Shuttle Program's (SSP's) Problem Reporting and Corrective Action (PRACA) System. A team at NASA Ames Research Center (ARC) performed this Study. This Study was initiated as a follow-on to the NASA chartered Shuttle Independent Assessment Team (SIAT) review (performed in the Fall of 1999) which identified deficiencies in the current PRACA implementation. The Pilot Study was launched with an initial qualitative assessment and technical review performed during January 2000 with the quantitative formal Study (the subject of this report) started in March 2000. The goal of the PRACA Enhancement Pilot Study is to evaluate and quantify the technical aspects of the SSP PRACA systems and recommend enhancements to address deficiencies and in preparation for future system upgrades.
Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems
NASA Technical Reports Server (NTRS)
Bujorianu, Marius C.; Bujorianu, Manuela L.
2009-01-01
In this paper, we sketch a framework for interdisciplinary modeling of space systems, by proposing a holistic view. We consider different system dimensions and their interaction. Specifically, we study the interactions between computation, physics, communication, uncertainty and autonomy. The most comprehensive computational paradigm that supports a holistic perspective on autonomous space systems is given by cyber-physical systems. For these, the state of art consists of collaborating multi-engineering efforts that prompt for an adequate formal foundation. To achieve this, we propose a leveraging of the traditional content of formal modeling by a co-engineering process.
NASA Astrophysics Data System (ADS)
Ryan, R.; Gross, L. A.
1995-05-01
The Space Shuttle main engine (SSME) alternate high-pressure liquid oxygen pump experienced synchronous vibration and ball bearing life problems that were program threatening. The success of the program hinged on the ability to solve these development problems. The design and solutions to these problems are engirded in the lessons learned and experiences from prior programs, technology programs, and the ability to properly conduct failure or anomaly investigations. The failure investigation determines the problem cause and is the basis for recommending design solutions. For a complex problem, a comprehensive solution requires that formal investigation procedures be used, including fault trees, resolution logic, and action items worked through a concurrent engineering-multidiscipline team. The normal tendency to use an intuitive, cut-and-try approach will usually prove to be costly, both in money and time and will reach a less than optimum, poorly understood answer. The SSME alternate high-pressure oxidizer turbopump development has had two complex problems critical to program success: (1) high synchronous vibrations and (2) excessive ball bearing wear. This paper will use these two problems as examples of this formal failure investigation approach. The results of the team's investigation provides insight into the complexity of the turbomachinery technical discipline interacting/sensitivities and the fine balance of competing investigations required to solve problems and guarantee program success. It is very important to the solution process that maximum use be made of the resources that both the contractor and Government can bring to the problem in a supporting and noncompeting way. There is no place for the not-invented-here attitude. The resources include, but are not limited to: (1) specially skilled professionals; (2) supporting technologies; (3) computational codes and capabilities; and (4) test and manufacturing facilities.
NASA Technical Reports Server (NTRS)
Ryan, R.; Gross, L. A.
1995-01-01
The Space Shuttle main engine (SSME) alternate high-pressure liquid oxygen pump experienced synchronous vibration and ball bearing life problems that were program threatening. The success of the program hinged on the ability to solve these development problems. The design and solutions to these problems are engirded in the lessons learned and experiences from prior programs, technology programs, and the ability to properly conduct failure or anomaly investigations. The failure investigation determines the problem cause and is the basis for recommending design solutions. For a complex problem, a comprehensive solution requires that formal investigation procedures be used, including fault trees, resolution logic, and action items worked through a concurrent engineering-multidiscipline team. The normal tendency to use an intuitive, cut-and-try approach will usually prove to be costly, both in money and time and will reach a less than optimum, poorly understood answer. The SSME alternate high-pressure oxidizer turbopump development has had two complex problems critical to program success: (1) high synchronous vibrations and (2) excessive ball bearing wear. This paper will use these two problems as examples of this formal failure investigation approach. The results of the team's investigation provides insight into the complexity of the turbomachinery technical discipline interacting/sensitivities and the fine balance of competing investigations required to solve problems and guarantee program success. It is very important to the solution process that maximum use be made of the resources that both the contractor and Government can bring to the problem in a supporting and noncompeting way. There is no place for the not-invented-here attitude. The resources include, but are not limited to: (1) specially skilled professionals; (2) supporting technologies; (3) computational codes and capabilities; and (4) test and manufacturing facilities.
Aerospace engineering design by systematic decomposition and multilevel optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.; Giles, G. L.; Barthelemy, J.-F. M.
1984-01-01
This paper describes a method for systematic analysis and optimization of large engineering systems, e.g., aircraft, by decomposition of a large task into a set of smaller, self-contained subtasks that can be solved concurrently. The subtasks may be arranged in many hierarchical levels with the assembled system at the top level. Analyses are carried out in each subtask using inputs received from other subtasks, and are followed by optimizations carried out from the bottom up. Each optimization at the lower levels is augmented by analysis of its sensitivity to the inputs received from other subtasks to account for the couplings among the subtasks in a formal manner. The analysis and optimization operations alternate iteratively until they converge to a system design whose performance is maximized with all constraints satisfied. The method, which is still under development, is tentatively validated by test cases in structural applications and an aircraft configuration optimization. It is pointed out that the method is intended to be compatible with the typical engineering organization and the modern technology of distributed computing.
Mass Transit: Implementation of FTA’s New Starts Evaluation Process and FY 2001 Funding Proposals
2000-04-01
formalize the process. FTA issued a proposed rule on April 7, 1999, and plans to issue final regulations by the summer of 2000. In selecting projects for...commit funds to any more New Starts projects during the last 2 years of TEA-21—through fiscal year 2003. Because there are plans for many more...regional review of alternatives, develop preliminary engineering plans , and meet FTA’s approval for the final design. TEA-21 requires that FTA evaluate
Ball, David A; Lux, Matthew W; Graef, Russell R; Peterson, Matthew W; Valenti, Jane D; Dileo, John; Peccoud, Jean
2010-01-01
The concept of co-design is common in engineering, where it is necessary, for example, to determine the optimal partitioning between hardware and software of the implementation of a system features. Here we propose to adapt co-design methodologies for synthetic biology. As a test case, we have designed an environmental sensing device that detects the presence of three chemicals, and returns an output only if at least two of the three chemicals are present. We show that the logical operations can be implemented in three different design domains: (1) the transcriptional domain using synthetically designed hybrid promoters, (2) the protein domain using bi-molecular fluorescence complementation, and (3) the fluorescence domain using spectral unmixing and relying on electronic processing. We discuss how these heterogeneous design strategies could be formalized to develop co-design algorithms capable of identifying optimal designs meeting user specifications.
NASA Astrophysics Data System (ADS)
Novak, Elena; Wisdom, Sonya
2018-05-01
3D printing technology is a powerful educational tool that can promote integrative STEM education by connecting engineering, technology, and applications of science concepts. Yet, research on the integration of 3D printing technology in formal educational contexts is extremely limited. This study engaged preservice elementary teachers (N = 42) in a 3D Printing Science Project that modeled a science experiment in the elementary classroom on why things float or sink using 3D printed boats. The goal was to explore how collaborative 3D printing inquiry-based learning experiences affected preservice teachers' science teaching self-efficacy beliefs, anxiety toward teaching science, interest in science, perceived competence in K-3 technology and engineering science standards, and science content knowledge. The 3D printing project intervention significantly decreased participants' science teaching anxiety and improved their science teaching efficacy, science interest, and perceived competence in K-3 technological and engineering design science standards. Moreover, an analysis of students' project reflections and boat designs provided an insight into their collaborative 3D modeling design experiences. The study makes a contribution to the scarce body of knowledge on how teacher preparation programs can utilize 3D printing technology as a means of preparing prospective teachers to implement the recently adopted engineering and technology standards in K-12 science education.
Formal Assurance Arguments: A Solution In Search of a Problem?
NASA Technical Reports Server (NTRS)
Graydon, Patrick J.
2015-01-01
An assurance case comprises evidence and argument showing how that evidence supports assurance claims (e.g., about safety or security). It is unsurprising that some computer scientists have proposed formalizing assurance arguments: most associate formality with rigor. But while engineers can sometimes prove that source code refines a formal specification, it is not clear that formalization will improve assurance arguments or that this benefit is worth its cost. For example, formalization might reduce the benefits of argumentation by limiting the audience to people who can read formal logic. In this paper, we present (1) a systematic survey of the literature surrounding formal assurance arguments, (2) an analysis of errors that formalism can help to eliminate, (3) a discussion of existing evidence, and (4) suggestions for experimental work to definitively answer the question.
(abstract) Formal Inspection Technology Transfer Program
NASA Technical Reports Server (NTRS)
Welz, Linda A.; Kelly, John C.
1993-01-01
A Formal Inspection Technology Transfer Program, based on the inspection process developed by Michael Fagan at IBM, has been developed at JPL. The goal of this program is to support organizations wishing to use Formal Inspections to improve the quality of software and system level engineering products. The Technology Transfer Program provides start-up materials and assistance to help organizations establish their own Formal Inspection program. The course materials and certified instructors associated with the Technology Transfer Program have proven to be effective in classes taught at other NASA centers as well as at JPL. Formal Inspections (NASA tailored Fagan Inspections) are a set of technical reviews whose objective is to increase quality and reduce the cost of software development by detecting and correcting errors early. A primary feature of inspections is the removal of engineering errors before they amplify into larger and more costly problems downstream in the development process. Note that the word 'inspection' is used differently in software than in a manufacturing context. A Formal Inspection is a front-end quality enhancement technique, rather than a task conducted just prior to product shipment for the purpose of sorting defective systems (manufacturing usage). Formal Inspections are supporting and in agreement with the 'total quality' approach being adopted by many NASA centers.
Engineering Research Division publication report, calendar year 1980
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, E.K.; Livingston, P.L.; Rae, D.C.
Each year the Engineering Research Division of the Electronics Engineering Department at Lawrence Livermore Laboratory has issued an internal report listing all formal publications produced by the Division during the calendar year. Abstracts of 1980 reports are presented.
Preparing Engineers for the Challenges of Community Engagement
ERIC Educational Resources Information Center
Harsh, Matthew; Bernstein, Michael J.; Wetmore, Jameson; Cozzens, Susan; Woodson, Thomas; Castillo, Rafael
2017-01-01
Despite calls to address global challenges through community engagement, engineers are not formally prepared to engage with communities. Little research has been done on means to address this "engagement gap" in engineering education. We examine the efficacy of an intensive, two-day Community Engagement Workshop for engineers, designed…
A Design Pattern for Decentralised Decision Making
Valentini, Gabriele; Fernández-Oto, Cristian; Dorigo, Marco
2015-01-01
The engineering of large-scale decentralised systems requires sound methodologies to guarantee the attainment of the desired macroscopic system-level behaviour given the microscopic individual-level implementation. While a general-purpose methodology is currently out of reach, specific solutions can be given to broad classes of problems by means of well-conceived design patterns. We propose a design pattern for collective decision making grounded on experimental/theoretical studies of the nest-site selection behaviour observed in honeybee swarms (Apis mellifera). The way in which honeybee swarms arrive at consensus is fairly well-understood at the macroscopic level. We provide formal guidelines for the microscopic implementation of collective decisions to quantitatively match the macroscopic predictions. We discuss implementation strategies based on both homogeneous and heterogeneous multiagent systems, and we provide means to deal with spatial and topological factors that have a bearing on the micro-macro link. Finally, we exploit the design pattern in two case studies that showcase the viability of the approach. Besides engineering, such a design pattern can prove useful for a deeper understanding of decision making in natural systems thanks to the inclusion of individual heterogeneities and spatial factors, which are often disregarded in theoretical modelling. PMID:26496359
Solving ordinary differential equations by electrical analogy: a multidisciplinary teaching tool
NASA Astrophysics Data System (ADS)
Sanchez Perez, J. F.; Conesa, M.; Alhama, I.
2016-11-01
Ordinary differential equations are the mathematical formulation for a great variety of problems in science and engineering, and frequently, two different problems are equivalent from a mathematical point of view when they are formulated by the same equations. Students acquire the knowledge of how to solve these equations (at least some types of them) using protocols and strict algorithms of mathematical calculation without thinking about the meaning of the equation. The aim of this work is that students learn to design network models or circuits in this way; with simple knowledge of them, students can establish the association of electric circuits and differential equations and their equivalences, from a formal point of view, that allows them to associate knowledge of two disciplines and promote the use of this interdisciplinary approach to address complex problems. Therefore, they learn to use a multidisciplinary tool that allows them to solve these kinds of equations, even students of first course of engineering, whatever the order, grade or type of non-linearity. This methodology has been implemented in numerous final degree projects in engineering and science, e.g., chemical engineering, building engineering, industrial engineering, mechanical engineering, architecture, etc. Applications are presented to illustrate the subject of this manuscript.
Survey of Verification and Validation Techniques for Small Satellite Software Development
NASA Technical Reports Server (NTRS)
Jacklin, Stephen A.
2015-01-01
The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.
Are your engineers talking to one another when they should?
Sosa, Manuel E; Eppinger, Steven D; Rowles, Craig M
2007-11-01
Communication may not be on managers' minds at companies that design complex, highly engineered products, but it should be. When mistakes take place, it's often because product-component teams fail to talk. The consequences can be huge: Ford and Bridgestone Firestone lost billions by not coordinating the design of the Explorer with the design of its tires. The major delays and cost overruns involved in the development of Airbus's A380 "superjumbo"--which most likely led to the CEO's exit--were a result of unforeseen design incompatibilities. To help managers mitigate such problems, the authors present a new application of the design structure matrix, a project management tool that maps the flow of information and its impact on product development. Drawing on research into how Pratt & Whitney handled the development of the PW4098 jet engine, they have developed an approach that uncovers (a) areas where communication should be occurring but is not (unattended interfaces, usually bad) and (b) areas where communication is occurring but has not been planned for (unidentified interfaces, usually good). After finding the unattended and unidentified interfaces, the next step is to figure out the causes of the critical ones. If a significant number of unattended interfaces cross organizational boundaries, executives may need to redraw organizational lines. Executives can then manage the remaining critical interfaces by extending the responsibilities of existing integration teams (those responsible for cross-system aspects, such as a jet engine's fuel economy) to include supervising the interaction, by dedicating teams to specific interfaces, or by formally charging teams already involved with the interfaces to oversee them. Finally, it's important to ensure that the teams are working with compatible design equipment; inconsistencies between CAD tools have cost Airbus dearly.
Designing flexible engineering systems utilizing embedded architecture options
NASA Astrophysics Data System (ADS)
Pierce, Jeff G.
This dissertation develops and applies an integrated framework for embedding flexibility in an engineered system architecture. Systems are constantly faced with unpredictability in the operational environment, threats from competing systems, obsolescence of technology, and general uncertainty in future system demands. Current systems engineering and risk management practices have focused almost exclusively on mitigating or preventing the negative consequences of uncertainty. This research recognizes that high uncertainty also presents an opportunity to design systems that can flexibly respond to changing requirements and capture additional value throughout the design life. There does not exist however a formalized approach to designing appropriately flexible systems. This research develops a three stage integrated flexibility framework based on the concept of architecture options embedded in the system design. Stage One defines an eight step systems engineering process to identify candidate architecture options. This process encapsulates the operational uncertainty though scenario development, traces new functional requirements to the affected design variables, and clusters the variables most sensitive to change. The resulting clusters can generate insight into the most promising regions in the architecture to embed flexibility in the form of architecture options. Stage Two develops a quantitative option valuation technique, grounded in real options theory, which is able to value embedded architecture options that exhibit variable expiration behavior. Stage Three proposes a portfolio optimization algorithm, for both discrete and continuous options, to select the optimal subset of architecture options, subject to budget and risk constraints. Finally, the feasibility, extensibility and limitations of the framework are assessed by its application to a reconnaissance satellite system development problem. Detailed technical data, performance models, and cost estimates were compiled for the Tactical Imaging Constellation Architecture Study and leveraged to complete a realistic proof-of-concept.
Altintas, Ferdi; Müstecaplıoğlu, Özgür E
2015-08-01
We investigate a quantum heat engine with a working substance of two particles, one with a spin-1/2 and the other with an arbitrary spin (spin s), coupled by Heisenberg exchange interaction, and subject to an external magnetic field. The engine operates in a quantum Otto cycle. Work harvested in the cycle and its efficiency are calculated using quantum thermodynamical definitions. It is found that the engine has higher efficiencies at higher spins and can harvest work at higher exchange interaction strengths. The role of exchange coupling and spin s on the work output and the thermal efficiency is studied in detail. In addition, the engine operation is analyzed from the perspective of local work and efficiency. We develop a general formalism to explore local thermodynamics applicable to any coupled bipartite system. Our general framework allows for examination of local thermodynamics even when global parameters of the system are varied in thermodynamic cycles. The generalized definitions of local and cooperative work are introduced by using mean field Hamiltonians. The general conditions for which the global work is not equal to the sum of the local works are given in terms of the covariance of the subsystems. Our coupled spin quantum Otto engine is used as an example of the general formalism.
NASA Astrophysics Data System (ADS)
Altintas, Ferdi; Müstecaplıoǧlu, Ã.-zgür E.
2015-08-01
We investigate a quantum heat engine with a working substance of two particles, one with a spin-1 /2 and the other with an arbitrary spin (spin s ), coupled by Heisenberg exchange interaction, and subject to an external magnetic field. The engine operates in a quantum Otto cycle. Work harvested in the cycle and its efficiency are calculated using quantum thermodynamical definitions. It is found that the engine has higher efficiencies at higher spins and can harvest work at higher exchange interaction strengths. The role of exchange coupling and spin s on the work output and the thermal efficiency is studied in detail. In addition, the engine operation is analyzed from the perspective of local work and efficiency. We develop a general formalism to explore local thermodynamics applicable to any coupled bipartite system. Our general framework allows for examination of local thermodynamics even when global parameters of the system are varied in thermodynamic cycles. The generalized definitions of local and cooperative work are introduced by using mean field Hamiltonians. The general conditions for which the global work is not equal to the sum of the local works are given in terms of the covariance of the subsystems. Our coupled spin quantum Otto engine is used as an example of the general formalism.
Improving Safety through Human Factors Engineering.
Siewert, Bettina; Hochman, Mary G
2015-10-01
Human factors engineering (HFE) focuses on the design and analysis of interactive systems that involve people, technical equipment, and work environment. HFE is informed by knowledge of human characteristics. It complements existing patient safety efforts by specifically taking into consideration that, as humans, frontline staff will inevitably make mistakes. Therefore, the systems with which they interact should be designed for the anticipation and mitigation of human errors. The goal of HFE is to optimize the interaction of humans with their work environment and technical equipment to maximize safety and efficiency. Special safeguards include usability testing, standardization of processes, and use of checklists and forcing functions. However, the effectiveness of the safety program and resiliency of the organization depend on timely reporting of all safety events independent of patient harm, including perceived potential risks, bad outcomes that occur even when proper protocols have been followed, and episodes of "improvisation" when formal guidelines are found not to exist. Therefore, an institution must adopt a robust culture of safety, where the focus is shifted from blaming individuals for errors to preventing future errors, and where barriers to speaking up-including barriers introduced by steep authority gradients-are minimized. This requires creation of formal guidelines to address safety concerns, establishment of unified teams with open communication and shared responsibility for patient safety, and education of managers and senior physicians to perceive the reporting of safety concerns as a benefit rather than a threat. © RSNA, 2015.
ERIC Educational Resources Information Center
Boots, Nikki Kim
2013-01-01
The emphasis on engaging young learners in science, technology, engineering, and math (STEM) professions is driving calls for educational reform. One movement that is gaining momentum is exposing K-12 learners to engineering. With the advent of the "Next Generation Science Standards" (2012b), engineering is being more formally integrated…
On decentralized design: Rationale, dynamics, and effects on decision-making
NASA Astrophysics Data System (ADS)
Chanron, Vincent
The focus of this dissertation is the design of complex systems, including engineering systems such as cars, airplanes, and satellites. Companies who design these systems are under constant pressure to design better products that meet customer expectations, and competition forces them to develop them faster. One of the responses of the industry to these conflicting challenges has been the decentralization of the design responsibilities. The current lack of understanding of the dynamics of decentralized design processes is the main motivation for this research, and places value on the descriptive base. It identifies the main reasons and the true benefits for companies to decentralize the design of their products. It also demonstrates the limitations of this approach by listing the relevant issues and problems created by the decentralization of decisions. Based on these observations, a game-theoretic approach to decentralized design is proposed to model the decisions made during the design process. The dynamics are modeled using mathematical formulations inspired from control theory. Building upon this formalism, the issue of convergence in decentralized design is analyzed: the equilibrium points of the design space are identified and convergent and divergent patterns are recognized. This rigorous investigation of the design process provides motivation and support for proposing new approaches to decentralized design problems. Two methods are developed, which aim at improving the design process in two ways: decreasing the product development time, and increasing the optimality of the final design. The frame of these methods are inspired by eigenstructure decomposition and set-based design, respectively. The value of the research detailed within this dissertation is in the proposed methods which are built upon the sound mathematical formalism developed. The contribution of this work is two fold: rigorous investigation of the design process, and practical support to decision-making in decentralized environments.
Applications of Ontology Design Patterns in Biomedical Ontologies
Mortensen, Jonathan M.; Horridge, Matthew; Musen, Mark A.; Noy, Natalya F.
2012-01-01
Ontology design patterns (ODPs) are a proposed solution to facilitate ontology development, and to help users avoid some of the most frequent modeling mistakes. ODPs originate from similar approaches in software engineering, where software design patterns have become a critical aspect of software development. There is little empirical evidence for ODP prevalence or effectiveness thus far. In this work, we determine the use and applicability of ODPs in a case study of biomedical ontologies. We encoded ontology design patterns from two ODP catalogs. We then searched for these patterns in a set of eight ontologies. We found five patterns of the 69 patterns. Two of the eight ontologies contained these patterns. While ontology design patterns provide a vehicle for capturing formally reoccurring models and best practices in ontology design, we show that today their use in a case study of widely used biomedical ontologies is limited. PMID:23304337
Security Hardened Cyber Components for Nuclear Power Plants: Phase I SBIR Final Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franusich, Michael D.
SpiralGen, Inc. built a proof-of-concept toolkit for enhancing the cyber security of nuclear power plants and other critical infrastructure with high-assurance instrumentation and control code. The toolkit is based on technology from the DARPA High-Assurance Cyber Military Systems (HACMS) program, which has focused on applying the science of formal methods to the formidable set of problems involved in securing cyber physical systems. The primary challenges beyond HACMS in developing this toolkit were to make the new technology usable by control system engineers and compatible with the regulatory and commercial constraints of the nuclear power industry. The toolkit, packaged as amore » Simulink add-on, allows a system designer to assemble a high-assurance component from formally specified and proven blocks and generate provably correct control and monitor code for that subsystem.« less
DNA stress and strain, in silico, in vitro and in vivo
NASA Astrophysics Data System (ADS)
Levens, David; Benham, Craig J.
2011-06-01
A vast literature has explored the genetic interactions among the cellular components regulating gene expression in many organisms. Early on, in the absence of any biochemical definition, regulatory modules were conceived using the strict formalism of genetics to designate the modifiers of phenotype as either cis- or trans-acting depending on whether the relevant genes were embedded in the same or separate DNA molecules. This formalism distilled gene regulation down to its essence in much the same way that consideration of an ideal gas reveals essential thermodynamic and kinetic principles. Yet just as the anomalous behavior of materials may thwart an engineer who ignores their non-ideal properties, schemes to control and manipulate the genetic and epigenetic programs of cells may falter without a fuller and more quantitative elucidation of the physical and chemical characteristics of DNA and chromatin in vivo.
Better Broader Impacts through National Science Foundation Centers
NASA Astrophysics Data System (ADS)
Campbell, K. M.
2010-12-01
National Science Foundation Science and Technology Centers (STCs) play a leading role in developing and evaluating “Better Broader Impacts”; best practices for recruiting a broad spectrum of American students into STEM fields and for educating these future professionals, as well as their families, teachers and the general public. With staff devoted full time to Broader Impacts activities, over the ten year life of a Center, STCs are able to address both a broad range of audiences and a broad range of topics. Along with other NSF funded centers, such as Centers for Ocean Sciences Education Excellence, Engineering Research Centers and Materials Research Science and Engineering Centers, STCs develop both models and materials that individual researchers can adopt, as well as, in some cases, direct opportunities for individual researchers to offer their disciplinary research expertise to existing center Broader Impacts Programs. The National Center for Earth-surface Dynamics is an STC headquartered at the University of Minnesota. NCED’s disciplinary research spans the physical, biological and engineering issues associated with developing an integrative, quantitative and predictive understanding of rivers and river basins. Funded in 2002, we have had the opportunity to partner with individuals and institutions ranging from formal to informal education and from science museums to Tribal and women’s colleges. We have developed simple table top physical models, complete museum exhibitions, 3D paper maps and interactive computer based visualizations, all of which have helped us communicate with this wide variety of learners. Many of these materials themselves or plans to construct them are available online; in many cases they have also been formally evaluated. We have also listened to the formal and informal educators with whom we partner, from whom we have learned a great deal about how to design Broader Impacts activities and programs. Using NCED as a case study, this session showcases NCED’s materials, approaches and lessons learned. We will also introduce the work of our sister STCs, whose disciplines span the STEM fields.
Software Development Technologies for Reactive, Real-Time, and Hybrid Systems
NASA Technical Reports Server (NTRS)
Manna, Zohar
1996-01-01
The research is directed towards the design and implementation of a comprehensive deductive environment for the development of high-assurance systems, especially reactive (concurrent, real-time, and hybrid) systems. Reactive systems maintain an ongoing interaction with their environment, and are among the most difficult to design and verify. The project aims to provide engineers with a wide variety of tools within a single, general, formal framework in which the tools will be most effective. The entire development process is considered, including the construction, transformation, validation, verification, debugging, and maintenance of computer systems. The goal is to automate the process as much as possible and reduce the errors that pervade hardware and software development.
Eliciting design patterns for e-learning systems
NASA Astrophysics Data System (ADS)
Retalis, Symeon; Georgiakakis, Petros; Dimitriadis, Yannis
2006-06-01
Design pattern creation, especially in the e-learning domain, is a highly complex process that has not been sufficiently studied and formalized. In this paper, we propose a systematic pattern development cycle, whose most important aspects focus on reverse engineering of existing systems in order to elicit features that are cross-validated through the use of appropriate, authentic scenarios. However, an iterative pattern process is proposed that takes advantage of multiple data sources, thus emphasizing a holistic view of the teaching learning processes. The proposed schema of pattern mining has been extensively validated for Asynchronous Network Supported Collaborative Learning (ANSCL) systems, as well as for other types of tools in a variety of scenarios, with promising results.
Integrated analysis of large space systems
NASA Technical Reports Server (NTRS)
Young, J. P.
1980-01-01
Based on the belief that actual flight hardware development of large space systems will necessitate a formalized method of integrating the various engineering discipline analyses, an efficient highly user oriented software system capable of performing interdisciplinary design analyses with tolerable solution turnaround time is planned Specific analysis capability goals were set forth with initial emphasis given to sequential and quasi-static thermal/structural analysis and fully coupled structural/control system analysis. Subsequently, the IAC would be expanded to include a fully coupled thermal/structural/control system, electromagnetic radiation, and optical performance analyses.
META-GLARE: a shell for CIG systems.
Bottrighi, Alessio; Rubrichi, Stefania; Terenziani, Paolo
2015-01-01
In the last twenty years, many different approaches to deal with Computer-Interpretable clinical Guidelines (CIGs) have been developed, each one proposing its own representation formalism (mostly based on the Task-Network Model) execution engine. We propose META-GLARE a shell for easily defining new CIG systems. Using META-GLARE, CIG system designers can easily define their own systems (basically by defining their representation language), with a minimal programming effort. META-GLARE is thus a flexible and powerful vehicle for research about CIGs, since it supports easy and fast prototyping of new CIG systems.
MedSynDiKATe--design considerations for an ontology-based medical text understanding system.
Hahn, U.; Romacker, M.; Schulz, S.
2000-01-01
MedSynDiKATe is a natural language processor for automatically acquiring knowledge from medical finding reports. The content of these documents is transferred to formal representation structures which constitute a corresponding text knowledge base. The general system architecture we present integrates requirements from the analysis of single sentences, as well as those of referentially linked sentences forming cohesive texts. The strong demands MedSynDiKATe poses to the availability of expressive knowledge sources are accounted for by two alternative approaches to (semi)automatic ontology engineering. PMID:11079899
VRML metabolic network visualizer.
Rojdestvenski, Igor
2003-03-01
A successful date collection visualization should satisfy a set of many requirements: unification of diverse data formats, support for serendipity research, support of hierarchical structures, algorithmizability, vast information density, Internet-readiness, and other. Recently, virtual reality has made significant progress in engineering, architectural design, entertainment and communication. We experiment with the possibility of using the immersive abstract three-dimensional visualizations of the metabolic networks. We present the trial Metabolic Network Visualizer software, which produces graphical representation of a metabolic network as a VRML world from a formal description written in a simple SGML-type scripting language.
Cyber-physical approach to the network-centric robotics control task
NASA Astrophysics Data System (ADS)
Muliukha, Vladimir; Ilyashenko, Alexander; Zaborovsky, Vladimir; Lukashin, Alexey
2016-10-01
Complex engineering tasks concerning control for groups of mobile robots are developed poorly. In our work for their formalization we use cyber-physical approach, which extends the range of engineering and physical methods for a design of complex technical objects by researching the informational aspects of communication and interaction between objects and with an external environment [1]. The paper analyzes network-centric methods for control of cyber-physical objects. Robots or cyber-physical objects interact with each other by transmitting information via computer networks using preemptive queueing system and randomized push-out mechanism [2],[3]. The main field of application for the results of our work is space robotics. The selection of cyber-physical systems as a special class of designed objects is due to the necessity of integrating various components responsible for computing, communications and control processes. Network-centric solutions allow using universal means for the organization of information exchange to integrate different technologies for the control system.
A linear decomposition method for large optimization problems. Blueprint for development
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.
1982-01-01
A method is proposed for decomposing large optimization problems encountered in the design of engineering systems such as an aircraft into a number of smaller subproblems. The decomposition is achieved by organizing the problem and the subordinated subproblems in a tree hierarchy and optimizing each subsystem separately. Coupling of the subproblems is accounted for by subsequent optimization of the entire system based on sensitivities of the suboptimization problem solutions at each level of the tree to variables of the next higher level. A formalization of the procedure suitable for computer implementation is developed and the state of readiness of the implementation building blocks is reviewed showing that the ingredients for the development are on the shelf. The decomposition method is also shown to be compatible with the natural human organization of the design process of engineering systems. The method is also examined with respect to the trends in computer hardware and software progress to point out that its efficiency can be amplified by network computing using parallel processors.
Modular Knowledge Representation and Reasoning in the Semantic Web
NASA Astrophysics Data System (ADS)
Serafini, Luciano; Homola, Martin
Construction of modular ontologies by combining different modules is becoming a necessity in ontology engineering in order to cope with the increasing complexity of the ontologies and the domains they represent. The modular ontology approach takes inspiration from software engineering, where modularization is a widely acknowledged feature. Distributed reasoning is the other side of the coin of modular ontologies: given an ontology comprising of a set of modules, it is desired to perform reasoning by combination of multiple reasoning processes performed locally on each of the modules. In the last ten years, a number of approaches for combining logics has been developed in order to formalize modular ontologies. In this chapter, we survey and compare the main formalisms for modular ontologies and distributed reasoning in the Semantic Web. We select four formalisms build on formal logical grounds of Description Logics: Distributed Description Logics, ℰ-connections, Package-based Description Logics and Integrated Distributed Description Logics. We concentrate on expressivity and distinctive modeling features of each framework. We also discuss reasoning capabilities of each framework.
Goal-Function Tree Modeling for Systems Engineering and Fault Management
NASA Technical Reports Server (NTRS)
Patterson, Jonathan D.; Johnson, Stephen B.
2013-01-01
The draft NASA Fault Management (FM) Handbook (2012) states that Fault Management (FM) is a "part of systems engineering", and that it "demands a system-level perspective" (NASAHDBK- 1002, 7). What, exactly, is the relationship between systems engineering and FM? To NASA, systems engineering (SE) is "the art and science of developing an operable system capable of meeting requirements within often opposed constraints" (NASA/SP-2007-6105, 3). Systems engineering starts with the elucidation and development of requirements, which set the goals that the system is to achieve. To achieve these goals, the systems engineer typically defines functions, and the functions in turn are the basis for design trades to determine the best means to perform the functions. System Health Management (SHM), by contrast, defines "the capabilities of a system that preserve the system's ability to function as intended" (Johnson et al., 2011, 3). Fault Management, in turn, is the operational subset of SHM, which detects current or future failures, and takes operational measures to prevent or respond to these failures. Failure, in turn, is the "unacceptable performance of intended function." (Johnson 2011, 605) Thus the relationship of SE to FM is that SE defines the functions and the design to perform those functions to meet system goals and requirements, while FM detects the inability to perform those functions and takes action. SHM and FM are in essence "the dark side" of SE. For every function to be performed (SE), there is the possibility that it is not successfully performed (SHM); FM defines the means to operationally detect and respond to this lack of success. We can also describe this in terms of goals: for every goal to be achieved, there is the possibility that it is not achieved; FM defines the means to operationally detect and respond to this inability to achieve the goal. This brief description of relationships between SE, SHM, and FM provide hints to a modeling approach to provide formal connectivity between the nominal (SE), and off-nominal (SHM and FM) aspects of functions and designs. This paper describes a formal modeling approach to the initial phases of the development process that integrates the nominal and off-nominal perspectives in a model that unites SE goals and functions of with the failure to achieve goals and functions (SHM/FM). This methodology and corresponding model, known as a Goal-Function Tree (GFT), provides a means to represent, decompose, and elaborate system goals and functions in a rigorous manner that connects directly to design through use of state variables that translate natural language requirements and goals into logical-physical state language. The state variable-based approach also provides the means to directly connect FM to the design, by specifying the range in which state variables must be controlled to achieve goals, and conversely, the failures that exist if system behavior go out-of-range. This in turn allows for the systems engineers and SHM/FM engineers to determine which state variables to monitor, and what action(s) to take should the system fail to achieve that goal. In sum, the GFT representation provides a unified approach to early-phase SE and FM development. This representation and methodology has been successfully developed and implemented using Systems Modeling Language (SysML) on the NASA Space Launch System (SLS) Program. It enabled early design trade studies of failure detection coverage to ensure complete detection coverage of all crew-threatening failures. The representation maps directly both to FM algorithm designs, and to failure scenario definitions needed for design analysis and testing. The GFT representation provided the basis for mapping of abort triggers into scenarios, both needed for initial, and successful quantitative analyses of abort effectiveness (detection and response to crew-threatening events).
Topology reconstruction for B-Rep modeling from 3D mesh in reverse engineering applications
NASA Astrophysics Data System (ADS)
Bénière, Roseline; Subsol, Gérard; Gesquière, Gilles; Le Breton, François; Puech, William
2012-03-01
Nowadays, most of the manufactured objects are designed using CAD (Computer-Aided Design) software. Nevertheless, for visualization, data exchange or manufacturing applications, the geometric model has to be discretized into a 3D mesh composed of a finite number of vertices and edges. But, in some cases, the initial model may be lost or unavailable. In other cases, the 3D discrete representation may be modified, for example after a numerical simulation, and does not correspond anymore to the initial model. A reverse engineering method is then required to reconstruct a 3D continuous representation from the discrete one. In previous work, we have presented a new approach for 3D geometric primitive extraction. In this paper, to complete our automatic and comprehensive reverse engineering process, we propose a method to construct the topology of the retrieved object. To reconstruct a B-Rep model, a new formalism is now introduced to define the adjacency relations. Then a new process is used to construct the boundaries of the object. The whole process is tested on 3D industrial meshes and bring a solution to recover B-Rep models.
Flood design recipes vs. reality: can predictions for ungauged basins be trusted?
NASA Astrophysics Data System (ADS)
Efstratiadis, A.; Koussis, A. D.; Koutsoyiannis, D.; Mamassis, N.
2014-06-01
Despite the great scientific and technological advances in flood hydrology, everyday engineering practices still follow simplistic approaches that are easy to formally implement in ungauged areas. In general, these "recipes" have been developed many decades ago, based on field data from typically few experimental catchments. However, many of them have been neither updated nor validated across all hydroclimatic and geomorphological conditions. This has an obvious impact on the quality and reliability of hydrological studies, and, consequently, on the safety and cost of the related flood protection works. Preliminary results, based on historical flood data from Cyprus and Greece, indicate that a substantial revision of many aspects of flood engineering procedures is required, including the regionalization formulas as well as the modelling concepts themselves. In order to provide a consistent design framework and to ensure realistic predictions of the flood risk (a key issue of the 2007/60/EU Directive) in ungauged basins, it is necessary to rethink the current engineering practices. In this vein, the collection of reliable hydrological data would be essential for re-evaluating the existing "recipes", taking into account local peculiarities, and for updating the modelling methodologies as needed.
NASA Astrophysics Data System (ADS)
Mitchell, K. L.; Lowes, L. L.; Budney, C. J.; Sohus, A.
2014-12-01
NASA's Planetary Science Summer School (PSSS) is an intensive program for postdocs and advanced graduate students in science and engineering fields with a keen interest in planetary exploration. The goal is to train the next generation of planetary science mission leaders in a hands-on environment involving a wide range of engineers and scientists. It was established in 1989, and has undergone several incarnations. Initially a series of seminars, it became a more formal mission design experience in 1999. Admission is competitive, with participants given financial support. The competitively selected trainees develop an early mission concept study in teams of 15-17, responsive to a typical NASA Science Mission Directorate Announcement of Opportunity. They select the mission concept from options presented by the course sponsors, based on high-priority missions as defined by the Decadal Survey, prepare a presentation for a proposal authorization review, present it to a senior review board and receive critical feedback. Each participant assumes multiple roles, on science, instrument and project teams. They develop an understanding of top-level science requirements and instrument priorities in advance through a series of reading assignments and webinars help trainees. Then, during the five day session at Jet Propulsion Laboratory, they work closely with concurrent engineers including JPL's Advanced Projects Design Team ("Team X"), a cross-functional multidisciplinary team of engineers that utilizes concurrent engineering methodologies to complete rapid design, analysis and evaluation of mission concept designs. All are mentored and assisted directly by Team X members and course tutors in their assigned project roles. There is a strong emphasis on making difficult trades, simulating a real mission design process as accurately as possible. The process is intense and at times dramatic, with fast-paced design sessions and late evening study sessions. A survey of PSSS alumni administered in 2013 provides information on the program's impact on trainees' career choices and leadership roles as they pursue their employment in planetary science and related fields. Results will be presented during the session, along with highlights of topics and missions covered since the program's inception.
Test and On-Orbit Experiences of FalconSAT-3
NASA Astrophysics Data System (ADS)
Saylor, W. W.; France, M. E. B.
2008-08-01
The fundamental objectives of the capstone design project in the Department of Astronautics at the United States Air Force Academy (USAFA) are for cadets to learn important engineering lessons by executing a real space mission on a Department of Defense-funded satellite project. FalconSAT-3 is a 50 kg, gravity gradient-stabilized designed and built by cadets and launched March 2007 on the first ESPA (Enhanced extended launch vehicle Satellite Payload Adapter) mission. FalconSAT-3 was one of six satellites integrated onto the launch vehicle and the nature of the mission made it that the satellite was subject to the full formality of testing requirements. Two successive gravity gradient booms failed either design requirements or environmental testing; design requirements grew dramatically during the design phase; ambiguous thermal vacuum test results led to uncertainty at launch; and after launch it was not possible to contact the satellite for several weeks.
2016-05-01
The formal and informal interactions among scientists, engineers, and business and technology specialists fostered by this environment will lead...pathways for highly trained graduates of science, technology, engineering, and mathematics (STEM) academic programs, and help academic institutions...engineering and mathematics (STEM) disciplines relevant to ARL science and technology programs. Under EPAs, visiting students and professors
Educating Engineers in Information Utilization.
ERIC Educational Resources Information Center
Borovansky, Vladimir T.
1987-01-01
Traditionally engineers are not heaviest users of information resources. This can be traced to lack of emphasis on information sources in engineering education. Failure to use available knowledge leads to reinventing the wheel and losing the race for technological superiority. Few U.S. universities offer formal courses in information resources in…
Formal ontology for natural language processing and the integration of biomedical databases.
Simon, Jonathan; Dos Santos, Mariana; Fielding, James; Smith, Barry
2006-01-01
The central hypothesis underlying this communication is that the methodology and conceptual rigor of a philosophically inspired formal ontology can bring significant benefits in the development and maintenance of application ontologies [A. Flett, M. Dos Santos, W. Ceusters, Some Ontology Engineering Procedures and their Supporting Technologies, EKAW2002, 2003]. This hypothesis has been tested in the collaboration between Language and Computing (L&C), a company specializing in software for supporting natural language processing especially in the medical field, and the Institute for Formal Ontology and Medical Information Science (IFOMIS), an academic research institution concerned with the theoretical foundations of ontology. In the course of this collaboration L&C's ontology, LinKBase, which is designed to integrate and support reasoning across a plurality of external databases, has been subjected to a thorough auditing on the basis of the principles underlying IFOMIS's Basic Formal Ontology (BFO) [B. Smith, Basic Formal Ontology, 2002. http://ontology.buffalo.edu/bfo]. The goal is to transform a large terminology-based ontology into one with the ability to support reasoning applications. Our general procedure has been the implementation of a meta-ontological definition space in which the definitions of all the concepts and relations in LinKBase are standardized in the framework of first-order logic. In this paper we describe how this principles-based standardization has led to a greater degree of internal coherence of the LinKBase structure, and how it has facilitated the construction of mappings between external databases using LinKBase as translation hub. We argue that the collaboration here described represents a new phase in the quest to solve the so-called "Tower of Babel" problem of ontology integration [F. Montayne, J. Flanagan, Formal Ontology: The Foundation for Natural Language Processing, 2003. http://www.landcglobal.com/].
Studies on behaviour of information to extract the meaning behind the behaviour
NASA Astrophysics Data System (ADS)
Nasution, M. K. M.; Syah, R.; Elveny, M.
2017-01-01
Web as social media can be used as a reference for determining social behaviour. However, the information extraction involves a search engine is not easy to give that picture. There are several properties of the search engine to be formally disclosed to provide assurance that the information is feasible. Although quite a lot of research that has revealed the interest of the Web as social media, but a few of them that have revealed behaviour of information related to social behaviour. In this case, it needs the formal steps to present possibilities related properties. There are 12 properties that are interconnected as behaviour of information and then it reveals several meanings based on the simulation results of any search engine.
Koutkias, Vassilis; Kilintzis, Vassilis; Stalidis, George; Lazou, Katerina; Niès, Julie; Durand-Texte, Ludovic; McNair, Peter; Beuscart, Régis; Maglaveras, Nicos
2012-06-01
The primary aim of this work was the development of a uniform, contextualized and sustainable knowledge-based framework to support adverse drug event (ADE) prevention via Clinical Decision Support Systems (CDSSs). In this regard, the employed methodology involved first the systematic analysis and formalization of the knowledge sources elaborated in the scope of this work, through which an application-specific knowledge model has been defined. The entire framework architecture has been then specified and implemented by adopting Computer Interpretable Guidelines (CIGs) as the knowledge engineering formalism for its construction. The framework integrates diverse and dynamic knowledge sources in the form of rule-based ADE signals, all under a uniform Knowledge Base (KB) structure, according to the defined knowledge model. Equally important, it employs the means to contextualize the encapsulated knowledge, in order to provide appropriate support considering the specific local environment (hospital, medical department, language, etc.), as well as the mechanisms for knowledge querying, inference, sharing, and management. In this paper, we present thoroughly the establishment of the proposed knowledge framework by presenting the employed methodology and the results obtained as regards implementation, performance and validation aspects that highlight its applicability and virtue in medication safety. Copyright © 2012 Elsevier Inc. All rights reserved.
Probabilistic simulation of concurrent engineering of propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Singhal, S. N.
1993-01-01
Technology readiness and the available infrastructure is assessed for timely computational simulation of concurrent engineering for propulsion systems. Results for initial coupled multidisciplinary, fabrication-process, and system simulators are presented including uncertainties inherent in various facets of engineering processes. An approach is outlined for computationally formalizing the concurrent engineering process from cradle-to-grave via discipline dedicated workstations linked with a common database.
DoD Science and Engineering Apprenticeship Program for High School Students, 1996- Activities
1997-05-01
including lectures, laboratory demonstrations, scientific films, field trips and a formal course and a weekly discussion session on the history of science using...lectures, laboratory demonstrations, scientific films, field trips and a formal course and a weekly discussion session on the history of science using
On the Need for Practical Formal Methods
1998-01-01
additional research and engineering that is needed to make the current set of formal methods more practical. To illustrate the ideas, I present several exam ...either a good violin or a highly talented violinist. Light-weight techniques o er software developers good violins . A user need not be a talented
Hufnagel, S; Harbison, K; Silva, J; Mettala, E
1994-01-01
This paper describes a new method for the evolutionary determination of user requirements and system specifications called scenario-based engineering process (SEP). Health care professional workstations are critical components of large scale health care system architectures. We suggest that domain-specific software architectures (DSSAs) be used to specify standard interfaces and protocols for reusable software components throughout those architectures, including workstations. We encourage the use of engineering principles and abstraction mechanisms. Engineering principles are flexible guidelines, adaptable to particular situations. Abstraction mechanisms are simplifications for management of complexity. We recommend object-oriented design principles, graphical structural specifications, and formal components' behavioral specifications. We give an ambulatory care scenario and associated models to demonstrate SEP. The scenario uses health care terminology and gives patients' and health care providers' system views. Our goal is to have a threefold benefit. (i) Scenario view abstractions provide consistent interdisciplinary communications. (ii) Hierarchical object-oriented structures provide useful abstractions for reuse, understandability, and long term evolution. (iii) SEP and health care DSSA integration into computer aided software engineering (CASE) environments. These environments should support rapid construction and certification of individualized systems, from reuse libraries.
NASA Astrophysics Data System (ADS)
Jacobitz, Frank; Schubert, Thomas
2013-11-01
Short-term, study-abroad, elective engineering courses were developed in order to raise the international awareness and global competency of engineering students. These Compact International Experience (CIE) courses were taught in response to a strong student desire for engineering study abroad courses and an effort by the home institution to internationalize its curriculum. An assessment of repeat offerings of two three-semester-unit courses on Topics in Fluid Mechanics and Advanced Electronic Circuit Design in a three-week time frame in France and Australia was performed. The goals of the two CIE courses are an effective teaching of their respective technical content as well as a student understanding of the cultural environment and the impact of engineering solutions from a global and societal viewpoint. In the repeat offerings, increased interaction with local industry was an additional goal. The CIE courses were assessed through surveys completed at the beginning and end of the courses, weekly student reflection papers, course evaluations, and formalized instructor observations. Based on the assessment performed, the two CIE courses have been found to be a valuable approach in the delivery of engineering technical electives combined with an international experience.
Formal Abstraction in Engineering Education--Challenges and Technology Support
ERIC Educational Resources Information Center
Neuper, Walther A.
2017-01-01
This is a position paper in the field of Engineering Education, which is at the very beginning in Europe. It relates challenges in the new field to the emerging technology of (Computer) Theorem Proving (TP). Experience shows, that "teaching" abstract models, for instance the wave equation in mechanical engineering and in electrical…
Knowledge Engineering as a Component of the Curriculum for Medical Cybernetists.
Karas, Sergey; Konev, Arthur
2017-01-01
According to a new state educational standard, students who have chosen medical cybernetics as their major must develop a knowledge engineering competency. Previously, in the course "Clinical cybernetics" while practicing project-based learning students were designing automated workstations for medical personnel using client-server technology. The purpose of the article is to give insight into the project of a new educational module "Knowledge engineering". Students will acquire expert knowledge by holding interviews and conducting surveys, and then they will formalize it. After that, students will form declarative expert knowledge in a network model and analyze the knowledge graph. Expert decision making methods will be applied in software on the basis of a production model of knowledge. Project implementation will result not only in the development of analytical competencies among students, but also creation of a practically useful expert system based on student models to support medical decisions. Nowadays, this module is being tested in the educational process.
NASA Technical Reports Server (NTRS)
Vaughan, William W.; Anderson, B. Jeffrey
2005-01-01
In modern government and aerospace industry institutions the necessity of controlling current year costs often leads to high mobility in the technical workforce, "one-deep" technical capabilities, and minimal mentoring for young engineers. Thus, formal recording, use, and teaching of lessons learned are especially important in the maintenance and improvement of current knowledge and development of new technologies, regardless of the discipline area. Within the NASA Technical Standards Program Website http://standards.nasa.gov there is a menu item entitled "Lessons Learned/Best Practices". It contains links to a large number of engineering and technical disciplines related data sets that contain a wealth of lessons learned information based on past experiences. This paper has provided a small sample of lessons learned relative to the atmospheric and space environment. There are many more whose subsequent applications have improved our knowledge of the atmosphere and space environment, and the application of this knowledge to the engineering and operations for a variety of aerospace programs.
Double Dutch: A Tool for Designing Combinatorial Libraries of Biological Systems.
Roehner, Nicholas; Young, Eric M; Voigt, Christopher A; Gordon, D Benjamin; Densmore, Douglas
2016-06-17
Recently, semirational approaches that rely on combinatorial assembly of characterized DNA components have been used to engineer biosynthetic pathways. In practice, however, it is not practical to assemble and test millions of pathway variants in order to elucidate how different DNA components affect the behavior of a pathway. To address this challenge, we apply a rigorous mathematical approach known as design of experiments (DOE) that can be used to construct empirical models of system behavior without testing all variants. To support this approach, we have developed a tool named Double Dutch, which uses a formal grammar and heuristic algorithms to automate the process of DOE library design. Compared to designing by hand, Double Dutch enables users to more efficiently and scalably design libraries of pathway variants that can be used in a DOE framework and uniquely provides a means to flexibly balance design considerations of statistical analysis, construction cost, and risk of homologous recombination, thereby demonstrating the utility of automating decision making when faced with complex design trade-offs.
A unified approach to the design of clinical reporting systems.
Gouveia-Oliveira, A; Salgado, N C; Azevedo, A P; Lopes, L; Raposo, V D; Almeida, I; de Melo, F G
1994-12-01
Computer-based Clinical Reporting Systems (CRS) for diagnostic departments that use structured data entry have a number of functional and structural affinities suggesting that a common software architecture for CRS may be defined. Such an architecture should allow easy expandability and reusability of a CRS. We report the development methodology and the architecture of SISCOPE, a CRS originally designed for gastrointestinal endoscopy that is expandable and reusable. Its main components are a patient database, a knowledge base, a reports base, and screen and reporting engines. The knowledge base contains the description of the controlled vocabulary and all the information necessary to control the menu system, and is easily accessed and modified with a conventional text editor. The structure of the controlled vocabulary is formally presented as an entity-relationship diagram. The screen engine drives a dynamic user interface and the reporting engine automatically creates a medical report; both engines operate by following a set of rules and the information contained in the knowledge base. Clinical experience has shown this architecture to be highly flexible and to allow frequent modifications of both the vocabulary and the menu system. This structure provided increased collaboration among development teams, insulating the domain expert from the details of the database, and enabling him to modify the system as necessary and to test the changes immediately. The system has also been reused in several different domains.
Galdzicki, Michal; Clancy, Kevin P; Oberortner, Ernst; Pocock, Matthew; Quinn, Jacqueline Y; Rodriguez, Cesar A; Roehner, Nicholas; Wilson, Mandy L; Adam, Laura; Anderson, J Christopher; Bartley, Bryan A; Beal, Jacob; Chandran, Deepak; Chen, Joanna; Densmore, Douglas; Endy, Drew; Grünberg, Raik; Hallinan, Jennifer; Hillson, Nathan J; Johnson, Jeffrey D; Kuchinsky, Allan; Lux, Matthew; Misirli, Goksel; Peccoud, Jean; Plahar, Hector A; Sirin, Evren; Stan, Guy-Bart; Villalobos, Alan; Wipat, Anil; Gennari, John H; Myers, Chris J; Sauro, Herbert M
2014-06-01
The re-use of previously validated designs is critical to the evolution of synthetic biology from a research discipline to an engineering practice. Here we describe the Synthetic Biology Open Language (SBOL), a proposed data standard for exchanging designs within the synthetic biology community. SBOL represents synthetic biology designs in a community-driven, formalized format for exchange between software tools, research groups and commercial service providers. The SBOL Developers Group has implemented SBOL as an XML/RDF serialization and provides software libraries and specification documentation to help developers implement SBOL in their own software. We describe early successes, including a demonstration of the utility of SBOL for information exchange between several different software tools and repositories from both academic and industrial partners. As a community-driven standard, SBOL will be updated as synthetic biology evolves to provide specific capabilities for different aspects of the synthetic biology workflow.
Confined Detonations and Pulse Detonation Engines
2003-01-01
chemically reacting flow was described by the 2D Euler equations &q OF(q) +G(q) W (1) 75 CONFINED DETONATIONS AND PULSE DETONATION ENGINES where q = (p...DETONATIONS AND PULSE DETONATION ENGINES 5 CONCLUDING REMARKS Numerical investigations of RR and MR in a supersonic chemically reacting flows have...formalism of hetero- geneous medium mechanics supplemented with an overall chemical reaction was 141 CONFINED DETONATIONS AND PULSE DETONATION ENGINES
ERIC Educational Resources Information Center
Benya, Frazier F., Ed.; Fletcher, Cameron H.,Ed.; Hollander, Rachelle D.,Ed.
2013-01-01
Over the last two decades, colleges and universities in the United States have significantly increased the formal ethics instruction they provide in science and engineering. Today, science and engineering programs socialize students into the values of scientists and engineers as well as their obligations in the conduct of scientific research and…
NASA Astrophysics Data System (ADS)
Meredith, Kate K.; Masters, Karen; Raddick, Jordan; Lundgren, Britt
2015-08-01
The Sloan Digital Sky Survey (SDSS) web interface “SkyServer” has long included online educational materials designed to help students and the public discover the fundamentals of modern astronomy using real observations from the SDSS database. The newly launched SDSS Voyages website updates and expands these activities to reflect new data from subsequent generations of the survey, advances in web technology, and evolving practices in science education. Voyages provides access to quality astronomy, astrophysics, and engineering materials to educators seeking an inquiry approach to fundamental concepts. During this session we will provide an overview of the design and development of Skyserver Voyages and discuss ways to apply this resource at K-12 and university levels.
NASA Astrophysics Data System (ADS)
Fiorani, D.; Acierno, M.
2017-05-01
The aim of the present research is to develop an instrument able to adequately support the conservation process by means of a twofold approach, based on both BIM environment and ontology formalisation. Although BIM has been successfully experimented within AEC (Architecture Engineering Construction) field, it has showed many drawbacks for architectural heritage. To cope with unicity and more generally complexity of ancient buildings, applications so far developed have shown to poorly adapt BIM to conservation design with unsatisfactory results (Dore, Murphy 2013; Carrara 2014). In order to combine achievements reached within AEC through BIM environment (design control and management) with an appropriate, semantically enriched and flexible The presented model has at its core a knowledge base developed through information ontologies and oriented around the formalization and computability of all the knowledge necessary for the full comprehension of the object of architectural heritage an its conservation. Such a knowledge representation is worked out upon conceptual categories defined above all within architectural criticism and conservation scope. The present paper aims at further extending the scope of conceptual modelling within cultural heritage conservation already formalized by the model. A special focus is directed on decay analysis and surfaces conservation project.
Thermodynamics of the mesoscopic thermoelectric heat engine beyond the linear-response regime.
Yamamoto, Kaoru; Hatano, Naomichi
2015-10-01
Mesoscopic thermoelectric heat engine is much anticipated as a device that allows us to utilize with high efficiency wasted heat inaccessible by conventional heat engines. However, the derivation of the heat current in this engine seems to be either not general or described too briefly, even inappropriately in some cases. In this paper, we give a clear-cut derivation of the heat current of the engine with suitable assumptions beyond the linear-response regime. It resolves the confusion in the definition of the heat current in the linear-response regime. After verifying that we can construct the same formalism as that of the cyclic engine, we find the following two interesting results within the Landauer-Büttiker formalism: the efficiency of the mesoscopic thermoelectric engine reaches the Carnot efficiency if and only if the transmission probability is finite at a specific energy and zero otherwise; the unitarity of the transmission probability guarantees the second law of thermodynamics, invalidating Benenti et al.'s argument in the linear-response regime that one could obtain a finite power with the Carnot efficiency under a broken time-reversal symmetry [Phys. Rev. Lett. 106, 230602 (2011)]. These results demonstrate how quantum mechanics constrains thermodynamics.
Proceedings 3rd NASA/IEEE Workshop on Formal Approaches to Agent-Based Systems (FAABS-III)
NASA Technical Reports Server (NTRS)
Hinchey, Michael (Editor); Rash, James (Editor); Truszkowski, Walt (Editor); Rouff, Christopher (Editor)
2004-01-01
These preceedings contain 18 papers and 4 poster presentation, covering topics such as: multi-agent systems, agent-based control, formalism, norms, as well as physical and biological models of agent-based systems. Some applications presented in the proceedings include systems analysis, software engineering, computer networks and robot control.
NASA Astrophysics Data System (ADS)
Gardner, Grant E.; Jones, M. Gail; Albe, Virginie; Blonder, Ron; Laherto, Antti; Macher, Daniel; Paechter, Manuela
2017-10-01
Recent efforts in the science education community have highlighted the need to integrate research and theory from science communication research into more general science education scholarship. These synthesized research perspectives are relatively novel but serve an important need to better understand the impacts that the advent of rapidly emerging technologies will have on a new generation of scientists and engineers including their formal communication with engaged citizenry. This cross-national study examined postsecondary science and engineering students' ( n = 254 from five countries: Austria, Finland, France, Israel, and USA) perspectives on the role of science communication in their own formal science and engineering education. More broadly, we examined participants' understanding of their perceived responsibilities of communicating science and engineering to the general public when an issue contains complex social and ethical implications (SEI). The study is contextualized in the emergent technology of nanotechnology for which SEI are of particular concern and for which the general public often perceives conflicting risks and benefits. Findings indicate that student participants' hold similar views on the need for their own training in communication as future scientists and engineers. When asked about the role that ethics and risk perception plays in research, development, and public communication of nanotechnology, participants demonstrate similar trajectories of perspectives that are, however, often anchored in very different levels of beginning concern. Results are discussed in the context of considerations for science communication training within formal science education curricula globally.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanz Rodrigo, Javier; Chávez Arroyo, Roberto Aurelio; Moriarty, Patrick
The increasing size of wind turbines, with rotors already spanning more than 150 m diameter and hub heights above 100 m, requires proper modeling of the atmospheric boundary layer (ABL) from the surface to the free atmosphere. Furthermore, large wind farm arrays create their own boundary layer structure with unique physics. This poses significant challenges to traditional wind engineering models that rely on surface-layer theories and engineering wind farm models to simulate the flow in and around wind farms. However, adopting an ABL approach offers the opportunity to better integrate wind farm design tools and meteorological models. The challenge ismore » how to build the bridge between atmospheric and wind engineering model communities and how to establish a comprehensive evaluation process that identifies relevant physical phenomena for wind energy applications with modeling and experimental requirements. A framework for model verification, validation, and uncertainty quantification is established to guide this process by a systematic evaluation of the modeling system at increasing levels of complexity. In terms of atmospheric physics, 'building the bridge' means developing models for the so-called 'terra incognita,' a term used to designate the turbulent scales that transition from mesoscale to microscale. This range of scales within atmospheric research deals with the transition from parameterized to resolved turbulence and the improvement of surface boundary-layer parameterizations. The coupling of meteorological and wind engineering flow models and the definition of a formal model evaluation methodology, is a strong area of research for the next generation of wind conditions assessment and wind farm and wind turbine design tools. Some fundamental challenges are identified in order to guide future research in this area.« less
NASA Astrophysics Data System (ADS)
Žáček, Martin
2017-07-01
Ontology or formal ontology? Which word is correct? The aim of this article is to introduce correct terms and explain their basis. Ontology describes a particular area of interest (domain) in a formal way - defines the classes of objects that are in that area, and relationships that may exist between them. Meaning of ontology consists mainly in facilitating communication between people, improve collaboration of software systems and in the improvement of systems engineering. Ontology in all these areas offer the possibility of unification of view, maintaining consistency and unambiguity.
Towards programming languages for genetic engineering of living cells
Pedersen, Michael; Phillips, Andrew
2009-01-01
Synthetic biology aims at producing novel biological systems to carry out some desired and well-defined functions. An ultimate dream is to design these systems at a high level of abstraction using engineering-based tools and programming languages, press a button, and have the design translated to DNA sequences that can be synthesized and put to work in living cells. We introduce such a programming language, which allows logical interactions between potentially undetermined proteins and genes to be expressed in a modular manner. Programs can be translated by a compiler into sequences of standard biological parts, a process that relies on logic programming and prototype databases that contain known biological parts and protein interactions. Programs can also be translated to reactions, allowing simulations to be carried out. While current limitations on available data prevent full use of the language in practical applications, the language can be used to develop formal models of synthetic systems, which are otherwise often presented by informal notations. The language can also serve as a concrete proposal on which future language designs can be discussed, and can help to guide the emerging standard of biological parts which so far has focused on biological, rather than logical, properties of parts. PMID:19369220
Towards programming languages for genetic engineering of living cells.
Pedersen, Michael; Phillips, Andrew
2009-08-06
Synthetic biology aims at producing novel biological systems to carry out some desired and well-defined functions. An ultimate dream is to design these systems at a high level of abstraction using engineering-based tools and programming languages, press a button, and have the design translated to DNA sequences that can be synthesized and put to work in living cells. We introduce such a programming language, which allows logical interactions between potentially undetermined proteins and genes to be expressed in a modular manner. Programs can be translated by a compiler into sequences of standard biological parts, a process that relies on logic programming and prototype databases that contain known biological parts and protein interactions. Programs can also be translated to reactions, allowing simulations to be carried out. While current limitations on available data prevent full use of the language in practical applications, the language can be used to develop formal models of synthetic systems, which are otherwise often presented by informal notations. The language can also serve as a concrete proposal on which future language designs can be discussed, and can help to guide the emerging standard of biological parts which so far has focused on biological, rather than logical, properties of parts.
Introduction to computers: Reference guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ligon, F.V.
1995-04-01
The ``Introduction to Computers`` program establishes formal partnerships with local school districts and community-based organizations, introduces computer literacy to precollege students and their parents, and encourages students to pursue Scientific, Mathematical, Engineering, and Technical careers (SET). Hands-on assignments are given in each class, reinforcing the lesson taught. In addition, the program is designed to broaden the knowledge base of teachers in scientific/technical concepts, and Brookhaven National Laboratory continues to act as a liaison, offering educational outreach to diverse community organizations and groups. This manual contains the teacher`s lesson plans and the student documentation to this introduction to computer course.
Semantically-Rigorous Systems Engineering Modeling Using Sysml and OWL
NASA Technical Reports Server (NTRS)
Jenkins, J. Steven; Rouquette, Nicolas F.
2012-01-01
The Systems Modeling Language (SysML) has found wide acceptance as a standard graphical notation for the domain of systems engineering. SysML subsets and extends the Unified Modeling Language (UML) to define conventions for expressing structural, behavioral, and analytical elements, and relationships among them. SysML-enabled modeling tools are available from multiple providers, and have been used for diverse projects in military aerospace, scientific exploration, and civil engineering. The Web Ontology Language (OWL) has found wide acceptance as a standard notation for knowledge representation. OWL-enabled modeling tools are available from multiple providers, as well as auxiliary assets such as reasoners and application programming interface libraries, etc. OWL has been applied to diverse projects in a wide array of fields. While the emphasis in SysML is on notation, SysML inherits (from UML) a semantic foundation that provides for limited reasoning and analysis. UML's partial formalization (FUML), however, does not cover the full semantics of SysML, which is a substantial impediment to developing high confidence in the soundness of any conclusions drawn therefrom. OWL, by contrast, was developed from the beginning on formal logical principles, and consequently provides strong support for verification of consistency and satisfiability, extraction of entailments, conjunctive query answering, etc. This emphasis on formal logic is counterbalanced by the absence of any graphical notation conventions in the OWL standards. Consequently, OWL has had only limited adoption in systems engineering. The complementary strengths and weaknesses of SysML and OWL motivate an interest in combining them in such a way that we can benefit from the attractive graphical notation of SysML and the formal reasoning of OWL. This paper describes an approach to achieving that combination.
Assessing the Higher National Diploma Chemical Engineering Programme in Ghana: Students' Perspective
ERIC Educational Resources Information Center
Boateng, Cyril D.; Bensah, Edem Cudjoe; Ahiekpor, Julius C.
2012-01-01
Chemical engineers have played key roles in the growth of the chemical and allied industries in Ghana but indigenous industries that have traditionally been the domain of the informal sector need to be migrated to the formal sector through the entrepreneurship and innovation of chemical engineers. The Higher National Diploma Chemical Engineering…
Formal Assurance for Cognitive Architecture Based Autonomous Agent
NASA Technical Reports Server (NTRS)
Bhattacharyya, Siddhartha; Eskridge, Thomas; Neogi, Natasha; Carvalho, Marco
2017-01-01
Autonomous systems are designed and deployed in different modeling paradigms. These environments focus on specific concepts in designing the system. We focus our effort in the use of cognitive architectures to design autonomous agents to collaborate with humans to accomplish tasks in a mission. Our research focuses on introducing formal assurance methods to verify the behavior of agents designed in Soar, by translating the agent to the formal verification environment Uppaal.
Experiences Using Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1996-01-01
This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
A theory of requirements definition in engineering design
NASA Astrophysics Data System (ADS)
Eodice, Michael Thomas
2000-10-01
Traditional requirements-definition activities begin with the engineer or design team performing a needs-analysis to identify user requirements. Needs-analysis is generally subjective, and varies according to the composition and experience of the design team. Systematic procedures for defining and ranking requirements are necessary to consolidate the foundation on which the design process is predicated, and to enhance its outcome by providing the designer with a consistent, reliable approach to product development. As a first step towards developing such procedures, research was conducted at Stanford University using empirical data from a NASA spaceflight experiment that flew aboard Space Shuttle mission STS-90 (April 1998). This research was accomplished using ex post facto data analysis. This researcher served in the central role of Experiment Manager for the spaceflight experiment, and acted as a participant-observer while conducting formal research. To better understand requirement structure and evolution, individual requirements were decomposed using AND/OR graphs. The AND/OR graph data structure illustrates requirements evolution, and reveals that the original requirement is often met by fulfilling a series of sub-requirements that are easier to implement. Early in the product life cycle, many hundreds of potential needs were identified; however, it was a smaller subset of these initial needs that were realized in the final product. Based on analysis of a large group of requirements, it was observed that two critical components for any individual requirement were: (1) a stated need, and (2) an advocate to implement the need. Identification of need, or needs-analysis, although a necessary condition, is insufficient to ensure that a stated need evolves into a formal requirement. Equally important to the concept of requirements-definition is the notion of advocacy. Early in the product development cycle of the of the NASA experiment, many potential needs were identified; however, it was only through need-advocate pairing that a stated need became a requirement. Empirical data revealed that when needs were accompanied by strong advocates, they became clear requirements. Conversely, needs without advocates did not become requirements. Hence, need-advocate pairing is useful for predicting needs which will become requirements, and more importantly, needs at risk of not becoming requirements.
NASA Astrophysics Data System (ADS)
Mishra, Arpit; Ghosh, Parthasarathi
2015-12-01
For low cost, high thrust, space missions with high specific impulse and high reliability, inert weight needs to be minimized and thereby increasing the delivered payload. Turbopump feed system for a liquid propellant rocket engine (LPRE) has the highest power to weight ratio. Turbopumps are primarily equipped with an axial flow inducer to achieve the high angular velocity and low suction pressure in combination with increased system reliability. Performance of the turbopump strongly depends on the performance of the inducer. Thus, for designing a LPRE turbopump, demands optimization of the inducer geometry based on the performance of different off-design operating regimes. In this paper, steady-state CFD analysis of the inducer of a liquid oxygen (LOX) axial pump used as a booster pump for an oxygen rich staged combustion cycle rocket engine has been presented using ANSYS® CFX. Attempts have been made to obtain the performance characteristic curves for the LOX pump inducer. The formalism has been used to predict the performance of the inducer for the throttling range varying from 80% to 113% of nominal thrust and for the different rotational velocities from 4500 to 7500 rpm. The results have been analysed to determine the region of cavitation inception for different inlet pressure.
Flood design recipes vs. reality: can predictions for ungauged basins be trusted?
NASA Astrophysics Data System (ADS)
Efstratiadis, A.; Koussis, A. D.; Koutsoyiannis, D.; Mamassis, N.
2013-12-01
Despite the great scientific and technological advances in flood hydrology, everyday engineering practices still follow simplistic approaches, such as the rational formula and the SCS-CN method combined with the unit hydrograph theory that are easy to formally implement in ungauged areas. In general, these "recipes" have been developed many decades ago, based on field data from few experimental catchments. However, many of them have been neither updated nor validated across all hydroclimatic and geomorphological conditions. This has an obvious impact on the quality and reliability of hydrological studies, and, consequently, on the safety and cost of the related flood protection works. Preliminary results, based on historical flood data from Cyprus and Greece, indicate that a substantial revision of many aspects of flood engineering procedures is required, including the regionalization formulas as well as the modelling concepts themselves. In order to provide a consistent design framework and to ensure realistic predictions of the flood risk (a key issue of the 2007/60/EU Directive) in ungauged basins, it is necessary to rethink the current engineering practices. In this vein, the collection of reliable hydrological data would be essential for re-evaluating the existing "recipes", taking into account local peculiarities, and for updating the modelling methodologies as needed.
A decision science approach for integrating social science in climate and energy solutions
NASA Astrophysics Data System (ADS)
Wong-Parodi, Gabrielle; Krishnamurti, Tamar; Davis, Alex; Schwartz, Daniel; Fischhoff, Baruch
2016-06-01
The social and behavioural sciences are critical for informing climate- and energy-related policies. We describe a decision science approach to applying those sciences. It has three stages: formal analysis of decisions, characterizing how well-informed actors should view them; descriptive research, examining how people actually behave in such circumstances; and interventions, informed by formal analysis and descriptive research, designed to create attractive options and help decision-makers choose among them. Each stage requires collaboration with technical experts (for example, climate scientists, geologists, power systems engineers and regulatory analysts), as well as continuing engagement with decision-makers. We illustrate the approach with examples from our own research in three domains related to mitigating climate change or adapting to its effects: preparing for sea-level rise, adopting smart grid technologies in homes, and investing in energy efficiency for office buildings. The decision science approach can facilitate creating climate- and energy-related policies that are behaviourally informed, realistic and respectful of the people whom they seek to aid.
Canonical formalism for modelling and control of rigid body dynamics.
Gurfil, P
2005-12-01
This paper develops a new paradigm for stabilization of rigid-body dynamics. The state-space model is formulated using canonical elements, known as the Serret-Andoyer (SA) variables, thus far scarcely used for engineering applications. The main feature of the SA formalism is the reduction of the dynamics via the underlying symmetry stemming from conservation of angular momentum and rotational kinetic energy. The controllability of the system model is examined using the notion of accessibility, and is shown to be accessible from all points. Based on the accessibility proof, two nonlinear asymptotic feedback stabilizers are developed: a damping feedback is designed based on the Jurdjevic-Quinn method, and a Hamiltonian controller is derived by using the Hamiltonian as a natural Lyapunov function for the closed-loop dynamics. It is shown that the Hamiltonian control is both passive and inverse optimal with respect to a meaningful performance index. The performance of the new controllers is examined and compared using simulations of realistic scenarios from the satellite attitude dynamics field.
A planning and scheduling lexicon
NASA Technical Reports Server (NTRS)
Cruz, Jennifer W.; Eggemeyer, William C.
1989-01-01
A lexicon related to mission planning and scheduling for spacecraft is presented. Planning and scheduling work is known as sequencing. Sequencing is a multistage process of merging requests from both the science and engineering arenas to accomplish the objectives defined in the requests. The multistage process begins with the creation of science and engineering goals, continues through their integration into the sequence, and eventually concludes with command execution onboard the spacecraft. The objective of this publication is to introduce some formalism into the field of spacecraft sequencing-system technology. This formalism will make it possible for researchers and potential customers to communicate about system requirements and capabilities in a common language.
Acclimating international graduate students to professional engineering ethics.
Newberry, Byron; Austin, Katherine; Lawson, William; Gorsuch, Greta; Darwin, Thomas
2011-03-01
This article describes the education portion of an ongoing grant-sponsored education and research project designed to help graduate students in all engineering disciplines learn about the basic ethical principles, rules, and obligations associated with engineering practice in the United States. While the curriculum developed for this project is used for both domestic and international students, the educational materials were designed to be sensitive to the specific needs of international graduate students. In recent years, engineering programs in the United States have sought to develop a larger role for professional ethics education in the curriculum. Accreditation requirements, as well as pressures from the private sector, have helped facilitate this shift in focus. Almost half of all engineering graduate students in the U.S. are international students. Further, research indicates that the majority of these students will remain in the U.S. to work post-graduation. It is therefore in the interest of the profession that these students, coming from diverse backgrounds, receive some formal exposure to the professional and ethical expectations and norms of the engineering profession in the United States to help ensure that they have the knowledge and skills--non-technical as well as technical--required in today's engineering profession. In becoming acculturated to professional norms in a host country, international students face challenges that domestic students do not encounter; such as cultural competency, language proficiency, and acculturation stress. Mitigating these challenges must be a consideration in the development of any effective education materials. The present article discusses the project rationale and describes the development of on-line instructional materials aimed at helping international engineering graduate students acclimate to professional engineering ethics standards in the United States. Finally, a brief data summary of students' perceptions of the usefulness of the content and instructional interface is provided to demonstrate the initial effectiveness of the materials and to present a case for project sustainability.
Enhancing Individual Employability: The Perspective of Engineering Graduates
ERIC Educational Resources Information Center
Nilsson, Staffan
2010-01-01
Purpose: Employability includes the ability to find employment and remain employed. Employability includes both hard and soft skills, including formal and actual competence, interpersonal skills, and personal characteristics. This paper aims to focus on illuminating perceptions engineering graduates have regarding employability. More specifically,…
Design of Mechanisms for Deployable, Optical Instruments: Guidelines for Reducing Hysteresis
NASA Technical Reports Server (NTRS)
Lake, Mark S.; Hachkowski, M. Roman
2000-01-01
This paper is intended to facilitate the development of deployable, optical instruments by providing a rational approach for the design, testing, and qualification of high-precision (i.e., low-hysteresis) deployment mechanisms for these instruments. Many of the guidelines included herein come directly from the field of optomechanical engineering, and are, therefore, neither newly developed guidelines, nor are they uniquely applicable to the design of high-precision deployment mechanisms. This paper is to be regarded as a guide to design and not a set of NASA requirements, except as may be defined in formal project specifications. Furthermore, due to the rapid pace of advancement in the field of precision deployment, this paper should be regarded as a preliminary set of guidelines. However, it is expected that this paper, with revisions as experience may indicate to be desirable, might eventually form the basis for a set of uniform design requirements for high-precision deployment mechanisms on future NASA space-based science instruments.
A Novel Latin Hypercube Algorithm via Translational Propagation
Pan, Guang; Ye, Pengcheng
2014-01-01
Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is directly related to the experimental designs used. Optimal Latin hypercube designs are frequently used and have been shown to have good space-filling and projective properties. However, the high cost in constructing them limits their use. In this paper, a methodology for creating novel Latin hypercube designs via translational propagation and successive local enumeration algorithm (TPSLE) is developed without using formal optimization. TPSLE algorithm is based on the inspiration that a near optimal Latin Hypercube design can be constructed by a simple initial block with a few points generated by algorithm SLE as a building block. In fact, TPSLE algorithm offers a balanced trade-off between the efficiency and sampling performance. The proposed algorithm is compared to two existing algorithms and is found to be much more efficient in terms of the computation time and has acceptable space-filling and projective properties. PMID:25276844
Experiences Using Lightweight Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1997-01-01
This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
NASA Technical Reports Server (NTRS)
Broderick, Ron
1997-01-01
The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network development. The changes were to include evaluation tools that can be applied to neural networks at each phase of the software engineering life cycle. The result was a formal evaluation approach to increase the product quality of systems that use neural networks for their implementation.
Ontological analysis of SNOMED CT.
Héja, Gergely; Surján, György; Varga, Péter
2008-10-27
SNOMED CT is the most comprehensive medical terminology. However, its use for intelligent services based on formal reasoning is questionable. The analysis of the structure of SNOMED CT is based on the formal top-level ontology DOLCE. The analysis revealed several ontological and knowledge-engineering errors, the most important are errors in the hierarchy (mostly from an ontological point of view, but also regarding medical aspects) and the mixing of subsumption relations with other types (mostly 'part of'). The found errors impede formal reasoning. The paper presents a possible way to correct these problems.
Thermodynamics of the mesoscopic thermoelectric heat engine beyond the linear-response regime
NASA Astrophysics Data System (ADS)
Yamamoto, Kaoru; Hatano, Naomichi
2015-10-01
Mesoscopic thermoelectric heat engine is much anticipated as a device that allows us to utilize with high efficiency wasted heat inaccessible by conventional heat engines. However, the derivation of the heat current in this engine seems to be either not general or described too briefly, even inappropriately in some cases. In this paper, we give a clear-cut derivation of the heat current of the engine with suitable assumptions beyond the linear-response regime. It resolves the confusion in the definition of the heat current in the linear-response regime. After verifying that we can construct the same formalism as that of the cyclic engine, we find the following two interesting results within the Landauer-Büttiker formalism: the efficiency of the mesoscopic thermoelectric engine reaches the Carnot efficiency if and only if the transmission probability is finite at a specific energy and zero otherwise; the unitarity of the transmission probability guarantees the second law of thermodynamics, invalidating Benenti et al.'s argument in the linear-response regime that one could obtain a finite power with the Carnot efficiency under a broken time-reversal symmetry [Phys. Rev. Lett. 106, 230602 (2011), 10.1103/PhysRevLett.106.230602]. These results demonstrate how quantum mechanics constrains thermodynamics.
Probabilistic Methods for Structural Design and Reliability
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)
2002-01-01
This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.
OMOGENIA: A Semantically Driven Collaborative Environment
NASA Astrophysics Data System (ADS)
Liapis, Aggelos
Ontology creation can be thought of as a social procedure. Indeed the concepts involved in general need to be elicited from communities of domain experts and end-users by teams of knowledge engineers. Many problems in ontology creation appear to resemble certain problems in software design, particularly with respect to the setup of collaborative systems. For instance, the resolution of conceptual conflicts between formalized ontologies is a major engineering problem as ontologies move into widespread use on the semantic web. Such conflict resolution often requires human collaboration and cannot be achieved by automated methods with the exception of simple cases. In this chapter we discuss research in the field of computer-supported cooperative work (CSCW) that focuses on classification and which throws light on ontology building. Furthermore, we present a semantically driven collaborative environment called OMOGENIA as a natural way to display and examine the structure of an evolving ontology in a collaborative setting.
Petri net-based dependability modeling methodology for reconfigurable field programmable gate arrays
NASA Astrophysics Data System (ADS)
Graczyk, Rafał; Orleański, Piotr; Poźniak, Krzysztof
2015-09-01
Dependability modeling is an important issue for aerospace and space equipment designers. From system level perspective, one has to choose from multitude of possible architectures, redundancy levels, component combinations in a way to meet desired properties and dependability and finally fit within required cost and time budgets. Modeling of such systems is getting harder as its levels of complexity grow together with demand for more functional and flexible, yet more available systems that govern more and more crucial parts of our civilization's infrastructure (aerospace transport systems, telecommunications, exploration probes). In this article promising method of modeling complex systems using Petri networks is introduced in context of qualitative and quantitative dependability analysis. This method, although with some limitation and drawback offer still convenient visual formal method of describing system behavior on different levels (functional, timing, random events) and offers straight correspondence to underlying mathematical engine, perfect for simulations and engineering support.
NASA Astrophysics Data System (ADS)
Ndu, Obibobi Kamtochukwu
To ensure that estimates of risk and reliability inform design and resource allocation decisions in the development of complex engineering systems, early engagement in the design life cycle is necessary. An unfortunate constraint on the accuracy of such estimates at this stage of concept development is the limited amount of high fidelity design and failure information available on the actual system under development. Applying the human ability to learn from experience and augment our state of knowledge to evolve better solutions mitigates this limitation. However, the challenge lies in formalizing a methodology that takes this highly abstract, but fundamentally human cognitive, ability and extending it to the field of risk analysis while maintaining the tenets of generalization, Bayesian inference, and probabilistic risk analysis. We introduce an integrated framework for inferring the reliability, or other probabilistic measures of interest, of a new system or a conceptual variant of an existing system. Abstractly, our framework is based on learning from the performance of precedent designs and then applying the acquired knowledge, appropriately adjusted based on degree of relevance, to the inference process. This dissertation presents a method for inferring properties of the conceptual variant using a pseudo-spatial model that describes the spatial configuration of the family of systems to which the concept belongs. Through non-metric multidimensional scaling, we formulate the pseudo-spatial model based on rank-ordered subjective expert perception of design similarity between systems that elucidate the psychological space of the family. By a novel extension of Kriging methods for analysis of geospatial data to our "pseudo-space of comparable engineered systems", we develop a Bayesian inference model that allows prediction of the probabilistic measure of interest.
NASA Astrophysics Data System (ADS)
Drake, J.; Schamel, D.; Fisher, P.; Terschak, J. A.; Stelling, P.; Almberg, L.; Phillips, E.; Forner, M.; Gregory, D.
2002-12-01
When a gummi-bear is introduced into hot potassium chlorate there is a powerful reaction. This is analogous to the response we have seen to the Alaska Summer Research Academy (ASRA). ASRA is a residential science research camp supported by the College of Science, Engineering and Mathematics at the University of Alaska Fairbanks. The hallmark of ASRA is the opportunity for small groups of 4 or fewer students, ages 10-17, to conduct scientific research and participate in engineering design projects with university faculty and researchers as mentors. Participating scientists, engineers, faculty, graduate students, and K-12 teachers from a variety of disciplines design individual research units and guide the students through designing and constructing a project, collecting data, and synthesizing results. The week-long camp culminates with the students from each project making a formal presentation to the camp and public. In its second year ASRA is already a huge success, quadrupling in size from 21 students in 2001 to 89 students in 2002. Due to a high percentage of returning students, we anticipate there will be a waiting list next year. This presentation contains perspectives from administrators, instructors, staff, and students. Based on our experience we feel there is a large potential demand for education and public outreach (EPO) in university settings. We believe the quality and depth of the ASRA experience directly contributes to the success of a worthwhile EPO program. ASRA will be portrayed as a useful model for EPO at other institutions.
Mathematical Building-Blocks in Engineering Mechanics
ERIC Educational Resources Information Center
Boyajian, David M.
2007-01-01
A gamut of mathematical subjects and concepts are taught within a handful of courses formally required of the typical engineering student who so often questions the relevancy of being bound to certain lower-division prerequisites. Basic classes at the undergraduate level, in this context, include: Integral and Differential Calculus, Differential…
Publications of the Jet Propulsion Laboratory 1976
NASA Technical Reports Server (NTRS)
1977-01-01
The formalized technical reporting, released January through December 1975, that resulted from scientific and engineering work performed, or managed, by the Jet Propulsion Laboratory is described and indexed. The following classes of publications are included: (1) technical reports; (2) technical memorandums; (3) articles from bi-monthly Deep Space Network (DSN) progress report; (4) special publications; and (5) articles published in the open literature. The publications are indexed by: (1) author, (2) subject, and (3) publication type and number. A descriptive entry appears under the name of each author of each publication; an abstract is included with the entry for the primary (first-listed) author. Unless designated otherwise, all publications listed are unclassified.
Opportunities for Space Science Education Using Current and Future Solar System Missions
NASA Astrophysics Data System (ADS)
Matiella Novak, M.; Beisser, K.; Butler, L.; Turney, D.
2010-12-01
The Education and Public Outreach (E/PO) office in The Johns Hopkins University Applied Physics Laboratory (APL) Space Department strives to excite and inspire the next generation of explorers by creating interactive education experiences. Since 1959, APL engineers and scientists have designed, built, and launched 61 spacecraft and over 150 instruments involved in space science. With the vast array of current and future Solar System exploration missions available, endless opportunities exist for education programs to incorporate the real-world science of these missions. APL currently has numerous education and outreach programs tailored for K-12 formal and informal education, higher education, and general outreach communities. Current programs focus on Solar System exploration missions such as the Compact Reconnaissance Imaging Spectrometer for Mars (CRISM), Miniature Radio Frequency (Mini-RF) Moon explorer, the Radiation Belt Storm Probes (RBSP), New Horizons mission to Pluto, and the Thermosphere Ionosphere Mesosphere Energetics and Dynamics (TIMED) Satellite, to name a few. Education and outreach programs focusing on K-12 formal education include visits to classrooms, summer programs for middle school students, and teacher workshops. APL hosts a Girl Power event and a STEM (Science, Technology, Engineering, and Mathematics) Day each year. Education and outreach specialists hold teacher workshops throughout the year to train educators in using NASA spacecraft science in their lesson plans. High school students from around the U.S. are able to engage in NASA spacecraft science directly by participating in the Mars Exploration Student Data Teams (MESDT) and the Student Principal Investigator Programs. An effort is also made to generate excitement for future missions by focusing on what mysteries will be solved. Higher education programs are used to recruit and train the next generation of scientists and engineers. The NASA/APL Summer Internship Program offers a unique glimpse into the Space Department’s “end-to-end” approach to mission design and execution. College students - both undergraduate and graduate - are recruited from around the U.S. to work with APL scientists and engineers who act as mentors to the students. Many students are put on summer projects that allow them to work with existing spacecraft systems, while others participate in projects that investigate the operational and science objectives of future planned spacecraft systems. In many cases these interns have returned to APL as full-time staff after graduation.
AOPs and Biomarkers: Bridging High Throughput Screening ...
As high throughput screening (HTS) plays a larger role in toxicity testing, camputational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models designed to quantify potential adverse effects based on HTS data will benefit from additional data sources that connect the magnitude of perturbation from the in vitro system to a level of concern at the organism or population level. The adverse outcome pathway (AOP) concept provides an ideal framework for combining these complementary data. Recent international efforts under the auspices of the Organization for Economic Co-operation and Development (OECD) have resulted in an AOP wiki designed to house formal descriptions of AOPs suitable for use in regulatory decision making. Recent efforts have built upon this to include an ontology describing the AOP with linkages to biological pathways, physiological terminology, and taxonomic applicability domains. Incorporation of an AOP network tool developed by the U.S. Army Corps of Engineers also allows consideration of cumulative risk from chemical and non-chemical stressors. Biomarkers are an important complement to formal AOP descriptions, particularly when dealing with susceptible subpopulations or lifestages in human health risk assessment. To address the issue of nonchemical stressors than may modify effects of criteria air pollutants, a novel method was used to integrate blood gene expression data with hema
Making Sense of the Arrow-Pushing Formalism among Chemistry Majors Enrolled in Organic Chemistry
ERIC Educational Resources Information Center
Ferguson, Robert; Bodner, George M.
2008-01-01
This paper reports results of a qualitative study of sixteen students enrolled in a second year organic chemistry course for chemistry and chemical engineering majors. The focus of the study was student use of the arrow-pushing formalism that plays a central role in both the teaching and practice of organic chemistry. The goal of the study was to…
Systems Engineering Leadership Development: Advancing Systems Engineering Excellence
NASA Technical Reports Server (NTRS)
Hall, Phil; Whitfield, Susan
2011-01-01
This slide presentation reviews the Systems Engineering Leadership Development Program, with particular emphasis on the work being done in the development of systems engineers at Marshall Space Flight Center. There exists a lack of individuals with systems engineering expertise, in particular those with strong leadership capabilities, to meet the needs of the Agency's exploration agenda. Therefore there is a emphasis on developing these programs to identify and train systems engineers. The presentation reviews the proposed MSFC program that includes course work, and developmental assignments. The formal developmental programs at the other centers are briefly reviewed, including the Point of Contact (POC)
Derivation and application of the energy dissipation factor in the design of fishways
Towler, Brett; Mulligan, Kevin; Haro, Alexander J.
2015-01-01
Reducing turbulence and associated air entrainment is generally considered advantageous in the engineering design of fish passage facilities. The well-known energy dissipation factor, or EDF, correlates with observations of the phenomena. However, inconsistencies in EDF forms exist and the bases for volumetric energy dissipation rate criteria are often misunderstood. A comprehensive survey of EDF criteria is presented. Clarity in the application of the EDF and resolutions to these inconsistencies are provided through formal derivations; it is demonstrated that kinetic energy represents only 1/3 of the total energy input for the special case of a broad-crested weir. Specific errors in published design manuals are identified and resolved. New, fundamentally sound, design equations for culvert outlet pools and standard Denil Fishway resting pools are developed. The findings underscore the utility of EDF equations, demonstrate the transferability of volumetric energy dissipation rates, and provide a foundation for future refinement of component-, species-, and life-stage-specific EDF criteria.
Formal Validation of Fault Management Design Solutions
NASA Technical Reports Server (NTRS)
Gibson, Corrina; Karban, Robert; Andolfato, Luigi; Day, John
2013-01-01
The work presented in this paper describes an approach used to develop SysML modeling patterns to express the behavior of fault protection, test the model's logic by performing fault injection simulations, and verify the fault protection system's logical design via model checking. A representative example, using a subset of the fault protection design for the Soil Moisture Active-Passive (SMAP) system, was modeled with SysML State Machines and JavaScript as Action Language. The SysML model captures interactions between relevant system components and system behavior abstractions (mode managers, error monitors, fault protection engine, and devices/switches). Development of a method to implement verifiable and lightweight executable fault protection models enables future missions to have access to larger fault test domains and verifiable design patterns. A tool-chain to transform the SysML model to jpf-Statechart compliant Java code and then verify the generated code via model checking was established. Conclusions and lessons learned from this work are also described, as well as potential avenues for further research and development.
An overlooked alliance: using human factors engineering to reduce patient harm.
Perry, Shawna J
2004-08-01
Although human factors engineering (HFE) is considered only in relationship to the design of medical devices or information systems technology, human factors issues arise in many aspects of work in health care organizations. In one scenario, the resuscitation stretcher would not pass through the ED door closest to radiology. Many clinical work spaces were never formally designed for the work currently being performed in them; instead, they were adapted from existing space originally designed for a different use. In a second scenario, infusion pump malfunction was not apparent. The patient experienced a near miss secondary to poor design; users thought that the infusion pump had been turned off when it was not. Health care can significantly benefit from the incorporation of HFE into the workplace. Introductory classes in medical and nursing schools on HFE will assist students in detecting HFE-related issues, making them less likely to suffer with them or overlook them once in clinical practice. More extensive training for patient safety and risk managers, that is, at a minimum, a certificate-level course from an HFE program, would enhance case and root cause analyses since these issues are rarely factored in. Collaboration with HFE experts and use of HFE principles may not make health care fool-proof, but it will make it less dependent on improvisation and ingenuity to protect patients from the system's vulnerabilities.
NASA's Space Launch System: Systems Engineering Approach for Affordability and Mission Success
NASA Technical Reports Server (NTRS)
Hutt, John J.; Whitehead, Josh; Hanson, John
2017-01-01
NASA is working toward the first launch of a new, unmatched capability for deep space exploration, with launch readiness planned for 2018. The initial Block 1 configuration of the Space Launch System will more than double the mass and volume to Low Earth Orbit (LEO) of any launch vehicle currently in operation - with a path to evolve to the greatest capability ever developed. The program formally began in 2011. The vehicle successfully passed Preliminary Design Review (PDR) in 2013, Key Decision Point C (KDPC) in 2014 and Critical Design Review (CDR) in October 2015 - nearly 40 years since the last CDR of a NASA human-rated rocket. Every major SLS element has completed components of test and flight hardware. Flight software has completed several development cycles. RS-25 hotfire testing at NASA Stennis Space Center (SSC) has successfully demonstrated the space shuttle-heritage engine can perform to SLS requirements and environments. The five-segment solid rocket booster design has successfully completed two full-size motor firing tests in Utah. Stage and component test facilities at Stennis and NASA Marshall Space Flight Center are nearing completion. Launch and test facilities, as well as transportation and other ground support equipment are largely complete at NASA's Kennedy, Stennis and Marshall field centers. Work is also underway on the more powerful Block 1 B variant with successful completion of the Exploration Upper Stage (EUS) PDR in January 2017. NASA's approach is to develop this heavy lift launch vehicle with limited resources by building on existing subsystem designs and existing hardware where available. The systems engineering and integration (SE&I) of existing and new designs introduces unique challenges and opportunities. The SLS approach was designed with three objectives in mind: 1) Design the vehicle around the capability of existing systems; 2) Reduce work hours for nonhardware/ software activities; 3) Increase the probability of mission success by focusing effort on more critical activities.
NASA Astrophysics Data System (ADS)
Klug Boonstra, S.
2017-12-01
With the advent and widespread adoption of virtual connectivity, it is possible for scientists, engineers, and other STEM professionals to reach every place the youth of America learn! Arizona State University's School of Earth and Space Exploration, in planned collaboration with national STEM organizations, agencies, and education partners, are proposing a bold, collaborative, national model that will better enable STEM professionals of all disciplines to meet the needs of their audiences more effectively and efficiently. STEM subject matter experts (SMEs) can bring timely and authentic, real-world examples that engage and motivate learners in the conceptual learning journey presented through formal and informal curricula while also providing a personal face and story of their STEM journey and experience. With over 6.2 million scientists and engineers, 55.6 million PreK-12 students, and 6.3 million community college students in the US, the possible reach, long-term impact, and benefits of the virtual, just-in-time interactions between SMEs, teachers, and students has the potential to provide the missing links of relevancy and real-world application that will engage learners and enhance STEM understanding at a higher, deeper level while having the capacity to do this at a national scale. Providing professional development training for the SMEs will be an essential element in helping them to understand where their STEM work is relevant and appropriate within educational learning progressions. The vision for STEM Connect will be to prepare the STEM SMEs to share their expertise in a way that will show the dynamic and iterative nature of STEM research and design, helping them to bring their STEM expertise to formal and informal learners in a strategic and meaningful way. Discussions with possible STEM Connect collaborators (e.g., national STEM member-based organizations, technology providers, federal agencies, and professional educational organizations) are underway to bring together a national design and implementation vision, start to build a collaborative team, and to look for funding mechanisms. We hope to empower this national pathway for STEM professionals to impact the way the next generation will understand and appreciate STEM's impact on our everyday lives.
Experimental validation of structural optimization methods
NASA Technical Reports Server (NTRS)
Adelman, Howard M.
1992-01-01
The topic of validating structural optimization methods by use of experimental results is addressed. The need for validating the methods as a way of effecting a greater and an accelerated acceptance of formal optimization methods by practicing engineering designers is described. The range of validation strategies is defined which includes comparison of optimization results with more traditional design approaches, establishing the accuracy of analyses used, and finally experimental validation of the optimization results. Examples of the use of experimental results to validate optimization techniques are described. The examples include experimental validation of the following: optimum design of a trussed beam; combined control-structure design of a cable-supported beam simulating an actively controlled space structure; minimum weight design of a beam with frequency constraints; minimization of the vibration response of helicopter rotor blade; minimum weight design of a turbine blade disk; aeroelastic optimization of an aircraft vertical fin; airfoil shape optimization for drag minimization; optimization of the shape of a hole in a plate for stress minimization; optimization to minimize beam dynamic response; and structural optimization of a low vibration helicopter rotor.
Formally verifying human–automation interaction as part of a system model: limitations and tradeoffs
Bass, Ellen J.
2011-01-01
Both the human factors engineering (HFE) and formal methods communities are concerned with improving the design of safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to perform formal verification of human–automation interaction with a programmable device. This effort utilizes a system architecture composed of independent models of the human mission, human task behavior, human-device interface, device automation, and operational environment. The goals of this architecture were to allow HFE practitioners to perform formal verifications of realistic systems that depend on human–automation interaction in a reasonable amount of time using representative models, intuitive modeling constructs, and decoupled models of system components that could be easily changed to support multiple analyses. This framework was instantiated using a patient controlled analgesia pump in a two phased process where models in each phase were verified using a common set of specifications. The first phase focused on the mission, human-device interface, and device automation; and included a simple, unconstrained human task behavior model. The second phase replaced the unconstrained task model with one representing normative pump programming behavior. Because models produced in the first phase were too large for the model checker to verify, a number of model revisions were undertaken that affected the goals of the effort. While the use of human task behavior models in the second phase helped mitigate model complexity, verification time increased. Additional modeling tools and technological developments are necessary for model checking to become a more usable technique for HFE. PMID:21572930
Formal Analysis of Privacy Requirements Specifications for Multi-Tier Applications
2013-07-30
Requirements Engineering Lab and co- founder of the Requirements Engineering and Law Workshop and has several publications in ACM- and IEEE- sponsored journals...Advertising that serves the online ad “Buying Razors Sucks” in this game. Zynga also produces a version of this game for the Android and iPhone mobile
NASA Technical Reports Server (NTRS)
1990-01-01
Papers presented at RICIS Software Engineering Symposium are compiled. The following subject areas are covered: flight critical software; management of real-time Ada; software reuse; megaprogramming software; Ada net; POSIX and Ada integration in the Space Station Freedom Program; and assessment of formal methods for trustworthy computer systems.
Attitudes towards Communication Skills among Engineering Students
ERIC Educational Resources Information Center
Kovac, Mirjana M.; Sirkovic, N.
2017-01-01
Good communication skills are of utmost importance in the education of engineering students. It is necessary to promote not only their education, but also to prepare them for the demanding and competitive job market. The purpose of this study was to compare the attitudes towards communication skills after formal instruction between the students of…
ERIC Educational Resources Information Center
Thompson, Julia D.; Jesiek, Brent K.
2017-01-01
This paper examines how the structural features of engineering engagement programs (EEPs) are related to the nature of their service-learning partnerships. "Structure" refers to formal and informal models, processes, and operations adopted or used to describe engagement programs, while "nature" signifies the quality of…
Expressive map design: OGC SLD/SE++ extension for expressive map styles
NASA Astrophysics Data System (ADS)
Christophe, Sidonie; Duménieu, Bertrand; Masse, Antoine; Hoarau, Charlotte; Ory, Jérémie; Brédif, Mathieu; Lecordix, François; Mellado, Nicolas; Turbet, Jérémie; Loi, Hugo; Hurtut, Thomas; Vanderhaeghe, David; Vergne, Romain; Thollot, Joëlle
2018-05-01
In the context of custom map design, handling more artistic and expressive tools has been identified as a carto-graphic need, in order to design stylized and expressive maps. Based on previous works on style formalization, an approach for specifying the map style has been proposed and experimented for particular use cases. A first step deals with the analysis of inspiration sources, in order to extract `what does make the style of the source', i.e. the salient visual characteristics to be automatically reproduced (textures, spatial arrangements, linear stylization, etc.). In a second step, in order to mimic and generate those visual characteristics, existing and innovative rendering techniques have been implemented in our GIS engine, thus extending the capabilities to generate expressive renderings. Therefore, an extension of the existing cartographic pipeline has been proposed based on the following aspects: 1- extension of the symbolization specifications OGC SLD/SE in order to provide a formalism to specify and reference expressive rendering methods; 2- separate the specification of each rendering method and its parameterization, as metadata. The main contribution has been described in (Christophe et al. 2016). In this paper, we focus firstly on the extension of the cartographic pipeline (SLD++ and metadata) and secondly on map design capabilities which have been experimented on various topographic styles: old cartographic styles (Cassini), artistic styles (watercolor, impressionism, Japanese print), hybrid topographic styles (ortho-imagery & vector data) and finally abstract and photo-realist styles for the geovisualization of costal area. The genericity and interoperability of our approach are promising and have already been tested for 3D visualization.
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.; Glassman, Nanci A.; Affelder, Linda O.; Hecht, Laura M.; Kennedy, John M.; Barclay, Rebecca O.
1994-01-01
This paper reports the results of an exploratory study that investigated the influence of technical uncertainty on the use of information and information sources by U.S. industry-affiliated aerospace engineers and scientists in completing or solving a project, task, or problem. Data were collected through a self-administered questionnaire. Survey participants were U.S. aerospace engineers and scientists whose names appeared on the Society of Automotive Engineers (SAE) mailing list. The results support the findings of previous research and the following study assumptions. Information and information-source use differ for projects, problems, and tasks with high and low technical uncertainty. As technical uncertainty increases, information-source use changes from internal to external and from informal to formal sources. As technical uncertainty increases, so too does the use of federally funded aerospace research and development (R&D). The use of formal information sources to learn about federally funded aerospace R&D differs for projects, problems, and tasks with high and low technical uncertainty.
Transforming Aggregate Object-Oriented Formal Specifications to Code
1999-03-01
integration issues associated with a formal-based software transformation system, such as the source specification, the problem space architecture , design architecture ... design transforms, and target software transforms. Software is critical in today’s Air Force, yet its specification, design, and development
1981-01-01
comparison of formal and informal design methodologies will show how we think they are converging. Lastly, I will describe our involvement with the DoD...computer security must begin with the design methodology , with the objective being provability. The idea ofa formal evaluation and on-the-shelf... Methodologies ] Here we can compare the formal design methodologies with those used by informal practitioners like Control Data. Obviously, both processes
Formalizing the Austrian Procedure Catalogue: A 4-step methodological analysis approach.
Neururer, Sabrina Barbara; Lasierra, Nelia; Peiffer, Karl Peter; Fensel, Dieter
2016-04-01
Due to the lack of an internationally accepted and adopted standard for coding health interventions, Austria has established its own country-specific procedure classification system - the Austrian Procedure Catalogue (APC). Even though the APC is an elaborate coding standard for medical procedures, it has shortcomings that limit its usability. In order to enhance usability and usefulness, especially for research purposes and e-health applications, we developed an ontologized version of the APC. In this paper we present a novel four-step approach for the ontology engineering process, which enables accurate extraction of relevant concepts for medical ontologies from written text. The proposed approach for formalizing the APC consists of the following four steps: (1) comparative pre-analysis, (2) definition analysis, (3) typological analysis, and (4) ontology implementation. The first step contained a comparison of the APC to other well-established or elaborate health intervention coding systems in order to identify strengths and weaknesses of the APC. In the second step, a list of definitions of medical terminology used in the APC was obtained. This list of definitions was used as input for Step 3, in which we identified the most important concepts to describe medical procedures using the qualitative typological analysis approach. The definition analysis as well as the typological analysis are well-known and effective methods used in social sciences, but not commonly employed in the computer science or ontology engineering domain. Finally, this list of concepts was used in Step 4 to formalize the APC. The pre-analysis highlighted the major shortcomings of the APC, such as the lack of formal definition, leading to implicitly available, but not directly accessible information (hidden data), or the poor procedural type classification. After performing the definition and subsequent typological analyses, we were able to identify the following main characteristics of health interventions: (1) Procedural type, (2) Anatomical site, (3) Medical device, (4) Pathology, (5) Access, (6) Body system, (7) Population, (8) Aim, (9) Discipline, (10) Technique, and (11) Body Function. These main characteristics were taken as input of classes for the formalization of the APC. We were also able to identify relevant relations between classes. The proposed four-step approach for formalizing the APC provides a novel, systematically developed, strong framework to semantically enrich procedure classifications. Although this methodology was designed to address the particularities of the APC, the included methods are based on generic analysis tasks, and therefore can be re-used to provide a systematic representation of other procedure catalogs or classification systems and hence contribute towards a universal alignment of such representations, if desired. Copyright © 2015 Elsevier Inc. All rights reserved.
Structuring Formal Requirements Specifications for Reuse and Product Families
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.
2001-01-01
In this project we have investigated how formal specifications should be structured to allow for requirements reuse, product family engineering, and ease of requirements change, The contributions of this work include (1) a requirements specification methodology specifically targeted for critical avionics applications, (2) guidelines for how to structure state-based specifications to facilitate ease of change and reuse, and (3) examples from the avionics domain demonstrating the proposed approach.
Discrete mathematics, formal methods, the Z schema and the software life cycle
NASA Technical Reports Server (NTRS)
Bown, Rodney L.
1991-01-01
The proper role and scope for the use of discrete mathematics and formal methods in support of engineering the security and integrity of components within deployed computer systems are discussed. It is proposed that the Z schema can be used as the specification language to capture the precise definition of system and component interfaces. This can be accomplished with an object oriented development paradigm.
Bolton, Matthew L.; Bass, Ellen J.; Siminiceanu, Radu I.
2012-01-01
Breakdowns in complex systems often occur as a result of system elements interacting in unanticipated ways. In systems with human operators, human-automation interaction associated with both normative and erroneous human behavior can contribute to such failures. Model-driven design and analysis techniques provide engineers with formal methods tools and techniques capable of evaluating how human behavior can contribute to system failures. This paper presents a novel method for automatically generating task analytic models encompassing both normative and erroneous human behavior from normative task models. The generated erroneous behavior is capable of replicating Hollnagel’s zero-order phenotypes of erroneous action for omissions, jumps, repetitions, and intrusions. Multiple phenotypical acts can occur in sequence, thus allowing for the generation of higher order phenotypes. The task behavior model pattern capable of generating erroneous behavior can be integrated into a formal system model so that system safety properties can be formally verified with a model checker. This allows analysts to prove that a human-automation interactive system (as represented by the model) will or will not satisfy safety properties with both normative and generated erroneous human behavior. We present benchmarks related to the size of the statespace and verification time of models to show how the erroneous human behavior generation process scales. We demonstrate the method with a case study: the operation of a radiation therapy machine. A potential problem resulting from a generated erroneous human action is discovered. A design intervention is presented which prevents this problem from occurring. We discuss how our method could be used to evaluate larger applications and recommend future paths of development. PMID:23105914
Kitto, Simon C; Grant, Rachel E; Peller, Jennifer; Moulton, Carol-Anne; Gallinger, Steven
2018-03-01
In 2007 the Cancer Care Ontario Hepatobiliary-Pancreatic (HPB) Community of Practice was formed during the wake of provincial regionalization of HPB services in Ontario, Canada. Despite being conceptualized within the literature as an educational intervention, communities of practice (CoP) are increasingly being adopted in healthcare as quality improvement initiatives. A qualitative case study approach using in-depth interviews and document analysis was employed to gain insight into the perceptions and attitudes of the HPB surgeons in the CoP. This study demonstrates how an engineered formal or idealized structure of a CoP was created in tension with the natural CoPs that HPB surgeons identified with during and after their training. This tension contributed to the inactive and/or marginal participation by some of the surgeons in the CoP. The findings of this study represent a cautionary tale for such future engineering attempts in two distinct ways: (1) a CoP in surgery cannot simply be created by regulatory agencies, rather they need to be supported in a way to evolve naturally, and (2) when the concept of CoPs is co-opted by governing bodies, it does not necessarily capture the power and potential of situated learning. To ensure CoP sustainability and effectiveness, we suggest that both core and peripheral members need to be more directly involved at the inception of the COP in terms of design, organization, implementation and ongoing management.
Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield
NASA Technical Reports Server (NTRS)
Baurle, R. A.; Axdahl, E. L.
2017-01-01
Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.
Concept similarity and related categories in information retrieval using formal concept analysis
NASA Astrophysics Data System (ADS)
Eklund, P.; Ducrou, J.; Dau, F.
2012-11-01
The application of formal concept analysis to the problem of information retrieval has been shown useful but has lacked any real analysis of the idea of relevance ranking of search results. SearchSleuth is a program developed to experiment with the automated local analysis of Web search using formal concept analysis. SearchSleuth extends a standard search interface to include a conceptual neighbourhood centred on a formal concept derived from the initial query. This neighbourhood of the concept derived from the search terms is decorated with its upper and lower neighbours representing more general and special concepts, respectively. SearchSleuth is in many ways an archetype of search engines based on formal concept analysis with some novel features. In SearchSleuth, the notion of related categories - which are themselves formal concepts - is also introduced. This allows the retrieval focus to shift to a new formal concept called a sibling. This movement across the concept lattice needs to relate one formal concept to another in a principled way. This paper presents the issues concerning exploring, searching, and ordering the space of related categories. The focus is on understanding the use and meaning of proximity and semantic distance in the context of information retrieval using formal concept analysis.
Quality of online information on type 2 diabetes: a cross-sectional study.
Weymann, Nina; Härter, Martin; Dirmaier, Jörg
2015-12-01
Evidence-based health information is a prerequisite for patients with type 2 diabetes to engage in self-management and to make informed medical decisions. The Internet is an important source of health information. In the present study, we systematically assessed formal quality, quality of decision support and usability of German and English language websites on type 2 diabetes. The search term 'type 2 diabetes' was entered in the two most popular search engines. Descriptive data on website quality are presented. Additionally, associations between website quality and affiliation (commercial vs. non-commercial), presence of the HON code quality seal and website traffic were explored. Forty-six websites were included. Most websites provided basic information necessary for decision-making, while only one website also provided decision support. Websites with a HON code had significantly better formal quality than websites without HON code. We found a highly significant correlation between usability and website traffic and a significant correlation between formal quality and website traffic. Most websites do not provide sufficient information to support patients in medical decision-making. Our finding that usability and website traffic are tightly associated is consistent with previous research indicating that design is the most important cue for users assessing website credibility. © The Author (2014). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
DEVELOPMENT OF OPERATIONAL CONCEPTS FOR ADVANCED SMRs: THE ROLE OF COGNITIVE SYSTEMS ENGINEERING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacques Hugo; David Gertman
Advanced small modular reactors (AdvSMRs) will use advanced digital instrumentation and control systems, and make greater use of automation. These advances not only pose technical and operational challenges, but will inevitably have an effect on the operating and maintenance (O&M) cost of new plants. However, there is much uncertainty about the impact of AdvSMR designs on operational and human factors considerations, such as workload, situation awareness, human reliability, staffing levels, and the appropriate allocation of functions between the crew and various automated plant systems. Existing human factors and systems engineering design standards and methodologies are not current in terms ofmore » human interaction requirements for dynamic automated systems and are no longer suitable for the analysis of evolving operational concepts. New models and guidance for operational concepts for complex socio-technical systems need to adopt a state-of-the-art approach such as Cognitive Systems Engineering (CSE) that gives due consideration to the role of personnel. This approach we report on helps to identify and evaluate human challenges related to non-traditional concepts of operations. A framework - defining operational strategies was developed based on the operational analysis of Argonne National Laboratory’s Experimental Breeder Reactor-II (EBR-II), a small (20MWe) sodium-cooled reactor that was successfully operated for thirty years. Insights from the application of the systematic application of the methodology and its utility are reviewed and arguments for the formal adoption of CSE as a value-added part of the Systems Engineering process are presented.« less
ERIC Educational Resources Information Center
Reynolds, Rebecca; Chiu, Ming Ming
2013-01-01
This paper explored informal (after-school) and formal (elective course in-school) learning contexts as contributors to middle-school student attitudinal changes in a guided discovery-based and blended e-learning program in which students designed web games and used social media and information resources for a full school year. Formality of the…
Formal functional test designs with a test representation language
NASA Technical Reports Server (NTRS)
Hops, J. M.
1993-01-01
The application of the category-partition method to the test design phase of hardware, software, or system test development is discussed. The method provides a formal framework for reducing the total number of possible test cases to a minimum logical subset for effective testing. An automatic tool and a formal language were developed to implement the method and produce the specification of test cases.
History and future of genetically engineered food animal regulation: an open request.
Wells, Kevin D
2016-06-01
Modern biotechnology resulted from of a series of incremental improvements in the understanding of DNA and the enzymes that nature evolved to manipulate it. As the potential impact of genetic engineering became apparent, scientists began the process of trying to identify the potential unintended consequences. Restrictions to recombinant DNA experimentation were at first self-imposed. Collaborative efforts between scientists and lawyers formalized an initial set of guidelines. These guidelines have been used to promulgate regulations around world. However, the initial guidelines were only intended as a starting point and were motivated by a specific set of concerns. As new data became available, the guidelines and regulations should have been adapted to the new knowledge. Instead, other social drivers drove the development of regulations. For most species and most applications, the framework that was established has slowly allowed some products to reach the market. However, genetically engineered livestock that are intended for food have been left in a regulatory state of limbo. To date, no genetically engineered food animal is available in the marketplace. A short history and a U.S.-based genetic engineer's perspective are presented. In addition, a request to regulatory agencies is presented for consideration as regulation continues to evolve. Regulators appear to have shown preference for the slow, random progression of evolution over the efficiency of intentional design.
Tacit Knowledge Capture and the Brain-Drain at Electrical Utilities
NASA Astrophysics Data System (ADS)
Perjanik, Nicholas Steven
As a consequence of an aging workforce, electric utilities are at risk of losing their most experienced and knowledgeable electrical engineers. In this research, the problem was a lack of understanding of what electric utilities were doing to capture the tacit knowledge or know-how of these engineers. The purpose of this qualitative research study was to explore the tacit knowledge capture strategies currently used in the industry by conducting a case study of 7 U.S. electrical utilities that have demonstrated an industry commitment to improving operational standards. The research question addressed the implemented strategies to capture the tacit knowledge of retiring electrical engineers and technical personnel. The research methodology involved a qualitative embedded case study. The theories used in this study included knowledge creation theory, resource-based theory, and organizational learning theory. Data were collected through one time interviews of a senior electrical engineer or technician within each utility and a workforce planning or training professional within 2 of the 7 utilities. The analysis included the use of triangulation and content analysis strategies. Ten tacit knowledge capture strategies were identified: (a) formal and informal on-boarding mentorship and apprenticeship programs, (b) formal and informal off-boarding mentorship programs, (c) formal and informal training programs, (d) using lessons learned during training sessions, (e) communities of practice, (f) technology enabled tools, (g) storytelling, (h) exit interviews, (i) rehiring of retirees as consultants, and (j) knowledge risk assessments. This research contributes to social change by offering strategies to capture the know-how needed to ensure operational continuity in the delivery of safe, reliable, and sustainable power.
Towards the formal specification of the requirements and design of a processor interface unit
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1993-01-01
Work to formally specify the requirements and design of a Processor Interface Unit (PIU), a single-chip subsystem providing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault-tolerant computer system, is described. This system, the Fault-Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance free operation, or both. The approaches that were developed for modeling the PIU requirements and for composition of the PIU subcomponents at high levels of abstraction are described. These approaches were used to specify and verify a nontrivial subset of the PIU behavior. The PIU specification in Higher Order Logic (HOL) is documented in a companion NASA contractor report entitled 'Towards the Formal Specification of the Requirements and Design of a Processor Interfacs Unit - HOL Listings.' The subsequent verification approach and HOL listings are documented in NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit' and NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings.'
Finding Patterns of Emergence in Science and Technology
2012-09-24
formal evaluation scheduled – Case Studies, Eight Examples: Tissue Engineering, Cold Fusion, RF Metamaterials, DNA Microarrays, Genetic Algorithms, RNAi...emerging capabilities Case Studies, Eight Examples: • Tissue Engineering, Cold Fusion, RF Metamaterials, DNA Microarrays, Genetic Algorithms...Evidence Quality (i.e., the rubric ) and deliver comprehensible evidential support for nomination • Demonstrate proof-of-concept nomination for Chinese
ERIC Educational Resources Information Center
Amara, Nabil; Landry, Rejean; Halilem, Norrin
2013-01-01
Academic consulting is a form of knowledge and technology transfer largely under-documented and under-studied that raises ethical and resources allocation issues. Based on a survey of 2,590 Canadian researchers in engineering and natural sciences, this paper explores three forms of academic consulting: (1) paid consulting; (2) unpaid consulting…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacques Hugo
Traditional engineering methods do not make provision for the integration of human considerations, while traditional human factors methods do not scale well to the complexity of large-scale nuclear power plant projects. Although the need for up-to-date human factors engineering processes and tools is recognised widely in industry, so far no formal guidance has been developed. This article proposes such a framework.
ERIC Educational Resources Information Center
Almeida, Joana; Fantini, Alvino E.; Simões, Ana Raquel; Costa, Nilza
2016-01-01
This paper examines how the addition of intercultural interventions carried out throughout European credit-bearing exchange programmes can enhance sojourners' development of intercultural competencies, and it explores how both formal and non-formal pedagogical interventions may be designed and implemented. Such interventions were conducted at a…
Open Source Patient-Controlled Analgesic Pump Requirements Documentation
Larson, Brian R.; Hatcliff, John; Chalin, Patrice
2014-01-01
The dynamic nature of the medical domain is driving a need for continuous innovation and improvement in techniques for developing and assuring medical devices. Unfortunately, research in academia and communication between academics, industrial engineers, and regulatory authorities is hampered by the lack of realistic non-proprietary development artifacts for medical devices. In this paper, we give an overview of a detailed requirements document for a Patient-Controlled Analgesic (PCA) pump developed under the US NSF’s Food and Drug Administration (FDA) Scholar-in-Residence (SIR) program. This 60+ page document follows the methodology outlined in the US Federal Aviation Administrations (FAA) Requirements Engineering Management Handbook (REMH) and includes a domain overview, use cases, statements of safety & security requirements, and formal top-level system architectural description. Based on previous experience with release of a requirements document for a cardiac pacemaker that spawned a number of research and pedagogical activities, we believe that the described PCA requirements document can be an important research enabler within the formal methods and software engineering communities. PMID:24931440
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan D. Maughan
2006-11-01
Mentoring is an established strategy for learning that has its root in antiquity. Most, if not all, successful scientists and engineers had an effective mentor at some point in their career. In the context of scientists and engineers, mentoring has been undefined. Reports addressing critical concerns regarding the future of science and engineering in the U.S. mention the practice of mentoring a priori, leaving organizations without guidance in its application. Preliminary results from this study imply that formal mentoring can be effective when properly defined and operationalized. Recognizing the uniqueness of the individual in a symbiotic mentor-protégé relationship significantly influencesmore » a protégé’s learning experience which carries repercussions into their career intentions. The mentor-protégé relationship is a key factor in succession planning and preserving and disseminating critical information and tacit knowledge essential to the development of leadership in the science and technological industry.« less
Increase in the Accuracy of Calculating Length of Horizontal Cable SCS in Civil Engineering
NASA Astrophysics Data System (ADS)
Semenov, A.
2017-11-01
A modification of the method for calculating the horizontal cable consumption of SCS established at civil engineering facilities is proposed. The proposed procedure preserves the prototype simplicity and provides a 5 percent accuracy increase. The values of the achieved accuracy are justified, their compliance with the practice of real projects is proved. The method is brought to the level of the engineering algorithm and formalized in the form of 12/70 rule.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.
Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1991-01-01
The ongoing debate over the role of formalism and formal specifications in software features many speakers with diverse positions. Yet, in the end, they share the conviction that the requirements of a software system can be unambiguously specified, that acceptable software is a product demonstrably meeting the specifications, and that the design process can be carried out with little interaction between designers and users once the specification has been agreed to. This conviction is part of a larger paradigm prevalent in American management thinking, which holds that organizations are systems that can be precisely specified and optimized. This paradigm, which traces historically to the works of Frederick Taylor in the early 1900s, is no longer sufficient for organizations and software systems today. In the domain of software, a new paradigm, called user-centered design, overcomes the limitations of pure formalism. Pioneered in Scandinavia, user-centered design is spreading through Europe and is beginning to make its way into the U.S.
Ramoni, Marco F.
2010-01-01
The field of synthetic biology holds an inspiring vision for the future; it integrates computational analysis, biological data and the systems engineering paradigm in the design of new biological machines and systems. These biological machines are built from basic biomolecular components analogous to electrical devices, and the information flow among these components requires the augmentation of biological insight with the power of a formal approach to information management. Here we review the informatics challenges in synthetic biology along three dimensions: in silico, in vitro and in vivo. First, we describe state of the art of the in silico support of synthetic biology, from the specific data exchange formats, to the most popular software platforms and algorithms. Next, we cast in vitro synthetic biology in terms of information flow, and discuss genetic fidelity in DNA manipulation, development strategies of biological parts and the regulation of biomolecular networks. Finally, we explore how the engineering chassis can manipulate biological circuitries in vivo to give rise to future artificial organisms. PMID:19906839
A demonstration of motion base design alternatives for the National Advanced Driving Simulator
NASA Technical Reports Server (NTRS)
Mccauley, Michael E.; Sharkey, Thomas J.; Sinacori, John B.; Laforce, Soren; Miller, James C.; Cook, Anthony
1992-01-01
A demonstration of the capability of NASA's Vertical Motion Simulator to simulate two alternative motion base designs for the National Advanced Driving simulator (NADS) is reported. The VMS is located at ARC. The motion base conditions used in this demonstration were as follows: (1) a large translational motion base; and (2) a motion base design with limited translational capability. The latter had translational capability representative of a typical synergistic motion platform. These alternatives were selected to test the prediction that large amplitude translational motion would result in a lower incidence or severity of simulator induced sickness (SIS) than would a limited translational motion base. A total of 10 drivers performed two tasks, slaloms and quick-stops, using each of the motion bases. Physiological, objective, and subjective measures were collected. No reliable differences in SIS between the motion base conditions was found in this demonstration. However, in light of the cost considerations and engineering challenges associated with implementing a large translation motion base, performance of a formal study is recommended.
Formal Verification of the Runway Safety Monitor
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu; Ciardo, Gianfranco
2006-01-01
The Runway Safety Monitor (RSM) designed by Lockheed Martin is part of NASA's effort to reduce runway accidents. We developed a Petri net model of the RSM protocol and used the model checking functions of our tool SMART to investigate a number of safety properties in RSM. To mitigate the impact of state-space explosion, we built a highly discretized model of the system, obtained by partitioning the monitored runway zone into a grid of smaller volumes and by considering scenarios involving only two aircraft. The model also assumes that there are no communication failures, such as bad input from radar or lack of incoming data, thus it relies on a consistent view of reality by all participants. In spite of these simplifications, we were able to expose potential problems in the RSM conceptual design. Our findings were forwarded to the design engineers, who undertook corrective action. Additionally, the results stress the efficiency attained by the new model checking algorithms implemented in SMART, and demonstrate their applicability to real-world systems.
Reconfigurable Very Long Instruction Word (VLIW) Processor
NASA Technical Reports Server (NTRS)
Velev, Miroslav N.
2015-01-01
Future NASA missions will depend on radiation-hardened, power-efficient processing systems-on-a-chip (SOCs) that consist of a range of processor cores custom tailored for space applications. Aries Design Automation, LLC, has developed a processing SOC that is optimized for software-defined radio (SDR) uses. The innovation implements the Institute of Electrical and Electronics Engineers (IEEE) RazorII voltage management technique, a microarchitectural mechanism that allows processor cores to self-monitor, self-analyze, and selfheal after timing errors, regardless of their cause (e.g., radiation; chip aging; variations in the voltage, frequency, temperature, or manufacturing process). This highly automated SOC can also execute legacy PowerPC 750 binary code instruction set architecture (ISA), which is used in the flight-control computers of many previous NASA space missions. In developing this innovation, Aries Design Automation has made significant contributions to the fields of formal verification of complex pipelined microprocessors and Boolean satisfiability (SAT) and has developed highly efficient electronic design automation tools that hold promise for future developments.
Summary of Results from the Risk Management Program for the Mars Microrover Flight Experiment
NASA Technical Reports Server (NTRS)
Shishko, Robert; Matijevic, Jacob R.
2000-01-01
On 4 July 1997, the Mars Pathfinder landed on the surface of Mars carrying the first planetary rover, known as the Sojourner. Formally known as the Microrover Flight Experiment (MFEX), the Sojourner was a low cost, high-risk technology demonstration, in which new risk management techniques were tried. This paper summarizes the activities and results of the effort to conduct a low-cost, yet meaningful risk management program for the MFEX. The specific activities focused on cost, performance, schedule, and operations risks. Just as the systems engineering process was iterative and produced successive refinements of requirements, designs, etc., so was the risk management process. Qualitative risk assessments were performed first to gain some insights for refining the microrover design and operations concept. These then evolved into more quantitative analyses. Risk management lessons from the manager's perspective is presented for other low-cost, high-risk space missions.
Human Factors Checklist: Think Human Factors - Focus on the People
NASA Technical Reports Server (NTRS)
Miller, Darcy; Stelges, Katrine; Barth, Timothy; Stambolian, Damon; Henderson, Gena; Dischinger, Charles; Kanki, Barbara; Kramer, Ian
2016-01-01
A quick-look Human Factors (HF) Checklist condenses industry and NASA Agency standards consisting of thousands of requirements into 14 main categories. With support from contractor HF and Safety Practitioners, NASA developed a means to share key HF messages with Design, Engineering, Safety, Project Management, and others. It is often difficult to complete timely assessments due to the large volume of HF information. The HF Checklist evolved over time into a simple way to consider the most important concepts. A wide audience can apply the checklist early in design or through planning phases, even before hardware or processes are finalized or implemented. The checklist is a good place to start to supplement formal HF evaluation. The HF Checklist was based on many Space Shuttle processing experiences and lessons learned. It is now being applied to ground processing of new space vehicles and adjusted for new facilities and systems.
1988-03-01
Mechanism; Computer Security. 16. PRICE CODE 17. SECURITY CLASSIFICATION IS. SECURITY CLASSIFICATION 19. SECURITY CLASSIFICATION 20. UMrrATION OF ABSTRACT...denial of service. This paper assumes that the reader is a computer science or engineering professional working in the area of formal specification and...recovery from such events as deadlocks and crashes can be accounted for in the computation of the waiting time for each service in the service hierarchy
A preliminary design for the GMT-Consortium Large Earth Finder (G-CLEF)
NASA Astrophysics Data System (ADS)
Szentgyorgyi, Andrew; Barnes, Stuart; Bean, Jacob; Bigelow, Bruce; Bouchez, Antonin; Chun, Moo-Young; Crane, Jeffrey D.; Epps, Harland; Evans, Ian; Evans, Janet; Frebel, Anna; Furesz, Gabor; Glenday, Alex; Guzman, Dani; Hare, Tyson; Jang, Bi-Ho; Jang, Jeong-Gyun; Jeong, Ueejong; Jordan, Andres; Kim, Kang-Min; Kim, Jihun; Li, Chih-Hao; Lopez-Morales, Mercedes; McCracken, Kenneth; McLeod, Brian; Mueller, Mark; Nah, Jakyung; Norton, Timothy; Oh, Heeyoung; Oh, Jae Sok; Ordway, Mark; Park, Byeong-Gon; Park, Chan; Park, Sung-Joon; Phillips, David; Plummer, David; Podgorski, William; Rodler, Florian; Seifahrt, Andreas; Tak, Kyung-Mo; Uomoto, Alan; Van Dam, Marcos A.; Walsworth, Ronald; Yu, Young Sam; Yuk, In-Soo
2014-08-01
The GMT-Consortium Large Earth Finder (G-CLEF) is an optical-band echelle spectrograph that has been selected as the first light instrument for the Giant Magellan Telescope (GMT). G-CLEF is a general-purpose, high dispersion spectrograph that is fiber fed and capable of extremely precise radial velocity measurements. The G-CLEF Concept Design (CoD) was selected in Spring 2013. Since then, G-CLEF has undergone science requirements and instrument requirements reviews and will be the subject of a preliminary design review (PDR) in March 2015. Since CoD review (CoDR), the overall G-CLEF design has evolved significantly as we have optimized the constituent designs of the major subsystems, i.e. the fiber system, the telescope interface, the calibration system and the spectrograph itself. These modifications have been made to enhance G-CLEF's capability to address frontier science problems, as well as to respond to the evolution of the GMT itself and developments in the technical landscape. G-CLEF has been designed by applying rigorous systems engineering methodology to flow Level 1 Scientific Objectives to Level 2 Observational Requirements and thence to Level 3 and Level 4. The rigorous systems approach applied to G-CLEF establishes a well defined science requirements framework for the engineering design. By adopting this formalism, we may flexibly update and analyze the capability of G-CLEF to respond to new scientific discoveries as we move toward first light. G-CLEF will exploit numerous technological advances and features of the GMT itself to deliver an efficient, high performance instrument, e.g. exploiting the adaptive optics secondary system to increase both throughput and radial velocity measurement precision.
Separating essentials from incidentals: an execution architecture for real-time control systems
NASA Technical Reports Server (NTRS)
Dvorak, Daniel; Reinholtz, Kirk
2004-01-01
This paper describes an execution architecture that makes such systems far more analyzable and verifiable by aggressive separation of concerns. The architecture separates two key software concerns: transformations of global state, as defined in pure functions; and sequencing/timing of transformations, as performed by an engine that enforces four prime invariants. The important advantage of this architecture, besides facilitating verification, is that it encourages formal specification of systems in a vocabulary that brings systems engineering closer to software engineering.
Konstantinidis, Evdokimos I; Billis, Antonis S; Mouzakidis, Christos A; Zilidou, Vasiliki I; Antoniou, Panagiotis E; Bamidis, Panagiotis D
2016-01-01
Many platforms have emerged as response to the call for technology supporting active and healthy aging. Key requirements for any such e-health systems and any subsequent business exploitation are tailor-made design and proper evaluation. This paper presents the design, implementation, wide deployment, and evaluation of the low cost, physical exercise, and gaming (exergaming) FitForAll (FFA) platform system usability, user adherence to exercise, and efficacy are explored. The design of FFA is tailored to elderly populations, distilling literature guidelines and recommendations. The FFA architecture introduces standard physical exercise protocols in exergaming software engineering, as well as, standard physical assessment tests for augmented adaptability through adjustable exercise intensity. This opens up the way to next generation exergaming software, which may be more automatically/smartly adaptive. 116 elderly users piloted FFA five times/week, during an eight-week controlled intervention. Usability evaluation was formally conducted (SUS, SUMI questionnaires). Control group consisted of a size-matched elderly group following cognitive training. Efficacy was assessed objectively through the senior fitness (Fullerton) test, and subjectively, through WHOQoL-BREF comparisons of pre-postintervention between groups. Adherence to schedule was measured by attendance logs. The global SUMI score was 68.33±5.85%, while SUS was 77.7. Good usability perception is reflected in relatively high adherence of 82% for a daily two months pilot schedule. Compared to control group, elderly using FFA improved significantly strength, flexibility, endurance, and balance while presenting a significant trend in quality of life improvements. This is the first elderly focused exergaming platform intensively evaluated with more than 100 participants. The use of formal tools makes the findings comparable to other studies and forms an elderly exergaming corpus.
An Ontology for State Analysis: Formalizing the Mapping to SysML
NASA Technical Reports Server (NTRS)
Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel
2012-01-01
State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.
How External Institutions Penetrate Schools through Formal and Informal Leaders
ERIC Educational Resources Information Center
Sun, Min; Frank, Kenneth A.; Penuel, William R.; Kim, Chong Min
2013-01-01
Purposes: This study investigates the role of formal and informal leaders in the diffusion of external reforms into schools and to teachers' practices. Formal leaders are designated by their roles in the formal organization of the school (e.g., principals, department chairs, and instructional coaches) and informal leaders refer to those who do not…
Educating Tomorrow's Aerrospace Engineers by Developing and Launching Liquid-Propelled Rockets
NASA Astrophysics Data System (ADS)
Besnard, Eric; Garvey, John; Holleman, Tom; Mueller, Tom
2002-01-01
conducted at California State University, Long Beach (CSULB), in which engineering students develop and launch liquid propelled rockets. The program is articulated around two main activities, each with specific objectives. The first component of CALVEIN is a systems integration laboratory where students develop/improve vehicle subsystems and integrate them into a vehicle (Prospector-2 - P-2 - for the 2001-02 academic year - AY). This component has three main objectives: (1) Develop hands- on skills for incoming students and expose them to aerospace hardware; (2) allow for upper division students who have been involved in the program to mentor incoming students and manage small teams; and (3) provide students from various disciplines within the College of Engineering - and other universities - with the chance to develop/improve subsystems on the vehicle. Among recent student projects conducted as part of this component are: a new 1000 lbf thrust engine using pintle injector technology, which was successfully tested on Dec. 1, 2001 and flown on Prospector-2 in Feb. 2002 (developed by CSULB Mechanical and Aerospace Engineering students); a digital flight telemetry package (developed by CSULB Electrical Engineering students); a new recovery system where a mechanical system replaces pyrotechnics for parachute release (developed by CSULB Mechanical and Aerospace Engineering students); and a 1-ft payload bay to accommodate experimental payloads (e.g. "CANSATS" developed by Stanford University students). The second component of CALVEIN is a formal Aerospace System Design curriculum. In the first-semester, from top-level system requirements, the students perform functional analysis, define the various subsystems and derive their requirements. These are presented at the Systems Functional and Requirement Reviews (SFR &SRR). The methods used for validation and verification are determined. Specifications and Interface Control Documents (ICD) are generated by the student team(s). Trade studies are identified and conducted, leading to a Preliminary Design Review (PDR) at the end of the first semester. A detailed design follows, culminating in a Critical Design Review (CDR), etc. A general process suitable for a two-semester course sequence will be outlined. The project is conducted in an Integrated Product Team (IPT) environment, which includes a project manager, a systems engineer, and the various disciplines needed for the project (propulsion, aerodynamics, structures and materials, mass, CAD, thermal, fluids, etc.). Each student works with a Faculty member or industry advisor who is a specialist in his/her area. This design curriculum enhances the education of the graduates and provides future employers with engineers cognizant of and experienced in the application of Systems Engineering to a full-scale project over the entire product development cycle. For the AY01-02, the curriculum is being applied to the development of a gimbaled aerospike engine and its integration into P-3, scheduled to fly in May 2002. The paper ends with a summary of "lessons learned" from this experience. Budget issues are also addressed to demonstrate the ability to replicate such projects at other institutions with minimal costs, provided that it can be taken advantages of possible synergies between existing programs, in-house resources, and cooperation with other institutions or organizations.
ERIC Educational Resources Information Center
Calvo, Gilbert
Various educators from Latin and Central America and the Caribbean met to design and produce materials for teaching family life, human sexuality, community life, and environmental studies. They concluded that the materials should meet community standards; help prepare for future change; develop working models for designing effective teaching…
General purpose optimization software for engineering design
NASA Technical Reports Server (NTRS)
Vanderplaats, G. N.
1990-01-01
The author has developed several general purpose optimization programs over the past twenty years. The earlier programs were developed as research codes and served that purpose reasonably well. However, in taking the formal step from research to industrial application programs, several important lessons have been learned. Among these are the importance of clear documentation, immediate user support, and consistent maintenance. Most important has been the issue of providing software that gives a good, or at least acceptable, design at minimum computational cost. Here, the basic issues developing optimization software for industrial applications are outlined and issues of convergence rate, reliability, and relative minima are discussed. Considerable feedback has been received from users, and new software is being developed to respond to identified needs. The basic capabilities of this software are outlined. A major motivation for the development of commercial grade software is ease of use and flexibility, and these issues are discussed with reference to general multidisciplinary applications. It is concluded that design productivity can be significantly enhanced by the more widespread use of optimization as an everyday design tool.
How to get students to love (or not hate) MATLAB and programming
NASA Astrophysics Data System (ADS)
Reckinger, Shanon; Reckinger, Scott
2014-11-01
An effective programming course geared toward engineering students requires the utilization of modern teaching philosophies. A newly designed course that focuses on programming in MATLAB involves flipping the classroom and integrating various active teaching techniques. Vital aspects of the new course design include: lengthening in-class contact hours, Process-Oriented Guided Inquiry Learning (POGIL) method worksheets (self-guided instruction), student created video content posted on YouTube, clicker questions (used in class to practice reading and debugging code), programming exams that don't require computers, integrating oral exams into the classroom, fostering an environment for formal and informal peer learning, and designing in a broader theme to tie together assignments. However, possibly the most important piece to this programming course puzzle: the instructor needs to be able to find programming mistakes very fast and then lead individuals and groups through the steps to find their mistakes themselves. The effectiveness of the new course design is demonstrated through pre- and post- concept exam results and student evaluation feedback. Students reported that the course was challenging and required a lot of effort, but left largely positive feedback.
ERIC Educational Resources Information Center
Johnson, Christopher W.
1996-01-01
The development of safety-critical systems (aircraft cockpits and reactor control rooms) is qualitatively different from that of other interactive systems. These differences impose burdens on design teams that must ensure the development of human-machine interfaces. Analyzes strengths and weaknesses of formal methods for the design of user…
DRS: Derivational Reasoning System
NASA Technical Reports Server (NTRS)
Bose, Bhaskar
1995-01-01
The high reliability requirements for airborne systems requires fault-tolerant architectures to address failures in the presence of physical faults, and the elimination of design flaws during the specification and validation phase of the design cycle. Although much progress has been made in developing methods to address physical faults, design flaws remain a serious problem. Formal methods provides a mathematical basis for removing design flaws from digital systems. DRS (Derivational Reasoning System) is a formal design tool based on advanced research in mathematical modeling and formal synthesis. The system implements a basic design algebra for synthesizing digital circuit descriptions from high level functional specifications. DRS incorporates an executable specification language, a set of correctness preserving transformations, verification interface, and a logic synthesis interface, making it a powerful tool for realizing hardware from abstract specifications. DRS integrates recent advances in transformational reasoning, automated theorem proving and high-level CAD synthesis systems in order to provide enhanced reliability in designs with reduced time and cost.
KAM (Knowledge Acquisition Module): A tool to simplify the knowledge acquisition process
NASA Technical Reports Server (NTRS)
Gettig, Gary A.
1988-01-01
Analysts, knowledge engineers and information specialists are faced with increasing volumes of time-sensitive data in text form, either as free text or highly structured text records. Rapid access to the relevant data in these sources is essential. However, due to the volume and organization of the contents, and limitations of human memory and association, frequently: (1) important information is not located in time; (2) reams of irrelevant data are searched; and (3) interesting or critical associations are missed due to physical or temporal gaps involved in working with large files. The Knowledge Acquisition Module (KAM) is a microcomputer-based expert system designed to assist knowledge engineers, analysts, and other specialists in extracting useful knowledge from large volumes of digitized text and text-based files. KAM formulates non-explicit, ambiguous, or vague relations, rules, and facts into a manageable and consistent formal code. A library of system rules or heuristics is maintained to control the extraction of rules, relations, assertions, and other patterns from the text. These heuristics can be added, deleted or customized by the user. The user can further control the extraction process with optional topic specifications. This allows the user to cluster extracts based on specific topics. Because KAM formalizes diverse knowledge, it can be used by a variety of expert systems and automated reasoning applications. KAM can also perform important roles in computer-assisted training and skill development. Current research efforts include the applicability of neural networks to aid in the extraction process and the conversion of these extracts into standard formats.
A Formalized Design Process for Bacterial Consortia That Perform Logic Computing
Sun, Rui; Xi, Jingyi; Wen, Dingqiao; Feng, Jingchen; Chen, Yiwei; Qin, Xiao; Ma, Yanrong; Luo, Wenhan; Deng, Linna; Lin, Hanchi; Yu, Ruofan; Ouyang, Qi
2013-01-01
The concept of microbial consortia is of great attractiveness in synthetic biology. Despite of all its benefits, however, there are still problems remaining for large-scaled multicellular gene circuits, for example, how to reliably design and distribute the circuits in microbial consortia with limited number of well-behaved genetic modules and wiring quorum-sensing molecules. To manage such problem, here we propose a formalized design process: (i) determine the basic logic units (AND, OR and NOT gates) based on mathematical and biological considerations; (ii) establish rules to search and distribute simplest logic design; (iii) assemble assigned basic logic units in each logic operating cell; and (iv) fine-tune the circuiting interface between logic operators. We in silico analyzed gene circuits with inputs ranging from two to four, comparing our method with the pre-existing ones. Results showed that this formalized design process is more feasible concerning numbers of cells required. Furthermore, as a proof of principle, an Escherichia coli consortium that performs XOR function, a typical complex computing operation, was designed. The construction and characterization of logic operators is independent of “wiring” and provides predictive information for fine-tuning. This formalized design process provides guidance for the design of microbial consortia that perform distributed biological computation. PMID:23468999
Forum on Workforce Development
NASA Technical Reports Server (NTRS)
Hoffman, Edward
2010-01-01
APPEL Mission: To support NASA's mission by promoting individual, team, and organizational excellence in program/project management and engineering through the application of learning strategies, methods, models, and tools. Goals: a) Provide a common frame of reference for NASA s technical workforce. b) Provide and enhance critical job skills. c) Support engineering, program and project teams. d) Promote organizational learning across the agency. e) Supplement formal educational programs.
On introduction of artificial intelligence elements to heat power engineering
NASA Astrophysics Data System (ADS)
Dregalin, A. F.; Nazyrova, R. R.
1993-10-01
The basic problems of 'the thermodynamic intelligence' of personal computers have been outlined. The thermodynamic intellect of personal computers as a concept has been introduced to heat processes occurring in engines of flying vehicles. In particular, the thermodynamic intellect of computers is determined by the possibility of deriving formal relationships between thermodynamic functions. In chemical thermodynamics, a concept of a characteristic function has been introduced.
2015-11-01
28 2.3.4 Input/Output Automata ...various other modeling frameworks such as I/O Automata , Kahn Process Networks, Petri-nets, Multi-dimensional SDF, etc. are also used for designing...Formal Ideally suited to model DSP applications 3 Petri Nets Graphical Formal Used for modeling distributed systems 4 I/O Automata Both Formal
Reduced-Order Blade Mistuning Analysis Techniques Developed for the Robust Design of Engine Rotors
NASA Technical Reports Server (NTRS)
Min, James B.
2004-01-01
The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo-Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using eigenfrequency curve veerings to identify "danger zones" in the operating conditions--ranges of rotational speeds and engine orders in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued. Several methods will be investigated, including the use of intentional mistuning patterns to mitigate the harmful effects of random mistuning, and the modification of disk stiffness to avoid reaching critical values of interblade coupling in the desired operating range. Recent research progress is summarized in the following paragraphs. First, significant progress was made in the development of the component mode mistuning (CMM) and static mode compensation (SMC) methods for reduced-order modeling of mistuned bladed disks (see the following figure). The CMM method has been formalized and extended to allow a general treatment of mistuning. In addition, CMM allows individual mode mistuning, which accounts for the realistic effects of local variations in blade properties that lead to different mistuning values for different mode types (e.g., mistuning of the first torsion mode versus the second flexural mode). The accuracy and efficiency of the CMM method and the corresponding Turbo-Reduce code were validated for an example finite element model of a bladed disk.
A Formal Approach to Requirements-Based Programming
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.
Software Tools for Formal Specification and Verification of Distributed Real-Time Systems
1994-07-29
time systems and to evaluate the design. The evaluation of the design includes investigation of both the capability and potential usefulness of the toolkit environment and the feasibility of its implementation....The goals of Phase 1 are to design in detail a toolkit environment based on formal methods for the specification and verification of distributed real
An automated qualification framework for the MeerKAT CAM (Control-And-Monitoring)
NASA Astrophysics Data System (ADS)
van den Heever, Lize; Marais, Neilen; Slabber, Martin
2016-08-01
This paper introduces and discusses the design of an Automated Qualification Framework (AQF) that was developed to automate as much as possible of the formal Qualification Testing of the Control And Monitoring (CAM) subsystem of the 64 dish MeerKAT radio telescope currently under construction in the Karoo region of South Africa. The AQF allows each Integrated CAM Test to reference the MeerKAT CAM requirement and associated verification requirement it covers and automatically produces the Qualification Test Procedure and Qualification Test Report from the test steps and evaluation steps annotated in the Integrated CAM Tests. The MeerKAT System Engineers are extremely happy with the AQF results, but mostly by the approach and process it enforces.
Wireless Sensors Grouping Proofs for Medical Care and Ambient Assisted-Living Deployment
Trček, Denis
2016-01-01
Internet of Things (IoT) devices are rapidly penetrating e-health and assisted living domains, and an increasing proportion among them goes on the account of computationally-weak devices, where security and privacy provisioning alone are demanding tasks, not to mention grouping proofs. This paper, therefore, gives an extensive analysis of such proofs and states lessons learnt to avoid possible pitfalls in future designs. It sticks with prudent engineering techniques in this field and deploys in a novel way the so called non-deterministic principle to provide not only grouping proofs, but (among other) also privacy. The developed solution is analyzed by means of a tangible metric and it is shown to be lightweight, and formally for security. PMID:26729131
Calendar years 1989 and 1990 monitoring well installation program Y-12 plant, Oak Ridge, Tennessee
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-10-01
This report documents the well-construction activities at the Y-12 Plant in Oak Ridge, Tennessee during 1989 and 1990. The well- construction program consisted of installing seventy-five monitoring wells. Geologists from ERCE (formally the Engineering, Design and Geosciences Group) and Martin Marietta Energy Systems (Energy Systems), supervised and documented well-construction activities and monitored for health and safety concerns. Sixty-seven monitoring wells were installed under the supervision of an ERCE geologist from March 1989 to September 1990. Beginning in September 1990, Energy Systems supervised drilling activities for eight monitoring wells, the last of which was completed in December 1990. 9 refs., 3more » figs., 2 tabs.« less
NASA Technical Reports Server (NTRS)
Menzel, Christopher; Mayer, Richard J.; Edwards, Douglas D.
1991-01-01
The Process Description Capture Method (IDEF3) is one of several Integrated Computer-Aided Manufacturing (ICAM) DEFinition methods developed by the Air Force to support systems engineering activities, and in particular, to support information systems development. These methods have evolved as a distillation of 'good practice' experience by information system developers and are designed to raise the performance level of the novice practitioner to one comparable with that of an expert. IDEF3 is meant to serve as a knowledge acquisition and requirements definition tool that structures the user's understanding of how a given process, event, or system works around process descriptions. A special purpose graphical language accompanying the method serves to highlight temporal precedence and causality relationships relative to the process or event being described.
Wireless Sensors Grouping Proofs for Medical Care and Ambient Assisted-Living Deployment.
Trček, Denis
2016-01-02
Internet of Things (IoT) devices are rapidly penetrating e-health and assisted living domains, and an increasing proportion among them goes on the account of computationally-weak devices, where security and privacy provisioning alone are demanding tasks, not to mention grouping proofs. This paper, therefore, gives an extensive analysis of such proofs and states lessons learnt to avoid possible pitfalls in future designs. It sticks with prudent engineering techniques in this field and deploys in a novel way the so called non-deterministic principle to provide not only grouping proofs, but (among other) also privacy. The developed solution is analyzed by means of a tangible metric and it is shown to be lightweight, and formally for security.
Formal design specification of a Processor Interface Unit
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1992-01-01
This report describes work to formally specify the requirements and design of a processor interface unit (PIU), a single-chip subsystem providing memory-interface bus-interface, and additional support services for a commercial microprocessor within a fault-tolerant computer system. This system, the Fault-Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance-free operation, or both. The need for high-quality design assurance in such applications is an undisputed fact, given the disastrous consequences that even a single design flaw can produce. Thus, the further development and application of formal methods to fault-tolerant systems is of critical importance as these systems see increasing use in modern society.
NASA Technical Reports Server (NTRS)
Richstein, Alan B.; Nolte, Jerome T.; Pfarr, Barbara B.
2004-01-01
There are numerous technical reviews that occur throughout the systems engineering process life cycle. Many are well known by project managers and stakeholders such as developers and end users, an example of much is the critical design review (CDR). This major milestone for a large, complex new project may last two or more days, include an extensive agenda of topics, and entail hundreds of hours of developer time to prepare presentation materials and associated documents. Additionally, the weeks of schedule spent on review preparation is at least partly at the expense of other work. This paper suggests an approach for tailoring technical reviews, based on the project characteristics and the project manager s identification of the key stakeholders and understanding of their most important issues and considerations. With this insight the project manager can communicate to, manage expectations oc and establish formal agreement with the stakeholders as to which reviews, and at what depth, are most appropriate to achieve project success. The authors, coming from diverse organizations and backgrounds, have drawn on their personal experiences and summarized the best practices of their own organizations to create a common framework to provide guidance on the adaptation of design reviews to other system engineers.
The Impact of Ada and Object-Oriented Design in NASA Goddard's Flight Dynamics Division
NASA Technical Reports Server (NTRS)
Waligora, Sharon; Bailey, John; Stark, Mike
1996-01-01
This paper presents the highlights and key findings of 10 years of use and study of Ada and object-oriented design in NASA Goddard's Flight Dynamics Division (FDD). In 1985, the Software Engineering Laboratory (SEL) began investigating how the Ada language might apply to FDD software development projects. Although they began cautiously using Ada on only a few pilot projects, they expected that, if the Ada pilots showed promising results, the FDD would fully transition its entire development organization from FORTRAN to Ada within 10 years. However, 10 years later, the FDD still produced 80 percent of its software in FORTRAN and had begun using C and C++, despite positive results on Ada projects. This paper presents the final results of a SEL study to quantify the impact of Ada in the FDD, to determine why Ada has not flourished, and to recommend future directions regarding Ada. Project trends in both languages are examined as are external factors and cultural issues that affected the infusion of this technology. The detailed results of this study were published in a formal study report in March of 1995. This paper supersedes the preliminary results of this study that were presented at the Eighteenth Annual Software Engineering Workshop in 1993.
Re-typograph phase I: a proof-of-concept for typeface parameter extraction from historical documents
NASA Astrophysics Data System (ADS)
Lamiroy, Bart; Bouville, Thomas; Blégean, Julien; Cao, Hongliu; Ghamizi, Salah; Houpin, Romain; Lloyd, Matthias
2015-01-01
This paper reports on the first phase of an attempt to create a full retro-engineering pipeline that aims to construct a complete set of coherent typographic parameters defining the typefaces used in a printed homogenous text. It should be stressed that this process cannot reasonably be expected to be fully automatic and that it is designed to include human interaction. Although font design is governed by a set of quite robust and formal geometric rulesets, it still heavily relies on subjective human interpretation. Furthermore, different parameters, applied to the generic rulesets may actually result in quite similar and visually difficult to distinguish typefaces, making the retro-engineering an inverse problem that is ill conditioned once shape distortions (related to the printing and/or scanning process) come into play. This work is the first phase of a long iterative process, in which we will progressively study and assess the techniques from the state-of-the-art that are most suited to our problem and investigate new directions when they prove to not quite adequate. As a first step, this is more of a feasibility proof-of-concept, that will allow us to clearly pinpoint the items that will require more in-depth research over the next iterations.
Aircraft applications of fault detection and isolation techniques
NASA Astrophysics Data System (ADS)
Marcos Esteban, Andres
In this thesis the problems of fault detection & isolation and fault tolerant systems are studied from the perspective of LTI frequency-domain, model-based techniques. Emphasis is placed on the applicability of these LTI techniques to nonlinear models, especially to aerospace systems. Two applications of Hinfinity LTI fault diagnosis are given using an open-loop (no controller) design approach: one for the longitudinal motion of a Boeing 747-100/200 aircraft, the other for a turbofan jet engine. An algorithm formalizing a robust identification approach based on model validation ideas is also given and applied to the previous jet engine. A general linear fractional transformation formulation is given in terms of the Youla and Dual Youla parameterizations for the integrated (control and diagnosis filter) approach. This formulation provides better insight into the trade-off between the control and the diagnosis objectives. It also provides the basic groundwork towards the development of nested schemes for the integrated approach. These nested structures allow iterative improvements on the control/filter Youla parameters based on successive identification of the system uncertainty (as given by the Dual Youla parameter). The thesis concludes with an application of Hinfinity LTI techniques to the integrated design for the longitudinal motion of the previous Boeing 747-100/200 model.
On the Safety of Machine Learning: Cyber-Physical Systems, Decision Sciences, and Data Products.
Varshney, Kush R; Alemzadeh, Homa
2017-09-01
Machine learning algorithms increasingly influence our decisions and interact with us in all parts of our daily lives. Therefore, just as we consider the safety of power plants, highways, and a variety of other engineered socio-technical systems, we must also take into account the safety of systems involving machine learning. Heretofore, the definition of safety has not been formalized in a machine learning context. In this article, we do so by defining machine learning safety in terms of risk, epistemic uncertainty, and the harm incurred by unwanted outcomes. We then use this definition to examine safety in all sorts of applications in cyber-physical systems, decision sciences, and data products. We find that the foundational principle of modern statistical machine learning, empirical risk minimization, is not always a sufficient objective. We discuss how four different categories of strategies for achieving safety in engineering, including inherently safe design, safety reserves, safe fail, and procedural safeguards can be mapped to a machine learning context. We then discuss example techniques that can be adopted in each category, such as considering interpretability and causality of predictive models, objective functions beyond expected prediction accuracy, human involvement for labeling difficult or rare examples, and user experience design of software and open data.
Optimal tuning of a confined Brownian information engine.
Park, Jong-Min; Lee, Jae Sung; Noh, Jae Dong
2016-03-01
A Brownian information engine is a device extracting mechanical work from a single heat bath by exploiting the information on the state of a Brownian particle immersed in the bath. As for engines, it is important to find the optimal operating condition that yields the maximum extracted work or power. The optimal condition for a Brownian information engine with a finite cycle time τ has been rarely studied because of the difficulty in finding the nonequilibrium steady state. In this study, we introduce a model for the Brownian information engine and develop an analytic formalism for its steady-state distribution for any τ. We find that the extracted work per engine cycle is maximum when τ approaches infinity, while the power is maximum when τ approaches zero.
NASA Astrophysics Data System (ADS)
Kannape, Oliver Alan; Lenggenhager, Bigna
2016-03-01
From brain-computer interfaces to wearable robotics and bionic prostheses - intelligent assistive devices have already become indispensable in the therapy of people living with reduced sensorimotor functioning of their physical body, be it due to spinal cord injury, amputation or brain lesions [1]. Rapid technological advances will continue to fuel this field for years to come. As Pazzaglia and Molinari [2] rightly point out, progress in this domain should not solely be driven by engineering prowess, but utilize the increasing psychological and neuroscientific understanding of cortical body-representations and their plasticity [3]. We argue that a core concept for such an integrated embodiment framework was introduced with the formalization of the forward model for sensorimotor control [4]. The application of engineering concepts to human movement control paved the way for rigorous computational and neuroscientific analysis. The forward model has successfully been adapted to investigate principles underlying aspects of bodily awareness such as the sense of agency in the comparator framework [5]. At the example of recent advances in lower limb prostheses, we propose a cross-disciplinary, integrated embodiment framework to investigate the sense of agency and the related sense of body ownership for such devices. The main onus now is on the engineers and cognitive scientists to embed such an approach into the design of assistive technology and its evaluation battery.
A Model-based Approach to Reactive Self-Configuring Systems
NASA Technical Reports Server (NTRS)
Williams, Brian C.; Nayak, P. Pandurang
1996-01-01
This paper describes Livingstone, an implemented kernel for a self-reconfiguring autonomous system, that is reactive and uses component-based declarative models. The paper presents a formal characterization of the representation formalism used in Livingstone, and reports on our experience with the implementation in a variety of domains. Livingstone's representation formalism achieves broad coverage of hybrid software/hardware systems by coupling the concurrent transition system models underlying concurrent reactive languages with the discrete qualitative representations developed in model-based reasoning. We achieve a reactive system that performs significant deductions in the sense/response loop by drawing on our past experience at building fast prepositional conflict-based algorithms for model-based diagnosis, and by framing a model-based configuration manager as a prepositional, conflict-based feedback controller that generates focused, optimal responses. Livingstone automates all these tasks using a single model and a single core deductive engine, thus making significant progress towards achieving a central goal of model-based reasoning. Livingstone, together with the HSTS planning and scheduling engine and the RAPS executive, has been selected as the core autonomy architecture for Deep Space One, the first spacecraft for NASA's New Millennium program.
Formalizing an integrative, multidisciplinary cancer therapy discovery workflow
McGuire, Mary F.; Enderling, Heiko; Wallace, Dorothy I.; Batra, Jaspreet; Jordan, Marie; Kumar, Sushil; Panetta, John C.; Pasquier, Eddy
2014-01-01
Although many clinicians and researchers work to understand cancer, there has been limited success to effectively combine forces and collaborate over time, distance, data and budget constraints. Here we present a workflow template for multidisciplinary cancer therapy that was developed during the 2nd Annual Workshop on Cancer Systems Biology sponsored by Tufts University, Boston, MA in July 2012. The template was applied to the development of a metronomic therapy backbone for neuroblastoma. Three primary groups were identified: clinicians, biologists, and scientists (mathematicians, computer scientists, physicists and engineers). The workflow described their integrative interactions; parallel or sequential processes; data sources and computational tools at different stages as well as the iterative nature of therapeutic development from clinical observations to in vitro, in vivo, and clinical trials. We found that theoreticians in dialog with experimentalists could develop calibrated and parameterized predictive models that inform and formalize sets of testable hypotheses, thus speeding up discovery and validation while reducing laboratory resources and costs. The developed template outlines an interdisciplinary collaboration workflow designed to systematically investigate the mechanistic underpinnings of a new therapy and validate that therapy to advance development and clinical acceptance. PMID:23955390
ADGS-2100 Adaptive Display and Guidance System Window Manager Analysis
NASA Technical Reports Server (NTRS)
Whalen, Mike W.; Innis, John D.; Miller, Steven P.; Wagner, Lucas G.
2006-01-01
Recent advances in modeling languages have made it feasible to formally specify and analyze the behavior of large system components. Synchronous data flow languages, such as Lustre, SCR, and RSML-e are particularly well suited to this task, and commercial versions of these tools such as SCADE and Simulink are growing in popularity among designers of safety critical systems, largely due to their ability to automatically generate code from the models. At the same time, advances in formal analysis tools have made it practical to formally verify important properties of these models to ensure that design defects are identified and corrected early in the lifecycle. This report describes how these tools have been applied to the ADGS-2100 Adaptive Display and Guidance Window Manager being developed by Rockwell Collins Inc. This work demonstrates how formal methods can be easily and cost-efficiently used to remove defects early in the design cycle.
Formal Methods for Life-Critical Software
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Johnson, Sally C.
1993-01-01
The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.
ERIC Educational Resources Information Center
Saif, Perveen; Reba, Amjad; ud Din, Jalal
2017-01-01
This study was designed to compare the subject knowledge of B.Ed graduates of formal and non-formal teacher education systems. The population of the study included all teachers from Girls High and Higher Secondary Schools both from private and public sectors from the district of Peshawar. Out of the total population, twenty schools were randomly…
Designing Curricular Experiences that Promote Young Adolescents' Cognitive Growth
ERIC Educational Resources Information Center
Brown, Dave F.; Canniff, Mary
2007-01-01
One of the most challenging daily experiences of teaching young adolescents is helping them transition from Piaget's concrete to the formal operational stage of cognitive development during the middle school years. Students who have reached formal operations can design and test hypotheses, engage in deductive reasoning, use flexible thinking,…
Best practices for team-based assistive technology design courses.
Goldberg, Mary R; Pearlman, Jonathan L
2013-09-01
Team-based design courses focused on products for people with disabilities have become relatively common, in part because of training grants such as the NSF Research to Aid Persons with Disabilities course grants. An output from these courses is an annual description of courses and projects but has yet to be complied into a "best practices guide," though it could be helpful for instructors. To meet this need, we conducted a study to generate best practices for assistive technology product development courses and how to use these courses to teach students the fundamentals of innovation. A full list of recommendations is comprised in the manuscript and include identifying a client through a reliable clinical partner; allowing for transparency between the instructors, the client, and the team(s); establishing multi-disciplinary teams; using a process-oriented vs. solution-oriented product development model; using a project management software to facilitate and archive communication and outputs; facilitating client interaction through frequent communication; seeking to develop professional role confidence to inspire students' commitment to engineering and (where applicable) rehabilitation field; publishing student designs on repositories; incorporating both formal and informal education opportunities related to design; and encouraging students to submit their designs to local or national entrepreneurship competitions.
NASA Technical Reports Server (NTRS)
1948-01-01
The conference on Turbojet-Engine Thrust-Augmentation Research was organized by the NACA to present in summarized form the results of the latest experimental and analytical investigations conducted at the Lewis Flight Propulsion Laboratory on methods of augmenting the thrust of turbojet engines. The technical discussions are reproduced herewith in the same form in which they were presented. The original presentation in this record are considered as complementary to, rather than substitutes for, the committee's system of complete and formal reports.
ERIC Educational Resources Information Center
Wu, Qun; Wang, Yecheng
2015-01-01
The purpose of this study is to identify the occurrence of Sudden Moments of Inspiration (SMI) in the sketching process of industrial design through experiments to explain the effect of sub consciousness on SMI. There are a pre-experiment and a formal experiment. In the formal experiment, nine undergraduates majoring in industrial design with same…
NASA Astrophysics Data System (ADS)
Araki, Mituhiko; Nakamura, Yuichi; Fujii, Shigeo; Tsuno, Hiroshi
Three international simultaneous lectures of the post graduate level in the field of environmental science and engineering are under preparation in Kyoto University. They are planned to be opened in three Asian universities (Tsinghua University in China, University of Malaya in Malaysia, and Kyoto University in Japan) as formal courses. The contents of the lectures, purpose of the project and technical problems are reported.
1987-06-01
described the state )f ruaturity of software engineering as being equivalent to the state of maturity of Civil Engineering before Pythagoras invented the...formal verification languages, theorem provers or secure configuration 0 management tools would have to be maintained and used in the PDSS Center to
Functional groups of ecosystem engineers: a proposed classification with comments on current issues.
Berke, Sarah K
2010-08-01
Ecologists have long known that certain organisms fundamentally modify, create, or define habitats by altering the habitat's physical properties. In the past 15 years, these processes have been formally defined as "ecosystem engineering", reflecting a growing consensus that environmental structuring by organisms represents a fundamental class of ecological interactions occurring in most, if not all, ecosystems. Yet, the precise definition and scope of ecosystem engineering remains debated, as one should expect given the complexity, enormity, and variability of ecological systems. Here I briefly comment on a few specific current points of contention in the ecosystem engineering concept. I then suggest that ecosystem engineering can be profitably subdivided into four narrower functional categories reflecting four broad mechanisms by which ecosystem engineering occurs: structural engineers, bioturbators, chemical engineers, and light engineers. Finally, I suggest some conceptual model frameworks that could apply broadly within these functional groups.
NASA Technical Reports Server (NTRS)
Hale, Mark A.
1996-01-01
Computer applications for design have evolved rapidly over the past several decades, and significant payoffs are being achieved by organizations through reductions in design cycle times. These applications are overwhelmed by the requirements imposed during complex, open engineering systems design. Organizations are faced with a number of different methodologies, numerous legacy disciplinary tools, and a very large amount of data. Yet they are also faced with few interdisciplinary tools for design collaboration or methods for achieving the revolutionary product designs required to maintain a competitive advantage in the future. These organizations are looking for a software infrastructure that integrates current corporate design practices with newer simulation and solution techniques. Such an infrastructure must be robust to changes in both corporate needs and enabling technologies. In addition, this infrastructure must be user-friendly, modular and scalable. This need is the motivation for the research described in this dissertation. The research is focused on the development of an open computing infrastructure that facilitates product and process design. In addition, this research explicitly deals with human interactions during design through a model that focuses on the role of a designer as that of decision-maker. The research perspective here is taken from that of design as a discipline with a focus on Decision-Based Design, Theory of Languages, Information Science, and Integration Technology. Given this background, a Model of IPPD is developed and implemented along the lines of a traditional experimental procedure: with the steps of establishing context, formalizing a theory, building an apparatus, conducting an experiment, reviewing results, and providing recommendations. Based on this Model, Design Processes and Specification can be explored in a structured and implementable architecture. An architecture for exploring design called DREAMS (Developing Robust Engineering Analysis Models and Specifications) has been developed which supports the activities of both meta-design and actual design execution. This is accomplished through a systematic process which is comprised of the stages of Formulation, Translation, and Evaluation. During this process, elements from a Design Specification are integrated into Design Processes. In addition, a software infrastructure was developed and is called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment). This represents a virtual apparatus in the Design Experiment conducted in this research. IMAGE is an innovative architecture because it explicitly supports design-related activities. This is accomplished through a GUI driven and Agent-based implementation of DREAMS. A HSCT design has been adopted from the Framework for Interdisciplinary Design Optimization (FIDO) and is implemented in IMAGE. This problem shows how Design Processes and Specification interact in a design system. In addition, the problem utilizes two different solution models concurrently: optimal and satisfying. The satisfying model allows for more design flexibility and allows a designer to maintain design freedom. As a result of following this experimental procedure, this infrastructure is an open system that it is robust to changes in both corporate needs and computer technologies. The development of this infrastructure leads to a number of significant intellectual contributions: 1) A new approach to implementing IPPD with the aid of a computer; 2) A formal Design Experiment; 3) A combined Process and Specification architecture that is language-based; 4) An infrastructure for exploring design; 5) An integration strategy for implementing computer resources; and 6) A seamless modeling language. The need for these contributions is emphasized by the demand by industry and government agencies for the development of these technologies.
Single-Vector Calibration of Wind-Tunnel Force Balances
NASA Technical Reports Server (NTRS)
Parker, P. A.; DeLoach, R.
2003-01-01
An improved method of calibrating a wind-tunnel force balance involves the use of a unique load application system integrated with formal experimental design methodology. The Single-Vector Force Balance Calibration System (SVS) overcomes the productivity and accuracy limitations of prior calibration methods. A force balance is a complex structural spring element instrumented with strain gauges for measuring three orthogonal components of aerodynamic force (normal, axial, and side force) and three orthogonal components of aerodynamic torque (rolling, pitching, and yawing moments). Force balances remain as the state-of-the-art instrument that provide these measurements on a scale model of an aircraft during wind tunnel testing. Ideally, each electrical channel of the balance would respond only to its respective component of load, and it would have no response to other components of load. This is not entirely possible even though balance designs are optimized to minimize these undesirable interaction effects. Ultimately, a calibration experiment is performed to obtain the necessary data to generate a mathematical model and determine the force measurement accuracy. In order to set the independent variables of applied load for the calibration 24 NASA Tech Briefs, October 2003 experiment, a high-precision mechanical system is required. Manual deadweight systems have been in use at Langley Research Center (LaRC) since the 1940s. These simple methodologies produce high confidence results, but the process is mechanically complex and labor-intensive, requiring three to four weeks to complete. Over the past decade, automated balance calibration systems have been developed. In general, these systems were designed to automate the tedious manual calibration process resulting in an even more complex system which deteriorates load application quality. The current calibration approach relies on a one-factor-at-a-time (OFAT) methodology, where each independent variable is incremented individually throughout its full-scale range, while all other variables are held at a constant magnitude. This OFAT approach has been widely accepted because of its inherent simplicity and intuitive appeal to the balance engineer. LaRC has been conducting research in a "modern design of experiments" (MDOE) approach to force balance calibration. Formal experimental design techniques provide an integrated view to the entire calibration process covering all three major aspects of an experiment; the design of the experiment, the execution of the experiment, and the statistical analyses of the data. In order to overcome the weaknesses in the available mechanical systems and to apply formal experimental techniques, a new mechanical system was required. The SVS enables the complete calibration of a six-component force balance with a series of single force vectors.
NASA Technical Reports Server (NTRS)
1995-01-01
This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods.
Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.
1997-09-30
set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has
5 CFR 2638.311 - Copies of published formal advisory opinions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Copies of published formal advisory... Service § 2638.311 Copies of published formal advisory opinions. Each designated agency ethics official shall receive a copy of each published opinion. Copies will also be available to the public from the...
Conceptual Questions and Lack of Formal Reasoning: Are They Mutually Exclusive?
ERIC Educational Resources Information Center
Igaz, Csaba; Proksa, Miroslav
2012-01-01
Using specially designed conceptual question pairs, 9th grade students were tested on tasks (presented as experimental situations in pictorial form) that involved controlling the variables' scheme of formal reasoning. The question topics focused on these three chemical contexts: chemistry in everyday life, chemistry without formal concepts, and…
Two-Step Formal Advertisement: An Examination.
1976-10-01
The purpose of this report is to examine the potential application of the Two-Step Formal Advertisement method of procurement. Emphasis is placed on...Step formal advertising is a method of procurement designed to take advantage of negotiation flexibility and at the same time obtain the benefits of...formal advertising . It is used where the specifications are not sufficiently definite or may be too restrictive to permit full and free competition
(Finite) statistical size effects on compressive strength.
Weiss, Jérôme; Girard, Lucas; Gimbert, Florent; Amitrano, David; Vandembroucq, Damien
2014-04-29
The larger structures are, the lower their mechanical strength. Already discussed by Leonardo da Vinci and Edmé Mariotte several centuries ago, size effects on strength remain of crucial importance in modern engineering for the elaboration of safety regulations in structural design or the extrapolation of laboratory results to geophysical field scales. Under tensile loading, statistical size effects are traditionally modeled with a weakest-link approach. One of its prominent results is a prediction of vanishing strength at large scales that can be quantified in the framework of extreme value statistics. Despite a frequent use outside its range of validity, this approach remains the dominant tool in the field of statistical size effects. Here we focus on compressive failure, which concerns a wide range of geophysical and geotechnical situations. We show on historical and recent experimental data that weakest-link predictions are not obeyed. In particular, the mechanical strength saturates at a nonzero value toward large scales. Accounting explicitly for the elastic interactions between defects during the damage process, we build a formal analogy of compressive failure with the depinning transition of an elastic manifold. This critical transition interpretation naturally entails finite-size scaling laws for the mean strength and its associated variability. Theoretical predictions are in remarkable agreement with measurements reported for various materials such as rocks, ice, coal, or concrete. This formalism, which can also be extended to the flowing instability of granular media under multiaxial compression, has important practical consequences for future design rules.
NASA's Space Launch Transitions: From Design to Production
NASA Technical Reports Server (NTRS)
Askins, Bruce; Robinson, Kimberly
2016-01-01
NASA's Space Launch System (SLS) successfully completed its Critical Design Review (CDR) in 2015, a major milestone on the journey to an unprecedented era of exploration for humanity. CDR formally marked the program's transition from design to production phase just four years after the program's inception and the first such milestone for a human launch vehicle in 40 years. While challenges typical of a complex development program lie ahead, CDR evaluators concluded that the design is technically and programmatically sound and ready to press forward to Design Certification Review (DCR) and readiness for launch of Exploration Mission 1 (EM-1) in the 2018 timeframe. SLS is prudently based on existing propulsion systems, infrastructure and knowledge with a clear, evolutionary path as required by mission needs. In its initial configuration, designated Block I, SLS will a minimum of 70 metric tons (t) of payload to low Earth orbit (LEO). It can evolve to a 130 t payload capacity by upgrading its engines, boosters, and upper stage, dramatically increasing the mass and volume of human and robotic exploration while decreasing mission risk, increasing safety, and simplifying ground and mission operations. CDR was the central programmatic accomplishment among many technical accomplishments that will be described in this paper. The government/industry SLS team successfully test fired a flight-like five-segment solid rocket motor, as well as seven hotfire development tests of the RS-25 core stage engine. The majority of the major test article and flight barrels, rings, and domes for the core stage liquid oxygen, liquid hydrogen, engine section, intertank, and forward skirt were manufactured at NASA's Michoud Assembly Facility. Renovations to the B-2 test stand for stage green run testing were completed at NASA Stennis Space Center. Core stage test stands are rising at NASA Marshall Space Flight Center. The modified Pegasus barge for core stage transportation from manufacturing to testing and launch sites was delivered. The Interim Cryogenic Propulsion System test article was also completed. This paper will discuss these and other technical and programmatic successes and challenges over the past year and provide a preview of work ahead before the first flight of this new capability.
Designing with Protocells: Applications of a Novel Technical Platform
Armstrong, Rachel
2014-01-01
The paper offers a design perspective on protocell applications and presents original research that characterizes the life-like qualities of the Bütschli dynamic droplet system, as a particular “species” of protocell. Specific focus is given to the possibility of protocell species becoming a technical platform for designing and engineering life-like solutions to address design challenges. An alternative framing of the protocell, based on process philosophy, sheds light on its capabilities as a technology that can deal with probability and whose ontology is consistent with complexity, nonlinear dynamics and the flow of energy and matter. However, the proposed technical systems do not yet formally exist as products or mature technologies. Their potential applications are therefore experimentally examined within a design context as architectural “projects”—an established way of considering proposals that have not yet been realized, like an extended hypothesis. Exemplary design-led projects are introduced, such as The Hylozoic Ground and Future Venice, which aim to “discover”, rather than “solve”, challenges to examine a set of possibilities that have not yet been resolved. The value of such exploration in design practice is in opening up a set of potential directions for further assessment before complex challenges are procedurally implemented. PMID:25370381
Bilitchenko, Lesia; Liu, Adam; Cheung, Sherine; Weeding, Emma; Xia, Bing; Leguia, Mariana; Anderson, J Christopher; Densmore, Douglas
2011-04-29
Synthetic biological systems are currently created by an ad-hoc, iterative process of specification, design, and assembly. These systems would greatly benefit from a more formalized and rigorous specification of the desired system components as well as constraints on their composition. Therefore, the creation of robust and efficient design flows and tools is imperative. We present a human readable language (Eugene) that allows for the specification of synthetic biological designs based on biological parts, as well as provides a very expressive constraint system to drive the automatic creation of composite Parts (Devices) from a collection of individual Parts. We illustrate Eugene's capabilities in three different areas: Device specification, design space exploration, and assembly and simulation integration. These results highlight Eugene's ability to create combinatorial design spaces and prune these spaces for simulation or physical assembly. Eugene creates functional designs quickly and cost-effectively. Eugene is intended for forward engineering of DNA-based devices, and through its data types and execution semantics, reflects the desired abstraction hierarchy in synthetic biology. Eugene provides a powerful constraint system which can be used to drive the creation of new devices at runtime. It accomplishes all of this while being part of a larger tool chain which includes support for design, simulation, and physical device assembly.
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Divito, Ben L.
1992-01-01
The design and formal verification of the Reliable Computing Platform (RCP), a fault tolerant computing system for digital flight control applications is presented. The RCP uses N-Multiply Redundant (NMR) style redundancy to mask faults and internal majority voting to flush the effects of transient faults. The system is formally specified and verified using the Ehdm verification system. A major goal of this work is to provide the system with significant capability to withstand the effects of High Intensity Radiated Fields (HIRF).
Using ontologies for structuring organizational knowledge in Home Care assistance.
Valls, Aida; Gibert, Karina; Sánchez, David; Batet, Montserrat
2010-05-01
Information Technologies and Knowledge-based Systems can significantly improve the management of complex distributed health systems, where supporting multidisciplinarity is crucial and communication and synchronization between the different professionals and tasks becomes essential. This work proposes the use of the ontological paradigm to describe the organizational knowledge of such complex healthcare institutions as a basis to support their management. The ontology engineering process is detailed, as well as the way to maintain the ontology updated in front of changes. The paper also analyzes how such an ontology can be exploited in a real healthcare application and the role of the ontology in the customization of the system. The particular case of senior Home Care assistance is addressed, as this is a highly distributed field as well as a strategic goal in an ageing Europe. The proposed ontology design is based on a Home Care medical model defined by an European consortium of Home Care professionals, framed in the scope of the K4Care European project (FP6). Due to the complexity of the model and the knowledge gap existing between the - textual - medical model and the strict formalization of an ontology, an ontology engineering methodology (On-To-Knowledge) has been followed. After applying the On-To-Knowledge steps, the following results were obtained: the feasibility study concluded that the ontological paradigm and the expressiveness of modern ontology languages were enough to describe the required medical knowledge; after the kick-off and refinement stages, a complete and non-ambiguous definition of the Home Care model, including its main components and interrelations, was obtained; the formalization stage expressed HC medical entities in the form of ontological classes, which are interrelated by means of hierarchies, properties and semantically rich class restrictions; the evaluation, carried out by exploiting the ontology into a knowledge-driven e-health application running on a real scenario, showed that the ontology design and its exploitation brought several benefits with regards to flexibility, adaptability and work efficiency from the end-user point of view; for the maintenance stage, two software tools are presented, aimed to address the incorporation and modification of healthcare units and the personalization of ontological profiles. The paper shows that the ontological paradigm and the expressiveness of modern ontology languages can be exploited not only to represent terminology in a non-ambiguous way, but also to formalize the interrelations and organizational structures involved in a real and distributed healthcare environment. This kind of ontologies facilitates the adaptation in front of changes in the healthcare organization or Care Units, supports the creation of profile-based interaction models in a transparent and seamless way, and increases the reusability and generality of the developed software components. As a conclusion of the exploitation of the developed ontology in a real medical scenario, we can say that an ontology formalizing organizational interrelations is a key component for building effective distributed knowledge-driven e-health systems. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
Executable Architecture Research at Old Dominion University
NASA Technical Reports Server (NTRS)
Tolk, Andreas; Shuman, Edwin A.; Garcia, Johnny J.
2011-01-01
Executable Architectures allow the evaluation of system architectures not only regarding their static, but also their dynamic behavior. However, the systems engineering community do not agree on a common formal specification of executable architectures. To close this gap and identify necessary elements of an executable architecture, a modeling language, and a modeling formalism is topic of ongoing PhD research. In addition, systems are generally defined and applied in an operational context to provide capabilities and enable missions. To maximize the benefits of executable architectures, a second PhD effort introduces the idea of creating an executable context in addition to the executable architecture. The results move the validation of architectures from the current information domain into the knowledge domain and improve the reliability of such validation efforts. The paper presents research and results of both doctoral research efforts and puts them into a common context of state-of-the-art of systems engineering methods supporting more agility.
MatLab Script and Functional Programming
NASA Technical Reports Server (NTRS)
Shaykhian, Gholam Ali
2007-01-01
MatLab Script and Functional Programming: MatLab is one of the most widely used very high level programming languages for scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. The MatLab seminar covers the functional and script programming aspect of MatLab language. Specific expectations are: a) Recognize MatLab commands, script and function. b) Create, and run a MatLab function. c) Read, recognize, and describe MatLab syntax. d) Recognize decisions, loops and matrix operators. e) Evaluate scope among multiple files, and multiple functions within a file. f) Declare, define and use scalar variables, vectors and matrices.
A Method for Capturing and Reconciling Stakeholder Intentions Based on the Formal Concept Analysis
NASA Astrophysics Data System (ADS)
Aoyama, Mikio
Information systems are ubiquitous in our daily life. Thus, information systems need to work appropriately anywhere at any time for everybody. Conventional information systems engineering tends to engineer systems from the viewpoint of systems functionality. However, the diversity of the usage context requires fundamental change compared to our current thinking on information systems; from the functionality the systems provide to the goals the systems should achieve. The intentional approach embraces the goals and related aspects of the information systems. This chapter presents a method for capturing, structuring and reconciling diverse goals of multiple stakeholders. The heart of the method lies in the hierarchical structuring of goals by goal lattice based on the formal concept analysis, a semantic extension of the lattice theory. We illustrate the effectiveness of the presented method through application to the self-checkout systems for large-scale supermarkets.
Fast Formal Analysis of Requirements via "Topoi Diagrams"
NASA Technical Reports Server (NTRS)
Menzies, Tim; Powell, John; Houle, Michael E.; Kelly, John C. (Technical Monitor)
2001-01-01
Early testing of requirements can decrease the cost of removing errors in software projects. However, unless done carefully, that testing process can significantly add to the cost of requirements analysis. We show here that requirements expressed as topoi diagrams can be built and tested cheaply using our SP2 algorithm, the formal temporal properties of a large class of topoi can be proven very quickly, in time nearly linear in the number of nodes and edges in the diagram. There are two limitations to our approach. Firstly, topoi diagrams cannot express certain complex concepts such as iteration and sub-routine calls. Hence, our approach is more useful for requirements engineering than for traditional model checking domains. Secondly, out approach is better for exploring the temporal occurrence of properties than the temporal ordering of properties. Within these restrictions, we can express a useful range of concepts currently seen in requirements engineering, and a wide range of interesting temporal properties.
MOS 2.0: Modeling the Next Revolutionary Mission Operations System
NASA Technical Reports Server (NTRS)
Delp, Christopher L.; Bindschadler, Duane; Wollaeger, Ryan; Carrion, Carlos; McCullar, Michelle; Jackson, Maddalena; Sarrel, Marc; Anderson, Louise; Lam, Doris
2011-01-01
Designed and implemented in the 1980's, the Advanced Multi-Mission Operations System (AMMOS) was a breakthrough for deep-space NASA missions, enabling significant reductions in the cost and risk of implementing ground systems. By designing a framework for use across multiple missions and adaptability to specific mission needs, AMMOS developers created a set of applications that have operated dozens of deep-space robotic missions over the past 30 years. We seek to leverage advances in technology and practice of architecting and systems engineering, using model-based approaches to update the AMMOS. We therefore revisit fundamental aspects of the AMMOS, resulting in a major update to the Mission Operations System (MOS): MOS 2.0. This update will ensure that the MOS can support an increasing range of mission types, (such as orbiters, landers, rovers, penetrators and balloons), and that the operations systems for deep-space robotic missions can reap the benefits of an iterative multi-mission framework.12 This paper reports on the first phase of this major update. Here we describe the methods and formal semantics used to address MOS 2.0 architecture and some early results. Early benefits of this approach include improved stakeholder input and buy-in, the ability to articulate and focus effort on key, system-wide principles, and efficiency gains obtained by use of well-architected design patterns and the use of models to improve the quality of documentation and decrease the effort required to produce and maintain it. We find that such methods facilitate reasoning, simulation, analysis on the system design in terms of design impacts, generation of products (e.g., project-review and software-delivery products), and use of formal process descriptions to enable goal-based operations. This initial phase yields a forward-looking and principled MOS 2.0 architectural vision, which considers both the mission-specific context and long-term system sustainability.
On verifying a high-level design. [cost and error analysis
NASA Technical Reports Server (NTRS)
Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.
1993-01-01
An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.
Dynamic Forms. Part 1: Functions
NASA Technical Reports Server (NTRS)
Meyer, George; Smith, G. Allan
1993-01-01
The formalism of dynamic forms is developed as a means for organizing and systematizing the design control systems. The formalism allows the designer to easily compute derivatives to various orders of large composite functions that occur in flight-control design. Such functions involve many function-of-a-function calls that may be nested to many levels. The component functions may be multiaxis, nonlinear, and they may include rotation transformations. A dynamic form is defined as a variable together with its time derivatives up to some fixed but arbitrary order. The variable may be a scalar, a vector, a matrix, a direction cosine matrix, Euler angles, or Euler parameters. Algorithms for standard elementary functions and operations of scalar dynamic forms are developed first. Then vector and matrix operations and transformations between parameterization of rotations are developed in the next level in the hierarchy. Commonly occurring algorithms in control-system design, including inversion of pure feedback systems, are developed in the third level. A large-angle, three-axis attitude servo and other examples are included to illustrate the effectiveness of the developed formalism. All algorithms were implemented in FORTRAN code. Practical experience shows that the proposed formalism may significantly improve the productivity of the design and coding process.
Moving formal methods into practice. Verifying the FTPP Scoreboard: Results, phase 1
NASA Technical Reports Server (NTRS)
Srivas, Mandayam; Bickford, Mark
1992-01-01
This report documents the Phase 1 results of an effort aimed at formally verifying a key hardware component, called Scoreboard, of a Fault-Tolerant Parallel Processor (FTPP) being built at Charles Stark Draper Laboratory (CSDL). The Scoreboard is part of the FTPP virtual bus that guarantees reliable communication between processors in the presence of Byzantine faults in the system. The Scoreboard implements a piece of control logic that approves and validates a message before it can be transmitted. The goal of Phase 1 was to lay the foundation of the Scoreboard verification. A formal specification of the functional requirements and a high-level hardware design for the Scoreboard were developed. The hardware design was based on a preliminary Scoreboard design developed at CSDL. A main correctness theorem, from which the functional requirements can be established as corollaries, was proved for the Scoreboard design. The goal of Phase 2 is to verify the final detailed design of Scoreboard. This task is being conducted as part of a NASA-sponsored effort to explore integration of formal methods in the development cycle of current fault-tolerant architectures being built in the aerospace industry.
Saturn Radiation (SATRAD) Model
NASA Technical Reports Server (NTRS)
Garrett, H. B.; Ratliff, J. M.; Evans, R. W.
2005-01-01
The Saturnian radiation belts have not received as much attention as the Jovian radiation belts because they are not nearly as intense-the famous Saturnian particle rings tend to deplete the belts near where their peak would occur. As a result, there has not been a systematic development of engineering models of the Saturnian radiation environment for mission design. A primary exception is that of Divine (1990). That study used published data from several charged particle experiments aboard the Pioneer 1 1, Voyager 1, and Voyager 2 spacecraft during their flybys at Saturn to generate numerical models for the electron and proton radiation belts between 2.3 and 13 Saturn radii. The Divine Saturn radiation model described the electron distributions at energies between 0.04 and 10 MeV and the proton distributions at energies between 0.14 and 80 MeV. The model was intended to predict particle intensity, flux, and fluence for the Cassini orbiter. Divine carried out hand calculations using the model but never formally developed a computer program that could be used for general mission analyses. This report seeks to fill that void by formally developing a FORTRAN version of the model that can be used as a computer design tool for missions to Saturn that require estimates of the radiation environment around the planet. The results of that effort and the program listings are presented here along with comparisons with the original estimates carried out by Divine. In addition, Pioneer and Voyager data were scanned in from the original references and compared with the FORTRAN model s predictions. The results were statistically analyzed in a manner consistent with Divine s approach to provide estimates of the ability of the model to reproduce the original data. Results of a formal review of the model by a panel of experts are also presented. Their recommendations for further tests, analyses, and extensions to the model are discussed.
Fourth NASA Langley Formal Methods Workshop
NASA Technical Reports Server (NTRS)
Holloway, C. Michael (Compiler); Hayhurst, Kelly J. (Compiler)
1997-01-01
This publication consists of papers presented at NASA Langley Research Center's fourth workshop on the application of formal methods to the design and verification of life-critical systems. Topic considered include: Proving properties of accident; modeling and validating SAFER in VDM-SL; requirement analysis of real-time control systems using PVS; a tabular language for system design; automated deductive verification of parallel systems. Also included is a fundamental hardware design in PVS.
The development of a collaborative virtual environment for finite element simulation
NASA Astrophysics Data System (ADS)
Abdul-Jalil, Mohamad Kasim
Communication between geographically distributed designers has been a major hurdle in traditional engineering design. Conventional methods of communication, such as video conferencing, telephone, and email, are less efficient especially when dealing with complex design models. Complex shapes, intricate features and hidden parts are often difficult to describe verbally or even using traditional 2-D or 3-D visual representations. Virtual Reality (VR) and Internet technologies have provided a substantial potential to bridge the present communication barrier. VR technology allows designers to immerse themselves in a virtual environment to view and manipulate this model just as in real-life. Fast Internet connectivity has enabled fast data transfer between remote locations. Although various collaborative virtual environment (CVE) systems have been developed in the past decade, they are limited to high-end technology that is not accessible to typical designers. The objective of this dissertation is to discover and develop a new approach to increase the efficiency of the design process, particularly for large-scale applications wherein participants are geographically distributed. A multi-platform and easily accessible collaborative virtual environment (CVRoom), is developed to accomplish the stated research objective. Geographically dispersed designers can meet in a single shared virtual environment to discuss issues pertaining to the engineering design process and to make trade-off decisions more quickly than before, thereby speeding the entire process. This 'faster' design process will be achieved through the development of capabilities to better enable the multidisciplinary and modeling the trade-off decisions that are so critical before launching into a formal detailed design. The features of the environment developed as a result of this research include the ability to view design models, use voice interaction, and to link engineering analysis modules (such as Finite Element Analysis module, such as is demonstrated in this work). One of the major issues in developing a CVE system for engineering design purposes is to obtain any pertinent simulation results in real-time. This is critical so that the designers can make decisions based on these results quickly. For example, in a finite element analysis, if a design model is changed or perturbed, the analysis results must be obtained in real-time or near real-time to make the virtual meeting environment realistic. In this research, the finite difference-based Design Sensitivity Analysis (DSA) approach is employed to approximate structural responses (i.e. stress, displacement, etc), so as to demonstrate the applicability of CVRoom for engineering design trade-offs. This DSA approach provides for fast approximation and is well-suited for the virtual meeting environment where fast response time is required. The DSA-based approach is tested on several example test problems to show its applicability and limitations. This dissertation demonstrates that an increase in efficiency and reduction of time required for a complex design processing can be accomplished using the approach developed in this dissertation research. Several implementations of CVRoom by students working on common design tasks were investigated. All participants confirmed the preference of using the collaborative virtual environment developed in this dissertation work (CVRoom) over other modes of interactions. It is proposed here that CVRoom is representative of the type of collaborative virtual environment that will be used by most designers in the future to reduce the time required in a design cycle and thereby reduce the associated cost.
Experimental and analytical investigation of a modified ring cusp NSTAR engine
NASA Technical Reports Server (NTRS)
Sengupta, Anita
2005-01-01
A series of experimental measurements on a modified laboratory NSTAR engine were used to validate a zero dimensional analytical discharge performance model of a ring cusp ion thruster. The model predicts the discharge performance of a ring cusp NSTAR thruster as a function the magnetic field configuration, thruster geometry, and throttle level. Analytical formalisms for electron and ion confinement are used to predict the ionization efficiency for a given thruster design. Explicit determination of discharge loss and volume averaged plasma parameters are also obtained. The model was used to predict the performance of the nominal and modified three and four ring cusp 30-cm ion thruster configurations operating at the full power (2.3 kW) NSTAR throttle level. Experimental measurements of the modified engine configuration discharge loss compare well with the predicted value for propellant utilizations from 80 to 95%. The theory, as validated by experiment, indicates that increasing the magnetic strength of the minimum closed reduces maxwellian electron diffusion and electrostatically confines the ion population and subsequent loss to the anode wall. The theory also indicates that increasing the cusp strength and minimizing the cusp area improves primary electron confinement increasing the probability of an ionization collision prior to loss at the cusp.
Backstop: Shuttle Will Fly with Outstanding Waivers; New Oversight Eases Conflicts on Safety
NASA Technical Reports Server (NTRS)
Morring, Frank, Jr.
2005-01-01
he space shuttle Discovery is carrying some 300 waivers to technical specifications as it enters the home stretch of its planned return to flight next month. There were about 6,000 waivers in place when Columbia crashed. Shuttle managers say they are working to reduce the number of waivers remaining by fixing the problems they highlight, a change prompted by the Columbia Accident Investigation Board. In the wake of the accident, NASA has heeded the CAWS recommendation that waivers be the responsibility of an "independent technical authority" (ITA), rather than the shuttle program itself. To carry out the recommendation of the CAIB-which found an inherent conflict of interest in having the same managers make decisions about cost, schedule and safety-then-Administrator Sean O'Keefe designated the agency's chief engineer as the formal ITA. He is responsible for setting, maintaining and granting waivers across the agency. In mid-January, Fred Gregory, then O'Keefe's deputy and now his acting replacement, launched the ITA within NASA under Chief Engineer Rex Geveden, the former program manager on the Gravity Probe B experiment.
Advanced Software V&V for Civil Aviation and Autonomy
NASA Technical Reports Server (NTRS)
Brat, Guillaume P.
2017-01-01
With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.
Das, Tanmoy; Balatsky, A. V.
2013-01-01
Topological insulators represent a new class of quantum phase defined by invariant symmetries and spin-orbit coupling that guarantees metallic Dirac excitations at its surface. The discoveries of these states have sparked the hope of realizing non-trivial excitations and novel effects such as a magnetoelectric effect and topological Majorana excitations. Here we develop a theoretical formalism to show that a three-dimensional topological insulator can be designed artificially via stacking bilayers of two-dimensional Fermi gases with opposite Rashba-type spin-orbit coupling on adjacent layers, and with interlayer quantum tunneling. We demonstrate that in the stack of bilayers grown along a (001)-direction, a non-trivial topological phase transition occurs above a critical number of Rashba bilayers. In the topological phase, we find the formation of a single spin-polarized Dirac cone at the -point. This approach offers an accessible way to design artificial topological insulators in a set up that takes full advantage of the atomic layer deposition approach. This design principle is tunable and also allows us to bypass limitations imposed by bulk crystal geometry. PMID:23739724
Space Station Freedom extravehicular activity systems evolution study
NASA Technical Reports Server (NTRS)
Rouen, Michael
1990-01-01
Evaluation of Space Station Freedom (SSF) support of manned exploration is in progress to identify SSF extravehicular activity (EVA) system evolution requirements and capabilities. The output from these studies will provide data to support the preliminary design process to ensure that Space Station EVA system requirements for future missions (including the transportation node) are adequately considered and reflected in the baseline design. The study considers SSF support of future missions and the EVA system baseline to determine adequacy of EVA requirements and capabilities and to identify additional requirements, capabilities, and necessary technology upgrades. The EVA demands levied by formal requirements and indicated by evolutionary mission scenarios are high for the out-years of Space Station Freedom. An EVA system designed to meet the baseline requirements can easily evolve to meet evolution demands with few exceptions. Results to date indicate that upgrades or modifications to the EVA system may be necessary to meet the full range of EVA thermal environments associated with the transportation node. Work continues to quantify the EVA capability in this regard. Evolution mission scenarios with EVA and ground unshielded nuclear propulsion engines are inconsistent with anthropomorphic EVA capabilities.
Formal Techniques for Synchronized Fault-Tolerant Systems
NASA Technical Reports Server (NTRS)
DiVito, Ben L.; Butler, Ricky W.
1992-01-01
We present the formal verification of synchronizing aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization is based on an extended state machine model incorporating snapshots of local processors clocks.
Integrated topology and shape optimization in structural design
NASA Technical Reports Server (NTRS)
Bremicker, M.; Chirehdast, M.; Kikuchi, N.; Papalambros, P. Y.
1990-01-01
Structural optimization procedures usually start from a given design topology and vary its proportions or boundary shapes to achieve optimality under various constraints. Two different categories of structural optimization are distinguished in the literature, namely sizing and shape optimization. A major restriction in both cases is that the design topology is considered fixed and given. Questions concerning the general layout of a design (such as whether a truss or a solid structure should be used) as well as more detailed topology features (e.g., the number and connectivities of bars in a truss or the number of holes in a solid) have to be resolved by design experience before formulating the structural optimization model. Design quality of an optimized structure still depends strongly on engineering intuition. This article presents a novel approach for initiating formal structural optimization at an earlier stage, where the design topology is rigorously generated in addition to selecting shape and size dimensions. A three-phase design process is discussed: an optimal initial topology is created by a homogenization method as a gray level image, which is then transformed to a realizable design using computer vision techniques; this design is then parameterized and treated in detail by sizing and shape optimization. A fully automated process is described for trusses. Optimization of two dimensional solid structures is also discussed. Several application-oriented examples illustrate the usefulness of the proposed methodology.
Proceedings of the Second NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Munoz, Cesar (Editor)
2010-01-01
This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.
Toward Synthesis, Analysis, and Certification of Security Protocols
NASA Technical Reports Server (NTRS)
Schumann, Johann
2004-01-01
Implemented security protocols are basically pieces of software which are used to (a) authenticate the other communication partners, (b) establish a secure communication channel between them (using insecure communication media), and (c) transfer data between the communication partners in such a way that these data only available to the desired receiver, but not to anyone else. Such an implementation usually consists of the following components: the protocol-engine, which controls in which sequence the messages of the protocol are sent over the network, and which controls the assembly/disassembly and processing (e.g., decryption) of the data. the cryptographic routines to actually encrypt or decrypt the data (using given keys), and t,he interface to the operating system and to the application. For a correct working of such a security protocol, all of these components must work flawlessly. Many formal-methods based techniques for the analysis of a security protocols have been developed. They range from using specific logics (e.g.: BAN-logic [4], or higher order logics [12] to model checking [2] approaches. In each approach, the analysis tries to prove that no (or at least not a modeled intruder) can get access to secret data. Otherwise, a scenario illustrating the &tack may be produced. Despite the seeming simplicity of security protocols ("only" a few messages are sent between the protocol partners in order to ensure a secure communication), many flaws have been detected. Unfortunately, even a perfect protocol engine does not guarantee flawless working of a security protocol, as incidents show. Many break-ins and security vulnerabilities are caused by exploiting errors in the implementation of the protocol engine or the underlying operating system. Attacks using buffer-overflows are a very common class of such attacks. Errors in the implementation of exception or error handling can open up additional vulnerabilities. For example, on a website with a log-in screen: multiple tries with invalid passwords caused the expected error message (too many retries). but let the user nevertheless pass. Finally, security can be compromised by silly implementation bugs or design decisions. In a commercial VPN software, all calls to the encryption routines were incidentally replaced by stubs, probably during factory testing. The product worked nicely. and the error (an open VPN) would have gone undetected, if a team member had not inspected the low-level traffic out of curiosity. Also, the use secret proprietary encryption routines can backfire, because such algorithms often exhibit weaknesses which can be exploited easily (see e.g., DVD encoding). Summarizing, there is large number of possibilities to make errors which can compromise the security of a protocol. In today s world with short time-to-market and the use of security protocols in open and hostile networks for safety-critical applications (e.g., power or air-traffic control), such slips could lead to catastrophic situations. Thus, formal methods and automatic reasoning techniques should not be used just for the formal proof of absence of an attack, but they ought to be used to provide an end-to-end tool-supported framework for security software. With such an approach all required artifacts (code, documentation, test cases) , formal analyses, and reliable certification will be generated automatically, given a single, high level specification. By a combination of program synthesis, formal protocol analysis, certification; and proof-carrying code, this goal is within practical reach, since all the important technologies for such an approach actually exist and only need to be assembled in the right way.
ENGINEERING BULLETIN: PYROLYSIS TREATMENT
Pyrolysis is formally defined as chemical decomposition induced in organic materials by heat in the absence of oxygen. In practice, it is not possible to achieve a completely oxygen-free atmosphere; actual pyrolytic systems are operated with less than stoichiometric quantities of...
Bridging the Gulf between Formal Calculus and Physical Reasoning.
ERIC Educational Resources Information Center
Van Der Meer, A.
1980-01-01
Some ways to link calculus instruction with the mathematical models used in physics courses are presented. The activity of modelling is presented as a major tool in synchronizing physics and mathematics instruction in undergraduate engineering programs. (MP)
48 CFR 1509.170-3 - Applicability.
Code of Federal Regulations, 2011 CFR
2011-10-01
... PLANNING CONTRACTOR QUALIFICATIONS Contractor Performance Evaluations 1509.170-3 Applicability. (a) This....604 provides detailed instructions for architect-engineer contractor performance evaluations. (b) The... simplified acquisition procedures do not require the creation or existence of a formal database for past...
NASA Technical Reports Server (NTRS)
Bickford, Mark; Srivas, Mandayam
1991-01-01
Presented here is a formal specification and verification of a property of a quadruplicately redundant fault tolerant microprocessor system design. A complete listing of the formal specification of the system and the correctness theorems that are proved are given. The system performs the task of obtaining interactive consistency among the processors using a special instruction on the processors. The design is based on an algorithm proposed by Pease, Shostak, and Lamport. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, providing certain preconditions hold, using a computer aided design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.
Spin filter for arbitrary spins by substrate engineering
NASA Astrophysics Data System (ADS)
Pal, Biplab; Römer, Rudolf A.; Chakrabarti, Arunava
2016-08-01
We design spin filters for particles with potentially arbitrary spin S≤ft(=1/2,1,3/2,\\ldots \\right) using a one-dimensional periodic chain of magnetic atoms as a quantum device. Describing the system within a tight-binding formalism we present an analytical method to unravel the analogy between a one-dimensional magnetic chain and a multi-strand ladder network. This analogy is crucial, and is subsequently exploited to engineer gaps in the energy spectrum by an appropriate choice of the magnetic substrate. We obtain an exact correlation between the magnitude of the spin of the incoming beam of particles and the magnetic moment of the substrate atoms in the chain desired for opening up of a spectral gap. Results of spin polarized transport, calculated within a transfer matrix formalism, are presented for particles having half-integer as well as higher spin states. We find that the chain can be made to act as a quantum device which opens a transmission window only for selected spin components over certain ranges of the Fermi energy, blocking them in the remaining part of the spectrum. The results appear to be robust even when the choice of the substrate atoms deviates substantially from the ideal situation, as verified by extending the ideas to the case of a ‘spin spiral’. Interestingly, the spin spiral geometry, apart from exhibiting the filtering effect, is also seen to act as a device flipping spins—an effect that can be monitored by an interplay of the system size and the period of the spiral. Our scheme is applicable to ultracold quantum gases, and might inspire future experiments in this direction.
From Goal-Oriented Requirements to Event-B Specifications
NASA Technical Reports Server (NTRS)
Aziz, Benjamin; Arenas, Alvaro E.; Bicarregui, Juan; Ponsard, Christophe; Massonet, Philippe
2009-01-01
In goal-oriented requirements engineering methodologies, goals are structured into refinement trees from high-level system-wide goals down to fine-grained requirements assigned to specific software/ hardware/human agents that can realise them. Functional goals assigned to software agents need to be operationalised into specification of services that the agent should provide to realise those requirements. In this paper, we propose an approach for operationalising requirements into specifications expressed in the Event-B formalism. Our approach has the benefit of aiding software designers by bridging the gap between declarative requirements and operational system specifications in a rigorous manner, enabling powerful correctness proofs and allowing further refinements down to the implementation level. Our solution is based on verifying that a consistent Event-B machine exhibits properties corresponding to requirements.
Chemical vapor deposition modeling for high temperature materials
NASA Technical Reports Server (NTRS)
Gokoglu, Suleyman A.
1992-01-01
The formalism for the accurate modeling of chemical vapor deposition (CVD) processes has matured based on the well established principles of transport phenomena and chemical kinetics in the gas phase and on surfaces. The utility and limitations of such models are discussed in practical applications for high temperature structural materials. Attention is drawn to the complexities and uncertainties in chemical kinetics. Traditional approaches based on only equilibrium thermochemistry and/or transport phenomena are defended as useful tools, within their validity, for engineering purposes. The role of modeling is discussed within the context of establishing the link between CVD process parameters and material microstructures/properties. It is argued that CVD modeling is an essential part of designing CVD equipment and controlling/optimizing CVD processes for the production and/or coating of high performance structural materials.
Effective Tools and Resources from the MAVEN Education and Public Outreach Program
NASA Astrophysics Data System (ADS)
Mason, T.
2015-12-01
Since 2010, NASA's Mars Atmosphere and Volatile Evolution (MAVEN) Education and Public Outreach (E/PO) team has developed and implemented a robust and varied suite of projects, serving audiences of all ages and diverse backgrounds from across the country. With a program designed to reach formal K-12 educators and students, afterschool and summertime communities, museum docents, journalists, and online audiences, we have incorporated an equally varied approach to developing tools, resources, and evaluation methods to specifically reach each target population and to determine the effectiveness of our efforts. This poster will highlight some of the tools and resources we have developed to share the complex science and engineering of the MAVEN mission, as well as initial evaluation results and lessons-learned from each of our E/PO projects.
Crisis Management for Secondary Education: A Survey of Secondary Education Directors in Greece
ERIC Educational Resources Information Center
Savelides, Socrates; Mihiotis, Athanassios; Koutsoukis, Nikitas-Spiros
2015-01-01
Purpose: The Greek secondary education system lacks a formal crisis management system. The purpose of this paper is to address this problem as follows: elicit current crisis management practices, outline features for designing a formal crisis management system in Greece. Design/methodology/approach: The research is based on a survey conducted with…
STEM promotion through museum exhibits on cardiac monitoring & cardiac rhythm management.
Countryman, Jordan D; Dow, Douglas E
2014-01-01
Formal education in science, technology, engineering and math (STEM) does not successfully engage all of the students who have potential to become skilled in STEM activities and careers. Museum exhibits may be able to reach and engage a broader range of the public. STEM Exhibits that are both understandable and capture the imagination of viewers may contribute toward increased interest in STEM activities. One such topic for such an exhibit could be cardiac pacemakers and cardioverter defibrillators that sustain life. Although museums have existed for centuries, the available types of exhibit designs has dramatically increased in recent decades due to innovations in technology. Science and technology museums have especially taken advantage of the progression of exhibit design to developed new ways to communicate to their viewers. These novel presentation tools allow museums to more effectively convey to and engage viewers. This paper examines the techniques employed by museums in exhibits and considers the practices of several museums with exhibits related to cardiac monitoring (CM) and cardiac rhythm management (CRM).
A practical review of energy saving technology for ageing populations.
Walker, Guy; Taylor, Andrea; Whittet, Craig; Lynn, Craig; Docherty, Catherine; Stephen, Bruce; Owens, Edward; Galloway, Stuart
2017-07-01
Fuel poverty is a critical issue for a globally ageing population. Longer heating/cooling requirements combine with declining incomes to create a problem in need of urgent attention. One solution is to deploy technology to help elderly users feel informed about their energy use, and empowered to take steps to make it more cost effective and efficient. This study subjects a broad cross section of energy monitoring and home automation products to a formal ergonomic analysis. A high level task analysis was used to guide a product walk through, and a toolkit approach was used thereafter to drive out further insights. The findings reveal a number of serious usability issues which prevent these products from successfully accessing an important target demographic and associated energy saving and fuel poverty outcomes. Design principles and examples are distilled from the research to enable practitioners to translate the underlying research into high quality design-engineering solutions. Copyright © 2017 Elsevier Ltd. All rights reserved.
The application of exergy to human-designed systems
NASA Astrophysics Data System (ADS)
Hamilton, P.
2012-12-01
Exergy is the portion of the total energy of a system that is available for conversion to useful work. Exergy takes into account both the quantity and quality of energy. Heat is the inevitable product of using any form of high-quality energy such as electricity. Modern commercial buildings and industrial facilities use large amounts of electricity and so produce huge amounts of heat. This heat energy typically is treated as a waste product and discharged to the environment and then high-quality energy sources are consumed to satisfy low-quality energy heating and cooling needs. Tens of thousands of buildings and even whole communities could meet much of their heating and cooling needs through the capture and reuse of heat energy. Yet the application of exergy principles often faces resistance because it challenges conventions about how we design, construct and operate human-engineered systems. This session will review several exergy case studies and conclude with an audience discussion of how exergy principles may be both applied and highlighted in formal and informal education settings.
A Formal Methods Approach to the Analysis of Mode Confusion
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Miller, Steven P.; Potts, James N.; Carreno, Victor A.
2004-01-01
The goal of the new NASA Aviation Safety Program (AvSP) is to reduce the civil aviation fatal accident rate by 80% in ten years and 90% in twenty years. This program is being driven by the accident data with a focus on the most recent history. Pilot error is the most commonly cited cause for fatal accidents (up to 70%) and obviously must be given major consideration in this program. While the greatest source of pilot error is the loss of situation awareness , mode confusion is increasingly becoming a major contributor as well. The January 30, 1995 issue of Aviation Week lists 184 incidents and accidents involving mode awareness including the Bangalore A320 crash 2/14/90, the Strasbourg A320 crash 1/20/92, the Mulhouse-Habsheim A320 crash 6/26/88, and the Toulouse A330 crash 6/30/94. These incidents and accidents reveal that pilots sometimes become confused about what the cockpit automation is doing. Consequently, human factors research is an obvious investment area. However, even a cursory look at the accident data reveals that the mode confusion problem is much deeper than just training deficiencies and a lack of human-oriented design. This is readily acknowledged by human factors experts. It seems that further progress in human factors must come through a deeper scrutiny of the internals of the automation. It is in this arena that formal methods can contribute. Formal methods refers to the use of techniques from logic and discrete mathematics in the specification, design, and verification of computer systems, both hardware and software. The fundamental goal of formal methods is to capture requirements, designs and implementations in a mathematically based model that can be analyzed in a rigorous manner. Research in formal methods is aimed at automating this analysis as much as possible. By capturing the internal behavior of a flight deck in a rigorous and detailed formal model, the dark corners of a design can be analyzed. This paper will explore how formal models and analyses can be used to help eliminate mode confusion from flight deck designs and at the same time increase our confidence in the safety of the implementation. The paper is based upon interim results from a new project involving NASA Langley and Rockwell Collins in applying formal methods to a realistic business jet Flight Guidance System (FGS).
23 CFR 172.5 - Methods of procurement.
Code of Federal Regulations, 2011 CFR
2011-04-01
... and ranked by the contracting agency using one of the following procedures: (1) Competitive negotiation. Contracting agencies shall use competitive negotiation for the procurement of engineering and... and selection phase. Alternatively, a formal procedure adopted by State Statute enacted into law prior...
Enhancing Formal E-Learning with Edutainment on Social Networks
ERIC Educational Resources Information Center
Labus, A.; Despotovic-Zrakic, M.; Radenkovic, B.; Bogdanovic, Z.; Radenkovic, M.
2015-01-01
This paper reports on the investigation of the possibilities of enhancing the formal e-learning process by harnessing the potential of informal game-based learning on social networks. The goal of the research is to improve the outcomes of the formal learning process through the design and implementation of an educational game on a social network…
42 CFR 411.380 - When CMS issues a formal advisory opinion.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 2 2010-10-01 2010-10-01 false When CMS issues a formal advisory opinion. 411.380... Relationships Between Physicians and Entities Furnishing Designated Health Services § 411.380 When CMS issues a formal advisory opinion. (a) CMS considers an advisory opinion to be issued once it has received payment...
Bottrighi, Alessio; Terenziani, Paolo
2016-09-01
Several different computer-assisted management systems of computer interpretable guidelines (CIGs) have been developed by the Artificial Intelligence in Medicine community. Each CIG system is characterized by a specific formalism to represent CIGs, and usually provides a manager to acquire, consult and execute them. Though there are several commonalities between most formalisms in the literature, each formalism has its own peculiarities. The goal of our work is to provide a flexible support to the extension or definition of CIGs formalisms, and of their acquisition and execution engines. Instead of defining "yet another CIG formalism and its manager", we propose META-GLARE (META Guideline Acquisition, Representation, and Execution), a "meta"-system to define new CIG systems. In this paper, META-GLARE, a meta-system to define new CIG systems, is presented. We try to capture the commonalities among current CIG approaches, by providing (i) a general manager for the acquisition, consultation and execution of hierarchical graphs (representing the control flow of actions in CIGs), parameterized over the types of nodes and of arcs constituting it, and (ii) a library of different elementary components of guidelines nodes (actions) and arcs, in which each type definition involves the specification of how objects of this type can be acquired, consulted and executed. We provide generality and flexibility, by allowing free aggregations of such elementary components to define new primitive node and arc types. We have drawn several experiments, in which we have used META-GLARE to build a CIG system (Experiment 1 in Section 8), or to extend it (Experiments 2 and 3). Such experiments show that META-GLARE provides a useful and easy-to-use support to such tasks. For instance, re-building the Guideline Acquisition, Representation, and Execution (GLARE) system using META-GLARE required less than one day (Experiment 1). META-GLARE is a meta-system for CIGs supporting fast prototyping. Since META-GLARE provides acquisition and execution engines that are parametric over the specific CIG formalism, it supports easy update and construction of CIG systems. Copyright © 2016 Elsevier B.V. All rights reserved.
2012-07-15
Expedition 32 Flight Engineer Sunita Williams, right, Soyuz Commander Yuri Malenchenko and JAXA Flight Engineer Akihiko Hoshide, left, receive a formal go for launch from Vitaly Alexandrovich Lopota, President of Energia, left, and Vladimir Popovkin, Director of Roscosmos prior to their launch onboard the Soyuz TMA-05M on Sunday, July 15, 2012 at the Baikonur Cosmodrome in Kazakhstan. The Soyuz spacecraft with Malenchenko, Williams and Hoshide onboard launched at 8:40 a.m. later that morning Kazakhstan time. Photo Credit: (NASA/Victor Zelentsov)
MATTS- A Step Towards Model Based Testing
NASA Astrophysics Data System (ADS)
Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.
2016-08-01
In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.
2002 Computing and Interdisciplinary Systems Office Review and Planning Meeting
NASA Technical Reports Server (NTRS)
Lytle, John; Follen, Gregory; Lopez, Isaac; Veres, Joseph; Lavelle, Thomas; Sehra, Arun; Freeh, Josh; Hah, Chunill
2003-01-01
The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with NASA Glenn s Propulsion program, NASA Ames, industry, academia and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This year s review meeting describes the current status of the NPSS and the Object Oriented Development Kit with specific emphasis on the progress made over the past year on air breathing propulsion applications for aeronautics and space transportation applications. Major accomplishments include the first 3-D simulation of the primary flow path of a large turbofan engine in less than 15 hours, and the formal release of the NPSS Version 1.5 that includes elements of rocket engine systems and a visual based syntax layer. NPSS and the Development Kit are managed by the Computing and Interdisciplinary Systems Office (CISO) at the NASA Glenn Research Center and financially supported in fiscal year 2002 by the Computing, Networking and Information Systems (CNIS) project managed at NASA Ames, the Glenn Aerospace Propulsion and Power Program and the Advanced Space Transportation Program.
ERIC Educational Resources Information Center
Monk, John S.; And Others
A multiple-group, single-intervention intensive time-series design was used to examine the achievement of an abstract concept, plate tectonics, of students grouped on the basis of cognitive tendency. Two questions were addressed: (1) How do daily achievement patterns differ between formal and concrete cognitive tendency groups when learning an…
ERIC Educational Resources Information Center
Manurung, Sondang R.; Mihardi, Satria
2016-01-01
The purpose of this study was to determine the effectiveness of hypertext media based kinematic learning and formal thinking ability to improve the conceptual understanding of physic prospective students. The research design used is the one-group pretest-posttest experimental design is carried out in the research by taking 36 students on from…
The path to next generation biofuels: successes and challenges in the era of synthetic biology
2010-01-01
Volatility of oil prices along with major concerns about climate change, oil supply security and depleting reserves have sparked renewed interest in the production of fuels from renewable resources. Recent advances in synthetic biology provide new tools for metabolic engineers to direct their strategies and construct optimal biocatalysts for the sustainable production of biofuels. Metabolic engineering and synthetic biology efforts entailing the engineering of native and de novo pathways for conversion of biomass constituents to short-chain alcohols and advanced biofuels are herewith reviewed. In the foreseeable future, formal integration of functional genomics and systems biology with synthetic biology and metabolic engineering will undoubtedly support the discovery, characterization, and engineering of new metabolic routes and more efficient microbial systems for the production of biofuels. PMID:20089184
Kamath, Janine R. A.; Osborn, John B.; Roger, Véronique L.; Rohleder, Thomas R.
2011-01-01
In August 2010, the Third Annual Mayo Clinic Conference on Systems Engineering and Operations Research in Health Care was held. The continuing mission of the conference is to gather a multidisciplinary group of systems engineers, clinicians, administrators, and academic professors to discuss the translation of systems engineering methods to more effective health care delivery. Education, research, and practice were enhanced via a mix of formal presentations, tutorials, and informal gatherings of participants with diverse backgrounds. Although the conference promotes a diversity of perspectives and methods, participants are united in their desire to find ways in which systems engineering can transform health care, especially in the context of health care reform and other significant changes affecting the delivery of health care. PMID:21803959
Formal Foundations for Hierarchical Safety Cases
NASA Technical Reports Server (NTRS)
Denney, Ewen; Pai, Ganesh; Whiteside, Iain
2015-01-01
Safety cases are increasingly being required in many safety-critical domains to assure, using structured argumentation and evidence, that a system is acceptably safe. However, comprehensive system-wide safety arguments present appreciable challenges to develop, understand, evaluate, and manage, partly due to the volume of information that they aggregate, such as the results of hazard analysis, requirements analysis, testing, formal verification, and other engineering activities. Previously, we have proposed hierarchical safety cases, hicases, to aid the comprehension of safety case argument structures. In this paper, we build on a formal notion of safety case to formalise the use of hierarchy as a structuring technique, and show that hicases satisfy several desirable properties. Our aim is to provide a formal, theoretical foundation for safety cases. In particular, we believe that tools for high assurance systems should be granted similar assurance to the systems to which they are applied. To this end, we formally specify and prove the correctness of key operations for constructing and managing hicases, which gives the specification for implementing hicases in AdvoCATE, our toolset for safety case automation. We motivate and explain the theory with the help of a simple running example, extracted from a real safety case and developed using AdvoCATE.
Aqua Education and Public Outreach
NASA Astrophysics Data System (ADS)
Graham, S. M.; Parkinson, C. L.; Chambers, L. H.; Ray, S. E.
2011-12-01
NASA's Aqua satellite was launched on May 4, 2002, with six instruments designed to collect data about the Earth's atmosphere, biosphere, hydrosphere, and cryosphere. Since the late 1990s, the Aqua mission has involved considerable education and public outreach (EPO) activities, including printed products, formal education, an engineering competition, webcasts, and high-profile multimedia efforts. The printed products include Aqua and instrument brochures, an Aqua lithograph, Aqua trading cards, NASA Fact Sheets on Aqua, the water cycle, and weather forecasting, and an Aqua science writers' guide. On-going formal education efforts include the Students' Cloud Observations On-Line (S'COOL) Project, the MY NASA DATA Project, the Earth System Science Education Alliance, and, in partnership with university professors, undergraduate student research modules. Each of these projects incorporates Aqua data into its inquiry-based framework. Additionally, high school and undergraduate students have participated in summer internship programs. An earlier formal education activity was the Aqua Engineering Competition, which was a high school program sponsored by the NASA Goddard Space Flight Center, Morgan State University, and the Baltimore Museum of Industry. The competition began with the posting of a Round 1 Aqua-related engineering problem in December 2002 and concluded in April 2003 with a final round of competition among the five finalist teams. The Aqua EPO efforts have also included a wide range of multimedia products. Prior to launch, the Aqua team worked closely with the Special Projects Initiative (SPI) Office to produce a series of live webcasts on Aqua science and the Cool Science website aqua.nasa.gov/coolscience, which displays short video clips of Aqua scientists and engineers explaining the many aspects of the Aqua mission. These video clips, the Aqua website, and numerous presentations have benefited from dynamic visualizations showing the Aqua launch, instrument deployments, instrument sensing, and the Aqua orbit. More recently, in 2008 the Aqua team worked with the ViewSpace production team from the Space Telescope Science Institute to create an 18-minute ViewSpace feature showcasing the science and applications of the Aqua mission. Then in 2010 and 2011, Aqua and other NASA Earth-observing missions partnered with National CineMedia on the "Know Your Earth" (KYE) project. During January and July 2010 and 2011, KYE ran 2-minute segments highlighting questions that promoted global climate literacy on lobby LCD screens in movie theaters throughout the U.S. Among the ongoing Aqua EPO efforts is the incorporation of Aqua data sets onto the Dynamic Planet, a large digital video globe that projects a wide variety of spherical data sets. Aqua also has a highly successful collaboration with EarthSky communications on the production of an Aqua/EarthSky radio show and podcast series. To date, eleven productions have been completed and distributed via the EarthSky network. In addition, a series of eight video podcasts (i.e., vodcasts) are under production by NASA Goddard TV in conjunction with Aqua personnel, highlighting various aspects of the Aqua mission.
ERIC Educational Resources Information Center
Wilson, Kristy J.; Brickman, Peggy; Brame, Cynthia J.
2018-01-01
Science, technology, engineering, and mathematics faculty are increasingly incorporating both formal and informal group work in their courses. Implementing group work can be improved by an understanding of the extensive body of educational research studies on this topic. This essay describes an online, evidence-based teaching guide published by…
2011-01-01
Background Academic literature and international standards bodies suggest that user involvement, via the incorporation of human factors engineering methods within the medical device design and development (MDDD) process, offer many benefits that enable the development of safer and more usable medical devices that are better suited to users' needs. However, little research has been carried out to explore medical device manufacturers' beliefs and attitudes towards user involvement within this process, or indeed what value they believe can be added by doing so. Methods In-depth interviews with representatives from 11 medical device manufacturers are carried out. We ask them to specify who they believe the intended users of the device to be, who they consult to inform the MDDD process, what role they believe the user plays within this process, and what value (if any) they believe users add. Thematic analysis is used to analyse the fully transcribed interview data, to gain insight into medical device manufacturers' beliefs and attitudes towards user involvement within the MDDD process. Results A number of high-level themes emerged, relating who the user is perceived to be, the methods used, the perceived value and barriers to user involvement, and the nature of user contributions. The findings reveal that despite standards agencies and academic literature offering strong support for the employment formal methods, manufacturers are still hesitant due to a range of factors including: perceived barriers to obtaining ethical approval; the speed at which such activity may be carried out; the belief that there is no need given the 'all-knowing' nature of senior health care staff and clinical champions; a belief that effective results are achievable by consulting a minimal number of champions. Furthermore, less senior health care practitioners and patients were rarely seen as being able to provide valuable input into the process. Conclusions Medical device manufacturers often do not see the benefit of employing formal human factors engineering methods within the MDDD process. Research is required to better understand the day-to-day requirements of manufacturers within this sector. The development of new or adapted methods may be required if user involvement is to be fully realised. PMID:21356097
ERIC Educational Resources Information Center
Ward, Ted W.; Herzog, William A., Jr.
This document is part of a series dealing with nonformal education. Introductory information is included in document SO 008 058. The focus of this report is on the learning effectiveness of nonformal education. Chapter 1 compares effective learning in a formal and nonformal environment. Chapter 2 develops a systems model for designers of learning…
ERIC Educational Resources Information Center
Penning, Margaret J.
2002-01-01
Purpose: In response to concerns among policymakers and others that increases in the availability of publicly funded formal services will lead to reductions in self- and informal care, this study examines the relationship between the extent of formal in-home care received and levels of self- and informal care. Design and Methods: Two-stage least…
The Influence of Rural Location on Utilization of Formal Home Care: The Role of Medicaid
ERIC Educational Resources Information Center
McAuley, William J.; Spector, William D.; Van Nostrand, Joan; Shaffer, Tom
2004-01-01
Purpose: This research examines the impact of rural-urban residence on formal home-care utilization among older people and determines whether and how Medicaid coverage influences the association between, rural-urban location and risk of formal home-care use. Design and Methods: We combined data from the 1998 consolidated file of the Medical…
ClusterCAD: a computational platform for type I modular polyketide synthase design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eng, Clara H.; Backman, Tyler W H; Bailey, Constance B.
Here, we present ClusterCAD, a web-based toolkit designed to leverage the collinear structure and deterministic logic of type I modular polyketide synthases (PKSs) for synthetic biology applications. The unique organization of these megasynthases, combined with the diversity of their catalytic domain building blocks, has fueled an interest in harnessing the biosynthetic potential of PKSs for the microbial production of both novel natural product analogs and industrially relevant small molecules. However, a limited theoretical understanding of the determinants of PKS fold and function poses a substantial barrier to the design of active variants, and identifying strategies to reliably construct functional PKSmore » chimeras remains an active area of research. In this work, we formalize a paradigm for the design of PKS chimeras and introduce ClusterCAD as a computational platform to streamline and simplify the process of designing experiments to test strategies for engineering PKS variants. ClusterCAD provides chemical structures with stereochemistry for the intermediates generated by each PKS module, as well as sequence- and structure-based search tools that allow users to identify modules based either on amino acid sequence or on the chemical structure of the cognate polyketide intermediate. ClusterCAD can be accessed at https://clustercad.jbei.org and at http://clustercad.igb.uci.edu.« less
Gendrault, Yves; Madec, Morgan; Lallement, Christophe; Haiech, Jacques
2014-04-01
Nowadays, synthetic biology is a hot research topic. Each day, progresses are made to improve the complexity of artificial biological functions in order to tend to complex biodevices and biosystems. Up to now, these systems are handmade by bioengineers, which require strong technical skills and leads to nonreusable development. Besides, scientific fields that share the same design approach, such as microelectronics, have already overcome several issues and designers succeed in building extremely complex systems with many evolved functions. On the other hand, in systems engineering and more specifically in microelectronics, the development of the domain has been promoted by both the improvement of technological processes and electronic design automation tools. The work presented in this paper paves the way for the adaptation of microelectronics design tools to synthetic biology. Considering the similarities and differences between the synthetic biology and microelectronics, the milestones of this adaptation are described. The first one concerns the modeling of biological mechanisms. To do so, a new formalism is proposed, based on an extension of the generalized Kirchhoff laws to biology. This way, a description of all biological mechanisms can be made with languages widely used in microelectronics. Our approach is therefore successfully validated on specific examples drawn from the literature.
ClusterCAD: a computational platform for type I modular polyketide synthase design
Eng, Clara H.; Backman, Tyler W H; Bailey, Constance B.; ...
2017-10-11
Here, we present ClusterCAD, a web-based toolkit designed to leverage the collinear structure and deterministic logic of type I modular polyketide synthases (PKSs) for synthetic biology applications. The unique organization of these megasynthases, combined with the diversity of their catalytic domain building blocks, has fueled an interest in harnessing the biosynthetic potential of PKSs for the microbial production of both novel natural product analogs and industrially relevant small molecules. However, a limited theoretical understanding of the determinants of PKS fold and function poses a substantial barrier to the design of active variants, and identifying strategies to reliably construct functional PKSmore » chimeras remains an active area of research. In this work, we formalize a paradigm for the design of PKS chimeras and introduce ClusterCAD as a computational platform to streamline and simplify the process of designing experiments to test strategies for engineering PKS variants. ClusterCAD provides chemical structures with stereochemistry for the intermediates generated by each PKS module, as well as sequence- and structure-based search tools that allow users to identify modules based either on amino acid sequence or on the chemical structure of the cognate polyketide intermediate. ClusterCAD can be accessed at https://clustercad.jbei.org and at http://clustercad.igb.uci.edu.« less
Formalization of the Access Control on ARM-Android Platform with the B Method
NASA Astrophysics Data System (ADS)
Ren, Lu; Wang, Wei; Zhu, Xiaodong; Man, Yujia; Yin, Qing
2018-01-01
ARM-Android is a widespread mobile platform with multi-layer access control mechanisms, security-critical in the system. Many access control vulnerabilities still exist due to the course-grained policy and numerous engineering defects, which have been widely studied. However, few researches focus on the mechanism formalization, including the Android permission framework, kernel process management and hardware isolation. This paper first develops a comprehensive formal access control model on the ARM-Android platform using the B method, from the Android middleware to hardware layer. All the model specifications are type checked and proved to be well-defined, with 75%of proof obligations demonstrated automatically. The results show that the proposed B model is feasible to specify and verify access control schemes in the ARM-Android system, and capable of implementing a practical control module.
User Interface Technology for Formal Specification Development
NASA Technical Reports Server (NTRS)
Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)
1994-01-01
Formal specification development and modification are an essential component of the knowledge-based software life cycle. User interface technology is needed to empower end-users to create their own formal specifications. This paper describes the advanced user interface for AMPHION1 a knowledge-based software engineering system that targets scientific subroutine libraries. AMPHION is a generic, domain-independent architecture that is specialized to an application domain through a declarative domain theory. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that provides semantic guidance in creating diagrams denoting formal specifications in an application domain. The diagrams also serve to document the specifications. Automatic deductive program synthesis ensures that end-user specifications are correctly implemented. The tables that drive AMPHION's user interface are automatically compiled from a domain theory; portions of the interface can be customized by the end-user. The user interface facilitates formal specification development by hiding syntactic details, such as logical notation. It also turns some of the barriers for end-user specification development associated with strongly typed formal languages into active sources of guidance, without restricting advanced users. The interface is especially suited for specification modification. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development.
Theory of Collective Intelligence
NASA Technical Reports Server (NTRS)
Wolpert, David H.
2003-01-01
In this chapter an analysis of the behavior of an arbitrary (perhaps massive) collective of computational processes in terms of an associated "world" utility function is presented We concentrate on the situation where each process in the collective can be viewed as though it were striving to maximize its own private utility function. For such situations the central design issue is how to initialize/update the collective's structure, and in particular the private utility functions, so as to induce the overall collective to behave in a way that has large values of the world utility. Traditional "team game" approaches to this problem simply set each private utility function equal to the world utility function. The "Collective Intelligence" (COIN) framework is a semi-formal set of heuristics that recently have been used to construct private utility. functions that in many experiments have resulted in world utility values up to orders of magnitude superior to that ensuing from use of the team game utility. In this paper we introduce a formal mathematics for analyzing and designing collectives. We also use this mathematics to suggest new private utilities that should outperform the COIN heuristics in certain kinds of domains. In accompanying work we use that mathematics to explain previous experimental results concerning the superiority of COIN heuristics. In that accompanying work we also use the mathematics to make numerical predictions, some of which we then test. In this way these two papers establish the study of collectives as a proper science, involving theory, explanation of old experiments, prediction concerning new experiments, and engineering insights.
NASA Formal Methods Workshop, 1990
NASA Technical Reports Server (NTRS)
Butler, Ricky W. (Compiler)
1990-01-01
The workshop brought together researchers involved in the NASA formal methods research effort for detailed technical interchange and provided a mechanism for interaction with representatives from the FAA and the aerospace industry. The workshop also included speakers from industry to debrief the formal methods researchers on the current state of practice in flight critical system design, verification, and certification. The goals were: define and characterize the verification problem for ultra-reliable life critical flight control systems and the current state of practice in industry today; determine the proper role of formal methods in addressing these problems, and assess the state of the art and recent progress toward applying formal methods to this area.
ERIC Educational Resources Information Center
Laosa, Luis M.; And Others
As the second volume in a 4-volume evaluation report on the University of Massachusetts Non-Formal Education Project (UMass NFEP) in rural Ecuador, this volume details the evaluation design. Cited as basic to the evaluation design are questions which ask: (1) What kinds of effects (changes) can be observed? and (2) What are characteristics of the…
Software Design for Real-Time Systems on Parallel Computers: Formal Specifications.
1996-04-01
This research investigated the important issues related to the analysis and design of real - time systems targeted to parallel architectures. In...particular, the software specification models for real - time systems on parallel architectures were evaluated. A survey of current formal methods for...uniprocessor real - time systems specifications was conducted to determine their extensibility in specifying real - time systems on parallel architectures. In
Bilitchenko, Lesia; Liu, Adam; Cheung, Sherine; Weeding, Emma; Xia, Bing; Leguia, Mariana; Anderson, J. Christopher; Densmore, Douglas
2011-01-01
Background Synthetic biological systems are currently created by an ad-hoc, iterative process of specification, design, and assembly. These systems would greatly benefit from a more formalized and rigorous specification of the desired system components as well as constraints on their composition. Therefore, the creation of robust and efficient design flows and tools is imperative. We present a human readable language (Eugene) that allows for the specification of synthetic biological designs based on biological parts, as well as provides a very expressive constraint system to drive the automatic creation of composite Parts (Devices) from a collection of individual Parts. Results We illustrate Eugene's capabilities in three different areas: Device specification, design space exploration, and assembly and simulation integration. These results highlight Eugene's ability to create combinatorial design spaces and prune these spaces for simulation or physical assembly. Eugene creates functional designs quickly and cost-effectively. Conclusions Eugene is intended for forward engineering of DNA-based devices, and through its data types and execution semantics, reflects the desired abstraction hierarchy in synthetic biology. Eugene provides a powerful constraint system which can be used to drive the creation of new devices at runtime. It accomplishes all of this while being part of a larger tool chain which includes support for design, simulation, and physical device assembly. PMID:21559524
Propulsion and Energetics Panel Working Group 15 on the Uniform Engine Test Programme
1990-02-01
earlier test of uniform aerodynamic models in wind tunnels under the auspices of the Fluid Dynamics Panel. A formal proposal was presented to the...this major new effort and members of the engine test community throughout AGARD were selected to serve on Working Group 15 along with PEP...STPA/MO 4 Mr J.R.Bednarsk; 4 Avenue de Ia Porte d’lssy PE-63 75015 Paris Naval Air Propulsion Center PO Box 7176 GERMANY Trenton. New Jersey 08628
Expedition 31 Crew Prepares For Launch
2012-05-15
Expedition 31 Flight Engineer Joe Acaba, left, Soyuz Commander Gennady Padalka, and, Flight Engineer Sergei Revin, right, receive a formal go for launch from Vitaly Alexandrovich Lopota, President of Energia, left, and Vladimir Popovkin, Director of Roscosmos prior to their launch onboard the Soyuz TMA-04M on Tuesday, May 15, 2012 at the Baikonur Cosmodrome in Kazakhstan. The Soyuz spacecraft with Padalka, Revin, and Acaba onboard, launched at 9:01 a.m. Kazakhstan time on Tuesday, May 15. Photo Credit: (NASA/GCTC/Andrey Shelepin)
Epistemology, software engineering and formal methods
NASA Technical Reports Server (NTRS)
Holloway, C. Michael
1994-01-01
One of the most basic questions anyone can ask is, 'How do I know that what I think I know is true?' The study of this question is called epistemology. Traditionally, epistemology has been considered to be of legitimate interest only to philosophers, theologians, and three year old children who respond to every statement by asking, 'Why?' Software engineers need to be interested in the subject, however, because a lack of sufficient understanding of epistemology contributes to many of the current problems in the field.
Probabilistic structural analysis methods of hot engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1989-01-01
Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.
NASA Technical Reports Server (NTRS)
Johnson, James E.; Conley, Cassie; Siegel, Bette
2015-01-01
As systems, technologies, and plans for the human exploration of Mars and other destinations beyond low Earth orbit begin to coalesce, it is imperative that frequent and early consideration is given to how planetary protection practices and policy will be upheld. While the development of formal planetary protection requirements for future human space systems and operations may still be a few years from fruition, guidance to appropriately influence mission and system design will be needed soon to avoid costly design and operational changes. The path to constructing such requirements is a journey that espouses key systems engineering practices of understanding shared goals, objectives and concerns, identifying key stakeholders, and iterating a draft requirement set to gain community consensus. This paper traces through each of these practices, beginning with a literature review of nearly three decades of publications addressing planetary protection concerns with respect to human exploration. Key goals, objectives and concerns, particularly with respect to notional requirements, required studies and research, and technology development needs have been compiled and categorized to provide a current 'state of knowledge'. This information, combined with the identification of key stakeholders in upholding planetary protection concerns for human missions, has yielded a draft requirement set that might feed future iteration among space system designers, exploration scientists, and the mission operations community. Combining the information collected with a proposed forward path will hopefully yield a mutually agreeable set of timely, verifiable, and practical requirements for human space exploration that will uphold international commitment to planetary protection.
Challenges and Demands on Automated Software Revision
NASA Technical Reports Server (NTRS)
Bonakdarpour, Borzoo; Kulkarni, Sandeep S.
2008-01-01
In the past three decades, automated program verification has undoubtedly been one of the most successful contributions of formal methods to software development. However, when verification of a program against a logical specification discovers bugs in the program, manual manipulation of the program is needed in order to repair it. Thus, in the face of existence of numerous unverified and un- certified legacy software in virtually any organization, tools that enable engineers to automatically verify and subsequently fix existing programs are highly desirable. In addition, since requirements of software systems often evolve during the software life cycle, the issue of incomplete specification has become a customary fact in many design and development teams. Thus, automated techniques that revise existing programs according to new specifications are of great assistance to designers, developers, and maintenance engineers. As a result, incorporating program synthesis techniques where an algorithm generates a program, that is correct-by-construction, seems to be a necessity. The notion of manual program repair described above turns out to be even more complex when programs are integrated with large collections of sensors and actuators in hostile physical environments in the so-called cyber-physical systems. When such systems are safety/mission- critical (e.g., in avionics systems), it is essential that the system reacts to physical events such as faults, delays, signals, attacks, etc, so that the system specification is not violated. In fact, since it is impossible to anticipate all possible such physical events at design time, it is highly desirable to have automated techniques that revise programs with respect to newly identified physical events according to the system specification.
Carré, Clément; Mas, André; Krouk, Gabriel
2017-01-01
Inferring transcriptional gene regulatory networks from transcriptomic datasets is a key challenge of systems biology, with potential impacts ranging from medicine to agronomy. There are several techniques used presently to experimentally assay transcription factors to target relationships, defining important information about real gene regulatory networks connections. These techniques include classical ChIP-seq, yeast one-hybrid, or more recently, DAP-seq or target technologies. These techniques are usually used to validate algorithm predictions. Here, we developed a reverse engineering approach based on mathematical and computer simulation to evaluate the impact that this prior knowledge on gene regulatory networks may have on training machine learning algorithms. First, we developed a gene regulatory networks-simulating engine called FRANK (Fast Randomizing Algorithm for Network Knowledge) that is able to simulate large gene regulatory networks (containing 10 4 genes) with characteristics of gene regulatory networks observed in vivo. FRANK also generates stable or oscillatory gene expression directly produced by the simulated gene regulatory networks. The development of FRANK leads to important general conclusions concerning the design of large and stable gene regulatory networks harboring scale free properties (built ex nihilo). In combination with supervised (accepting prior knowledge) support vector machine algorithm we (i) address biologically oriented questions concerning our capacity to accurately reconstruct gene regulatory networks and in particular we demonstrate that prior-knowledge structure is crucial for accurate learning, and (ii) draw conclusions to inform experimental design to performed learning able to solve gene regulatory networks in the future. By demonstrating that our predictions concerning the influence of the prior-knowledge structure on support vector machine learning capacity holds true on real data ( Escherichia coli K14 network reconstruction using network and transcriptomic data), we show that the formalism used to build FRANK can to some extent be a reasonable model for gene regulatory networks in real cells.
Activities of the Center for Space Construction
NASA Technical Reports Server (NTRS)
1993-01-01
The Center for Space Construction (CSC) at the University of Colorado at Boulder is one of eight University Space Engineering Research Centers established by NASA in 1988. The mission of the center is to conduct research into space technology and to directly contribute to space engineering education. The center reports to the Department of Aerospace Engineering Sciences and resides in the College of Engineering and Applied Science. The college has a long and successful track record of cultivating multi-disciplinary research and education programs. The Center for Space Construction is prominent evidence of this record. At the inception of CSC, the center was primarily founded on the need for research on in-space construction of large space systems like space stations and interplanetary space vehicles. The scope of CSC's research has now evolved to include the design and construction of all spacecraft, large and small. Within this broadened scope, our research projects seek to impact the underlying technological basis for such spacecraft as remote sensing satellites, communication satellites, and other special purpose spacecraft, as well as the technological basis for large space platforms. The center's research focuses on three areas: spacecraft structures, spacecraft operations and control, and regolith and surface systems. In the area of spacecraft structures, our current emphasis is on concepts and modeling of deployable structures, analysis of inflatable structures, structural damage detection algorithms, and composite materials for lightweight structures. In the area of spacecraft operations and control, we are continuing our previous efforts in process control of in-orbit structural assembly. In addition, we have begun two new efforts in formal approach to spacecraft flight software systems design and adaptive attitude control systems. In the area of regolith and surface systems, we are continuing the work of characterizing the physical properties of lunar regolith, and we are at work on a project on path planning for planetary surface rovers.
Proportional Reasoning and the Visually Impaired
ERIC Educational Resources Information Center
Hilton, Geoff; Hilton, Annette; Dole, Shelley L.; Goos, Merrilyn; O'Brien, Mia
2012-01-01
Proportional reasoning is an important aspect of formal thinking that is acquired during the developmental years that approximate the middle years of schooling. Students who fail to acquire sound proportional reasoning often experience difficulties in subjects that require quantitative thinking, such as science, technology, engineering, and…
Black Boxes in Analytical Chemistry: University Students' Misconceptions of Instrumental Analysis
ERIC Educational Resources Information Center
Carbo, Antonio Domenech; Adelantado, Jose Vicente Gimeno; Reig, Francisco Bosch
2010-01-01
Misconceptions of chemistry and chemical engineering university students concerning instrumental analysis have been established from coordinated tests, tutorial interviews and laboratory lessons. Misconceptions can be divided into: (1) formal, involving specific concepts and formulations within the general frame of chemistry; (2)…
The FoReVer Methodology: A MBSE Framework for Formal Verification
NASA Astrophysics Data System (ADS)
Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald
2013-08-01
The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.
A knowledge based software engineering environment testbed
NASA Technical Reports Server (NTRS)
Gill, C.; Reedy, A.; Baker, L.
1985-01-01
The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing
Rasmussen's legacy: A paradigm change in engineering for safety.
Leveson, Nancy G
2017-03-01
This paper describes three applications of Rasmussen's idea to systems engineering practice. The first is the application of the abstraction hierarchy to engineering specifications, particularly requirements specification. The second is the use of Rasmussen's ideas in safety modeling and analysis to create a new, more powerful type of accident causation model that extends traditional models to better handle human-operated, software-intensive, sociotechnical systems. Because this new model has a formal, mathematical foundation built on systems theory (as was Rasmussen's original model), new modeling and analysis tools become possible. The third application is to engineering hazard analysis. Engineers have traditionally either omitted human from consideration in system hazard analysis or have treated them rather superficially, for example, that they behave randomly. Applying Rasmussen's model of human error to a powerful new hazard analysis technique allows human behavior to be included in engineering hazard analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Doerr, Martin; Freitas, Fred; Guizzardi, Giancarlo; Han, Hyoil
Ontology is a cross-disciplinary field concerned with the study of concepts and theories that can be used for representing shared conceptualizations of specific domains. Ontological Engineering is a discipline in computer and information science concerned with the development of techniques, methods, languages and tools for the systematic construction of concrete artifacts capturing these representations, i.e., models (e.g., domain ontologies) and metamodels (e.g., upper-level ontologies). In recent years, there has been a growing interest in the application of formal ontology and ontological engineering to solve modeling problems in diverse areas in computer science such as software and data engineering, knowledge representation, natural language processing, information science, among many others.
NASA Technical Reports Server (NTRS)
Kershaw, John
1990-01-01
The VIPER project has so far produced a formal specification of a 32 bit RISC microprocessor, an implementation of that chip in radiation-hard SOS technology, a partial proof of correctness of the implementation which is still being extended, and a large body of supporting software. The time has now come to consider what has been achieved and what directions should be pursued in the future. The most obvious lesson from the VIPER project was the time and effort needed to use formal methods properly. Most of the problems arose in the interfaces between different formalisms, e.g., between the (informal) English description and the HOL spec, between the block-level spec in HOL and the equivalent in ELLA needed by the low-level CAD tools. These interfaces need to be made rigorous or (better) eliminated. VIPER 1A (the latest chip) is designed to operate in pairs, to give protection against breakdowns in service as well as design faults. We have come to regard redundancy and formal design methods as complementary, the one to guard against normal component failures and the other to provide insurance against the risk of the common-cause failures which bedevil reliability predictions. Any future VIPER chips will certainly need improved performance to keep up with increasingly demanding applications. We have a prototype design (not yet specified formally) which includes 32 and 64 bit multiply, instruction pre-fetch, more efficient interface timing, and a new instruction to allow a quick response to peripheral requests. Work is under way to specify this device in MIRANDA, and then to refine the spec into a block-level design by top-down transformations. When the refinement is complete, a relatively simple proof checker should be able to demonstrate its correctness. This paper is presented in viewgraph form.
Verifying Hybrid Systems Modeled as Timed Automata: A Case Study
1997-03-01
Introduction Researchers have proposed many innovative formal methods for developing real - time systems [9]. Such methods can give system developers and...customers greater con dence that real - time systems satisfy their requirements, especially their crit- ical requirements. However, applying formal methods...specifying and reasoning about real - time systems that is designed to address these challenging problems. Our approach is to build formal reasoning tools
Design of high reliability organizations in health care.
Carroll, J S; Rudolph, J W
2006-12-01
To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self-understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self-design for safety and reliability.
NASA Langley Research and Technology-Transfer Program in Formal Methods
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Caldwell, James L.; Carreno, Victor A.; Holloway, C. Michael; Miner, Paul S.; DiVito, Ben L.
1995-01-01
This paper presents an overview of NASA Langley research program in formal methods. The major goals of this work are to make formal methods practical for use on life critical systems, and to orchestrate the transfer of this technology to U.S. industry through use of carefully designed demonstration projects. Several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of five NASA civil servants and contractors from Odyssey Research Associates, SRI International, and VIGYAN Inc.
NASA Astrophysics Data System (ADS)
Fiorini, Rodolfo A.; Dacquino, Gianfranco
2005-03-01
GEOGINE (GEOmetrical enGINE), a state-of-the-art OMG (Ontological Model Generator) based on n-D Tensor Invariants for n-Dimensional shape/texture optimal synthetic representation, description and learning, was presented in previous conferences elsewhere recently. Improved computational algorithms based on the computational invariant theory of finite groups in Euclidean space and a demo application is presented. Progressive model automatic generation is discussed. GEOGINE can be used as an efficient computational kernel for fast reliable application development and delivery in advanced biomedical engineering, biometric, intelligent computing, target recognition, content image retrieval, data mining technological areas mainly. Ontology can be regarded as a logical theory accounting for the intended meaning of a formal dictionary, i.e., its ontological commitment to a particular conceptualization of the world object. According to this approach, "n-D Tensor Calculus" can be considered a "Formal Language" to reliably compute optimized "n-Dimensional Tensor Invariants" as specific object "invariant parameter and attribute words" for automated n-Dimensional shape/texture optimal synthetic object description by incremental model generation. The class of those "invariant parameter and attribute words" can be thought as a specific "Formal Vocabulary" learned from a "Generalized Formal Dictionary" of the "Computational Tensor Invariants" language. Even object chromatic attributes can be effectively and reliably computed from object geometric parameters into robust colour shape invariant characteristics. As a matter of fact, any highly sophisticated application needing effective, robust object geometric/colour invariant attribute capture and parameterization features, for reliable automated object learning and discrimination can deeply benefit from GEOGINE progressive automated model generation computational kernel performance. Main operational advantages over previous, similar approaches are: 1) Progressive Automated Invariant Model Generation, 2) Invariant Minimal Complete Description Set for computational efficiency, 3) Arbitrary Model Precision for robust object description and identification.
Ceccarelli, Michele; Cerulo, Luigi; Santone, Antonella
2014-10-01
Reverse engineering of gene regulatory relationships from genomics data is a crucial task to dissect the complex underlying regulatory mechanism occurring in a cell. From a computational point of view the reconstruction of gene regulatory networks is an undetermined problem as the large number of possible solutions is typically high in contrast to the number of available independent data points. Many possible solutions can fit the available data, explaining the data equally well, but only one of them can be the biologically true solution. Several strategies have been proposed in literature to reduce the search space and/or extend the amount of independent information. In this paper we propose a novel algorithm based on formal methods, mathematically rigorous techniques widely adopted in engineering to specify and verify complex software and hardware systems. Starting with a formal specification of gene regulatory hypotheses we are able to mathematically prove whether a time course experiment belongs or not to the formal specification, determining in fact whether a gene regulation exists or not. The method is able to detect both direction and sign (inhibition/activation) of regulations whereas most of literature methods are limited to undirected and/or unsigned relationships. We empirically evaluated the approach on experimental and synthetic datasets in terms of precision and recall. In most cases we observed high levels of accuracy outperforming the current state of art, despite the computational cost increases exponentially with the size of the network. We made available the tool implementing the algorithm at the following url: http://www.bioinformatics.unisannio.it. Copyright © 2014 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hills, Richard G.; Maniaci, David Charles; Naughton, Jonathan W.
2015-09-01
A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3)more » uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.« less
Artificial General Intelligence: Concept, State of the Art, and Future Prospects
NASA Astrophysics Data System (ADS)
Goertzel, Ben
2014-12-01
In recent years broad community of researchers has emerged, focusing on the original ambitious goals of the AI field - the creation and study of software or hardware systems with general intelligence comparable to, and ultimately perhaps greater than, that of human beings. This paper surveys this diverse community and its progress. Approaches to defining the concept of Artificial General Intelligence (AGI) are reviewed including mathematical formalisms, engineering, and biology inspired perspectives. The spectrum of designs for AGI systems includes systems with symbolic, emergentist, hybrid and universalist characteristics. Metrics for general intelligence are evaluated, with a conclusion that, although metrics for assessing the achievement of human-level AGI may be relatively straightforward (e.g. the Turing Test, or a robot that can graduate from elementary school or university), metrics for assessing partial progress remain more controversial and problematic.
Verification of NASA Emergent Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike
2004-01-01
NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.
Towards the formal verification of the requirements and design of a processor interface unit
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1993-01-01
The formal verification of the design and partial requirements for a Processor Interface Unit (PIU) using the Higher Order Logic (HOL) theorem-proving system is described. The processor interface unit is a single-chip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. It provides the opportunity to investigate the specification and verification of a real-world subsystem within a commercially-developed fault-tolerant computer. An overview of the PIU verification effort is given. The actual HOL listing from the verification effort are documented in a companion NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings' including the general-purpose HOL theories and definitions that support the PIU verification as well as tactics used in the proofs.
NASA Technical Reports Server (NTRS)
Jellicorse, John J.; Rahman, Shamin A.
2016-01-01
NASA is currently developing the next generation crewed spacecraft and launch vehicle for exploration beyond earth orbit including returning to the Moon and making the transit to Mars. Managing the design integration of major hardware elements of a space transportation system is critical for overcoming both the technical and programmatic challenges in taking a complex system from concept to space operations. An established method of accomplishing this is formal interface management. In this paper we set forth an argument that the interface management process implemented by NASA between the Orion Multi-Purpose Crew Vehicle (MPCV) and the Space Launch System (SLS) achieves the Level 3 tier of the EIA 731.1 System Engineering Capability Model (SECM) for Generic Practices. We describe the relevant NASA systems and associated organizations, and define the EIA SECM Level 3 Generic Practices. We then provide evidence for our compliance with those practices. This evidence includes discussions of: NASA Systems Engineering Interface (SE) Management standard process and best practices; the tailoring of that process for implementation on the Orion to SLS interface; changes made over time to improve the tailored process, and; the opportunities to take the resulting lessons learned and propose improvements to our institutional processes and best practices. We compare this evidence against the practices to form the rationale for the declared SECM maturity level.
Alves, Rui; Vilaprinyo, Ester; Hernádez-Bermejo, Benito; Sorribas, Albert
2008-01-01
There is a renewed interest in obtaining a systemic understanding of metabolism, gene expression and signal transduction processes, driven by the recent research focus on Systems Biology. From a biotechnological point of view, such a systemic understanding of how a biological system is designed to work can facilitate the rational manipulation of specific pathways in different cell types to achieve specific goals. Due to the intrinsic complexity of biological systems, mathematical models are a central tool for understanding and predicting the integrative behavior of those systems. Particularly, models are essential for a rational development of biotechnological applications and in understanding system's design from an evolutionary point of view. Mathematical models can be obtained using many different strategies. In each case, their utility will depend upon the properties of the mathematical representation and on the possibility of obtaining meaningful parameters from available data. In practice, there are several issues at stake when one has to decide which mathematical model is more appropriate for the study of a given problem. First, one needs a model that can represent the aspects of the system one wishes to study. Second, one must choose a mathematical representation that allows an accurate analysis of the system with respect to different aspects of interest (for example, robustness of the system, dynamical behavior, optimization of the system with respect to some production goal, parameter value determination, etc). Third, before choosing between alternative and equally appropriate mathematical representations for the system, one should compare representations with respect to easiness of automation for model set-up, simulation, and analysis of results. Fourth, one should also consider how to facilitate model transference and re-usability by other researchers and for distinct purposes. Finally, one factor that is important for all four aspects is the regularity in the mathematical structure of the equations because it facilitates computational manipulation. This regularity is a mark of kinetic representations based on approximation theory. The use of approximation theory to derive mathematical representations with regular structure for modeling purposes has a long tradition in science. In most applied fields, such as engineering and physics, those approximations are often required to obtain practical solutions to complex problems. In this paper we review some of the more popular mathematical representations that have been derived using approximation theory and are used for modeling in molecular systems biology. We will focus on formalisms that are theoretically supported by the Taylor Theorem. These include the Power-law formalism, the recently proposed (log)linear and Lin-log formalisms as well as some closely related alternatives. We will analyze the similarities and differences between these formalisms, discuss the advantages and limitations of each representation, and provide a tentative "road map" for their potential utilization for different problems.
A Second-Year Undergraduate Course in Applied Differential Equations.
ERIC Educational Resources Information Center
Fahidy, Thomas Z.
1991-01-01
Presents the framework for a chemical engineering course using ordinary differential equations to solve problems with the underlying strategy of concisely discussing the theory behind each solution technique without extensions to formal proofs. Includes typical class illustrations, student responses to this strategy, and reaction of the…
Bridging Formal and Informal Learning Environments
ERIC Educational Resources Information Center
Barker, Bradley S.; Larson, Kim; Krehbiel, Michelle
2014-01-01
Out-of-school time programs that provide science, technology, engineering, and mathematics (STEM) educational content are promising approaches to develop skills and abilities in students. These programs may potentially inspire students with engaging hands-on, minds-on activities that encourages their natural curiosity around STEM content areas.…
Informal Science: Family Education, Experiences, and Initial Interest in Science
ERIC Educational Resources Information Center
Dabney, Katherine P.; Tai, Robert H.; Scott, Michael R.
2016-01-01
Recent research and public policy have indicated the need for increasing the physical science workforce through development of interest and engagement with informal and formal science, technology, engineering, and mathematics experiences. This study examines the association of family education and physical scientists' informal experiences in…
ERIC Educational Resources Information Center
Kahler, Jim; Valentine, Nancy
2011-01-01
America has a gap when it comes to youth pursuing science and technology careers. In an effort to improve the knowledge and application of science, technology, engineering, and math (STEM), after-school programs can work in conjunction with formal in-school curriculum to improve science education. One organization that actively addresses this…
ERIC Educational Resources Information Center
Cannon, Kama
2018-01-01
Although formal papers are typical, sometimes posters or other visual presentations are more useful tools for sharing visual-spatial information. By incorporating creativity and technology into the study of geographical science, STEM (the study of Science, Technology Engineering, and Mathematics) is changed to STEAM (the A stands for ART)! The…
RAFCON: A Graphical Tool for Engineering Complex, Robotic Tasks
2016-10-09
Robotic tasks are becoming increasingly complex, and with this also the robotic systems. This requires new tools to manage this complexity and to...execution of robotic tasks, called RAFCON. These tasks are described in hierarchical state machines supporting concurrency. A formal notation of this concept
CADDIS is an online application that helps scientists and engineers in the Regions, States, and Tribes find, access, organize, use, and share information to conduct causal evaluations in aquatic systems. It is based on the USEPA stressor identification process, a formal method fo...
NASA's Space Launch System Transitions From Design To Production
NASA Technical Reports Server (NTRS)
Askins, Bruce R.; Robinson, Kimberly F.
2016-01-01
NASA's Space Launch System (SLS) successfully completed its Critical Design Review (CDR) in 2015, a major milestone on the journey to an unprecedented era of exploration for humanity. CDR formally marked the program's transition from design to production phase just four years after the program's inception and the first such milestone for a human launch vehicle in 40 years. While challenges typical of a complex development program lie ahead, CDR evaluators concluded that the design is technically and programmatically sound and ready to press forward to Design Certification Review (DCR) and readiness for launch of Exploration Mission 1 (EM-1) in the 2018 timeframe. SLS is prudently based on existing propulsion systems, infrastructure and knowledge with a clear, evolutionary path as required by mission needs. In its initial configuration, designated Block 1, SLS will a minimum of 70 metric tons (t) (154,324 pounds) of payload to low Earth orbit (LEO). It will evolve to a 130 t (286,601 pound) payload capacity by upgrading its engines, boosters, and upper stage, dramatically increasing the mass and volume of human and robotic exploration while decreasing mission risk, increasing safety, and simplifying ground and mission operations. CDR was the central programmatic accomplishment among many technical accomplishments that will be described in this paper. The government/industry SLS team successfully test-fired a flight-like five-segment solid rocket motor, as well as seven hotfire development tests of the RS-25 core stage engine. The majority of the major test article and flight barrels, rings, and domes for the core stage liquid oxygen, liquid hydrogen, engine section, intertank, and forward skirt were manufactured at NASA's Michoud Assembly Facility in New Orleans, Louisiana. Renovations to the B-2 test stand for stage green run testing were completed at NASA's Stennis Space Center (SSC), near Bay St. Louis, Mississippi. Core stage test stands are reaching completion at NASA's Marshall Space Flight Center in Huntsville, Alabama. The modified Pegasus barge for core stage transportation from manufacturing to testing and launch sites was delivered to SSC. The Interim Cryogenic Propulsion System test article was also completed. This paper will discuss these and other technical and programmatic successes and challenges over the past year and provide a preview of work ahead before the first flight of this new capability.
Formal Analysis of BPMN Models Using Event-B
NASA Astrophysics Data System (ADS)
Bryans, Jeremy W.; Wei, Wei
The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.
Formal methods and digital systems validation for airborne systems
NASA Technical Reports Server (NTRS)
Rushby, John
1993-01-01
This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.
A brief overview of NASA Langley's research program in formal methods
NASA Technical Reports Server (NTRS)
1992-01-01
An overview of NASA Langley's research program in formal methods is presented. The major goal of this work is to bring formal methods technology to a sufficiently mature level for use by the United States aerospace industry. Towards this goal, work is underway to design and formally verify a fault-tolerant computing platform suitable for advanced flight control applications. Also, several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of six NASA civil servants and contractors from Boeing Military Aircraft Company, Computational Logic Inc., Odyssey Research Associates, SRI International, University of California at Davis, and Vigyan Inc.
The VATES-Diamond as a Verifier's Best Friend
NASA Astrophysics Data System (ADS)
Glesner, Sabine; Bartels, Björn; Göthel, Thomas; Kleine, Moritz
Within a model-based software engineering process it needs to be ensured that properties of abstract specifications are preserved by transformations down to executable code. This is even more important in the area of safety-critical real-time systems where additionally non-functional properties are crucial. In the VATES project, we develop formal methods for the construction and verification of embedded systems. We follow a novel approach that allows us to formally relate abstract process algebraic specifications to their implementation in a compiler intermediate representation. The idea is to extract a low-level process algebraic description from the intermediate code and to formally relate it to previously developed abstract specifications. We apply this approach to a case study from the area of real-time operating systems and show that this approach has the potential to seamlessly integrate modeling, implementation, transformation and verification stages of embedded system development.
Formal Methods for Automated Diagnosis of Autosub 6000
NASA Technical Reports Server (NTRS)
Ernits, Juhan; Dearden, Richard; Pebody, Miles
2009-01-01
This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a probably equivalent model has yet to appeal: Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a probably equivalent implementation are valuable but not su8cient. The "gap" unfilled by such tools and methods is that their. formal models cannot be proven to be equivalent to the system requirements as originated by the customel: For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a probably equivalent formal model that can be used as the basis for code generation and other transformations.
Modeling and Verification of Dependable Electronic Power System Architecture
NASA Astrophysics Data System (ADS)
Yuan, Ling; Fan, Ping; Zhang, Xiao-fang
The electronic power system can be viewed as a system composed of a set of concurrently interacting subsystems to generate, transmit, and distribute electric power. The complex interaction among sub-systems makes the design of electronic power system complicated. Furthermore, in order to guarantee the safe generation and distribution of electronic power, the fault tolerant mechanisms are incorporated in the system design to satisfy high reliability requirements. As a result, the incorporation makes the design of such system more complicated. We propose a dependable electronic power system architecture, which can provide a generic framework to guide the development of electronic power system to ease the development complexity. In order to provide common idioms and patterns to the system *designers, we formally model the electronic power system architecture by using the PVS formal language. Based on the PVS model of this system architecture, we formally verify the fault tolerant properties of the system architecture by using the PVS theorem prover, which can guarantee that the system architecture can satisfy high reliability requirements.
A rigorous approach to self-checking programming
NASA Technical Reports Server (NTRS)
Hua, Kien A.; Abraham, Jacob A.
1986-01-01
Self-checking programming is shown to be an effective concurrent error detection technique. The reliability of a self-checking program however relies on the quality of its assertion statements. A self-checking program written without formal guidelines could provide a poor coverage of the errors. A constructive technique for self-checking programming is presented. A Structured Program Design Language (SPDL) suitable for self-checking software development is defined. A set of formal rules, was also developed, that allows the transfromation of SPDL designs into self-checking designs to be done in a systematic manner.
NASA Technical Reports Server (NTRS)
Ciardo, Gianfranco
2004-01-01
The Runway Safety Monitor (RSM) designed by Lockheed Martin is part of NASA's effort to reduce aviation accidents. We developed a Petri net model of the RSM protocol and used the model checking functions of our tool SMART to investigate a number of safety properties in RSM. To mitigate the impact of state-space explosion, we built a highly discretized model of the system, obtained by partitioning the monitored runway zone into a grid of smaller volumes and by considering scenarios involving only two aircraft. The model also assumes that there are no communication failures, such as bad input from radar or lack of incoming data, thus it relies on a consistent view of reality by all participants. In spite of these simplifications, we were able to expose potential problems in the RSM conceptual design. Our findings were forwarded to the design engineers, who undertook corrective action. Additionally, the results stress the efficiency attained by the new model checking algorithms implemented in SMART, and demonstrate their applicability to real-world systems. Attempts to verify RSM with NuSMV and SPIN have failed due to excessive memory consumption.
NASA Astrophysics Data System (ADS)
Chan, Christine S.; Ostertag, Michael H.; Akyürek, Alper Sinan; Šimunić Rosing, Tajana
2017-05-01
The Internet of Things envisions a web-connected infrastructure of billions of sensors and actuation devices. However, the current state-of-the-art presents another reality: monolithic end-to-end applications tightly coupled to a limited set of sensors and actuators. Growing such applications with new devices or behaviors, or extending the existing infrastructure with new applications, involves redesign and redeployment. We instead propose a modular approach to these applications, breaking them into an equivalent set of functional units (context engines) whose input/output transformations are driven by general-purpose machine learning, demonstrating an improvement in compute redundancy and computational complexity with minimal impact on accuracy. In conjunction with formal data specifications, or ontologies, we can replace application-specific implementations with a composition of context engines that use common statistical learning to generate output, thus improving context reuse. We implement interconnected context-aware applications using our approach, extracting user context from sensors in both healthcare and grid applications. We compare our infrastructure to single-stage monolithic implementations with single-point communications between sensor nodes and the cloud servers, demonstrating a reduction in combined system energy by 22-45%, and multiplying the battery lifetime of power-constrained devices by at least 22x, with easy deployment across different architectures and devices.
The founding of ISOTT: the Shamattawa of engineering science and medical science.
Bruley, Duane F
2014-01-01
The founding of ISOTT was based upon the blending of Medical and Engineering sciences. This occurrence is portrayed by the Shamattawa, the joining of the Chippewa and Flambeau rivers. Beginning with Carl Scheele's discovery of oxygen, the medical sciences advanced the knowledge of its importance to physiological phenomena. Meanwhile, engineering science was evolving as a mathematical discipline used to define systems quantitatively from basic principles. In particular, Adolf Fick's employment of a gradient led to the formalization of transport phenomena. These two rivers of knowledge were blended to found ISOTT at Clemson/Charleston, South Carolina, USA, in 1973.The establishment of our society with a mission to support the collaborative work of medical scientists, clinicians and all disciplines of engineering was a supporting step in the evolution of bioengineering. Traditional engineers typically worked in areas not requiring knowledge of biology or the life sciences. By encouraging collaboration between medical science and traditional engineering, our society became one of the forerunners in establishing bioengineering as the fifth traditional discipline of engineering.
Software Formal Inspections Guidebook
NASA Technical Reports Server (NTRS)
1993-01-01
The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.
AMPHION: Specification-based programming for scientific subroutine libraries
NASA Technical Reports Server (NTRS)
Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Waldinger, Richard; Stickel, Mark
1994-01-01
AMPHION is a knowledge-based software engineering (KBSE) system that guides a user in developing a diagram representing a formal problem specification. It then automatically implements a solution to this specification as a program consisting of calls to subroutines from a library. The diagram provides an intuitive domain oriented notation for creating a specification that also facilitates reuse and modification. AMPHION'S architecture is domain independent. AMPHION is specialized to an application domain by developing a declarative domain theory. Creating a domain theory is an iterative process that currently requires the joint expertise of domain experts and experts in automated formal methods for software development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.
2005-12-27
Graph theory is a branch of discrete combinatorial mathematics that studies the properties of graphs. The theory was pioneered by the Swiss mathematician Leonhard Euler in the 18th century, commenced its formal development during the second half of the 19th century, and has witnessed substantial growth during the last seventy years, with applications in areas as diverse as engineering, computer science, physics, sociology, chemistry and biology. Graph theory has also had a strong impact in computational linguistics by providing the foundations for the theory of features structures that has emerged as one of the most widely used frameworks for themore » representation of grammar formalisms.« less
Program Developments: Formal Explanations of Implementations.
1982-08-01
January 1982. [Cheatham 791 Cheatham, T. E., G. H. Holloway, and J. A. Townley . "Symbolic evaluation and the analysis of programs," IEEE Transactions on...Software Engineering 5, (4), July 1979, 402-417. [Cheatham 81] Cheatham, T. E., G. H. Holloway, and J. A. Townley , "Program refinement by
A Semi-Automatic Approach to Construct Vietnamese Ontology from Online Text
ERIC Educational Resources Information Center
Nguyen, Bao-An; Yang, Don-Lin
2012-01-01
An ontology is an effective formal representation of knowledge used commonly in artificial intelligence, semantic web, software engineering, and information retrieval. In open and distance learning, ontologies are used as knowledge bases for e-learning supplements, educational recommenders, and question answering systems that support students with…
Appreciating Formal Similarities in the Kinetics of Homogeneous, Heterogeneous, and Enzyme Catalysis
ERIC Educational Resources Information Center
Ashby, Michael T.
2007-01-01
Because interest in catalysts is widespread, the kinetics of catalytic reactions have been investigated by widely diverse groups of individuals, including chemists, engineers, and biologists. This has lead to redundancy in theories, particularly with regard to the topics of homogeneous, heterogeneous, and enzyme catalysis. From a pedagogical…
CrossTalk. The Journal of Defense Software Engineering. Volume 25, Number 1, Jan/Feb 2012
2012-01-01
Considerations in Airborne Systems and Equipment Certification – RTCA/DO-178B,” Washington, D.C., 1992. 5. Ishikawa , Kaoru (Translator: J. H...significant, repeated issue, a formal root cause analysis process is performed. This method uses fishbone or Ishikawa diagrams [5], where possible
Abstract Numeric Relations and the Visual Structure of Algebra
ERIC Educational Resources Information Center
Landy, David; Brookes, David; Smout, Ryan
2014-01-01
Formal algebras are among the most powerful and general mechanisms for expressing quantitative relational statements; yet, even university engineering students, who are relatively proficient with algebraic manipulation, struggle with and often fail to correctly deploy basic aspects of algebraic notation (Clement, 1982). In the cognitive tradition,…
Community Partnerships for Fostering Student Interest and Engagement in STEM
ERIC Educational Resources Information Center
Watters, James J.; Diezmann, Carmel M.
2013-01-01
The foundations of Science, Technology, Engineering and Mathematics (STEM) education begins in the early years of schooling when students encounter formal learning experiences primarily in mathematics and science. Politicians, economists and industrialists recognise the importance of STEM in society, and therefore a number of strategies have been…
FY 1997 Scientific and Technical Reports, Articles, Papers, and Presentations
NASA Technical Reports Server (NTRS)
Waits, J. E. Turner (Compiler)
1998-01-01
This document presents formal NASA technical reports, papers published in technical journals, and presentations by MSFC personnel in FY97. The information in this report may be of value to the scientific and engineering community in determining what information has been published and what is available.
The Atlanta University Center: A Consortium-Based Dual Degree Engineering Program
ERIC Educational Resources Information Center
Jackson, Marilyn T.
2007-01-01
The Atlanta University Center (AUC) comprises five historically black colleges and a centralized library. All are separate institutions, each having its own board of directors, president, infrastructure, students, faculty, staff, and traditions. To encourage coordination of effort and resources, the AUC was formed and the first formal cooperative…
Balancing Stakeholders' Interests in Evolving Teacher Education Accreditation Contexts
ERIC Educational Resources Information Center
Elliott, Alison
2008-01-01
While Australian teacher education programs have long had rigorous accreditation pathways at the University level they have not been subject to the same formal public or professional scrutiny typical of professions such as medicine, nursing or engineering. Professional accreditation for teacher preparation programs is relatively new and is linked…
ERIC Educational Resources Information Center
Traube, Dorian E.; Begun, Stephanie; Petering, Robin; Flynn, Marilyn L.
2017-01-01
The field of social work does not currently have a widely adopted method for expediting innovations into micro- or macropractice. Although it is common in fields such as engineering and business to have formal processes for accelerating scientific advances into consumer markets, few comparable mechanisms exist in the social sciences or social…
IB Offering Certificate for Careers
ERIC Educational Resources Information Center
Robelen, Erik W.
2012-01-01
The International Baccalaureate (IB) organization, best known in the United States for its prestigious two-year diploma program for juniors and seniors, will enter new terrain this fall as it formally rolls out an initiative centered on a variety of career pathways that includes engineering, culinary arts, and automotive technology. The move comes…
The ADVANCE project : formal evaluation of the targeted deployment. Volume 2
DOT National Transportation Integrated Search
1997-01-01
This document reports on the formal evaluation of the targeted (limited but highly focused) deployment of the Advanced Driver and Vehicle Advisory Navigation ConcEpt (ADVANCE), an in-vehicle advanced traveler information system designed to provide sh...
NASA software specification and evaluation system design, part 1
NASA Technical Reports Server (NTRS)
1976-01-01
The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.
32 CFR 644.377 - Formal revocation of public land withdrawals and reservations.
Code of Federal Regulations, 2011 CFR
2011-07-01
... domain, the BLM Land Office will transmit to the DE a draft of public land order (PLO) designed to formally revoke the order or reservation which withdrew or reserved the land. The DE will review the draft...
32 CFR 644.377 - Formal revocation of public land withdrawals and reservations.
Code of Federal Regulations, 2010 CFR
2010-07-01
... domain, the BLM Land Office will transmit to the DE a draft of public land order (PLO) designed to formally revoke the order or reservation which withdrew or reserved the land. The DE will review the draft...
On Abstractions and Simplifications in the Design of Human-Automation Interfaces
NASA Technical Reports Server (NTRS)
Heymann, Michael; Degani, Asaf; Shafto, Michael; Meyer, George; Clancy, Daniel (Technical Monitor)
2001-01-01
This report addresses the design of human-automation interaction from a formal perspective that focuses on the information content of the interface, rather than the design of the graphical user interface. It also addresses the, issue of the information provided to the user (e.g., user-manuals, training material, and all other resources). In this report, we propose a formal procedure for generating interfaces and user-manuals. The procedure is guided by two criteria: First, the interface must be correct, i.e., that with the given interface the user will be able to perform the specified tasks correctly. Second, the interface should be as succinct as possible. The report discusses the underlying concepts and the formal methods for this approach. Several examples are used to illustrate the procedure. The algorithm for constructing interfaces can be automated, and a preliminary software system for its implementation has been developed.
On Abstractions and Simplifications in the Design of Human-Automation Interfaces
NASA Technical Reports Server (NTRS)
Heymann, Michael; Degani, Asaf; Clancy, Daniel (Technical Monitor)
2002-01-01
This report addresses the design of human-automation interaction from a formal perspective that focuses on the information content of the interface, rather than the design of the graphical user interface. It also addresses the issue of the information provided to the user (e.g., user-manuals, training material, and all other resources). In this report, we propose a formal procedure for generating interfaces and user-manuals. The procedure is guided by two criteria: First, the interface must be correct, that is, with the given interface the user will be able to perform the specified tasks correctly. Second, the interface should be succinct. The report discusses the underlying concepts and the formal methods for this approach. Two examples are used to illustrate the procedure. The algorithm for constructing interfaces can be automated, and a preliminary software system for its implementation has been developed.
Design of high reliability organizations in health care
Carroll, J S; Rudolph, J W
2006-01-01
To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self‐understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self‐design for safety and reliability. PMID:17142607
Machine Learning-based Intelligent Formal Reasoning and Proving System
NASA Astrophysics Data System (ADS)
Chen, Shengqing; Huang, Xiaojian; Fang, Jiaze; Liang, Jia
2018-03-01
The reasoning system can be used in many fields. How to improve reasoning efficiency is the core of the design of system. Through the formal description of formal proof and the regular matching algorithm, after introducing the machine learning algorithm, the system of intelligent formal reasoning and verification has high efficiency. The experimental results show that the system can verify the correctness of propositional logic reasoning and reuse the propositional logical reasoning results, so as to obtain the implicit knowledge in the knowledge base and provide the basic reasoning model for the construction of intelligent system.