Sample records for formal systems engineering

  1. IDEF3 Formalization Report

    DTIC Science & Technology

    1991-10-01

    SUBJECT TERMS 15. NUMBER OF PAGES engineering management information systems method formalization 60 information engineering process modeling 16 PRICE...CODE information systems requirements definition methods knowlede acquisition methods systems engineering 17. SECURITY CLASSIFICATION ji. SECURITY... Management , Inc., Santa Monica, California. CORYNEN, G. C., 1975, A Mathematical Theory of Modeling and Simula- tion. Ph.D. Dissertation, Department

  2. Helping System Engineers Bridge the Peaks

    NASA Technical Reports Server (NTRS)

    Rungta, Neha; Tkachuk, Oksana; Person, Suzette; Biatek, Jason; Whalen, Michael W.; Castle, Joseph; Castle, JosephGundy-Burlet, Karen

    2014-01-01

    In our experience at NASA, system engineers generally follow the Twin Peaks approach when developing safety-critical systems. However, iterations between the peaks require considerable manual, and in some cases duplicate, effort. A significant part of the manual effort stems from the fact that requirements are written in English natural language rather than a formal notation. In this work, we propose an approach that enables system engineers to leverage formal requirements and automated test generation to streamline iterations, effectively "bridging the peaks". The key to the approach is a formal language notation that a) system engineers are comfortable with, b) is supported by a family of automated V&V tools, and c) is semantically rich enough to describe the requirements of interest. We believe the combination of formalizing requirements and providing tool support to automate the iterations will lead to a more efficient Twin Peaks implementation at NASA.

  3. Applying formal methods and object-oriented analysis to existing flight software

    NASA Technical Reports Server (NTRS)

    Cheng, Betty H. C.; Auernheimer, Brent

    1993-01-01

    Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.

  4. 7 CFR 1726.201 - Formal competitive bidding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... AGRICULTURE ELECTRIC SYSTEM CONSTRUCTION POLICIES AND PROCEDURES Procurement Procedures § 1726.201 Formal... formal competitive bidding: (a) Selection of qualified bidders. The borrower (acting through its engineer... § 1726.23). (b) Invitations to bid. The borrower (acting through its engineer, if applicable) is...

  5. 7 CFR 1726.201 - Formal competitive bidding.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AGRICULTURE ELECTRIC SYSTEM CONSTRUCTION POLICIES AND PROCEDURES Procurement Procedures § 1726.201 Formal... formal competitive bidding: (a) Selection of qualified bidders. The borrower (acting through its engineer... § 1726.23). (b) Invitations to bid. The borrower (acting through its engineer, if applicable) is...

  6. 7 CFR 1726.201 - Formal competitive bidding.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AGRICULTURE ELECTRIC SYSTEM CONSTRUCTION POLICIES AND PROCEDURES Procurement Procedures § 1726.201 Formal... formal competitive bidding: (a) Selection of qualified bidders. The borrower (acting through its engineer... § 1726.23). (b) Invitations to bid. The borrower (acting through its engineer, if applicable) is...

  7. 7 CFR 1726.201 - Formal competitive bidding.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... AGRICULTURE ELECTRIC SYSTEM CONSTRUCTION POLICIES AND PROCEDURES Procurement Procedures § 1726.201 Formal... formal competitive bidding: (a) Selection of qualified bidders. The borrower (acting through its engineer... § 1726.23). (b) Invitations to bid. The borrower (acting through its engineer, if applicable) is...

  8. 7 CFR 1726.201 - Formal competitive bidding.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... AGRICULTURE ELECTRIC SYSTEM CONSTRUCTION POLICIES AND PROCEDURES Procurement Procedures § 1726.201 Formal... formal competitive bidding: (a) Selection of qualified bidders. The borrower (acting through its engineer... § 1726.23). (b) Invitations to bid. The borrower (acting through its engineer, if applicable) is...

  9. Working on the Boundaries: Philosophies and Practices of the Design Process

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.

    1996-01-01

    While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.

  10. Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering

    NASA Technical Reports Server (NTRS)

    Bolton, Matthew L.; Bass, Ellen J.

    2009-01-01

    Both the human factors engineering (HFE) and formal methods communities are concerned with finding and eliminating problems with safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to use model checking with HFE practices to perform formal verification of a human-interactive system. Despite the use of a seemingly simple target system, a patient controlled analgesia pump, the initial model proved to be difficult for the model checker to verify in a reasonable amount of time. This resulted in a number of model revisions that affected the HFE architectural, representativeness, and understandability goals of the effort. If formal methods are to meet the needs of the HFE community, additional modeling tools and technological developments are necessary.

  11. The approach to engineering tasks composition on knowledge portals

    NASA Astrophysics Data System (ADS)

    Novogrudska, Rina; Globa, Larysa; Schill, Alexsander; Romaniuk, Ryszard; Wójcik, Waldemar; Karnakova, Gaini; Kalizhanova, Aliya

    2017-08-01

    The paper presents an approach to engineering tasks composition on engineering knowledge portals. The specific features of engineering tasks are highlighted, their analysis makes the basis for partial engineering tasks integration. The formal algebraic system for engineering tasks composition is proposed, allowing to set the context-independent formal structures for engineering tasks elements' description. The method of engineering tasks composition is developed that allows to integrate partial calculation tasks into general calculation tasks on engineering portals, performed on user request demand. The real world scenario «Calculation of the strength for the power components of magnetic systems» is represented, approving the applicability and efficiency of proposed approach.

  12. Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems

    NASA Technical Reports Server (NTRS)

    Bujorianu, Marius C.; Bujorianu, Manuela L.

    2009-01-01

    In this paper, we sketch a framework for interdisciplinary modeling of space systems, by proposing a holistic view. We consider different system dimensions and their interaction. Specifically, we study the interactions between computation, physics, communication, uncertainty and autonomy. The most comprehensive computational paradigm that supports a holistic perspective on autonomous space systems is given by cyber-physical systems. For these, the state of art consists of collaborating multi-engineering efforts that prompt for an adequate formal foundation. To achieve this, we propose a leveraging of the traditional content of formal modeling by a co-engineering process.

  13. Formal Verification of Complex Systems based on SysML Functional Requirements

    DTIC Science & Technology

    2014-12-23

    Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools

  14. Usability engineering: domain analysis activities for augmented-reality systems

    NASA Astrophysics Data System (ADS)

    Gabbard, Joseph; Swan, J. E., II; Hix, Deborah; Lanzagorta, Marco O.; Livingston, Mark; Brown, Dennis B.; Julier, Simon J.

    2002-05-01

    This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured, iterative, stepwise development process. Like the related disciplines of software and systems engineering, usability engineering is a combination of management principals and techniques, formal and semi- formal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system that displays heads- up battlefield intelligence information to a dismounted warrior. The paper discusses our general usability engineering process. We originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented reality system. The focus of this paper is our work on domain analysis, the first activity of the usability engineering process. We describe our plans for and our progress to date on our domain analysis for BARS. We give results in terms of a specific urban battlefield use case we have designed.

  15. Systems engineering principles for the design of biomedical signal processing systems.

    PubMed

    Faust, Oliver; Acharya U, Rajendra; Sputh, Bernhard H C; Min, Lim Choo

    2011-06-01

    Systems engineering aims to produce reliable systems which function according to specification. In this paper we follow a systems engineering approach to design a biomedical signal processing system. We discuss requirements capturing, specification definition, implementation and testing of a classification system. These steps are executed as formal as possible. The requirements, which motivate the system design, are based on diabetes research. The main requirement for the classification system is to be a reliable component of a machine which controls diabetes. Reliability is very important, because uncontrolled diabetes may lead to hyperglycaemia (raised blood sugar) and over a period of time may cause serious damage to many of the body systems, especially the nerves and blood vessels. In a second step, these requirements are refined into a formal CSP‖ B model. The formal model expresses the system functionality in a clear and semantically strong way. Subsequently, the proven system model was translated into an implementation. This implementation was tested with use cases and failure cases. Formal modeling and automated model checking gave us deep insight in the system functionality. This insight enabled us to create a reliable and trustworthy implementation. With extensive tests we established trust in the reliability of the implementation. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  16. Advancing Systems Engineering Excellence: The Marshall Systems Engineering Leadership Development Program

    NASA Technical Reports Server (NTRS)

    Hall, Philip; Whitfield, Susan

    2011-01-01

    As NASA undertakes increasingly complex projects, the need for expert systems engineers and leaders in systems engineering is becoming more pronounced. As a result of this issue, the Agency has undertaken an initiative to develop more systems engineering leaders through its Systems Engineering Leadership Development Program; however, the NASA Office of the Chief Engineer has also called on the field Centers to develop mechanisms to strengthen their expertise in systems engineering locally. In response to this call, Marshall Space Flight Center (MSFC) has developed a comprehensive development program for aspiring systems engineers and systems engineering leaders. This presentation will summarize the two-level program, which consists of a combination of training courses and on-the-job, developmental training assignments at the Center to help develop stronger expertise in systems engineering and technical leadership. In addition, it will focus on the success the program has had in its pilot year. The program hosted a formal kickoff event for Level I on October 13, 2009. The first class includes 42 participants from across MSFC and Michoud Assembly Facility (MAF). A formal call for Level II is forthcoming. With the new Agency focus on research and development of new technologies, having a strong pool of well-trained systems engineers is becoming increasingly more critical. Programs such as the Marshall Systems Engineering Leadership Development Program, as well as those developed at other Centers, help ensure that there is an upcoming generation of trained systems engineers and systems engineering leaders to meet future design challenges.

  17. Proceedings 3rd NASA/IEEE Workshop on Formal Approaches to Agent-Based Systems (FAABS-III)

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael (Editor); Rash, James (Editor); Truszkowski, Walt (Editor); Rouff, Christopher (Editor)

    2004-01-01

    These preceedings contain 18 papers and 4 poster presentation, covering topics such as: multi-agent systems, agent-based control, formalism, norms, as well as physical and biological models of agent-based systems. Some applications presented in the proceedings include systems analysis, software engineering, computer networks and robot control.

  18. Probabilistic simulation of concurrent engineering of propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Technology readiness and the available infrastructure is assessed for timely computational simulation of concurrent engineering for propulsion systems. Results for initial coupled multidisciplinary, fabrication-process, and system simulators are presented including uncertainties inherent in various facets of engineering processes. An approach is outlined for computationally formalizing the concurrent engineering process from cradle-to-grave via discipline dedicated workstations linked with a common database.

  19. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  20. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  1. Using formal specification in the Guidance and Control Software (GCS) experiment. Formal design and verification technology for life critical systems

    NASA Technical Reports Server (NTRS)

    Weber, Doug; Jamsek, Damir

    1994-01-01

    The goal of this task was to investigate how formal methods could be incorporated into a software engineering process for flight-control systems under DO-178B and to demonstrate that process by developing a formal specification for NASA's Guidance and Controls Software (GCS) Experiment. GCS is software to control the descent of a spacecraft onto a planet's surface. The GCS example is simplified from a real example spacecraft, but exhibits the characteristics of realistic spacecraft control software. The formal specification is written in Larch.

  2. Colloquium: Modeling the dynamics of multicellular systems: Application to tissue engineering

    NASA Astrophysics Data System (ADS)

    Kosztin, Ioan; Vunjak-Novakovic, Gordana; Forgacs, Gabor

    2012-10-01

    Tissue engineering is a rapidly evolving discipline that aims at building functional tissues to improve or replace damaged ones. To be successful in such an endeavor, ideally, the engineering of tissues should be based on the principles of developmental biology. Recent progress in developmental biology suggests that the formation of tissues from the composing cells is often guided by physical laws. Here a comprehensive computational-theoretical formalism is presented that is based on experimental input and incorporates biomechanical principles of developmental biology. The formalism is described and it is shown that it correctly reproduces and predicts the quantitative characteristics of the fundamental early developmental process of tissue fusion. Based on this finding, the formalism is then used toward the optimization of the fabrication of tubular multicellular constructs, such as a vascular graft, by bioprinting, a novel tissue engineering technology.

  3. Systems Engineering Leadership Development: Advancing Systems Engineering Excellence

    NASA Technical Reports Server (NTRS)

    Hall, Phil; Whitfield, Susan

    2011-01-01

    This slide presentation reviews the Systems Engineering Leadership Development Program, with particular emphasis on the work being done in the development of systems engineers at Marshall Space Flight Center. There exists a lack of individuals with systems engineering expertise, in particular those with strong leadership capabilities, to meet the needs of the Agency's exploration agenda. Therefore there is a emphasis on developing these programs to identify and train systems engineers. The presentation reviews the proposed MSFC program that includes course work, and developmental assignments. The formal developmental programs at the other centers are briefly reviewed, including the Point of Contact (POC)

  4. NASA systems engineering handbook

    NASA Astrophysics Data System (ADS)

    Shishko, Robert; Aster, Robert; Chamberlain, Robert G.; McDuffee, Patrick; Pieniazek, Les; Rowell, Tom; Bain, Beth; Cox, Renee I.; Mooz, Harold; Polaski, Lou

    1995-06-01

    This handbook brings the fundamental concepts and techniques of systems engineering to NASA personnel in a way that recognizes the nature of NASA systems and environment. It is intended to accompany formal NASA training courses on systems engineering and project management when appropriate, and is designed to be a top-level overview. The concepts were drawn from NASA field center handbooks, NMI's/NHB's, the work of the NASA-wide Systems Engineering Working Group and the Systems Engineering Process Improvement Task team, several non-NASA textbooks and guides, and material from independent systems engineering courses taught to NASA personnel. Five core chapters cover systems engineering fundamentals, the NASA Project Cycle, management issues in systems engineering, systems analysis and modeling, and specialty engineering integration. It is not intended as a directive.

  5. NASA Systems Engineering Handbook

    NASA Technical Reports Server (NTRS)

    Shishko, Robert; Aster, Robert; Chamberlain, Robert G.; Mcduffee, Patrick; Pieniazek, Les; Rowell, Tom; Bain, Beth; Cox, Renee I.; Mooz, Harold; Polaski, Lou

    1995-01-01

    This handbook brings the fundamental concepts and techniques of systems engineering to NASA personnel in a way that recognizes the nature of NASA systems and environment. It is intended to accompany formal NASA training courses on systems engineering and project management when appropriate, and is designed to be a top-level overview. The concepts were drawn from NASA field center handbooks, NMI's/NHB's, the work of the NASA-wide Systems Engineering Working Group and the Systems Engineering Process Improvement Task team, several non-NASA textbooks and guides, and material from independent systems engineering courses taught to NASA personnel. Five core chapters cover systems engineering fundamentals, the NASA Project Cycle, management issues in systems engineering, systems analysis and modeling, and specialty engineering integration. It is not intended as a directive. Superseded by: NASA/SP-2007-6105 Rev 1 (20080008301).

  6. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice; Baggs, Rhoda

    2007-01-01

    Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.

  7. Toward a mathematical formalism of performance, task difficulty, and activation

    NASA Technical Reports Server (NTRS)

    Samaras, George M.

    1988-01-01

    The rudiments of a mathematical formalism for handling operational, physiological, and psychological concepts are developed for use by the man-machine system design engineer. The formalism provides a framework for developing a structured, systematic approach to the interface design problem, using existing mathematical tools, and simplifying the problem of telling a machine how to measure and use performance.

  8. Dynamic Gate Product and Artifact Generation from System Models

    NASA Technical Reports Server (NTRS)

    Jackson, Maddalena; Delp, Christopher; Bindschadler, Duane; Sarrel, Marc; Wollaeger, Ryan; Lam, Doris

    2011-01-01

    Model Based Systems Engineering (MBSE) is gaining acceptance as a way to formalize systems engineering practice through the use of models. The traditional method of producing and managing a plethora of disjointed documents and presentations ("Power-Point Engineering") has proven both costly and limiting as a means to manage the complex and sophisticated specifications of modern space systems. We have developed a tool and method to produce sophisticated artifacts as views and by-products of integrated models, allowing us to minimize the practice of "Power-Point Engineering" from model-based projects and demonstrate the ability of MBSE to work within and supersede traditional engineering practices. This paper describes how we have created and successfully used model-based document generation techniques to extract paper artifacts from complex SysML and UML models in support of successful project reviews. Use of formal SysML and UML models for architecture and system design enables production of review documents, textual artifacts, and analyses that are consistent with one-another and require virtually no labor-intensive maintenance across small-scale design changes and multiple authors. This effort thus enables approaches that focus more on rigorous engineering work and less on "PowerPoint engineering" and production of paper-based documents or their "office-productivity" file equivalents.

  9. (abstract) Formal Inspection Technology Transfer Program

    NASA Technical Reports Server (NTRS)

    Welz, Linda A.; Kelly, John C.

    1993-01-01

    A Formal Inspection Technology Transfer Program, based on the inspection process developed by Michael Fagan at IBM, has been developed at JPL. The goal of this program is to support organizations wishing to use Formal Inspections to improve the quality of software and system level engineering products. The Technology Transfer Program provides start-up materials and assistance to help organizations establish their own Formal Inspection program. The course materials and certified instructors associated with the Technology Transfer Program have proven to be effective in classes taught at other NASA centers as well as at JPL. Formal Inspections (NASA tailored Fagan Inspections) are a set of technical reviews whose objective is to increase quality and reduce the cost of software development by detecting and correcting errors early. A primary feature of inspections is the removal of engineering errors before they amplify into larger and more costly problems downstream in the development process. Note that the word 'inspection' is used differently in software than in a manufacturing context. A Formal Inspection is a front-end quality enhancement technique, rather than a task conducted just prior to product shipment for the purpose of sorting defective systems (manufacturing usage). Formal Inspections are supporting and in agreement with the 'total quality' approach being adopted by many NASA centers.

  10. Ontology or formal ontology

    NASA Astrophysics Data System (ADS)

    Žáček, Martin

    2017-07-01

    Ontology or formal ontology? Which word is correct? The aim of this article is to introduce correct terms and explain their basis. Ontology describes a particular area of interest (domain) in a formal way - defines the classes of objects that are in that area, and relationships that may exist between them. Meaning of ontology consists mainly in facilitating communication between people, improve collaboration of software systems and in the improvement of systems engineering. Ontology in all these areas offer the possibility of unification of view, maintaining consistency and unambiguity.

  11. A systematic approach to embedded biomedical decision making.

    PubMed

    Song, Zhe; Ji, Zhongkai; Ma, Jian-Guo; Sputh, Bernhard; Acharya, U Rajendra; Faust, Oliver

    2012-11-01

    An embedded decision making is a key feature for many biomedical systems. In most cases human life directly depends on correct decisions made by these systems, therefore they have to work reliably. This paper describes how we applied systems engineering principles to design a high performance embedded classification system in a systematic and well structured way. We introduce the structured design approach by discussing requirements capturing, specifications refinement, implementation and testing. Thereby, we follow systems engineering principles and execute each of these processes as formal as possible. The requirements, which motivate the system design, describe an automated decision making system for diagnostic support. These requirements are refined into the implementation of a support vector machine (SVM) algorithm which enables us to integrate automated decision making in embedded systems. With a formal model we establish functionality, stability and reliability of the system. Furthermore, we investigated different parallel processing configurations of this computationally complex algorithm. We found that, by adding SVM processes, an almost linear speedup is possible. Once we established these system properties, we translated the formal model into an implementation. The resulting implementation was tested using XMOS processors with both normal and failure cases, to build up trust in the implementation. Finally, we demonstrated that our parallel implementation achieves the speedup, predicted by the formal model. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  12. Semantically-Rigorous Systems Engineering Modeling Using Sysml and OWL

    NASA Technical Reports Server (NTRS)

    Jenkins, J. Steven; Rouquette, Nicolas F.

    2012-01-01

    The Systems Modeling Language (SysML) has found wide acceptance as a standard graphical notation for the domain of systems engineering. SysML subsets and extends the Unified Modeling Language (UML) to define conventions for expressing structural, behavioral, and analytical elements, and relationships among them. SysML-enabled modeling tools are available from multiple providers, and have been used for diverse projects in military aerospace, scientific exploration, and civil engineering. The Web Ontology Language (OWL) has found wide acceptance as a standard notation for knowledge representation. OWL-enabled modeling tools are available from multiple providers, as well as auxiliary assets such as reasoners and application programming interface libraries, etc. OWL has been applied to diverse projects in a wide array of fields. While the emphasis in SysML is on notation, SysML inherits (from UML) a semantic foundation that provides for limited reasoning and analysis. UML's partial formalization (FUML), however, does not cover the full semantics of SysML, which is a substantial impediment to developing high confidence in the soundness of any conclusions drawn therefrom. OWL, by contrast, was developed from the beginning on formal logical principles, and consequently provides strong support for verification of consistency and satisfiability, extraction of entailments, conjunctive query answering, etc. This emphasis on formal logic is counterbalanced by the absence of any graphical notation conventions in the OWL standards. Consequently, OWL has had only limited adoption in systems engineering. The complementary strengths and weaknesses of SysML and OWL motivate an interest in combining them in such a way that we can benefit from the attractive graphical notation of SysML and the formal reasoning of OWL. This paper describes an approach to achieving that combination.

  13. Formalization of the classification pattern: survey of classification modeling in information systems engineering.

    PubMed

    Partridge, Chris; de Cesare, Sergio; Mitchell, Andrew; Odell, James

    2018-01-01

    Formalization is becoming more common in all stages of the development of information systems, as a better understanding of its benefits emerges. Classification systems are ubiquitous, no more so than in domain modeling. The classification pattern that underlies these systems provides a good case study of the move toward formalization in part because it illustrates some of the barriers to formalization, including the formal complexity of the pattern and the ontological issues surrounding the "one and the many." Powersets are a way of characterizing the (complex) formal structure of the classification pattern, and their formalization has been extensively studied in mathematics since Cantor's work in the late nineteenth century. One can use this formalization to develop a useful benchmark. There are various communities within information systems engineering (ISE) that are gradually working toward a formalization of the classification pattern. However, for most of these communities, this work is incomplete, in that they have not yet arrived at a solution with the expressiveness of the powerset benchmark. This contrasts with the early smooth adoption of powerset by other information systems communities to, for example, formalize relations. One way of understanding the varying rates of adoption is recognizing that the different communities have different historical baggage. Many conceptual modeling communities emerged from work done on database design, and this creates hurdles to the adoption of the high level of expressiveness of powersets. Another relevant factor is that these communities also often feel, particularly in the case of domain modeling, a responsibility to explain the semantics of whatever formal structures they adopt. This paper aims to make sense of the formalization of the classification pattern in ISE and surveys its history through the literature, starting from the relevant theoretical works of the mathematical literature and gradually shifting focus to the ISE literature. The literature survey follows the evolution of ISE's understanding of how to formalize the classification pattern. The various proposals are assessed using the classical example of classification; the Linnaean taxonomy formalized using powersets as a benchmark for formal expressiveness. The broad conclusion of the survey is that (1) the ISE community is currently in the early stages of the process of understanding how to formalize the classification pattern, particularly in the requirements for expressiveness exemplified by powersets, and (2) that there is an opportunity to intervene and speed up the process of adoption by clarifying this expressiveness. Given the central place that the classification pattern has in domain modeling, this intervention has the potential to lead to significant improvements.

  14. General formalism of local thermodynamics with an example: Quantum Otto engine with a spin-1/2 coupled to an arbitrary spin.

    PubMed

    Altintas, Ferdi; Müstecaplıoğlu, Özgür E

    2015-08-01

    We investigate a quantum heat engine with a working substance of two particles, one with a spin-1/2 and the other with an arbitrary spin (spin s), coupled by Heisenberg exchange interaction, and subject to an external magnetic field. The engine operates in a quantum Otto cycle. Work harvested in the cycle and its efficiency are calculated using quantum thermodynamical definitions. It is found that the engine has higher efficiencies at higher spins and can harvest work at higher exchange interaction strengths. The role of exchange coupling and spin s on the work output and the thermal efficiency is studied in detail. In addition, the engine operation is analyzed from the perspective of local work and efficiency. We develop a general formalism to explore local thermodynamics applicable to any coupled bipartite system. Our general framework allows for examination of local thermodynamics even when global parameters of the system are varied in thermodynamic cycles. The generalized definitions of local and cooperative work are introduced by using mean field Hamiltonians. The general conditions for which the global work is not equal to the sum of the local works are given in terms of the covariance of the subsystems. Our coupled spin quantum Otto engine is used as an example of the general formalism.

  15. General formalism of local thermodynamics with an example: Quantum Otto engine with a spin-1 /2 coupled to an arbitrary spin

    NASA Astrophysics Data System (ADS)

    Altintas, Ferdi; Müstecaplıoǧlu, Ã.-zgür E.

    2015-08-01

    We investigate a quantum heat engine with a working substance of two particles, one with a spin-1 /2 and the other with an arbitrary spin (spin s ), coupled by Heisenberg exchange interaction, and subject to an external magnetic field. The engine operates in a quantum Otto cycle. Work harvested in the cycle and its efficiency are calculated using quantum thermodynamical definitions. It is found that the engine has higher efficiencies at higher spins and can harvest work at higher exchange interaction strengths. The role of exchange coupling and spin s on the work output and the thermal efficiency is studied in detail. In addition, the engine operation is analyzed from the perspective of local work and efficiency. We develop a general formalism to explore local thermodynamics applicable to any coupled bipartite system. Our general framework allows for examination of local thermodynamics even when global parameters of the system are varied in thermodynamic cycles. The generalized definitions of local and cooperative work are introduced by using mean field Hamiltonians. The general conditions for which the global work is not equal to the sum of the local works are given in terms of the covariance of the subsystems. Our coupled spin quantum Otto engine is used as an example of the general formalism.

  16. Multi-Attribute Tradespace Exploration in Space System Design

    NASA Astrophysics Data System (ADS)

    Ross, A. M.; Hastings, D. E.

    2002-01-01

    The complexity inherent in space systems necessarily requires intense expenditures of resources both human and monetary. The high level of ambiguity present in the early design phases of these systems causes long, highly iterative, and costly design cycles. This paper looks at incorporating decision theory methods into the early design processes to streamline communication of wants and needs among stakeholders and between levels of design. Communication channeled through formal utility interviews and analysis enables engineers to better understand the key drivers for the system and allows a more thorough exploration of the design tradespace. Multi-Attribute Tradespace Exploration (MATE), an evolving process incorporating decision theory into model and simulation- based design, has been applied to several space system case studies at MIT. Preliminary results indicate that this process can improve the quality of communication to more quickly resolve project ambiguity, and enable the engineer to discover better value designs for multiple stakeholders. MATE is also being integrated into a concurrent design environment to facilitate the transfer knowledge of important drivers into higher fidelity design phases. Formal utility theory provides a mechanism to bridge the language barrier between experts of different backgrounds and differing needs (e.g. scientists, engineers, managers, etc). MATE with concurrent design couples decision makers more closely to the design, and most importantly, maintains their presence between formal reviews.

  17. A planning and scheduling lexicon

    NASA Technical Reports Server (NTRS)

    Cruz, Jennifer W.; Eggemeyer, William C.

    1989-01-01

    A lexicon related to mission planning and scheduling for spacecraft is presented. Planning and scheduling work is known as sequencing. Sequencing is a multistage process of merging requests from both the science and engineering arenas to accomplish the objectives defined in the requests. The multistage process begins with the creation of science and engineering goals, continues through their integration into the sequence, and eventually concludes with command execution onboard the spacecraft. The objective of this publication is to introduce some formalism into the field of spacecraft sequencing-system technology. This formalism will make it possible for researchers and potential customers to communicate about system requirements and capabilities in a common language.

  18. Separating essentials from incidentals: an execution architecture for real-time control systems

    NASA Technical Reports Server (NTRS)

    Dvorak, Daniel; Reinholtz, Kirk

    2004-01-01

    This paper describes an execution architecture that makes such systems far more analyzable and verifiable by aggressive separation of concerns. The architecture separates two key software concerns: transformations of global state, as defined in pure functions; and sequencing/timing of transformations, as performed by an engine that enforces four prime invariants. The important advantage of this architecture, besides facilitating verification, is that it encourages formal specification of systems in a vocabulary that brings systems engineering closer to software engineering.

  19. Formalization of the engineering science discipline - knowledge engineering

    NASA Astrophysics Data System (ADS)

    Peng, Xiao

    Knowledge is the most precious ingredient facilitating aerospace engineering research and product development activities. Currently, the most common knowledge retention methods are paper-based documents, such as reports, books and journals. However, those media have innate weaknesses. For example, four generations of flying wing aircraft (Horten, Northrop XB-35/YB-49, Boeing BWB and many others) were mostly developed in isolation. The subsequent engineers were not aware of the previous developments, because these projects were documented such which prevented the next generation of engineers to benefit from the previous lessons learned. In this manner, inefficient knowledge retention methods have become a primary obstacle for knowledge transfer from the experienced to the next generation of engineers. In addition, the quality of knowledge itself is a vital criterion; thus, an accurate measure of the quality of 'knowledge' is required. Although qualitative knowledge evaluation criteria have been researched in other disciplines, such as the AAA criterion by Ernest Sosa stemming from the field of philosophy, a quantitative knowledge evaluation criterion needs to be developed which is capable to numerically determine the qualities of knowledge for aerospace engineering research and product development activities. To provide engineers with a high-quality knowledge management tool, the engineering science discipline Knowledge Engineering has been formalized to systematically address knowledge retention issues. This research undertaking formalizes Knowledge Engineering as follows: 1. Categorize knowledge according to its formats and representations for the first time, which serves as the foundation for the subsequent knowledge management function development. 2. Develop an efficiency evaluation criterion for knowledge management by analyzing the characteristics of both knowledge and the parties involved in the knowledge management processes. 3. Propose and develop an innovative Knowledge-Based System (KBS), AVD KBS, forming a systematic approach facilitating knowledge management. 4. Demonstrate the efficiency advantages of AVDKBS over traditional knowledge management methods via selected design case studies. This research formalizes, for the first time, Knowledge Engineering as a distinct discipline by delivering a robust and high-quality knowledge management and process tool, AVDKBS. Formalizing knowledge proves to significantly impact the effectiveness of aerospace knowledge retention and utilization.

  20. A Method for Capturing and Reconciling Stakeholder Intentions Based on the Formal Concept Analysis

    NASA Astrophysics Data System (ADS)

    Aoyama, Mikio

    Information systems are ubiquitous in our daily life. Thus, information systems need to work appropriately anywhere at any time for everybody. Conventional information systems engineering tends to engineer systems from the viewpoint of systems functionality. However, the diversity of the usage context requires fundamental change compared to our current thinking on information systems; from the functionality the systems provide to the goals the systems should achieve. The intentional approach embraces the goals and related aspects of the information systems. This chapter presents a method for capturing, structuring and reconciling diverse goals of multiple stakeholders. The heart of the method lies in the hierarchical structuring of goals by goal lattice based on the formal concept analysis, a semantic extension of the lattice theory. We illustrate the effectiveness of the presented method through application to the self-checkout systems for large-scale supermarkets.

  1. Using Maxwell's Demon to Tame the "Devil in the Details" that are Encountered During System Development

    NASA Technical Reports Server (NTRS)

    Richardson, David

    2018-01-01

    Model-Based Systems Engineering (MBSE) is the formalized application of modeling to support system requirements, design, analysis, verification and validation activities beginning in the conceptual design phase and continuing throughout development and later life cycle phases . This presentation will discuss the value proposition that MBSE has for Systems Engineering, and the associated culture change needed to adopt it.

  2. Improving engineering system design by formal decomposition, sensitivity analysis, and optimization

    NASA Technical Reports Server (NTRS)

    Sobieski, J.; Barthelemy, J. F. M.

    1985-01-01

    A method for use in the design of a complex engineering system by decomposing the problem into a set of smaller subproblems is presented. Coupling of the subproblems is preserved by means of the sensitivity derivatives of the subproblem solution to the inputs received from the system. The method allows for the division of work among many people and computers.

  3. Warfighting Concepts to Future Weapon System Designs (WARCON)

    DTIC Science & Technology

    2003-09-12

    34* Software design documents rise to litigation. "* A Material List "Cost information that may support, or may * Final Engineering Process Maps be...document may include design the system as derived from the engineering design, software development, SRD. MTS Technologies, Inc. 26 FOR OFFICIAL USE...document, early in the development phase. It is software engineers produce the vision of important to establish a standard, formal the design effort. As

  4. Formal methods in computer system design

    NASA Astrophysics Data System (ADS)

    Hoare, C. A. R.

    1989-12-01

    This note expounds a philosophy of engineering design which is stimulated, guided and checked by mathematical calculations and proofs. Its application to software engineering promises the same benifits as those derived from the use of mathematics in all other branches of modern science.

  5. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2004-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  6. Discrete mathematics, formal methods, the Z schema and the software life cycle

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    The proper role and scope for the use of discrete mathematics and formal methods in support of engineering the security and integrity of components within deployed computer systems are discussed. It is proposed that the Z schema can be used as the specification language to capture the precise definition of system and component interfaces. This can be accomplished with an object oriented development paradigm.

  7. Optical systems engineering - A tutorial

    NASA Technical Reports Server (NTRS)

    Wyman, C. L.

    1979-01-01

    The paper examines the use of the systems engineering approach in the design of optical systems, noting that the use of such an approach which involves an integrated interdisciplinary approach to the development of systems is most appropriate for optics. It is shown that the high precision character of optics leads to complex and subtle effects on optical system performance, resulting from structural, thermal dynamical, control system, and manufacturing and assembly considerations. Attention is given to communication problems that often occur among users and optical engineers due to the unique factors of optical systems. It is concluded that it is essential that the optics community provide leadership to resolve communication problems and fully formalize the field of optical systems engineering.

  8. Review of Estelle and LOTOS with respect to critical computer applications

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    Man rated NASA space vehicles seem to represent a set of ultimate critical computer applications. These applications require a high degree of security, integrity, and safety. A variety of formal and/or precise modeling techniques are becoming available for the designer of critical systems. The design phase of the software engineering life cycle includes the modification of non-development components. A review of the Estelle and LOTOS formal description languages is presented. Details of the languages and a set of references are provided. The languages were used to formally describe some of the Open System Interconnect (OSI) protocols.

  9. Pedagogical Basis of DAS Formalism in Engineering Education

    ERIC Educational Resources Information Center

    Hiltunen, J.; Heikkinen, E.-P.; Jaako, J.; Ahola, J.

    2011-01-01

    The paper presents a new approach for a bachelor-level curriculum structure in engineering. The approach is called DAS formalism according to its three phases: description, analysis and synthesis. Although developed specifically for process and environmental engineering, DAS formalism has a generic nature and it could also be used in other…

  10. What is the Final Verification of Engineering Requirements?

    NASA Technical Reports Server (NTRS)

    Poole, Eric

    2010-01-01

    This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.

  11. Third NASA Langley Formal Methods Workshop

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael (Compiler)

    1995-01-01

    This publication constitutes the proceedings of NASA Langley Research Center's third workshop on the application of formal methods to the design and verification of life-critical systems. This workshop brought together formal methods researchers, industry engineers, and academicians to discuss the potential of NASA-sponsored formal methods and to investigate new opportunities for applying these methods to industry problems. contained herein are copies of the material presented at the workshop, summaries of many of the presentations, a complete list of attendees, and a detailed summary of the Langley formal methods program. Much of this material is available electronically through the World-Wide Web via the following URL.

  12. Reconfigurable Hardware Adapts to Changing Mission Demands

    NASA Technical Reports Server (NTRS)

    2003-01-01

    A new class of computing architectures and processing systems, which use reconfigurable hardware, is creating a revolutionary approach to implementing future spacecraft systems. With the increasing complexity of electronic components, engineers must design next-generation spacecraft systems with new technologies in both hardware and software. Derivation Systems, Inc., of Carlsbad, California, has been working through NASA s Small Business Innovation Research (SBIR) program to develop key technologies in reconfigurable computing and Intellectual Property (IP) soft cores. Founded in 1993, Derivation Systems has received several SBIR contracts from NASA s Langley Research Center and the U.S. Department of Defense Air Force Research Laboratories in support of its mission to develop hardware and software for high-assurance systems. Through these contracts, Derivation Systems began developing leading-edge technology in formal verification, embedded Java, and reconfigurable computing for its PF3100, Derivational Reasoning System (DRS ), FormalCORE IP, FormalCORE PCI/32, FormalCORE DES, and LavaCORE Configurable Java Processor, which are designed for greater flexibility and security on all space missions.

  13. Formal verification of an avionics microprocessor

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam, K.; Miller, Steven P.

    1995-01-01

    Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.

  14. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  15. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  16. A Model-based Approach to Reactive Self-Configuring Systems

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Nayak, P. Pandurang

    1996-01-01

    This paper describes Livingstone, an implemented kernel for a self-reconfiguring autonomous system, that is reactive and uses component-based declarative models. The paper presents a formal characterization of the representation formalism used in Livingstone, and reports on our experience with the implementation in a variety of domains. Livingstone's representation formalism achieves broad coverage of hybrid software/hardware systems by coupling the concurrent transition system models underlying concurrent reactive languages with the discrete qualitative representations developed in model-based reasoning. We achieve a reactive system that performs significant deductions in the sense/response loop by drawing on our past experience at building fast prepositional conflict-based algorithms for model-based diagnosis, and by framing a model-based configuration manager as a prepositional, conflict-based feedback controller that generates focused, optimal responses. Livingstone automates all these tasks using a single model and a single core deductive engine, thus making significant progress towards achieving a central goal of model-based reasoning. Livingstone, together with the HSTS planning and scheduling engine and the RAPS executive, has been selected as the core autonomy architecture for Deep Space One, the first spacecraft for NASA's New Millennium program.

  17. Applications of formal simulation languages in the control and monitoring subsystems of Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Lacovara, R. C.

    1990-01-01

    The notions, benefits, and drawbacks of numeric simulation are introduced. Two formal simulation languages, Simpscript and Modsim are introduced. The capabilities of each are discussed briefly, and then the two programs are compared. The use of simulation in the process of design engineering for the Control and Monitoring System (CMS) for Space Station Freedom is discussed. The application of the formal simulation language to the CMS design is presented, and recommendations are made as to their use.

  18. META-GLARE: A meta-system for defining your own computer interpretable guideline system-Architecture and acquisition.

    PubMed

    Bottrighi, Alessio; Terenziani, Paolo

    2016-09-01

    Several different computer-assisted management systems of computer interpretable guidelines (CIGs) have been developed by the Artificial Intelligence in Medicine community. Each CIG system is characterized by a specific formalism to represent CIGs, and usually provides a manager to acquire, consult and execute them. Though there are several commonalities between most formalisms in the literature, each formalism has its own peculiarities. The goal of our work is to provide a flexible support to the extension or definition of CIGs formalisms, and of their acquisition and execution engines. Instead of defining "yet another CIG formalism and its manager", we propose META-GLARE (META Guideline Acquisition, Representation, and Execution), a "meta"-system to define new CIG systems. In this paper, META-GLARE, a meta-system to define new CIG systems, is presented. We try to capture the commonalities among current CIG approaches, by providing (i) a general manager for the acquisition, consultation and execution of hierarchical graphs (representing the control flow of actions in CIGs), parameterized over the types of nodes and of arcs constituting it, and (ii) a library of different elementary components of guidelines nodes (actions) and arcs, in which each type definition involves the specification of how objects of this type can be acquired, consulted and executed. We provide generality and flexibility, by allowing free aggregations of such elementary components to define new primitive node and arc types. We have drawn several experiments, in which we have used META-GLARE to build a CIG system (Experiment 1 in Section 8), or to extend it (Experiments 2 and 3). Such experiments show that META-GLARE provides a useful and easy-to-use support to such tasks. For instance, re-building the Guideline Acquisition, Representation, and Execution (GLARE) system using META-GLARE required less than one day (Experiment 1). META-GLARE is a meta-system for CIGs supporting fast prototyping. Since META-GLARE provides acquisition and execution engines that are parametric over the specific CIG formalism, it supports easy update and construction of CIG systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  20. Executable Architecture Research at Old Dominion University

    NASA Technical Reports Server (NTRS)

    Tolk, Andreas; Shuman, Edwin A.; Garcia, Johnny J.

    2011-01-01

    Executable Architectures allow the evaluation of system architectures not only regarding their static, but also their dynamic behavior. However, the systems engineering community do not agree on a common formal specification of executable architectures. To close this gap and identify necessary elements of an executable architecture, a modeling language, and a modeling formalism is topic of ongoing PhD research. In addition, systems are generally defined and applied in an operational context to provide capabilities and enable missions. To maximize the benefits of executable architectures, a second PhD effort introduces the idea of creating an executable context in addition to the executable architecture. The results move the validation of architectures from the current information domain into the knowledge domain and improve the reliability of such validation efforts. The paper presents research and results of both doctoral research efforts and puts them into a common context of state-of-the-art of systems engineering methods supporting more agility.

  1. What can formal methods offer to digital flight control systems design

    NASA Technical Reports Server (NTRS)

    Good, Donald I.

    1990-01-01

    Formal methods research begins to produce methods which will enable mathematic modeling of the physical behavior of digital hardware and software systems. The development of these methods directly supports the NASA mission of increasing the scope and effectiveness of flight system modeling capabilities. The conventional, continuous mathematics that is used extensively in modeling flight systems is not adequate for accurate modeling of digital systems. Therefore, the current practice of digital flight control system design has not had the benefits of extensive mathematical modeling which are common in other parts of flight system engineering. Formal methods research shows that by using discrete mathematics, very accurate modeling of digital systems is possible. These discrete modeling methods will bring the traditional benefits of modeling to digital hardware and hardware design. Sound reasoning about accurate mathematical models of flight control systems can be an important part of reducing risk of unsafe flight control.

  2. Systems engineering: A formal approach. Part 1: System concepts

    NASA Astrophysics Data System (ADS)

    Vanhee, K. M.

    1993-03-01

    Engineering is the scientific discipline focused on the creation of new artifacts that are supposed to be of some use to our society. Different types of artifacts require different engineering approaches. However, in all these disciplines the development of a new artifact is divided into stages. Three stages can always be recognized: Analysis, Design, and Realization. The book considers only the first two stages of the development process. It focuses on a specific type of artifacts, called discrete dynamic systems. These systems consist of active components of actors that consume and produce passive components or tokens. Three subtypes are studied in more detail: business systems (like a factory or restaurant), information systems (whether automated or not), and automated systems (systems that are controlled by an automated information system). The first subtype is studied by industrial engineers, the last by software engineers and electrical engineers, whereas the second is a battlefield for all three disciplines. The union of these disciplines is called systems engineering.

  3. Highlights From the Third Annual Mayo Clinic Conference on Systems Engineering and Operations Research in Health Care

    PubMed Central

    Kamath, Janine R. A.; Osborn, John B.; Roger, Véronique L.; Rohleder, Thomas R.

    2011-01-01

    In August 2010, the Third Annual Mayo Clinic Conference on Systems Engineering and Operations Research in Health Care was held. The continuing mission of the conference is to gather a multidisciplinary group of systems engineers, clinicians, administrators, and academic professors to discuss the translation of systems engineering methods to more effective health care delivery. Education, research, and practice were enhanced via a mix of formal presentations, tutorials, and informal gatherings of participants with diverse backgrounds. Although the conference promotes a diversity of perspectives and methods, participants are united in their desire to find ways in which systems engineering can transform health care, especially in the context of health care reform and other significant changes affecting the delivery of health care. PMID:21803959

  4. Toward a Model-Based Approach to Flight System Fault Protection

    NASA Technical Reports Server (NTRS)

    Day, John; Murray, Alex; Meakin, Peter

    2012-01-01

    Fault Protection (FP) is a distinct and separate systems engineering sub-discipline that is concerned with the off-nominal behavior of a system. Flight system fault protection is an important part of the overall flight system systems engineering effort, with its own products and processes. As with other aspects of systems engineering, the FP domain is highly amenable to expression and management in models. However, while there are standards and guidelines for performing FP related analyses, there are not standards or guidelines for formally relating the FP analyses to each other or to the system hardware and software design. As a result, the material generated for these analyses are effectively creating separate models that are only loosely-related to the system being designed. Development of approaches that enable modeling of FP concerns in the same model as the system hardware and software design enables establishment of formal relationships that has great potential for improving the efficiency, correctness, and verification of the implementation of flight system FP. This paper begins with an overview of the FP domain, and then continues with a presentation of a SysML/UML model of the FP domain and the particular analyses that it contains, by way of showing a potential model-based approach to flight system fault protection, and an exposition of the use of the FP models in FSW engineering. The analyses are small examples, inspired by current real-project examples of FP analyses.

  5. The FoReVer Methodology: A MBSE Framework for Formal Verification

    NASA Astrophysics Data System (ADS)

    Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald

    2013-08-01

    The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.

  6. A Formal Approach to Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.

  7. RICIS Software Engineering 90 Symposium: Aerospace Applications and Research Directions Proceedings Appendices

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Papers presented at RICIS Software Engineering Symposium are compiled. The following subject areas are covered: flight critical software; management of real-time Ada; software reuse; megaprogramming software; Ada net; POSIX and Ada integration in the Space Station Freedom Program; and assessment of formal methods for trustworthy computer systems.

  8. Towards an Automated Development Methodology for Dependable Systems with Application to Sensor Networks

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a probably equivalent model has yet to appeal: Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a probably equivalent implementation are valuable but not su8cient. The "gap" unfilled by such tools and methods is that their. formal models cannot be proven to be equivalent to the system requirements as originated by the customel: For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a probably equivalent formal model that can be used as the basis for code generation and other transformations.

  9. SU-E-T-785: Using Systems Engineering to Design HDR Skin Treatment Operation for Small Lesions to Enhance Patient Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saw, C; Baikadi, M; Peters, C

    2015-06-15

    Purpose: Using systems engineering to design HDR skin treatment operation for small lesions using shielded applicators to enhance patient safety. Methods: Systems engineering is an interdisciplinary field that offers formal methodologies to study, design, implement, and manage complex engineering systems as a whole over their life-cycles. The methodologies deal with human work-processes, coordination of different team, optimization, and risk management. The V-model of systems engineering emphasize two streams, the specification and the testing streams. The specification stream consists of user requirements, functional requirements, and design specifications while the testing on installation, operational, and performance specifications. In implementing system engineering tomore » this project, the user and functional requirements are (a) HDR unit parameters be downloaded from the treatment planning system, (b) dwell times and positions be generated by treatment planning system, (c) source decay be computer calculated, (d) a double-check system of treatment parameters to comply with the NRC regulation. These requirements are intended to reduce human intervention to improve patient safety. Results: A formal investigation indicated that the user requirements can be satisfied. The treatment operation consists of using the treatment planning system to generate a pseudo plan that is adjusted for different shielded applicators to compute the dwell times. The dwell positions, channel numbers, and the dwell times are verified by the medical physicist and downloaded into the HDR unit. The decayed source strength is transferred to a spreadsheet that computes the dwell times based on the type of applicators and prescribed dose used. Prior to treatment, the source strength, dwell times, dwell positions, and channel numbers are double-checked by the radiation oncologist. No dosimetric parameters are manually calculated. Conclusion: Systems engineering provides methodologies to effectively design the HDR treatment operation that minimize human intervention and improve patient safety.« less

  10. Experiences Using Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1996-01-01

    This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  11. 75 FR 34170 - Chrysler Group LLC, Formally Known as Chrysler LLC, Kenosha Engine Plant, Including On-Site...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-16

    ... automobile engines. The company reports that workers leased from Caravan Knight Facilities Management, LLC..., Formally Known as Chrysler LLC, Kenosha Engine Plant, Including On-Site Leased Workers From Caravan Knight..., Kenosha Engine Plant, Kenosha, Wisconsin. The notice was published in [[Page 34171

  12. Using Life-Cycle Human Factors Engineering to Avoid $2.4 Million in Costs: Lessons Learned from NASA's Requirements Verification Process for Space Payloads

    NASA Technical Reports Server (NTRS)

    Carr, Daniel; Ellenberger, Rich

    2008-01-01

    The Human Factors Implementation Team (HFIT) process has been used to verify human factors requirements for NASA International Space Station (ISS) payloads since 2003, resulting in $2.4 million in avoided costs. This cost benefit has been realized by greatly reducing the need to process time-consuming formal waivers (exceptions) for individual requirements violations. The HFIT team, which includes astronauts and their technical staff, acts as the single source for human factors requirements integration of payloads. HFIT has the authority to provide inputs during early design phases, thus eliminating many potential requirements violations in a cost-effective manner. In those instances where it is not economically or technically feasible to meet the precise metric of a given requirement, HFIT can work with the payload engineers to develop common sense solutions and formally document that the resulting payload design does not materially affect the astronaut s ability to operate and interact with the payload. The HFIT process is fully ISO 9000 compliant and works concurrently with NASA s formal systems engineering work flow. Due to its success with payloads, the HFIT process is being adapted and extended to ISS systems hardware. Key aspects of this process are also being considered for NASA's Space Shuttle replacement, the Crew Exploration Vehicle.

  13. Architecting the Human Space Flight Program with Systems Modeling Language (SysML)

    NASA Technical Reports Server (NTRS)

    Jackson, Maddalena M.; Fernandez, Michela Munoz; McVittie, Thomas I.; Sindiy, Oleg V.

    2012-01-01

    The next generation of missions in NASA's Human Space Flight program focuses on the development and deployment of highly complex systems (e.g., Orion Multi-Purpose Crew Vehicle, Space Launch System, 21st Century Ground System) that will enable astronauts to venture beyond low Earth orbit and explore the moon, near-Earth asteroids, and beyond. Architecting these highly complex system-of-systems requires formal systems engineering techniques for managing the evolution of the technical features in the information exchange domain (e.g., data exchanges, communication networks, ground software) and also, formal correlation of the technical architecture to stakeholders' programmatic concerns (e.g., budget, schedule, risk) and design development (e.g., assumptions, constraints, trades, tracking of unknowns). This paper will describe how the authors have applied System Modeling Language (SysML) to implement model-based systems engineering for managing the description of the End-to-End Information System (EEIS) architecture and associated development activities and ultimately enables stakeholders to understand, reason, and answer questions about the EEIS under design for proposed lunar Exploration Missions 1 and 2 (EM-1 and EM-2).

  14. Rasmussen's legacy: A paradigm change in engineering for safety.

    PubMed

    Leveson, Nancy G

    2017-03-01

    This paper describes three applications of Rasmussen's idea to systems engineering practice. The first is the application of the abstraction hierarchy to engineering specifications, particularly requirements specification. The second is the use of Rasmussen's ideas in safety modeling and analysis to create a new, more powerful type of accident causation model that extends traditional models to better handle human-operated, software-intensive, sociotechnical systems. Because this new model has a formal, mathematical foundation built on systems theory (as was Rasmussen's original model), new modeling and analysis tools become possible. The third application is to engineering hazard analysis. Engineers have traditionally either omitted human from consideration in system hazard analysis or have treated them rather superficially, for example, that they behave randomly. Applying Rasmussen's model of human error to a powerful new hazard analysis technique allows human behavior to be included in engineering hazard analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Experiences Using Lightweight Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1997-01-01

    This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  16. 75 FR 52982 - Chrysler Group LLC, Formally Known as Chrysler LLC, Kenosha Engine Plant, Including On-Site...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-30

    ... automobile engines. The company reports that workers leased from Syncreon were employed on-site at the..., Formally Known as Chrysler LLC, Kenosha Engine Plant, Including On-Site Leased Workers From Caravan Knight... Chrysler, LLC, Kenosha Engine Plant, Kenosha, Wisconsin. The notice was published in the Federal Register...

  17. Formal Foundations for Hierarchical Safety Cases

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Pai, Ganesh; Whiteside, Iain

    2015-01-01

    Safety cases are increasingly being required in many safety-critical domains to assure, using structured argumentation and evidence, that a system is acceptably safe. However, comprehensive system-wide safety arguments present appreciable challenges to develop, understand, evaluate, and manage, partly due to the volume of information that they aggregate, such as the results of hazard analysis, requirements analysis, testing, formal verification, and other engineering activities. Previously, we have proposed hierarchical safety cases, hicases, to aid the comprehension of safety case argument structures. In this paper, we build on a formal notion of safety case to formalise the use of hierarchy as a structuring technique, and show that hicases satisfy several desirable properties. Our aim is to provide a formal, theoretical foundation for safety cases. In particular, we believe that tools for high assurance systems should be granted similar assurance to the systems to which they are applied. To this end, we formally specify and prove the correctness of key operations for constructing and managing hicases, which gives the specification for implementing hicases in AdvoCATE, our toolset for safety case automation. We motivate and explain the theory with the help of a simple running example, extracted from a real safety case and developed using AdvoCATE.

  18. Recent Developments: PKI Square Dish for the Soleras Project

    NASA Technical Reports Server (NTRS)

    Rogers, W. E.

    1984-01-01

    The Square Dish solar collectors are subjected to rigorous design attention regarding corrosion at the site, and certification of the collector structure. The microprocessor controls and tracking mechanisms are improved in the areas of fail safe operations, durability, and low parasitic power requirements. Prototype testing demonstrates performance efficiency of approximately 72% at 730 F outlet temperature. Studies are conducted that include developing formal engineering design studies, developing formal engineering design drawing and fabrication details, establishing subcontracts for fabrication of major components, and developing a rigorous quality control system. The improved design is more cost effective to product and the extensive manuals developed for assembly and operation/maintenance result in faster field assembly and ease of operation.

  19. Recent developments: PKI square dish for the Soleras Project

    NASA Astrophysics Data System (ADS)

    Rogers, W. E.

    1984-03-01

    The Square Dish solar collectors are subjected to rigorous design attention regarding corrosion at the site, and certification of the collector structure. The microprocessor controls and tracking mechanisms are improved in the areas of fail safe operations, durability, and low parasitic power requirements. Prototype testing demonstrates performance efficiency of approximately 72% at 730 F outlet temperature. Studies are conducted that include developing formal engineering design studies, developing formal engineering design drawing and fabrication details, establishing subcontracts for fabrication of major components, and developing a rigorous quality control system. The improved design is more cost effective to product and the extensive manuals developed for assembly and operation/maintenance result in faster field assembly and ease of operation.

  20. The VATES-Diamond as a Verifier's Best Friend

    NASA Astrophysics Data System (ADS)

    Glesner, Sabine; Bartels, Björn; Göthel, Thomas; Kleine, Moritz

    Within a model-based software engineering process it needs to be ensured that properties of abstract specifications are preserved by transformations down to executable code. This is even more important in the area of safety-critical real-time systems where additionally non-functional properties are crucial. In the VATES project, we develop formal methods for the construction and verification of embedded systems. We follow a novel approach that allows us to formally relate abstract process algebraic specifications to their implementation in a compiler intermediate representation. The idea is to extract a low-level process algebraic description from the intermediate code and to formally relate it to previously developed abstract specifications. We apply this approach to a case study from the area of real-time operating systems and show that this approach has the potential to seamlessly integrate modeling, implementation, transformation and verification stages of embedded system development.

  1. Candidate Causes. Sediments. In: Causal Analysis, Diagnosis Decision Information System, USEPA Website

    EPA Science Inventory

    CADDIS is an online application that helps scientists and engineers in the Regions, States, and Tribes find, access, organize, use, and share information to conduct causal evaluations in aquatic systems. It is based on the USEPA stressor identification process, a formal method fo...

  2. Integrating Science and Engineering to Implement Evidence-Based Practices in Health Care Settings.

    PubMed

    Wu, Shinyi; Duan, Naihua; Wisdom, Jennifer P; Kravitz, Richard L; Owen, Richard R; Sullivan, J Greer; Wu, Albert W; Di Capua, Paul; Hoagwood, Kimberly Eaton

    2015-09-01

    Integrating two distinct and complementary paradigms, science and engineering, may produce more effective outcomes for the implementation of evidence-based practices in health care settings. Science formalizes and tests innovations, whereas engineering customizes and optimizes how the innovation is applied tailoring to accommodate local conditions. Together they may accelerate the creation of an evidence-based healthcare system that works effectively in specific health care settings. We give examples of applying engineering methods for better quality, more efficient, and safer implementation of clinical practices, medical devices, and health services systems. A specific example was applying systems engineering design that orchestrated people, process, data, decision-making, and communication through a technology application to implement evidence-based depression care among low-income patients with diabetes. We recommend that leading journals recognize the fundamental role of engineering in implementation research, to improve understanding of design elements that create a better fit between program elements and local context.

  3. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  4. Universal Quantum Computing with Arbitrary Continuous-Variable Encoding.

    PubMed

    Lau, Hoi-Kwan; Plenio, Martin B

    2016-09-02

    Implementing a qubit quantum computer in continuous-variable systems conventionally requires the engineering of specific interactions according to the encoding basis states. In this work, we present a unified formalism to conduct universal quantum computation with a fixed set of operations but arbitrary encoding. By storing a qubit in the parity of two or four qumodes, all computing processes can be implemented by basis state preparations, continuous-variable exponential-swap operations, and swap tests. Our formalism inherits the advantages that the quantum information is decoupled from collective noise, and logical qubits with different encodings can be brought to interact without decoding. We also propose a possible implementation of the required operations by using interactions that are available in a variety of continuous-variable systems. Our work separates the "hardware" problem of engineering quantum-computing-universal interactions, from the "software" problem of designing encodings for specific purposes. The development of quantum computer architecture could hence be simplified.

  5. Universal Quantum Computing with Arbitrary Continuous-Variable Encoding

    NASA Astrophysics Data System (ADS)

    Lau, Hoi-Kwan; Plenio, Martin B.

    2016-09-01

    Implementing a qubit quantum computer in continuous-variable systems conventionally requires the engineering of specific interactions according to the encoding basis states. In this work, we present a unified formalism to conduct universal quantum computation with a fixed set of operations but arbitrary encoding. By storing a qubit in the parity of two or four qumodes, all computing processes can be implemented by basis state preparations, continuous-variable exponential-swap operations, and swap tests. Our formalism inherits the advantages that the quantum information is decoupled from collective noise, and logical qubits with different encodings can be brought to interact without decoding. We also propose a possible implementation of the required operations by using interactions that are available in a variety of continuous-variable systems. Our work separates the "hardware" problem of engineering quantum-computing-universal interactions, from the "software" problem of designing encodings for specific purposes. The development of quantum computer architecture could hence be simplified.

  6. Weaving a Formal Methods Education with Problem-Based Learning

    NASA Astrophysics Data System (ADS)

    Gibson, J. Paul

    The idea of weaving formal methods through computing (or software engineering) degrees is not a new one. However, there has been little success in developing and implementing such a curriculum. Formal methods continue to be taught as stand-alone modules and students, in general, fail to see how fundamental these methods are to the engineering of software. A major problem is one of motivation — how can the students be expected to enthusiastically embrace a challenging subject when the learning benefits, beyond passing an exam and achieving curriculum credits, are not clear? Problem-based learning has gradually moved from being an innovative pedagogique technique, commonly used to better-motivate students, to being widely adopted in the teaching of many different disciplines, including computer science and software engineering. Our experience shows that a good problem can be re-used throughout a student's academic life. In fact, the best computing problems can be used with children (young and old), undergraduates and postgraduates. In this paper we present a process for weaving formal methods through a University curriculum that is founded on the application of problem-based learning and a library of good software engineering problems, where students learn about formal methods without sitting a traditional formal methods module. The process of constructing good problems and integrating them into the curriculum is shown to be analagous to the process of engineering software. This approach is not intended to replace more traditional formal methods modules: it will better prepare students for such specialised modules and ensure that all students have an understanding and appreciation for formal methods even if they do not go on to specialise in them.

  7. Air Force Space Command. Space and Missile Systems Center Standard. Configuration Management

    DTIC Science & Technology

    2008-06-13

    Aerospace Corporation report number TOR-2006( 8583 )-1. 3. Beneficial comments (recommendations, additions, deletions) and any pertinent data that...Engineering Drawing Practices IEEE STD 610.12 Glossary of Software Engineering Terminology, September 28,1990 ISO /IEC 12207 Software Life...item, regardless of media, formally designated and fixed at a specific time during the configuration item’s life cycle. (Source: ISO /IEC 12207

  8. Preparing engineers for the challenges of community engagement

    NASA Astrophysics Data System (ADS)

    Harsh, Matthew; Bernstein, Michael J.; Wetmore, Jameson; Cozzens, Susan; Woodson, Thomas; Castillo, Rafael

    2017-11-01

    Despite calls to address global challenges through community engagement, engineers are not formally prepared to engage with communities. Little research has been done on means to address this 'engagement gap' in engineering education. We examine the efficacy of an intensive, two-day Community Engagement Workshop for engineers, designed to help engineers better look beyond technology, listen to and learn from people, and empower communities. We assessed the efficacy of the workshop in a non-experimental pre-post design using a questionnaire and a concept map. Questionnaire results indicate participants came away better able to ask questions more broadly inclusive of non-technological dimensions of engineering projects. Concept map results indicate participants have a greater understanding of ways social factors shape complex material systems after completing the programme. Based on the workshop's strengths and weaknesses, we discuss the potential of expanding and supplementing the programme to help engineers account for social aspects central to engineered systems.

  9. Perspectives on knowledge in engineering design

    NASA Technical Reports Server (NTRS)

    Rasdorf, W. J.

    1985-01-01

    Various perspectives are given of the knowledge currently used in engineering design, specifically dealing with knowledge-based expert systems (KBES). Constructing an expert system often reveals inconsistencies in domain knowledge while formalizing it. The types of domain knowledge (facts, procedures, judgments, and control) differ from the classes of that knowledge (creative, innovative, and routine). The feasible tasks for expert systems can be determined based on these types and classes of knowledge. Interpretive tasks require reasoning about a task in light of the knowledge available, where generative tasks create potential solutions to be tested against constraints. Only after classifying the domain by type and level can the engineer select a knowledge-engineering tool for the domain being considered. The critical features to be weighed after classification are knowledge representation techniques, control strategies, interface requirements, compatibility with traditional systems, and economic considerations.

  10. ENGINEERING BULLETIN: PYROLYSIS TREATMENT

    EPA Science Inventory

    Pyrolysis is formally defined as chemical decomposition induced in organic materials by heat in the absence of oxygen. In practice, it is not possible to achieve a completely oxygen-free atmosphere; actual pyrolytic systems are operated with less than stoichiometric quantities of...

  11. Recent trends related to the use of formal methods in software engineering

    NASA Technical Reports Server (NTRS)

    Prehn, Soren

    1986-01-01

    An account is given of some recent developments and trends related to the development and use of formal methods in software engineering. Ongoing activities in Europe are focussed on, since there seems to be a notable difference in attitude towards industrial usage of formal methods in Europe and in the U.S. A more detailed account is given of the currently most widespread formal method in Europe: the Vienna Development Method. Finally, the use of Ada is discussed in relation to the application of formal methods, and the potential for constructing Ada-specific tools based on that method is considered.

  12. The path to next generation biofuels: successes and challenges in the era of synthetic biology

    PubMed Central

    2010-01-01

    Volatility of oil prices along with major concerns about climate change, oil supply security and depleting reserves have sparked renewed interest in the production of fuels from renewable resources. Recent advances in synthetic biology provide new tools for metabolic engineers to direct their strategies and construct optimal biocatalysts for the sustainable production of biofuels. Metabolic engineering and synthetic biology efforts entailing the engineering of native and de novo pathways for conversion of biomass constituents to short-chain alcohols and advanced biofuels are herewith reviewed. In the foreseeable future, formal integration of functional genomics and systems biology with synthetic biology and metabolic engineering will undoubtedly support the discovery, characterization, and engineering of new metabolic routes and more efficient microbial systems for the production of biofuels. PMID:20089184

  13. Reengineering Framework for Systems in Education

    ERIC Educational Resources Information Center

    Choquet, Christophe; Corbiere, Alain

    2006-01-01

    Specifications recently proposed as standards in the domain of Technology Enhanced Learning (TEL), question the designers of TEL systems on how to put them into practice. Recent studies in Model Driven Engineering have highlighted the need for a framework which could formalize the use of these specifications as well as enhance the quality of the…

  14. Environmental Science and Engineering Merit Badges: An Exploratory Case Study of a Non-Formal Science Education Program and the U.S. Scientific and Engineering Practices

    ERIC Educational Resources Information Center

    Vick, Matthew E.; Garvey, Michael P.

    2016-01-01

    The Boy Scouts of America's Environmental Science and Engineering merit badges are two of their over 120 merit badges offered as a part of a non-formal educational program to U.S. boys. The Scientific and Engineering Practices of the U.S. Next Generation Science Standards provide a vision of science education that includes integrating eight…

  15. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  16. Youth's Engagement as Scientists and Engineers in an Afterschool Making and Tinkering Program

    NASA Astrophysics Data System (ADS)

    Simpson, Amber; Burris, Alexandra; Maltese, Adam

    2017-11-01

    Making and tinkering is currently gaining traction as an interdisciplinary approach to education. However, little is known about how these activities and explorations in formal and informal learning spaces address the content and skills common to professionals across science, technology, engineering, and mathematics. As such, the purpose of this qualitative study was to examine how youth were engaged in the eight science and engineering practices outlined within the US Next Generation Science Standards within an informal learning environment utilizing principles of tinkering within the daily activities. Findings highlight how youth and facilitators engaged and enacted in practices common to scientists and engineers. Yet, in this study, enactment of these practices "looked" differently than might be expected in a formal learning environment such as a laboratory setting. For example, in this setting, students were observed carrying out trials on their design as opposed to carrying out a formal scientific investigation. Results also highlight instances of doing science and engineering not explicitly stated within parameters of formal education documents in the USA, such as experiences with failure.

  17. Systems Engineering and Integration for Technology Programs

    NASA Technical Reports Server (NTRS)

    Kennedy, Kruss J.

    2006-01-01

    The Architecture, Habitability & Integration group (AH&I) is a system engineering and integration test team within the NASA Crew and Thermal Systems Division (CTSD) at Johnson Space Center. AH&I identifies and resolves system-level integration issues within the research and technology development community. The timely resolution of these integration issues is fundamental to the development of human system requirements and exploration capability. The integration of the many individual components necessary to construct an artificial environment is difficult. The necessary interactions between individual components and systems must be approached in a piece-wise fashion to achieve repeatable results. A formal systems engineering (SE) approach to define, develop, and integrate quality systems within the life support community has been developed. This approach will allow a Research & Technology Program to systematically approach the development, management, and quality of technology deliverables to the various exploration missions. A tiered system engineering structure has been proposed to implement best systems engineering practices across all development levels from basic research to working assemblies. These practices will be implemented through a management plan across all applicable programs, projects, elements and teams. While many of the engineering practices are common to other industries, the implementation is specific to technology development. An accounting of the systems engineering management philosophy will be discussed and the associated programmatic processes will be presented.

  18. Haptic Paddle Enhancements and a Formal Assessment of Student Learning in System Dynamics

    ERIC Educational Resources Information Center

    Gorlewicz, Jenna L.; Kratchman, Louis B.; Webster, Robert J., III

    2014-01-01

    The haptic paddle is a force-feedback joystick used at several universities in teaching System Dynamics, a core mechanical engineering undergraduate course where students learn to model dynamic systems in several domains. A second goal of the haptic paddle is to increase the accessibility of robotics and haptics by providing a low-cost device for…

  19. Formalization of the Access Control on ARM-Android Platform with the B Method

    NASA Astrophysics Data System (ADS)

    Ren, Lu; Wang, Wei; Zhu, Xiaodong; Man, Yujia; Yin, Qing

    2018-01-01

    ARM-Android is a widespread mobile platform with multi-layer access control mechanisms, security-critical in the system. Many access control vulnerabilities still exist due to the course-grained policy and numerous engineering defects, which have been widely studied. However, few researches focus on the mechanism formalization, including the Android permission framework, kernel process management and hardware isolation. This paper first develops a comprehensive formal access control model on the ARM-Android platform using the B method, from the Android middleware to hardware layer. All the model specifications are type checked and proved to be well-defined, with 75%of proof obligations demonstrated automatically. The results show that the proposed B model is feasible to specify and verify access control schemes in the ARM-Android system, and capable of implementing a practical control module.

  20. The significance of requirements engineering for the medical domain.

    PubMed

    Kossmann, Mario

    2014-07-01

    This paper aims to raise awareness of the importance of Requirements Engineering (RE) for the successful and efficient development of high-quality systems and products for the medical domain. It does so by providing an introduction to RE from the viewpoints of project and programme management and systems engineering in general and by illustrating the usefulness of a sound RE approach to the development of a local healthcare system in a deprived region in central Africa. The paper concludes that RE is just as crucial for the development of systems and products in the medical domain, as it is for the development of systems in the aerospace industry or software systems in the consumer electronics industry; while the degree of detail and formality of how RE is used has to be tailored to fit the context in question.

  1. Experiences with a Requirements-Based Programming Approach to the Development of a NASA Autonomous Ground Control System

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John

    2005-01-01

    Requirements-to-Design-to-Code (R2D2C) is an approach to the engineering of computer-based systems that embodies the idea of requirements-based programming in system development. It goes further; however, in that the approach offers not only an underlying formalism, but full formal development from requirements capture through to the automatic generation of provably-correct code. As such, the approach has direct application to the development of systems requiring autonomic properties. We describe a prototype tool to support the method, and illustrate its applicability to the development of LOGOS, a NASA autonomous ground control system, which exhibits autonomic behavior. Finally, we briefly discuss other areas where the approach and prototype tool are being considered for application.

  2. NACA Conference on Turbojet-Engine Thrust Augmentation Research: A Compilation of the Papers Presented by NACA Staff Members

    NASA Technical Reports Server (NTRS)

    1948-01-01

    The conference on Turbojet-Engine Thrust-Augmentation Research was organized by the NACA to present in summarized form the results of the latest experimental and analytical investigations conducted at the Lewis Flight Propulsion Laboratory on methods of augmenting the thrust of turbojet engines. The technical discussions are reproduced herewith in the same form in which they were presented. The original presentation in this record are considered as complementary to, rather than substitutes for, the committee's system of complete and formal reports.

  3. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  4. Integrating Science and Engineering to Implement Evidence-Based Practices in Health Care Settings

    PubMed Central

    Wu, Shinyi; Duan, Naihua; Wisdom, Jennifer P.; Kravitz, Richard L.; Owen, Richard R.; Sullivan, Greer; Wu, Albert W.; Di Capua, Paul; Hoagwood, Kimberly Eaton

    2015-01-01

    Integrating two distinct and complementary paradigms, science and engineering, may produce more effective outcomes for the implementation of evidence-based practices in health care settings. Science formalizes and tests innovations, whereas engineering customizes and optimizes how the innovation is applied tailoring to accommodate local conditions. Together they may accelerate the creation of an evidence-based healthcare system that works effectively in specific health care settings. We give examples of applying engineering methods for better quality, more efficient, and safer implementation of clinical practices, medical devices, and health services systems. A specific example was applying systems engineering design that orchestrated people, process, data, decision-making, and communication through a technology application to implement evidence-based depression care among low-income patients with diabetes. We recommend that leading journals recognize the fundamental role of engineering in implementation research, to improve understanding of design elements that create a better fit between program elements and local context. PMID:25217100

  5. Designing an architectural style for Pervasive Healthcare systems.

    PubMed

    Rafe, Vahid; Hajvali, Masoumeh

    2013-04-01

    Nowadays, the Pervasive Healthcare (PH) systems are considered as an important research area. These systems have a dynamic structure and configuration. Therefore, an appropriate method for designing such systems is necessary. The Publish/Subscribe Architecture (pub/sub) is one of the convenient architectures to support such systems. PH systems are safety critical; hence, errors can bring disastrous results. To prevent such problems, a powerful analytical tool is required. So using a proper formal language like graph transformation systems for developing of these systems seems necessary. But even if software engineers use such high level methodologies, errors may occur in the system under design. Hence, it should be investigated automatically and formally that whether this model of system satisfies all their requirements or not. In this paper, a dynamic architectural style for developing PH systems is presented. Then, the behavior of these systems is modeled and evaluated using GROOVE toolset. The results of the analysis show its high reliability.

  6. The 18 mm[superscript 2] Laboratory: Teaching MEMS Development with the SUMMiT Foundry Process

    ERIC Educational Resources Information Center

    Dallas, T.; Berg, J. M.; Gale, R. O.

    2012-01-01

    This paper describes the goals, pedagogical system, and educational outcomes of a three-semester curriculum in microelectromechanical systems (MEMS). The sequence takes engineering students with no formal MEMS training and gives them the skills to participate in cutting-edge MEMS research and development. The evolution of the curriculum from…

  7. Integrated model development for liquid fueled rocket propulsion systems

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    1993-01-01

    As detailed in the original statement of work, the objective of phase two of this research effort was to develop a general framework for rocket engine performance prediction that integrates physical principles, a rigorous mathematical formalism, component level test data, system level test data, and theory-observation reconciliation. Specific phase two development tasks are defined.

  8. MatLab Programming for Engineers Having No Formal Programming Knowledge

    NASA Technical Reports Server (NTRS)

    Shaykhian, Linda H.; Shaykhian, Gholam Ali

    2007-01-01

    MatLab is one of the most widely used very high level programming languages for Scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. Also, stated are the current limitations of the MatLab, which possibly can be taken care of by Mathworks Inc. in a future version to make MatLab more versatile.

  9. Open Source Patient-Controlled Analgesic Pump Requirements Documentation

    PubMed Central

    Larson, Brian R.; Hatcliff, John; Chalin, Patrice

    2014-01-01

    The dynamic nature of the medical domain is driving a need for continuous innovation and improvement in techniques for developing and assuring medical devices. Unfortunately, research in academia and communication between academics, industrial engineers, and regulatory authorities is hampered by the lack of realistic non-proprietary development artifacts for medical devices. In this paper, we give an overview of a detailed requirements document for a Patient-Controlled Analgesic (PCA) pump developed under the US NSF’s Food and Drug Administration (FDA) Scholar-in-Residence (SIR) program. This 60+ page document follows the methodology outlined in the US Federal Aviation Administrations (FAA) Requirements Engineering Management Handbook (REMH) and includes a domain overview, use cases, statements of safety & security requirements, and formal top-level system architectural description. Based on previous experience with release of a requirements document for a cardiac pacemaker that spawned a number of research and pedagogical activities, we believe that the described PCA requirements document can be an important research enabler within the formal methods and software engineering communities. PMID:24931440

  10. Cirrus: Inducing Subject Models from Protocol Data

    DTIC Science & Technology

    1988-08-16

    behavior scientists, and more recently, by knowledge engineers who wish to embed the knowledge of human experts in an expert system. However, protocol...analysis is notoriously difficult and time comsuming . Several systems have been developed to aid in protocol analysis. Waterman and Newell (1971, 1973...developed a system that could read the natural langauge of the protocol and produce a formal trace of it (a problem behavior graph). The system, however

  11. Integrated analysis of engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1981-01-01

    The need for light, durable, fuel efficient, cost effective aircraft requires the development of engine structures which are flexible, made from advaced materials (including composites), resist higher temperatures, maintain tighter clearances and have lower maintenance costs. The formal quantification of any or several of these requires integrated computer programs (multilevel and/or interdisciplinary analysis programs interconnected) for engine structural analysis/design. Several integrated analysis computer prorams are under development at Lewis Reseach Center. These programs include: (1) COBSTRAN-Composite Blade Structural Analysis, (2) CODSTRAN-Composite Durability Structural Analysis, (3) CISTRAN-Composite Impact Structural Analysis, (4) STAEBL-StruTailoring of Engine Blades, and (5) ESMOSS-Engine Structures Modeling Software System. Three other related programs, developed under Lewis sponsorship, are described.

  12. Leading the Teacher Team--Balancing between Formal and Informal Power in Program Leadership

    ERIC Educational Resources Information Center

    Högfeldt, Anna-Karin; Malmi, Lauri; Kinnunen, Päivi; Jerbrant, Anna; Strömberg, Emma; Berglund, Anders; Villadsen, Jørgen

    2018-01-01

    This continuous research within Nordic engineering institutions targets the contexts and possibilities for leadership among engineering education program directors. The IFP-model, developed based on analysis of interviews with program leaders in these institutions, visualizes the program director's informal and formal power. The model is presented…

  13. Formal Methods for Automated Diagnosis of Autosub 6000

    NASA Technical Reports Server (NTRS)

    Ernits, Juhan; Dearden, Richard; Pebody, Miles

    2009-01-01

    This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.

  14. Viewpoints, Formalisms, Languages, and Tools for Cyber-Physical Systems

    DTIC Science & Technology

    2014-05-16

    Organization]: Special-Purpose and Application-Based Systems —real-time and embedded sys- tems; F.1.2 [Computation by Abstract Devices]: Mod- els of...domain CPS is not new. For example, early automotive embedded systems in the 1970s already combined closed-loop control of the brake and engine subsystems...Consider for example the development of an embedded control system such as an advanced driver assistance system (ADAS) (e.g., adaptive cruise control

  15. A Software Engineering Approach based on WebML and BPMN to the Mediation Scenario of the SWS Challenge

    NASA Astrophysics Data System (ADS)

    Brambilla, Marco; Ceri, Stefano; Valle, Emanuele Della; Facca, Federico M.; Tziviskou, Christina

    Although Semantic Web Services are expected to produce a revolution in the development of Web-based systems, very few enterprise-wide design experiences are available; one of the main reasons is the lack of sound Software Engineering methods and tools for the deployment of Semantic Web applications. In this chapter, we present an approach to software development for the Semantic Web based on classical Software Engineering methods (i.e., formal business process development, computer-aided and component-based software design, and automatic code generation) and on semantic methods and tools (i.e., ontology engineering, semantic service annotation and discovery).

  16. Utilisation d'analyse de concepts formels pour la gestion de variabilite d'un logiciel configure dynamiquement

    NASA Astrophysics Data System (ADS)

    Menguy, Theotime

    Because of its critical nature, avionic industry is bound with numerous constraints such as security standards and certifications while having to fulfill the clients' desires for personalization. In this context, variability management is a very important issue for re-engineering projects of avionic softwares. In this thesis, we propose a new approach, based on formal concept analysis and semantic web, to support variability management. The first goal of this research is to identify characteristic behaviors and interactions of configuration variables in a dynamically configured system. To identify such elements, we used formal concept analysis on different levels of abstractions in the system and defined new metrics. Then, we built a classification for the configuration variables and their relations in order to enable a quick identification of a variable's behavior in the system. This classification could help finding a systematic approach to process variables during a re-engineering operation, depending on their category. To have a better understanding of the system, we also studied the shared controls of code between configuration variables. A second objective of this research is to build a knowledge platform to gather the results of all the analysis performed, and to store any additional element relevant in the variability management context, for instance new results helping define re-engineering process for each of the categories. To address this goal, we built a solution based on a semantic web, defining a new ontology, very extensive and enabling to build inferences related to the evolution processes. The approach presented here is, to the best of our knowledge, the first classification of configuration variables of a dynamically configured software and an original use of documentation and variability management techniques using semantic web in the aeronautic field. The analysis performed and the final results show that formal concept analysis is a way to identify specific properties and behaviors and that semantic web is a good solution to store and explore the results. However, the use of formal concept analysis with new boolean relations, such as the link between configuration variables and files, and the definition of new inferences may be a way to draw better conclusions. The use of the same methodology with other systems would enable to validate the approach in other contexts.

  17. The systems engineering overview and process (from the Systems Engineering Management Guide, 1990)

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The past several decades have seen the rise of large, highly interactive systems that are on the forward edge of technology. As a result of this growth and the increased usage of digital systems (computers and software), the concept of systems engineering has gained increasing attention. Some of this attention is no doubt due to large program failures which possibly could have been avoided, or at least mitigated, through the use of systems engineering principles. The complexity of modern day weapon systems requires conscious application of systems engineering concepts to ensure producible, operable and supportable systems that satisfy mission requirements. Although many authors have traced the roots of systems engineering to earlier dates, the initial formalization of the systems engineering process for military development began to surface in the mid-1950s on the ballistic missile programs. These early ballistic missile development programs marked the emergence of engineering discipline 'specialists' which has since continued to grow. Each of these specialties not only has a need to take data from the overall development process, but also to supply data, in the form of requirements and analysis results, to the process. A number of technical instructions, military standards and specifications, and manuals were developed as a result of these development programs. In particular, MILSTD-499 was issued in 1969 to assist both government and contractor personnel in defining the systems engineering effort in support of defense acquisition programs. This standard was updated to MIL-STD499A in 1974, and formed the foundation for current application of systems engineering principles to military development programs.

  18. The systems engineering overview and process (from the Systems Engineering Management Guide, 1990)

    NASA Astrophysics Data System (ADS)

    The past several decades have seen the rise of large, highly interactive systems that are on the forward edge of technology. As a result of this growth and the increased usage of digital systems (computers and software), the concept of systems engineering has gained increasing attention. Some of this attention is no doubt due to large program failures which possibly could have been avoided, or at least mitigated, through the use of systems engineering principles. The complexity of modern day weapon systems requires conscious application of systems engineering concepts to ensure producible, operable and supportable systems that satisfy mission requirements. Although many authors have traced the roots of systems engineering to earlier dates, the initial formalization of the systems engineering process for military development began to surface in the mid-1950s on the ballistic missile programs. These early ballistic missile development programs marked the emergence of engineering discipline 'specialists' which has since continued to grow. Each of these specialties not only has a need to take data from the overall development process, but also to supply data, in the form of requirements and analysis results, to the process. A number of technical instructions, military standards and specifications, and manuals were developed as a result of these development programs. In particular, MILSTD-499 was issued in 1969 to assist both government and contractor personnel in defining the systems engineering effort in support of defense acquisition programs. This standard was updated to MIL-STD499A in 1974, and formed the foundation for current application of systems engineering principles to military development programs.

  19. User Participation and Participatory Design: Topics in Computing Education.

    ERIC Educational Resources Information Center

    Kautz, Karlheinz

    1996-01-01

    Discusses user participation and participatory design in the context of formal education for computing professionals. Topics include the current curriculum debate; mathematical- and engineering-based education; traditional system-development training; and an example of a course program that includes computers and society, and prototyping. (53…

  20. RAFCON: A Graphical Tool for Engineering Complex, Robotic Tasks

    DTIC Science & Technology

    2016-10-09

    Robotic tasks are becoming increasingly complex, and with this also the robotic systems. This requires new tools to manage this complexity and to...execution of robotic tasks, called RAFCON. These tasks are described in hierarchical state machines supporting concurrency. A formal notation of this concept

  1. FORMED: Bringing Formal Methods to the Engineering Desktop

    DTIC Science & Technology

    2016-02-01

    integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification

  2. Goal-Function Tree Modeling for Systems Engineering and Fault Management

    NASA Technical Reports Server (NTRS)

    Johnson, Stephen B.; Breckenridge, Jonathan T.

    2013-01-01

    The draft NASA Fault Management (FM) Handbook (2012) states that Fault Management (FM) is a "part of systems engineering", and that it "demands a system-level perspective" (NASAHDBK- 1002, 7). What, exactly, is the relationship between systems engineering and FM? To NASA, systems engineering (SE) is "the art and science of developing an operable system capable of meeting requirements within often opposed constraints" (NASA/SP-2007-6105, 3). Systems engineering starts with the elucidation and development of requirements, which set the goals that the system is to achieve. To achieve these goals, the systems engineer typically defines functions, and the functions in turn are the basis for design trades to determine the best means to perform the functions. System Health Management (SHM), by contrast, defines "the capabilities of a system that preserve the system's ability to function as intended" (Johnson et al., 2011, 3). Fault Management, in turn, is the operational subset of SHM, which detects current or future failures, and takes operational measures to prevent or respond to these failures. Failure, in turn, is the "unacceptable performance of intended function." (Johnson 2011, 605) Thus the relationship of SE to FM is that SE defines the functions and the design to perform those functions to meet system goals and requirements, while FM detects the inability to perform those functions and takes action. SHM and FM are in essence "the dark side" of SE. For every function to be performed (SE), there is the possibility that it is not successfully performed (SHM); FM defines the means to operationally detect and respond to this lack of success. We can also describe this in terms of goals: for every goal to be achieved, there is the possibility that it is not achieved; FM defines the means to operationally detect and respond to this inability to achieve the goal. This brief description of relationships between SE, SHM, and FM provide hints to a modeling approach to provide formal connectivity between the nominal (SE), and off-nominal (SHM and FM) aspects of functions and designs. This paper describes a formal modeling approach to the initial phases of the development process that integrates the nominal and off-nominal perspectives in a model that unites SE goals and functions of with the failure to achieve goals and functions (SHM/FM).

  3. NASA's Space Launch System: Systems Engineering Approach for Affordability and Mission Success

    NASA Technical Reports Server (NTRS)

    Hutt, John J.; Whitehead, Josh; Hanson, John

    2017-01-01

    NASA is working toward the first launch of the Space Launch System, a new, unmatched capability for deep space exploration with launch readiness planned for 2019. Since program start in 2011, SLS has passed several major formal design milestones, and every major element of the vehicle has produced test and flight hardware. The SLS approach to systems engineering has been key to the program's success. Key aspects of the SLS SE&I approach include: 1) minimizing the number of requirements, 2) elimination of explicit verification requirements, 3) use of certified models of subsystem capability in lieu of requirements when appropriate and 4) certification of capability beyond minimum required capability.

  4. Cirrus: Inducing Subject Models from Protocol Data

    DTIC Science & Technology

    1988-08-16

    Protocol analysis is used routinely by psychologists and other behavior scientists, and more recently, by knowledge engineers who wish to embed the...knowledge of human experts in an expert system. However, protocol analysis is notoriously difficult and time comsuming . Several systems have been developed to...formal trace of it (a problem behavior graph). The system, however, did not produce an abstract model of the subject. Bhaskar and Simon (1977) avoided the

  5. Investigations in Computer-Aided Instruction and Computer-Aided Controls. Final Report.

    ERIC Educational Resources Information Center

    Rosenberg, R.C.; And Others

    These research projects, designed to delve into certain relationships between humans and computers, are focused on computer-assisted instruction and on man-computer interaction. One study demonstrates that within the limits of formal engineering theory, a computer simulated laboratory (Dynamic Systems Laboratory) can be built in which freshmen…

  6. Improving Function Allocation for Integrated Systems Design

    DTIC Science & Technology

    1996-06-01

    in the movie Star Trek—The Next Generation, the android DATA is both perceived and treated as a member of the crew. That type of perceptual change...time-consuming, formal contract change. LABORATORY VIEW OF FUNCTION ALLOCATION In 1951, Paul M. Fitts, the founder of the Human Engineering Divi

  7. A Semi-Automatic Approach to Construct Vietnamese Ontology from Online Text

    ERIC Educational Resources Information Center

    Nguyen, Bao-An; Yang, Don-Lin

    2012-01-01

    An ontology is an effective formal representation of knowledge used commonly in artificial intelligence, semantic web, software engineering, and information retrieval. In open and distance learning, ontologies are used as knowledge bases for e-learning supplements, educational recommenders, and question answering systems that support students with…

  8. CrossTalk. The Journal of Defense Software Engineering. Volume 25, Number 1, Jan/Feb 2012

    DTIC Science & Technology

    2012-01-01

    Considerations in Airborne Systems and Equipment Certification – RTCA/DO-178B,” Washington, D.C., 1992. 5. Ishikawa , Kaoru (Translator: J. H...significant, repeated issue, a formal root cause analysis process is performed. This method uses fishbone or Ishikawa diagrams [5], where possible

  9. META II: Formal Co-Verification of Correctness of Large-Scale Cyber-Physical Systems during Design. Volume 1

    DTIC Science & Technology

    2011-08-01

    design space is large. His research contributions are to the field of Decision-based Design, specifically in linking consumer preferences and...Integrating Consumer Preferences into Engineering Design, to be published in 2012. He received his PhD from Northwestern University in Mechanical

  10. Formal, Non-Formal and Informal Learning in the Sciences

    ERIC Educational Resources Information Center

    Ainsworth, Heather L.; Eaton, Sarah Elaine

    2010-01-01

    This research report investigates the links between formal, non-formal and informal learning and the differences between them. In particular, the report aims to link these notions of learning to the field of sciences and engineering in Canada and the United States, including professional development of adults working in these fields. It offers…

  11. A Study of Technical Engineering Peer Reviews at NASA

    NASA Technical Reports Server (NTRS)

    Chao, Lawrence P.; Tumer, Irem Y.; Bell, David G.

    2003-01-01

    This report describes the state of practices of design reviews at NASA and research into what can be done to improve peer review practices. There are many types of reviews at NASA: required and not, formalized and informal, programmatic and technical. Standing project formal reviews such as the Preliminary Design Review and Critical Design Review are a required part of every project and mission development. However, the technical, engineering peer reviews that support teams' work on such projects are informal, some times ad hoc, and inconsistent across the organization. The goal of this work is to identify best practices and lessons learned from NASA's experience, supported by academic research and methodologies to ultimately improve the process. This research has determined that the organization, composition, scope, and approach of the reviews impact their success. Failure Modes and Effects Analysis (FMEA) can identify key areas of concern before or in the reviews. Product definition tools like the Project Priority Matrix, engineering-focused Customer Value Chain Analysis (CVCA), and project or system-based Quality Function Deployment (QFD) help prioritize resources in reviews. The use of information technology and structured design methodologies can strengthen the engineering peer review process to help NASA work towards error-proofing the design process.

  12. User Interface Technology for Formal Specification Development

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    Formal specification development and modification are an essential component of the knowledge-based software life cycle. User interface technology is needed to empower end-users to create their own formal specifications. This paper describes the advanced user interface for AMPHION1 a knowledge-based software engineering system that targets scientific subroutine libraries. AMPHION is a generic, domain-independent architecture that is specialized to an application domain through a declarative domain theory. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that provides semantic guidance in creating diagrams denoting formal specifications in an application domain. The diagrams also serve to document the specifications. Automatic deductive program synthesis ensures that end-user specifications are correctly implemented. The tables that drive AMPHION's user interface are automatically compiled from a domain theory; portions of the interface can be customized by the end-user. The user interface facilitates formal specification development by hiding syntactic details, such as logical notation. It also turns some of the barriers for end-user specification development associated with strongly typed formal languages into active sources of guidance, without restricting advanced users. The interface is especially suited for specification modification. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development.

  13. Closing the Gap between Formalism and Application--PBL and Mathematical Skills in Engineering

    ERIC Educational Resources Information Center

    Christensen, Ole Ravn

    2008-01-01

    A common problem in learning mathematics concerns the gap between, on the one hand, doing the formalisms and calculations of abstract mathematics and, on the other hand, applying these in a specific contextualized setting for example the engineering world. The skills acquired through problem-based learning (PBL), in the special model used at…

  14. Functional groups of ecosystem engineers: a proposed classification with comments on current issues.

    PubMed

    Berke, Sarah K

    2010-08-01

    Ecologists have long known that certain organisms fundamentally modify, create, or define habitats by altering the habitat's physical properties. In the past 15 years, these processes have been formally defined as "ecosystem engineering", reflecting a growing consensus that environmental structuring by organisms represents a fundamental class of ecological interactions occurring in most, if not all, ecosystems. Yet, the precise definition and scope of ecosystem engineering remains debated, as one should expect given the complexity, enormity, and variability of ecological systems. Here I briefly comment on a few specific current points of contention in the ecosystem engineering concept. I then suggest that ecosystem engineering can be profitably subdivided into four narrower functional categories reflecting four broad mechanisms by which ecosystem engineering occurs: structural engineers, bioturbators, chemical engineers, and light engineers. Finally, I suggest some conceptual model frameworks that could apply broadly within these functional groups.

  15. Active Reliability Engineering - Technical Concept and Program Plan. A Solid-State Systems Approach to Increased Reliability and Availability in Military Systems.

    DTIC Science & Technology

    1983-10-05

    battle damage. Others are local electrical power and cooling disruptions. Again, a highly critical function is lost if its computer site is destroyed. A...formalized design of the test bed to meet the requirements of the functional description and goals of the program. AMTEC --Z3IT TASKS: 610, 710, 810

  16. B-2 Systems Engineering Case Study

    DTIC Science & Technology

    2007-01-01

    formal configuration freeze , an immediate refocus of the Task Teams was required. Within several days, the air vehicle task teams were conducting...39 3.3.3 Configuration Freeze ...1983 PDR 1 Oct 1982 Reconfiguration Feb 1983-Aug 1983 (LP3, LP4) Configuration Freeze July 1983 PDR 2 Mar-April 1984 CDR Dec 1986

  17. Lesson Learning at JPL

    NASA Technical Reports Server (NTRS)

    Oberhettinger, David

    2011-01-01

    A lessons learned system is a hallmark of a mature engineering organization A formal lessons learned process can help assure that valuable lessons get written and published, that they are well-written, and that the essential information is "infused" into institutional practice. Requires high-level institutional commitment, and everyone's participation in gathering, disseminating, and using the lessons

  18. A Formal Application of Safety and Risk Assessment in Software Systems

    DTIC Science & Technology

    2004-09-01

    characteristics of Software Engineering, Development, and Safety...against a comparison of planned and actual schedules, costs, and characteristics . Software Safety is focused on the reduction of unsafe incidents...they merely carry out the role for which they were anatomically designed.55 Software is characteristically like an anatomical cell as it merely

  19. Health care professional workstation: software system construction using DSSA scenario-based engineering process.

    PubMed

    Hufnagel, S; Harbison, K; Silva, J; Mettala, E

    1994-01-01

    This paper describes a new method for the evolutionary determination of user requirements and system specifications called scenario-based engineering process (SEP). Health care professional workstations are critical components of large scale health care system architectures. We suggest that domain-specific software architectures (DSSAs) be used to specify standard interfaces and protocols for reusable software components throughout those architectures, including workstations. We encourage the use of engineering principles and abstraction mechanisms. Engineering principles are flexible guidelines, adaptable to particular situations. Abstraction mechanisms are simplifications for management of complexity. We recommend object-oriented design principles, graphical structural specifications, and formal components' behavioral specifications. We give an ambulatory care scenario and associated models to demonstrate SEP. The scenario uses health care terminology and gives patients' and health care providers' system views. Our goal is to have a threefold benefit. (i) Scenario view abstractions provide consistent interdisciplinary communications. (ii) Hierarchical object-oriented structures provide useful abstractions for reuse, understandability, and long term evolution. (iii) SEP and health care DSSA integration into computer aided software engineering (CASE) environments. These environments should support rapid construction and certification of individualized systems, from reuse libraries.

  20. Composing, Analyzing and Validating Software Models

    NASA Astrophysics Data System (ADS)

    Sheldon, Frederick T.

    1998-10-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  1. Composing, Analyzing and Validating Software Models

    NASA Technical Reports Server (NTRS)

    Sheldon, Frederick T.

    1998-01-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  2. A knowledge engineering framework towards clinical support for adverse drug event prevention: the PSIP approach.

    PubMed

    Koutkias, Vassilis; Stalidis, George; Chouvarda, Ioanna; Lazou, Katerina; Kilintzis, Vassilis; Maglaveras, Nicos

    2009-01-01

    Adverse Drug Events (ADEs) are currently considered as a major public health issue, endangering patients' safety and causing significant healthcare costs. Several research efforts are currently concentrating on the reduction of preventable ADEs by employing Information Technology (IT) solutions, which aim to provide healthcare professionals and patients with relevant knowledge and decision support tools. In this context, we present a knowledge engineering approach towards the construction of a Knowledge-based System (KBS) regarded as the core part of a CDSS (Clinical Decision Support System) for ADE prevention, all developed in the context of the EU-funded research project PSIP (Patient Safety through Intelligent Procedures in Medication). In the current paper, we present the knowledge sources considered in PSIP and the implications they pose to knowledge engineering, the methodological approach followed, as well as the components defining the knowledge engineering framework based on relevant state-of-the-art technologies and representation formalisms.

  3. Automated Test Environment for a Real-Time Control System

    NASA Technical Reports Server (NTRS)

    Hall, Ronald O.

    1994-01-01

    An automated environment with hardware-in-the-loop has been developed by Rocketdyne Huntsville for test of a real-time control system. The target system of application is the man-rated real-time system which controls the Space Shuttle Main Engines (SSME). The primary use of the environment is software verification and validation, but it is also useful for evaluation and analysis of SSME avionics hardware and mathematical engine models. It provides a test bed for the integration of software and hardware. The principles and skills upon which it operates may be applied to other target systems, such as those requiring hardware-in-the-loop simulation and control system development. Potential applications are in problem domains demanding highly reliable software systems requiring testing to formal requirements and verifying successful transition to/from off-nominal system states.

  4. Consistent design schematics for biological systems: standardization of representation in biological engineering

    PubMed Central

    Matsuoka, Yukiko; Ghosh, Samik; Kitano, Hiroaki

    2009-01-01

    The discovery by design paradigm driving research in synthetic biology entails the engineering of de novo biological constructs with well-characterized input–output behaviours and interfaces. The construction of biological circuits requires iterative phases of design, simulation and assembly, leading to the fabrication of a biological device. In order to represent engineered models in a consistent visual format and further simulating them in silico, standardization of representation and model formalism is imperative. In this article, we review different efforts for standardization, particularly standards for graphical visualization and simulation/annotation schemata adopted in systems biology. We identify the importance of integrating the different standardization efforts and provide insights into potential avenues for developing a common framework for model visualization, simulation and sharing across various tools. We envision that such a synergistic approach would lead to the development of global, standardized schemata in biology, empowering deeper understanding of molecular mechanisms as well as engineering of novel biological systems. PMID:19493898

  5. A system approach to aircraft optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1991-01-01

    Mutual couplings among the mathematical models of physical phenomena and parts of a system such as an aircraft complicate the design process because each contemplated design change may have a far reaching consequence throughout the system. Techniques are outlined for computing these influences as system design derivatives useful for both judgemental and formal optimization purposes. The techniques facilitate decomposition of the design process into smaller, more manageable tasks and they form a methodology that can easily fit into existing engineering organizations and incorporate their design tools.

  6. Security Hardened Cyber Components for Nuclear Power Plants: Phase I SBIR Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franusich, Michael D.

    SpiralGen, Inc. built a proof-of-concept toolkit for enhancing the cyber security of nuclear power plants and other critical infrastructure with high-assurance instrumentation and control code. The toolkit is based on technology from the DARPA High-Assurance Cyber Military Systems (HACMS) program, which has focused on applying the science of formal methods to the formidable set of problems involved in securing cyber physical systems. The primary challenges beyond HACMS in developing this toolkit were to make the new technology usable by control system engineers and compatible with the regulatory and commercial constraints of the nuclear power industry. The toolkit, packaged as amore » Simulink add-on, allows a system designer to assemble a high-assurance component from formally specified and proven blocks and generate provably correct control and monitor code for that subsystem.« less

  7. Autonomous and Autonomic Swarms

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Truszkowski, Walter F.; Rouff, Christopher A.; Sterritt, Roy

    2005-01-01

    A watershed in systems engineering is represented by the advent of swarm-based systems that accomplish missions through cooperative action by a (large) group of autonomous individuals each having simple capabilities and no global knowledge of the group s objective. Such systems, with individuals capable of surviving in hostile environments, pose unprecedented challenges to system developers. Design and testing and verification at much higher levels will be required, together with the corresponding tools, to bring such systems to fruition. Concepts for possible future NASA space exploration missions include autonomous, autonomic swarms. Engineering swarm-based missions begins with understanding autonomy and autonomicity and how to design, test, and verify systems that have those properties and, simultaneously, the capability to accomplish prescribed mission goals. Formal methods-based technologies, both projected and in development, are described in terms of their potential utility to swarm-based system developers.

  8. An Overview of Starfish: A Table-Centric Tool for Interactive Synthesis

    NASA Technical Reports Server (NTRS)

    Tsow, Alex

    2008-01-01

    Engineering is an interactive process that requires intelligent interaction at many levels. My thesis [1] advances an engineering discipline for high-level synthesis and architectural decomposition that integrates perspicuous representation, designer interaction, and mathematical rigor. Starfish, the software prototype for the design method, implements a table-centric transformation system for reorganizing control-dominated system expressions into high-level architectures. Based on the digital design derivation (DDD) system a designer-guided synthesis technique that applies correctness preserving transformations to synchronous data flow specifications expressed as co- recursive stream equations Starfish enhances user interaction and extends the reachable design space by incorporating four innovations: behavior tables, serialization tables, data refinement, and operator retiming. Behavior tables express systems of co-recursive stream equations as a table of guarded signal updates. Developers and users of the DDD system used manually constructed behavior tables to help them decide which transformations to apply and how to specify them. These design exercises produced several formally constructed hardware implementations: the FM9001 microprocessor, an SECD machine for evaluating LISP, and the SchemEngine, garbage collected machine for interpreting a byte-code representation of compiled Scheme programs. Bose and Tuna, two of DDD s developers, have subsequently commercialized the design derivation methodology at Derivation Systems, Inc. (DSI). DSI has formally derived and validated PCI bus interfaces and a Java byte-code processor; they further executed a contract to prototype SPIDER-NASA's ultra-reliable communications bus. To date, most derivations from DDD and DRS have targeted hardware due to its synchronous design paradigm. However, Starfish expressions are independent of the synchronization mechanism; there is no commitment to hardware or globally broadcast clocks. Though software back-ends for design derivation are limited to the DDD stream-interpreter, targeting synchronous or real-time software is not substantively different from targeting hardware.

  9. Formalisms for user interface specification and design

    NASA Technical Reports Server (NTRS)

    Auernheimer, Brent J.

    1989-01-01

    The application of formal methods to the specification and design of human-computer interfaces is described. A broad outline of human-computer interface problems, a description of the field of cognitive engineering and two relevant research results, the appropriateness of formal specification techniques, and potential NASA application areas are described.

  10. Academic Achievement and Formal Thought in Engineering Students

    ERIC Educational Resources Information Center

    Vazquez, Stella Maris; de Anglat, Hilda Difabio

    2009-01-01

    Introduction: Research on university-level academic performance has significantly linked failure and dropping out to formal reasoning deficiency. We have not found any papers on formal thought in Argentine university students, in spite of the obvious shortcomings observed in the classrooms. Thus, the main objective of this paper was exploring the…

  11. Operations planning and analysis handbook for NASA/MSFC phase B development projects

    NASA Technical Reports Server (NTRS)

    Batson, Robert C.

    1986-01-01

    Current operations planning and analysis practices on NASA/MSFC Phase B projects were investigated with the objectives of (1) formalizing these practices into a handbook and (2) suggesting improvements. The study focused on how Science and Engineering (S&E) Operational Personnel support Program Development (PD) Task Teams. The intimate relationship between systems engineering and operations analysis was examined. Methods identified for use by operations analysts during Phase B include functional analysis, interface analysis methods to calculate/allocate such criteria as reliability, Maintainability, and operations and support cost.

  12. Management Of Optical Projects

    NASA Astrophysics Data System (ADS)

    Young, Peter S.; Olson, David R.

    1981-03-01

    This paper discusses the management of optical projects from the concept stage, beginning with system specifications, through design, optical fabrication and test tasks. Special emphasis is placed on effective coupling of design engineering with fabrication development and utilization of available technology. Contrasts are drawn between accepted formalized management techniques, the realities of dealing with fragile components and the necessity of an effective project team which integrates the special characteristics of highly skilled optical specialists including lens designers, optical engineers, opticians, and metrologists. Examples are drawn from the HEAO-2 X-Ray Telescope and Space Telescope projects.

  13. 48 CFR 48.101 - General.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions... two value engineering approaches: (1) The first is an incentive approach in which contractor...

  14. The Role of Formal Experiment Design in Hypersonic Flight System Technology Development

    NASA Technical Reports Server (NTRS)

    McClinton, Charles R.; Ferlemann, Shelly M.; Rock, Ken E.; Ferlemann, Paul G.

    2002-01-01

    Hypersonic airbreathing engine (scramjet) powered vehicles are being considered to replace conventional rocket-powered launch systems. Effective utilization of scramjet engines requires careful integration with the air vehicle. This integration synergistically combines aerodynamic forces with propulsive cycle functions of the engine. Due to the highly integrated nature of the hypersonic vehicle design problem, the large flight envelope, and the large number of design variables, the use of a statistical design approach in design is effective. Modern Design-of-Experiments (MDOE) has been used throughout the Hyper-X program, for both systems analysis and experimental testing. Application of MDOE fall into four categories: (1) experimental testing; (2) studies of unit phenomena; (3) refining engine design; and (4) full vehicle system optimization. The MDOE process also provides analytical models, which are also used to document lessons learned, supplement low-level design tools, and accelerate future studies. This paper will discuss the design considerations for scramjet-powered vehicles, specifics of MDOE utilized for Hyper-X, and present highlights from the use of these MDOE methods within the Hyper-X Program.

  15. The fusion of biology, computer science, and engineering: towards efficient and successful synthetic biology.

    PubMed

    Linshiz, Gregory; Goldberg, Alex; Konry, Tania; Hillson, Nathan J

    2012-01-01

    Synthetic biology is a nascent field that emerged in earnest only around the turn of the millennium. It aims to engineer new biological systems and impart new biological functionality, often through genetic modifications. The design and construction of new biological systems is a complex, multistep process, requiring multidisciplinary collaborative efforts from "fusion" scientists who have formal training in computer science or engineering, as well as hands-on biological expertise. The public has high expectations for synthetic biology and eagerly anticipates the development of solutions to the major challenges facing humanity. This article discusses laboratory practices and the conduct of research in synthetic biology. It argues that the fusion science approach, which integrates biology with computer science and engineering best practices, including standardization, process optimization, computer-aided design and laboratory automation, miniaturization, and systematic management, will increase the predictability and reproducibility of experiments and lead to breakthroughs in the construction of new biological systems. The article also discusses several successful fusion projects, including the development of software tools for DNA construction design automation, recursive DNA construction, and the development of integrated microfluidics systems.

  16. AMPHION: Specification-based programming for scientific subroutine libraries

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Waldinger, Richard; Stickel, Mark

    1994-01-01

    AMPHION is a knowledge-based software engineering (KBSE) system that guides a user in developing a diagram representing a formal problem specification. It then automatically implements a solution to this specification as a program consisting of calls to subroutines from a library. The diagram provides an intuitive domain oriented notation for creating a specification that also facilitates reuse and modification. AMPHION'S architecture is domain independent. AMPHION is specialized to an application domain by developing a declarative domain theory. Creating a domain theory is an iterative process that currently requires the joint expertise of domain experts and experts in automated formal methods for software development.

  17. Semi-classical approach to transitionless quantum driving: Explicitness and Locality

    NASA Astrophysics Data System (ADS)

    Loewe, Benjamin; Hipolito, Rafael; Goldbart, Paul M.

    Berry has shown that, via a reverse engineering strategy, non-adiabatic transitions in time-dependent quantum systems can be stifled through the introduction of a specific auxiliary hamiltonian. This hamiltonian comes, however, expressed as a formal sum of outer products of the original instantaneous eigenstates and their time-derivatives. Generically, how to create such an operator in the laboratory is thus not evident. Furthermore, the operator may be non- local. By following a semi-classical approach, we obtain a recipe that yields the auxiliary hamiltonian explicitly in terms of the fundamental operators of the system (e.g., position and momentum). By using this formalism, we are able to ascertain criteria for the locality of the auxiliary hamiltonian, and also to determine its exact form in certain special cases.

  18. 48 CFR 48.101 - General.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... engineering effort is applied to areas of the contract that offer opportunities for considerable savings... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions...

  19. 48 CFR 48.101 - General.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... engineering effort is applied to areas of the contract that offer opportunities for considerable savings... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions...

  20. 48 CFR 48.101 - General.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... engineering effort is applied to areas of the contract that offer opportunities for considerable savings... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions...

  1. Integrated analysis of large space systems

    NASA Technical Reports Server (NTRS)

    Young, J. P.

    1980-01-01

    Based on the belief that actual flight hardware development of large space systems will necessitate a formalized method of integrating the various engineering discipline analyses, an efficient highly user oriented software system capable of performing interdisciplinary design analyses with tolerable solution turnaround time is planned Specific analysis capability goals were set forth with initial emphasis given to sequential and quasi-static thermal/structural analysis and fully coupled structural/control system analysis. Subsequently, the IAC would be expanded to include a fully coupled thermal/structural/control system, electromagnetic radiation, and optical performance analyses.

  2. META-GLARE: a shell for CIG systems.

    PubMed

    Bottrighi, Alessio; Rubrichi, Stefania; Terenziani, Paolo

    2015-01-01

    In the last twenty years, many different approaches to deal with Computer-Interpretable clinical Guidelines (CIGs) have been developed, each one proposing its own representation formalism (mostly based on the Task-Network Model) execution engine. We propose META-GLARE a shell for easily defining new CIG systems. Using META-GLARE, CIG system designers can easily define their own systems (basically by defining their representation language), with a minimal programming effort. META-GLARE is thus a flexible and powerful vehicle for research about CIGs, since it supports easy and fast prototyping of new CIG systems.

  3. Developing Non-Formal Education Competences as a Complement of Formal Education for STEM Lecturers

    ERIC Educational Resources Information Center

    Terrazas-Marín, Roy Alonso

    2018-01-01

    This paper focuses on a current practice piece on professional development for university lecturers, transformative learning, dialogism and STEM (Science, Technology, Engineering and Mathematics) education. Its main goals are to identify the key characteristics that allow STEM educators to experiment with the usage of non-formal education…

  4. National Computer Security Conference Proceedings (12th): Information Systems Security: Solutions for Today - Concepts for Tomorrow Held in Baltimore, Maryland on 10-13 October 1989

    DTIC Science & Technology

    1989-10-13

    and other non -technical aspects of the system). System-wide Perspective. The systerm that is being designed and engineered must include not just the...specifications and is regarded as the lowest-level (implementation) of detail.-’ Ihis decomposition follows the typical "top down" design methodology ...formal verification process has contributed to the security and correctness of the TCB design and implementation. FORMUL METHODOLOGY DESCRIPTION The

  5. Technology Infusion of CodeSonar into the Space Network Ground Segment

    NASA Technical Reports Server (NTRS)

    Benson, Markland J.

    2009-01-01

    This slide presentation reviews the applicability of CodeSonar to the Space Network software. CodeSonar is a commercial off the shelf system that analyzes programs written in C, C++ or Ada for defects in the code. Software engineers use CodeSonar results as an input to the existing source code inspection process. The study is focused on large scale software developed using formal processes. The systems studied are mission critical in nature but some use commodity computer systems.

  6. The TAME Project: Towards improvement-oriented software environments

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Rombach, H. Dieter

    1988-01-01

    Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.

  7. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    NASA Technical Reports Server (NTRS)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network development. The changes were to include evaluation tools that can be applied to neural networks at each phase of the software engineering life cycle. The result was a formal evaluation approach to increase the product quality of systems that use neural networks for their implementation.

  8. Why Engineers Should Consider Formal Methods

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael

    1997-01-01

    This paper presents a logical analysis of a typical argument favoring the use of formal methods for software development, and suggests an alternative argument that is simpler and stronger than the typical one.

  9. A knowledge based software engineering environment testbed

    NASA Technical Reports Server (NTRS)

    Gill, C.; Reedy, A.; Baker, L.

    1985-01-01

    The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing

  10. Joint electrical engineering/physics course sequence for optics fundamentals and design

    NASA Astrophysics Data System (ADS)

    Magnusson, Robert; Maldonado, Theresa A.; Black, Truman D.

    2000-06-01

    Optics is a key technology in a broad range of engineering and science applications of high national priority. Engineers and scientists with a sound background in this field are needed to preserve technical leadership and to establish new directions of research and development. To meet this educational need, a joint Electrical Engineering/Physics optics course sequence was created as PHYS 3445 Fundamentals of Optics and EE 4444 Optical Systems Design, both with a laboratory component. The objectives are to educate EE and Physics undergraduate students in the fundamentals of optics; in interdisciplinary problem solving; in design and analysis; in handling optical components; and in skills such as communications and team cooperation. Written technical reports in professional format are required, formal presentations are given, and participation in paper design contests is encouraged.

  11. Clinical Immersion and Biomedical Engineering Design Education: "Engineering Grand Rounds".

    PubMed

    Walker, Matthew; Churchwell, André L

    2016-03-01

    Grand Rounds is a ritual of medical education and inpatient care comprised of presenting the medical problems and treatment of a patient to an audience of physicians, residents, and medical students. Traditionally, the patient would be in attendance for the presentation and would answer questions. Grand Rounds has evolved considerably over the years with most sessions being didactic-rarely having a patient present (although, in some instances, an actor will portray the patient). Other members of the team, such as nurses, nurse practitioners, and biomedical engineers, are not traditionally involved in the formal teaching process. In this study we examine the rapid ideation in a clinical setting to forge a system of cross talk between engineers and physicians as a steady state at the praxis of ideation and implementation.

  12. Engineering the Business of Defense Acquisition: An Analysis of Program Office Processes

    DTIC Science & Technology

    2015-05-01

    Information Technology and Business Process Redesign | MIT Sloan Management Review . MIT Sloan Management Review . Retrieved from http://sloanreview.mit.edu...links systems management to process execution Three Phases/ Multi-Year Effort (This Phase) Literature review Model development— Formal and...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining

  13. Information visualisation based on graph models

    NASA Astrophysics Data System (ADS)

    Kasyanov, V. N.; Kasyanova, E. V.

    2013-05-01

    Information visualisation is a key component of support tools for many applications in science and engineering. A graph is an abstract structure that is widely used to model information for its visualisation. In this paper, we consider practical and general graph formalism called hierarchical graphs and present the Higres and Visual Graph systems aimed at supporting information visualisation on the base of hierarchical graph models.

  14. Integrated Modeling and Simulation Verification, Validation, and Accreditation Strategy for Exploration Systems Mission Directorate

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P.

    2006-01-01

    Models and simulations (M&S) are critical resources in the exploration of space. They support program management, systems engineering, integration, analysis, test, and operations and provide critical information and data supporting key analyses and decisions (technical, cost and schedule). Consequently, there is a clear need to establish a solid understanding of M&S strengths and weaknesses, and the bounds within which they can credibly support decision-making. Their usage requires the implementation of a rigorous approach to verification, validation and accreditation (W&A) and establishment of formal process and practices associated with their application. To ensure decision-making is suitably supported by information (data, models, test beds) from activities (studies, exercises) from M&S applications that are understood and characterized, ESMD is establishing formal, tailored W&A processes and practices. In addition, to ensure the successful application of M&S within ESMD, a formal process for the certification of analysts that use M&S is being implemented. This presentation will highlight NASA's Exploration Systems Mission Directorate (ESMD) management approach for M&S W&A to ensure decision-makers receive timely information on the model's fidelity, credibility, and quality.

  15. Petri net-based dependability modeling methodology for reconfigurable field programmable gate arrays

    NASA Astrophysics Data System (ADS)

    Graczyk, Rafał; Orleański, Piotr; Poźniak, Krzysztof

    2015-09-01

    Dependability modeling is an important issue for aerospace and space equipment designers. From system level perspective, one has to choose from multitude of possible architectures, redundancy levels, component combinations in a way to meet desired properties and dependability and finally fit within required cost and time budgets. Modeling of such systems is getting harder as its levels of complexity grow together with demand for more functional and flexible, yet more available systems that govern more and more crucial parts of our civilization's infrastructure (aerospace transport systems, telecommunications, exploration probes). In this article promising method of modeling complex systems using Petri networks is introduced in context of qualitative and quantitative dependability analysis. This method, although with some limitation and drawback offer still convenient visual formal method of describing system behavior on different levels (functional, timing, random events) and offers straight correspondence to underlying mathematical engine, perfect for simulations and engineering support.

  16. The Second NASA Formal Methods Workshop 1992

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C. (Compiler); Holloway, C. Michael (Compiler); Butler, Ricky W. (Compiler)

    1992-01-01

    The primary goal of the workshop was to bring together formal methods researchers and aerospace industry engineers to investigate new opportunities for applying formal methods to aerospace problems. The first part of the workshop was tutorial in nature. The second part of the workshop explored the potential of formal methods to address current aerospace design and verification problems. The third part of the workshop involved on-line demonstrations of state-of-the-art formal verification tools. Also, a detailed survey was filled in by the attendees; the results of the survey are compiled.

  17. An ORCID based synchronization framework for a national CRIS ecosystem.

    PubMed

    Mendes Moreira, João; Cunha, Alcino; Macedo, Nuno

    2015-01-01

    PTCRIS (Portuguese Current Research Information System) is a program aiming at the creation and sustained development of a national integrated information ecosystem, to support research management according to the best international standards and practices. This paper reports on the experience of designing and prototyping a synchronization framework for PTCRIS based on ORCID (Open Researcher and Contributor ID). This framework embraces the "input once, re-use often" principle, and will enable a substantial reduction of the research output management burden by allowing automatic information exchange between the various national systems. The design of the framework followed best practices in rigorous software engineering, namely well-established principles in the research field of consistency management, and relied on formal analysis techniques and tools for its validation and verification. The notion of consistency between the services was formally specified and discussed with the stakeholders before the technical aspects on how to preserve said consistency were explored. Formal specification languages and automated verification tools were used to analyze the specifications and generate usage scenarios, useful for validation with the stakeholder and essential to certificate compliant services.

  18. Developing Formal Correctness Properties from Natural Language Requirements

    NASA Technical Reports Server (NTRS)

    Nikora, Allen P.

    2006-01-01

    This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.

  19. Are Earth System model software engineering practices fit for purpose? A case study.

    NASA Astrophysics Data System (ADS)

    Easterbrook, S. M.; Johns, T. C.

    2009-04-01

    We present some analysis and conclusions from a case study of the culture and practices of scientists at the Met Office and Hadley Centre working on the development of software for climate and Earth System models using the MetUM infrastructure. The study examined how scientists think about software correctness, prioritize their requirements in making changes, and develop a shared understanding of the resulting models. We conclude that highly customized techniques driven strongly by scientific research goals have evolved for verification and validation of such models. In a formal software engineering context these represents costly, but invaluable, software integration tests with considerable benefits. The software engineering practices seen also exhibit recognisable features of both agile and open source software development projects - self-organisation of teams consistent with a meritocracy rather than top-down organisation, extensive use of informal communication channels, and software developers who are generally also users and science domain experts. We draw some general conclusions on whether these practices work well, and what new software engineering challenges may lie ahead as Earth System models become ever more complex and petascale computing becomes the norm.

  20. Formally verifying human–automation interaction as part of a system model: limitations and tradeoffs

    PubMed Central

    Bass, Ellen J.

    2011-01-01

    Both the human factors engineering (HFE) and formal methods communities are concerned with improving the design of safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to perform formal verification of human–automation interaction with a programmable device. This effort utilizes a system architecture composed of independent models of the human mission, human task behavior, human-device interface, device automation, and operational environment. The goals of this architecture were to allow HFE practitioners to perform formal verifications of realistic systems that depend on human–automation interaction in a reasonable amount of time using representative models, intuitive modeling constructs, and decoupled models of system components that could be easily changed to support multiple analyses. This framework was instantiated using a patient controlled analgesia pump in a two phased process where models in each phase were verified using a common set of specifications. The first phase focused on the mission, human-device interface, and device automation; and included a simple, unconstrained human task behavior model. The second phase replaced the unconstrained task model with one representing normative pump programming behavior. Because models produced in the first phase were too large for the model checker to verify, a number of model revisions were undertaken that affected the goals of the effort. While the use of human task behavior models in the second phase helped mitigate model complexity, verification time increased. Additional modeling tools and technological developments are necessary for model checking to become a more usable technique for HFE. PMID:21572930

  1. Provably trustworthy systems.

    PubMed

    Klein, Gerwin; Andronick, June; Keller, Gabriele; Matichuk, Daniel; Murray, Toby; O'Connor, Liam

    2017-10-13

    We present recent work on building and scaling trustworthy systems with formal, machine-checkable proof from the ground up, including the operating system kernel, at the level of binary machine code. We first give a brief overview of the seL4 microkernel verification and how it can be used to build verified systems. We then show two complementary techniques for scaling these methods to larger systems: proof engineering, to estimate verification effort; and code/proof co-generation, for scalable development of provably trustworthy applications.This article is part of the themed issue 'Verified trustworthy software systems'. © 2017 The Author(s).

  2. Design of Astrometric Mission (JASMINE) by Applying Model Driven System Engineering

    NASA Astrophysics Data System (ADS)

    Yamada, Y.; Miyashita, H.; Nakamura, H.; Suenaga, K.; Kamiyoshi, S.; Tsuiki, A.

    2010-12-01

    We are planning space astrometric satellite mission named JASMINE. The target accuracy of parallaxes in JASMINE observation is 10 micro arc second, which corresponds to 1 nm scale on the focal plane. It is very hard to measure the 1 nm scale deformation of focal plane. Eventually, we need to add the deformation to the observation equations when estimating stellar astrometric parameters, which requires considering many factors such as instrument models and observation data analysis. In this situation, because the observation equations become more complex, we may reduce the stability of the hardware, nevertheless, we require more samplings due to the lack of rigidity of each estimation. This mission imposes a number of trades-offs in the engineering choices and then decide the optimal design from a number of candidates. In order to efficiently support such decisions, we apply Model Driven Systems Engineering (MDSE), which improves the efficiency of the engineering by revealing and formalizing requirements, specifications, and designs to find a good balance among various trade-offs.

  3. Formal Assurance Arguments: A Solution In Search of a Problem?

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.

    2015-01-01

    An assurance case comprises evidence and argument showing how that evidence supports assurance claims (e.g., about safety or security). It is unsurprising that some computer scientists have proposed formalizing assurance arguments: most associate formality with rigor. But while engineers can sometimes prove that source code refines a formal specification, it is not clear that formalization will improve assurance arguments or that this benefit is worth its cost. For example, formalization might reduce the benefits of argumentation by limiting the audience to people who can read formal logic. In this paper, we present (1) a systematic survey of the literature surrounding formal assurance arguments, (2) an analysis of errors that formalism can help to eliminate, (3) a discussion of existing evidence, and (4) suggestions for experimental work to definitively answer the question.

  4. Prediction of Petermann I and II Spot Sizes for Single-mode Dispersion-shifted and Dispersion-flattened Fibers by a Simple Technique

    NASA Astrophysics Data System (ADS)

    Kamila, Kiranmay; Panda, Anup Kumar; Gangopadhyay, Sankar

    2013-09-01

    Employing the series expression for the fundamental modal field of dispersion-shifted trapezoidal and dispersion-flattened graded and step W fibers, we present simple but accurate analytical expressions for Petermann I and II spot sizes of such kind of fibers. Choosing some typical dispersion-shifted trapezoidal and dispersion-flattened graded and step W fibers as examples, we show that our estimations match excellently with the exact numerical results. The evaluation of the concerned propagation parameters by our formalism needs very little computations. This accurate but simple formalism will benefit the system engineers working in the field of all optical technology.

  5. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  6. Genetic Design Automation: engineering fantasy or scientific renewal?

    PubMed Central

    Lux, Matthew W.; Bramlett, Brian W.; Ball, David A.; Peccoud, Jean

    2013-01-01

    Synthetic biology aims to make genetic systems more amenable to engineering, which has naturally led to the development of Computer-Aided Design (CAD) tools. Experimentalists still primarily rely on project-specific ad-hoc workflows instead of domain-specific tools, suggesting that CAD tools are lagging behind the front line of the field. Here, we discuss the scientific hurdles that have limited the productivity gains anticipated from existing tools. We argue that the real value of efforts to develop CAD tools is the formalization of genetic design rules that determine the complex relationships between genotype and phenotype. PMID:22001068

  7. The Study on Collaborative Manufacturing Platform Based on Agent

    NASA Astrophysics Data System (ADS)

    Zhang, Xiao-yan; Qu, Zheng-geng

    To fulfill the trends of knowledge-intensive in collaborative manufacturing development, we have described multi agent architecture supporting knowledge-based platform of collaborative manufacturing development platform. In virtue of wrapper service and communication capacity agents provided, the proposed architecture facilitates organization and collaboration of multi-disciplinary individuals and tools. By effectively supporting the formal representation, capture, retrieval and reuse of manufacturing knowledge, the generalized knowledge repository based on ontology library enable engineers to meaningfully exchange information and pass knowledge across boundaries. Intelligent agent technology increases traditional KBE systems efficiency and interoperability and provides comprehensive design environments for engineers.

  8. Engineering and Software Engineering

    NASA Astrophysics Data System (ADS)

    Jackson, Michael

    The phrase ‘software engineering' has many meanings. One central meaning is the reliable development of dependable computer-based systems, especially those for critical applications. This is not a solved problem. Failures in software development have played a large part in many fatalities and in huge economic losses. While some of these failures may be attributable to programming errors in the narrowest sense—a program's failure to satisfy a given formal specification—there is good reason to think that most of them have other roots. These roots are located in the problem of software engineering rather than in the problem of program correctness. The famous 1968 conference was motivated by the belief that software development should be based on “the types of theoretical foundations and practical disciplines that are traditional in the established branches of engineering.” Yet after forty years of currency the phrase ‘software engineering' still denotes no more than a vague and largely unfulfilled aspiration. Two major causes of this disappointment are immediately clear. First, too many areas of software development are inadequately specialised, and consequently have not developed the repertoires of normal designs that are the indispensable basis of reliable engineering success. Second, the relationship between structural design and formal analytical techniques for software has rarely been one of fruitful synergy: too often it has defined a boundary between competing dogmas, at which mutual distrust and incomprehension deprive both sides of advantages that should be within their grasp. This paper discusses these causes and their effects. Whether the common practice of software development will eventually satisfy the broad aspiration of 1968 is hard to predict; but an understanding of past failure is surely a prerequisite of future success.

  9. Engineering Research Division publication report, calendar year 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, E.K.; Livingston, P.L.; Rae, D.C.

    Each year the Engineering Research Division of the Electronics Engineering Department at Lawrence Livermore Laboratory has issued an internal report listing all formal publications produced by the Division during the calendar year. Abstracts of 1980 reports are presented.

  10. The founding of ISOTT: the Shamattawa of engineering science and medical science.

    PubMed

    Bruley, Duane F

    2014-01-01

    The founding of ISOTT was based upon the blending of Medical and Engineering sciences. This occurrence is portrayed by the Shamattawa, the joining of the Chippewa and Flambeau rivers. Beginning with Carl Scheele's discovery of oxygen, the medical sciences advanced the knowledge of its importance to physiological phenomena. Meanwhile, engineering science was evolving as a mathematical discipline used to define systems quantitatively from basic principles. In particular, Adolf Fick's employment of a gradient led to the formalization of transport phenomena. These two rivers of knowledge were blended to found ISOTT at Clemson/Charleston, South Carolina, USA, in 1973.The establishment of our society with a mission to support the collaborative work of medical scientists, clinicians and all disciplines of engineering was a supporting step in the evolution of bioengineering. Traditional engineers typically worked in areas not requiring knowledge of biology or the life sciences. By encouraging collaboration between medical science and traditional engineering, our society became one of the forerunners in establishing bioengineering as the fifth traditional discipline of engineering.

  11. How to Boost Engineering Support Via Web 2.0 - Seeds for the Ares Project...and/or Yours?

    NASA Technical Reports Server (NTRS)

    Scott, David W.

    2010-01-01

    The Mission Operations Laboratory (MOL) at Marshall Space Flight Center (MSFC) is responsible for Engineering Support capability for NASA s Ares launch system development. In pursuit of this, MOL is building the Ares Engineering and Operations Network (AEON), a web-based portal intended to provide a seamless interface to support and simplify two critical activities: a) Access and analyze Ares manufacturing, test, and flight performance data, with access to Shuttle data for comparison. b) Provide archive storage for engineering instrumentation data to support engineering design, development, and test. A mix of NASA-written and COTS software provides engineering analysis tools. A by-product of using a data portal to access and display data is access to collaborative tools inherent in a Web 2.0 environment. This paper discusses how Web 2.0 techniques, particularly social media, might be applied to the traditionally conservative and formal engineering support arena. A related paper by the author [1] considers use

  12. Preparing Engineers for the Challenges of Community Engagement

    ERIC Educational Resources Information Center

    Harsh, Matthew; Bernstein, Michael J.; Wetmore, Jameson; Cozzens, Susan; Woodson, Thomas; Castillo, Rafael

    2017-01-01

    Despite calls to address global challenges through community engagement, engineers are not formally prepared to engage with communities. Little research has been done on means to address this "engagement gap" in engineering education. We examine the efficacy of an intensive, two-day Community Engagement Workshop for engineers, designed…

  13. PRACA Enhancement Pilot Study Report: Engineering for Complex Systems Program (formerly Design for Safety), DFS-IC-0006

    NASA Technical Reports Server (NTRS)

    Korsmeyer, David; Schreiner, John

    2002-01-01

    This technology evaluation report documents the findings and recommendations of the Engineering for Complex Systems Program (formerly Design for Safety) PRACA Enhancement Pilot Study of the Space Shuttle Program's (SSP's) Problem Reporting and Corrective Action (PRACA) System. A team at NASA Ames Research Center (ARC) performed this Study. This Study was initiated as a follow-on to the NASA chartered Shuttle Independent Assessment Team (SIAT) review (performed in the Fall of 1999) which identified deficiencies in the current PRACA implementation. The Pilot Study was launched with an initial qualitative assessment and technical review performed during January 2000 with the quantitative formal Study (the subject of this report) started in March 2000. The goal of the PRACA Enhancement Pilot Study is to evaluate and quantify the technical aspects of the SSP PRACA systems and recommend enhancements to address deficiencies and in preparation for future system upgrades.

  14. Formalizing the Austrian Procedure Catalogue: A 4-step methodological analysis approach.

    PubMed

    Neururer, Sabrina Barbara; Lasierra, Nelia; Peiffer, Karl Peter; Fensel, Dieter

    2016-04-01

    Due to the lack of an internationally accepted and adopted standard for coding health interventions, Austria has established its own country-specific procedure classification system - the Austrian Procedure Catalogue (APC). Even though the APC is an elaborate coding standard for medical procedures, it has shortcomings that limit its usability. In order to enhance usability and usefulness, especially for research purposes and e-health applications, we developed an ontologized version of the APC. In this paper we present a novel four-step approach for the ontology engineering process, which enables accurate extraction of relevant concepts for medical ontologies from written text. The proposed approach for formalizing the APC consists of the following four steps: (1) comparative pre-analysis, (2) definition analysis, (3) typological analysis, and (4) ontology implementation. The first step contained a comparison of the APC to other well-established or elaborate health intervention coding systems in order to identify strengths and weaknesses of the APC. In the second step, a list of definitions of medical terminology used in the APC was obtained. This list of definitions was used as input for Step 3, in which we identified the most important concepts to describe medical procedures using the qualitative typological analysis approach. The definition analysis as well as the typological analysis are well-known and effective methods used in social sciences, but not commonly employed in the computer science or ontology engineering domain. Finally, this list of concepts was used in Step 4 to formalize the APC. The pre-analysis highlighted the major shortcomings of the APC, such as the lack of formal definition, leading to implicitly available, but not directly accessible information (hidden data), or the poor procedural type classification. After performing the definition and subsequent typological analyses, we were able to identify the following main characteristics of health interventions: (1) Procedural type, (2) Anatomical site, (3) Medical device, (4) Pathology, (5) Access, (6) Body system, (7) Population, (8) Aim, (9) Discipline, (10) Technique, and (11) Body Function. These main characteristics were taken as input of classes for the formalization of the APC. We were also able to identify relevant relations between classes. The proposed four-step approach for formalizing the APC provides a novel, systematically developed, strong framework to semantically enrich procedure classifications. Although this methodology was designed to address the particularities of the APC, the included methods are based on generic analysis tasks, and therefore can be re-used to provide a systematic representation of other procedure catalogs or classification systems and hence contribute towards a universal alignment of such representations, if desired. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. From quantum heat engines to laser cooling: Floquet theory beyond the Born–Markov approximation

    NASA Astrophysics Data System (ADS)

    Restrepo, Sebastian; Cerrillo, Javier; Strasberg, Philipp; Schaller, Gernot

    2018-05-01

    We combine the formalisms of Floquet theory and full counting statistics with a Markovian embedding strategy to access the dynamics and thermodynamics of a periodically driven thermal machine beyond the conventional Born–Markov approximation. The working medium is a two-level system and we drive the tunneling as well as the coupling to one bath with the same period. We identify four different operating regimes of our machine which include a heat engine and a refrigerator. As the coupling strength with one bath is increased, the refrigerator regime disappears, the heat engine regime narrows and their efficiency and coefficient of performance decrease. Furthermore, our model can reproduce the setup of laser cooling of trapped ions in a specific parameter limit.

  16. A Software Technology Transition Entropy Based Engineering Model

    DTIC Science & Technology

    2002-03-01

    Systems Basics, p273). (Prigogine 1997 p81). It is not the place of this research to provide a mathematical formalism with theorems and lemmas. Rather...science). The ancient philosophers, 27 Pythagoras , Protagoras, Socrates, and Plato start the first discourse (the message) that has continued...unpacking of the technology "message" from Pythagoras . This process is characterized by accumulation learning, modeled by learning curves in

  17. Generalized Operations Simulation Environment for Aircraft Maintenance Training

    DTIC Science & Technology

    2004-04-01

    Operations Simulation Environment ( GOSE ) project is a collaborative effort between AETC and AFRL to develop common, cost-effective, generalized VR training...maintenance training domain since it provided an opportunity to build on the VEST architecture. Development of GOSE involves re-engineering VEST as a scalable...modular, immersive VR training system comprised of PC-based hardware and software. GOSE initiatives include: (a) formalize training needs across

  18. Artificial Symmetry-Breaking for Morphogenetic Engineering Bacterial Colonies.

    PubMed

    Nuñez, Isaac N; Matute, Tamara F; Del Valle, Ilenne D; Kan, Anton; Choksi, Atri; Endy, Drew; Haseloff, Jim; Rudge, Timothy J; Federici, Fernan

    2017-02-17

    Morphogenetic engineering is an emerging field that explores the design and implementation of self-organized patterns, morphologies, and architectures in systems composed of multiple agents such as cells and swarm robots. Synthetic biology, on the other hand, aims to develop tools and formalisms that increase reproducibility, tractability, and efficiency in the engineering of biological systems. We seek to apply synthetic biology approaches to the engineering of morphologies in multicellular systems. Here, we describe the engineering of two mechanisms, symmetry-breaking and domain-specific cell regulation, as elementary functions for the prototyping of morphogenetic instructions in bacterial colonies. The former represents an artificial patterning mechanism based on plasmid segregation while the latter plays the role of artificial cell differentiation by spatial colocalization of ubiquitous and segregated components. This separation of patterning from actuation facilitates the design-build-test-improve engineering cycle. We created computational modules for CellModeller representing these basic functions and used it to guide the design process and explore the design space in silico. We applied these tools to encode spatially structured functions such as metabolic complementation, RNAPT7 gene expression, and CRISPRi/Cas9 regulation. Finally, as a proof of concept, we used CRISPRi/Cas technology to regulate cell growth by controlling methionine synthesis. These mechanisms start from single cells enabling the study of morphogenetic principles and the engineering of novel population scale structures from the bottom up.

  19. Examining Teachers' Perspectives on an Implementation of Elementary Engineering Teacher Professional Development

    ERIC Educational Resources Information Center

    Boots, Nikki Kim

    2013-01-01

    The emphasis on engaging young learners in science, technology, engineering, and math (STEM) professions is driving calls for educational reform. One movement that is gaining momentum is exposing K-12 learners to engineering. With the advent of the "Next Generation Science Standards" (2012b), engineering is being more formally integrated…

  20. The historical evolution of engineering degrees: competing stakeholders, contestation over ideas, and coherence across national borders

    NASA Astrophysics Data System (ADS)

    Case, Jennifer M.

    2017-11-01

    Recent times have seen significant realignment of engineering degrees globally, most notably in the Washington Accord, a system of mutual recognition of accreditation across much of the Anglophone world and beyond, and the Bologna Process, impacting significantly on the form of engineering degrees in Europe. This article, tracing the historical evolution of engineering degrees, argues that recent events can be seen to be part of an ongoing process of reworking the arrangements for formal engineering education, based on a long-standing contradiction between the different stakeholders that have an interest in curriculum: the state, engineering employers, and academics. This is reflected in a contestation over what was historically termed the 'shop culture' of the employers versus the 'school culture' of the academy. Furthermore, contemporary developments of mutual accreditation beyond national borders can be seen to have an earlier echo in the relative measure of global coherence that was achieved in the 1870s.

  1. Formal and heuristic system decomposition methods in multidisciplinary synthesis. Ph.D. Thesis, 1991

    NASA Technical Reports Server (NTRS)

    Bloebaum, Christina L.

    1991-01-01

    The multidisciplinary interactions which exist in large scale engineering design problems provide a unique set of difficulties. These difficulties are associated primarily with unwieldy numbers of design variables and constraints, and with the interdependencies of the discipline analysis modules. Such obstacles require design techniques which account for the inherent disciplinary couplings in the analyses and optimizations. The objective of this work was to develop an efficient holistic design synthesis methodology that takes advantage of the synergistic nature of integrated design. A general decomposition approach for optimization of large engineering systems is presented. The method is particularly applicable for multidisciplinary design problems which are characterized by closely coupled interactions among discipline analyses. The advantage of subsystem modularity allows for implementation of specialized methods for analysis and optimization, computational efficiency, and the ability to incorporate human intervention and decision making in the form of an expert systems capability. The resulting approach is not a method applicable to only a specific situation, but rather, a methodology which can be used for a large class of engineering design problems in which the system is non-hierarchic in nature.

  2. A MultiAir®/MultiFuel Approach to Enhancing Engine System Efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reese, Ronald

    2015-05-20

    FCA US LLC (formally known as Chrysler Group LLC, and hereinafter “Chrysler”) was awarded an American Recovery and Reinvestment Act (ARRA) funded project by the Department of Energy (DOE) titled “A MultiAir®/MultiFuel Approach to Enhancing Engine System Efficiency” (hereinafter “project”). This award was issued after Chrysler submitted a proposal for Funding Opportunity Announcement DE-FOA- 0000079, “Systems Level Technology Development, Integration, and Demonstration for Efficient Class 8 Trucks (SuperTruck) and Advanced Technology Powertrains for Light-Duty Vehicles (ATP-LD).” Chrysler started work on this project on June 01, 2010 and completed testing activities on August 30, 2014. Overall objectives of this project were;more » Demonstrate a 25% improvement in combined Federal Test Procedure (FTP) City and Highway fuel economy over a 2009 Chrysler minivan; Accelerate the development of highly efficient engine and powertrain systems for light-duty vehicles, while meeting future emissions standards; and Create and retain jobs in accordance with the American Recovery and Reinvestment Act of 2009« less

  3. Case Study of 'Engineering Peer Meetings' in JPL's ST-6 Project

    NASA Technical Reports Server (NTRS)

    Chao, Lawrence P.; Tumer, Irem

    2004-01-01

    This design process error-proofing case study describes a design review practice implemented by a project manager at NASA Jet Propulsion Laboratory. There are many types of reviews at NASA: required and not, formalized and informal, programmatic and technical. Standing project formal reviews such as the Preliminary Design Review (PDR) and Critical Design Review (CDR) are a required part of every project and mission development. However, the engineering peer reviews that support teams technical work on such projects are often informal, ad hoc, and inconsistent across the organization. This case study discusses issues and innovations identified by a project manager at JPL and implemented in 'engineering peer meetings' for his group.

  4. Case Study of "Engineering Peer Meetings" in JPL's ST-6 Project

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Chao, Lawrence P.

    2003-01-01

    This design process error-proofing case study describes a design review practice implemented by a project manager at NASA Jet Propulsion Laboratory. There are many types of reviews at NASA: required and not, formalized and informal, programmatic and technical. Standing project formal reviews such as the Preliminary Design Review (PDR) and Critical Design Review (CDR) are a required part of every project and mission development. However, the engineering peer reviews that support teams technical work on such projects are often informal, ad hoc, and inconsistent across the organization. This case study discusses issues and innovations identified by a project manager at JPL and implemented in "engineering peer meetings" for his group.

  5. Adaptable dialog architecture and runtime engine (AdaRTE): a framework for rapid prototyping of health dialog systems.

    PubMed

    Rojas-Barahona, L M; Giorgino, T

    2009-04-01

    Spoken dialog systems have been increasingly employed to provide ubiquitous access via telephone to information and services for the non-Internet-connected public. They have been successfully applied in the health care context; however, speech technology requires a considerable development investment. The advent of VoiceXML reduced the proliferation of incompatible dialog formalisms, at the expense of adding even more complexity. This paper introduces a novel architecture for dialogue representation and interpretation, AdaRTE, which allows developers to lay out dialog interactions through a high-level formalism, offering both declarative and procedural features. AdaRTE's aim is to provide a ground for deploying complex and adaptable dialogs whilst allowing experimentation and incremental adoption of innovative speech technologies. It enhances augmented transition networks with dynamic behavior, and drives multiple back-end realizers, including VoiceXML. It has been especially targeted to the health care context, because of the great scale and the need for reducing the barrier to a widespread adoption of dialog systems.

  6. Genetic design automation: engineering fantasy or scientific renewal?

    PubMed

    Lux, Matthew W; Bramlett, Brian W; Ball, David A; Peccoud, Jean

    2012-02-01

    The aim of synthetic biology is to make genetic systems more amenable to engineering, which has naturally led to the development of computer-aided design (CAD) tools. Experimentalists still primarily rely on project-specific ad hoc workflows instead of domain-specific tools, which suggests that CAD tools are lagging behind the front line of the field. Here, we discuss the scientific hurdles that have limited the productivity gains anticipated from existing tools. We argue that the real value of efforts to develop CAD tools is the formalization of genetic design rules that determine the complex relationships between genotype and phenotype. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Modular Knowledge Representation and Reasoning in the Semantic Web

    NASA Astrophysics Data System (ADS)

    Serafini, Luciano; Homola, Martin

    Construction of modular ontologies by combining different modules is becoming a necessity in ontology engineering in order to cope with the increasing complexity of the ontologies and the domains they represent. The modular ontology approach takes inspiration from software engineering, where modularization is a widely acknowledged feature. Distributed reasoning is the other side of the coin of modular ontologies: given an ontology comprising of a set of modules, it is desired to perform reasoning by combination of multiple reasoning processes performed locally on each of the modules. In the last ten years, a number of approaches for combining logics has been developed in order to formalize modular ontologies. In this chapter, we survey and compare the main formalisms for modular ontologies and distributed reasoning in the Semantic Web. We select four formalisms build on formal logical grounds of Description Logics: Distributed Description Logics, ℰ-connections, Package-based Description Logics and Integrated Distributed Description Logics. We concentrate on expressivity and distinctive modeling features of each framework. We also discuss reasoning capabilities of each framework.

  8. Linear quadratic servo control of a reusable rocket engine

    NASA Technical Reports Server (NTRS)

    Musgrave, Jeffrey L.

    1991-01-01

    The paper deals with the development of a design method for a servo component in the frequency domain using singular values and its application to a reusable rocket engine. A general methodology used to design a class of linear multivariable controllers for intelligent control systems is presented. Focus is placed on performance and robustness characteristics, and an estimator design performed in the framework of the Kalman-filter formalism with emphasis on using a sensor set different from the commanded values is discussed. It is noted that loop transfer recovery modifies the nominal plant noise intensities in order to obtain the desired degree of robustness to uncertainty reflected at the plant input. Simulation results demonstrating the performance of the linear design on a nonlinear engine model over all power levels during mainstage operation are discussed.

  9. Formalizing procedures for operations automation, operator training and spacecraft autonomy

    NASA Technical Reports Server (NTRS)

    Lecouat, Francois; Desaintvincent, Arnaud

    1994-01-01

    The generation and validation of operations procedures is a key task of mission preparation that is quite complex and costly. This has motivated the development of software applications providing support for procedures preparation. Several applications have been developed at MATRA MARCONI SPACE (MMS) over the last five years. They are presented in the first section of this paper. The main idea is that if procedures are represented in a formal language, they can be managed more easily with a computer tool and some automatic verifications can be performed. One difficulty is to define a formal language that is easy to use for operators and operations engineers. From the experience of the various procedures management tools developed in the last five years (including the POM, EOA, and CSS projects), MMS has derived OPSMAKER, a generic tool for procedure elaboration and validation. It has been applied to quite different types of missions, ranging from crew procedures (PREVISE system), ground control centers management procedures (PROCSU system), and - most relevant to the present paper - satellite operation procedures (PROCSAT developed for CNES, to support the preparation and verification of SPOT 4 operation procedures, and OPSAT for MMS telecom satellites operation procedures).

  10. De novo reconstruction of gene regulatory networks from time series data, an approach based on formal methods.

    PubMed

    Ceccarelli, Michele; Cerulo, Luigi; Santone, Antonella

    2014-10-01

    Reverse engineering of gene regulatory relationships from genomics data is a crucial task to dissect the complex underlying regulatory mechanism occurring in a cell. From a computational point of view the reconstruction of gene regulatory networks is an undetermined problem as the large number of possible solutions is typically high in contrast to the number of available independent data points. Many possible solutions can fit the available data, explaining the data equally well, but only one of them can be the biologically true solution. Several strategies have been proposed in literature to reduce the search space and/or extend the amount of independent information. In this paper we propose a novel algorithm based on formal methods, mathematically rigorous techniques widely adopted in engineering to specify and verify complex software and hardware systems. Starting with a formal specification of gene regulatory hypotheses we are able to mathematically prove whether a time course experiment belongs or not to the formal specification, determining in fact whether a gene regulation exists or not. The method is able to detect both direction and sign (inhibition/activation) of regulations whereas most of literature methods are limited to undirected and/or unsigned relationships. We empirically evaluated the approach on experimental and synthetic datasets in terms of precision and recall. In most cases we observed high levels of accuracy outperforming the current state of art, despite the computational cost increases exponentially with the size of the network. We made available the tool implementing the algorithm at the following url: http://www.bioinformatics.unisannio.it. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Open Campus: Strategic Plan

    DTIC Science & Technology

    2016-05-01

    The formal and informal interactions among scientists, engineers, and business and technology specialists fostered by this environment will lead...pathways for highly trained graduates of science, technology, engineering, and mathematics (STEM) academic programs, and help academic institutions...engineering and mathematics (STEM) disciplines relevant to ARL science and technology programs. Under EPAs, visiting students and professors

  12. Educating Engineers in Information Utilization.

    ERIC Educational Resources Information Center

    Borovansky, Vladimir T.

    1987-01-01

    Traditionally engineers are not heaviest users of information resources. This can be traced to lack of emphasis on information sources in engineering education. Failure to use available knowledge leads to reinventing the wheel and losing the race for technological superiority. Few U.S. universities offer formal courses in information resources in…

  13. Engineering in Communities: Learning by Doing

    ERIC Educational Resources Information Center

    Goggins, J.

    2012-01-01

    Purpose: The purpose of this paper is to focus on a number of initiatives in civil engineering undergraduate programmes at the National University of Ireland, Galway (NUIG) that allow students to complete engineering projects in the community, enabling them to learn by doing. Design/methodology/approach: A formal commitment to civic engagement was…

  14. A unified approach to the design of clinical reporting systems.

    PubMed

    Gouveia-Oliveira, A; Salgado, N C; Azevedo, A P; Lopes, L; Raposo, V D; Almeida, I; de Melo, F G

    1994-12-01

    Computer-based Clinical Reporting Systems (CRS) for diagnostic departments that use structured data entry have a number of functional and structural affinities suggesting that a common software architecture for CRS may be defined. Such an architecture should allow easy expandability and reusability of a CRS. We report the development methodology and the architecture of SISCOPE, a CRS originally designed for gastrointestinal endoscopy that is expandable and reusable. Its main components are a patient database, a knowledge base, a reports base, and screen and reporting engines. The knowledge base contains the description of the controlled vocabulary and all the information necessary to control the menu system, and is easily accessed and modified with a conventional text editor. The structure of the controlled vocabulary is formally presented as an entity-relationship diagram. The screen engine drives a dynamic user interface and the reporting engine automatically creates a medical report; both engines operate by following a set of rules and the information contained in the knowledge base. Clinical experience has shown this architecture to be highly flexible and to allow frequent modifications of both the vocabulary and the menu system. This structure provided increased collaboration among development teams, insulating the domain expert from the details of the database, and enabling him to modify the system as necessary and to test the changes immediately. The system has also been reused in several different domains.

  15. From scenarios to domain models: processes and representations

    NASA Astrophysics Data System (ADS)

    Haddock, Gail; Harbison, Karan

    1994-03-01

    The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.

  16. Studies on behaviour of information to extract the meaning behind the behaviour

    NASA Astrophysics Data System (ADS)

    Nasution, M. K. M.; Syah, R.; Elveny, M.

    2017-01-01

    Web as social media can be used as a reference for determining social behaviour. However, the information extraction involves a search engine is not easy to give that picture. There are several properties of the search engine to be formally disclosed to provide assurance that the information is feasible. Although quite a lot of research that has revealed the interest of the Web as social media, but a few of them that have revealed behaviour of information related to social behaviour. In this case, it needs the formal steps to present possibilities related properties. There are 12 properties that are interconnected as behaviour of information and then it reveals several meanings based on the simulation results of any search engine.

  17. MedSynDiKATe--design considerations for an ontology-based medical text understanding system.

    PubMed Central

    Hahn, U.; Romacker, M.; Schulz, S.

    2000-01-01

    MedSynDiKATe is a natural language processor for automatically acquiring knowledge from medical finding reports. The content of these documents is transferred to formal representation structures which constitute a corresponding text knowledge base. The general system architecture we present integrates requirements from the analysis of single sentences, as well as those of referentially linked sentences forming cohesive texts. The strong demands MedSynDiKATe poses to the availability of expressive knowledge sources are accounted for by two alternative approaches to (semi)automatic ontology engineering. PMID:11079899

  18. DoD Science and Engineering Apprenticeship Program for High School Students, 1996-󈨥 Activities

    DTIC Science & Technology

    1997-05-01

    including lectures, laboratory demonstrations, scientific films, field trips and a formal course and a weekly discussion session on the history of science using...lectures, laboratory demonstrations, scientific films, field trips and a formal course and a weekly discussion session on the history of science using

  19. On the Need for Practical Formal Methods

    DTIC Science & Technology

    1998-01-01

    additional research and engineering that is needed to make the current set of formal methods more practical. To illustrate the ideas, I present several exam ...either a good violin or a highly talented violinist. Light-weight techniques o er software developers good violins . A user need not be a talented

  20. Proceedings of the first switch tube advanced technology meeting held at EG G, Salem, Massachusetts, May 23, 1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shuman, A.; Beavis, L.

    Early in 1990, J. A. Wilder, Supervisor of Sandia National Laboratories (SNLA), Division 2565 requested that a meeting of the scientists and engineers responsible for developing and producing switch tubes be set up to discuss in a semi-formal way the science and technology of switch tubes. Programmatic and administrative issues were specifically exempted from the discussions. L. Beavis, Division 7471, SNL and A. Shuman, EG G, Salem were made responsible for organizing a program including the materials and processes of switch tubes. The purpose of the Switch Tube Advanced Technology meeting was to allow personnel from Allied Signal Kansas Citymore » Division (AS/KCD); EG G, Salem and Sandia National Laboratories (SNL) to discuss a variety of issues involved in the development and production of switch tubes. It was intended that the formal and informal discussions would allow a better understanding of the production problems by material and process engineers and of the materials and processes by production engineers. This program consisted of formal presentations on May 23 and informal discussions on May 24. The topics chosen for formal presentation were suggested by the people of AS/KCD, EG G, Salem, and SNL involved with the design, development and production of switch tubes. The topics selected were generic. They were not directed to any specific switch tube but rather to all switch tubes in production and development. This document includes summaries of the material presented at the formal presentation on May 23.« less

  1. An Introduction to Electrical Breakdown in Dielectrics

    DTIC Science & Technology

    1985-04-01

    PROJECT TASK WORK UNIT ELEMENT NO. NO. NO. NO. 11TI TL E ’tniclude Security Classification) AN INTRODUCTION TO ELECTRICAL 1PERSONAL AUTHOR(S...find themselves working in the area without the benefit of formal coursework. inAlthough the title of the course was High Voltage Engineer- inI titled...this work , "An Introduction to Electrical Breakdown * Phenomena," because breakdown may occur at low voltages when spacecraft systems are considered

  2. Design review - A tool for all seasons.

    NASA Technical Reports Server (NTRS)

    Liberman, D. S.

    1972-01-01

    The origins of design review are considered together with questions of definitions. The main characteristics which distinguish the concept of design review discussed from the basic master-apprentice relationship include competence, objectivity, formality, and a systematic approach. Preliminary, major, and final reviews are the steps used in the management of the design and development process in each company. It is shown that the design review is generically a systems engineering milestone review with certain unique characteristics.

  3. Aerospace engineering design by systematic decomposition and multilevel optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Giles, G. L.; Barthelemy, J.-F. M.

    1984-01-01

    This paper describes a method for systematic analysis and optimization of large engineering systems, e.g., aircraft, by decomposition of a large task into a set of smaller, self-contained subtasks that can be solved concurrently. The subtasks may be arranged in many hierarchical levels with the assembled system at the top level. Analyses are carried out in each subtask using inputs received from other subtasks, and are followed by optimizations carried out from the bottom up. Each optimization at the lower levels is augmented by analysis of its sensitivity to the inputs received from other subtasks to account for the couplings among the subtasks in a formal manner. The analysis and optimization operations alternate iteratively until they converge to a system design whose performance is maximized with all constraints satisfied. The method, which is still under development, is tentatively validated by test cases in structural applications and an aircraft configuration optimization. It is pointed out that the method is intended to be compatible with the typical engineering organization and the modern technology of distributed computing.

  4. Small Engine Technology (SET) - Task 14 Axisymmetric Engine Simulation Environment

    NASA Technical Reports Server (NTRS)

    Miller, Max J.

    1999-01-01

    As part of the NPSS (Numerical Propulsion Simulation System) project, NASA Lewis has a goal of developing an U.S. industry standard for an axisymmetric engine simulation environment. In this program, AlliedSignal Engines (AE) contributed to this goal by evaluating the ENG20 software and developing support tools. ENG20 is a NASA developed axisymmetric engine simulation tool. The project was divided into six subtasks which are summarized below: Evaluate the capabilities of the ENG20 code using an existing test case to see how this procedure can capture the component interactions for a full engine. Link AE's compressor and turbine axisymmetric streamline curvature codes (UD0300M and TAPS) with ENG20, which will provide the necessary boundary conditions for an ENG20 engine simulation. Evaluate GE's Global Data System (GDS), attempt to use GDS to do the linking of codes described in Subtask 2 above. Use a turbofan engine test case to evaluate various aspects of the system, including the linkage of UD0300M and TAPS with ENG20 and the GE data storage system. Also, compare the solution results with cycle deck results, axisymmetric solutions (UD0300M and TAPS), and test data to determine the accuracy of the solution. Evaluate the order of accuracy and the convergence time for the solution. Provide a monthly status report and a final formal report documenting AE's evaluation of ENG20. Provide the developed interfaces that link UD0300M and TAPS with ENG20, to NASA. The interface that links UD0300M with ENG20 will be compatible with the industr,, version of UD0300M.

  5. Formal Abstraction in Engineering Education--Challenges and Technology Support

    ERIC Educational Resources Information Center

    Neuper, Walther A.

    2017-01-01

    This is a position paper in the field of Engineering Education, which is at the very beginning in Europe. It relates challenges in the new field to the emerging technology of (Computer) Theorem Proving (TP). Experience shows, that "teaching" abstract models, for instance the wave equation in mechanical engineering and in electrical…

  6. Planning Non-Formal Education Curricula: The Case of Israel.

    ERIC Educational Resources Information Center

    Keller, Diana; Dror, Ilana

    This paper compares the formal and non-formal education systems currently operating in Israel, describing the special features of curriculum planning in non-formal education. The central argument is that the non-formal education system fulfills functions that constitute a critique of the formal education system. The non-formal system offers the…

  7. Confined Detonations and Pulse Detonation Engines

    DTIC Science & Technology

    2003-01-01

    chemically reacting flow was described by the 2D Euler equations &q OF(q) +G(q) W (1) 75 CONFINED DETONATIONS AND PULSE DETONATION ENGINES where q = (p...DETONATIONS AND PULSE DETONATION ENGINES 5 CONCLUDING REMARKS Numerical investigations of RR and MR in a supersonic chemically reacting flows have...formalism of hetero- geneous medium mechanics supplemented with an overall chemical reaction was 141 CONFINED DETONATIONS AND PULSE DETONATION ENGINES

  8. Practical Guidance on Science and Engineering Ethics Education for Instructors and Administrators: Papers and Summary from a Workshop, December 12, 2012

    ERIC Educational Resources Information Center

    Benya, Frazier F., Ed.; Fletcher, Cameron H.,Ed.; Hollander, Rachelle D.,Ed.

    2013-01-01

    Over the last two decades, colleges and universities in the United States have significantly increased the formal ethics instruction they provide in science and engineering. Today, science and engineering programs socialize students into the values of scientists and engineers as well as their obligations in the conduct of scientific research and…

  9. Chemical disorder as an engineering tool for spin polarization in Mn3Ga -based Heusler systems

    NASA Astrophysics Data System (ADS)

    Chadov, S.; D'Souza, S. W.; Wollmann, L.; Kiss, J.; Fecher, G. H.; Felser, C.

    2015-03-01

    Our study highlights spin-polarization mechanisms in metals by focusing on the mobilities of conducting electrons with different spins instead of their quantities. Here, we engineer electron mobility by applying chemical disorder induced by nonstoichiometric variations. As a practical example, we discuss the scheme that establishes such variations in tetragonal Mn3Ga Heusler material. We justify this approach using first-principles calculations of the spin-projected conductivity components based on the Kubo-Greenwood formalism. It follows that, in the majority of cases, even a small substitution of some other transition element instead of Mn may lead to a substantial increase in spin polarization along the tetragonal axis.

  10. Quantum dynamical framework for Brownian heat engines

    NASA Astrophysics Data System (ADS)

    Agarwal, G. S.; Chaturvedi, S.

    2013-07-01

    We present a self-contained formalism modeled after the Brownian motion of a quantum harmonic oscillator for describing the performance of microscopic Brownian heat engines such as Carnot, Stirling, and Otto engines. Our theory, besides reproducing the standard thermodynamics results in the steady state, enables us to study the role dissipation plays in determining the efficiency of Brownian heat engines under actual laboratory conditions. In particular, we analyze in detail the dynamics associated with decoupling a system in equilibrium with one bath and recoupling it to another bath and obtain exact analytical results, which are shown to have significant ramifications on the efficiencies of engines involving such a step. We also develop a simple yet powerful technique for computing corrections to the steady state results arising from finite operation time and use it to arrive at the thermodynamic complementarity relations for various operating conditions and also to compute the efficiencies of the three engines cited above at maximum power. Some of the methods and exactly solvable models presented here are interesting in their own right and could find useful applications in other contexts as well.

  11. Aerospace engineering design by systematic decomposition and multilevel optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Barthelemy, J. F. M.; Giles, G. L.

    1984-01-01

    A method for systematic analysis and optimization of large engineering systems, by decomposition of a large task into a set of smaller subtasks that is solved concurrently is described. The subtasks may be arranged in hierarchical levels. Analyses are carried out in each subtask using inputs received from other subtasks, and are followed by optimizations carried out from the bottom up. Each optimization at the lower levels is augmented by analysis of its sensitivity to the inputs received from other subtasks to account for the couplings among the subtasks in a formal manner. The analysis and optimization operations alternate iteratively until they converge to a system design whose performance is maximized with all constraints satisfied. The method, which is still under development, is tentatively validated by test cases in structural applications and an aircraft configuration optimization.

  12. Thermodynamics of the mesoscopic thermoelectric heat engine beyond the linear-response regime.

    PubMed

    Yamamoto, Kaoru; Hatano, Naomichi

    2015-10-01

    Mesoscopic thermoelectric heat engine is much anticipated as a device that allows us to utilize with high efficiency wasted heat inaccessible by conventional heat engines. However, the derivation of the heat current in this engine seems to be either not general or described too briefly, even inappropriately in some cases. In this paper, we give a clear-cut derivation of the heat current of the engine with suitable assumptions beyond the linear-response regime. It resolves the confusion in the definition of the heat current in the linear-response regime. After verifying that we can construct the same formalism as that of the cyclic engine, we find the following two interesting results within the Landauer-Büttiker formalism: the efficiency of the mesoscopic thermoelectric engine reaches the Carnot efficiency if and only if the transmission probability is finite at a specific energy and zero otherwise; the unitarity of the transmission probability guarantees the second law of thermodynamics, invalidating Benenti et al.'s argument in the linear-response regime that one could obtain a finite power with the Carnot efficiency under a broken time-reversal symmetry [Phys. Rev. Lett. 106, 230602 (2011)]. These results demonstrate how quantum mechanics constrains thermodynamics.

  13. Revisiting Feynman's ratchet with thermoelectric transport theory.

    PubMed

    Apertet, Y; Ouerdane, H; Goupil, C; Lecoeur, Ph

    2014-07-01

    We show how the formalism used for thermoelectric transport may be adapted to Smoluchowski's seminal thought experiment, also known as Feynman's ratchet and pawl system. Our analysis rests on the notion of useful flux, which for a thermoelectric system is the electrical current and for Feynman's ratchet is the effective jump frequency. Our approach yields original insight into the derivation and analysis of the system's properties. In particular we define an entropy per tooth in analogy with the entropy per carrier or Seebeck coefficient, and we derive the analog to Kelvin's second relation for Feynman's ratchet. Owing to the formal similarity between the heat fluxes balance equations for a thermoelectric generator (TEG) and those for Feynman's ratchet, we introduce a distribution parameter γ that quantifies the amount of heat that flows through the cold and hot sides of both heat engines. While it is well established that γ = 1/2 for a TEG, it is equal to 1 for Feynman's ratchet. This implies that no heat may be rejected in the cold reservoir for the latter case. Further, the analysis of the efficiency at maximum power shows that the so-called Feynman efficiency corresponds to that of an exoreversible engine, with γ = 1. Then, turning to the nonlinear regime, we generalize the approach based on the convection picture and introduce two different types of resistance to distinguish the dynamical behavior of the considered system from its ability to dissipate energy. We finally put forth the strong similarity between the original Feynman ratchet and a mesoscopic thermoelectric generator with a single conducting channel.

  14. The application of SSADM to modelling the logical structure of proteins.

    PubMed

    Saldanha, J; Eccles, J

    1991-10-01

    A logical design that describes the overall structure of proteins, together with a more detailed design describing secondary and some supersecondary structures, has been constructed using the computer-aided software engineering (CASE) tool, Auto-mate. Auto-mate embodies the philosophy of the Structured Systems Analysis and Design Method (SSADM) which enables the logical design of computer systems. Our design will facilitate the building of large information systems, such as databases and knowledgebases in the field of protein structure, by the derivation of system requirements from our logical model prior to producing the final physical system. In addition, the study has highlighted the ease of employing SSADM as a formalism in which to conduct the transferral of concepts from an expert into a design for a knowledge-based system that can be implemented on a computer (the knowledge-engineering exercise). It has been demonstrated how SSADM techniques may be extended for the purpose of modelling the constituent Prolog rules. This facilitates the integration of the logical system design model with the derived knowledge-based system.

  15. Factors Influencing Postsecondary STEM Students' Views of the Public Communication of an Emergent Technology: a Cross-National Study from Five Universities

    NASA Astrophysics Data System (ADS)

    Gardner, Grant E.; Jones, M. Gail; Albe, Virginie; Blonder, Ron; Laherto, Antti; Macher, Daniel; Paechter, Manuela

    2017-10-01

    Recent efforts in the science education community have highlighted the need to integrate research and theory from science communication research into more general science education scholarship. These synthesized research perspectives are relatively novel but serve an important need to better understand the impacts that the advent of rapidly emerging technologies will have on a new generation of scientists and engineers including their formal communication with engaged citizenry. This cross-national study examined postsecondary science and engineering students' ( n = 254 from five countries: Austria, Finland, France, Israel, and USA) perspectives on the role of science communication in their own formal science and engineering education. More broadly, we examined participants' understanding of their perceived responsibilities of communicating science and engineering to the general public when an issue contains complex social and ethical implications (SEI). The study is contextualized in the emergent technology of nanotechnology for which SEI are of particular concern and for which the general public often perceives conflicting risks and benefits. Findings indicate that student participants' hold similar views on the need for their own training in communication as future scientists and engineers. When asked about the role that ethics and risk perception plays in research, development, and public communication of nanotechnology, participants demonstrate similar trajectories of perspectives that are, however, often anchored in very different levels of beginning concern. Results are discussed in the context of considerations for science communication training within formal science education curricula globally.

  16. cFE/CFS (Core Flight Executive/Core Flight System)

    NASA Technical Reports Server (NTRS)

    Wildermann, Charles P.

    2008-01-01

    This viewgraph presentation describes in detail the requirements and goals of the Core Flight Executive (cFE) and the Core Flight System (CFS). The Core Flight Software System is a mission independent, platform-independent, Flight Software (FSW) environment integrating a reusable core flight executive (cFE). The CFS goals include: 1) Reduce time to deploy high quality flight software; 2) Reduce project schedule and cost uncertainty; 3) Directly facilitate formalized software reuse; 4) Enable collaboration across organizations; 5) Simplify sustaining engineering (AKA. FSW maintenance); 6) Scale from small instruments to System of Systems; 7) Platform for advanced concepts and prototyping; and 7) Common standards and tools across the branch and NASA wide.

  17. Software Development Technologies for Reactive, Real-Time, and Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Manna, Zohar

    1996-01-01

    The research is directed towards the design and implementation of a comprehensive deductive environment for the development of high-assurance systems, especially reactive (concurrent, real-time, and hybrid) systems. Reactive systems maintain an ongoing interaction with their environment, and are among the most difficult to design and verify. The project aims to provide engineers with a wide variety of tools within a single, general, formal framework in which the tools will be most effective. The entire development process is considered, including the construction, transformation, validation, verification, debugging, and maintenance of computer systems. The goal is to automate the process as much as possible and reduce the errors that pervade hardware and software development.

  18. Dynamic properties of interfaces in soft matter: Experiments and theory

    NASA Astrophysics Data System (ADS)

    Sagis, Leonard M. C.

    2011-10-01

    The dynamic properties of interfaces often play a crucial role in the macroscopic dynamics of multiphase soft condensed matter systems. These properties affect the dynamics of emulsions, of dispersions of vesicles, of biological fluids, of coatings, of free surface flows, of immiscible polymer blends, and of many other complex systems. The study of interfacial dynamic properties, surface rheology, is therefore a relevant discipline for many branches of physics, chemistry, engineering, and life sciences. In the past three to four decades a vast amount of literature has been produced dealing with the rheological properties of interfaces stabilized by low molecular weight surfactants, proteins, (bio)polymers, lipids, colloidal particles, and various mixtures of these surface active components. In this paper recent experiments are reviewed in the field of surface rheology, with particular emphasis on the models used to analyze surface rheological data. Most of the models currently used are straightforward generalizations of models developed for the analysis of rheological data of bulk phases. In general the limits on the validity of these generalizations are not discussed. Not much use is being made of recent advances in nonequilibrium thermodynamic formalisms for multiphase systems, to construct admissible models for the stress-deformation behavior of interfaces. These formalisms are ideally suited to construct thermodynamically admissible constitutive equations for rheological behavior that include the often relevant couplings to other fluxes in the interface (heat and mass), and couplings to the transfer of mass from the bulk phase to the interface. In this review recent advances in the application of classical irreversible thermodynamics, extended irreversible thermodynamics, rational thermodynamics, extended rational thermodynamics, and the general equation for the nonequilibrium reversible-irreversible coupling formalism to multiphase systems are also discussed, and shown how these formalisms can be used to generate a wide range of thermodynamically admissible constitutive models for the surface stress tensor. Some of the generalizations currently in use are shown to have only limited validity. The aim of this review is to stimulate new developments in the fields of experimental surface rheology and constitutive modeling of multiphase systems using nonequilibrium thermodynamic formalisms and to promote a closer integration of these disciplines.

  19. Entropy generation in biophysical systems

    NASA Astrophysics Data System (ADS)

    Lucia, U.; Maino, G.

    2013-03-01

    Recently, in theoretical biology and in biophysical engineering the entropy production has been verified to approach asymptotically its maximum rate, by using the probability of individual elementary modes distributed in accordance with the Boltzmann distribution. The basis of this approach is the hypothesis that the entropy production rate is maximum at the stationary state. In the present work, this hypothesis is explained and motivated, starting from the entropy generation analysis. This latter quantity is obtained from the entropy balance for open systems considering the lifetime of the natural real process. The Lagrangian formalism is introduced in order to develop an analytical approach to the thermodynamic analysis of the open irreversible systems. The stationary conditions of the open systems are thus obtained in relation to the entropy generation and the least action principle. Consequently, the considered hypothesis is analytically proved and it represents an original basic approach in theoretical and mathematical biology and also in biophysical engineering. It is worth remarking that the present results show that entropy generation not only increases but increases as fast as possible.

  20. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  1. Research into the development of a knowledge acquisition taxonomy

    NASA Technical Reports Server (NTRS)

    Fink, Pamela K.; Herren, L. Tandy

    1991-01-01

    The focus of the research was on the development of a problem solving taxonomy that can support and direct the knowledge engineering process during the development of an intelligent tutoring system. The results of the research are necessarily general. Being only a small initial attempt at a fundamental problem in artificial intelligence and cognitive psychology, the process has had to be bootstrapped and the results can only provide pointers to further, more formal research designs.

  2. Assessing the Higher National Diploma Chemical Engineering Programme in Ghana: Students' Perspective

    ERIC Educational Resources Information Center

    Boateng, Cyril D.; Bensah, Edem Cudjoe; Ahiekpor, Julius C.

    2012-01-01

    Chemical engineers have played key roles in the growth of the chemical and allied industries in Ghana but indigenous industries that have traditionally been the domain of the informal sector need to be migrated to the formal sector through the entrepreneurship and innovation of chemical engineers. The Higher National Diploma Chemical Engineering…

  3. Anticipatory precrash restraint sensor feasibility study: Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kercel, S.W.; Dress, W.B.

    1995-08-01

    This report explores feasibility of an anticipatory precrash restraint sensor. The foundation principle is the anticipation mechanism found at a primitive level of biological intelligence and originally formalized by the mathematical biologist Robert Rosen. A system based on formal anticipatory principles should significantly outperform conventional technologies. It offers the prospect of high payoff in prevention of death and injury. Sensors and processes are available to provide a good, fast, and inexpensive description of the present dynamical state of the vehicle to the embedded system model in the anticipation engine. The experimental part of this study found that inexpensive radar inmore » a real-world setting does return useful data on target dynamics. The data produced by a radar system can be converted to target dynamical information by good, fast and inexpensive signal-processing techniques. Not only is the anticipatory sensor feasible, but further development under the sponsorship of the National Highway Traffic Safety Administration is necessary and desirable. There are a number of possible lines of follow-on investigation. The level of effort and expected benefits of various alternatives are discussed.« less

  4. Opportunities for Space Science Education Using Current and Future Solar System Missions

    NASA Astrophysics Data System (ADS)

    Matiella Novak, M.; Beisser, K.; Butler, L.; Turney, D.

    2010-12-01

    The Education and Public Outreach (E/PO) office in The Johns Hopkins University Applied Physics Laboratory (APL) Space Department strives to excite and inspire the next generation of explorers by creating interactive education experiences. Since 1959, APL engineers and scientists have designed, built, and launched 61 spacecraft and over 150 instruments involved in space science. With the vast array of current and future Solar System exploration missions available, endless opportunities exist for education programs to incorporate the real-world science of these missions. APL currently has numerous education and outreach programs tailored for K-12 formal and informal education, higher education, and general outreach communities. Current programs focus on Solar System exploration missions such as the Compact Reconnaissance Imaging Spectrometer for Mars (CRISM), Miniature Radio Frequency (Mini-RF) Moon explorer, the Radiation Belt Storm Probes (RBSP), New Horizons mission to Pluto, and the Thermosphere Ionosphere Mesosphere Energetics and Dynamics (TIMED) Satellite, to name a few. Education and outreach programs focusing on K-12 formal education include visits to classrooms, summer programs for middle school students, and teacher workshops. APL hosts a Girl Power event and a STEM (Science, Technology, Engineering, and Mathematics) Day each year. Education and outreach specialists hold teacher workshops throughout the year to train educators in using NASA spacecraft science in their lesson plans. High school students from around the U.S. are able to engage in NASA spacecraft science directly by participating in the Mars Exploration Student Data Teams (MESDT) and the Student Principal Investigator Programs. An effort is also made to generate excitement for future missions by focusing on what mysteries will be solved. Higher education programs are used to recruit and train the next generation of scientists and engineers. The NASA/APL Summer Internship Program offers a unique glimpse into the Space Department’s “end-to-end” approach to mission design and execution. College students - both undergraduate and graduate - are recruited from around the U.S. to work with APL scientists and engineers who act as mentors to the students. Many students are put on summer projects that allow them to work with existing spacecraft systems, while others participate in projects that investigate the operational and science objectives of future planned spacecraft systems. In many cases these interns have returned to APL as full-time staff after graduation.

  5. The evolution of optics education at the U.S. National Optical Astronomy Observatory

    NASA Astrophysics Data System (ADS)

    Pompea, Stephen M.; Walker, Constance E.; Sparks, Robert T.

    2014-07-01

    The last decade of optics education at the U.S. National Optical Astronomy Observatory will be described in terms of program planning, assessment of community needs, identification of networks and strategic partners, the establishment of specific program goals and objectives, and program metrics and evaluation. A number of NOAO's optics education programs for formal and informal audiences will be described, including our Hands-On Optics program, illumination engineering/dark skies energy education programs, afterschool programs, adaptive optics education program, student outreach, and Galileoscope program. Particular emphasis will be placed on techniques for funding and sustaining high-quality programs. The use of educational gap analysis to identify the key needs of the formal and informal educational systems will be emphasized as a technique that has helped us to maximize our educational program effectiveness locally, regionally, nationally, and in Chile.

  6. Goal-Function Tree Modeling for Systems Engineering and Fault Management

    NASA Technical Reports Server (NTRS)

    Patterson, Jonathan D.; Johnson, Stephen B.

    2013-01-01

    The draft NASA Fault Management (FM) Handbook (2012) states that Fault Management (FM) is a "part of systems engineering", and that it "demands a system-level perspective" (NASAHDBK- 1002, 7). What, exactly, is the relationship between systems engineering and FM? To NASA, systems engineering (SE) is "the art and science of developing an operable system capable of meeting requirements within often opposed constraints" (NASA/SP-2007-6105, 3). Systems engineering starts with the elucidation and development of requirements, which set the goals that the system is to achieve. To achieve these goals, the systems engineer typically defines functions, and the functions in turn are the basis for design trades to determine the best means to perform the functions. System Health Management (SHM), by contrast, defines "the capabilities of a system that preserve the system's ability to function as intended" (Johnson et al., 2011, 3). Fault Management, in turn, is the operational subset of SHM, which detects current or future failures, and takes operational measures to prevent or respond to these failures. Failure, in turn, is the "unacceptable performance of intended function." (Johnson 2011, 605) Thus the relationship of SE to FM is that SE defines the functions and the design to perform those functions to meet system goals and requirements, while FM detects the inability to perform those functions and takes action. SHM and FM are in essence "the dark side" of SE. For every function to be performed (SE), there is the possibility that it is not successfully performed (SHM); FM defines the means to operationally detect and respond to this lack of success. We can also describe this in terms of goals: for every goal to be achieved, there is the possibility that it is not achieved; FM defines the means to operationally detect and respond to this inability to achieve the goal. This brief description of relationships between SE, SHM, and FM provide hints to a modeling approach to provide formal connectivity between the nominal (SE), and off-nominal (SHM and FM) aspects of functions and designs. This paper describes a formal modeling approach to the initial phases of the development process that integrates the nominal and off-nominal perspectives in a model that unites SE goals and functions of with the failure to achieve goals and functions (SHM/FM). This methodology and corresponding model, known as a Goal-Function Tree (GFT), provides a means to represent, decompose, and elaborate system goals and functions in a rigorous manner that connects directly to design through use of state variables that translate natural language requirements and goals into logical-physical state language. The state variable-based approach also provides the means to directly connect FM to the design, by specifying the range in which state variables must be controlled to achieve goals, and conversely, the failures that exist if system behavior go out-of-range. This in turn allows for the systems engineers and SHM/FM engineers to determine which state variables to monitor, and what action(s) to take should the system fail to achieve that goal. In sum, the GFT representation provides a unified approach to early-phase SE and FM development. This representation and methodology has been successfully developed and implemented using Systems Modeling Language (SysML) on the NASA Space Launch System (SLS) Program. It enabled early design trade studies of failure detection coverage to ensure complete detection coverage of all crew-threatening failures. The representation maps directly both to FM algorithm designs, and to failure scenario definitions needed for design analysis and testing. The GFT representation provided the basis for mapping of abort triggers into scenarios, both needed for initial, and successful quantitative analyses of abort effectiveness (detection and response to crew-threatening events).

  7. Framework Support For Knowledge-Based Software Development

    NASA Astrophysics Data System (ADS)

    Huseth, Steve

    1988-03-01

    The advent of personal engineering workstations has brought substantial information processing power to the individual programmer. Advanced tools and environment capabilities supporting the software lifecycle are just beginning to become generally available. However, many of these tools are addressing only part of the software development problem by focusing on rapid construction of self-contained programs by a small group of talented engineers. Additional capabilities are required to support the development of large programming systems where a high degree of coordination and communication is required among large numbers of software engineers, hardware engineers, and managers. A major player in realizing these capabilities is the framework supporting the software development environment. In this paper we discuss our research toward a Knowledge-Based Software Assistant (KBSA) framework. We propose the development of an advanced framework containing a distributed knowledge base that can support the data representation needs of tools, provide environmental support for the formalization and control of the software development process, and offer a highly interactive and consistent user interface.

  8. Preface to RIGiM 2009

    NASA Astrophysics Data System (ADS)

    Rolland, Colette; Yu, Eric; Salinesi, Camille; Castro, Jaelson

    The use of intentional concepts, the notion of "goal" in particular, has been prominent in recent approaches to requirement engineering (RE). Goal-oriented frameworks and methods for requirements engineering (GORE) have been keynote topics in requirements engineering, conceptual modelling, and more generally in software engineering. What are the conceptual modelling foundations in these approaches? RIGiM (Requirements Intentions and Goals in Conceptual Modelling) aims to provide a forum for discussing the interplay between requirements engineering and conceptual modelling, and in particular, to investigate how goal- and intention-driven approaches help in conceptualising purposeful systems. What are the fundamental objectives and premises of requirements engineering and conceptual modelling respectively, and how can they complement each other? What are the demands on conceptual modelling from the standpoint of requirements engineering? What conceptual modelling techniques can be further taken advantage of in requirements engineering? What are the upcoming modelling challenges and issues in GORE? What are the unresolved open questions? What lessons are there to be learnt from industrial experiences? What empirical data are there to support the cost-benefit analysis when adopting GORE methods? Are there application domains or types of project settings for which goals and intentional approaches are particularly suitable or not suitable? What degree of formalization and automation, or interactivity is feasible and appropriate for what types of participants during requirements engineering?

  9. Generating Phenotypical Erroneous Human Behavior to Evaluate Human-automation Interaction Using Model Checking

    PubMed Central

    Bolton, Matthew L.; Bass, Ellen J.; Siminiceanu, Radu I.

    2012-01-01

    Breakdowns in complex systems often occur as a result of system elements interacting in unanticipated ways. In systems with human operators, human-automation interaction associated with both normative and erroneous human behavior can contribute to such failures. Model-driven design and analysis techniques provide engineers with formal methods tools and techniques capable of evaluating how human behavior can contribute to system failures. This paper presents a novel method for automatically generating task analytic models encompassing both normative and erroneous human behavior from normative task models. The generated erroneous behavior is capable of replicating Hollnagel’s zero-order phenotypes of erroneous action for omissions, jumps, repetitions, and intrusions. Multiple phenotypical acts can occur in sequence, thus allowing for the generation of higher order phenotypes. The task behavior model pattern capable of generating erroneous behavior can be integrated into a formal system model so that system safety properties can be formally verified with a model checker. This allows analysts to prove that a human-automation interactive system (as represented by the model) will or will not satisfy safety properties with both normative and generated erroneous human behavior. We present benchmarks related to the size of the statespace and verification time of models to show how the erroneous human behavior generation process scales. We demonstrate the method with a case study: the operation of a radiation therapy machine. A potential problem resulting from a generated erroneous human action is discovered. A design intervention is presented which prevents this problem from occurring. We discuss how our method could be used to evaluate larger applications and recommend future paths of development. PMID:23105914

  10. Enhancing Individual Employability: The Perspective of Engineering Graduates

    ERIC Educational Resources Information Center

    Nilsson, Staffan

    2010-01-01

    Purpose: Employability includes the ability to find employment and remain employed. Employability includes both hard and soft skills, including formal and actual competence, interpersonal skills, and personal characteristics. This paper aims to focus on illuminating perceptions engineering graduates have regarding employability. More specifically,…

  11. DEVELOPMENT OF OPERATIONAL CONCEPTS FOR ADVANCED SMRs: THE ROLE OF COGNITIVE SYSTEMS ENGINEERING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacques Hugo; David Gertman

    Advanced small modular reactors (AdvSMRs) will use advanced digital instrumentation and control systems, and make greater use of automation. These advances not only pose technical and operational challenges, but will inevitably have an effect on the operating and maintenance (O&M) cost of new plants. However, there is much uncertainty about the impact of AdvSMR designs on operational and human factors considerations, such as workload, situation awareness, human reliability, staffing levels, and the appropriate allocation of functions between the crew and various automated plant systems. Existing human factors and systems engineering design standards and methodologies are not current in terms ofmore » human interaction requirements for dynamic automated systems and are no longer suitable for the analysis of evolving operational concepts. New models and guidance for operational concepts for complex socio-technical systems need to adopt a state-of-the-art approach such as Cognitive Systems Engineering (CSE) that gives due consideration to the role of personnel. This approach we report on helps to identify and evaluate human challenges related to non-traditional concepts of operations. A framework - defining operational strategies was developed based on the operational analysis of Argonne National Laboratory’s Experimental Breeder Reactor-II (EBR-II), a small (20MWe) sodium-cooled reactor that was successfully operated for thirty years. Insights from the application of the systematic application of the methodology and its utility are reviewed and arguments for the formal adoption of CSE as a value-added part of the Systems Engineering process are presented.« less

  12. Agent-based re-engineering of ErbB signaling: a modeling pipeline for integrative systems biology.

    PubMed

    Das, Arya A; Ajayakumar Darsana, T; Jacob, Elizabeth

    2017-03-01

    Experiments in systems biology are generally supported by a computational model which quantitatively estimates the parameters of the system by finding the best fit to the experiment. Mathematical models have proved to be successful in reverse engineering the system. The data generated is interpreted to understand the dynamics of the underlying phenomena. The question we have sought to answer is that - is it possible to use an agent-based approach to re-engineer a biological process, making use of the available knowledge from experimental and modelling efforts? Can the bottom-up approach benefit from the top-down exercise so as to create an integrated modelling formalism for systems biology? We propose a modelling pipeline that learns from the data given by reverse engineering, and uses it for re-engineering the system, to carry out in-silico experiments. A mathematical model that quantitatively predicts co-expression of EGFR-HER2 receptors in activation and trafficking has been taken for this study. The pipeline architecture takes cues from the population model that gives the rates of biochemical reactions, to formulate knowledge-based rules for the particle model. Agent-based simulations using these rules, support the existing facts on EGFR-HER2 dynamics. We conclude that, re-engineering models, built using the results of reverse engineering, opens up the possibility of harnessing the power pack of data which now lies scattered in literature. Virtual experiments could then become more realistic when empowered with the findings of empirical cell biology and modelling studies. Implemented on the Agent Modelling Framework developed in-house. C ++ code templates available in Supplementary material . liz.csir@gmail.com. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  13. T/BEST: Technology Benefit Estimator for Composites and Applications to Engine Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos

    1997-01-01

    Progress in the field of aerospace propulsion has heightened the need to combine advanced technologies. These benefits will provide guidelines for identifying and prioritizing high-payoff research areas, will help manage research with limited resources, and will show the link between advanced and basic concepts. An effort was undertaken at the NASA Lewis Research Center to develop a formal computational method, T/BEST (Technology Benefit Estimator), to assess advanced aerospace technologies, such as fibrous composites, and credibly communicate the benefits of research. Fibrous composites are ideal for structural applications such as high-performance aircraft engine blades where high strength-to-weight and stiffness-to-weight ratios are required. These factors - along with the flexibility to select the composite system and layup, and to favorably orient fiber directions - reduce the displacements and stresses caused by large rotational speeds in aircraft engines.

  14. CrossTalk: The Journal of Defense Software Engineering. Volume 26, Number 6, November/December 2013

    DTIC Science & Technology

    2013-12-01

    requirements during sprint planning. Automated scanning, which includes automated code-review tools, allows the expert to monitor the system... sprint . This enables the validator to leverage the test results for formal validation and verification, and perform a shortened “hybrid” style of IV&V...per SPRINT (1-4 weeks) 1 week 1 Month Up to four months Ø Deliverable product to user Ø Security posture assessed Ø Accredited to field/operate

  15. Ontological analysis of SNOMED CT.

    PubMed

    Héja, Gergely; Surján, György; Varga, Péter

    2008-10-27

    SNOMED CT is the most comprehensive medical terminology. However, its use for intelligent services based on formal reasoning is questionable. The analysis of the structure of SNOMED CT is based on the formal top-level ontology DOLCE. The analysis revealed several ontological and knowledge-engineering errors, the most important are errors in the hierarchy (mostly from an ontological point of view, but also regarding medical aspects) and the mixing of subsumption relations with other types (mostly 'part of'). The found errors impede formal reasoning. The paper presents a possible way to correct these problems.

  16. Thermodynamics of the mesoscopic thermoelectric heat engine beyond the linear-response regime

    NASA Astrophysics Data System (ADS)

    Yamamoto, Kaoru; Hatano, Naomichi

    2015-10-01

    Mesoscopic thermoelectric heat engine is much anticipated as a device that allows us to utilize with high efficiency wasted heat inaccessible by conventional heat engines. However, the derivation of the heat current in this engine seems to be either not general or described too briefly, even inappropriately in some cases. In this paper, we give a clear-cut derivation of the heat current of the engine with suitable assumptions beyond the linear-response regime. It resolves the confusion in the definition of the heat current in the linear-response regime. After verifying that we can construct the same formalism as that of the cyclic engine, we find the following two interesting results within the Landauer-Büttiker formalism: the efficiency of the mesoscopic thermoelectric engine reaches the Carnot efficiency if and only if the transmission probability is finite at a specific energy and zero otherwise; the unitarity of the transmission probability guarantees the second law of thermodynamics, invalidating Benenti et al.'s argument in the linear-response regime that one could obtain a finite power with the Carnot efficiency under a broken time-reversal symmetry [Phys. Rev. Lett. 106, 230602 (2011), 10.1103/PhysRevLett.106.230602]. These results demonstrate how quantum mechanics constrains thermodynamics.

  17. Assessment of the Orion-SLS Interface Management Process in Achieving the EIA 731.1 Systems Engineering Capability Model Generic Practices Level 3 Criteria

    NASA Technical Reports Server (NTRS)

    Jellicorse, John J.; Rahman, Shamin A.

    2016-01-01

    NASA is currently developing the next generation crewed spacecraft and launch vehicle for exploration beyond earth orbit including returning to the Moon and making the transit to Mars. Managing the design integration of major hardware elements of a space transportation system is critical for overcoming both the technical and programmatic challenges in taking a complex system from concept to space operations. An established method of accomplishing this is formal interface management. In this paper we set forth an argument that the interface management process implemented by NASA between the Orion Multi-Purpose Crew Vehicle (MPCV) and the Space Launch System (SLS) achieves the Level 3 tier of the EIA 731.1 System Engineering Capability Model (SECM) for Generic Practices. We describe the relevant NASA systems and associated organizations, and define the EIA SECM Level 3 Generic Practices. We then provide evidence for our compliance with those practices. This evidence includes discussions of: NASA Systems Engineering Interface (SE) Management standard process and best practices; the tailoring of that process for implementation on the Orion to SLS interface; changes made over time to improve the tailored process, and; the opportunities to take the resulting lessons learned and propose improvements to our institutional processes and best practices. We compare this evidence against the practices to form the rationale for the declared SECM maturity level.

  18. KAM (Knowledge Acquisition Module): A tool to simplify the knowledge acquisition process

    NASA Technical Reports Server (NTRS)

    Gettig, Gary A.

    1988-01-01

    Analysts, knowledge engineers and information specialists are faced with increasing volumes of time-sensitive data in text form, either as free text or highly structured text records. Rapid access to the relevant data in these sources is essential. However, due to the volume and organization of the contents, and limitations of human memory and association, frequently: (1) important information is not located in time; (2) reams of irrelevant data are searched; and (3) interesting or critical associations are missed due to physical or temporal gaps involved in working with large files. The Knowledge Acquisition Module (KAM) is a microcomputer-based expert system designed to assist knowledge engineers, analysts, and other specialists in extracting useful knowledge from large volumes of digitized text and text-based files. KAM formulates non-explicit, ambiguous, or vague relations, rules, and facts into a manageable and consistent formal code. A library of system rules or heuristics is maintained to control the extraction of rules, relations, assertions, and other patterns from the text. These heuristics can be added, deleted or customized by the user. The user can further control the extraction process with optional topic specifications. This allows the user to cluster extracts based on specific topics. Because KAM formalizes diverse knowledge, it can be used by a variety of expert systems and automated reasoning applications. KAM can also perform important roles in computer-assisted training and skill development. Current research efforts include the applicability of neural networks to aid in the extraction process and the conversion of these extracts into standard formats.

  19. Designing flexible engineering systems utilizing embedded architecture options

    NASA Astrophysics Data System (ADS)

    Pierce, Jeff G.

    This dissertation develops and applies an integrated framework for embedding flexibility in an engineered system architecture. Systems are constantly faced with unpredictability in the operational environment, threats from competing systems, obsolescence of technology, and general uncertainty in future system demands. Current systems engineering and risk management practices have focused almost exclusively on mitigating or preventing the negative consequences of uncertainty. This research recognizes that high uncertainty also presents an opportunity to design systems that can flexibly respond to changing requirements and capture additional value throughout the design life. There does not exist however a formalized approach to designing appropriately flexible systems. This research develops a three stage integrated flexibility framework based on the concept of architecture options embedded in the system design. Stage One defines an eight step systems engineering process to identify candidate architecture options. This process encapsulates the operational uncertainty though scenario development, traces new functional requirements to the affected design variables, and clusters the variables most sensitive to change. The resulting clusters can generate insight into the most promising regions in the architecture to embed flexibility in the form of architecture options. Stage Two develops a quantitative option valuation technique, grounded in real options theory, which is able to value embedded architecture options that exhibit variable expiration behavior. Stage Three proposes a portfolio optimization algorithm, for both discrete and continuous options, to select the optimal subset of architecture options, subject to budget and risk constraints. Finally, the feasibility, extensibility and limitations of the framework are assessed by its application to a reconnaissance satellite system development problem. Detailed technical data, performance models, and cost estimates were compiled for the Tactical Imaging Constellation Architecture Study and leveraged to complete a realistic proof-of-concept.

  20. Probabilistic Multi-Scale, Multi-Level, Multi-Disciplinary Analysis and Optimization of Engine Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2000-01-01

    Aircraft engines are assemblies of dynamically interacting components. Engine updates to keep present aircraft flying safely and engines for new aircraft are progressively required to operate in more demanding technological and environmental requirements. Designs to effectively meet those requirements are necessarily collections of multi-scale, multi-level, multi-disciplinary analysis and optimization methods and probabilistic methods are necessary to quantify respective uncertainties. These types of methods are the only ones that can formally evaluate advanced composite designs which satisfy those progressively demanding requirements while assuring minimum cost, maximum reliability and maximum durability. Recent research activities at NASA Glenn Research Center have focused on developing multi-scale, multi-level, multidisciplinary analysis and optimization methods. Multi-scale refers to formal methods which describe complex material behavior metal or composite; multi-level refers to integration of participating disciplines to describe a structural response at the scale of interest; multidisciplinary refers to open-ended for various existing and yet to be developed discipline constructs required to formally predict/describe a structural response in engine operating environments. For example, these include but are not limited to: multi-factor models for material behavior, multi-scale composite mechanics, general purpose structural analysis, progressive structural fracture for evaluating durability and integrity, noise and acoustic fatigue, emission requirements, hot fluid mechanics, heat-transfer and probabilistic simulations. Many of these, as well as others, are encompassed in an integrated computer code identified as Engine Structures Technology Benefits Estimator (EST/BEST) or Multi-faceted/Engine Structures Optimization (MP/ESTOP). The discipline modules integrated in MP/ESTOP include: engine cycle (thermodynamics), engine weights, internal fluid mechanics, cost, mission and coupled structural/thermal, various composite property simulators and probabilistic methods to evaluate uncertainty effects (scatter ranges) in all the design parameters. The objective of the proposed paper is to briefly describe a multi-faceted design analysis and optimization capability for coupled multi-discipline engine structures optimization. Results are presented for engine and aircraft type metrics to illustrate the versatility of that capability. Results are also presented for reliability, noise and fatigue to illustrate its inclusiveness. For example, replacing metal rotors with composites reduces the engine weight by 20 percent, 15 percent noise reduction, and an order of magnitude improvement in reliability. Composite designs exist to increase fatigue life by at least two orders of magnitude compared to state-of-the-art metals.

  1. Cyber-physical approach to the network-centric robotics control task

    NASA Astrophysics Data System (ADS)

    Muliukha, Vladimir; Ilyashenko, Alexander; Zaborovsky, Vladimir; Lukashin, Alexey

    2016-10-01

    Complex engineering tasks concerning control for groups of mobile robots are developed poorly. In our work for their formalization we use cyber-physical approach, which extends the range of engineering and physical methods for a design of complex technical objects by researching the informational aspects of communication and interaction between objects and with an external environment [1]. The paper analyzes network-centric methods for control of cyber-physical objects. Robots or cyber-physical objects interact with each other by transmitting information via computer networks using preemptive queueing system and randomized push-out mechanism [2],[3]. The main field of application for the results of our work is space robotics. The selection of cyber-physical systems as a special class of designed objects is due to the necessity of integrating various components responsible for computing, communications and control processes. Network-centric solutions allow using universal means for the organization of information exchange to integrate different technologies for the control system.

  2. Using a formal requirements management tool for system engineering: first results at ESO

    NASA Astrophysics Data System (ADS)

    Zamparelli, Michele

    2006-06-01

    The attention to proper requirement analysis and maintenance is growing in modern astronomical undertakings. The increasing degree of complexity that current and future generations of projects have reached requires substantial system engineering efforts and the usage of all available technology to keep project development under control. One such technology is a tool which helps managing relationships between deliverables at various development stages, and across functional subsystems and disciplines as different as software, mechanics, optics and electronics. The immediate benefits are traceability and the possibility to do impact analysis. An industrially proven tool for requirements management is presented together with the first results across some projects at ESO and a cost/benefit analysis of its usage. Experience gathered so far shows that the extensibility and configurability of the tool from one hand, and integration with common documentation formats and standards on the other, make it appear as a promising solution for even small scale system development.

  3. A linear decomposition method for large optimization problems. Blueprint for development

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.

    1982-01-01

    A method is proposed for decomposing large optimization problems encountered in the design of engineering systems such as an aircraft into a number of smaller subproblems. The decomposition is achieved by organizing the problem and the subordinated subproblems in a tree hierarchy and optimizing each subsystem separately. Coupling of the subproblems is accounted for by subsequent optimization of the entire system based on sensitivities of the suboptimization problem solutions at each level of the tree to variables of the next higher level. A formalization of the procedure suitable for computer implementation is developed and the state of readiness of the implementation building blocks is reviewed showing that the ingredients for the development are on the shelf. The decomposition method is also shown to be compatible with the natural human organization of the design process of engineering systems. The method is also examined with respect to the trends in computer hardware and software progress to point out that its efficiency can be amplified by network computing using parallel processors.

  4. Mathematical Building-Blocks in Engineering Mechanics

    ERIC Educational Resources Information Center

    Boyajian, David M.

    2007-01-01

    A gamut of mathematical subjects and concepts are taught within a handful of courses formally required of the typical engineering student who so often questions the relevancy of being bound to certain lower-division prerequisites. Basic classes at the undergraduate level, in this context, include: Integral and Differential Calculus, Differential…

  5. Employees' Perceptions of Barriers to Participation in Training and Development in Small Engineering Businesses

    ERIC Educational Resources Information Center

    Susomrith, Pattanee; Coetzer, Alan

    2015-01-01

    Purpose: This paper aims to investigate barriers to employee participation in voluntary formal training and development opportunities from the perspective of employees in small engineering businesses. Design/methodology/approach: An exploratory qualitative methodology involving data collection via site visits and in-depth semi-structured…

  6. Eliciting design patterns for e-learning systems

    NASA Astrophysics Data System (ADS)

    Retalis, Symeon; Georgiakakis, Petros; Dimitriadis, Yannis

    2006-06-01

    Design pattern creation, especially in the e-learning domain, is a highly complex process that has not been sufficiently studied and formalized. In this paper, we propose a systematic pattern development cycle, whose most important aspects focus on reverse engineering of existing systems in order to elicit features that are cross-validated through the use of appropriate, authentic scenarios. However, an iterative pattern process is proposed that takes advantage of multiple data sources, thus emphasizing a holistic view of the teaching learning processes. The proposed schema of pattern mining has been extensively validated for Asynchronous Network Supported Collaborative Learning (ANSCL) systems, as well as for other types of tools in a variety of scenarios, with promising results.

  7. Improving Safety through Human Factors Engineering.

    PubMed

    Siewert, Bettina; Hochman, Mary G

    2015-10-01

    Human factors engineering (HFE) focuses on the design and analysis of interactive systems that involve people, technical equipment, and work environment. HFE is informed by knowledge of human characteristics. It complements existing patient safety efforts by specifically taking into consideration that, as humans, frontline staff will inevitably make mistakes. Therefore, the systems with which they interact should be designed for the anticipation and mitigation of human errors. The goal of HFE is to optimize the interaction of humans with their work environment and technical equipment to maximize safety and efficiency. Special safeguards include usability testing, standardization of processes, and use of checklists and forcing functions. However, the effectiveness of the safety program and resiliency of the organization depend on timely reporting of all safety events independent of patient harm, including perceived potential risks, bad outcomes that occur even when proper protocols have been followed, and episodes of "improvisation" when formal guidelines are found not to exist. Therefore, an institution must adopt a robust culture of safety, where the focus is shifted from blaming individuals for errors to preventing future errors, and where barriers to speaking up-including barriers introduced by steep authority gradients-are minimized. This requires creation of formal guidelines to address safety concerns, establishment of unified teams with open communication and shared responsibility for patient safety, and education of managers and senior physicians to perceive the reporting of safety concerns as a benefit rather than a threat. © RSNA, 2015.

  8. A Generic Software Safety Document Generator

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Venkatesan, Ram Prasad

    2004-01-01

    Formal certification is based on the idea that a mathematical proof of some property of a piece of software can be regarded as a certificate of correctness which, in principle, can be subjected to external scrutiny. In practice, however, proofs themselves are unlikely to be of much interest to engineers. Nevertheless, it is possible to use the information obtained from a mathematical analysis of software to produce a detailed textual justification of correctness. In this paper, we describe an approach to generating textual explanations from automatically generated proofs of program safety, where the proofs are of compliance with an explicit safety policy that can be varied. Key to this is tracing proof obligations back to the program, and we describe a tool which implements this to certify code auto-generated by AutoBayes and AutoFilter, program synthesis systems under development at the NASA Ames Research Center. Our approach is a step towards combining formal certification with traditional certification methods.

  9. Engineering research, development and technology FY99

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langland, R T

    The growth of computer power and connectivity, together with advances in wireless sensing and communication technologies, is transforming the field of complex distributed systems. The ability to deploy large numbers of sensors with a rapid, broadband communication system will enable high-fidelity, near real-time monitoring of complex systems. These technological developments will provide unprecedented insight into the actual performance of engineered and natural environment systems, enable the evolution of many new types of engineered systems for monitoring and detection, and enhance our ability to perform improved and validated large-scale simulations of complex systems. One of the challenges facing engineering is tomore » develop methodologies to exploit the emerging information technologies. Particularly important will be the ability to assimilate measured data into the simulation process in a way which is much more sophisticated than current, primarily ad hoc procedures. The reports contained in this section on the Center for Complex Distributed Systems describe activities related to the integrated engineering of large complex systems. The first three papers describe recent developments for each link of the integrated engineering process for large structural systems. These include (1) the development of model-based signal processing algorithms which will formalize the process of coupling measurements and simulation and provide a rigorous methodology for validation and update of computational models; (2) collaborative efforts with faculty at the University of California at Berkeley on the development of massive simulation models for the earth and large bridge structures; and (3) the development of wireless data acquisition systems which provide a practical means of monitoring large systems like the National Ignition Facility (NIF) optical support structures. These successful developments are coming to a confluence in the next year with applications to NIF structural characterizations and analysis of large bridge structures for the State of California. Initial feasibility investigations into the development of monitoring and detection systems are described in the papers on imaging of underground structures with ground-penetrating radar, and the use of live insects as sensor platforms. These efforts are establishing the basic performance characteristics essential to the decision process for future development of sensor arrays for information gathering related to national security.« less

  10. Making Sense of the Arrow-Pushing Formalism among Chemistry Majors Enrolled in Organic Chemistry

    ERIC Educational Resources Information Center

    Ferguson, Robert; Bodner, George M.

    2008-01-01

    This paper reports results of a qualitative study of sixteen students enrolled in a second year organic chemistry course for chemistry and chemical engineering majors. The focus of the study was student use of the arrow-pushing formalism that plays a central role in both the teaching and practice of organic chemistry. The goal of the study was to…

  11. IDEF3 formalization report

    NASA Technical Reports Server (NTRS)

    Menzel, Christopher; Mayer, Richard J.; Edwards, Douglas D.

    1991-01-01

    The Process Description Capture Method (IDEF3) is one of several Integrated Computer-Aided Manufacturing (ICAM) DEFinition methods developed by the Air Force to support systems engineering activities, and in particular, to support information systems development. These methods have evolved as a distillation of 'good practice' experience by information system developers and are designed to raise the performance level of the novice practitioner to one comparable with that of an expert. IDEF3 is meant to serve as a knowledge acquisition and requirements definition tool that structures the user's understanding of how a given process, event, or system works around process descriptions. A special purpose graphical language accompanying the method serves to highlight temporal precedence and causality relationships relative to the process or event being described.

  12. The Application of Architecture Frameworks to Modelling Exploration Operations Costs

    NASA Technical Reports Server (NTRS)

    Shishko, Robert

    2006-01-01

    Developments in architectural frameworks and system-of-systems thinking have provided useful constructs for systems engineering. DoDAF concepts, language, and formalisms, in particular, provide a natural way of conceptualizing an operations cost model applicable to NASA's space exploration vision. Not all DoDAF products have meaning or apply to a DoDAF inspired operations cost model, but this paper describes how such DoDAF concepts as nodes, systems, and operational activities relate to the development of a model to estimate exploration operations costs. The paper discusses the specific implementation to the Mission Operations Directorate (MOD) operational functions/activities currently being developed and presents an overview of how this powerful representation can apply to robotic space missions as well.

  13. Model-Based Testability Assessment and Directed Troubleshooting of Shuttle Wiring Systems

    NASA Technical Reports Server (NTRS)

    Deb, Somnath; Domagala, Chuck; Shrestha, Roshan; Malepati, Venkatesh; Cavanaugh, Kevin; Patterson-Hine, Ann; Sanderfer, Dwight; Cockrell, Jim; Norvig, Peter (Technical Monitor)

    2000-01-01

    We have recently completed a pilot study on the Space shuttle wiring system commissioned by the Wiring Integrity Research (WIRe) team at NASA Ames Research Center, As the space shuttle ages, it is experiencing wiring degradation problems including arcing, chaffing insulation breakdown and broken conductors. A systematic and comprehensive test process is required to thoroughly test and quality assure (QA) the wiring systems. The NASA WIRe team recognized the value of a formal model based analysis for risk-assessment and fault coverage analysis. However. wiring systems are complex and involve over 50,000 wire segments. Therefore, NASA commissioned this pilot study with Qualtech Systems. Inc. (QSI) to explore means of automatically extracting high fidelity multi-signal models from wiring information database for use with QSI's Testability Engineering and Maintenance System (TEAMS) tool.

  14. Re-Engineering the Mission Operations System (MOS) for the Prime and Extended Mission

    NASA Technical Reports Server (NTRS)

    Hunt, Joseph C., Jr.; Cheng, Leo Y.

    2012-01-01

    One of the most challenging tasks in a space science mission is designing the Mission Operations System (MOS). Whereas the focus of the project is getting the spacecraft built and tested for launch, the mission operations engineers must build a system to carry out the science objectives. The completed MOS design is then formally assessed in the many reviews. Once a mission has completed the reviews, the Mission Operation System (MOS) design has been validated to the Functional Requirements and is ready for operations. The design was built based on heritage processes, new technology, and lessons learned from past experience. Furthermore, our operational concepts must be properly mapped to the mission design and science objectives. However, during the course of implementing the science objective in the operations phase after launch, the MOS experiences an evolutional change to adapt for actual performance characteristics. This drives the re-engineering of the MOS, because the MOS includes the flight and ground segments. Using the Spitzer mission as an example we demonstrate how the MOS design evolved for both the prime and extended mission to enhance the overall efficiency for science return. In our re-engineering process, we ensured that no requirements were violated or mission objectives compromised. In most cases, optimized performance across the MOS, including gains in science return as well as savings in the budget profile was achieved. Finally, we suggest a need to better categorize the Operations Phase (Phase E) in the NASA Life-Cycle Phases of Formulation and Implementation

  15. Model Based Definition

    NASA Technical Reports Server (NTRS)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  16. Mathematical formalisms based on approximated kinetic representations for modeling genetic and metabolic pathways.

    PubMed

    Alves, Rui; Vilaprinyo, Ester; Hernádez-Bermejo, Benito; Sorribas, Albert

    2008-01-01

    There is a renewed interest in obtaining a systemic understanding of metabolism, gene expression and signal transduction processes, driven by the recent research focus on Systems Biology. From a biotechnological point of view, such a systemic understanding of how a biological system is designed to work can facilitate the rational manipulation of specific pathways in different cell types to achieve specific goals. Due to the intrinsic complexity of biological systems, mathematical models are a central tool for understanding and predicting the integrative behavior of those systems. Particularly, models are essential for a rational development of biotechnological applications and in understanding system's design from an evolutionary point of view. Mathematical models can be obtained using many different strategies. In each case, their utility will depend upon the properties of the mathematical representation and on the possibility of obtaining meaningful parameters from available data. In practice, there are several issues at stake when one has to decide which mathematical model is more appropriate for the study of a given problem. First, one needs a model that can represent the aspects of the system one wishes to study. Second, one must choose a mathematical representation that allows an accurate analysis of the system with respect to different aspects of interest (for example, robustness of the system, dynamical behavior, optimization of the system with respect to some production goal, parameter value determination, etc). Third, before choosing between alternative and equally appropriate mathematical representations for the system, one should compare representations with respect to easiness of automation for model set-up, simulation, and analysis of results. Fourth, one should also consider how to facilitate model transference and re-usability by other researchers and for distinct purposes. Finally, one factor that is important for all four aspects is the regularity in the mathematical structure of the equations because it facilitates computational manipulation. This regularity is a mark of kinetic representations based on approximation theory. The use of approximation theory to derive mathematical representations with regular structure for modeling purposes has a long tradition in science. In most applied fields, such as engineering and physics, those approximations are often required to obtain practical solutions to complex problems. In this paper we review some of the more popular mathematical representations that have been derived using approximation theory and are used for modeling in molecular systems biology. We will focus on formalisms that are theoretically supported by the Taylor Theorem. These include the Power-law formalism, the recently proposed (log)linear and Lin-log formalisms as well as some closely related alternatives. We will analyze the similarities and differences between these formalisms, discuss the advantages and limitations of each representation, and provide a tentative "road map" for their potential utilization for different problems.

  17. NASA's New Science Education and Public Outreach Forums: Bringing Communities and Resources Together to Increase Effectiveness and Sustainability

    NASA Astrophysics Data System (ADS)

    Smith, Denise A.; Mendez, B.; Shipp, S.; Schwerin, T.; Stockman, S.; Cooper, L. P.; Sharma, M.

    2010-01-01

    Scientists, engineers, educators, and public outreach professionals have a rich history of creatively using NASA's pioneering scientific discoveries and technology to engage and educate youth and adults nationwide in core science, technology, engineering, and mathematics topics. We introduce four new Science Education and Public Outreach Forums that will work in partnership with the community and NASA's Science Mission Directorate (SMD) to ensure that current and future SMD-funded education and public outreach (E/PO) activities form a seamless whole, with easy entry points for general public, students, K-12 formal and informal science educators, faculty, scientists, engineers, and E/PO professionals alike. The new Science Education and Public Outreach Forums support the astrophysics, heliophysics, planetary and Earth science divisions of NASA SMD in three core areas: 1) E/PO community engagement and development activities will provide clear paths of involvement for scientists and engineers interested - or potentially interested - in participating in SMD-funded E/PO activities. Collaborations with scientists and engineers are vital for infusing current, accurate SMD mission and research findings into educational products and activities. Forum activities will also yield readily accessible information on effective E/PO strategies, resources, and expertise; context for individual E/PO activities; and opportunities for collaboration. 2) A rigorous analysis of SMD-funded K-12 formal, informal, and higher education products and activities will help the community and SMD to understand how the existing collection supports education standards and audience needs, and to strategically identify areas of opportunity for new materials and activities. 3) Finally, a newly convened Coordinating Committee will work across the four SMD science divisions to address systemic issues and integrate related activities. By supporting the NASA E/PO community and facilitating coordination of E/PO activities, the NASA-SEPOF partnerships will lead to more effective, sustainable, and efficient utilization of NASA science discoveries and learning experiences.

  18. Knowledge engineering for adverse drug event prevention: on the design and development of a uniform, contextualized and sustainable knowledge-based framework.

    PubMed

    Koutkias, Vassilis; Kilintzis, Vassilis; Stalidis, George; Lazou, Katerina; Niès, Julie; Durand-Texte, Ludovic; McNair, Peter; Beuscart, Régis; Maglaveras, Nicos

    2012-06-01

    The primary aim of this work was the development of a uniform, contextualized and sustainable knowledge-based framework to support adverse drug event (ADE) prevention via Clinical Decision Support Systems (CDSSs). In this regard, the employed methodology involved first the systematic analysis and formalization of the knowledge sources elaborated in the scope of this work, through which an application-specific knowledge model has been defined. The entire framework architecture has been then specified and implemented by adopting Computer Interpretable Guidelines (CIGs) as the knowledge engineering formalism for its construction. The framework integrates diverse and dynamic knowledge sources in the form of rule-based ADE signals, all under a uniform Knowledge Base (KB) structure, according to the defined knowledge model. Equally important, it employs the means to contextualize the encapsulated knowledge, in order to provide appropriate support considering the specific local environment (hospital, medical department, language, etc.), as well as the mechanisms for knowledge querying, inference, sharing, and management. In this paper, we present thoroughly the establishment of the proposed knowledge framework by presenting the employed methodology and the results obtained as regards implementation, performance and validation aspects that highlight its applicability and virtue in medication safety. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. Extended abstract: Managing disjunction for practical temporal reasoning

    NASA Technical Reports Server (NTRS)

    Boddy, Mark; Schrag, Bob; Carciofini, Jim

    1992-01-01

    One of the problems that must be dealt with in either a formal or implemented temporal reasoning system is the ambiguity arising from uncertain information. Lack of precise information about when events happen leads to uncertainty regarding the effects of those events. Incomplete information and nonmonotonic inference lead to situations where there is more than one set of possible inferences, even when there is no temporal uncertainty at all. In an implemented system, this ambiguity is a computational problem as well as a semantic one. In this paper, we discuss some of the sources of this ambiguity, which we will treat as explicit disjunction, in the sense that ambiguous information can be interpreted as defining a set of possible inferences. We describe the application of three techniques for managing disjunction in an implementation of Dean's Time Map Manager. Briefly, the disjunction is either: removed by limiting the expressive power of the system, or approximated by a weaker form of representation that subsumes the disjunction. We use a combination of these methods to implement an expressive and efficient temporal reasoning engine that performs sound inference in accordance with a well-defined formal semantics.

  20. Formal Analysis of Privacy Requirements Specifications for Multi-Tier Applications

    DTIC Science & Technology

    2013-07-30

    Requirements Engineering Lab and co- founder of the Requirements Engineering and Law Workshop and has several publications in ACM- and IEEE- sponsored journals...Advertising that serves the online ad “Buying Razors Sucks” in this game. Zynga also produces a version of this game for the Android and iPhone mobile

  1. Engineering Aid 3 & 2, Vol. 1. Rate Training Manual and Nonresident Career Course.

    ERIC Educational Resources Information Center

    Naval Education and Training Command, Washington, DC.

    Designed for individual study and not formal classroom instruction, this rate training manual provides subject matter that relates directly to the occupational qualifications of the Engineering Aid (EA) rating. This eight-chapter volume focuses on administrative matters, mathematics, and basic drafting. Chapter 1 discusses the scope of the EA…

  2. Baseball Stadium Design: Teaching Engineering Economics and Technical Communication in a Multi-Disciplinary Setting.

    ERIC Educational Resources Information Center

    Dahm, Kevin; Newell, James

    2001-01-01

    Reports on a course at Rowan University, based on the economic design of a baseball stadium, that offers an introduction to multidisciplinary engineering design linked with formal training in technical communication. Addresses four pedagogical goals: (1) developing public speaking skills in a realistic, business setting; (2) giving students…

  3. Attitudes towards Communication Skills among Engineering Students

    ERIC Educational Resources Information Center

    Kovac, Mirjana M.; Sirkovic, N.

    2017-01-01

    Good communication skills are of utmost importance in the education of engineering students. It is necessary to promote not only their education, but also to prepare them for the demanding and competitive job market. The purpose of this study was to compare the attitudes towards communication skills after formal instruction between the students of…

  4. High School Physics: An Interactive Instructional Approach That Meets the Next Generation Science Standards

    ERIC Educational Resources Information Center

    Huang, Shaobo; Mejia, Joel Alejandro; Becker, Kurt; Neilson, Drew

    2015-01-01

    Improving high school physics teaching and learning is important to the long-term success of science, technology, engineering, and mathematics (STEM) education. Efforts are currently in place to develop an understanding of science among high school students through formal and informal educational experiences in engineering design activities…

  5. Engineering Aid 3 & 2, Vol. 2. Rate Training Manual.

    ERIC Educational Resources Information Center

    Bernal, Benito C., Jr.

    Designed for individual study and not formal classroom instruction, this rate training manual provides subject matter that relates directly to the occupational qualifications of the Engineering Aid (EA) rating. This volume contains 10 chapters which deal with: (1) wood and light frame structures (examining the uses, kinds, sizes, and grades of…

  6. Transactional, Cooperative, and Communal: Relating the Structure of Engineering Engagement Programs with the Nature of Partnerships

    ERIC Educational Resources Information Center

    Thompson, Julia D.; Jesiek, Brent K.

    2017-01-01

    This paper examines how the structural features of engineering engagement programs (EEPs) are related to the nature of their service-learning partnerships. "Structure" refers to formal and informal models, processes, and operations adopted or used to describe engagement programs, while "nature" signifies the quality of…

  7. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 36: Technical uncertainty as a correlate of information use by US industry-affiliated aerospace engineers and scientists

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Glassman, Nanci A.; Affelder, Linda O.; Hecht, Laura M.; Kennedy, John M.; Barclay, Rebecca O.

    1994-01-01

    This paper reports the results of an exploratory study that investigated the influence of technical uncertainty on the use of information and information sources by U.S. industry-affiliated aerospace engineers and scientists in completing or solving a project, task, or problem. Data were collected through a self-administered questionnaire. Survey participants were U.S. aerospace engineers and scientists whose names appeared on the Society of Automotive Engineers (SAE) mailing list. The results support the findings of previous research and the following study assumptions. Information and information-source use differ for projects, problems, and tasks with high and low technical uncertainty. As technical uncertainty increases, information-source use changes from internal to external and from informal to formal sources. As technical uncertainty increases, so too does the use of federally funded aerospace research and development (R&D). The use of formal information sources to learn about federally funded aerospace R&D differs for projects, problems, and tasks with high and low technical uncertainty.

  8. Structuring Formal Requirements Specifications for Reuse and Product Families

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.

    2001-01-01

    In this project we have investigated how formal specifications should be structured to allow for requirements reuse, product family engineering, and ease of requirements change, The contributions of this work include (1) a requirements specification methodology specifically targeted for critical avionics applications, (2) guidelines for how to structure state-based specifications to facilitate ease of change and reuse, and (3) examples from the avionics domain demonstrating the proposed approach.

  9. Concept similarity and related categories in information retrieval using formal concept analysis

    NASA Astrophysics Data System (ADS)

    Eklund, P.; Ducrou, J.; Dau, F.

    2012-11-01

    The application of formal concept analysis to the problem of information retrieval has been shown useful but has lacked any real analysis of the idea of relevance ranking of search results. SearchSleuth is a program developed to experiment with the automated local analysis of Web search using formal concept analysis. SearchSleuth extends a standard search interface to include a conceptual neighbourhood centred on a formal concept derived from the initial query. This neighbourhood of the concept derived from the search terms is decorated with its upper and lower neighbours representing more general and special concepts, respectively. SearchSleuth is in many ways an archetype of search engines based on formal concept analysis with some novel features. In SearchSleuth, the notion of related categories - which are themselves formal concepts - is also introduced. This allows the retrieval focus to shift to a new formal concept called a sibling. This movement across the concept lattice needs to relate one formal concept to another in a principled way. This paper presents the issues concerning exploring, searching, and ordering the space of related categories. The focus is on understanding the use and meaning of proximity and semantic distance in the context of information retrieval using formal concept analysis.

  10. Evaluating a common semi-mechanistic mathematical model of gene-regulatory networks

    PubMed Central

    2015-01-01

    Modeling and simulation of gene-regulatory networks (GRNs) has become an important aspect of modern systems biology investigations into mechanisms underlying gene regulation. A key challenge in this area is the automated inference (reverse-engineering) of dynamic, mechanistic GRN models from gene expression time-course data. Common mathematical formalisms for representing such models capture two aspects simultaneously within a single parameter: (1) Whether or not a gene is regulated, and if so, the type of regulator (activator or repressor), and (2) the strength of influence of the regulator (if any) on the target or effector gene. To accommodate both roles, "generous" boundaries or limits for possible values of this parameter are commonly allowed in the reverse-engineering process. This approach has several important drawbacks. First, in the absence of good guidelines, there is no consensus on what limits are reasonable. Second, because the limits may vary greatly among different reverse-engineering experiments, the concrete values obtained for the models may differ considerably, and thus it is difficult to compare models. Third, if high values are chosen as limits, the search space of the model inference process becomes very large, adding unnecessary computational load to the already complex reverse-engineering process. In this study, we demonstrate that restricting the limits to the [−1, +1] interval is sufficient to represent the essential features of GRN systems and offers a reduction of the search space without loss of quality in the resulting models. To show this, we have carried out reverse-engineering studies on data generated from artificial and experimentally determined from real GRN systems. PMID:26356485

  11. Tacit Knowledge Capture and the Brain-Drain at Electrical Utilities

    NASA Astrophysics Data System (ADS)

    Perjanik, Nicholas Steven

    As a consequence of an aging workforce, electric utilities are at risk of losing their most experienced and knowledgeable electrical engineers. In this research, the problem was a lack of understanding of what electric utilities were doing to capture the tacit knowledge or know-how of these engineers. The purpose of this qualitative research study was to explore the tacit knowledge capture strategies currently used in the industry by conducting a case study of 7 U.S. electrical utilities that have demonstrated an industry commitment to improving operational standards. The research question addressed the implemented strategies to capture the tacit knowledge of retiring electrical engineers and technical personnel. The research methodology involved a qualitative embedded case study. The theories used in this study included knowledge creation theory, resource-based theory, and organizational learning theory. Data were collected through one time interviews of a senior electrical engineer or technician within each utility and a workforce planning or training professional within 2 of the 7 utilities. The analysis included the use of triangulation and content analysis strategies. Ten tacit knowledge capture strategies were identified: (a) formal and informal on-boarding mentorship and apprenticeship programs, (b) formal and informal off-boarding mentorship programs, (c) formal and informal training programs, (d) using lessons learned during training sessions, (e) communities of practice, (f) technology enabled tools, (g) storytelling, (h) exit interviews, (i) rehiring of retirees as consultants, and (j) knowledge risk assessments. This research contributes to social change by offering strategies to capture the know-how needed to ensure operational continuity in the delivery of safe, reliable, and sustainable power.

  12. Calendar years 1989 and 1990 monitoring well installation program Y-12 plant, Oak Ridge, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-10-01

    This report documents the well-construction activities at the Y-12 Plant in Oak Ridge, Tennessee during 1989 and 1990. The well- construction program consisted of installing seventy-five monitoring wells. Geologists from ERCE (formally the Engineering, Design and Geosciences Group) and Martin Marietta Energy Systems (Energy Systems), supervised and documented well-construction activities and monitored for health and safety concerns. Sixty-seven monitoring wells were installed under the supervision of an ERCE geologist from March 1989 to September 1990. Beginning in September 1990, Energy Systems supervised drilling activities for eight monitoring wells, the last of which was completed in December 1990. 9 refs., 3more » figs., 2 tabs.« less

  13. Control mechanisms for stochastic biochemical systems via computation of reachable sets.

    PubMed

    Lakatos, Eszter; Stumpf, Michael P H

    2017-08-01

    Controlling the behaviour of cells by rationally guiding molecular processes is an overarching aim of much of synthetic biology. Molecular processes, however, are notoriously noisy and frequently nonlinear. We present an approach to studying the impact of control measures on motifs of molecular interactions that addresses the problems faced in many biological systems: stochasticity, parameter uncertainty and nonlinearity. We show that our reachability analysis formalism can describe the potential behaviour of biological (naturally evolved as well as engineered) systems, and provides a set of bounds on their dynamics at the level of population statistics: for example, we can obtain the possible ranges of means and variances of mRNA and protein expression levels, even in the presence of uncertainty about model parameters.

  14. Control mechanisms for stochastic biochemical systems via computation of reachable sets

    PubMed Central

    Lakatos, Eszter

    2017-01-01

    Controlling the behaviour of cells by rationally guiding molecular processes is an overarching aim of much of synthetic biology. Molecular processes, however, are notoriously noisy and frequently nonlinear. We present an approach to studying the impact of control measures on motifs of molecular interactions that addresses the problems faced in many biological systems: stochasticity, parameter uncertainty and nonlinearity. We show that our reachability analysis formalism can describe the potential behaviour of biological (naturally evolved as well as engineered) systems, and provides a set of bounds on their dynamics at the level of population statistics: for example, we can obtain the possible ranges of means and variances of mRNA and protein expression levels, even in the presence of uncertainty about model parameters. PMID:28878957

  15. Finding Patterns of Emergence in Science and Technology

    DTIC Science & Technology

    2012-09-24

    formal evaluation scheduled – Case Studies, Eight Examples: Tissue Engineering, Cold Fusion, RF Metamaterials, DNA Microarrays, Genetic Algorithms, RNAi...emerging capabilities Case Studies, Eight Examples: • Tissue Engineering, Cold Fusion, RF Metamaterials, DNA Microarrays, Genetic Algorithms...Evidence Quality (i.e., the rubric ) and deliver comprehensible evidential support for nomination • Demonstrate proof-of-concept nomination for Chinese

  16. Faculty Consulting in Natural Sciences and Engineering: Between Formal and Informal Knowledge Transfer

    ERIC Educational Resources Information Center

    Amara, Nabil; Landry, Rejean; Halilem, Norrin

    2013-01-01

    Academic consulting is a form of knowledge and technology transfer largely under-documented and under-studied that raises ethical and resources allocation issues. Based on a survey of 2,590 Canadian researchers in engineering and natural sciences, this paper explores three forms of academic consulting: (1) paid consulting; (2) unpaid consulting…

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacques Hugo

    Traditional engineering methods do not make provision for the integration of human considerations, while traditional human factors methods do not scale well to the complexity of large-scale nuclear power plant projects. Although the need for up-to-date human factors engineering processes and tools is recognised widely in industry, so far no formal guidance has been developed. This article proposes such a framework.

  18. Knowledge Engineering as a Component of the Curriculum for Medical Cybernetists.

    PubMed

    Karas, Sergey; Konev, Arthur

    2017-01-01

    According to a new state educational standard, students who have chosen medical cybernetics as their major must develop a knowledge engineering competency. Previously, in the course "Clinical cybernetics" while practicing project-based learning students were designing automated workstations for medical personnel using client-server technology. The purpose of the article is to give insight into the project of a new educational module "Knowledge engineering". Students will acquire expert knowledge by holding interviews and conducting surveys, and then they will formalize it. After that, students will form declarative expert knowledge in a network model and analyze the knowledge graph. Expert decision making methods will be applied in software on the basis of a production model of knowledge. Project implementation will result not only in the development of analytical competencies among students, but also creation of a practically useful expert system based on student models to support medical decisions. Nowadays, this module is being tested in the educational process.

  19. Automotive Stirling Engine Development Program

    NASA Technical Reports Server (NTRS)

    Nightingale, N.; Richey, A.; Farrell, R.; Riecke, G.; Ernst, W.; Howarth, R.; Cronin, M.; Simetkosky, M.; Smith, G.; Meacher, J.

    1985-01-01

    Development test activities on Mod I engines directed toward evaluating technologies for potential inclusion in the Mod II engine are summarized. Activities covered include: test of a 12-tube combustion gas recirculation combustor; manufacture and flow-distribution test of a two-manifold annular heater head; piston rod/piston base joint; single-solid piston rings; and a digital air/fuel concept. Also summarized are results of a formal assessment of candidate technologies for the Mod II engine, and preliminary design work for the Mod II. The overall program philosophy weight is outlined, and data and test results are presented.

  20. Current fluctuations in periodically driven systems

    NASA Astrophysics Data System (ADS)

    Barato, Andre C.; Chetrite, Raphael

    2018-05-01

    Small nonequelibrium systems driven by an external periodic protocol can be described by Markov processes with time-periodic transition rates. In general, current fluctuations in such small systems are large and may play a crucial role. We develop a theoretical formalism to evaluate the rate of such large deviations in periodically driven systems. We show that the scaled cumulant generating function that characterizes current fluctuations is given by a maximal Floquet exponent. Comparing deterministic protocols with stochastic protocols, we show that, with respect to large deviations, systems driven by a stochastic protocol with an infinitely large number of jumps are equivalent to systems driven by deterministic protocols. Our results are illustrated with three case studies: a two-state model for a heat engine, a three-state model for a molecular pump, and a biased random walk with a time-periodic affinity.

  1. The challenges of informatics in synthetic biology: from biomolecular networks to artificial organisms

    PubMed Central

    Ramoni, Marco F.

    2010-01-01

    The field of synthetic biology holds an inspiring vision for the future; it integrates computational analysis, biological data and the systems engineering paradigm in the design of new biological machines and systems. These biological machines are built from basic biomolecular components analogous to electrical devices, and the information flow among these components requires the augmentation of biological insight with the power of a formal approach to information management. Here we review the informatics challenges in synthetic biology along three dimensions: in silico, in vitro and in vivo. First, we describe state of the art of the in silico support of synthetic biology, from the specific data exchange formats, to the most popular software platforms and algorithms. Next, we cast in vitro synthetic biology in terms of information flow, and discuss genetic fidelity in DNA manipulation, development strategies of biological parts and the regulation of biomolecular networks. Finally, we explore how the engineering chassis can manipulate biological circuitries in vivo to give rise to future artificial organisms. PMID:19906839

  2. Mentoring Among Scientists: Implications of Interpersonal Relationships within a Formal Mentoring Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan D. Maughan

    2006-11-01

    Mentoring is an established strategy for learning that has its root in antiquity. Most, if not all, successful scientists and engineers had an effective mentor at some point in their career. In the context of scientists and engineers, mentoring has been undefined. Reports addressing critical concerns regarding the future of science and engineering in the U.S. mention the practice of mentoring a priori, leaving organizations without guidance in its application. Preliminary results from this study imply that formal mentoring can be effective when properly defined and operationalized. Recognizing the uniqueness of the individual in a symbiotic mentor-protégé relationship significantly influencesmore » a protégé’s learning experience which carries repercussions into their career intentions. The mentor-protégé relationship is a key factor in succession planning and preserving and disseminating critical information and tacit knowledge essential to the development of leadership in the science and technological industry.« less

  3. Increase in the Accuracy of Calculating Length of Horizontal Cable SCS in Civil Engineering

    NASA Astrophysics Data System (ADS)

    Semenov, A.

    2017-11-01

    A modification of the method for calculating the horizontal cable consumption of SCS established at civil engineering facilities is proposed. The proposed procedure preserves the prototype simplicity and provides a 5 percent accuracy increase. The values of the achieved accuracy are justified, their compliance with the practice of real projects is proved. The method is brought to the level of the engineering algorithm and formalized in the form of 12/70 rule.

  4. Systems, methods and apparatus for modeling, specifying and deploying policies in autonomous and autonomic systems using agent-oriented software engineering

    NASA Technical Reports Server (NTRS)

    Sterritt, Roy (Inventor); Hinchey, Michael G. (Inventor); Penn, Joaquin (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which in some embodiments, an agent-oriented specification modeled with MaCMAS, is analyzed, flaws in the agent-oriented specification modeled with MaCMAS are corrected, and an implementation is derived from the corrected agent-oriented specification. Described herein are systems, method and apparatus that produce fully (mathematically) tractable development of agent-oriented specification(s) modeled with methodology fragment for analyzing complex multiagent systems (MaCMAS) and policies for autonomic systems from requirements through to code generation. The systems, method and apparatus described herein are illustrated through an example showing how user formulated policies can be translated into a formal mode which can then be converted to code. The requirements-based programming systems, method and apparatus described herein may provide faster, higher quality development and maintenance of autonomic systems based on user formulation of policies.

  5. Formal verification of medical monitoring software using Z language: a representative sample.

    PubMed

    Babamir, Seyed Morteza; Borhani, Mehdi

    2012-08-01

    Medical monitoring systems are useful aids assisting physicians in keeping patients under constant surveillance; however, taking sound decision by the systems is a physician concern. As a result, verification of the systems behavior in monitoring patients is a matter of significant. The patient monitoring is undertaken by software in modern medical systems; so, software verification of modern medial systems have been noticed. Such verification can be achieved by the Formal Languages having mathematical foundations. Among others, the Z language is a suitable formal language has been used to formal verification of systems. This study aims to present a constructive method to verify a representative sample of a medical system by which the system is visually specified and formally verified against patient constraints stated in Z Language. Exploiting our past experience in formal modeling Continuous Infusion Insulin Pump (CIIP), we think of the CIIP system as a representative sample of medical systems in proposing our present study. The system is responsible for monitoring diabetic's blood sugar.

  6. Implementation of Scene Shadows in the Target Acquistion TDA (TARGAC).

    DTIC Science & Technology

    1994-11-01

    B-2 APPENDIX C: ENGINEERING CHANGE REPORTS .......................... C-1 APPENDIX D: TASK...Appendix C contains the details of each change made. Each change is accompanied by an Engineering Change Report (ECR) and in-line documentation of the source...code. Appendix D is a formal design document of the changes needed to implement shadowing by small-scale features. The implementation presented in

  7. A Formal Specification and Verification Method for the Prevention of Denial of Service in Ada Services

    DTIC Science & Technology

    1988-03-01

    Mechanism; Computer Security. 16. PRICE CODE 17. SECURITY CLASSIFICATION IS. SECURITY CLASSIFICATION 19. SECURITY CLASSIFICATION 20. UMrrATION OF ABSTRACT...denial of service. This paper assumes that the reader is a computer science or engineering professional working in the area of formal specification and...recovery from such events as deadlocks and crashes can be accounted for in the computation of the waiting time for each service in the service hierarchy

  8. Conceptualizing the Structure of Coupled Estuary, Coast and Inner Shelf Sediment Systems

    NASA Astrophysics Data System (ADS)

    French, J.; Burningham, H.

    2013-12-01

    The concept of the coastal cell has endured for 50 years as a geomorphological framework for coastal engineering and management. Cells are readily defined for coasts dominated by alongshore transport of beach-grade material, but the concept struggles to accommodate long range cohesive sediment fluxes. Moreover, the challenges of predicting, understanding and mitigating climate change impacts at the coast demand a richer conceptualization that embraces the connectedness of open coasts with estuaries and the inner shelf at broader scales and that also acknowledges the extent of anthropogenic control. Accordingly, this paper presents a new approach that re-engages with formal systems analysis and restores a geomorphological focus to coastal management problems that have latterly been tackled primarily by engineers. At the heart of this approach is an ontology of landforms and interventions (both structural and non-structural) that is partly inspired by the coastal tract concept and its temporal hierarchy of sediment sharing systems, but which also emphasizes a spatial hierarchy in scale, from coastal regions, through landform complexes, to landforms and human interventions. The complex web of interactions is represented through an influence network in which a sub-set of mass transfer pathways define the sediment system. Guided by a machine-readable ontology and produced within a geospatial framework, such system ';maps' can be utilized in several ways. First, their generation constitutes a form of knowledge formalization in which disparate sources of information (published research, data etc) are generalized into usable knowledge. Second, system maps also provide a repository for more quantitative analyses and system-level modelling at the scales that really matter. Third, they can also be analyzed using methods derived from graph theory to yield potentially valuable insights into the scale linkages that govern the mutual adjustment of estuary, coast and inner shelf morphology and their implications for the development of quantitative models able to capture such behaviour. Illustrative results, produced as a contribution to the NERC Integrated Coastal Sediment Systems (iCOASST) project, are presented for demonstration regions in Liverpool Bay and Suffolk, UK.

  9. The Electronic Encyclopedia of Earthquakes

    NASA Astrophysics Data System (ADS)

    Benthien, M.; Marquis, J.; Jordan, T.

    2003-12-01

    The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will complete 450 entries, which will populate the E3 collection to a level that fully spans earthquake science and engineering. Scientists, engineers, and educators who have suggestions for content to be included in the Encyclopedia can visit www.earthquake.info now to complete the "Suggest a Web Page" form.

  10. A Design Pattern for Decentralised Decision Making

    PubMed Central

    Valentini, Gabriele; Fernández-Oto, Cristian; Dorigo, Marco

    2015-01-01

    The engineering of large-scale decentralised systems requires sound methodologies to guarantee the attainment of the desired macroscopic system-level behaviour given the microscopic individual-level implementation. While a general-purpose methodology is currently out of reach, specific solutions can be given to broad classes of problems by means of well-conceived design patterns. We propose a design pattern for collective decision making grounded on experimental/theoretical studies of the nest-site selection behaviour observed in honeybee swarms (Apis mellifera). The way in which honeybee swarms arrive at consensus is fairly well-understood at the macroscopic level. We provide formal guidelines for the microscopic implementation of collective decisions to quantitatively match the macroscopic predictions. We discuss implementation strategies based on both homogeneous and heterogeneous multiagent systems, and we provide means to deal with spatial and topological factors that have a bearing on the micro-macro link. Finally, we exploit the design pattern in two case studies that showcase the viability of the approach. Besides engineering, such a design pattern can prove useful for a deeper understanding of decision making in natural systems thanks to the inclusion of individual heterogeneities and spatial factors, which are often disregarded in theoretical modelling. PMID:26496359

  11. A Quantitative Approach to the Formal Verification of Real-Time Systems.

    DTIC Science & Technology

    1996-09-01

    Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic

  12. Defaults, context, and knowledge: alternatives for OWL-indexed knowledge bases.

    PubMed

    Rector, A

    2004-01-01

    The new Web Ontology Language (OWL) and its Description Logic compatible sublanguage (OWL-DL) explicitly exclude defaults and exceptions, as do all logic based formalisms for ontologies. However, many biomedical applications appear to require default reasoning, at least if they are to be engineered in a maintainable way. Default reasoning has always been one of the great strengths of Frame systems such as Protégé. Resolving this conflict requires analysis of the different uses for defaults and exceptions. In some cases, alternatives can be provided within the OWL framework; in others, it appears that hybrid reasoning about a knowledge base of contingent facts built around the core ontology is necessary. Trade-offs include both human factors and the scaling of computational performance. The analysis presented here is based on the OpenGALEN experience with large scale ontologies using a formalism, GRAIL, which explicitly incorporates constructs for hybrid reasoning, numerous experiments with OWL, and initial work on combining OWL and Protégé.

  13. A decision science approach for integrating social science in climate and energy solutions

    NASA Astrophysics Data System (ADS)

    Wong-Parodi, Gabrielle; Krishnamurti, Tamar; Davis, Alex; Schwartz, Daniel; Fischhoff, Baruch

    2016-06-01

    The social and behavioural sciences are critical for informing climate- and energy-related policies. We describe a decision science approach to applying those sciences. It has three stages: formal analysis of decisions, characterizing how well-informed actors should view them; descriptive research, examining how people actually behave in such circumstances; and interventions, informed by formal analysis and descriptive research, designed to create attractive options and help decision-makers choose among them. Each stage requires collaboration with technical experts (for example, climate scientists, geologists, power systems engineers and regulatory analysts), as well as continuing engagement with decision-makers. We illustrate the approach with examples from our own research in three domains related to mitigating climate change or adapting to its effects: preparing for sea-level rise, adopting smart grid technologies in homes, and investing in energy efficiency for office buildings. The decision science approach can facilitate creating climate- and energy-related policies that are behaviourally informed, realistic and respectful of the people whom they seek to aid.

  14. Canonical formalism for modelling and control of rigid body dynamics.

    PubMed

    Gurfil, P

    2005-12-01

    This paper develops a new paradigm for stabilization of rigid-body dynamics. The state-space model is formulated using canonical elements, known as the Serret-Andoyer (SA) variables, thus far scarcely used for engineering applications. The main feature of the SA formalism is the reduction of the dynamics via the underlying symmetry stemming from conservation of angular momentum and rotational kinetic energy. The controllability of the system model is examined using the notion of accessibility, and is shown to be accessible from all points. Based on the accessibility proof, two nonlinear asymptotic feedback stabilizers are developed: a damping feedback is designed based on the Jurdjevic-Quinn method, and a Hamiltonian controller is derived by using the Hamiltonian as a natural Lyapunov function for the closed-loop dynamics. It is shown that the Hamiltonian control is both passive and inverse optimal with respect to a meaningful performance index. The performance of the new controllers is examined and compared using simulations of realistic scenarios from the satellite attitude dynamics field.

  15. A Foundational Approach to Designing Geoscience Ontologies

    NASA Astrophysics Data System (ADS)

    Brodaric, B.

    2009-05-01

    E-science systems are increasingly deploying ontologies to aid online geoscience research. Geoscience ontologies are typically constructed independently by isolated individuals or groups who tend to follow few design principles. This limits the usability of the ontologies due to conceptualizations that are vague, conflicting, or narrow. Advances in foundational ontologies and formal engineering approaches offer promising solutions, but these advanced techniques have had limited application in the geosciences. This paper develops a design approach for geoscience ontologies by extending aspects of the DOLCE foundational ontology and the OntoClean method. Geoscience examples will be presented to demonstrate the feasibility of the approach.

  16. Toward Synthesis, Analysis, and Certification of Security Protocols

    NASA Technical Reports Server (NTRS)

    Schumann, Johann

    2004-01-01

    Implemented security protocols are basically pieces of software which are used to (a) authenticate the other communication partners, (b) establish a secure communication channel between them (using insecure communication media), and (c) transfer data between the communication partners in such a way that these data only available to the desired receiver, but not to anyone else. Such an implementation usually consists of the following components: the protocol-engine, which controls in which sequence the messages of the protocol are sent over the network, and which controls the assembly/disassembly and processing (e.g., decryption) of the data. the cryptographic routines to actually encrypt or decrypt the data (using given keys), and t,he interface to the operating system and to the application. For a correct working of such a security protocol, all of these components must work flawlessly. Many formal-methods based techniques for the analysis of a security protocols have been developed. They range from using specific logics (e.g.: BAN-logic [4], or higher order logics [12] to model checking [2] approaches. In each approach, the analysis tries to prove that no (or at least not a modeled intruder) can get access to secret data. Otherwise, a scenario illustrating the &tack may be produced. Despite the seeming simplicity of security protocols ("only" a few messages are sent between the protocol partners in order to ensure a secure communication), many flaws have been detected. Unfortunately, even a perfect protocol engine does not guarantee flawless working of a security protocol, as incidents show. Many break-ins and security vulnerabilities are caused by exploiting errors in the implementation of the protocol engine or the underlying operating system. Attacks using buffer-overflows are a very common class of such attacks. Errors in the implementation of exception or error handling can open up additional vulnerabilities. For example, on a website with a log-in screen: multiple tries with invalid passwords caused the expected error message (too many retries). but let the user nevertheless pass. Finally, security can be compromised by silly implementation bugs or design decisions. In a commercial VPN software, all calls to the encryption routines were incidentally replaced by stubs, probably during factory testing. The product worked nicely. and the error (an open VPN) would have gone undetected, if a team member had not inspected the low-level traffic out of curiosity. Also, the use secret proprietary encryption routines can backfire, because such algorithms often exhibit weaknesses which can be exploited easily (see e.g., DVD encoding). Summarizing, there is large number of possibilities to make errors which can compromise the security of a protocol. In today s world with short time-to-market and the use of security protocols in open and hostile networks for safety-critical applications (e.g., power or air-traffic control), such slips could lead to catastrophic situations. Thus, formal methods and automatic reasoning techniques should not be used just for the formal proof of absence of an attack, but they ought to be used to provide an end-to-end tool-supported framework for security software. With such an approach all required artifacts (code, documentation, test cases) , formal analyses, and reliable certification will be generated automatically, given a single, high level specification. By a combination of program synthesis, formal protocol analysis, certification; and proof-carrying code, this goal is within practical reach, since all the important technologies for such an approach actually exist and only need to be assembled in the right way.

  17. Formal Techniques for Synchronized Fault-Tolerant Systems

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Butler, Ricky W.

    1992-01-01

    We present the formal verification of synchronizing aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization is based on an extended state machine model incorporating snapshots of local processors clocks.

  18. Volcanic hazards at distant critical infrastructure: A method for bespoke, multi-disciplinary assessment

    NASA Astrophysics Data System (ADS)

    Odbert, H. M.; Aspinall, W.; Phillips, J.; Jenkins, S.; Wilson, T. M.; Scourse, E.; Sheldrake, T.; Tucker, P.; Nakeshree, K.; Bernardara, P.; Fish, K.

    2015-12-01

    Societies rely on critical services such as power, water, transport networks and manufacturing. Infrastructure may be sited to minimise exposure to natural hazards but not all can be avoided. The probability of long-range transport of a volcanic plume to a site is comparable to other external hazards that must be considered to satisfy safety assessments. Recent advances in numerical models of plume dispersion and stochastic modelling provide a formalized and transparent approach to probabilistic assessment of hazard distribution. To understand the risks to critical infrastructure far from volcanic sources, it is necessary to quantify their vulnerability to different hazard stressors. However, infrastructure assets (e.g. power plantsand operational facilities) are typically complex systems in themselves, with interdependent components that may differ in susceptibility to hazard impact. Usually, such complexity means that risk either cannot be estimated formally or that unsatisfactory simplifying assumptions are prerequisite to building a tractable risk model. We present a new approach to quantifying risk by bridging expertise of physical hazard modellers and infrastructure engineers. We use a joint expert judgment approach to determine hazard model inputs and constrain associated uncertainties. Model outputs are chosen on the basis of engineering or operational concerns. The procedure facilitates an interface between physical scientists, with expertise in volcanic hazards, and infrastructure engineers, with insight into vulnerability to hazards. The result is a joined-up approach to estimating risk from low-probability hazards to critical infrastructure. We describe our methodology and show preliminary results for vulnerability to volcanic hazards at a typical UK industrial facility. We discuss our findings in the context of developing bespoke assessment of hazards from distant sources in collaboration with key infrastructure stakeholders.

  19. Forum on Workforce Development

    NASA Technical Reports Server (NTRS)

    Hoffman, Edward

    2010-01-01

    APPEL Mission: To support NASA's mission by promoting individual, team, and organizational excellence in program/project management and engineering through the application of learning strategies, methods, models, and tools. Goals: a) Provide a common frame of reference for NASA s technical workforce. b) Provide and enhance critical job skills. c) Support engineering, program and project teams. d) Promote organizational learning across the agency. e) Supplement formal educational programs.

  20. On introduction of artificial intelligence elements to heat power engineering

    NASA Astrophysics Data System (ADS)

    Dregalin, A. F.; Nazyrova, R. R.

    1993-10-01

    The basic problems of 'the thermodynamic intelligence' of personal computers have been outlined. The thermodynamic intellect of personal computers as a concept has been introduced to heat processes occurring in engines of flying vehicles. In particular, the thermodynamic intellect of computers is determined by the possibility of deriving formal relationships between thermodynamic functions. In chemical thermodynamics, a concept of a characteristic function has been introduced.

  1. Social engineering of societal knowledge in livestock science: Can we be more empathetic?

    PubMed Central

    Ravikumar, R. K.; Thakur, Devesh; Choudhary, Hardev; Kumar, Vivek; Kinhekar, Amol S.; Garg, Tushar; Ponnusamy, K.; Bhojne, G. R.; Shetty, Vasanth M.; Kumar, Vipin

    2017-01-01

    Questions are raised in effective utilization of farmer’s wisdom by communities in their farming. Planners support to livelihood emphasize mostly of inputs from outside and not setting up sustainable goals. Formal institutions and planners of program are finding constraints and sceptical in wider dissemination of indigenous knowledge research system (IKRS). This is in spite of evidence that considerable number of farmer’s in livestock sector depends on IKRS. In this context, it is pertinent to showcase dissemination potential of these knowledge system(s) in larger geographical areas. The review illustrates different challenges encountered while control of livestock ailments like ectoparasite infestation through IKRS. Several times, it was opinioned to provide or share IKRS to thwart ailments in a specific region. This is interesting as it was narrated how formal system is unable to recognize farmer’s problem and challenges in integrating these sustainable practices. It has to be noted that disseminating activities seldom takes into account the experimental potential of farmers. This review paper articulates various evidences generated in enhancing diffusion thereby dissemination of IKRS. The nature of support extended by IKRS in entrepreneurial activity of smallholder farming units did not get adequate recognition. There needs to be minimum standard protocol in deriving benefit from such low-cost alternative technologies. This will enrich incremental innovation activities as per location specific need and provide scope for wider dissemination. PMID:28246452

  2. Social engineering of societal knowledge in livestock science: Can we be more empathetic?

    PubMed

    Ravikumar, R K; Thakur, Devesh; Choudhary, Hardev; Kumar, Vivek; Kinhekar, Amol S; Garg, Tushar; Ponnusamy, K; Bhojne, G R; Shetty, Vasanth M; Kumar, Vipin

    2017-01-01

    Questions are raised in effective utilization of farmer's wisdom by communities in their farming. Planners support to livelihood emphasize mostly of inputs from outside and not setting up sustainable goals. Formal institutions and planners of program are finding constraints and sceptical in wider dissemination of indigenous knowledge research system (IKRS). This is in spite of evidence that considerable number of farmer's in livestock sector depends on IKRS. In this context, it is pertinent to showcase dissemination potential of these knowledge system(s) in larger geographical areas. The review illustrates different challenges encountered while control of livestock ailments like ectoparasite infestation through IKRS. Several times, it was opinioned to provide or share IKRS to thwart ailments in a specific region. This is interesting as it was narrated how formal system is unable to recognize farmer's problem and challenges in integrating these sustainable practices. It has to be noted that disseminating activities seldom takes into account the experimental potential of farmers. This review paper articulates various evidences generated in enhancing diffusion thereby dissemination of IKRS. The nature of support extended by IKRS in entrepreneurial activity of smallholder farming units did not get adequate recognition. There needs to be minimum standard protocol in deriving benefit from such low-cost alternative technologies. This will enrich incremental innovation activities as per location specific need and provide scope for wider dissemination.

  3. Formal methods and digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.

  4. Optimal tuning of a confined Brownian information engine.

    PubMed

    Park, Jong-Min; Lee, Jae Sung; Noh, Jae Dong

    2016-03-01

    A Brownian information engine is a device extracting mechanical work from a single heat bath by exploiting the information on the state of a Brownian particle immersed in the bath. As for engines, it is important to find the optimal operating condition that yields the maximum extracted work or power. The optimal condition for a Brownian information engine with a finite cycle time τ has been rarely studied because of the difficulty in finding the nonequilibrium steady state. In this study, we introduce a model for the Brownian information engine and develop an analytic formalism for its steady-state distribution for any τ. We find that the extracted work per engine cycle is maximum when τ approaches infinity, while the power is maximum when τ approaches zero.

  5. OMOGENIA: A Semantically Driven Collaborative Environment

    NASA Astrophysics Data System (ADS)

    Liapis, Aggelos

    Ontology creation can be thought of as a social procedure. Indeed the concepts involved in general need to be elicited from communities of domain experts and end-users by teams of knowledge engineers. Many problems in ontology creation appear to resemble certain problems in software design, particularly with respect to the setup of collaborative systems. For instance, the resolution of conceptual conflicts between formalized ontologies is a major engineering problem as ontologies move into widespread use on the semantic web. Such conflict resolution often requires human collaboration and cannot be achieved by automated methods with the exception of simple cases. In this chapter we discuss research in the field of computer-supported cooperative work (CSCW) that focuses on classification and which throws light on ontology building. Furthermore, we present a semantically driven collaborative environment called OMOGENIA as a natural way to display and examine the structure of an evolving ontology in a collaborative setting.

  6. Quantum-enhanced absorption refrigerators

    PubMed Central

    Correa, Luis A.; Palao, José P.; Alonso, Daniel; Adesso, Gerardo

    2014-01-01

    Thermodynamics is a branch of science blessed by an unparalleled combination of generality of scope and formal simplicity. Based on few natural assumptions together with the four laws, it sets the boundaries between possible and impossible in macroscopic aggregates of matter. This triggered groundbreaking achievements in physics, chemistry and engineering over the last two centuries. Close analogues of those fundamental laws are now being established at the level of individual quantum systems, thus placing limits on the operation of quantum-mechanical devices. Here we study quantum absorption refrigerators, which are driven by heat rather than external work. We establish thermodynamic performance bounds for these machines and investigate their quantum origin. We also show how those bounds may be pushed beyond what is classically achievable, by suitably tailoring the environmental fluctuations via quantum reservoir engineering techniques. Such superefficient quantum-enhanced cooling realises a promising step towards the technological exploitation of autonomous quantum refrigerators. PMID:24492860

  7. Using ontologies for structuring organizational knowledge in Home Care assistance.

    PubMed

    Valls, Aida; Gibert, Karina; Sánchez, David; Batet, Montserrat

    2010-05-01

    Information Technologies and Knowledge-based Systems can significantly improve the management of complex distributed health systems, where supporting multidisciplinarity is crucial and communication and synchronization between the different professionals and tasks becomes essential. This work proposes the use of the ontological paradigm to describe the organizational knowledge of such complex healthcare institutions as a basis to support their management. The ontology engineering process is detailed, as well as the way to maintain the ontology updated in front of changes. The paper also analyzes how such an ontology can be exploited in a real healthcare application and the role of the ontology in the customization of the system. The particular case of senior Home Care assistance is addressed, as this is a highly distributed field as well as a strategic goal in an ageing Europe. The proposed ontology design is based on a Home Care medical model defined by an European consortium of Home Care professionals, framed in the scope of the K4Care European project (FP6). Due to the complexity of the model and the knowledge gap existing between the - textual - medical model and the strict formalization of an ontology, an ontology engineering methodology (On-To-Knowledge) has been followed. After applying the On-To-Knowledge steps, the following results were obtained: the feasibility study concluded that the ontological paradigm and the expressiveness of modern ontology languages were enough to describe the required medical knowledge; after the kick-off and refinement stages, a complete and non-ambiguous definition of the Home Care model, including its main components and interrelations, was obtained; the formalization stage expressed HC medical entities in the form of ontological classes, which are interrelated by means of hierarchies, properties and semantically rich class restrictions; the evaluation, carried out by exploiting the ontology into a knowledge-driven e-health application running on a real scenario, showed that the ontology design and its exploitation brought several benefits with regards to flexibility, adaptability and work efficiency from the end-user point of view; for the maintenance stage, two software tools are presented, aimed to address the incorporation and modification of healthcare units and the personalization of ontological profiles. The paper shows that the ontological paradigm and the expressiveness of modern ontology languages can be exploited not only to represent terminology in a non-ambiguous way, but also to formalize the interrelations and organizational structures involved in a real and distributed healthcare environment. This kind of ontologies facilitates the adaptation in front of changes in the healthcare organization or Care Units, supports the creation of profile-based interaction models in a transparent and seamless way, and increases the reusability and generality of the developed software components. As a conclusion of the exploitation of the developed ontology in a real medical scenario, we can say that an ontology formalizing organizational interrelations is a key component for building effective distributed knowledge-driven e-health systems. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  8. Spin filter for arbitrary spins by substrate engineering

    NASA Astrophysics Data System (ADS)

    Pal, Biplab; Römer, Rudolf A.; Chakrabarti, Arunava

    2016-08-01

    We design spin filters for particles with potentially arbitrary spin S≤ft(=1/2,1,3/2,\\ldots \\right) using a one-dimensional periodic chain of magnetic atoms as a quantum device. Describing the system within a tight-binding formalism we present an analytical method to unravel the analogy between a one-dimensional magnetic chain and a multi-strand ladder network. This analogy is crucial, and is subsequently exploited to engineer gaps in the energy spectrum by an appropriate choice of the magnetic substrate. We obtain an exact correlation between the magnitude of the spin of the incoming beam of particles and the magnetic moment of the substrate atoms in the chain desired for opening up of a spectral gap. Results of spin polarized transport, calculated within a transfer matrix formalism, are presented for particles having half-integer as well as higher spin states. We find that the chain can be made to act as a quantum device which opens a transmission window only for selected spin components over certain ranges of the Fermi energy, blocking them in the remaining part of the spectrum. The results appear to be robust even when the choice of the substrate atoms deviates substantially from the ideal situation, as verified by extending the ideas to the case of a ‘spin spiral’. Interestingly, the spin spiral geometry, apart from exhibiting the filtering effect, is also seen to act as a device flipping spins—an effect that can be monitored by an interplay of the system size and the period of the spiral. Our scheme is applicable to ultracold quantum gases, and might inspire future experiments in this direction.

  9. Predicting performance of axial pump inducer of LOX booster turbo-pump of staged combustion cycle based rocket engine using CFD

    NASA Astrophysics Data System (ADS)

    Mishra, Arpit; Ghosh, Parthasarathi

    2015-12-01

    For low cost, high thrust, space missions with high specific impulse and high reliability, inert weight needs to be minimized and thereby increasing the delivered payload. Turbopump feed system for a liquid propellant rocket engine (LPRE) has the highest power to weight ratio. Turbopumps are primarily equipped with an axial flow inducer to achieve the high angular velocity and low suction pressure in combination with increased system reliability. Performance of the turbopump strongly depends on the performance of the inducer. Thus, for designing a LPRE turbopump, demands optimization of the inducer geometry based on the performance of different off-design operating regimes. In this paper, steady-state CFD analysis of the inducer of a liquid oxygen (LOX) axial pump used as a booster pump for an oxygen rich staged combustion cycle rocket engine has been presented using ANSYS® CFX. Attempts have been made to obtain the performance characteristic curves for the LOX pump inducer. The formalism has been used to predict the performance of the inducer for the throttling range varying from 80% to 113% of nominal thrust and for the different rotational velocities from 4500 to 7500 rpm. The results have been analysed to determine the region of cavitation inception for different inlet pressure.

  10. Thermodynamical analysis of a quantum heat engine based on harmonic oscillators.

    PubMed

    Insinga, Andrea; Andresen, Bjarne; Salamon, Peter

    2016-07-01

    Many models of heat engines have been studied with the tools of finite-time thermodynamics and an ensemble of independent quantum systems as the working fluid. Because of their convenient analytical properties, harmonic oscillators are the most frequently used example of a quantum system. We analyze different thermodynamical aspects with the final aim of the optimization of the performance of the engine in terms of the mechanical power provided during a finite-time Otto cycle. The heat exchange mechanism between the working fluid and the thermal reservoirs is provided by the Lindblad formalism. We describe an analytical method to find the limit cycle and give conditions for a stable limit cycle to exist. We explore the power production landscape as the duration of the four branches of the cycle are varied for short times, intermediate times, and special frictionless times. For short times we find a periodic structure with atolls of purely dissipative operation surrounding islands of divergent behavior where, rather than tending to a limit cycle, the working fluid accumulates more and more energy. For frictionless times the periodic structure is gone and we come very close to the global optimal operation. The global optimum is found and interestingly comes with a particular value of the cycle time.

  11. Machine Learning-based Intelligent Formal Reasoning and Proving System

    NASA Astrophysics Data System (ADS)

    Chen, Shengqing; Huang, Xiaojian; Fang, Jiaze; Liang, Jia

    2018-03-01

    The reasoning system can be used in many fields. How to improve reasoning efficiency is the core of the design of system. Through the formal description of formal proof and the regular matching algorithm, after introducing the machine learning algorithm, the system of intelligent formal reasoning and verification has high efficiency. The experimental results show that the system can verify the correctness of propositional logic reasoning and reuse the propositional logical reasoning results, so as to obtain the implicit knowledge in the knowledge base and provide the basic reasoning model for the construction of intelligent system.

  12. A theoretical framework for improving education in geriatric medicine.

    PubMed

    Boreham, N C

    1983-01-01

    Alternative concepts of learning include a formal system in which part of the medical curriculum is designated as that for geriatric medicine; a non-formal system including conferences, lectures, broadcasts, available to both medical students and physicians; and thirdly, an informal system in which doctors learn medicine through their experience practising the profession. While the most emphasis in medical schools would seem to be on the formal system it is essential that medical educators (if they wish their students in later life to maintain high levels of self-initiated learning) must use all three strategies. The structure of a system of formal teaching for geriatric medicine is examined. An important objective is attitude change and it is in achieving this that geriatricians must be particularly involved in non-formal and informal systems.

  13. A Bayesian Framework for Analysis of Pseudo-Spatial Models of Comparable Engineered Systems with Application to Spacecraft Anomaly Prediction Based on Precedent Data

    NASA Astrophysics Data System (ADS)

    Ndu, Obibobi Kamtochukwu

    To ensure that estimates of risk and reliability inform design and resource allocation decisions in the development of complex engineering systems, early engagement in the design life cycle is necessary. An unfortunate constraint on the accuracy of such estimates at this stage of concept development is the limited amount of high fidelity design and failure information available on the actual system under development. Applying the human ability to learn from experience and augment our state of knowledge to evolve better solutions mitigates this limitation. However, the challenge lies in formalizing a methodology that takes this highly abstract, but fundamentally human cognitive, ability and extending it to the field of risk analysis while maintaining the tenets of generalization, Bayesian inference, and probabilistic risk analysis. We introduce an integrated framework for inferring the reliability, or other probabilistic measures of interest, of a new system or a conceptual variant of an existing system. Abstractly, our framework is based on learning from the performance of precedent designs and then applying the acquired knowledge, appropriately adjusted based on degree of relevance, to the inference process. This dissertation presents a method for inferring properties of the conceptual variant using a pseudo-spatial model that describes the spatial configuration of the family of systems to which the concept belongs. Through non-metric multidimensional scaling, we formulate the pseudo-spatial model based on rank-ordered subjective expert perception of design similarity between systems that elucidate the psychological space of the family. By a novel extension of Kriging methods for analysis of geospatial data to our "pseudo-space of comparable engineered systems", we develop a Bayesian inference model that allows prediction of the probabilistic measure of interest.

  14. A Tool for Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John

    2005-01-01

    Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.

  15. Asian Universities' Collaboration for Advanced Environmental Engineering via Simultaneous Distant Learning Classes Using Video Playback

    NASA Astrophysics Data System (ADS)

    Araki, Mituhiko; Nakamura, Yuichi; Fujii, Shigeo; Tsuno, Hiroshi

    Three international simultaneous lectures of the post graduate level in the field of environmental science and engineering are under preparation in Kyoto University. They are planned to be opened in three Asian universities (Tsinghua University in China, University of Malaya in Malaysia, and Kyoto University in Japan) as formal courses. The contents of the lectures, purpose of the project and technical problems are reported.

  16. Joint Logistics Commanders’ Biennial Software Workshop (4th) Orlando II: Solving the PDSS (Post Deployment Software Support) Challenge Held in Orlando, Florida on 27-29 January 87. Volume 2. Proceedings

    DTIC Science & Technology

    1987-06-01

    described the state )f ruaturity of software engineering as being equivalent to the state of maturity of Civil Engineering before Pythagoras invented the...formal verification languages, theorem provers or secure configuration 0 management tools would have to be maintained and used in the PDSS Center to

  17. Formal System Verification - Extension 2

    DTIC Science & Technology

    2012-08-08

    vision of truly trustworthy systems has been to provide a formally verified microkernel basis. We have previously developed the seL4 microkernel...together with a formal proof (in the theorem prover Isabelle/HOL) of its functional correctness [6]. This means that all the behaviours of the seL4 C...source code are included in the high-level, formal specification of the kernel. This work enabled us to provide further formal guarantees about seL4 , in

  18. Systems, methods and apparatus for verification of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Rash, James L. (Inventor); Gracinin, Denis (Inventor); Erickson, John D. (Inventor); Rouff, Christopher A. (Inventor); Hinchey, Michael G. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.

  19. Bottom-up approaches to strengthening child protection systems: Placing children, families, and communities at the center.

    PubMed

    Wessells, Michael G

    2015-05-01

    Efforts to strengthen national child protection systems have frequently taken a top-down approach of imposing formal, government-managed services. Such expert-driven approaches are often characterized by low use of formal services and the misalignment of the nonformal and formal aspects of the child protection system. This article examines an alternative approach of community-driven, bottom-up work that enables nonformal-formal collaboration and alignment, greater use of formal services, internally driven social change, and high levels of community ownership. The dominant approach of reliance on expert-driven Child Welfare Committees produces low levels of community ownership. Using an approach developed and tested in rural Sierra Leone, community-driven action, including collaboration and linkages with the formal system, promoted the use of formal services and achieved increased ownership, effectiveness, and sustainability of the system. The field needs less reliance on expert-driven approaches and much wider use of slower, community-driven, bottom-up approaches to child protection. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  20. Understanding terminological systems. II: Experience with conceptual and formal representation of structure.

    PubMed

    de Keizer, N F; Abu-Hanna, A

    2000-03-01

    This article describes the application of two popular conceptual and formal representation formalisms, as part of a framework for understanding terminological systems. A precise understanding of the structure of a terminological system is essential to assess existing terminological systems, to recognize patterns in various systems and to build new terminological systems. Our experience with the application of this framework to five well-known terminological systems is described.

  1. Towards programming languages for genetic engineering of living cells

    PubMed Central

    Pedersen, Michael; Phillips, Andrew

    2009-01-01

    Synthetic biology aims at producing novel biological systems to carry out some desired and well-defined functions. An ultimate dream is to design these systems at a high level of abstraction using engineering-based tools and programming languages, press a button, and have the design translated to DNA sequences that can be synthesized and put to work in living cells. We introduce such a programming language, which allows logical interactions between potentially undetermined proteins and genes to be expressed in a modular manner. Programs can be translated by a compiler into sequences of standard biological parts, a process that relies on logic programming and prototype databases that contain known biological parts and protein interactions. Programs can also be translated to reactions, allowing simulations to be carried out. While current limitations on available data prevent full use of the language in practical applications, the language can be used to develop formal models of synthetic systems, which are otherwise often presented by informal notations. The language can also serve as a concrete proposal on which future language designs can be discussed, and can help to guide the emerging standard of biological parts which so far has focused on biological, rather than logical, properties of parts. PMID:19369220

  2. Towards programming languages for genetic engineering of living cells.

    PubMed

    Pedersen, Michael; Phillips, Andrew

    2009-08-06

    Synthetic biology aims at producing novel biological systems to carry out some desired and well-defined functions. An ultimate dream is to design these systems at a high level of abstraction using engineering-based tools and programming languages, press a button, and have the design translated to DNA sequences that can be synthesized and put to work in living cells. We introduce such a programming language, which allows logical interactions between potentially undetermined proteins and genes to be expressed in a modular manner. Programs can be translated by a compiler into sequences of standard biological parts, a process that relies on logic programming and prototype databases that contain known biological parts and protein interactions. Programs can also be translated to reactions, allowing simulations to be carried out. While current limitations on available data prevent full use of the language in practical applications, the language can be used to develop formal models of synthetic systems, which are otherwise often presented by informal notations. The language can also serve as a concrete proposal on which future language designs can be discussed, and can help to guide the emerging standard of biological parts which so far has focused on biological, rather than logical, properties of parts.

  3. Skinner-Rusk unified formalism for higher-order systems

    NASA Astrophysics Data System (ADS)

    Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso

    2012-07-01

    The Lagrangian-Hamiltonian unified formalism of R. Skinner and R. Rusk was originally stated for autonomous dynamical systems in classical mechanics. It has been generalized for non-autonomous first-order mechanical systems, first-order and higher-order field theories, and higher-order autonomous systems. In this work we present a generalization of this formalism for higher-order non-autonomous mechanical systems.

  4. Reverse Engineering a Signaling Network Using Alternative Inputs

    PubMed Central

    Tanaka, Hiromasa; Yi, Tau-Mu

    2009-01-01

    One of the goals of systems biology is to reverse engineer in a comprehensive fashion the arrow diagrams of signal transduction systems. An important tool for ordering pathway components is genetic epistasis analysis, and here we present a strategy termed Alternative Inputs (AIs) to perform systematic epistasis analysis. An alternative input is defined as any genetic manipulation that can activate the signaling pathway instead of the natural input. We introduced the concept of an “AIs-Deletions matrix” that summarizes the outputs of all combinations of alternative inputs and deletions. We developed the theory and algorithms to construct a pairwise relationship graph from the AIs-Deletions matrix capturing both functional ordering (upstream, downstream) and logical relationships (AND, OR), and then interpreting these relationships into a standard arrow diagram. As a proof-of-principle, we applied this methodology to a subset of genes involved in yeast mating signaling. This experimental pilot study highlights the robustness of the approach and important technical challenges. In summary, this research formalizes and extends classical epistasis analysis from linear pathways to more complex networks, facilitating computational analysis and reconstruction of signaling arrow diagrams. PMID:19898612

  5. MatLab Script and Functional Programming

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali

    2007-01-01

    MatLab Script and Functional Programming: MatLab is one of the most widely used very high level programming languages for scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. The MatLab seminar covers the functional and script programming aspect of MatLab language. Specific expectations are: a) Recognize MatLab commands, script and function. b) Create, and run a MatLab function. c) Read, recognize, and describe MatLab syntax. d) Recognize decisions, loops and matrix operators. e) Evaluate scope among multiple files, and multiple functions within a file. f) Declare, define and use scalar variables, vectors and matrices.

  6. Fast Formal Analysis of Requirements via "Topoi Diagrams"

    NASA Technical Reports Server (NTRS)

    Menzies, Tim; Powell, John; Houle, Michael E.; Kelly, John C. (Technical Monitor)

    2001-01-01

    Early testing of requirements can decrease the cost of removing errors in software projects. However, unless done carefully, that testing process can significantly add to the cost of requirements analysis. We show here that requirements expressed as topoi diagrams can be built and tested cheaply using our SP2 algorithm, the formal temporal properties of a large class of topoi can be proven very quickly, in time nearly linear in the number of nodes and edges in the diagram. There are two limitations to our approach. Firstly, topoi diagrams cannot express certain complex concepts such as iteration and sub-routine calls. Hence, our approach is more useful for requirements engineering than for traditional model checking domains. Secondly, out approach is better for exploring the temporal occurrence of properties than the temporal ordering of properties. Within these restrictions, we can express a useful range of concepts currently seen in requirements engineering, and a wide range of interesting temporal properties.

  7. Lagrangian-Hamiltonian unified formalism for autonomous higher order dynamical systems

    NASA Astrophysics Data System (ADS)

    Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso

    2011-09-01

    The Lagrangian-Hamiltonian unified formalism of Skinner and Rusk was originally stated for autonomous dynamical systems in classical mechanics. It has been generalized for non-autonomous first-order mechanical systems, as well as for first-order and higher order field theories. However, a complete generalization to higher order mechanical systems is yet to be described. In this work, after reviewing the natural geometrical setting and the Lagrangian and Hamiltonian formalisms for higher order autonomous mechanical systems, we develop a complete generalization of the Lagrangian-Hamiltonian unified formalism for these kinds of systems, and we use it to analyze some physical models from this new point of view.

  8. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.

    DTIC Science & Technology

    1997-09-30

    set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has

  9. Artificial General Intelligence: Concept, State of the Art, and Future Prospects

    NASA Astrophysics Data System (ADS)

    Goertzel, Ben

    2014-12-01

    In recent years broad community of researchers has emerged, focusing on the original ambitious goals of the AI field - the creation and study of software or hardware systems with general intelligence comparable to, and ultimately perhaps greater than, that of human beings. This paper surveys this diverse community and its progress. Approaches to defining the concept of Artificial General Intelligence (AGI) are reviewed including mathematical formalisms, engineering, and biology inspired perspectives. The spectrum of designs for AGI systems includes systems with symbolic, emergentist, hybrid and universalist characteristics. Metrics for general intelligence are evaluated, with a conclusion that, although metrics for assessing the achievement of human-level AGI may be relatively straightforward (e.g. the Turing Test, or a robot that can graduate from elementary school or university), metrics for assessing partial progress remain more controversial and problematic.

  10. CORESAFE: A Formal Approach against Code Replacement Attacks on Cyber Physical Systems

    DTIC Science & Technology

    2018-04-19

    AFRL-AFOSR-JP-TR-2018-0035 CORESAFE:A Formal Approach against Code Replacement Attacks on Cyber Physical Systems Sandeep Shukla INDIAN INSTITUTE OF...Formal Approach against Code Replacement Attacks on Cyber Physical Systems 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA2386-16-1-4099 5c.  PROGRAM ELEMENT...Institute of Technology Kanpur India Final Report for AOARD Grant “CORESAFE: A Formal Approach against Code Replacement Attacks on Cyber Physical

  11. Verifying Hybrid Systems Modeled as Timed Automata: A Case Study

    DTIC Science & Technology

    1997-03-01

    Introduction Researchers have proposed many innovative formal methods for developing real - time systems [9]. Such methods can give system developers and...customers greater con dence that real - time systems satisfy their requirements, especially their crit- ical requirements. However, applying formal methods...specifying and reasoning about real - time systems that is designed to address these challenging problems. Our approach is to build formal reasoning tools

  12. Deductive Evaluation: Formal Code Analysis With Low User Burden

    NASA Technical Reports Server (NTRS)

    Di Vito, Ben. L

    2016-01-01

    We describe a framework for symbolically evaluating iterative C code using a deductive approach that automatically discovers and proves program properties. Although verification is not performed, the method can infer detailed program behavior. Software engineering work flows could be enhanced by this type of analysis. Floyd-Hoare verification principles are applied to synthesize loop invariants, using a library of iteration-specific deductive knowledge. When needed, theorem proving is interleaved with evaluation and performed on the fly. Evaluation results take the form of inferred expressions and type constraints for values of program variables. An implementation using PVS (Prototype Verification System) is presented along with results for sample C functions.

  13. Integrating ethics in design through the value-sensitive design approach.

    PubMed

    Cummings, Mary L

    2006-10-01

    The Accreditation Board of Engineering and Technology (ABET) has declared that to achieve accredited status, 'engineering programs must demonstrate that their graduates have an understanding of professional and ethical responsibility.' Many engineering professors struggle to integrate this required ethics instruction in technical classes and projects because of the lack of a formalized ethics-in-design approach. However, one methodology developed in human-computer interaction research, the Value-Sensitive Design approach, can serve as an engineering education tool which bridges the gap between design and ethics for many engineering disciplines. The three major components of Value-Sensitive Design, conceptual, technical, and empirical, exemplified through a case study which focuses on the development of a command and control supervisory interface for a military cruise missile.

  14. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  15. On the Safety of Machine Learning: Cyber-Physical Systems, Decision Sciences, and Data Products.

    PubMed

    Varshney, Kush R; Alemzadeh, Homa

    2017-09-01

    Machine learning algorithms increasingly influence our decisions and interact with us in all parts of our daily lives. Therefore, just as we consider the safety of power plants, highways, and a variety of other engineered socio-technical systems, we must also take into account the safety of systems involving machine learning. Heretofore, the definition of safety has not been formalized in a machine learning context. In this article, we do so by defining machine learning safety in terms of risk, epistemic uncertainty, and the harm incurred by unwanted outcomes. We then use this definition to examine safety in all sorts of applications in cyber-physical systems, decision sciences, and data products. We find that the foundational principle of modern statistical machine learning, empirical risk minimization, is not always a sufficient objective. We discuss how four different categories of strategies for achieving safety in engineering, including inherently safe design, safety reserves, safe fail, and procedural safeguards can be mapped to a machine learning context. We then discuss example techniques that can be adopted in each category, such as considering interpretability and causality of predictive models, objective functions beyond expected prediction accuracy, human involvement for labeling difficult or rare examples, and user experience design of software and open data.

  16. Using a Formal Approach for Reverse Engineering and Design Recovery to Support Software Reuse

    NASA Technical Reports Server (NTRS)

    Gannod, Gerald C.

    2002-01-01

    This document describes 3rd year accomplishments and summarizes overall project accomplishments. Included as attachments are all published papers from year three. Note that the budget for this project was discontinued after year two, but that a residual budget from year two allowed minimal continuance into year three. Accomplishments include initial investigations into log-file based reverse engineering, service-based software reuse, and a source to XML generator.

  17. Bridging the Gulf between Formal Calculus and Physical Reasoning.

    ERIC Educational Resources Information Center

    Van Der Meer, A.

    1980-01-01

    Some ways to link calculus instruction with the mathematical models used in physics courses are presented. The activity of modelling is presented as a major tool in synchronizing physics and mathematics instruction in undergraduate engineering programs. (MP)

  18. Light UAV Support Ship (ASW) (LUSSA)

    DTIC Science & Technology

    2011-08-01

    35 9.5 TriSWACH Model Test Data...7 Figure 8: TriSWACH Model ...Innovation in Ship Design (CISD) used the Northrop Grumman Bat UAV (formally known as the Swift Engineering Killer Bee KB4) to model launch, recovery, and

  19. 48 CFR 1509.170-3 - Applicability.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... PLANNING CONTRACTOR QUALIFICATIONS Contractor Performance Evaluations 1509.170-3 Applicability. (a) This....604 provides detailed instructions for architect-engineer contractor performance evaluations. (b) The... simplified acquisition procedures do not require the creation or existence of a formal database for past...

  20. Field-antifield and BFV formalisms for quadratic systems with open gauge algebras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nirov, K.S.; Razumov, A.V.

    1992-09-20

    In this paper the Lagrangian field-antifield (BV) and Hamiltonian (BFV) BRST formalisms for the general quadratic systems with open gauge algebra are considered. The equivalence between the Lagrangian and Hamiltonian formalisms is proven.

  1. Working the College System: Six Strategies for Building a Personal Powerbase

    ERIC Educational Resources Information Center

    Simplicio, Joseph S. C.

    2008-01-01

    Within each college system there are prescribed formalized methods for accomplishing tasks and achieving established goals. To truly understand how a college, or any large organization functions, it is vital to understand the basis of the formal structure. Those individuals who understand formal systems within a college can use this knowledge to…

  2. 2002 Computing and Interdisciplinary Systems Office Review and Planning Meeting

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Gregory; Lopez, Isaac; Veres, Joseph; Lavelle, Thomas; Sehra, Arun; Freeh, Josh; Hah, Chunill

    2003-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with NASA Glenn s Propulsion program, NASA Ames, industry, academia and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This year s review meeting describes the current status of the NPSS and the Object Oriented Development Kit with specific emphasis on the progress made over the past year on air breathing propulsion applications for aeronautics and space transportation applications. Major accomplishments include the first 3-D simulation of the primary flow path of a large turbofan engine in less than 15 hours, and the formal release of the NPSS Version 1.5 that includes elements of rocket engine systems and a visual based syntax layer. NPSS and the Development Kit are managed by the Computing and Interdisciplinary Systems Office (CISO) at the NASA Glenn Research Center and financially supported in fiscal year 2002 by the Computing, Networking and Information Systems (CNIS) project managed at NASA Ames, the Glenn Aerospace Propulsion and Power Program and the Advanced Space Transportation Program.

  3. Systems, methods and apparatus for implementation of formal specifications derived from informal requirements

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments an informal specification is translated without human intervention into a formal specification. In some embodiments the formal specification is a process-based specification. In some embodiments, the formal specification is translated into a high-level computer programming language which is further compiled into a set of executable computer instructions.

  4. A Comparitive Study of Subject Knowledge of B.Ed Graduates of Formal and Non-Formal Teacher Education Systems

    ERIC Educational Resources Information Center

    Saif, Perveen; Reba, Amjad; ud Din, Jalal

    2017-01-01

    This study was designed to compare the subject knowledge of B.Ed graduates of formal and non-formal teacher education systems. The population of the study included all teachers from Girls High and Higher Secondary Schools both from private and public sectors from the district of Peshawar. Out of the total population, twenty schools were randomly…

  5. From Goal-Oriented Requirements to Event-B Specifications

    NASA Technical Reports Server (NTRS)

    Aziz, Benjamin; Arenas, Alvaro E.; Bicarregui, Juan; Ponsard, Christophe; Massonet, Philippe

    2009-01-01

    In goal-oriented requirements engineering methodologies, goals are structured into refinement trees from high-level system-wide goals down to fine-grained requirements assigned to specific software/ hardware/human agents that can realise them. Functional goals assigned to software agents need to be operationalised into specification of services that the agent should provide to realise those requirements. In this paper, we propose an approach for operationalising requirements into specifications expressed in the Event-B formalism. Our approach has the benefit of aiding software designers by bridging the gap between declarative requirements and operational system specifications in a rigorous manner, enabling powerful correctness proofs and allowing further refinements down to the implementation level. Our solution is based on verifying that a consistent Event-B machine exhibits properties corresponding to requirements.

  6. Auxiliary Propulsion Activities in Support of NASA's Exploration Initiative

    NASA Technical Reports Server (NTRS)

    Best, Philip J.; Unger, Ronald J.; Waits, David A.

    2005-01-01

    The Space Launch Initiative (SLI) procurement mechanism NRA8-30 initiated the Auxiliary Propulsion System/Main Propulsion System (APS/MPS) Project in 2001 to address technology gaps and development risks for non-toxic and cryogenic propellants for auxiliary propulsion applications. These applications include reaction control and orbital maneuvering engines, and storage, pressure control, and transfer technologies associated with on-orbit maintenance of cryogens. The project has successfully evolved over several years in response to changing requirements for re-usable launch vehicle technologies, general launch technology improvements, and, most recently, exploration technologies. Lessons learned based on actual hardware performance have also played a part in the project evolution to focus now on those technologies deemed specifically relevant to the Exploration Initiative. Formal relevance reviews held in the spring of 2004 resulted in authority for continuation of the Auxiliary Propulsion Project through Fiscal Year 2005 (FY05), and provided for a direct reporting path to the Exploration Systems Mission Directorate. The tasks determined to be relevant under the project were: continuation of the development, fabrication, and delivery of three 870 lbf thrust prototype LOX/ethanol reaction control engines; the fabrication, assembly, engine integration and testing of the Auxiliary Propulsion Test Bed at White Sands Test Facility; and the completion of FY04 cryogenic fluid management component and subsystem development tasks (mass gauging, pressure control, and liquid acquisition elements). This paper presents an overview of those tasks, their scope, expectations, and results to-date as carried forward into the Exploration Initiative.

  7. Subsumption principles underlying medical concept systems and their formal reconstruction.

    PubMed Central

    Bernauer, J.

    1994-01-01

    Conventional medical concept systems represent generic concept relations by hierarchical coding principles. Often, these coding principles constrain the concept system and reduce the potential for automatical derivation of subsumption. Formal reconstruction of medical concept systems is an approach that bases on the conceptual representation of meanings and that allows for the application of formal criteria for subsumption. Those criteria must reflect intuitive principles of subordination which are underlying conventional medical concept systems. Particularly these are: The subordinate concept results (1) from adding a specializing criterion to the superordinate concept, (2) from refining the primary category, or a criterion of the superordinate concept, by a concept that is less general, (3) from adding a partitive criterion to a criterion of the superordinate, (4) from refining a criterion by a concept that is less comprehensive, and finally (5) from coordinating the superordinate concept, or one of its criteria. This paper introduces a formalism called BERNWARD that aims at the formal reconstruction of medical concept systems according to these intuitive principles. The automatical derivation of hierarchical relations is primarily supported by explicit generic and explicit partititive hierarchies of concepts, secondly, by two formal criteria that base on the structure of concept descriptions and explicit hierarchical relations between their elements, namely: formal subsumption and part-sensitive subsumption. Formal subsumption takes only generic relations into account, part-sensitive subsumption additionally regards partive relations between criteria. This approach seems to be flexible enough to cope with unforeseeable effects of partitive criteria on subsumption. PMID:7949907

  8. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  9. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  10. Knowledge-based system verification and validation

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1990-01-01

    The objective of this task is to develop and evaluate a methodology for verification and validation (V&V) of knowledge-based systems (KBS) for space station applications with high reliability requirements. The approach consists of three interrelated tasks. The first task is to evaluate the effectiveness of various validation methods for space station applications. The second task is to recommend requirements for KBS V&V for Space Station Freedom (SSF). The third task is to recommend modifications to the SSF to support the development of KBS using effectiveness software engineering and validation techniques. To accomplish the first task, three complementary techniques will be evaluated: (1) Sensitivity Analysis (Worchester Polytechnic Institute); (2) Formal Verification of Safety Properties (SRI International); and (3) Consistency and Completeness Checking (Lockheed AI Center). During FY89 and FY90, each contractor will independently demonstrate the user of his technique on the fault detection, isolation, and reconfiguration (FDIR) KBS or the manned maneuvering unit (MMU), a rule-based system implemented in LISP. During FY91, the application of each of the techniques to other knowledge representations and KBS architectures will be addressed. After evaluation of the results of the first task and examination of Space Station Freedom V&V requirements for conventional software, a comprehensive KBS V&V methodology will be developed and documented. Development of highly reliable KBS's cannot be accomplished without effective software engineering methods. Using the results of current in-house research to develop and assess software engineering methods for KBS's as well as assessment of techniques being developed elsewhere, an effective software engineering methodology for space station KBS's will be developed, and modification of the SSF to support these tools and methods will be addressed.

  11. Control Architecture for Robotic Agent Command and Sensing

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance; Aghazarian, Hrand; Estlin, Tara; Gaines, Daniel

    2008-01-01

    Control Architecture for Robotic Agent Command and Sensing (CARACaS) is a recent product of a continuing effort to develop architectures for controlling either a single autonomous robotic vehicle or multiple cooperating but otherwise autonomous robotic vehicles. CARACaS is potentially applicable to diverse robotic systems that could include aircraft, spacecraft, ground vehicles, surface water vessels, and/or underwater vessels. CARACaS incudes an integral combination of three coupled agents: a dynamic planning engine, a behavior engine, and a perception engine. The perception and dynamic planning en - gines are also coupled with a memory in the form of a world model. CARACaS is intended to satisfy the need for two major capabilities essential for proper functioning of an autonomous robotic system: a capability for deterministic reaction to unanticipated occurrences and a capability for re-planning in the face of changing goals, conditions, or resources. The behavior engine incorporates the multi-agent control architecture, called CAMPOUT, described in An Architecture for Controlling Multiple Robots (NPO-30345), NASA Tech Briefs, Vol. 28, No. 11 (November 2004), page 65. CAMPOUT is used to develop behavior-composition and -coordination mechanisms. Real-time process algebra operators are used to compose a behavior network for any given mission scenario. These operators afford a capability for producing a formally correct kernel of behaviors that guarantee predictable performance. By use of a method based on multi-objective decision theory (MODT), recommendations from multiple behaviors are combined to form a set of control actions that represents their consensus. In this approach, all behaviors contribute simultaneously to the control of the robotic system in a cooperative rather than a competitive manner. This approach guarantees a solution that is good enough with respect to resolution of complex, possibly conflicting goals within the constraints of the mission to be accomplished by the vehicle(s).

  12. 23 CFR 172.5 - Methods of procurement.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... and ranked by the contracting agency using one of the following procedures: (1) Competitive negotiation. Contracting agencies shall use competitive negotiation for the procurement of engineering and... and selection phase. Alternatively, a formal procedure adopted by State Statute enacted into law prior...

  13. Photon scattering from a system of multilevel quantum emitters. I. Formalism

    NASA Astrophysics Data System (ADS)

    Das, Sumanta; Elfving, Vincent E.; Reiter, Florentin; Sørensen, Anders S.

    2018-04-01

    We introduce a formalism to solve the problem of photon scattering from a system of multilevel quantum emitters. Our approach provides a direct solution of the scattering dynamics. As such the formalism gives the scattered fields' amplitudes in the limit of a weak incident intensity. Our formalism is equipped to treat both multiemitter and multilevel emitter systems, and is applicable to a plethora of photon-scattering problems, including conditional state preparation by photodetection. In this paper, we develop the general formalism for an arbitrary geometry. In the following paper (part II) S. Das et al. [Phys. Rev. A 97, 043838 (2018), 10.1103/PhysRevA.97.043838], we reduce the general photon-scattering formalism to a form that is applicable to one-dimensional waveguides and show its applicability by considering explicit examples with various emitter configurations.

  14. Activities of the Center for Space Construction

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Center for Space Construction (CSC) at the University of Colorado at Boulder is one of eight University Space Engineering Research Centers established by NASA in 1988. The mission of the center is to conduct research into space technology and to directly contribute to space engineering education. The center reports to the Department of Aerospace Engineering Sciences and resides in the College of Engineering and Applied Science. The college has a long and successful track record of cultivating multi-disciplinary research and education programs. The Center for Space Construction is prominent evidence of this record. At the inception of CSC, the center was primarily founded on the need for research on in-space construction of large space systems like space stations and interplanetary space vehicles. The scope of CSC's research has now evolved to include the design and construction of all spacecraft, large and small. Within this broadened scope, our research projects seek to impact the underlying technological basis for such spacecraft as remote sensing satellites, communication satellites, and other special purpose spacecraft, as well as the technological basis for large space platforms. The center's research focuses on three areas: spacecraft structures, spacecraft operations and control, and regolith and surface systems. In the area of spacecraft structures, our current emphasis is on concepts and modeling of deployable structures, analysis of inflatable structures, structural damage detection algorithms, and composite materials for lightweight structures. In the area of spacecraft operations and control, we are continuing our previous efforts in process control of in-orbit structural assembly. In addition, we have begun two new efforts in formal approach to spacecraft flight software systems design and adaptive attitude control systems. In the area of regolith and surface systems, we are continuing the work of characterizing the physical properties of lunar regolith, and we are at work on a project on path planning for planetary surface rovers.

  15. Expedition 32 Preflight

    NASA Image and Video Library

    2012-07-15

    Expedition 32 Flight Engineer Sunita Williams, right, Soyuz Commander Yuri Malenchenko and JAXA Flight Engineer Akihiko Hoshide, left, receive a formal go for launch from Vitaly Alexandrovich Lopota, President of Energia, left, and Vladimir Popovkin, Director of Roscosmos prior to their launch onboard the Soyuz TMA-05M on Sunday, July 15, 2012 at the Baikonur Cosmodrome in Kazakhstan. The Soyuz spacecraft with Malenchenko, Williams and Hoshide onboard launched at 8:40 a.m. later that morning Kazakhstan time. Photo Credit: (NASA/Victor Zelentsov)

  16. Meta-Analysis of Human Factors Engineering Studies Comparing Individual Differences, Practice Effects and Equipment Design Variations.

    DTIC Science & Technology

    1985-02-21

    Approvoid foT public 90Ieleol, 2* . tJni7nited " - . - o . - ’--. * . -... . 1 UNCLASSIFIED S, E CURITY CLASSIFICATION OF THIS PAGE-" REPORT DOCUMENTATION...ACCESSION NO. 11. TITLE (Include Security Classification) . Veta -Analysis of Human Factors Engineering Studies Comparing Individual Differences, Practice...Background C Opportunity D Significance E History III. PHASE I FINAL REPORT A Literature Review B Formal Analysis C Results D Implications for Phase II IV

  17. A Comparison and Evaluation of Real-Time Software Systems Modeling Languages

    NASA Technical Reports Server (NTRS)

    Evensen, Kenneth D.; Weiss, Kathryn Anne

    2010-01-01

    A model-driven approach to real-time software systems development enables the conceptualization of software, fostering a more thorough understanding of its often complex architecture and behavior while promoting the documentation and analysis of concerns common to real-time embedded systems such as scheduling, resource allocation, and performance. Several modeling languages have been developed to assist in the model-driven software engineering effort for real-time systems, and these languages are beginning to gain traction with practitioners throughout the aerospace industry. This paper presents a survey of several real-time software system modeling languages, namely the Architectural Analysis and Design Language (AADL), the Unified Modeling Language (UML), Systems Modeling Language (SysML), the Modeling and Analysis of Real-Time Embedded Systems (MARTE) UML profile, and the AADL for UML profile. Each language has its advantages and disadvantages, and in order to adequately describe a real-time software system's architecture, a complementary use of multiple languages is almost certainly necessary. This paper aims to explore these languages in the context of understanding the value each brings to the model-driven software engineering effort and to determine if it is feasible and practical to combine aspects of the various modeling languages to achieve more complete coverage in architectural descriptions. To this end, each language is evaluated with respect to a set of criteria such as scope, formalisms, and architectural coverage. An example is used to help illustrate the capabilities of the various languages.

  18. Fuzzy-logic-based network for complex systems risk assessment: application to ship performance analysis.

    PubMed

    Abou, Seraphin C

    2012-03-01

    In this paper, a new interpretation of intuitionistic fuzzy sets in the advanced framework of the Dempster-Shafer theory of evidence is extended to monitor safety-critical systems' performance. Not only is the proposed approach more effective, but it also takes into account the fuzzy rules that deal with imperfect knowledge/information and, therefore, is different from the classical Takagi-Sugeno fuzzy system, which assumes that the rule (the knowledge) is perfect. We provide an analytical solution to the practical and important problem of the conceptual probabilistic approach for formal ship safety assessment using the fuzzy set theory that involves uncertainties associated with the reliability input data. Thus, the overall safety of the ship engine is investigated as an object of risk analysis using the fuzzy mapping structure, which considers uncertainty and partial truth in the input-output mapping. The proposed method integrates direct evidence of the frame of discernment and is demonstrated through references to examples where fuzzy set models are informative. These simple applications illustrate how to assess the conflict of sensor information fusion for a sufficient cooling power system of vessels under extreme operation conditions. It was found that propulsion engine safety systems are not only a function of many environmental and operation profiles but are also dynamic and complex. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Understanding Rape Survivors' Decisions Not to Seek Help from Formal Social Systems

    ERIC Educational Resources Information Center

    Patterson, Debra; Greeson, Megan; Campbell, Rebecca

    2009-01-01

    Few rape survivors seek help from formal social systems after their assault. The purpose of this study was to examine factors that prevent survivors from seeking help from the legal, medical, and mental health systems and rape crisis centers. In this study, 29 female rape survivors who did not seek any postassault formal help were interviewed…

  20. Formal logic rewrite system bachelor in teaching mathematical informatics

    NASA Astrophysics Data System (ADS)

    Habiballa, Hashim; Jendryscik, Radek

    2017-07-01

    The article presents capabilities of the formal rewrite logic system - Bachelor - for teaching theoretical computer science (mathematical informatics). The system Bachelor enables constructivist approach to teaching and therefore it may enhance the learning process in hard informatics essential disciplines. It brings not only detailed description of formal rewrite process but also it can demonstrate algorithmical principles for logic formulae manipulations.

  1. A Formalization of HIPAA for a Medical Messaging System

    NASA Astrophysics Data System (ADS)

    Lam, Peifung E.; Mitchell, John C.; Sundaram, Sharada

    The complexity of regulations in healthcare, financial services, and other industries makes it difficult for enterprises to design and deploy effective compliance systems. We believe that in some applications, it may be practical to support compliance by using formalized portions of applicable laws to regulate business processes that use information systems. In order to explore this possibility, we use a stratified fragment of Prolog with limited use of negation to formalize a portion of the US Health Insurance Portability and Accountability Act (HIPAA). As part of our study, we also explore the deployment of our formalization in a prototype hospital Web portal messaging system.

  2. Systems, methods and apparatus for pattern matching in procedure development and verification

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which, in some embodiments, a formal specification is pattern-matched from scenarios, the formal specification is analyzed, and flaws in the formal specification are corrected. The systems, methods and apparatus may include pattern-matching an equivalent formal model from an informal specification. Such a model can be analyzed for contradictions, conflicts, use of resources before the resources are available, competition for resources, and so forth. From such a formal model, an implementation can be automatically generated in a variety of notations. The approach can improve the resulting implementation, which, in some embodiments, is provably equivalent to the procedures described at the outset, which in turn can improve confidence that the system reflects the requirements, and in turn reduces system development time and reduces the amount of testing required of a new system. Moreover, in some embodiments, two or more implementations can be "reversed" to appropriate formal models, the models can be combined, and the resulting combination checked for conflicts. Then, the combined, error-free model can be used to generate a new (single) implementation that combines the functionality of the original separate implementations, and may be more likely to be correct.

  3. Formalizing structured file services for the data storage and retrieval subsystem of the data management system for Spacestation Freedom

    NASA Technical Reports Server (NTRS)

    Jamsek, Damir A.

    1993-01-01

    A brief example of the use of formal methods techniques in the specification of a software system is presented. The report is part of a larger effort targeted at defining a formal methods pilot project for NASA. One possible application domain that may be used to demonstrate the effective use of formal methods techniques within the NASA environment is presented. It is not intended to provide a tutorial on either formal methods techniques or the application being addressed. It should, however, provide an indication that the application being considered is suitable for a formal methods by showing how such a task may be started. The particular system being addressed is the Structured File Services (SFS), which is a part of the Data Storage and Retrieval Subsystem (DSAR), which in turn is part of the Data Management System (DMS) onboard Spacestation Freedom. This is a software system that is currently under development for NASA. An informal mathematical development is presented. Section 3 contains the same development using Penelope (23), an Ada specification and verification system. The complete text of the English version Software Requirements Specification (SRS) is reproduced in Appendix A.

  4. Using the EC decision on case definitions for communicable diseases as a terminology source--lessons learned.

    PubMed

    Balkanyi, Laszlo; Heja, Gergely; Nagy, Attlia

    2014-01-01

    Extracting scientifically accurate terminology from an EU public health regulation is part of the knowledge engineering work at the European Centre for Disease Prevention and Control (ECDC). ECDC operates information systems at the crossroads of many areas - posing a challenge for transparency and consistency. Semantic interoperability is based on the Terminology Server (TS). TS value sets (structured vocabularies) describe shared domains as "diseases", "organisms", "public health terms", "geo-entities" "organizations" and "administrative terms" and others. We extracted information from the relevant EC Implementing Decision on case definitions for reporting communicable diseases, listing 53 notifiable infectious diseases, containing clinical, diagnostic, laboratory and epidemiological criteria. We performed a consistency check; a simplification - abstraction; we represented lab criteria in triplets: as 'y' procedural result /of 'x' organism-substance/on 'z' specimen and identified negations. The resulting new case definition value set represents the various formalized criteria, meanwhile the existing disease value set has been extended, new signs and symptoms were added. New organisms enriched the organism value set. Other new categories have been added to the public health value set, as transmission modes; substances; specimens and procedures. We identified problem areas, as (a) some classification error(s); (b) inconsistent granularity of conditions; (c) seemingly nonsense criteria, medical trivialities; (d) possible logical errors, (e) seemingly factual errors that might be phrasing errors. We think our hypothesis regarding room for possible improvements is valid: there are some open issues and a further improved legal text might lead to more precise epidemiologic data collection. It has to be noted that formal representation for automatic classification of cases was out of scope, such a task would require other formalism, as e.g. those used by rule-based decision support systems.

  5. The role of thermodynamics in biochemical engineering

    NASA Astrophysics Data System (ADS)

    von Stockar, Urs

    2013-09-01

    This article is an adapted version of the introductory chapter of a book whose publication is imminent. It bears the title "Biothermodynamics - The role of thermodynamics in biochemical engineering." The aim of the paper is to give a very short overview of the state of biothermodynamics in an engineering context as reflected in this book. Seen from this perspective, biothermodynamics may be subdivided according to the scale used to formalize the description of the biological system into three large areas: (i) biomolecular thermodynamics (most fundamental scale), (ii) thermodynamics of metabolism (intermediary scale), and (iii) whole-cell thermodynamics ("black-box" description of living entities). In each of these subareas, the main available theoretical approaches and the current and the potential applications are discussed. Biomolecular thermodynamics (i) is especially well developed and is obviously highly pertinent for the development of downstream processing. Its use ought to be encouraged as much as possible. The subarea of thermodynamics of live cells (iii), although scarcely applied in practice, is also expected to enhance bioprocess research and development, particularly in predicting culture performances, for understanding the driving forces for cellular growth, and in developing, monitoring, and controlling cellular cultures. Finally, there is no question that thermodynamic analysis of cellular metabolism (ii) is a promising tool for systems biology and for many other applications, but quite a large research effort is still needed before it may be put to practical use.

  6. Formal Methods Tool Qualification

    NASA Technical Reports Server (NTRS)

    Wagner, Lucas G.; Cofer, Darren; Slind, Konrad; Tinelli, Cesare; Mebsout, Alain

    2017-01-01

    Formal methods tools have been shown to be effective at finding defects in safety-critical digital systems including avionics systems. The publication of DO-178C and the accompanying formal methods supplement DO-333 allows applicants to obtain certification credit for the use of formal methods without providing justification for them as an alternative method. This project conducted an extensive study of existing formal methods tools, identifying obstacles to their qualification and proposing mitigations for those obstacles. Further, it interprets the qualification guidance for existing formal methods tools and provides case study examples for open source tools. This project also investigates the feasibility of verifying formal methods tools by generating proof certificates which capture proof of the formal methods tool's claim, which can be checked by an independent, proof certificate checking tool. Finally, the project investigates the feasibility of qualifying this proof certificate checker, in the DO-330 framework, in lieu of qualifying the model checker itself.

  7. Educating Tomorrow's Aerrospace Engineers by Developing and Launching Liquid-Propelled Rockets

    NASA Astrophysics Data System (ADS)

    Besnard, Eric; Garvey, John; Holleman, Tom; Mueller, Tom

    2002-01-01

    conducted at California State University, Long Beach (CSULB), in which engineering students develop and launch liquid propelled rockets. The program is articulated around two main activities, each with specific objectives. The first component of CALVEIN is a systems integration laboratory where students develop/improve vehicle subsystems and integrate them into a vehicle (Prospector-2 - P-2 - for the 2001-02 academic year - AY). This component has three main objectives: (1) Develop hands- on skills for incoming students and expose them to aerospace hardware; (2) allow for upper division students who have been involved in the program to mentor incoming students and manage small teams; and (3) provide students from various disciplines within the College of Engineering - and other universities - with the chance to develop/improve subsystems on the vehicle. Among recent student projects conducted as part of this component are: a new 1000 lbf thrust engine using pintle injector technology, which was successfully tested on Dec. 1, 2001 and flown on Prospector-2 in Feb. 2002 (developed by CSULB Mechanical and Aerospace Engineering students); a digital flight telemetry package (developed by CSULB Electrical Engineering students); a new recovery system where a mechanical system replaces pyrotechnics for parachute release (developed by CSULB Mechanical and Aerospace Engineering students); and a 1-ft payload bay to accommodate experimental payloads (e.g. "CANSATS" developed by Stanford University students). The second component of CALVEIN is a formal Aerospace System Design curriculum. In the first-semester, from top-level system requirements, the students perform functional analysis, define the various subsystems and derive their requirements. These are presented at the Systems Functional and Requirement Reviews (SFR &SRR). The methods used for validation and verification are determined. Specifications and Interface Control Documents (ICD) are generated by the student team(s). Trade studies are identified and conducted, leading to a Preliminary Design Review (PDR) at the end of the first semester. A detailed design follows, culminating in a Critical Design Review (CDR), etc. A general process suitable for a two-semester course sequence will be outlined. The project is conducted in an Integrated Product Team (IPT) environment, which includes a project manager, a systems engineer, and the various disciplines needed for the project (propulsion, aerodynamics, structures and materials, mass, CAD, thermal, fluids, etc.). Each student works with a Faculty member or industry advisor who is a specialist in his/her area. This design curriculum enhances the education of the graduates and provides future employers with engineers cognizant of and experienced in the application of Systems Engineering to a full-scale project over the entire product development cycle. For the AY01-02, the curriculum is being applied to the development of a gimbaled aerospike engine and its integration into P-3, scheduled to fly in May 2002. The paper ends with a summary of "lessons learned" from this experience. Budget issues are also addressed to demonstrate the ability to replicate such projects at other institutions with minimal costs, provided that it can be taken advantages of possible synergies between existing programs, in-house resources, and cooperation with other institutions or organizations.

  8. Statistical Teleodynamics: Toward a Theory of Emergence.

    PubMed

    Venkatasubramanian, Venkat

    2017-10-24

    The central scientific challenge of the 21st century is developing a mathematical theory of emergence that can explain and predict phenomena such as consciousness and self-awareness. The most successful research program of the 20th century, reductionism, which goes from the whole to parts, seems unable to address this challenge. This is because addressing this challenge inherently requires an opposite approach, going from parts to the whole. In addition, reductionism, by the very nature of its inquiry, typically does not concern itself with teleology or purposeful behavior. Modeling emergence, in contrast, requires the addressing of teleology. Together, these two requirements present a formidable challenge in developing a successful mathematical theory of emergence. In this article, I describe a new theory of emergence, called statistical teleodynamics, that addresses certain aspects of the general problem. Statistical teleodynamics is a mathematical framework that unifies three seemingly disparate domains-purpose-free entities in statistical mechanics, human engineered teleological systems in systems engineering, and nature-evolved teleological systems in biology and sociology-within the same conceptual formalism. This theory rests on several key conceptual insights, the most important one being the recognition that entropy mathematically models the concept of fairness in economics and philosophy and, equivalently, the concept of robustness in systems engineering. These insights help prove that the fairest inequality of income is a log-normal distribution, which will emerge naturally at equilibrium in an ideal free market society. Similarly, the theory predicts the emergence of the three classes of network organization-exponential, scale-free, and Poisson-seen widely in a variety of domains. Statistical teleodynamics is the natural generalization of statistical thermodynamics, the most successful parts-to-whole systems theory to date, but this generalization is only a modest step toward a more comprehensive mathematical theory of emergence.

  9. Cabin noise and weight reduction program for the Gulfstream G200

    NASA Astrophysics Data System (ADS)

    Barton, C. Kearney

    2002-11-01

    This paper describes the approach and logic involved in a cabin noise and weight reduction program for an existing aircraft that was already in service with a pre-existing insulation package. The aircraft, a Gulfstream G200, was formally an IAI Galaxy, and the program was purchased from IAI in 2001. The approach was to investigate every aspect of the aircraft that could be a factor for cabin noise. This included such items as engine mounting and balancing criteria, the hydraulic system, the pressurization and air-conditioning system, the outflow valve, the interior shell and mounting system, antennae and other hull protuberances, as well as the insulation package. Each of these items was evaluated as potential candidates for noise and weight control modifications. Although the program is still ongoing, the results to date include a 175-lb weight savings and a 5-dB reduction in the cabin average Speech Interference Level (SIL).

  10. [Construction of automatic elucidation platform for mechanism of traditional Chinese medicine].

    PubMed

    Zhang, Bai-xia; Luo, Si-jun; Yan, Jing; Gu, Hao; Luo, Ji; Zhang, Yan-ling; Tao, Ou; Wang, Yun

    2015-10-01

    Aim at the two problems in the field of traditional Chinese medicine (TCM) mechanism elucidation, one is the lack of detailed biological processes information, next is the low efficient in constructing network models, we constructed an auxiliary elucidation system for the TCM mechanism and realize the automatic establishment of biological network model. This study used the Entity Grammar Systems (EGS) as the theoretical framework, integrated the data of formulae, herbs, chemical components, targets of component, biological reactions, signaling pathways and disease related proteins, established the formal models, wrote the reasoning engine, constructed the auxiliary elucidation system for the TCM mechanism elucidation. The platform provides an automatic modeling method for biological network model of TCM mechanism. It would be benefit to perform the in-depth research on TCM theory of natures and combination and provides the scientific references for R&D of TCM.

  11. Group Work

    ERIC Educational Resources Information Center

    Wilson, Kristy J.; Brickman, Peggy; Brame, Cynthia J.

    2018-01-01

    Science, technology, engineering, and mathematics faculty are increasingly incorporating both formal and informal group work in their courses. Implementing group work can be improved by an understanding of the extensive body of educational research studies on this topic. This essay describes an online, evidence-based teaching guide published by…

  12. Some practical approaches to a course on paraconsistent logic for engineers

    NASA Astrophysics Data System (ADS)

    Lambert-Torres, Germano; de Moraes, Carlos Henrique Valerio; Coutinho, Maurilio Pereira; Martins, Helga Gonzaga; Borges da Silva, Luiz Eduardo

    2017-11-01

    This paper describes a non-classical logic course primarily indicated for graduate students in electrical engineering and energy engineering. The content of this course is based on the vision that it is not enough for a student to indefinitely accumulate knowledge; it is necessary to explore all the occasions to update, deepen, and enrich that knowledge, adapting it to a complex world. Therefore, this course is not tied to theoretical formalities and tries at each moment to provide a practical view of the non-classical logic. In the real world, the inconsistencies are important and cannot be ignored because contradictory information brings relevant facts, sometimes modifying the entire result of the analysis. As consequence, the non-classical logics, such as annotated paraconsistent logic - APL, are efficiently framed in the approach of complex situations of the real world. In APL, the concepts of unknown, partial, ambiguous, and inconsistent knowledge are referred not to trivialise any system in analysis. This course presents theoretical and applicable aspects of APL, which are successfully used in decision-making structures. The course is divided into modules: Basic, 2vAPL, 3vAPL, 4vAPL, and Final Project.

  13. Simulation of UV atomic radiation for application in exhaust plume spectrometry

    NASA Astrophysics Data System (ADS)

    Wallace, T. L.; Powers, W. T.; Cooper, A. E.

    1993-06-01

    Quantitative analysis of exhaust plume spectral data has long been a goal of developers of advanced engine health monitoring systems which incorporate optical measurements of rocket exhaust constituents. Discussed herein is the status of present efforts to model and predict atomic radiation spectra and infer free-atom densities from emission/absorption measurements as part of the Optical Plume Anomaly Detection (OPAD) program at Marshall Space Flight Center (MSFC). A brief examination of the mathematical formalism is provided in the context of predicting radiation from the Mach disk region of the SSME exhaust flow at nominal conditions during ground level testing at MSFC. Computational results are provided for Chromium and Copper at selected transitions which indicate a strong dependence upon broadening parameter values determining the absorption-emission line shape. Representative plots of recent spectral data from the Stennis Space Center (SSC) Diagnostic Test Facility (DTF) rocket engine are presented and compared to numerical results from the present self-absorbing model; a comprehensive quantitative analysis will be reported at a later date.

  14. Aircraft applications of fault detection and isolation techniques

    NASA Astrophysics Data System (ADS)

    Marcos Esteban, Andres

    In this thesis the problems of fault detection & isolation and fault tolerant systems are studied from the perspective of LTI frequency-domain, model-based techniques. Emphasis is placed on the applicability of these LTI techniques to nonlinear models, especially to aerospace systems. Two applications of Hinfinity LTI fault diagnosis are given using an open-loop (no controller) design approach: one for the longitudinal motion of a Boeing 747-100/200 aircraft, the other for a turbofan jet engine. An algorithm formalizing a robust identification approach based on model validation ideas is also given and applied to the previous jet engine. A general linear fractional transformation formulation is given in terms of the Youla and Dual Youla parameterizations for the integrated (control and diagnosis filter) approach. This formulation provides better insight into the trade-off between the control and the diagnosis objectives. It also provides the basic groundwork towards the development of nested schemes for the integrated approach. These nested structures allow iterative improvements on the control/filter Youla parameters based on successive identification of the system uncertainty (as given by the Dual Youla parameter). The thesis concludes with an application of Hinfinity LTI techniques to the integrated design for the longitudinal motion of the previous Boeing 747-100/200 model.

  15. Propulsion and Energetics Panel Working Group 15 on the Uniform Engine Test Programme

    DTIC Science & Technology

    1990-02-01

    earlier test of uniform aerodynamic models in wind tunnels under the auspices of the Fluid Dynamics Panel. A formal proposal was presented to the...this major new effort and members of the engine test community throughout AGARD were selected to serve on Working Group 15 along with PEP...STPA/MO 4 Mr J.R.Bednarsk; 4 Avenue de Ia Porte d’lssy PE-63 75015 Paris Naval Air Propulsion Center PO Box 7176 GERMANY Trenton. New Jersey 08628

  16. Expedition 31 Crew Prepares For Launch

    NASA Image and Video Library

    2012-05-15

    Expedition 31 Flight Engineer Joe Acaba, left, Soyuz Commander Gennady Padalka, and, Flight Engineer Sergei Revin, right, receive a formal go for launch from Vitaly Alexandrovich Lopota, President of Energia, left, and Vladimir Popovkin, Director of Roscosmos prior to their launch onboard the Soyuz TMA-04M on Tuesday, May 15, 2012 at the Baikonur Cosmodrome in Kazakhstan. The Soyuz spacecraft with Padalka, Revin, and Acaba onboard, launched at 9:01 a.m. Kazakhstan time on Tuesday, May 15. Photo Credit: (NASA/GCTC/Andrey Shelepin)

  17. Statistical analysis of Turbine Engine Diagnostic (TED) field test data

    NASA Astrophysics Data System (ADS)

    Taylor, Malcolm S.; Monyak, John T.

    1994-11-01

    During the summer of 1993, a field test of turbine engine diagnostic (TED) software, developed jointly by U.S. Army Research Laboratory and the U.S. Army Ordnance Center and School, was conducted at Fort Stuart, GA. The data were collected in conformance with a cross-over design, some of whose considerations are detailed. The initial analysis of the field test data was exploratory, followed by a more formal investigation. Technical aspects of the data analysis insights that were elicited are reported.

  18. Epistemology, software engineering and formal methods

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael

    1994-01-01

    One of the most basic questions anyone can ask is, 'How do I know that what I think I know is true?' The study of this question is called epistemology. Traditionally, epistemology has been considered to be of legitimate interest only to philosophers, theologians, and three year old children who respond to every statement by asking, 'Why?' Software engineers need to be interested in the subject, however, because a lack of sufficient understanding of epistemology contributes to many of the current problems in the field.

  19. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.

  20. Update on specified European R and D efforts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-10-01

    Information was collected for DOE on various European research programs of interest: Shell-Koppers coal gasification demonstration plant, fluidized-bed combustion pilot plant, a boiler super heat system, energy conservation on ships, waste heat utilization from large diesel engines and nuclear power plants and uranium enrichment plants, coal-water slurries with additive (CARBOGEL), electrostatic precipitators, radial inflow turbines, carbonization, heat pumps, heat exchangers, gas turbines, and research on heat resisting alloys and corrosion protection of these alloys. A number of organizations expressed a desire for creation of a formal interchange with DOE on specific subjects of mutual interest (one organization is unhappy aboutmore » furnishing information to DOE). (LTN)« less

  1. Proportional Reasoning and the Visually Impaired

    ERIC Educational Resources Information Center

    Hilton, Geoff; Hilton, Annette; Dole, Shelley L.; Goos, Merrilyn; O'Brien, Mia

    2012-01-01

    Proportional reasoning is an important aspect of formal thinking that is acquired during the developmental years that approximate the middle years of schooling. Students who fail to acquire sound proportional reasoning often experience difficulties in subjects that require quantitative thinking, such as science, technology, engineering, and…

  2. Black Boxes in Analytical Chemistry: University Students' Misconceptions of Instrumental Analysis

    ERIC Educational Resources Information Center

    Carbo, Antonio Domenech; Adelantado, Jose Vicente Gimeno; Reig, Francisco Bosch

    2010-01-01

    Misconceptions of chemistry and chemical engineering university students concerning instrumental analysis have been established from coordinated tests, tutorial interviews and laboratory lessons. Misconceptions can be divided into: (1) formal, involving specific concepts and formulations within the general frame of chemistry; (2)…

  3. NASA's Science Education and Public Outreach Forums: Bringing Communities and Resources Together to Increase Effectiveness and Sustainability of E/PO

    NASA Astrophysics Data System (ADS)

    Sharma, Mangala; Smith, D.; Mendez, B.; Shipp, S.; Schwerin, T.; Stockman, S.; Cooper, L.

    2010-03-01

    The AAS-HEAD community has a rich history of involvement in education and public outreach (E/PO). HEAD members have been using NASA science and educational resources to engage and educate youth and adults nationwide in science, technology, engineering, and mathematics topics. Four new Science Education and Public Outreach Forums ("Forums") funded by NASA Science Mission Directorate (SMD) are working in partnership with the research and education community to ensure that current and future SMD-funded E/PO activities form a seamless whole, with easy entry points for scientists, engineers, faculty, students, K-12 formal and informal science educators, general public, and E/PO professionals alike. These Forums support the astrophysics, heliophysics, planetary and Earth science divisions of NASA SMD in three core areas: 1) E/PO community engagement and development to facilitate clear paths of involvement for scientists, engineers and others interested - or potentially interested - in participating in SMD-funded E/PO activities. Collaborations with science professionals are vital for infusing current, accurate SMD mission and research findings into educational products and activities. Forum activities will yield readily accessible information on effective E/PO strategies, resources, and expertise; context for individual E/PO activities; and opportunities for collaboration. 2) A rigorous analysis of SMD-funded E/PO products and activities to help understand how the existing collection supports education standards and audience needs and to identify areas of opportunity for new materials and activities. K-12 formal, informal, and higher education products and activities are included in this analysis. 3) Finally, to address E/PO-related systemic issues and coordinate related activities across the four SMD science divisions. By supporting the NASA E/PO community and facilitating coordination of E/PO activities within and across disciplines, the SMD-Forum partnerships will lead to more effective, sustainable, and efficient utilization of NASA science discoveries and learning experiences.

  4. Provable Transient Recovery for Frame-Based, Fault-Tolerant Computing Systems

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Butler, Ricky W.

    1992-01-01

    We present a formal verification of the transient fault recovery aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system architecture for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization accommodates a wide variety of voting schemes for purging the effects of transients.

  5. Engineering Changes in Product Design - A Review

    NASA Astrophysics Data System (ADS)

    Karthik, K.; Janardhan Reddy, K., Dr

    2016-09-01

    Changes are fundamental to product development. Engineering changes are unavoidable and can arise at any phase of the product life cycle. The consideration of market requirements, customer/user feedbacks, manufacturing constraints, design innovations etc., turning them into viable products can be accomplished when product change is managed properly. In the early design cycle, informal changes are accepted. However, changes become formal when its complexity and cost increases, and as product matures. To maximize the market shares, manufacturers have to effectively and efficiently manage engineering changes by means of Configuration Control. The paper gives a broad overview about ‘Engineering Change Management’ (ECM) through configuration management and its implications in product design. The aim is to give an idea and understanding about the engineering changes in product design scenario to the new researchers. This paper elaborates the significant aspect of managing the engineering changes and the importance of ECM in a product life cycle.

  6. Indigenous Knowledge and Education from the Quechua Community to School: Beyond the Formal/Non-Formal Dichotomy

    ERIC Educational Resources Information Center

    Sumida Huaman, Elizabeth; Valdiviezo, Laura Alicia

    2014-01-01

    In this article, we propose to approach Indigenous education beyond the formal/non-formal dichotomy. We argue that there is a critical need to conscientiously include Indigenous knowledge in education processes from the school to the community; particularly, when formal systems exclude Indigenous cultures and languages. Based on ethnographic…

  7. Preface to MOST-ONISW 2009

    NASA Astrophysics Data System (ADS)

    Doerr, Martin; Freitas, Fred; Guizzardi, Giancarlo; Han, Hyoil

    Ontology is a cross-disciplinary field concerned with the study of concepts and theories that can be used for representing shared conceptualizations of specific domains. Ontological Engineering is a discipline in computer and information science concerned with the development of techniques, methods, languages and tools for the systematic construction of concrete artifacts capturing these representations, i.e., models (e.g., domain ontologies) and metamodels (e.g., upper-level ontologies). In recent years, there has been a growing interest in the application of formal ontology and ontological engineering to solve modeling problems in diverse areas in computer science such as software and data engineering, knowledge representation, natural language processing, information science, among many others.

  8. Context-aware system design

    NASA Astrophysics Data System (ADS)

    Chan, Christine S.; Ostertag, Michael H.; Akyürek, Alper Sinan; Šimunić Rosing, Tajana

    2017-05-01

    The Internet of Things envisions a web-connected infrastructure of billions of sensors and actuation devices. However, the current state-of-the-art presents another reality: monolithic end-to-end applications tightly coupled to a limited set of sensors and actuators. Growing such applications with new devices or behaviors, or extending the existing infrastructure with new applications, involves redesign and redeployment. We instead propose a modular approach to these applications, breaking them into an equivalent set of functional units (context engines) whose input/output transformations are driven by general-purpose machine learning, demonstrating an improvement in compute redundancy and computational complexity with minimal impact on accuracy. In conjunction with formal data specifications, or ontologies, we can replace application-specific implementations with a composition of context engines that use common statistical learning to generate output, thus improving context reuse. We implement interconnected context-aware applications using our approach, extracting user context from sensors in both healthcare and grid applications. We compare our infrastructure to single-stage monolithic implementations with single-point communications between sensor nodes and the cloud servers, demonstrating a reduction in combined system energy by 22-45%, and multiplying the battery lifetime of power-constrained devices by at least 22x, with easy deployment across different architectures and devices.

  9. FORMAL MODELING, MONITORING, AND CONTROL OF EMERGENCE IN DISTRIBUTED CYBER PHYSICAL SYSTEMS

    DTIC Science & Technology

    2018-02-23

    FORMAL MODELING, MONITORING, AND CONTROL OF EMERGENCE IN DISTRIBUTED CYBER- PHYSICAL SYSTEMS UNIVERSITY OF TEXAS AT ARLINGTON FEBRUARY 2018 FINAL...COVERED (From - To) APR 2015 – APR 2017 4. TITLE AND SUBTITLE FORMAL MODELING, MONITORING, AND CONTROL OF EMERGENCE IN DISTRIBUTED CYBER- PHYSICAL ...dated 16 Jan 09 13. SUPPLEMENTARY NOTES 14. ABSTRACT This project studied emergent behavior in distributed cyber- physical systems (DCPS). Emergent

  10. Properties of a Formal Method to Model Emergence in Swarm-Based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.

  11. Unified formalism for higher order non-autonomous dynamical systems

    NASA Astrophysics Data System (ADS)

    Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso

    2012-03-01

    This work is devoted to giving a geometric framework for describing higher order non-autonomous mechanical systems. The starting point is to extend the Lagrangian-Hamiltonian unified formalism of Skinner and Rusk for these kinds of systems, generalizing previous developments for higher order autonomous mechanical systems and first-order non-autonomous mechanical systems. Then, we use this unified formulation to derive the standard Lagrangian and Hamiltonian formalisms, including the Legendre-Ostrogradsky map and the Euler-Lagrange and the Hamilton equations, both for regular and singular systems. As applications of our model, two examples of regular and singular physical systems are studied.

  12. Improved object optimal synthetic description, modeling, learning, and discrimination by GEOGINE computational kernel

    NASA Astrophysics Data System (ADS)

    Fiorini, Rodolfo A.; Dacquino, Gianfranco

    2005-03-01

    GEOGINE (GEOmetrical enGINE), a state-of-the-art OMG (Ontological Model Generator) based on n-D Tensor Invariants for n-Dimensional shape/texture optimal synthetic representation, description and learning, was presented in previous conferences elsewhere recently. Improved computational algorithms based on the computational invariant theory of finite groups in Euclidean space and a demo application is presented. Progressive model automatic generation is discussed. GEOGINE can be used as an efficient computational kernel for fast reliable application development and delivery in advanced biomedical engineering, biometric, intelligent computing, target recognition, content image retrieval, data mining technological areas mainly. Ontology can be regarded as a logical theory accounting for the intended meaning of a formal dictionary, i.e., its ontological commitment to a particular conceptualization of the world object. According to this approach, "n-D Tensor Calculus" can be considered a "Formal Language" to reliably compute optimized "n-Dimensional Tensor Invariants" as specific object "invariant parameter and attribute words" for automated n-Dimensional shape/texture optimal synthetic object description by incremental model generation. The class of those "invariant parameter and attribute words" can be thought as a specific "Formal Vocabulary" learned from a "Generalized Formal Dictionary" of the "Computational Tensor Invariants" language. Even object chromatic attributes can be effectively and reliably computed from object geometric parameters into robust colour shape invariant characteristics. As a matter of fact, any highly sophisticated application needing effective, robust object geometric/colour invariant attribute capture and parameterization features, for reliable automated object learning and discrimination can deeply benefit from GEOGINE progressive automated model generation computational kernel performance. Main operational advantages over previous, similar approaches are: 1) Progressive Automated Invariant Model Generation, 2) Invariant Minimal Complete Description Set for computational efficiency, 3) Arbitrary Model Precision for robust object description and identification.

  13. Discrete approach to stochastic parametrization and dimension reduction in nonlinear dynamics.

    PubMed

    Chorin, Alexandre J; Lu, Fei

    2015-08-11

    Many physical systems are described by nonlinear differential equations that are too complicated to solve in full. A natural way to proceed is to divide the variables into those that are of direct interest and those that are not, formulate solvable approximate equations for the variables of greater interest, and use data and statistical methods to account for the impact of the other variables. In the present paper we consider time-dependent problems and introduce a fully discrete solution method, which simplifies both the analysis of the data and the numerical algorithms. The resulting time series are identified by a NARMAX (nonlinear autoregression moving average with exogenous input) representation familiar from engineering practice. The connections with the Mori-Zwanzig formalism of statistical physics are discussed, as well as an application to the Lorenz 96 system.

  14. FDA Regulation of Clinical Applications of CRISPR-CAS Gene-Editing Technology.

    PubMed

    Grant, Evita V

    Scientists have repurposed an adaptive immune system of single cell organisms to create a new type of gene-editing tool: CRISPR (clustered regularly interspaced short palindromic repeats)-Cas technology. Scientists in China have reported its use in the genome modification of non-viable human embryos. This has ignited a spirited debate about the moral, ethical, scientific, and social implications of human germline genome engineering. There have also been calls for regulations; however, FDA has yet to formally announce its oversight of clinical applications of CRISPR-Cas systems. This paper reviews FDA regulation of previously controversial biotechnology breakthroughs, recombinant DNA and human cloning. It then shows that FDA is well positioned to regulate CRISPR-Cas clinical applications, due to its legislative mandates, its existing regulatory frameworks for gene therapies and assisted reproductive technologies, and other considerations.

  15. Potential adoption and management of insect-resistant potato in Peru, and implications for genetically engineered potato.

    PubMed

    Buijs, Jasper; Martinet, Marianne; de Mendiburu, Felipe; Ghislain, Marc

    2005-01-01

    This paper analyzes some important issues surrounding possible deployment of genetically engineered (GE) insect-resistant potato in Peru, based on a large farmer survey held in Peru in 2003. We found that the formal seed system plays a limited role compared with the informal seed system, especially for smallholder farmers. Although 97% of smallholder farmers would buy seed of an insect-resistant variety, a majority would buy it only once every 2 to 4 years. Survey data show that farmers would be willing to pay a premium of 50% on seed cost for insect resistant varieties. Paying price premiums of 25% to 50%, farmers would still increase their net income, assuming insect resistance is high and pesticide use will be strongly reduced. Of all farmers, 55% indicated preference for insect-resistant potato in varieties other than their current varieties. The survey indicates that smallholder farmers are interested to experiment with new varieties and have a positive perception of improved varieties. Based on these findings, and considering the difficulties implementing existing biosafety regulatory systems such as those in place in the U.S. and E.U., we propose to develop a variety-based segregation system to separate GE from conventionally bred potatoes. In such a system, which would embrace the spread of GE potatoes through informal seed systems, only a limited number of sterile varieties would be introduced that are easily distinguishable from conventional varieties.

  16. Utilizing inheritance in requirements engineering

    NASA Technical Reports Server (NTRS)

    Kaindl, Hermann

    1994-01-01

    The scope of this paper is the utilization of inheritance for requirements specification, i.e., the tasks of analyzing and modeling the domain, as well as forming and defining requirements. Our approach and the tool supporting it are named RETH (Requirements Engineering Through Hypertext). Actually, RETH uses a combination of various technologies, including object-oriented approaches and artificial intelligence (in particular frames). We do not attempt to exclude or replace formal representations, but try to complement and provide means for gradually developing them. Among others, RETH has been applied in the CERN (Conseil Europeen pour la Rechereche Nucleaire) Cortex project. While it would be impossible to explain this project in detail here, it should be sufficient to know that it deals with a generic distributed control system. Since this project is not finished yet, it is difficult to state its size precisely. In order to give an idea, its final goal is to substitute the many existing similar control systems at CERN by this generic approach. Currently, RETH is also tested using real-world requirements for the Pastel Mission Planning System at ESOC in Darmstadt. First, we outline how hypertext is integrated into a frame system in our approach. Moreover, the usefulness of inheritance is demonstrated as performed by the tool RETH. We then summarize our experiences of utilizing inheritance in the Cortex project. Lastly, RETH will be related to existing work.

  17. NASA Formal Methods Workshop, 1990

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Compiler)

    1990-01-01

    The workshop brought together researchers involved in the NASA formal methods research effort for detailed technical interchange and provided a mechanism for interaction with representatives from the FAA and the aerospace industry. The workshop also included speakers from industry to debrief the formal methods researchers on the current state of practice in flight critical system design, verification, and certification. The goals were: define and characterize the verification problem for ultra-reliable life critical flight control systems and the current state of practice in industry today; determine the proper role of formal methods in addressing these problems, and assess the state of the art and recent progress toward applying formal methods to this area.

  18. Formal mechanization of device interactions with a process algebra

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, Karl; Cohen, Gerald C.

    1992-01-01

    The principle emphasis is to develop a methodology to formally verify correct synchronization communication of devices in a composed hardware system. Previous system integration efforts have focused on vertical integration of one layer on top of another. This task examines 'horizontal' integration of peer devices. To formally reason about communication, we mechanize a process algebra in the Higher Order Logic (HOL) theorem proving system. Using this formalization we show how four types of device interactions can be represented and verified to behave as specified. The report also describes the specification of a system consisting of an AVM-1 microprocessor and a memory management unit which were verified in previous work. A proof of correct communication is presented, and the extensions to the system specification to add a direct memory device are discussed.

  19. Perceptions of the value of traditional ecological knowledge to formal school curricula: opportunities and challenges from Malekula Island, Vanuatu

    PubMed Central

    2011-01-01

    Background The integration of traditional ecological knowledge (TEK) into formal school curricula may be a key tool for the revitalisation of biocultural diversity, and has the potential to improve the delivery of educational objectives. This paper explores perceptions of the value of TEK to formal education curricula on Malekula Island, Vanuatu. We conducted 49 interviews with key stakeholders (local TEK experts, educators, and officials) regarding the use of the formal school system to transmit, maintain, and revitalise TEK. Interviews also gathered information on the areas where TEK might add value to school curricula and on the perceived barriers to maintaining and revitalising TEK via formal education programs. Results Participants reported that TEK had eroded on Malekula, and identified the formal school system as a principal driver. Most interviewees believed that if an appropriate format could be developed, TEK could be included in the formal education system. Such an approach has potential to maintain customary knowledge and practice in the focus communities. Participants identified several specific domains of TEK for inclusion in school curricula, including ethnomedical knowledge, agricultural knowledge and practice, and the reinforcement of respect for traditional authority and values. However, interviewees also noted a number of practical and epistemological barriers to teaching TEK in school. These included the cultural diversity of Malekula, tensions between public and private forms of knowledge, and multiple values of TEK within the community. Conclusions TEK has potential to add value to formal education systems in Vanuatu by contextualising the content and process of curricular delivery, and by facilitating character development and self-awareness in students. These benefits are congruent with UNESCO-mandated goals for curricular reform and provide a strong argument for the inclusion of TEK in formal school systems. Such approaches may also assist in the maintenance and revitalisation of at-risk systems of ethnobiological knowledge. However, we urge further research attention to the significant epistemological challenges inherent in including TEK in formal school, particularly as participants noted the potential for such approaches to have negative consequences. PMID:22112326

  20. Perceptions of the value of traditional ecological knowledge to formal school curricula: opportunities and challenges from Malekula Island, Vanuatu.

    PubMed

    McCarter, Joe; Gavin, Michael C

    2011-11-23

    The integration of traditional ecological knowledge (TEK) into formal school curricula may be a key tool for the revitalisation of biocultural diversity, and has the potential to improve the delivery of educational objectives. This paper explores perceptions of the value of TEK to formal education curricula on Malekula Island, Vanuatu. We conducted 49 interviews with key stakeholders (local TEK experts, educators, and officials) regarding the use of the formal school system to transmit, maintain, and revitalise TEK. Interviews also gathered information on the areas where TEK might add value to school curricula and on the perceived barriers to maintaining and revitalising TEK via formal education programs. Participants reported that TEK had eroded on Malekula, and identified the formal school system as a principal driver. Most interviewees believed that if an appropriate format could be developed, TEK could be included in the formal education system. Such an approach has potential to maintain customary knowledge and practice in the focus communities. Participants identified several specific domains of TEK for inclusion in school curricula, including ethnomedical knowledge, agricultural knowledge and practice, and the reinforcement of respect for traditional authority and values. However, interviewees also noted a number of practical and epistemological barriers to teaching TEK in school. These included the cultural diversity of Malekula, tensions between public and private forms of knowledge, and multiple values of TEK within the community. TEK has potential to add value to formal education systems in Vanuatu by contextualising the content and process of curricular delivery, and by facilitating character development and self-awareness in students. These benefits are congruent with UNESCO-mandated goals for curricular reform and provide a strong argument for the inclusion of TEK in formal school systems. Such approaches may also assist in the maintenance and revitalisation of at-risk systems of ethnobiological knowledge. However, we urge further research attention to the significant epistemological challenges inherent in including TEK in formal school, particularly as participants noted the potential for such approaches to have negative consequences.

  1. A Second-Year Undergraduate Course in Applied Differential Equations.

    ERIC Educational Resources Information Center

    Fahidy, Thomas Z.

    1991-01-01

    Presents the framework for a chemical engineering course using ordinary differential equations to solve problems with the underlying strategy of concisely discussing the theory behind each solution technique without extensions to formal proofs. Includes typical class illustrations, student responses to this strategy, and reaction of the…

  2. The Direction of Web-based Training: A Practitioner's View.

    ERIC Educational Resources Information Center

    Kilby, Tim

    2001-01-01

    Web-based training has had achievements and disappointments as online learning has matured. Best practices include user-centered design, knowledge object structures, usability engineering, and formal evaluation. Knowledge management, peer-to-peer learning, and personal learning appliances will continue to alter the online learning landscape. (SK)

  3. Bridging Formal and Informal Learning Environments

    ERIC Educational Resources Information Center

    Barker, Bradley S.; Larson, Kim; Krehbiel, Michelle

    2014-01-01

    Out-of-school time programs that provide science, technology, engineering, and mathematics (STEM) educational content are promising approaches to develop skills and abilities in students. These programs may potentially inspire students with engaging hands-on, minds-on activities that encourages their natural curiosity around STEM content areas.…

  4. Informal Science: Family Education, Experiences, and Initial Interest in Science

    ERIC Educational Resources Information Center

    Dabney, Katherine P.; Tai, Robert H.; Scott, Michael R.

    2016-01-01

    Recent research and public policy have indicated the need for increasing the physical science workforce through development of interest and engagement with informal and formal science, technology, engineering, and mathematics experiences. This study examines the association of family education and physical scientists' informal experiences in…

  5. STEMMING the Gap

    ERIC Educational Resources Information Center

    Kahler, Jim; Valentine, Nancy

    2011-01-01

    America has a gap when it comes to youth pursuing science and technology careers. In an effort to improve the knowledge and application of science, technology, engineering, and math (STEM), after-school programs can work in conjunction with formal in-school curriculum to improve science education. One organization that actively addresses this…

  6. Window to the World

    ERIC Educational Resources Information Center

    Cannon, Kama

    2018-01-01

    Although formal papers are typical, sometimes posters or other visual presentations are more useful tools for sharing visual-spatial information. By incorporating creativity and technology into the study of geographical science, STEM (the study of Science, Technology Engineering, and Mathematics) is changed to STEAM (the A stands for ART)! The…

  7. Formal Methods Case Studies for DO-333

    NASA Technical Reports Server (NTRS)

    Cofer, Darren; Miller, Steven P.

    2014-01-01

    RTCA DO-333, Formal Methods Supplement to DO-178C and DO-278A provides guidance for software developers wishing to use formal methods in the certification of airborne systems and air traffic management systems. The supplement identifies the modifications and additions to DO-178C and DO-278A objectives, activities, and software life cycle data that should be addressed when formal methods are used as part of the software development process. This report presents three case studies describing the use of different classes of formal methods to satisfy certification objectives for a common avionics example - a dual-channel Flight Guidance System. The three case studies illustrate the use of theorem proving, model checking, and abstract interpretation. The material presented is not intended to represent a complete certification effort. Rather, the purpose is to illustrate how formal methods can be used in a realistic avionics software development project, with a focus on the evidence produced that could be used to satisfy the verification objectives found in Section 6 of DO-178C.

  8. Training Informal Educators Provides Leverage for Space Science Education and Public Outreach

    NASA Technical Reports Server (NTRS)

    Allen, J. S.; Tobola, K. W.; Betrue, R.

    2004-01-01

    How do we reach the public with the exciting story of Solar System Exploration? How do we encourage girls to think about careers in science, math, engineering and technology? Why should NASA scientists make an effort to reach the public and informal education settings to tell the Solar System Exploration story? These are questions that the Solar System Exploration Forum, a part of the NASA Office of Space Science Education (SSE) and Public Outreach network, has tackled over the past few years. The SSE Forum is a group of education teams and scientists who work to share the excitement of solar system exploration with colleagues, formal educators, and informal educators like museums and youth groups. One major area of the SSE Forum outreach supports the training of Girl Scouts of the USA (GS) leaders and trainers in a suite of activities that reflect NASA missions and science research. Youth groups like Girl Scouts structure their activities as informal education.

  9. Formal Validation of Fault Management Design Solutions

    NASA Technical Reports Server (NTRS)

    Gibson, Corrina; Karban, Robert; Andolfato, Luigi; Day, John

    2013-01-01

    The work presented in this paper describes an approach used to develop SysML modeling patterns to express the behavior of fault protection, test the model's logic by performing fault injection simulations, and verify the fault protection system's logical design via model checking. A representative example, using a subset of the fault protection design for the Soil Moisture Active-Passive (SMAP) system, was modeled with SysML State Machines and JavaScript as Action Language. The SysML model captures interactions between relevant system components and system behavior abstractions (mode managers, error monitors, fault protection engine, and devices/switches). Development of a method to implement verifiable and lightweight executable fault protection models enables future missions to have access to larger fault test domains and verifiable design patterns. A tool-chain to transform the SysML model to jpf-Statechart compliant Java code and then verify the generated code via model checking was established. Conclusions and lessons learned from this work are also described, as well as potential avenues for further research and development.

  10. Implementing Software Safety in the NASA Environment

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha S.; Radley, Charles F.

    1994-01-01

    Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of the system to be built. Shortly thereafter, as the system requirements are being defined, the second iteration of hazard analyses takes place, the systems hazard analysis (SHA). During the systems requirements phase, decisions are made as to what functions of the system will be the responsibility of software. This is the most critical time to affect the safety of the software. From this point, software safety analyses as well as software engineering practices are the main focus for assuring safe software. While many of the steps proposed in this paper seem like just sound engineering practices, they are the best technical and most cost effective means to assure safe software within a safe system.

  11. Offshore safety case approach and formal safety assessment of ships.

    PubMed

    Wang, J

    2002-01-01

    Tragic marine and offshore accidents have caused serious consequences including loss of lives, loss of property, and damage of the environment. A proactive, risk-based "goal setting" regime is introduced to the marine and offshore industries to increase the level of safety. To maximize marine and offshore safety, risks need to be modeled and safety-based decisions need to be made in a logical and confident way. Risk modeling and decision-making tools need to be developed and applied in a practical environment. This paper describes both the offshore safety case approach and formal safety assessment of ships in detail with particular reference to the design aspects. The current practices and the latest development in safety assessment in both the marine and offshore industries are described. The relationship between the offshore safety case approach and formal ship safety assessment is described and discussed. Three examples are used to demonstrate both the offshore safety case approach and formal ship safety assessment. The study of risk criteria in marine and offshore safety assessment is carried out. The recommendations on further work required are given. This paper gives safety engineers in the marine and offshore industries an overview of the offshore safety case approach and formal ship safety assessment. The significance of moving toward a risk-based "goal setting" regime is given.

  12. Graph Theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P.

    2005-12-27

    Graph theory is a branch of discrete combinatorial mathematics that studies the properties of graphs. The theory was pioneered by the Swiss mathematician Leonhard Euler in the 18th century, commenced its formal development during the second half of the 19th century, and has witnessed substantial growth during the last seventy years, with applications in areas as diverse as engineering, computer science, physics, sociology, chemistry and biology. Graph theory has also had a strong impact in computational linguistics by providing the foundations for the theory of features structures that has emerged as one of the most widely used frameworks for themore » representation of grammar formalisms.« less

  13. Program Developments: Formal Explanations of Implementations.

    DTIC Science & Technology

    1982-08-01

    January 1982. [Cheatham 791 Cheatham, T. E., G. H. Holloway, and J. A. Townley . "Symbolic evaluation and the analysis of programs," IEEE Transactions on...Software Engineering 5, (4), July 1979, 402-417. [Cheatham 81] Cheatham, T. E., G. H. Holloway, and J. A. Townley , "Program refinement by

  14. Appreciating Formal Similarities in the Kinetics of Homogeneous, Heterogeneous, and Enzyme Catalysis

    ERIC Educational Resources Information Center

    Ashby, Michael T.

    2007-01-01

    Because interest in catalysts is widespread, the kinetics of catalytic reactions have been investigated by widely diverse groups of individuals, including chemists, engineers, and biologists. This has lead to redundancy in theories, particularly with regard to the topics of homogeneous, heterogeneous, and enzyme catalysis. From a pedagogical…

  15. Abstract Numeric Relations and the Visual Structure of Algebra

    ERIC Educational Resources Information Center

    Landy, David; Brookes, David; Smout, Ryan

    2014-01-01

    Formal algebras are among the most powerful and general mechanisms for expressing quantitative relational statements; yet, even university engineering students, who are relatively proficient with algebraic manipulation, struggle with and often fail to correctly deploy basic aspects of algebraic notation (Clement, 1982). In the cognitive tradition,…

  16. Community Partnerships for Fostering Student Interest and Engagement in STEM

    ERIC Educational Resources Information Center

    Watters, James J.; Diezmann, Carmel M.

    2013-01-01

    The foundations of Science, Technology, Engineering and Mathematics (STEM) education begins in the early years of schooling when students encounter formal learning experiences primarily in mathematics and science. Politicians, economists and industrialists recognise the importance of STEM in society, and therefore a number of strategies have been…

  17. FY 1997 Scientific and Technical Reports, Articles, Papers, and Presentations

    NASA Technical Reports Server (NTRS)

    Waits, J. E. Turner (Compiler)

    1998-01-01

    This document presents formal NASA technical reports, papers published in technical journals, and presentations by MSFC personnel in FY97. The information in this report may be of value to the scientific and engineering community in determining what information has been published and what is available.

  18. The Atlanta University Center: A Consortium-Based Dual Degree Engineering Program

    ERIC Educational Resources Information Center

    Jackson, Marilyn T.

    2007-01-01

    The Atlanta University Center (AUC) comprises five historically black colleges and a centralized library. All are separate institutions, each having its own board of directors, president, infrastructure, students, faculty, staff, and traditions. To encourage coordination of effort and resources, the AUC was formed and the first formal cooperative…

  19. Balancing Stakeholders' Interests in Evolving Teacher Education Accreditation Contexts

    ERIC Educational Resources Information Center

    Elliott, Alison

    2008-01-01

    While Australian teacher education programs have long had rigorous accreditation pathways at the University level they have not been subject to the same formal public or professional scrutiny typical of professions such as medicine, nursing or engineering. Professional accreditation for teacher preparation programs is relatively new and is linked…

  20. Beta Testing in Social Work

    ERIC Educational Resources Information Center

    Traube, Dorian E.; Begun, Stephanie; Petering, Robin; Flynn, Marilyn L.

    2017-01-01

    The field of social work does not currently have a widely adopted method for expediting innovations into micro- or macropractice. Although it is common in fields such as engineering and business to have formal processes for accelerating scientific advances into consumer markets, few comparable mechanisms exist in the social sciences or social…

  1. Euler Teaches a Class in Structural Steel Design

    ERIC Educational Resources Information Center

    Boyajian, David M.

    2009-01-01

    Even before steel was a topic of formal study for structural engineers, the brilliant eighteenth century Swiss mathematician and physicist, Leonhard Euler (1707-1783), investigated the theory governing the elastic behaviour of columns, the results of which are incorporated into the American Institute of Steel Construction's (AISC's) Bible: the…

  2. IB Offering Certificate for Careers

    ERIC Educational Resources Information Center

    Robelen, Erik W.

    2012-01-01

    The International Baccalaureate (IB) organization, best known in the United States for its prestigious two-year diploma program for juniors and seniors, will enter new terrain this fall as it formally rolls out an initiative centered on a variety of career pathways that includes engineering, culinary arts, and automotive technology. The move comes…

  3. Describing the What and Why of Students' Difficulties in Boolean Logic

    ERIC Educational Resources Information Center

    Herman, Geoffrey L.; Loui, Michael C.; Kaczmarczyk, Lisa; Zilles, Craig

    2012-01-01

    The ability to reason with formal logic is a foundational skill for computer scientists and computer engineers that scaffolds the abilities to design, debug, and optimize. By interviewing students about their understanding of propositional logic and their ability to translate from English specifications to Boolean expressions, we characterized…

  4. NASA Langley Research and Technology-Transfer Program in Formal Methods

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Caldwell, James L.; Carreno, Victor A.; Holloway, C. Michael; Miner, Paul S.; DiVito, Ben L.

    1995-01-01

    This paper presents an overview of NASA Langley research program in formal methods. The major goals of this work are to make formal methods practical for use on life critical systems, and to orchestrate the transfer of this technology to U.S. industry through use of carefully designed demonstration projects. Several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of five NASA civil servants and contractors from Odyssey Research Associates, SRI International, and VIGYAN Inc.

  5. The Lifelong Learning Iceberg of Information Systems Academics--A Study of On-Going Formal and Informal Learning by Academics

    ERIC Educational Resources Information Center

    Davey, Bill; Tatnall, Arthur

    2007-01-01

    This article describes a study that examined the lifelong learning of information systems academics in relation to their normal work. It begins by considering the concept of lifelong learning, its relationship to real-life learning and that lifelong learning should encompass the whole spectrum of formal, non-formal and informal learning. Most…

  6. Formal design and verification of a reliable computing platform for real-time control. Phase 2: Results

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Divito, Ben L.

    1992-01-01

    The design and formal verification of the Reliable Computing Platform (RCP), a fault tolerant computing system for digital flight control applications is presented. The RCP uses N-Multiply Redundant (NMR) style redundancy to mask faults and internal majority voting to flush the effects of transient faults. The system is formally specified and verified using the Ehdm verification system. A major goal of this work is to provide the system with significant capability to withstand the effects of High Intensity Radiated Fields (HIRF).

  7. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  8. Formalizing an integrative, multidisciplinary cancer therapy discovery workflow

    PubMed Central

    McGuire, Mary F.; Enderling, Heiko; Wallace, Dorothy I.; Batra, Jaspreet; Jordan, Marie; Kumar, Sushil; Panetta, John C.; Pasquier, Eddy

    2014-01-01

    Although many clinicians and researchers work to understand cancer, there has been limited success to effectively combine forces and collaborate over time, distance, data and budget constraints. Here we present a workflow template for multidisciplinary cancer therapy that was developed during the 2nd Annual Workshop on Cancer Systems Biology sponsored by Tufts University, Boston, MA in July 2012. The template was applied to the development of a metronomic therapy backbone for neuroblastoma. Three primary groups were identified: clinicians, biologists, and scientists (mathematicians, computer scientists, physicists and engineers). The workflow described their integrative interactions; parallel or sequential processes; data sources and computational tools at different stages as well as the iterative nature of therapeutic development from clinical observations to in vitro, in vivo, and clinical trials. We found that theoreticians in dialog with experimentalists could develop calibrated and parameterized predictive models that inform and formalize sets of testable hypotheses, thus speeding up discovery and validation while reducing laboratory resources and costs. The developed template outlines an interdisciplinary collaboration workflow designed to systematically investigate the mechanistic underpinnings of a new therapy and validate that therapy to advance development and clinical acceptance. PMID:23955390

  9. Crafting usable knowledge for sustainable development.

    PubMed

    Clark, William C; van Kerkhoff, Lorrae; Lebel, Louis; Gallopin, Gilberto C

    2016-04-26

    This paper distills core lessons about how researchers (scientists, engineers, planners, etc.) interested in promoting sustainable development can increase the likelihood of producing usable knowledge. We draw the lessons from both practical experience in diverse contexts around the world and from scholarly advances in understanding the relationships between science and society. Many of these lessons will be familiar to those with experience in crafting knowledge to support action for sustainable development. However, few are included in the formal training of researchers. As a result, when scientists and engineers first venture out of the laboratory or library with the goal of linking their knowledge with action, the outcome has often been ineffectiveness and disillusionment. We therefore articulate here a core set of lessons that we believe should become part of the basic training for researchers interested in crafting usable knowledge for sustainable development. These lessons entail at least four things researchers should know, and four things they should do. The knowing lessons involve understanding the coproduction relationships through which knowledge making and decision making shape one another in social-environmental systems. We highlight the lessons that emerge from examining those coproduction relationships through the ICAP lens, viewing them from the perspectives of Innovation systems, Complex systems, Adaptive systems, and Political systems. The doing lessons involve improving the capacity of the research community to put its understanding of coproduction into practice. We highlight steps through which researchers can help build capacities for stakeholder collaboration, social learning, knowledge governance, and researcher training.

  10. A computerized tutor prototype for prostate cryotherapy: key building blocks and system evaluation

    NASA Astrophysics Data System (ADS)

    Rabin, Yoed; Shimada, Kenji; Joshi, Purva; Sehrawat, Anjali; Keelan, Robert; Wilfong, Dona M.; McCormick, James T.

    2017-02-01

    This paper focuses on the evaluation of a prototype for a computer-based tutoring system for prostate cryosurgery, while reviewing its key building blocks and their benchmark performance. The tutoring system lists geometrical constraints of cryoprobe placement, displays a rendered shape of the prostate, simulates cryoprobe insertion, enables distance measurements, simulates the corresponding thermal history, and evaluates the mismatch between the target region shape and a pre-selected planning isotherm. The quality of trainee planning is measured in comparison with a computergenerated plan, created for each case study by a previously developed planning algorithm, known as bubble-packing. While the tutoring level in this study aims only at geometrical constraints on cryoprobe placement and the resulting thermal history, it creates a unique opportunity to gain insight into the process outside of the operation room. System validation of the tutor has been performed by collecting training data from surgical residents, having no prior experience or advanced knowledge of cryotherapy. Furthermore, the system has been evaluated by graduate engineering students having no formal education in medicine. In terms of match between a planning isotherm and the target region shape, results demonstrate medical residents' performance improved from 4.4% in a pretest to 37.8% in a posttest over a course of 50 minutes of training (within 10% margins from a computer-optimized plan). Comparing those results with the performance of engineering students indicates similar results, suggesting that planning of the cryoprobe layout essentially revolves around geometric considerations.

  11. Crafting usable knowledge for sustainable development

    PubMed Central

    2016-01-01

    This paper distills core lessons about how researchers (scientists, engineers, planners, etc.) interested in promoting sustainable development can increase the likelihood of producing usable knowledge. We draw the lessons from both practical experience in diverse contexts around the world and from scholarly advances in understanding the relationships between science and society. Many of these lessons will be familiar to those with experience in crafting knowledge to support action for sustainable development. However, few are included in the formal training of researchers. As a result, when scientists and engineers first venture out of the laboratory or library with the goal of linking their knowledge with action, the outcome has often been ineffectiveness and disillusionment. We therefore articulate here a core set of lessons that we believe should become part of the basic training for researchers interested in crafting usable knowledge for sustainable development. These lessons entail at least four things researchers should know, and four things they should do. The knowing lessons involve understanding the coproduction relationships through which knowledge making and decision making shape one another in social–environmental systems. We highlight the lessons that emerge from examining those coproduction relationships through the ICAP lens, viewing them from the perspectives of Innovation systems, Complex systems, Adaptive systems, and Political systems. The doing lessons involve improving the capacity of the research community to put its understanding of coproduction into practice. We highlight steps through which researchers can help build capacities for stakeholder collaboration, social learning, knowledge governance, and researcher training. PMID:27091979

  12. From Informal to Formal: Status and Challenges of Informal Water Infrastructures in Indonesia

    NASA Astrophysics Data System (ADS)

    Maryati, S.; Humaira, A. N. S.; Kipuw, D. M.

    2018-05-01

    Informal water infrastructures in Indonesia have emerged due to the government’s inability or incapacity to guarantee the service of water provision to all communities. Communities have their own mechanisms to meet their water needs and arrange it as a self-supplying or self-governed form of water infrastructure provision. In general, infrastructure provisions in Indonesia are held in the form of public systems (centralized systems) that cover most of the urban communities; communal systems that serve some groups of households limited only to a particular small-scale area; and individual systems. The communal and individual systems are systems that are provided by the communities themselves, sometimes with some intervention by the government. This kind of system is usually built according to lower standards compared to the system built by the government. Informal systems in this study are not defined in terms of their legal aspect, but more in technical terms. The aim of this study was to examine the existing status and challenges in transforming informal water infrastructures to formal infrastructures. Formalizing informal infrastructures is now becoming an issue because of the limitations the government faces in building new formal infrastructures. On the other hand, global and national targets state 100% access to water supplies for the whole population in the near future. Formalizing informal infrastructures seems more realistic than building new infrastructures. The scope of this study were the technical aspects thereof. Making descriptive and comparative analyses was the methodology used. Generally, most of the informal systems do not apply progressive tariffs, do not have storage/reservoirs, do not have water treatment plants, and rarely conduct treatment in accordance with standards and procedures as formal systems do, which leads to dubious access to safe water, especially considering the quality aspect.

  13. Modeling formalisms in Systems Biology

    PubMed Central

    2011-01-01

    Systems Biology has taken advantage of computational tools and high-throughput experimental data to model several biological processes. These include signaling, gene regulatory, and metabolic networks. However, most of these models are specific to each kind of network. Their interconnection demands a whole-cell modeling framework for a complete understanding of cellular systems. We describe the features required by an integrated framework for modeling, analyzing and simulating biological processes, and review several modeling formalisms that have been used in Systems Biology including Boolean networks, Bayesian networks, Petri nets, process algebras, constraint-based models, differential equations, rule-based models, interacting state machines, cellular automata, and agent-based models. We compare the features provided by different formalisms, and discuss recent approaches in the integration of these formalisms, as well as possible directions for the future. PMID:22141422

  14. Basic exploration geophysics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson, E.S.

    1988-01-01

    An introduction to geophysical methods used to explore for natural resources and to survey earth's geology is presented in this volume. It is suitable for second-and third-year undergraduate students majoring in geology or engineering and for professional engineering and for professional engineers and earth scientists without formal instruction in geophysics. The author assumes the reader is familiar with geometry, algebra, and trigonometry. Geophysical exploration includes seismic refraction and reflection surveying, electrical resistivity and electromagnetic field surveying, and geophysical well logging. Surveying operations are described in step-by-step procedures and are illustrated by practical examples. Computer-based methods of processing and interpreting datamore » as well as geographical methods are introduced.« less

  15. A swarm intelligence framework for reconstructing gene networks: searching for biologically plausible architectures.

    PubMed

    Kentzoglanakis, Kyriakos; Poole, Matthew

    2012-01-01

    In this paper, we investigate the problem of reverse engineering the topology of gene regulatory networks from temporal gene expression data. We adopt a computational intelligence approach comprising swarm intelligence techniques, namely particle swarm optimization (PSO) and ant colony optimization (ACO). In addition, the recurrent neural network (RNN) formalism is employed for modeling the dynamical behavior of gene regulatory systems. More specifically, ACO is used for searching the discrete space of network architectures and PSO for searching the corresponding continuous space of RNN model parameters. We propose a novel solution construction process in the context of ACO for generating biologically plausible candidate architectures. The objective is to concentrate the search effort into areas of the structure space that contain architectures which are feasible in terms of their topological resemblance to real-world networks. The proposed framework is initially applied to the reconstruction of a small artificial network that has previously been studied in the context of gene network reverse engineering. Subsequently, we consider an artificial data set with added noise for reconstructing a subnetwork of the genetic interaction network of S. cerevisiae (yeast). Finally, the framework is applied to a real-world data set for reverse engineering the SOS response system of the bacterium Escherichia coli. Results demonstrate the relative advantage of utilizing problem-specific knowledge regarding biologically plausible structural properties of gene networks over conducting a problem-agnostic search in the vast space of network architectures.

  16. Non-formal Education in the Philippines: A Fundamental Step towards Lifelong Learning.

    ERIC Educational Resources Information Center

    Gonzales, Ma. Celeste T.; Pijano, Ma. Concepcion V.

    In order to significantly contribute to human resource development, the Philippines must develop an integrated educational system of lifelong learning, with a special emphasis on non-formal education. Despite the value that is placed on formal, or sequential academic schooling, it is non-formal schooling that makes accessible the acquisition of…

  17. From Regulation to Virtue: A Critique of Ethical Formalism in Research Organizations

    ERIC Educational Resources Information Center

    Atkinson, Timothy N.; Butler, Jesse W.

    2012-01-01

    The following article argues that the research compliance system has some flaws that should be addressed, particularly with regard to excessive emphasis of and reliance upon formal regulations in research administration. Ethical formalism, understood here as the use of formal rules for the determination of behavior, is not an optimal perspective…

  18. Assessing the Higher National Diploma Chemical Engineering programme in Ghana: students' perspective

    NASA Astrophysics Data System (ADS)

    Boateng, Cyril D.; Cudjoe Bensah, Edem; Ahiekpor, Julius C.

    2012-05-01

    Chemical engineers have played key roles in the growth of the chemical and allied industries in Ghana but indigenous industries that have traditionally been the domain of the informal sector need to be migrated to the formal sector through the entrepreneurship and innovation of chemical engineers. The Higher National Diploma Chemical Engineering programme is being migrated from a subject-based to a competency-based curriculum. This paper evaluates the programme from the point of view of students. Data were drawn from a survey conducted in the department and were analysed using SPSS. The survey involved administering questionnaires to students at all levels in the department. Analysis of the responses indicated that the majority of the students had decided to pursue chemical engineering due to the career opportunities available. Their knowledge of the programme learning outcomes was, however, poor. The study revealed that none of the students was interested in developing indigenous industries.

  19. Model-based engineering for medical-device software.

    PubMed

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  20. Prefreshman and Cooperative Education Program. [PREFACE training

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Of the 93 students enrolled in the PREFACE program over its four-year history, 70 are still in engineering school. Tables show profiles of student placement and participation from 1973 to 1977 (first semester completed). During the 1977 summer, 10 students were placed at NASA Goddard, 8 at DOE-Brookhaven, and 2 at American Can. Eleven students with less high school math preparation remained on campus for formal precalculus classes. Majors of the students in the program include civil, chemical, electrical, and mechanical engineering. Student satisfaction with their training experiences is summarized.

  1. NASA Propulsion Engineering Research Center, volume 1

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Over the past year, the Propulsion Engineering Research Center at The Pennsylvania State University continued its progress toward meeting the goals of NASA's University Space Engineering Research Centers (USERC) program. The USERC program was initiated in 1988 by the Office of Aeronautics and Space Technology to provide an invigorating force to drive technology advancements in the U.S. space industry. The Propulsion Center's role in this effort is to provide a fundamental basis from which the technology advances in propulsion can be derived. To fulfill this role, an integrated program was developed that focuses research efforts on key technical areas, provides students with a broad education in traditional propulsion-related science and engineering disciplines, and provides minority and other under-represented students with opportunities to take their first step toward professional careers in propulsion engineering. The program is made efficient by incorporating government propulsion laboratories and the U.S. propulsion industry into the program through extensive interactions and research involvement. The Center is comprised of faculty, professional staff, and graduate and undergraduate students working on a broad spectrum of research issues related to propulsion. The Center's research focus encompasses both current and advanced propulsion concepts for space transportation, with a research emphasis on liquid propellant rocket engines. The liquid rocket engine research includes programs in combustion and turbomachinery. Other space transportation modes that are being addressed include anti-matter, electric, nuclear, and solid propellant propulsion. Outside funding supports a significant fraction of Center research, with the major portion of the basic USERC grant being used for graduate student support and recruitment. The remainder of the USERC funds are used to support programs to increase minority student enrollment in engineering, to maintain Center infrastructure, and to develop research capability in key new areas. Significant research programs in propulsion systems for air and land transportation complement the space propulsion focus. The primary mission of the Center is student education. The student program emphasizes formal class work and research in classical engineering and science disciplines with applications to propulsion.

  2. Mayo clinic NLP system for patient smoking status identification.

    PubMed

    Savova, Guergana K; Ogren, Philip V; Duffy, Patrick H; Buntrock, James D; Chute, Christopher G

    2008-01-01

    This article describes our system entry for the 2006 I2B2 contest "Challenges in Natural Language Processing for Clinical Data" for the task of identifying the smoking status of patients. Our system makes the simplifying assumption that patient-level smoking status determination can be achieved by accurately classifying individual sentences from a patient's record. We created our system with reusable text analysis components built on the Unstructured Information Management Architecture and Weka. This reuse of code minimized the development effort related specifically to our smoking status classifier. We report precision, recall, F-score, and 95% exact confidence intervals for each metric. Recasting the classification task for the sentence level and reusing code from other text analysis projects allowed us to quickly build a classification system that performs with a system F-score of 92.64 based on held-out data tests and of 85.57 on the formal evaluation data. Our general medical natural language engine is easily adaptable to a real-world medical informatics application. Some of the limitations as applied to the use-case are negation detection and temporal resolution.

  3. Space Station Freedom extravehicular activity systems evolution study

    NASA Technical Reports Server (NTRS)

    Rouen, Michael

    1990-01-01

    Evaluation of Space Station Freedom (SSF) support of manned exploration is in progress to identify SSF extravehicular activity (EVA) system evolution requirements and capabilities. The output from these studies will provide data to support the preliminary design process to ensure that Space Station EVA system requirements for future missions (including the transportation node) are adequately considered and reflected in the baseline design. The study considers SSF support of future missions and the EVA system baseline to determine adequacy of EVA requirements and capabilities and to identify additional requirements, capabilities, and necessary technology upgrades. The EVA demands levied by formal requirements and indicated by evolutionary mission scenarios are high for the out-years of Space Station Freedom. An EVA system designed to meet the baseline requirements can easily evolve to meet evolution demands with few exceptions. Results to date indicate that upgrades or modifications to the EVA system may be necessary to meet the full range of EVA thermal environments associated with the transportation node. Work continues to quantify the EVA capability in this regard. Evolution mission scenarios with EVA and ground unshielded nuclear propulsion engines are inconsistent with anthropomorphic EVA capabilities.

  4. Challenges and Demands on Automated Software Revision

    NASA Technical Reports Server (NTRS)

    Bonakdarpour, Borzoo; Kulkarni, Sandeep S.

    2008-01-01

    In the past three decades, automated program verification has undoubtedly been one of the most successful contributions of formal methods to software development. However, when verification of a program against a logical specification discovers bugs in the program, manual manipulation of the program is needed in order to repair it. Thus, in the face of existence of numerous unverified and un- certified legacy software in virtually any organization, tools that enable engineers to automatically verify and subsequently fix existing programs are highly desirable. In addition, since requirements of software systems often evolve during the software life cycle, the issue of incomplete specification has become a customary fact in many design and development teams. Thus, automated techniques that revise existing programs according to new specifications are of great assistance to designers, developers, and maintenance engineers. As a result, incorporating program synthesis techniques where an algorithm generates a program, that is correct-by-construction, seems to be a necessity. The notion of manual program repair described above turns out to be even more complex when programs are integrated with large collections of sensors and actuators in hostile physical environments in the so-called cyber-physical systems. When such systems are safety/mission- critical (e.g., in avionics systems), it is essential that the system reacts to physical events such as faults, delays, signals, attacks, etc, so that the system specification is not violated. In fact, since it is impossible to anticipate all possible such physical events at design time, it is highly desirable to have automated techniques that revise programs with respect to newly identified physical events according to the system specification.

  5. An automated qualification framework for the MeerKAT CAM (Control-And-Monitoring)

    NASA Astrophysics Data System (ADS)

    van den Heever, Lize; Marais, Neilen; Slabber, Martin

    2016-08-01

    This paper introduces and discusses the design of an Automated Qualification Framework (AQF) that was developed to automate as much as possible of the formal Qualification Testing of the Control And Monitoring (CAM) subsystem of the 64 dish MeerKAT radio telescope currently under construction in the Karoo region of South Africa. The AQF allows each Integrated CAM Test to reference the MeerKAT CAM requirement and associated verification requirement it covers and automatically produces the Qualification Test Procedure and Qualification Test Report from the test steps and evaluation steps annotated in the Integrated CAM Tests. The MeerKAT System Engineers are extremely happy with the AQF results, but mostly by the approach and process it enforces.

  6. Software Validation via Model Animation

    NASA Technical Reports Server (NTRS)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  7. Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Yvonne (Editor)

    2008-01-01

    Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.

  8. A new and trustworthy formalism to compute entropy in quantum systems

    NASA Astrophysics Data System (ADS)

    Ansari, Mohammad

    Entropy is nonlinear in density matrix and as such its evaluation in open quantum system has not been fully understood. Recently a quantum formalism was proposed by Ansari and Nazarov that evaluates entropy using parallel time evolutions of multiple worlds. We can use this formalism to evaluate entropy flow in a photovoltaic cells coupled to thermal reservoirs and cavity modes. Recently we studied the full counting statistics of energy transfers in such systems. This rigorously proves a nontrivial correspondence between energy exchanges and entropy changes in quantum systems, which only in systems without entanglement can be simplified to the textbook second law of thermodynamics. We evaluate the flow of entropy using this formalism. In the presence of entanglement, however, interestingly much less information is exchanged than what we expected. This increases the upper limit capacity for information transfer and its conversion to energy for next generation devices in mesoscopic physics.

  9. JANNAF "Test and Evaluation Guidelines for Liquid Rocket Engines": Status and Application

    NASA Technical Reports Server (NTRS)

    Parkinson, Douglas; VanLerberghe, Wayne M.; Rahman, Shamim A.

    2017-01-01

    For many decades, the U.S. rocket propulsion industrial base has performed remarkably in developing complex liquid rocket engines that can propel critical payloads into service for the nation, as well as transport people and hardware for missions that open the frontiers of space exploration for humanity. This has been possible only at considerable expense given the lack of detailed guidance that captures the essence of successful practices and knowledge accumulated over five decades of liquid rocket engine development. In an effort to provide benchmarks and guidance for the next generation of rocket engineers, the Joint Army Navy NASA Air Force (JANNAF) Interagency Propulsion Committee published a liquid rocket engine (LRE) test and evaluation (T&E) guideline document in 2012 focusing on the development challenges and test verification considerations for liquid rocket engine systems. This document has been well received and applied by many current LRE developers as a benchmark and guidance tool, both for government-driven applications as well as for fully commercial ventures. The USAF Space and Missile Systems Center (SMC) has taken an additional near-term step and is directing activity to adapt and augment the content from the JANNAF LRE T&E guideline into a standard for potential application to future USAF requests for proposals for LRE development initiatives and launch vehicles for national security missions. A draft of this standard was already sent out for review and comment, and is intended to be formally approved and released towards the end of 2017. The acceptance and use of the LRE T&E guideline is possible through broad government and industry participation in the JANNAF liquid propulsion committee and associated panels. The sponsoring JANNAF community is expanding upon this initial baseline version and delving into further critical development aspects of liquid rocket propulsion testing at the integrated stage level as well as engine component level, in order to advance the state of the practice. The full participation of the entire U.S. rocket propulsion industrial base is invited and expected at this opportune moment in the continuing advancement of spaceflight technology.

  10. Archive of digital chirp subbottom profile data collected during USGS Cruise 13CCT04 offshore of Petit Bois Island, Mississippi, August 2013

    USGS Publications Warehouse

    Forde, Arnell S.; Flocks, James G.; Kindinger, Jack G.; Bernier, Julie C.; Kelso, Kyle W.; Wiese, Dana S.

    2015-01-01

    From August 13-23, 2013, the U.S. Geological Survey (USGS), in cooperation with the U.S. Army Corps of Engineers (USACE) conducted geophysical surveys to investigate the geologic controls on barrier island framework and long-term sediment transport offshore of Petit Bois Island, Mississippi. This investigation is part of a broader USGS study on Coastal Change and Transport (CCT). These surveys were funded through the Mississippi Coastal Improvements Program (MsCIP) with partial funding provided by the Northern Gulf of Mexico Ecosystem Change and Hazard Susceptibility Project. This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained-showing a relative increase in signal amplitude-digital images of the seismic profiles are provided.

  11. Testing Linear Temporal Logic Formulae on Finite Execution Traces

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Norvig, Peter (Technical Monitor)

    2001-01-01

    We present an algorithm for efficiently testing Linear Temporal Logic (LTL) formulae on finite execution traces. The standard models of LTL are infinite traces, reflecting the behavior of reactive and concurrent systems which conceptually may be continuously alive. In most past applications of LTL. theorem provers and model checkers have been used to formally prove that down-scaled models satisfy such LTL specifications. Our goal is instead to use LTL for up-scaled testing of real software applications. Such tests correspond to analyzing the conformance of finite traces against LTL formulae. We first describe what it means for a finite trace to satisfy an LTL property. We then suggest an optimized algorithm based on transforming LTL formulae. The work is done using the Maude rewriting system. which turns out to provide a perfect notation and an efficient rewriting engine for performing these experiments.

  12. Co-design in synthetic biology: a system-level analysis of the development of an environmental sensing device.

    PubMed

    Ball, David A; Lux, Matthew W; Graef, Russell R; Peterson, Matthew W; Valenti, Jane D; Dileo, John; Peccoud, Jean

    2010-01-01

    The concept of co-design is common in engineering, where it is necessary, for example, to determine the optimal partitioning between hardware and software of the implementation of a system features. Here we propose to adapt co-design methodologies for synthetic biology. As a test case, we have designed an environmental sensing device that detects the presence of three chemicals, and returns an output only if at least two of the three chemicals are present. We show that the logical operations can be implemented in three different design domains: (1) the transcriptional domain using synthetically designed hybrid promoters, (2) the protein domain using bi-molecular fluorescence complementation, and (3) the fluorescence domain using spectral unmixing and relying on electronic processing. We discuss how these heterogeneous design strategies could be formalized to develop co-design algorithms capable of identifying optimal designs meeting user specifications.

  13. Unchartered innovation? Local reforms of national formal water management in the Mkoji sub-catchment, Tanzania

    NASA Astrophysics Data System (ADS)

    Mehari, Abraham; Koppen, Barbara Van; McCartney, Matthew; Lankford, Bruce

    Tanzania is currently attempting to improve water resources management through formal water rights and water fees systems, and formal institutions. The water rights system is expected to facilitate water allocation. The water fees system aims at cost-recovery for water resources management services. To enhance community involvement in water management, Water User Associations (WUAs) are being established and, in areas with growing upstream-downstream conflicts, apex bodies of all users along the stressed river stretch. The Mkoji sub-catchment (MSC) in the Rufiji basin is one of the first where these formal water management systems are being attempted. This paper analyzes the effectiveness of these systems in the light of their expected merits and the consequences of the juxtaposition of contemporary laws with traditional approaches. The study employed mainly qualitative, but also quantitative approaches on social and technical variables. Major findings were: (1) a good mix of formal (water fees and WUAs) and traditional (rotation-based water sharing, the Zamu) systems improved village-level water management services and reduced intra-scheme conflicts; (2) the water rights system has not brought abstractions into line with allocations and (3) so far, the MSC Apex body failed to mitigate inter-scheme conflicts. A more sophisticated design of allocation infrastructure and institutions is recommended.

  14. Properties of a Formal Method for Prediction of Emergent Behaviors in Swarm-based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James

    2004-01-01

    Autonomous intelligent swarms of satellites are being proposed for NASA missions that have complex behaviors and interactions. The emergent properties of swarms make these missions powerful, but at the same time more difficult to design and assure that proper behaviors will emerge. This paper gives the results of research into formal methods techniques for verification and validation of NASA swarm-based missions. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft. The NASA ANTS mission was used as an example of swarm intelligence for which to apply the formal methods. This paper will give the evaluation of these formal methods and give partial specifications of the ANTS mission using four selected methods. We then give an evaluation of the methods and the needed properties of a formal method for effective specification and prediction of emergent behavior in swarm-based systems.

  15. Formal Specification of Information Systems Requirements.

    ERIC Educational Resources Information Center

    Kampfner, Roberto R.

    1985-01-01

    Presents a formal model for specification of logical requirements of computer-based information systems that incorporates structural and dynamic aspects based on two separate models: the Logical Information Processing Structure and the Logical Information Processing Network. The model's role in systems development is discussed. (MBR)

  16. MOS 2.0: Modeling the Next Revolutionary Mission Operations System

    NASA Technical Reports Server (NTRS)

    Delp, Christopher L.; Bindschadler, Duane; Wollaeger, Ryan; Carrion, Carlos; McCullar, Michelle; Jackson, Maddalena; Sarrel, Marc; Anderson, Louise; Lam, Doris

    2011-01-01

    Designed and implemented in the 1980's, the Advanced Multi-Mission Operations System (AMMOS) was a breakthrough for deep-space NASA missions, enabling significant reductions in the cost and risk of implementing ground systems. By designing a framework for use across multiple missions and adaptability to specific mission needs, AMMOS developers created a set of applications that have operated dozens of deep-space robotic missions over the past 30 years. We seek to leverage advances in technology and practice of architecting and systems engineering, using model-based approaches to update the AMMOS. We therefore revisit fundamental aspects of the AMMOS, resulting in a major update to the Mission Operations System (MOS): MOS 2.0. This update will ensure that the MOS can support an increasing range of mission types, (such as orbiters, landers, rovers, penetrators and balloons), and that the operations systems for deep-space robotic missions can reap the benefits of an iterative multi-mission framework.12 This paper reports on the first phase of this major update. Here we describe the methods and formal semantics used to address MOS 2.0 architecture and some early results. Early benefits of this approach include improved stakeholder input and buy-in, the ability to articulate and focus effort on key, system-wide principles, and efficiency gains obtained by use of well-architected design patterns and the use of models to improve the quality of documentation and decrease the effort required to produce and maintain it. We find that such methods facilitate reasoning, simulation, analysis on the system design in terms of design impacts, generation of products (e.g., project-review and software-delivery products), and use of formal process descriptions to enable goal-based operations. This initial phase yields a forward-looking and principled MOS 2.0 architectural vision, which considers both the mission-specific context and long-term system sustainability.

  17. Formal methods for dependable real-time systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    The motivation for using formal methods to specify and reason about real time properties is outlined and approaches that were proposed and used are sketched. The formal verifications of clock synchronization algorithms are concluded as showing that mechanically supported reasoning about complex real time behavior is feasible. However, there was significant increase in the effectiveness of verification systems since those verifications were performed, at it is to be expected that verifications of comparable difficulty will become fairly routine. The current challenge lies in developing perspicuous and economical approaches to the formalization and specification of real time properties.

  18. LWS/SET End-to-End Data System

    NASA Technical Reports Server (NTRS)

    Giffin, Geoff; Sherman, Barry; Colon, Gilberto (Technical Monitor)

    2002-01-01

    This paper describes the concept for the End-to-End Data System that will support NASA's Living With a Star Space Environment Testbed missions. NASA has initiated the Living With a Star (LWS) Program to develop a better scientific understanding to address the aspects of the connected Sun-Earth system that affect life and society. A principal goal of the program is to bridge the gap.between science, engineering, and user application communities. The Space Environment Testbed (SET) Project is one element of LWS. The Project will enable future science, operational, and commercial objectives in space and atmospheric environments by improving engineering approaches to the accommodation and/or mitigation of the effects of solar variability on technological systems. The End-to-end data system allows investigators to access the SET control center, command their experiments, and receive data from their experiments back at their home facility, using the Internet. The logical functioning of major components of the end-to-end data system are described, including the GSFC Payload Operations Control Center (POCC), SET Payloads, the GSFC SET Simulation Lab, SET Experiment PI Facilities, and Host Systems. Host Spacecraft Operations Control Centers (SOCC) and the Host Spacecraft are essential links in the end-to-end data system, but are not directly under the control of the SET Project. Formal interfaces will be established between these entities and elements of the SET Project. The paper describes data flow through the system, from PI facilities connecting to the SET operations center via the Internet, communications to SET carriers and experiments via host systems, to telemetry returns to investigators from their flight experiments. It also outlines the techniques that will be used to meet mission requirements, while holding development and operational costs to a minimum. Additional information is included in the original extended abstract.

  19. Formal methods and their role in digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1995-01-01

    This report is based on one prepared as a chapter for the FAA Digital Systems Validation Handbook (a guide to assist FAA certification specialists with advanced technology issues). Its purpose is to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used in critical applications; and to suggest factors for consideration when formal methods are offered in support of certification. The presentation concentrates on the rationale for formal methods and on their contribution to assurance for critical applications within a context such as that provided by DO-178B (the guidelines for software used on board civil aircraft); it is intended as an introduction for those to whom these topics are new.

  20. Formal reasoning about systems biology using theorem proving

    PubMed Central

    Hasan, Osman; Siddique, Umair; Tahar, Sofiène

    2017-01-01

    System biology provides the basis to understand the behavioral properties of complex biological organisms at different levels of abstraction. Traditionally, analysing systems biology based models of various diseases have been carried out by paper-and-pencil based proofs and simulations. However, these methods cannot provide an accurate analysis, which is a serious drawback for the safety-critical domain of human medicine. In order to overcome these limitations, we propose a framework to formally analyze biological networks and pathways. In particular, we formalize the notion of reaction kinetics in higher-order logic and formally verify some of the commonly used reaction based models of biological networks using the HOL Light theorem prover. Furthermore, we have ported our earlier formalization of Zsyntax, i.e., a deductive language for reasoning about biological networks and pathways, from HOL4 to the HOL Light theorem prover to make it compatible with the above-mentioned formalization of reaction kinetics. To illustrate the usefulness of the proposed framework, we present the formal analysis of three case studies, i.e., the pathway leading to TP53 Phosphorylation, the pathway leading to the death of cancer stem cells and the tumor growth based on cancer stem cells, which is used for the prognosis and future drug designs to treat cancer patients. PMID:28671950

  1. Aqua Education and Public Outreach

    NASA Astrophysics Data System (ADS)

    Graham, S. M.; Parkinson, C. L.; Chambers, L. H.; Ray, S. E.

    2011-12-01

    NASA's Aqua satellite was launched on May 4, 2002, with six instruments designed to collect data about the Earth's atmosphere, biosphere, hydrosphere, and cryosphere. Since the late 1990s, the Aqua mission has involved considerable education and public outreach (EPO) activities, including printed products, formal education, an engineering competition, webcasts, and high-profile multimedia efforts. The printed products include Aqua and instrument brochures, an Aqua lithograph, Aqua trading cards, NASA Fact Sheets on Aqua, the water cycle, and weather forecasting, and an Aqua science writers' guide. On-going formal education efforts include the Students' Cloud Observations On-Line (S'COOL) Project, the MY NASA DATA Project, the Earth System Science Education Alliance, and, in partnership with university professors, undergraduate student research modules. Each of these projects incorporates Aqua data into its inquiry-based framework. Additionally, high school and undergraduate students have participated in summer internship programs. An earlier formal education activity was the Aqua Engineering Competition, which was a high school program sponsored by the NASA Goddard Space Flight Center, Morgan State University, and the Baltimore Museum of Industry. The competition began with the posting of a Round 1 Aqua-related engineering problem in December 2002 and concluded in April 2003 with a final round of competition among the five finalist teams. The Aqua EPO efforts have also included a wide range of multimedia products. Prior to launch, the Aqua team worked closely with the Special Projects Initiative (SPI) Office to produce a series of live webcasts on Aqua science and the Cool Science website aqua.nasa.gov/coolscience, which displays short video clips of Aqua scientists and engineers explaining the many aspects of the Aqua mission. These video clips, the Aqua website, and numerous presentations have benefited from dynamic visualizations showing the Aqua launch, instrument deployments, instrument sensing, and the Aqua orbit. More recently, in 2008 the Aqua team worked with the ViewSpace production team from the Space Telescope Science Institute to create an 18-minute ViewSpace feature showcasing the science and applications of the Aqua mission. Then in 2010 and 2011, Aqua and other NASA Earth-observing missions partnered with National CineMedia on the "Know Your Earth" (KYE) project. During January and July 2010 and 2011, KYE ran 2-minute segments highlighting questions that promoted global climate literacy on lobby LCD screens in movie theaters throughout the U.S. Among the ongoing Aqua EPO efforts is the incorporation of Aqua data sets onto the Dynamic Planet, a large digital video globe that projects a wide variety of spherical data sets. Aqua also has a highly successful collaboration with EarthSky communications on the production of an Aqua/EarthSky radio show and podcast series. To date, eleven productions have been completed and distributed via the EarthSky network. In addition, a series of eight video podcasts (i.e., vodcasts) are under production by NASA Goddard TV in conjunction with Aqua personnel, highlighting various aspects of the Aqua mission.

  2. NASA propagation information center

    NASA Technical Reports Server (NTRS)

    Smith, Ernest K.; Flock, Warren L.

    1990-01-01

    The NASA Propagation Information Center became formally operational in July 1988. It is located in the Department of Electrical and Computer Engineering of the University of Colorado at Boulder. The center is several things: a communications medium for the propagation with the outside world, a mechanism for internal communication within the program, and an aid to management.

  3. Towards Model-Driven End-User Development in CALL

    ERIC Educational Resources Information Center

    Farmer, Rod; Gruba, Paul

    2006-01-01

    The purpose of this article is to introduce end-user development (EUD) processes to the CALL software development community. EUD refers to the active participation of end-users, as non-professional developers, in the software development life cycle. Unlike formal software engineering approaches, the focus in EUD on means/ends development is…

  4. FY 1991 scientific and technical reports, articles, papers, and presentations

    NASA Technical Reports Server (NTRS)

    Turner, Joyce E. (Compiler)

    1991-01-01

    Formal NASA technical reports, papers published in technical journals, and presentations by MSFC personnel in FY 1991 are presented. Papers of MSFC contractors are also included. The information in this report may be of value to the scientific and engineering community in determining what information has been published and what is available.

  5. NASA Propagation Information Center

    NASA Technical Reports Server (NTRS)

    Smith, Ernest K.; Flock, Warren L.

    1989-01-01

    The NASA Propagation Information Center became formally operational in July 1988. It is located in the Department of Electrical and Computer Engineering of the University of Colorado at Boulder. The Center is several things: a communications medium for the propagation with the outside world, a mechanism for internal communication within the program, and an aid to management.

  6. Sawlog weights for Appalachian hardwoods

    Treesearch

    Floyd G. Timson; Floyd G. Timson

    1972-01-01

    The tables are presented in this paper as reference material needed as a foundation for further work in the field of hardwood log weights. Such work may be undertaken by researchers, engineers, and equipment designers in the form of formal and informal studies, or by timbermen in the normal course of action to improve their operations.

  7. Lacking a Formal Concept of Limit: Advanced Non-Mathematics Students' Personal Concept Definitions

    ERIC Educational Resources Information Center

    Beynon, Kenneth A.; Zollman, Alan

    2015-01-01

    This mixed-methods study examines the conceptual understanding of limit among 22 undergraduate engineering students from two different sections of the same introductory differential equations course. The participants' concepts of limit (concept images and personal concept definitions) were examined using written tasks followed by one-on-one…

  8. STEM Learning through Engineering Design: Impact on Middle Secondary Students' Interest towards STEM

    ERIC Educational Resources Information Center

    Shahali, Edy Hafizan Mohd; Halim, Lilia; Rasul, Mohamad Sattar; Osman, Kamisah; Zulkifeli, Mohd Afendi

    2017-01-01

    The purpose of this study was to identify students' changes of (i) interest toward STEM subjects and (ii) interest to pursuing STEM career after participating in non-formal integrated STEM education programme. The programme exposed students with integrated STEM education through project based learning involving the application of five phases…

  9. Evaluating the Learning Process of Mechanical CAD Students

    ERIC Educational Resources Information Center

    Hamade, R. F.; Artail, H. A.; Jaber, M. Y.

    2007-01-01

    There is little theoretical or experimental research on how beginner-level trainees learn CAD skills in formal training sessions. This work presents findings on how trainees develop their skills in utilizing a solid mechanical CAD tool (Pro/Engineer version 2000i[squared] and later version Wildfire). Exercises at the beginner and intermediate…

  10. BRST Formalism for Systems with Higher Order Derivatives of Gauge Parameters

    NASA Astrophysics Data System (ADS)

    Nirov, Kh. S.

    For a wide class of mechanical systems, invariant under gauge transformations with arbitrary higher order time derivatives of gauge parameters, the equivalence of Lagrangian and Hamiltonian BRST formalisms is proved. It is shown that the Ostrogradsky formalism establishes the natural rules to relate the BFV ghost canonical pairs with the ghosts and antighosts introduced by the Lagrangian approach. Explicit relation between corresponding gauge-fixing terms is obtained.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  12. Formal Assurance for Cognitive Architecture Based Autonomous Agent

    NASA Technical Reports Server (NTRS)

    Bhattacharyya, Siddhartha; Eskridge, Thomas; Neogi, Natasha; Carvalho, Marco

    2017-01-01

    Autonomous systems are designed and deployed in different modeling paradigms. These environments focus on specific concepts in designing the system. We focus our effort in the use of cognitive architectures to design autonomous agents to collaborate with humans to accomplish tasks in a mission. Our research focuses on introducing formal assurance methods to verify the behavior of agents designed in Soar, by translating the agent to the formal verification environment Uppaal.

  13. Framework for integration of informal waste management sector with the formal sector in Pakistan.

    PubMed

    Masood, Maryam; Barlow, Claire Y

    2013-10-01

    Historically, waste pickers around the globe have utilised urban solid waste as a principal source of livelihood. Formal waste management sectors usually perceive the informal waste collection/recycling networks as backward, unhygienic and generally incompatible with modern waste management systems. It is proposed here that through careful planning and administration, these seemingly troublesome informal networks can be integrated into formal waste management systems in developing countries, providing mutual benefits. A theoretical framework for integration based on a case study in Lahore, Pakistan, is presented. The proposed solution suggests that the municipal authority should draw up and agree on a formal work contract with the group of waste pickers already operating in the area. The proposed system is assessed using the integration radar framework to classify and analyse possible intervention points between the sectors. The integration of the informal waste workers with the formal waste management sector is not a one dimensional or single step process. An ideal solution might aim for a balanced focus on all four categories of intervention, although this may be influenced by local conditions. Not all the positive benefits will be immediately apparent, but it is expected that as the acceptance of such projects increases over time, the informal recycling economy will financially supplement the formal system in many ways.

  14. The application of exergy to human-designed systems

    NASA Astrophysics Data System (ADS)

    Hamilton, P.

    2012-12-01

    Exergy is the portion of the total energy of a system that is available for conversion to useful work. Exergy takes into account both the quantity and quality of energy. Heat is the inevitable product of using any form of high-quality energy such as electricity. Modern commercial buildings and industrial facilities use large amounts of electricity and so produce huge amounts of heat. This heat energy typically is treated as a waste product and discharged to the environment and then high-quality energy sources are consumed to satisfy low-quality energy heating and cooling needs. Tens of thousands of buildings and even whole communities could meet much of their heating and cooling needs through the capture and reuse of heat energy. Yet the application of exergy principles often faces resistance because it challenges conventions about how we design, construct and operate human-engineered systems. This session will review several exergy case studies and conclude with an audience discussion of how exergy principles may be both applied and highlighted in formal and informal education settings.

  15. Process Algebra Approach for Action Recognition in the Maritime Domain

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terry

    2011-01-01

    The maritime environment poses a number of challenges for autonomous operation of surface boats. Among these challenges are the highly dynamic nature of the environment, the onboard sensing and reasoning requirements for obeying the navigational rules of the road, and the need for robust day/night hazard detection and avoidance. Development of full mission level autonomy entails addressing these challenges, coupled with inference of the tactical and strategic intent of possibly adversarial vehicles in the surrounding environment. This paper introduces PACIFIC (Process Algebra Capture of Intent From Information Content), an onboard system based on formal process algebras that is capable of extracting actions/activities from sensory inputs and reasoning within a mission context to ensure proper responses. PACIFIC is part of the Behavior Engine in CARACaS (Cognitive Architecture for Robotic Agent Command and Sensing), a system that is currently running on a number of U.S. Navy unmanned surface and underwater vehicles. Results from a series of experimental studies that demonstrate the effectiveness of the system are also presented.

  16. Statistical mechanics of few-particle systems: exact results for two useful models

    NASA Astrophysics Data System (ADS)

    Miranda, Enrique N.

    2017-11-01

    The statistical mechanics of small clusters (n ˜ 10-50 elements) of harmonic oscillators and two-level systems is studied exactly, following the microcanonical, canonical and grand canonical formalisms. For clusters with several hundred particles, the results from the three formalisms coincide with those found in the thermodynamic limit. However, for clusters formed by a few tens of elements, the three ensembles yield different results. For a cluster with a few tens of harmonic oscillators, when the heat capacity per oscillator is evaluated within the canonical formalism, it reaches a limit value equal to k B , as in the thermodynamic case, while within the microcanonical formalism the limit value is k B (1-1/n). This difference could be measured experimentally. For a cluster with a few tens of two-level systems, the heat capacity evaluated within the canonical and microcanonical ensembles also presents differences that could be detected experimentally. Both the microcanonical and grand canonical formalism show that the entropy is non-additive for systems this small, while the canonical ensemble reaches the opposite conclusion. These results suggest that the microcanonical ensemble is the most appropriate for dealing with systems with tens of particles.

  17. A Path to Planetary Protection Requirements for Human Exploration: A Literature Review and Systems Engineering Approach

    NASA Technical Reports Server (NTRS)

    Johnson, James E.; Conley, Cassie; Siegel, Bette

    2015-01-01

    As systems, technologies, and plans for the human exploration of Mars and other destinations beyond low Earth orbit begin to coalesce, it is imperative that frequent and early consideration is given to how planetary protection practices and policy will be upheld. While the development of formal planetary protection requirements for future human space systems and operations may still be a few years from fruition, guidance to appropriately influence mission and system design will be needed soon to avoid costly design and operational changes. The path to constructing such requirements is a journey that espouses key systems engineering practices of understanding shared goals, objectives and concerns, identifying key stakeholders, and iterating a draft requirement set to gain community consensus. This paper traces through each of these practices, beginning with a literature review of nearly three decades of publications addressing planetary protection concerns with respect to human exploration. Key goals, objectives and concerns, particularly with respect to notional requirements, required studies and research, and technology development needs have been compiled and categorized to provide a current 'state of knowledge'. This information, combined with the identification of key stakeholders in upholding planetary protection concerns for human missions, has yielded a draft requirement set that might feed future iteration among space system designers, exploration scientists, and the mission operations community. Combining the information collected with a proposed forward path will hopefully yield a mutually agreeable set of timely, verifiable, and practical requirements for human space exploration that will uphold international commitment to planetary protection.

  18. Single-Vector Calibration of Wind-Tunnel Force Balances

    NASA Technical Reports Server (NTRS)

    Parker, P. A.; DeLoach, R.

    2003-01-01

    An improved method of calibrating a wind-tunnel force balance involves the use of a unique load application system integrated with formal experimental design methodology. The Single-Vector Force Balance Calibration System (SVS) overcomes the productivity and accuracy limitations of prior calibration methods. A force balance is a complex structural spring element instrumented with strain gauges for measuring three orthogonal components of aerodynamic force (normal, axial, and side force) and three orthogonal components of aerodynamic torque (rolling, pitching, and yawing moments). Force balances remain as the state-of-the-art instrument that provide these measurements on a scale model of an aircraft during wind tunnel testing. Ideally, each electrical channel of the balance would respond only to its respective component of load, and it would have no response to other components of load. This is not entirely possible even though balance designs are optimized to minimize these undesirable interaction effects. Ultimately, a calibration experiment is performed to obtain the necessary data to generate a mathematical model and determine the force measurement accuracy. In order to set the independent variables of applied load for the calibration 24 NASA Tech Briefs, October 2003 experiment, a high-precision mechanical system is required. Manual deadweight systems have been in use at Langley Research Center (LaRC) since the 1940s. These simple methodologies produce high confidence results, but the process is mechanically complex and labor-intensive, requiring three to four weeks to complete. Over the past decade, automated balance calibration systems have been developed. In general, these systems were designed to automate the tedious manual calibration process resulting in an even more complex system which deteriorates load application quality. The current calibration approach relies on a one-factor-at-a-time (OFAT) methodology, where each independent variable is incremented individually throughout its full-scale range, while all other variables are held at a constant magnitude. This OFAT approach has been widely accepted because of its inherent simplicity and intuitive appeal to the balance engineer. LaRC has been conducting research in a "modern design of experiments" (MDOE) approach to force balance calibration. Formal experimental design techniques provide an integrated view to the entire calibration process covering all three major aspects of an experiment; the design of the experiment, the execution of the experiment, and the statistical analyses of the data. In order to overcome the weaknesses in the available mechanical systems and to apply formal experimental techniques, a new mechanical system was required. The SVS enables the complete calibration of a six-component force balance with a series of single force vectors.

  19. Providing solid angle formalism for skyshine calculations.

    PubMed

    Gossman, Michael S; Pahikkala, A Jussi; Rising, Mary B; McGinley, Patton H

    2010-08-17

    We detail, derive and correct the technical use of the solid angle variable identified in formal guidance that relates skyshine calculations to dose-equivalent rate. We further recommend it for use with all National Council on Radiation Protection and Measurements (NCRP), Institute of Physics and Engineering in Medicine (IPEM) and similar reports documented. In general, for beams of identical width which have different resulting areas, within ± 1.0 % maximum deviation the analytical pyramidal solution is 1.27 times greater than a misapplied analytical conical solution through all field sizes up to 40 × 40 cm². Therefore, we recommend determining the exact results with the analytical pyramidal solution for square beams and the analytical conical solution for circular beams.

  20. Industry Strength Tool and Technology for Automated Synthesis of Safety-Critical Applications from Formal Specifications

    DTIC Science & Technology

    2015-11-01

    28 2.3.4 Input/Output Automata ...various other modeling frameworks such as I/O Automata , Kahn Process Networks, Petri-nets, Multi-dimensional SDF, etc. are also used for designing...Formal Ideally suited to model DSP applications 3 Petri Nets Graphical Formal Used for modeling distributed systems 4 I/O Automata Both Formal

Top