Sample records for formal specification techniques

  1. Formalisms for user interface specification and design

    NASA Technical Reports Server (NTRS)

    Auernheimer, Brent J.

    1989-01-01

    The application of formal methods to the specification and design of human-computer interfaces is described. A broad outline of human-computer interface problems, a description of the field of cognitive engineering and two relevant research results, the appropriateness of formal specification techniques, and potential NASA application areas are described.

  2. UML activity diagrams in requirements specification of logic controllers

    NASA Astrophysics Data System (ADS)

    Grobelna, Iwona; Grobelny, Michał

    2015-12-01

    Logic controller specification can be prepared using various techniques. One of them is the wide understandable and user-friendly UML language and its activity diagrams. Using formal methods during the design phase increases the assurance that implemented system meets the project requirements. In the approach we use the model checking technique to formally verify a specification against user-defined behavioral requirements. The properties are usually defined as temporal logic formulas. In the paper we propose to use UML activity diagrams in requirements definition and then to formalize them as temporal logic formulas. As a result, UML activity diagrams can be used both for logic controller specification and for requirements definition, what simplifies the specification and verification process.

  3. Formalizing structured file services for the data storage and retrieval subsystem of the data management system for Spacestation Freedom

    NASA Technical Reports Server (NTRS)

    Jamsek, Damir A.

    1993-01-01

    A brief example of the use of formal methods techniques in the specification of a software system is presented. The report is part of a larger effort targeted at defining a formal methods pilot project for NASA. One possible application domain that may be used to demonstrate the effective use of formal methods techniques within the NASA environment is presented. It is not intended to provide a tutorial on either formal methods techniques or the application being addressed. It should, however, provide an indication that the application being considered is suitable for a formal methods by showing how such a task may be started. The particular system being addressed is the Structured File Services (SFS), which is a part of the Data Storage and Retrieval Subsystem (DSAR), which in turn is part of the Data Management System (DMS) onboard Spacestation Freedom. This is a software system that is currently under development for NASA. An informal mathematical development is presented. Section 3 contains the same development using Penelope (23), an Ada specification and verification system. The complete text of the English version Software Requirements Specification (SRS) is reproduced in Appendix A.

  4. An elementary tutorial on formal specification and verification using PVS

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1993-01-01

    A tutorial on the development of a formal specification and its verification using the Prototype Verification System (PVS) is presented. The tutorial presents the formal specification and verification techniques by way of specific example - an airline reservation system. The airline reservation system is modeled as a simple state machine with two basic operations. These operations are shown to preserve a state invariant using the theorem proving capabilities of PVS. The technique of validating a specification via 'putative theorem proving' is also discussed and illustrated in detail. This paper is intended for the novice and assumes only some of the basic concepts of logic. A complete description of user inputs and the PVS output is provided and thus it can be effectively used while one is sitting at a computer terminal.

  5. Developing Formal Correctness Properties from Natural Language Requirements

    NASA Technical Reports Server (NTRS)

    Nikora, Allen P.

    2006-01-01

    This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.

  6. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  7. Formal Methods Specification and Verification Guidebook for Software and Computer Systems. Volume 1; Planning and Technology Insertion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.

  8. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  9. Formal verification of an avionics microprocessor

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam, K.; Miller, Steven P.

    1995-01-01

    Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.

  10. Formalizing Space Shuttle Software Requirements

    NASA Technical Reports Server (NTRS)

    Crow, Judith; DiVito, Ben L.

    1996-01-01

    This paper describes two case studies in which requirements for new flight-software subsystems on NASA's Space Shuttle were analyzed, one using standard formal specification techniques, the other using state exploration. These applications serve to illustrate three main theses: (1) formal methods can complement conventional requirements analysis processes effectively, (2) formal methods confer benefits regardless of how extensively they are adopted and applied, and (3) formal methods are most effective when they are judiciously tailored to the application.

  11. Development of a Software Safety Process and a Case Study of Its Use

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1997-01-01

    Research in the year covered by this reporting period has been primarily directed toward the following areas: (1) Formal specification of user interfaces; (2) Fault-tree analysis including software; (3) Evaluation of formal specification notations; (4) Evaluation of formal verification techniques; (5) Expanded analysis of the shell architecture concept; (6) Development of techniques to address the problem of information survivability; and (7) Development of a sophisticated tool for the manipulation of formal specifications written in Z. This report summarizes activities under the grant. The technical results relating to this grant and the remainder of the principal investigator's research program are contained in various reports and papers. The remainder of this report is organized as follows. In the next section, an overview of the project is given. This is followed by a summary of accomplishments during the reporting period and details of students funded. Seminars presented describing work under this grant are listed in the following section, and the final section lists publications resulting from this grant.

  12. Developing Formal Object-oriented Requirements Specifications: A Model, Tool and Technique.

    ERIC Educational Resources Information Center

    Jackson, Robert B.; And Others

    1995-01-01

    Presents a formal object-oriented specification model (OSS) for computer software system development that is supported by a tool that automatically generates a prototype from an object-oriented analysis model (OSA) instance, lets the user examine the prototype, and permits the user to refine the OSA model instance to generate a requirements…

  13. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  14. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  15. Properties of a Formal Method for Prediction of Emergent Behaviors in Swarm-based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James

    2004-01-01

    Autonomous intelligent swarms of satellites are being proposed for NASA missions that have complex behaviors and interactions. The emergent properties of swarms make these missions powerful, but at the same time more difficult to design and assure that proper behaviors will emerge. This paper gives the results of research into formal methods techniques for verification and validation of NASA swarm-based missions. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft. The NASA ANTS mission was used as an example of swarm intelligence for which to apply the formal methods. This paper will give the evaluation of these formal methods and give partial specifications of the ANTS mission using four selected methods. We then give an evaluation of the methods and the needed properties of a formal method for effective specification and prediction of emergent behavior in swarm-based systems.

  16. Tools reference manual for a Requirements Specification Language (RSL), version 2.0

    NASA Technical Reports Server (NTRS)

    Fisher, Gene L.; Cohen, Gerald C.

    1993-01-01

    This report describes a general-purpose Requirements Specification Language, RSL. The purpose of RSL is to specify precisely the external structure of a mechanized system and to define requirements that the system must meet. A system can be comprised of a mixture of hardware, software, and human processing elements. RSL is a hybrid of features found in several popular requirements specification languages, such as SADT (Structured Analysis and Design Technique), PSL (Problem Statement Language), and RMF (Requirements Modeling Framework). While languages such as these have useful features for structuring a specification, they generally lack formality. To overcome the deficiencies of informal requirements languages, RSL has constructs for formal mathematical specification. These constructs are similar to those found in formal specification languages such as EHDM (Enhanced Hierarchical Development Methodology), Larch, and OBJ3.

  17. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  18. Automated Verification of Specifications with Typestates and Access Permissions

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Catano, Nestor

    2011-01-01

    We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).

  19. Formal verification of automated teller machine systems using SPIN

    NASA Astrophysics Data System (ADS)

    Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam

    2017-08-01

    Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.

  20. Interpreter composition issues in the formal verification of a processor-memory module

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Cohen, Gerald C.

    1994-01-01

    This report describes interpreter composition techniques suitable for the formal specification and verification of a processor-memory module using the HOL theorem proving system. The processor-memory module is a multichip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. Modeling and verification methods were developed that permit provably secure composition at the transaction-level of specification, significantly reducing the complexity of the hierarchical verification of the system.

  1. C code generation from Petri-net-based logic controller specification

    NASA Astrophysics Data System (ADS)

    Grobelny, Michał; Grobelna, Iwona; Karatkevich, Andrei

    2017-08-01

    The article focuses on programming of logic controllers. It is important that a programming code of a logic controller is executed flawlessly according to the primary specification. In the presented approach we generate C code for an AVR microcontroller from a rule-based logical model of a control process derived from a control interpreted Petri net. The same logical model is also used for formal verification of the specification by means of the model checking technique. The proposed rule-based logical model and formal rules of transformation ensure that the obtained implementation is consistent with the already verified specification. The approach is validated by practical experiments.

  2. Defining the IEEE-854 floating-point standard in PVS

    NASA Technical Reports Server (NTRS)

    Miner, Paul S.

    1995-01-01

    A significant portion of the ANSI/IEEE-854 Standard for Radix-Independent Floating-Point Arithmetic is defined in PVS (Prototype Verification System). Since IEEE-854 is a generalization of the ANSI/IEEE-754 Standard for Binary Floating-Point Arithmetic, the definition of IEEE-854 in PVS also formally defines much of IEEE-754. This collection of PVS theories provides a basis for machine checked verification of floating-point systems. This formal definition illustrates that formal specification techniques are sufficiently advanced that is is reasonable to consider their use in the development of future standards.

  3. On verifying a high-level design. [cost and error analysis

    NASA Technical Reports Server (NTRS)

    Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.

    1993-01-01

    An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.

  4. Composing, Analyzing and Validating Software Models

    NASA Astrophysics Data System (ADS)

    Sheldon, Frederick T.

    1998-10-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  5. Composing, Analyzing and Validating Software Models

    NASA Technical Reports Server (NTRS)

    Sheldon, Frederick T.

    1998-01-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  6. Formal semantic specifications as implementation blueprints for real-time programming languages

    NASA Technical Reports Server (NTRS)

    Feyock, S.

    1981-01-01

    Formal definitions of language and system semantics provide highly desirable checks on the correctness of implementations of programming languages and their runtime support systems. If these definitions can give concrete guidance to the implementor, major increases in implementation accuracy and decreases in implementation effort can be achieved. It is shown that of the wide variety of available methods the Hgraph (hypergraph) definitional technique (Pratt, 1975), is best suited to serve as such an implementation blueprint. A discussion and example of the Hgraph technique is presented, as well as an overview of the growing body of implementation experience of real-time languages based on Hgraph semantic definitions.

  7. Toward a Formal Evaluation of Refactorings

    NASA Technical Reports Server (NTRS)

    Paul, John; Kuzmina, Nadya; Gamboa, Ruben; Caldwell, James

    2008-01-01

    Refactoring is a software development strategy that characteristically alters the syntactic structure of a program without changing its external behavior [2]. In this talk we present a methodology for extracting formal models from programs in order to evaluate how incremental refactorings affect the verifiability of their structural specifications. We envision that this same technique may be applicable to other types of properties such as those that concern the design and maintenance of safety-critical systems.

  8. From Informal Safety-Critical Requirements to Property-Driven Formal Validation

    NASA Technical Reports Server (NTRS)

    Cimatti, Alessandro; Roveri, Marco; Susi, Angelo; Tonetta, Stefano

    2008-01-01

    Most of the efforts in formal methods have historically been devoted to comparing a design against a set of requirements. The validation of the requirements themselves, however, has often been disregarded, and it can be considered a largely open problem, which poses several challenges. The first challenge is given by the fact that requirements are often written in natural language, and may thus contain a high degree of ambiguity. Despite the progresses in Natural Language Processing techniques, the task of understanding a set of requirements cannot be automatized, and must be carried out by domain experts, who are typically not familiar with formal languages. Furthermore, in order to retain a direct connection with the informal requirements, the formalization cannot follow standard model-based approaches. The second challenge lies in the formal validation of requirements. On one hand, it is not even clear which are the correctness criteria or the high-level properties that the requirements must fulfill. On the other hand, the expressivity of the language used in the formalization may go beyond the theoretical and/or practical capacity of state-of-the-art formal verification. In order to solve these issues, we propose a new methodology that comprises of a chain of steps, each supported by a specific tool. The main steps are the following. First, the informal requirements are split into basic fragments, which are classified into categories, and dependency and generalization relationships among them are identified. Second, the fragments are modeled using a visual language such as UML. The UML diagrams are both syntactically restricted (in order to guarantee a formal semantics), and enriched with a highly controlled natural language (to allow for modeling static and temporal constraints). Third, an automatic formal analysis phase iterates over the modeled requirements, by combining several, complementary techniques: checking consistency; verifying whether the requirements entail some desirable properties; verify whether the requirements are consistent with selected scenarios; diagnosing inconsistencies by identifying inconsistent cores; identifying vacuous requirements; constructing multiple explanations by enabling the fault-tree analysis related to particular fault models; verifying whether the specification is realizable.

  9. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  10. Gulf Coast Clean Energy Application Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillingham, Gavin

    The Gulf Coast Clean Energy Application Center was initiated to significantly improve market and regulatory conditions for the implementation of combined heat and power technologies. The GC CEAC was responsible for the development of CHP in Texas, Louisiana and Oklahoma. Through this program we employed a variety of outreach and education techniques, developed and deployed assessment tools and conducted market assessments. These efforts resulted in the growth of the combined heat and power market in the Gulf Coast region with a realization of more efficient energy generation, reduced emissions and a more resilient infrastructure. Specific t research, we did notmore » formally investigate any techniques with any formal research design or methodology.« less

  11. Using ICT techniques for improving mechatronic systems' dependability

    NASA Astrophysics Data System (ADS)

    Miron, Emanuel; Silva, João P. M. A.; Machado, José; Olaru, Dumitru; Prisacaru, Gheorghe

    2013-10-01

    The use of analysis techniques for industrial controller's analysis, such as Simulation and Formal Verification, is complex on industrial context. This complexity is due to the fact that such techniques require sometimes high investment in specific skilled human resources that have sufficient theoretical knowledge in those domains. This paper aims, mainly, to show that it is possible to obtain a timed automata model for formal verification purposes, considering the CAD model of a mechanical component. This systematic approach can be used, by companies, for the analysis of industrial controllers programs. For this purpose, it is discussed, in the paper, the best way to systematize these procedures, and this paper describes, only, the first step of a complex process and promotes a discussion of the main difficulties that can be found and a possibility for handle those difficulties. A library for formal verification purposes is obtained from original 3D CAD models using Software as a Service platform (SaaS) that, nowadays, has become a common deliverable model for many applications, because SaaS is typically accessed by users via internet access.

  12. Applying formal methods and object-oriented analysis to existing flight software

    NASA Technical Reports Server (NTRS)

    Cheng, Betty H. C.; Auernheimer, Brent

    1993-01-01

    Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.

  13. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  14. Research on Synthesis of Concurrent Computing Systems.

    DTIC Science & Technology

    1982-09-01

    20 1.5.1 An Informal Description of the Techniques ....... ..................... 20 1.5 2 Formal Definitions of Aggregation and Virtualisation ...sparsely interconnected networks . We have also developed techniques to create Kung’s systolic array parallel structure from a specification of matrix...resufts of the computation of that element. For example, if A,j is computed using a single enumeration, then virtualisation would produce a three

  15. Submental liposuction versus formal cervicoplasty: which one to choose?

    PubMed

    Fattahi, Tirbod

    2012-12-01

    Esthetic rejuvenation of the submental area is a fairly common concern of patients seeking cosmetic surgery. There are several techniques used to obtain esthetic results. A common dilemma is the proper determination as to which procedure, liposuction versus formal cervicoplasty, is more appropriate. This manuscript describes the factors involved in the aging process of the submental area, as well as the inherent advantages of formal cervicoplasty over liposuction. A comprehensive review of the intrinsic and extrinsic aging process is described, and advantages and disadvantages of liposuction as well as cervicoplasty are detailed. On the basis of the specific factors leading to the fullness of the anterior neck/submental area, proper rejuvenation technique must include platysmaplasty, in addition to liposuction. Isolated liposuction is only beneficial in an isolated group of cosmetic patients. Formal cervicoplasty, including open liposuction and platysmaplasty, is a superior operation compared with isolated liposuction of the submental area. Whereas liposuction does have a role in cosmetic surgery of the submental area, it is not a comprehensive procedure and does not address all of the anatomic components leading to submental fullness. Copyright © 2012 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  16. Development of a Software Safety Process and a Case Study of Its Use

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1996-01-01

    Research in the year covered by this reporting period has been primarily directed toward: continued development of mock-ups of computer screens for operator of a digital reactor control system; development of a reactor simulation to permit testing of various elements of the control system; formal specification of user interfaces; fault-tree analysis including software; evaluation of formal verification techniques; and continued development of a software documentation system. Technical results relating to this grant and the remainder of the principal investigator's research program are contained in various reports and papers.

  17. Results of a Formal Methods Demonstration Project

    NASA Technical Reports Server (NTRS)

    Kelly, J.; Covington, R.; Hamilton, D.

    1994-01-01

    This paper describes the results of a cooperative study conducted by a team of researchers in formal methods at three NASA Centers to demonstrate FM techniques and to tailor them to critical NASA software systems. This pilot project applied FM to an existing critical software subsystem, the Shuttle's Jet Select subsystem (Phase I of an ongoing study). The present study shows that FM can be used successfully to uncover hidden issues in a highly critical and mature Functional Subsystem Software Requirements (FSSR) specification which are very difficult to discover by traditional means.

  18. Quantified Event Automata: Towards Expressive and Efficient Runtime Monitors

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Falcone, Ylies; Havelund, Klaus; Reger, Giles; Rydeheard, David

    2012-01-01

    Runtime verification is the process of checking a property on a trace of events produced by the execution of a computational system. Runtime verification techniques have recently focused on parametric specifications where events take data values as parameters. These techniques exist on a spectrum inhabited by both efficient and expressive techniques. These characteristics are usually shown to be conflicting - in state-of-the-art solutions, efficiency is obtained at the cost of loss of expressiveness and vice-versa. To seek a solution to this conflict we explore a new point on the spectrum by defining an alternative runtime verification approach.We introduce a new formalism for concisely capturing expressive specifications with parameters. Our technique is more expressive than the currently most efficient techniques while at the same time allowing for optimizations.

  19. A Survey of Formal Methods for Intelligent Swarms

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Rash, James; Hinchey, Mike; Rouff, Chrustopher A.

    2004-01-01

    Swarms of intelligent autonomous spacecraft, involving complex behaviors and interactions, are being proposed for future space exploration missions. Such missions provide greater flexibility and offer the possibility of gathering more science data than traditional single spacecraft missions. The emergent properties of swarms make these missions powerful, but simultaneously far more difficult to design, and to assure that the proper behaviors will emerge. These missions are also considerably more complex than previous types of missions, and NASA, like other organizations, has little experience in developing or in verifying and validating these types of missions. A significant challenge when verifying and validating swarms of intelligent interacting agents is how to determine that the possible exponential interactions and emergent behaviors are producing the desired results. Assuring correct behavior and interactions of swarms will be critical to mission success. The Autonomous Nano Technology Swarm (ANTS) mission is an example of one of the swarm types of missions NASA is considering. The ANTS mission will use a swarm of picospacecraft that will fly from Earth orbit to the Asteroid Belt. Using an insect colony analogy, ANTS will be composed of specialized workers for asteroid exploration. Exploration would consist of cataloguing the mass, density, morphology, and chemical composition of the asteroids, including any anomalous concentrations of specific minerals. To perform this task, ANTS would carry miniaturized instruments, such as imagers, spectrometers, and detectors. Since ANTS and other similar missions are going to consist of autonomous spacecraft that may be out of contact with the earth for extended periods of time, and have low bandwidths due to weight constraints, it will be difficult to observe improper behavior and to correct any errors after launch. Providing V&V (verification and validation) for this type of mission is new to NASA, and represents the cutting edge in system correctness, and requires higher levels of assurance than other (traditional) missions that use a single or small number of spacecraft that are deterministic in nature and have near continuous communication access. One of the highest possible levels of assurance comes from the application of formal methods. Formal methods are mathematics-based tools and techniques for specifying and verifying (software and hardware) systems. They are particularly useful for specifying complex parallel systems, such as exemplified by the ANTS mission, where the entire system is difficult for a single person to fully understand, a problem that is multiplied with multiple developers. Once written, a formal specification can be used to prove properties of a system (e.g., the underlying system will go from one state to another or not into a specific state) and check for particular types of errors (e.g., race or livelock conditions). A formal specification can also be used as input to a model checker for further validation. This report gives the results of a survey of formal methods techniques for verification and validation of space missions that use swarm technology. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft using the ANTS mission as an example system. This report is the first result of the project to determine formal approaches that are promising for formally specifying swarm-based systems. From this survey, the most promising approaches were selected and are discussed relative to their possible application to the ANTS mission. Future work will include the application of an integrated approach, based on the selected approaches identified in this report, to the formal specification of the ANTS mission.

  20. The Appropriation of Fine Art into Contemporary Narrative Picturebooks

    ERIC Educational Resources Information Center

    Serafini, Frank

    2015-01-01

    Many picturebook artists have been formally trained in specific artistic styles, movements, and techniques. These artists appropriate and transform works of fine art to varying degrees to fit the themes and designs of the stories they illustrate and publish, and to increase the significance and impact of their illustrations. The…

  1. Formal Methods Specification and Analysis Guidebook for the Verification of Software and Computer Systems. Volume 2; A Practitioner's Companion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods.

  2. Application of Lightweight Formal Methods to Software Security

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt

    2005-01-01

    Formal specification and verification of security has proven a challenging task. There is no single method that has proven feasible. Instead, an integrated approach which combines several formal techniques can increase the confidence in the verification of software security properties. Such an approach which species security properties in a library that can be reused by 2 instruments and their methodologies developed for the National Aeronautics and Space Administration (NASA) at the Jet Propulsion Laboratory (JPL) are described herein The Flexible Modeling Framework (FMF) is a model based verijkation instrument that uses Promela and the SPIN model checker. The Property Based Tester (PBT) uses TASPEC and a Text Execution Monitor (TEM). They are used to reduce vulnerabilities and unwanted exposures in software during the development and maintenance life cycles.

  3. The evolution of optics education at the U.S. National Optical Astronomy Observatory

    NASA Astrophysics Data System (ADS)

    Pompea, Stephen M.; Walker, Constance E.; Sparks, Robert T.

    2014-07-01

    The last decade of optics education at the U.S. National Optical Astronomy Observatory will be described in terms of program planning, assessment of community needs, identification of networks and strategic partners, the establishment of specific program goals and objectives, and program metrics and evaluation. A number of NOAO's optics education programs for formal and informal audiences will be described, including our Hands-On Optics program, illumination engineering/dark skies energy education programs, afterschool programs, adaptive optics education program, student outreach, and Galileoscope program. Particular emphasis will be placed on techniques for funding and sustaining high-quality programs. The use of educational gap analysis to identify the key needs of the formal and informal educational systems will be emphasized as a technique that has helped us to maximize our educational program effectiveness locally, regionally, nationally, and in Chile.

  4. Imagery mnemonics and memory remediation.

    PubMed

    Richardson, J T

    1992-02-01

    This paper evaluates the claim that imagery mnemonic techniques are useful in remediation of memory disorders in brain-damaged patients. Clinical research has confirmed that such techniques can lead to improved performance on formal testing in a number of neurologic disease populations and following lesions of either the left or right hemisphere. However, those patients with more severe forms of amnesia and those with medial or bilateral damage do not improve unless the learning task is highly structured. Even among patients who show improvement on formal testing, there is little evidence that they maintain the use of these techniques in similar learning tasks or generalize the use to new learning situations. Imagery mnemonics also appear to be of little practical value in the daily activities that are of most concern to brain-damaged patients themselves. The effectiveness of imagery mnemonics appears to depend upon the patients' motivation and insight rather than upon their intelligence or educational level. Instead of training patients in specific mnemonic techniques, clinicians should promote the development of "meta-cognitive" skills and the acquisition of knowledge about domains of practical significance.

  5. Software Safety Analysis of a Flight Guidance System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  6. Orbiter Avionics Radiation Handbook

    NASA Technical Reports Server (NTRS)

    Reddell, Brandon D.

    1999-01-01

    This handbook was assembled to document he radiation environment for design of Orbiter avionics. It also maps the environment through vehicle shielding and mission usage into discrete requirements such as total dose. Some details of analytical techniques for calculating radiation effects are provided. It is anticipated that appropriate portions of this document will be added to formal program specifications.

  7. Effects of Textual Enhancement and Input Enrichment on L2 Development

    ERIC Educational Resources Information Center

    Rassaei, Ehsan

    2015-01-01

    Research on second language (L2) acquisition has recently sought to include formal instruction into second and foreign language classrooms in a more unobtrusive and implicit manner. Textual enhancement and input enrichment are two techniques which are aimed at drawing learners' attention to specific linguistic features in input and at the same…

  8. Report on Ada (Trademark) Program Libraries Workshop Held at Monterey, California on November 1-3, 1983,

    DTIC Science & Technology

    1983-11-03

    capability. An intelligent library management system will be supported by knowledge-based techniques. In fact, until a formal specification of library...from artificial intelligence and information science 2 might also be useful, for example automatic indexing and cataloging schemes, methods for fast...Artificial Intelligence 5:1045-1058, 1977. [Burstall & Goguen 801 Burstall, R. M., and Goguen, J. A. The Semantics of Clear, a Specification Language. In

  9. Systems, methods and apparatus for implementation of formal specifications derived from informal requirements

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments an informal specification is translated without human intervention into a formal specification. In some embodiments the formal specification is a process-based specification. In some embodiments, the formal specification is translated into a high-level computer programming language which is further compiled into a set of executable computer instructions.

  10. Using Mobile TLA as a Logic for Dynamic I/O Automata

    NASA Astrophysics Data System (ADS)

    Kapus, Tatjana

    Input/Output (I/O) automata and the Temporal Logic of Actions (TLA) are two well-known techniques for the specification and verification of concurrent systems. Over the past few years, they have been extended to the so-called dynamic I/O automata and, respectively, Mobile TLA (MTLA) in order to be more appropriate for mobile agent systems. Dynamic I/O automata is just a mathematical model, whereas MTLA is a logic with a formally defined language. In this paper, therefore, we investigate how MTLA could be used as a formal language for the specification of dynamic I/O automata. We do this by writing an MTLA specification of a travel agent system which has been specified semi-formally in the literature on that model. In this specification, we deal with always existing agents as well as with an initially unknown number of dynamically created agents, with mobile and non-mobile agents, with I/O-automata-style communication, and with the changing communication capabilities of mobile agents. We have previously written a TLA specification of this system. This paper shows that an MTLA specification of such a system can be more elegant and faithful to the dynamic I/O automata definition because the agent existence and location can be expressed directly by using agent and location names instead of special variables as in TLA. It also shows how the reuse of names for dynamically created and destroyed agents within the dynamic I/O automata framework can be specified in MTLA.

  11. An ORCID based synchronization framework for a national CRIS ecosystem.

    PubMed

    Mendes Moreira, João; Cunha, Alcino; Macedo, Nuno

    2015-01-01

    PTCRIS (Portuguese Current Research Information System) is a program aiming at the creation and sustained development of a national integrated information ecosystem, to support research management according to the best international standards and practices. This paper reports on the experience of designing and prototyping a synchronization framework for PTCRIS based on ORCID (Open Researcher and Contributor ID). This framework embraces the "input once, re-use often" principle, and will enable a substantial reduction of the research output management burden by allowing automatic information exchange between the various national systems. The design of the framework followed best practices in rigorous software engineering, namely well-established principles in the research field of consistency management, and relied on formal analysis techniques and tools for its validation and verification. The notion of consistency between the services was formally specified and discussed with the stakeholders before the technical aspects on how to preserve said consistency were explored. Formal specification languages and automated verification tools were used to analyze the specifications and generate usage scenarios, useful for validation with the stakeholder and essential to certificate compliant services.

  12. Systematic errors in transport calculations of shear viscosity using the Green-Kubo formalism

    NASA Astrophysics Data System (ADS)

    Rose, J. B.; Torres-Rincon, J. M.; Oliinychenko, D.; Schäfer, A.; Petersen, H.

    2018-05-01

    The purpose of this study is to provide a reproducible framework in the use of the Green-Kubo formalism to extract transport coefficients. More specifically, in the case of shear viscosity, we investigate the limitations and technical details of fitting the auto-correlation function to a decaying exponential. This fitting procedure is found to be applicable for systems interacting both through constant and energy-dependent cross-sections, although this is only true for sufficiently dilute systems in the latter case. We find that the optimal fit technique consists in simultaneously fixing the intercept of the correlation function and use a fitting interval constrained by the relative error on the correlation function. The formalism is then applied to the full hadron gas, for which we obtain the shear viscosity to entropy ratio.

  13. Usability engineering: domain analysis activities for augmented-reality systems

    NASA Astrophysics Data System (ADS)

    Gabbard, Joseph; Swan, J. E., II; Hix, Deborah; Lanzagorta, Marco O.; Livingston, Mark; Brown, Dennis B.; Julier, Simon J.

    2002-05-01

    This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured, iterative, stepwise development process. Like the related disciplines of software and systems engineering, usability engineering is a combination of management principals and techniques, formal and semi- formal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system that displays heads- up battlefield intelligence information to a dismounted warrior. The paper discusses our general usability engineering process. We originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented reality system. The focus of this paper is our work on domain analysis, the first activity of the usability engineering process. We describe our plans for and our progress to date on our domain analysis for BARS. We give results in terms of a specific urban battlefield use case we have designed.

  14. Formal methods demonstration project for space applications

    NASA Technical Reports Server (NTRS)

    Divito, Ben L.

    1995-01-01

    The Space Shuttle program is cooperating in a pilot project to apply formal methods to live requirements analysis activities. As one of the larger ongoing shuttle Change Requests (CR's), the Global Positioning System (GPS) CR involves a significant upgrade to the Shuttle's navigation capability. Shuttles are to be outfitted with GPS receivers and the primary avionics software will be enhanced to accept GPS-provided positions and integrate them into navigation calculations. Prior to implementing the CR, requirements analysts at Loral Space Information Systems, the Shuttle software contractor, must scrutinize the CR to identify and resolve any requirements issues. We describe an ongoing task of the Formal Methods Demonstration Project for Space Applications whose goal is to find an effective way to use formal methods in the GPS CR requirements analysis phase. This phase is currently under way and a small team from NASA Langley, ViGYAN Inc. and Loral is now engaged in this task. Background on the GPS CR is provided and an overview of the hardware/software architecture is presented. We outline the approach being taken to formalize the requirements, only a subset of which is being attempted. The approach features the use of the PVS specification language to model 'principal functions', which are major units of Shuttle software. Conventional state machine techniques form the basis of our approach. Given this background, we present interim results based on a snapshot of work in progress. Samples of requirements specifications rendered in PVS are offered to illustration. We walk through a specification sketch for the principal function known as GPS Receiver State processing. Results to date are summarized and feedback from Loral requirements analysts is highlighted. Preliminary data is shown comparing issues detected by the formal methods team versus those detected using existing requirements analysis methods. We conclude by discussing our plan to complete the remaining activities of this task.

  15. Formal Foundations for Hierarchical Safety Cases

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Pai, Ganesh; Whiteside, Iain

    2015-01-01

    Safety cases are increasingly being required in many safety-critical domains to assure, using structured argumentation and evidence, that a system is acceptably safe. However, comprehensive system-wide safety arguments present appreciable challenges to develop, understand, evaluate, and manage, partly due to the volume of information that they aggregate, such as the results of hazard analysis, requirements analysis, testing, formal verification, and other engineering activities. Previously, we have proposed hierarchical safety cases, hicases, to aid the comprehension of safety case argument structures. In this paper, we build on a formal notion of safety case to formalise the use of hierarchy as a structuring technique, and show that hicases satisfy several desirable properties. Our aim is to provide a formal, theoretical foundation for safety cases. In particular, we believe that tools for high assurance systems should be granted similar assurance to the systems to which they are applied. To this end, we formally specify and prove the correctness of key operations for constructing and managing hicases, which gives the specification for implementing hicases in AdvoCATE, our toolset for safety case automation. We motivate and explain the theory with the help of a simple running example, extracted from a real safety case and developed using AdvoCATE.

  16. Automatic Review of Abstract State Machines by Meta Property Verification

    NASA Technical Reports Server (NTRS)

    Arcaini, Paolo; Gargantini, Angelo; Riccobene, Elvinia

    2010-01-01

    A model review is a validation technique aimed at determining if a model is of sufficient quality and allows defects to be identified early in the system development, reducing the cost of fixing them. In this paper we propose a technique to perform automatic review of Abstract State Machine (ASM) formal specifications. We first detect a family of typical vulnerabilities and defects a developer can introduce during the modeling activity using the ASMs and we express such faults as the violation of meta-properties that guarantee certain quality attributes of the specification. These meta-properties are then mapped to temporal logic formulas and model checked for their violation. As a proof of concept, we also report the result of applying this ASM review process to several specifications.

  17. Initiating Formal Requirements Specifications with Object-Oriented Models

    NASA Technical Reports Server (NTRS)

    Ampo, Yoko; Lutz, Robyn R.

    1994-01-01

    This paper reports results of an investigation into the suitability of object-oriented models as an initial step in developing formal specifications. The requirements for two critical system-level software modules were used as target applications. It was found that creating object-oriented diagrams prior to formally specifying the requirements enhanced the accuracy of the initial formal specifications and reduced the effort required to produce them. However, the formal specifications incorporated some information not found in the object-oriented diagrams, such as higher-level strategy or goals of the software.

  18. Formal Methods for Verification and Validation of Partial Specifications: A Case Study

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  19. Toward ab initio molecular dynamics modeling for sum-frequency generation spectra; an efficient algorithm based on surface-specific velocity-velocity correlation function.

    PubMed

    Ohto, Tatsuhiko; Usui, Kota; Hasegawa, Taisuke; Bonn, Mischa; Nagata, Yuki

    2015-09-28

    Interfacial water structures have been studied intensively by probing the O-H stretch mode of water molecules using sum-frequency generation (SFG) spectroscopy. This surface-specific technique is finding increasingly widespread use, and accordingly, computational approaches to calculate SFG spectra using molecular dynamics (MD) trajectories of interfacial water molecules have been developed and employed to correlate specific spectral signatures with distinct interfacial water structures. Such simulations typically require relatively long (several nanoseconds) MD trajectories to allow reliable calculation of the SFG response functions through the dipole moment-polarizability time correlation function. These long trajectories limit the use of computationally expensive MD techniques such as ab initio MD and centroid MD simulations. Here, we present an efficient algorithm determining the SFG response from the surface-specific velocity-velocity correlation function (ssVVCF). This ssVVCF formalism allows us to calculate SFG spectra using a MD trajectory of only ∼100 ps, resulting in the substantial reduction of the computational costs, by almost an order of magnitude. We demonstrate that the O-H stretch SFG spectra at the water-air interface calculated by using the ssVVCF formalism well reproduce those calculated by using the dipole moment-polarizability time correlation function. Furthermore, we applied this ssVVCF technique for computing the SFG spectra from the ab initio MD trajectories with various density functionals. We report that the SFG responses computed from both ab initio MD simulations and MD simulations with an ab initio based force field model do not show a positive feature in its imaginary component at 3100 cm(-1).

  20. Portable Wireless LAN Device and Two-Way Radio Threat Assessment for Aircraft VHF Communication Radio Band

    NASA Technical Reports Server (NTRS)

    Nguyen, Truong X.; Koppen, Sandra V.; Ely, Jay J.; Williams, Reuben A.; Smith, Laura J.; Salud, Maria Theresa P.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  1. Bio-Inspired Genetic Algorithms with Formalized Crossover Operators for Robotic Applications.

    PubMed

    Zhang, Jie; Kang, Man; Li, Xiaojuan; Liu, Geng-Yang

    2017-01-01

    Genetic algorithms are widely adopted to solve optimization problems in robotic applications. In such safety-critical systems, it is vitally important to formally prove the correctness when genetic algorithms are applied. This paper focuses on formal modeling of crossover operations that are one of most important operations in genetic algorithms. Specially, we for the first time formalize crossover operations with higher-order logic based on HOL4 that is easy to be deployed with its user-friendly programing environment. With correctness-guaranteed formalized crossover operations, we can safely apply them in robotic applications. We implement our technique to solve a path planning problem using a genetic algorithm with our formalized crossover operations, and the results show the effectiveness of our technique.

  2. But Are They Learning? Getting Started in Classroom Evaluation

    PubMed Central

    Dancy, Melissa H; Beichner, Robert J

    2002-01-01

    There are increasing numbers of traditional biologists, untrained in educational research methods, who want to develop and assess new classroom innovations. In this article we argue the necessity of formal research over normal classroom feedback. We also argue that traditionally trained biologists can make significant contributions to biology pedagogy. We then offer some guidance to the biologist with no formal educational research training who wants to get started. Specifically, we suggest ways to find out what others have done, we discuss the difference between qualitative and quantitative research, and we elaborate on the process of gaining insights from student interviews. We end with an example of a project that has used many different research techniques. PMID:12459792

  3. Representation in incremental learning

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Work focused on two areas in machine learning: representation for inductive learning and how to apply concept learning techniques to learning state preferences, which can represent search control knowledge for problem solving. Specifically, in the first area the issues of the effect of representation on learning, on how learning formalisms are biased, and how concept learning can benefit from the use of a hybrid formalism are addressed. In the second area, the issues of developing an agent to learn search control knowledge from the relative values of states, of the source of that qualitative information, and of the ability to use both quantitative and qualitative information in order to develop an effective problem-solving policy are examined.

  4. Automatically Grading Customer Confidence in a Formal Specification.

    ERIC Educational Resources Information Center

    Shukur, Zarina; Burke, Edmund; Foxley, Eric

    1999-01-01

    Describes an automatic grading system for a formal methods computer science course that is able to evaluate a formal specification written in the Z language. Quality is measured by considering first, specification correctness (syntax, semantics, and satisfaction of customer requirements), and second, specification maintainability (comparison of…

  5. Formal verification of an oral messages algorithm for interactive consistency

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  6. Behavioral and Temporal Pattern Detection Within Financial Data With Hidden Information

    DTIC Science & Technology

    2012-02-01

    probabilistic pattern detector to monitor the pattern. 15. SUBJECT TERMS Runtime verification, Hidden data, Hidden Markov models, Formal specifications...sequences in many other fields besides financial systems [L, TV, LC, LZ ]. Rather, the technique suggested in this paper is positioned as a hybrid...operation of the pattern detector . Section 7 describes the operation of the probabilistic pattern-matching monitor, and section 8 describes three

  7. Formal Semanol Specification of Ada.

    DTIC Science & Technology

    1980-09-01

    concurrent task modeling involved very little change to the SEMANOL metalanguage. A primitive capable of initiating concurrent SEMANOL task processors...i.e., #CO-COMPUTE) and two primitivc-; corresponding to integer semaphores (i.c., #P and #V) were all that were required. In addition, these changes... synchronization techniques and choice of correct unblocking alternatives. We should note that it had been our original intention to use the Ada Translator program

  8. The Strategic Thinking Process: Efficient Mobilization of Human Resources for System Definition

    PubMed Central

    Covvey, H. D.

    1987-01-01

    This paper describes the application of several group management techniques to the creation of needs specifications and information systems strategic plans in health care institutions. The overall process is called the “Strategic Thinking Process”. It is a formal methodology that can reduce the time and cost of creating key documents essential for the successful implementation of health care information systems.

  9. User Interface Technology for Formal Specification Development

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    Formal specification development and modification are an essential component of the knowledge-based software life cycle. User interface technology is needed to empower end-users to create their own formal specifications. This paper describes the advanced user interface for AMPHION1 a knowledge-based software engineering system that targets scientific subroutine libraries. AMPHION is a generic, domain-independent architecture that is specialized to an application domain through a declarative domain theory. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that provides semantic guidance in creating diagrams denoting formal specifications in an application domain. The diagrams also serve to document the specifications. Automatic deductive program synthesis ensures that end-user specifications are correctly implemented. The tables that drive AMPHION's user interface are automatically compiled from a domain theory; portions of the interface can be customized by the end-user. The user interface facilitates formal specification development by hiding syntactic details, such as logical notation. It also turns some of the barriers for end-user specification development associated with strongly typed formal languages into active sources of guidance, without restricting advanced users. The interface is especially suited for specification modification. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development.

  10. Proceedings of the Workshop on the Assessment of Crew Workload Measurement Methods, Techniques and Procedures. Volume 2. Library References.

    DTIC Science & Technology

    1987-06-01

    OHIO 45433-6553 0. % NOTICE When Government drawings, specifications, or other data are used for any purpose other than in connection with a definitely...have formulated or in any way supplied the said drawings, specifications, or other data , is not to be regarded by implication, or otherwise in any...Formal Review 2= Informal Review 3= No Reviewer C. Quality of Data (If not 1-3 QUIT) 1= Experiment(s) 2= Case Study(s) 3= Theory/Review (Skip to F.) (Skip

  11. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.

    DTIC Science & Technology

    1997-09-30

    set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has

  12. Using formal specification in the Guidance and Control Software (GCS) experiment. Formal design and verification technology for life critical systems

    NASA Technical Reports Server (NTRS)

    Weber, Doug; Jamsek, Damir

    1994-01-01

    The goal of this task was to investigate how formal methods could be incorporated into a software engineering process for flight-control systems under DO-178B and to demonstrate that process by developing a formal specification for NASA's Guidance and Controls Software (GCS) Experiment. GCS is software to control the descent of a spacecraft onto a planet's surface. The GCS example is simplified from a real example spacecraft, but exhibits the characteristics of realistic spacecraft control software. The formal specification is written in Larch.

  13. Comments on the use of network structures to analyse commercial companies’ evolution and their impact on economic behaviour

    NASA Astrophysics Data System (ADS)

    Costea, Carmen

    2006-10-01

    Network analysis studies the development of the social structure of relationships around a group or an institutional body, and how it affects beliefs and behaviours. Causal constraints require a special and deeper attention to the social structure. The purpose of this paper is to give a new approach to the idea that this reality should be primarily conceived and investigated from the perspective of the properties of relations between and within units, instead of the properties of these units themselves. The relationship may refer to the exchange of products, labour, information and money. By mapping these relationships, network analysis can help to uncover the emergent and informal communication patterns of commercial companies that may be compared to the formal communication structures. These emergent patterns can be used to explain institutional and individuals’ behaviours. Network analysis techniques focus on the communication structure of an organization that can be subdivided and handled with different approaches. Structural features that can be analysed through the use of network analysis techniques are, for example, the (formal and informal) communication patterns in an organization or the identification of specific groups within an organization. Special attention may be given to specific aspects of communication patterns.

  14. Fast alternative Monte Carlo formalism for a class of problems in biophotonics

    NASA Astrophysics Data System (ADS)

    Miller, Steven D.

    1997-12-01

    A practical and effective, alternative Monte Carlo formalism is presented that rapidly finds flux solutions to the radiative transport equation for a class of problems in biophotonics; namely, wide-beam irradiance of finite, optically anisotropic homogeneous or heterogeneous biomedias, which both strongly scatter and absorb light. Such biomedias include liver, tumors, blood, or highly blood perfused tissues. As Fermat rays comprising a wide coherent (laser) beam enter the tissue, they evolve into a bundle of random optical paths or trajectories due to scattering. Overall, this can be physically interpreted as a bundle of Markov trajectories traced out by a 'gas' of Brownian-like point photons being successively scattered and absorbed. By considering the cumulative flow of a statistical bundle of trajectories through interior data planes, the effective equivalent information of the (generally unknown) analytical flux solutions of the transfer equation rapidly emerges. Unlike the standard Monte Carlo techniques, which evaluate scalar fluence, this technique is faster, more efficient, and simpler to apply for this specific class of optical situations. Other analytical or numerical techniques can either become unwieldy or lack viability or are simply more difficult to apply. Illustrative flux calculations are presented for liver, blood, and tissue-tumor-tissue systems.

  15. Formal Methods of V&V of Partial Specifications: An Experience Report

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe an experiment in the application of the method SCR. to testing for consistency properties of a partial model of requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification is valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  16. Integrating reasoning and clinical archetypes using OWL ontologies and SWRL rules.

    PubMed

    Lezcano, Leonardo; Sicilia, Miguel-Angel; Rodríguez-Solano, Carlos

    2011-04-01

    Semantic interoperability is essential to facilitate the computerized support for alerts, workflow management and evidence-based healthcare across heterogeneous electronic health record (EHR) systems. Clinical archetypes, which are formal definitions of specific clinical concepts defined as specializations of a generic reference (information) model, provide a mechanism to express data structures in a shared and interoperable way. However, currently available archetype languages do not provide direct support for mapping to formal ontologies and then exploiting reasoning on clinical knowledge, which are key ingredients of full semantic interoperability, as stated in the SemanticHEALTH report [1]. This paper reports on an approach to translate definitions expressed in the openEHR Archetype Definition Language (ADL) to a formal representation expressed using the Ontology Web Language (OWL). The formal representations are then integrated with rules expressed with Semantic Web Rule Language (SWRL) expressions, providing an approach to apply the SWRL rules to concrete instances of clinical data. Sharing the knowledge expressed in the form of rules is consistent with the philosophy of open sharing, encouraged by archetypes. Our approach also allows the reuse of formal knowledge, expressed through ontologies, and extends reuse to propositions of declarative knowledge, such as those encoded in clinical guidelines. This paper describes the ADL-to-OWL translation approach, describes the techniques to map archetypes to formal ontologies, and demonstrates how rules can be applied to the resulting representation. We provide examples taken from a patient safety alerting system to illustrate our approach. Copyright © 2010 Elsevier Inc. All rights reserved.

  17. Organizational Decision Making

    DTIC Science & Technology

    1975-08-01

    the lack of formal techniques typically used by large organizations, digress on the advantages of formal over informal... optimization ; for example one might do a number of optimization calculations, each time using a different measure of effectiveness as the optimized ...final decision. The next level of computer application involves the use of computerized optimization techniques. Optimization

  18. Creative Process: Its Use and Extent of Formalization by Corporations.

    ERIC Educational Resources Information Center

    Fernald, Lloyd W., Jr.; Nickolenko, Pam

    1993-01-01

    This study reports creativity policies and practices used by Central Florida corporations. Survey responses (n=105) indicated that businesses are using a variety of creativity techniques with usage greater among the newer companies but that these techniques are not yet a formal part of business operations. (DB)

  19. Effect of formal specifications on program complexity and reliability: An experimental study

    NASA Technical Reports Server (NTRS)

    Goel, Amrit L.; Sahoo, Swarupa N.

    1990-01-01

    The results are presented of an experimental study undertaken to assess the improvement in program quality by using formal specifications. Specifications in the Z notation were developed for a simple but realistic antimissile system. These specifications were then used to develop 2 versions in C by 2 programmers. Another set of 3 versions in Ada were independently developed from informal specifications in English. A comparison of the reliability and complexity of the resulting programs suggests the advantages of using formal specifications in terms of number of errors detected and fault avoidance.

  20. Application of Artificial Intelligence technology to the analysis and synthesis of reliable software systems

    NASA Technical Reports Server (NTRS)

    Wild, Christian; Eckhardt, Dave

    1987-01-01

    The development of a methodology for the production of highly reliable software is one of the greatest challenges facing the computer industry. Meeting this challenge will undoubtably involve the integration of many technologies. This paper describes the use of Artificial Intelligence technologies in the automated analysis of the formal algebraic specifications of abstract data types. These technologies include symbolic execution of specifications using techniques of automated deduction and machine learning through the use of examples. On-going research into the role of knowledge representation and problem solving in the process of developing software is also discussed.

  1. Mapping of Primary Instructional Methods and Teaching Techniques for Regularly Scheduled, Formal Teaching Sessions in an Anesthesia Residency Program.

    PubMed

    Vested Madsen, Matias; Macario, Alex; Yamamoto, Satoshi; Tanaka, Pedro

    2016-06-01

    In this study, we examined the regularly scheduled, formal teaching sessions in a single anesthesiology residency program to (1) map the most common primary instructional methods, (2) map the use of 10 known teaching techniques, and (3) assess if residents scored sessions that incorporated active learning as higher quality than sessions with little or no verbal interaction between teacher and learner. A modified Delphi process was used to identify useful teaching techniques. A representative sample of each of the formal teaching session types was mapped, and residents anonymously completed a 5-question written survey rating the session. The most common primary instructional methods were computer slides-based classroom lectures (66%), workshops (15%), simulations (5%), and journal club (5%). The number of teaching techniques used per formal teaching session averaged 5.31 (SD, 1.92; median, 5; range, 0-9). Clinical applicability (85%) and attention grabbers (85%) were the 2 most common teaching techniques. Thirty-eight percent of the sessions defined learning objectives, and one-third of sessions engaged in active learning. The overall survey response rate equaled 42%, and passive sessions had a mean score of 8.44 (range, 5-10; median, 9; SD, 1.2) compared with a mean score of 8.63 (range, 5-10; median, 9; SD, 1.1) for active sessions (P = 0.63). Slides-based classroom lectures were the most common instructional method, and faculty used an average of 5 known teaching techniques per formal teaching session. The overall education scores of the sessions as rated by the residents were high.

  2. New Technologies and Learning Environments: A Perspective from Formal and Non-Formal Education in Baja California, Mexico

    ERIC Educational Resources Information Center

    Zamora, Julieta Lopez; Reynaga, Francisco Javier Arriaga

    2010-01-01

    This paper presents results of two research works, the first approaches non-formal education and the second addresses formal education. In both studies in-depth interview techniques were used. There were some points of convergence between them on aspects such as the implementation of learning environments and the integration of ICT. The interview…

  3. Formal specification of human-computer interfaces

    NASA Technical Reports Server (NTRS)

    Auernheimer, Brent

    1990-01-01

    A high-level formal specification of a human computer interface is described. Previous work is reviewed and the ASLAN specification language is described. Top-level specifications written in ASLAN for a library and a multiwindow interface are discussed.

  4. Patient perspectives: Kundalini yoga meditation techniques for psycho-oncology and as potential therapies for cancer.

    PubMed

    Shannahoff-Khalsa, David S

    2005-03-01

    The ancient system of Kundalini Yoga (KY) includes a vast array of meditation techniques. Some were discovered to be specific for treating psychiatric disorders and others are supposedly beneficial for treating cancers. To date, 2 clinical trials have been conducted for treating obsessive-compulsive disorder (OCD). The first was an open uncontrolled trial and the second a single-blinded randomized controlled trial (RCT) comparing a KY protocol against the Relaxation Response and Mindfulness Meditation (RRMM) techniques combined. Both trials showed efficacy on all psychological scales using the KY protocol; however, the RCT showed no efficacy on any scale with the RRMM control group. The KY protocol employed an OCD-specific meditation technique combined with other techniques that are individually specific for anxiety, low energy, fear, anger, meeting mental challenges, and turning negative thoughts into positive thoughts. In addition to OCD symptoms, other symptoms, including anxiety and depression, were also significantly reduced. Elements of the KY protocol other than the OCD-specific technique also may have applications for psycho-oncology patients and are described here. Two depression-specific KY techniques are described that also help combat mental fatigue and low energy. A 7-part protocol is described that would be used in KY practice to affect the full spectrum of emotions and distress that complicate a cancer diagnosis. In addition, there are KY techniques that practitioners have used in treating cancer. These techniques have not yet been subjected to formal clinical trials but are described here as potential adjunctive therapies. A case history demonstrating rapid onset of acute relief of intense fear in a terminal breast cancer patient using a KY technique specific for fear is presented. A second case history is reported for a surviving male diagnosed in 1988 with terminal prostate cancer who has used KY therapy long term as part of a self-directed integrative care approach.

  5. The integration of system specifications and program coding

    NASA Technical Reports Server (NTRS)

    Luebke, W. R.

    1970-01-01

    Experience in maintaining up-to-date documentation for one module of the large-scale Medical Literature Analysis and Retrieval System 2 (MEDLARS 2) is described. Several innovative techniques were explored in the development of this system's data management environment, particularly those that use PL/I as an automatic documenter. The PL/I data description section can provide automatic documentation by means of a master description of data elements that has long and highly meaningful mnemonic names and a formalized technique for the production of descriptive commentary. The techniques discussed are practical methods that employ the computer during system development in a manner that assists system implementation, provides interim documentation for customer review, and satisfies some of the deliverable documentation requirements.

  6. Systems, methods and apparatus for verification of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Rash, James L. (Inventor); Gracinin, Denis (Inventor); Erickson, John D. (Inventor); Rouff, Christopher A. (Inventor); Hinchey, Michael G. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.

  7. Fuzzy Logic Controller Stability Analysis Using a Satisfiability Modulo Theories Approach

    NASA Technical Reports Server (NTRS)

    Arnett, Timothy; Cook, Brandon; Clark, Matthew A.; Rattan, Kuldip

    2017-01-01

    While many widely accepted methods and techniques exist for validation and verification of traditional controllers, at this time no solutions have been accepted for Fuzzy Logic Controllers (FLCs). Due to the highly nonlinear nature of such systems, and the fact that developing a valid FLC does not require a mathematical model of the system, it is quite difficult to use conventional techniques to prove controller stability. Since safety-critical systems must be tested and verified to work as expected for all possible circumstances, the fact that FLC controllers cannot be tested to achieve such requirements poses limitations on the applications for such technology. Therefore, alternative methods for verification and validation of FLCs needs to be explored. In this study, a novel approach using formal verification methods to ensure the stability of a FLC is proposed. Main research challenges include specification of requirements for a complex system, conversion of a traditional FLC to a piecewise polynomial representation, and using a formal verification tool in a nonlinear solution space. Using the proposed architecture, the Fuzzy Logic Controller was found to always generate negative feedback, but inconclusive for Lyapunov stability.

  8. The Aeronautical Data Link: Taxonomy, Architectural Analysis, and Optimization

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Goode, Plesent W.

    2002-01-01

    The future Communication, Navigation, and Surveillance/Air Traffic Management (CNS/ATM) System will rely on global satellite navigation, and ground-based and satellite based communications via Multi-Protocol Networks (e.g. combined Aeronautical Telecommunications Network (ATN)/Internet Protocol (IP)) to bring about needed improvements in efficiency and safety of operations to meet increasing levels of air traffic. This paper will discuss the development of an approach that completely describes optimal data link architecture configuration and behavior to meet the multiple conflicting objectives of concurrent and different operations functions. The practical application of the approach enables the design and assessment of configurations relative to airspace operations phases. The approach includes a formal taxonomic classification, an architectural analysis methodology, and optimization techniques. The formal taxonomic classification provides a multidimensional correlation of data link performance with data link service, information protocol, spectrum, and technology mode; and to flight operations phase and environment. The architectural analysis methodology assesses the impact of a specific architecture configuration and behavior on the local ATM system performance. Deterministic and stochastic optimization techniques maximize architectural design effectiveness while addressing operational, technology, and policy constraints.

  9. The Formal Semantics of PVS

    NASA Technical Reports Server (NTRS)

    Owre, Sam; Shankar, Natarajan

    1999-01-01

    A specification language is a medium for expressing what is computed rather than how it is computed. Specification languages share some features with programming languages but are also different in several important ways. For our purpose, a specification language is a logic within which the behavior of computational systems can be formalized. Although a specification can be used to simulate the behavior of such systems, we mainly use specifications to state and prove system properties with mechanical assistance. We present the formal semantics of the specification language of SRI's Prototype Verification System (PVS). This specification language is based on the simply typed lambda calculus. The novelty in PVS is that it contains very expressive language features whose static analysis (e.g., typechecking) requires the assistance of a theorem prover. The formal semantics illuminates several of the design considerations underlying PVS, the interaction between theorem proving and typechecking.

  10. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    PubMed

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  11. Automated Predictive Big Data Analytics Using Ontology Based Semantics

    PubMed Central

    Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.

    2017-01-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954

  12. ARIES: Acquisition of Requirements and Incremental Evolution of Specifications

    NASA Technical Reports Server (NTRS)

    Roberts, Nancy A.

    1993-01-01

    This paper describes a requirements/specification environment specifically designed for large-scale software systems. This environment is called ARIES (Acquisition of Requirements and Incremental Evolution of Specifications). ARIES provides assistance to requirements analysts for developing operational specifications of systems. This development begins with the acquisition of informal system requirements. The requirements are then formalized and gradually elaborated (transformed) into formal and complete specifications. ARIES provides guidance to the user in validating formal requirements by translating them into natural language representations and graphical diagrams. ARIES also provides ways of analyzing the specification to ensure that it is correct, e.g., testing the specification against a running simulation of the system to be built. Another important ARIES feature, especially when developing large systems, is the sharing and reuse of requirements knowledge. This leads to much less duplication of effort. ARIES combines all of its features in a single environment that makes the process of capturing a formal specification quicker and easier.

  13. Formal Methods for Life-Critical Software

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Johnson, Sally C.

    1993-01-01

    The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.

  14. Geometry and Formal Linguistics.

    ERIC Educational Resources Information Center

    Huff, George A.

    This paper presents a method of encoding geometric line-drawings in a way which allows sets of such drawings to be interpreted as formal languages. A characterization of certain geometric predicates in terms of their properties as languages is obtained, and techniques usually associated with generative grammars and formal automata are then applied…

  15. Patch models and their applications to multivehicle command and control.

    PubMed

    Rao, Venkatesh G; D'Andrea, Raffaello

    2007-06-01

    We introduce patch models, a computational modeling formalism for multivehicle combat domains, based on spatiotemporal abstraction methods developed in the computer science community. The framework yields models that are expressive enough to accommodate nontrivial controlled vehicle dynamics while being within the representational capabilities of common artificial intelligence techniques used in the construction of autonomous systems. The framework allows several key design requirements of next-generation network-centric command and control systems, such as maintenance of shared situation awareness, to be achieved. Major features include support for multiple situation models at each decision node and rapid mission plan adaptation. We describe the formal specification of patch models and our prototype implementation, i.e., Patchworks. The capabilities of patch models are validated through a combat mission simulation in Patchworks, which involves two defending teams protecting a camp from an enemy attacking team.

  16. Training of trainers for community primary health care workers.

    PubMed

    Cernada, G P

    1983-01-01

    Training community-based health care workers in "developing" countries is essential to improving the quality of life in both rural and urban areas. Two major obstacles to such training are the tremendous social distance gap between these community workers and their more highly-educated and upper-class trainers (often medical officers) and the didactic, formal educational system. Bridging this gap demands a participant-centered, field-oriented approach which actively involves the trainee in the design, implementation and evaluation of the training program. A description of a philosophic learning approach based on self-initiated change, educational objectives related to planning, organizing, conducting and evaluating training, and specific learning methodologies utilizing participatory learning, non-formal educational techniques, field experience, continuing feedback and learner participation are reviewed. Included are: role playing, story telling, case studies, self-learning and simulation exercises, visuals, and Portapak videotape.

  17. EFL Teachers' Formal Assessment Practices Based on Exam Papers

    ERIC Educational Resources Information Center

    Kiliçkaya, Ferit

    2016-01-01

    This study reports initial findings from a small-scale qualitative study aimed at gaining insights into English language teachers' assessment practices in Turkey by examining the formal exam papers. Based on the technique of content analysis, formal exam papers were analyzed in terms of assessment items, language skills tested as well as the…

  18. Formal Attributes of Television Commercials: Subtle Ways of Transmitting Sex Stereotypes.

    ERIC Educational Resources Information Center

    Welch, Renate L.; And Others

    Differences in formal aspects of television commercials aimed at boys and those aimed at girls were investigated. Formal attributes were defined as production techniques such as action, pace, visual effects, dialogue and narration, background music and sound effects. Two aspects of content were also examined: aggressive behavior and the gender of…

  19. Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Yvonne (Editor)

    2008-01-01

    Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.

  20. Formal Techniques for Synchronized Fault-Tolerant Systems

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Butler, Ricky W.

    1992-01-01

    We present the formal verification of synchronizing aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization is based on an extended state machine model incorporating snapshots of local processors clocks.

  1. Vacuum instability in Kaluza–Klein manifolds

    NASA Astrophysics Data System (ADS)

    Fucci, Guglielmo

    2018-05-01

    The purpose of this work in to analyze particle creation in spaces with extra dimensions. We consider, in particular, a massive scalar field propagating in a Kaluza–Klein manifold subject to a constant electric field. We compute the rate of particle creation from vacuum by using techniques rooted in the spectral zeta function formalism. The results we obtain show explicitly how the presence of the extra-dimensions and their specific geometric characteristics, influence the rate at which pairs of particles and anti-particles are generated.

  2. 'No man is an island'. Testing the specific role of social isolation in formal thought disorder.

    PubMed

    de Sousa, Paulo; Spray, Amy; Sellwood, William; Bentall, Richard P

    2015-12-15

    Recent work has focused on the role of the environment in psychosis with emerging evidence that specific psychotic experiences are associated with specific types of adversity. One risk factor that has been often associated with psychosis is social isolation, with studies identifying isolation as an important feature of prodromal psychosis and others reporting that social networks of psychotic patients are smaller and less dense than those of healthy individuals. In the present study, we tested a prediction that social isolation would be specifically associated with formal thought disorder. 80 patients diagnosed with psychosis-spectrum disorder and 30 healthy participants were assessed for formal thought disorder with speech samples acquired during an interview that promoted personal disclosure and an interview targeting everyday topics. Social isolation was significantly associated with formal thought disorder in the neutral interview and in the salient interview, even when controlling for comorbid hallucinations, delusions and suspiciousness. Hallucinations, delusions and suspiciousness were not associated with social isolation when formal thought disorder was controlled for. Formal thought disorder is robustly and specifically associated with social isolation. Social cognitive mechanisms and processes are discussed which may explain this relationship as well as implications for clinical practice and future research. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  3. De novo reconstruction of gene regulatory networks from time series data, an approach based on formal methods.

    PubMed

    Ceccarelli, Michele; Cerulo, Luigi; Santone, Antonella

    2014-10-01

    Reverse engineering of gene regulatory relationships from genomics data is a crucial task to dissect the complex underlying regulatory mechanism occurring in a cell. From a computational point of view the reconstruction of gene regulatory networks is an undetermined problem as the large number of possible solutions is typically high in contrast to the number of available independent data points. Many possible solutions can fit the available data, explaining the data equally well, but only one of them can be the biologically true solution. Several strategies have been proposed in literature to reduce the search space and/or extend the amount of independent information. In this paper we propose a novel algorithm based on formal methods, mathematically rigorous techniques widely adopted in engineering to specify and verify complex software and hardware systems. Starting with a formal specification of gene regulatory hypotheses we are able to mathematically prove whether a time course experiment belongs or not to the formal specification, determining in fact whether a gene regulation exists or not. The method is able to detect both direction and sign (inhibition/activation) of regulations whereas most of literature methods are limited to undirected and/or unsigned relationships. We empirically evaluated the approach on experimental and synthetic datasets in terms of precision and recall. In most cases we observed high levels of accuracy outperforming the current state of art, despite the computational cost increases exponentially with the size of the network. We made available the tool implementing the algorithm at the following url: http://www.bioinformatics.unisannio.it. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Structural Embeddings: Mechanization with Method

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Rushby, John

    1999-01-01

    The most powerful tools for analysis of formal specifications are general-purpose theorem provers and model checkers, but these tools provide scant methodological support. Conversely, those approaches that do provide a well-developed method generally have less powerful automation. It is natural, therefore, to try to combine the better-developed methods with the more powerful general-purpose tools. An obstacle is that the methods and the tools often employ very different logics. We argue that methods are separable from their logics and are largely concerned with the structure and organization of specifications. We, propose a technique called structural embedding that allows the structural elements of a method to be supported by a general-purpose tool, while substituting the logic of the tool for that of the method. We have found this technique quite effective and we provide some examples of its application. We also suggest how general-purpose systems could be restructured to support this activity better.

  5. JIGSAW: Preference-directed, co-operative scheduling

    NASA Technical Reports Server (NTRS)

    Linden, Theodore A.; Gaw, David

    1992-01-01

    Techniques that enable humans and machines to cooperate in the solution of complex scheduling problems have evolved out of work on the daily allocation and scheduling of Tactical Air Force resources. A generalized, formal model of these applied techniques is being developed. It is called JIGSAW by analogy with the multi-agent, constructive process used when solving jigsaw puzzles. JIGSAW begins from this analogy and extends it by propagating local preferences into global statistics that dynamically influence the value and variable ordering decisions. The statistical projections also apply to abstract resources and time periods--allowing more opportunities to find a successful variable ordering by reserving abstract resources and deferring the choice of a specific resource or time period.

  6. Generalizing Backtrack-Free Search: A Framework for Search-Free Constraint Satisfaction

    NASA Technical Reports Server (NTRS)

    Jonsson, Ari K.; Frank, Jeremy

    2000-01-01

    Tractable classes of constraint satisfaction problems are of great importance in artificial intelligence. Identifying and taking advantage of such classes can significantly speed up constraint problem solving. In addition, tractable classes are utilized in applications where strict worst-case performance guarantees are required, such as constraint-based plan execution. In this work, we present a formal framework for search-free (backtrack-free) constraint satisfaction. The framework is based on general procedures, rather than specific propagation techniques, and thus generalizes existing techniques in this area. We also relate search-free problem solving to the notion of decision sets and use the result to provide a constructive criterion that is sufficient to guarantee search-free problem solving.

  7. Evaluating a Control System Architecture Based on a Formally Derived AOCS Model

    NASA Astrophysics Data System (ADS)

    Ilic, Dubravka; Latvala, Timo; Varpaaniemi, Kimmo; Vaisanen, Pauli; Troubitsyna, Elena; Laibinis, Linas

    2010-08-01

    Attitude & Orbit Control System (AOCS) refers to a wider class of control systems which are used to determine and control the attitude of the spacecraft while in orbit, based on the information obtained from various sensors. In this paper, we propose an approach to evaluate a typical (yet somewhat simplified) AOCS architecture using formal development - based on the Event-B method. As a starting point, an Ada specification of the AOCS is translated into a formal specification and further refined to incorporate all the details of its original source code specification. This way we are able not only to evaluate the Ada specification by expressing and verifying specific system properties in our formal models, but also to determine how well the chosen modelling framework copes with the level of detail required for an actual implementation and code generation from the derived models.

  8. Transforming Aggregate Object-Oriented Formal Specifications to Code

    DTIC Science & Technology

    1999-03-01

    integration issues associated with a formal-based software transformation system, such as the source specification, the problem space architecture , design architecture ... design transforms, and target software transforms. Software is critical in today’s Air Force, yet its specification, design, and development

  9. Symbolically Modeling Concurrent MCAPI Executions

    NASA Technical Reports Server (NTRS)

    Fischer, Topher; Mercer, Eric; Rungta, Neha

    2011-01-01

    Improper use of Inter-Process Communication (IPC) within concurrent systems often creates data races which can lead to bugs that are challenging to discover. Techniques that use Satisfiability Modulo Theories (SMT) problems to symbolically model possible executions of concurrent software have recently been proposed for use in the formal verification of software. In this work we describe a new technique for modeling executions of concurrent software that use a message passing API called MCAPI. Our technique uses an execution trace to create an SMT problem that symbolically models all possible concurrent executions and follows the same sequence of conditional branch outcomes as the provided execution trace. We check if there exists a satisfying assignment to the SMT problem with respect to specific safety properties. If such an assignment exists, it provides the conditions that lead to the violation of the property. We show how our method models behaviors of MCAPI applications that are ignored in previously published techniques.

  10. Systems, methods and apparatus for pattern matching in procedure development and verification

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which, in some embodiments, a formal specification is pattern-matched from scenarios, the formal specification is analyzed, and flaws in the formal specification are corrected. The systems, methods and apparatus may include pattern-matching an equivalent formal model from an informal specification. Such a model can be analyzed for contradictions, conflicts, use of resources before the resources are available, competition for resources, and so forth. From such a formal model, an implementation can be automatically generated in a variety of notations. The approach can improve the resulting implementation, which, in some embodiments, is provably equivalent to the procedures described at the outset, which in turn can improve confidence that the system reflects the requirements, and in turn reduces system development time and reduces the amount of testing required of a new system. Moreover, in some embodiments, two or more implementations can be "reversed" to appropriate formal models, the models can be combined, and the resulting combination checked for conflicts. Then, the combined, error-free model can be used to generate a new (single) implementation that combines the functionality of the original separate implementations, and may be more likely to be correct.

  11. A Formal Approach to Domain-Oriented Software Design Environments

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    This paper describes a formal approach to domain-oriented software design environments, based on declarative domain theories, formal specifications, and deductive program synthesis. A declarative domain theory defines the semantics of a domain-oriented specification language and its relationship to implementation-level subroutines. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that guides them in creating diagrams denoting formal specifications. The diagrams also serve to document the specifications. Deductive program synthesis ensures that end-user specifications are correctly implemented. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory, which includes an axiomatization of JPL's SPICELIB subroutine library. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development. Furthermore, AMPHION synthesizes one to two page programs consisting of calls to SPICELIB subroutines from these specifications in just a few minutes. Test results obtained by metering AMPHION's deductive program synthesis component are examined. AMPHION has been installed at JPL and is currently undergoing further refinement in preparation for distribution to hundreds of SPICELIB users worldwide. Current work to support end-user customization of AMPHION's specification acquisition subsystem is briefly discussed, as well as future work to enable domain-expert creation of new AMPHION applications through development of suitable domain theories.

  12. Applications of Formal Methods to Specification and Safety of Avionics Software

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.; Guaspari, David; Humenn, Polar

    1996-01-01

    This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.

  13. Recent advances in applying decision science to managing national forests

    USGS Publications Warehouse

    Marcot, Bruce G.; Thompson, Matthew P.; Runge, Michael C.; Thompson, Frank R.; McNulty, Steven; Cleaves, David; Tomosy, Monica; Fisher, Larry A.; Andrew, Bliss

    2012-01-01

    Management of federal public forests to meet sustainability goals and multiple use regulations is an immense challenge. To succeed, we suggest use of formal decision science procedures and tools in the context of structured decision making (SDM). SDM entails four stages: problem structuring (framing the problem and defining objectives and evaluation criteria), problem analysis (defining alternatives, evaluating likely consequences, identifying key uncertainties, and analyzing tradeoffs), decision point (identifying the preferred alternative), and implementation and monitoring the preferred alternative with adaptive management feedbacks. We list a wide array of models, techniques, and tools available for each stage, and provide three case studies of their selected use in National Forest land management and project plans. Successful use of SDM involves participation by decision-makers, analysts, scientists, and stakeholders. We suggest specific areas for training and instituting SDM to foster transparency, rigor, clarity, and inclusiveness in formal decision processes regarding management of national forests.

  14. Self-motion facilitates echo-acoustic orientation in humans

    PubMed Central

    Wallmeier, Ludwig; Wiegrebe, Lutz

    2014-01-01

    The ability of blind humans to navigate complex environments through echolocation has received rapidly increasing scientific interest. However, technical limitations have precluded a formal quantification of the interplay between echolocation and self-motion. Here, we use a novel virtual echo-acoustic space technique to formally quantify the influence of self-motion on echo-acoustic orientation. We show that both the vestibular and proprioceptive components of self-motion contribute significantly to successful echo-acoustic orientation in humans: specifically, our results show that vestibular input induced by whole-body self-motion resolves orientation-dependent biases in echo-acoustic cues. Fast head motions, relative to the body, provide additional proprioceptive cues which allow subjects to effectively assess echo-acoustic space referenced against the body orientation. These psychophysical findings clearly demonstrate that human echolocation is well suited to drive precise locomotor adjustments. Our data shed new light on the sensory–motor interactions, and on possible optimization strategies underlying echolocation in humans. PMID:26064556

  15. Model-Driven Test Generation of Distributed Systems

    NASA Technical Reports Server (NTRS)

    Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin

    2012-01-01

    This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.

  16. Protecting genomic sequence anonymity with generalization lattices.

    PubMed

    Malin, B A

    2005-01-01

    Current genomic privacy technologies assume the identity of genomic sequence data is protected if personal information, such as demographics, are obscured, removed, or encrypted. While demographic features can directly compromise an individual's identity, recent research demonstrates such protections are insufficient because sequence data itself is susceptible to re-identification. To counteract this problem, we introduce an algorithm for anonymizing a collection of person-specific DNA sequences. The technique is termed DNA lattice anonymization (DNALA), and is based upon the formal privacy protection schema of k -anonymity. Under this model, it is impossible to observe or learn features that distinguish one genetic sequence from k-1 other entries in a collection. To maximize information retained in protected sequences, we incorporate a concept generalization lattice to learn the distance between two residues in a single nucleotide region. The lattice provides the most similar generalized concept for two residues (e.g. adenine and guanine are both purines). The method is tested and evaluated with several publicly available human population datasets ranging in size from 30 to 400 sequences. Our findings imply the anonymization schema is feasible for the protection of sequences privacy. The DNALA method is the first computational disclosure control technique for general DNA sequences. Given the computational nature of the method, guarantees of anonymity can be formally proven. There is room for improvement and validation, though this research provides the groundwork from which future researchers can construct genomics anonymization schemas tailored to specific datasharing scenarios.

  17. Dynamic Parameter Identification of Subject-Specific Body Segment Parameters Using Robotics Formalism: Case Study Head Complex.

    PubMed

    Díaz-Rodríguez, Miguel; Valera, Angel; Page, Alvaro; Besa, Antonio; Mata, Vicente

    2016-05-01

    Accurate knowledge of body segment inertia parameters (BSIP) improves the assessment of dynamic analysis based on biomechanical models, which is of paramount importance in fields such as sport activities or impact crash test. Early approaches for BSIP identification rely on the experiments conducted on cadavers or through imaging techniques conducted on living subjects. Recent approaches for BSIP identification rely on inverse dynamic modeling. However, most of the approaches are focused on the entire body, and verification of BSIP for dynamic analysis for distal segment or chain of segments, which has proven to be of significant importance in impact test studies, is rarely established. Previous studies have suggested that BSIP should be obtained by using subject-specific identification techniques. To this end, our paper develops a novel approach for estimating subject-specific BSIP based on static and dynamics identification models (SIM, DIM). We test the validity of SIM and DIM by comparing the results using parameters obtained from a regression model proposed by De Leva (1996, "Adjustments to Zatsiorsky-Seluyanov's Segment Inertia Parameters," J. Biomech., 29(9), pp. 1223-1230). Both SIM and DIM are developed considering robotics formalism. First, the static model allows the mass and center of gravity (COG) to be estimated. Second, the results from the static model are included in the dynamics equation allowing us to estimate the moment of inertia (MOI). As a case study, we applied the approach to evaluate the dynamics modeling of the head complex. Findings provide some insight into the validity not only of the proposed method but also of the application proposed by De Leva (1996, "Adjustments to Zatsiorsky-Seluyanov's Segment Inertia Parameters," J. Biomech., 29(9), pp. 1223-1230) for dynamic modeling of body segments.

  18. Partial Thickness Rotator Cuff Tears: Current Concepts

    PubMed Central

    Matthewson, Graeme; Beach, Cara J.; Nelson, Atiba A.; Woodmass, Jarret M.; Ono, Yohei; Boorman, Richard S.; Lo, Ian K. Y.; Thornton, Gail M.

    2015-01-01

    Partial thickness rotator cuff tears are a common cause of pain in the adult shoulder. Despite their high prevalence, the diagnosis and treatment of partial thickness rotator cuff tears remains controversial. While recent studies have helped to elucidate the anatomy and natural history of disease progression, the optimal treatment, both nonoperative and operative, is unclear. Although the advent of arthroscopy has improved the accuracy of the diagnosis of partial thickness rotator cuff tears, the number of surgical techniques used to repair these tears has also increased. While multiple repair techniques have been described, there is currently no significant clinical evidence supporting more complex surgical techniques over standard rotator cuff repair. Further research is required to determine the clinical indications for surgical and nonsurgical management, when formal rotator cuff repair is specifically indicated and when biologic adjunctive therapy may be utilized. PMID:26171251

  19. FORMED: Bringing Formal Methods to the Engineering Desktop

    DTIC Science & Technology

    2016-02-01

    integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification

  20. The new formal competency-based curriculum and informal curriculum at Indiana University School of Medicine: overview and five-year analysis.

    PubMed

    Litzelman, Debra K; Cottingham, Ann H

    2007-04-01

    There is growing recognition in the medical community that being a good doctor requires more than strong scientific knowledge and excellent clinical skills. Many key qualities are essential to providing comprehensive care, including the abilities to communicate effectively with patients and colleagues, act in a professional manner, cultivate an awareness of one's own values and prejudices, and provide care with an understanding of the cultural and spiritual dimensions of patients' lives. To ensure that Indiana University School of Medicine (IUSM) graduates demonstrate this range of abilities, IUSM has undertaken a substantial transformation of both its formal curriculum and learning environment (informal curriculum). The authors provide an overview of IUSM's two-part initiative to develop and implement a competency-based formal curriculum that requires students to demonstrate proficiency in nine core competencies and to create simultaneously an informal curriculum that models and supports the moral, professional, and humane values expressed in the formal curriculum. The authors describe the institutional and curricular transformations that have enabled and furthered the new IUSM curricular goals: changes in education administration; education implementation, assessment, and curricular design; admissions procedures; performance tracking; and the development of an electronic infrastructure to facilitate the expanded curriculum. The authors address the cost of reform and the results of two progress reviews. Specific case examples illustrate the interweaving of the formal competency curriculum through the students' four years of training, as well as techniques that are being used to positively influence the IUSM informal curriculum.

  1. Computational logic: its origins and applications.

    PubMed

    Paulson, Lawrence C

    2018-02-01

    Computational logic is the use of computers to establish facts in a logical formalism. Originating in nineteenth century attempts to understand the nature of mathematical reasoning, the subject now comprises a wide variety of formalisms, techniques and technologies. One strand of work follows the 'logic for computable functions (LCF) approach' pioneered by Robin Milner, where proofs can be constructed interactively or with the help of users' code (which does not compromise correctness). A refinement of LCF, called Isabelle, retains these advantages while providing flexibility in the choice of logical formalism and much stronger automation. The main application of these techniques has been to prove the correctness of hardware and software systems, but increasingly researchers have been applying them to mathematics itself.

  2. Why are Formal Methods Not Used More Widely?

    NASA Technical Reports Server (NTRS)

    Knight, John C.; DeJong, Colleen L.; Gibble, Matthew S.; Nakano, Luis G.

    1997-01-01

    Despite extensive development over many years and significant demonstrated benefits, formal methods remain poorly accepted by industrial practitioners. Many reasons have been suggested for this situation such as a claim that they extent the development cycle, that they require difficult mathematics, that inadequate tools exist, and that they are incompatible with other software packages. There is little empirical evidence that any of these reasons is valid. The research presented here addresses the question of why formal methods are not used more widely. The approach used was to develop a formal specification for a safety-critical application using several specification notations and assess the results in a comprehensive evaluation framework. The results of the experiment suggests that there remain many impediments to the routine use of formal methods.

  3. Structuring Formal Requirements Specifications for Reuse and Product Families

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.

    2001-01-01

    In this project we have investigated how formal specifications should be structured to allow for requirements reuse, product family engineering, and ease of requirements change, The contributions of this work include (1) a requirements specification methodology specifically targeted for critical avionics applications, (2) guidelines for how to structure state-based specifications to facilitate ease of change and reuse, and (3) examples from the avionics domain demonstrating the proposed approach.

  4. Management Of Optical Projects

    NASA Astrophysics Data System (ADS)

    Young, Peter S.; Olson, David R.

    1981-03-01

    This paper discusses the management of optical projects from the concept stage, beginning with system specifications, through design, optical fabrication and test tasks. Special emphasis is placed on effective coupling of design engineering with fabrication development and utilization of available technology. Contrasts are drawn between accepted formalized management techniques, the realities of dealing with fragile components and the necessity of an effective project team which integrates the special characteristics of highly skilled optical specialists including lens designers, optical engineers, opticians, and metrologists. Examples are drawn from the HEAO-2 X-Ray Telescope and Space Telescope projects.

  5. A Formal Investigation of Human Spatial Control Skills: Mathematical Formalization, Skill Development, and Skill Assessment

    NASA Astrophysics Data System (ADS)

    Li, Bin

    Spatial control behaviors account for a large proportion of human everyday activities from normal daily tasks, such as reaching for objects, to specialized tasks, such as driving, surgery, or operating equipment. These behaviors involve intensive interactions within internal processes (i.e. cognitive, perceptual, and motor control) and with the physical world. This dissertation builds on a concept of interaction pattern and a hierarchical functional model. Interaction pattern represents a type of behavior synergy that humans coordinates cognitive, perceptual, and motor control processes. It contributes to the construction of the hierarchical functional model that delineates humans spatial control behaviors as the coordination of three functional subsystems: planning, guidance, and tracking/pursuit. This dissertation formalizes and validates these two theories and extends them for the investigation of human spatial control skills encompassing development and assessment. Specifically, this dissertation first presents an overview of studies in human spatial control skills encompassing definition, characteristic, development, and assessment, to provide theoretical evidence for the concept of interaction pattern and the hierarchical functional model. The following, the human experiments for collecting motion and gaze data and techniques to register and classify gaze data, are described. This dissertation then elaborates and mathematically formalizes the hierarchical functional model and the concept of interaction pattern. These theories then enables the construction of a succinct simulation model that can reproduce a variety of human performance with a minimal set of hypotheses. This validates the hierarchical functional model as a normative framework for interpreting human spatial control behaviors. The dissertation then investigates human skill development and captures the emergence of interaction pattern. The final part of the dissertation applies the hierarchical functional model for skill assessment and introduces techniques to capture interaction patterns both from the top down using their geometric features and from the bottom up using their dynamical characteristics. The validity and generality of the skill assessment is illustrated using two the remote-control flight and laparoscopic surgical training experiments.

  6. Nursing students' reading and English aptitudes and their relationship to discipline-specific formal writing ability: a descriptive correlational study.

    PubMed

    Newton, Sarah; Moore, Gary

    2010-01-01

    Formal writing assignments are commonly used in nursing education to develop students' critical thinking skills, as well as to enhance their communication abilities. However, writing apprehension is a common phenomenon among nursing students. It has been suggested that reading and English aptitudes are related to formal writing ability, yet neither the reading nor the English aptitudes of undergraduate nursing students have been described in the literature, and the relationships that reading and English aptitude have with formal writing ability have not been explored. The purpose of this descriptive correlational study was to describe writing apprehension and to assess the relationships among reading and English aptitude and discipline-specific formal writing ability among undergraduate nursing students. The study sample consisted of 146 sophomores from one baccalaureate nursing program. The results indicated that both reading and English aptitude were related to students' formal writing ability.

  7. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  8. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  9. Report on the formal specification and partial verification of the VIPER microprocessor

    NASA Technical Reports Server (NTRS)

    Brock, Bishop; Hunt, Warren A., Jr.

    1991-01-01

    The formal specification and partial verification of the VIPER microprocessor is reviewed. The VIPER microprocessor was designed by RSRE, Malvern, England, for safety critical computing applications (e.g., aircraft, reactor control, medical instruments, armaments). The VIPER was carefully specified and partially verified in an attempt to provide a microprocessor with completely predictable operating characteristics. The specification of VIPER is divided into several levels of abstraction, from a gate-level description up to an instruction execution model. Although the consistency between certain levels was demonstrated with mechanically-assisted mathematical proof, the formal verification of VIPER was never completed.

  10. Properties of a Formal Method to Model Emergence in Swarm-Based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.

  11. Towards Formal Implementation of PUS Standard

    NASA Astrophysics Data System (ADS)

    Ilić, D.

    2009-05-01

    As an effort to promote the reuse of on-board and ground systems ESA developed a standard for packet telemetry and telecommand - PUS. It defines a set of standard service models with the corresponding structures of the associated telemetry and telecommand packets. Various missions then can choose to implement those standard PUS services that best conform to their specific requirements. In this paper we propose a formal development (based on the Event-B method) of reusable service patterns, which can be instantiated for concrete application. Our formal models allow us to formally express and verify specific service properties including various telecommand and telemetry packet structure validation.

  12. Towards Formal Verification of a Separation Microkernel

    NASA Astrophysics Data System (ADS)

    Butterfield, Andrew; Sanan, David; Hinchey, Mike

    2013-08-01

    The best approach to verifying an IMA separation kernel is to use a (fixed) time-space partitioning kernel with a multiple independent levels of separation (MILS) architecture. We describe an activity that explores the cost and feasibility of doing a formal verification of such a kernel to the Common Criteria (CC) levels mandated by the Separation Kernel Protection Profile (SKPP). We are developing a Reference Specification of such a kernel, and are using higher-order logic (HOL) to construct formal models of this specification and key separation properties. We then plan to do a dry run of part of a formal proof of those properties using the Isabelle/HOL theorem prover.

  13. Computational logic: its origins and applications

    PubMed Central

    2018-01-01

    Computational logic is the use of computers to establish facts in a logical formalism. Originating in nineteenth century attempts to understand the nature of mathematical reasoning, the subject now comprises a wide variety of formalisms, techniques and technologies. One strand of work follows the ‘logic for computable functions (LCF) approach’ pioneered by Robin Milner, where proofs can be constructed interactively or with the help of users’ code (which does not compromise correctness). A refinement of LCF, called Isabelle, retains these advantages while providing flexibility in the choice of logical formalism and much stronger automation. The main application of these techniques has been to prove the correctness of hardware and software systems, but increasingly researchers have been applying them to mathematics itself. PMID:29507522

  14. Formal System Verification - Extension 2

    DTIC Science & Technology

    2012-08-08

    vision of truly trustworthy systems has been to provide a formally verified microkernel basis. We have previously developed the seL4 microkernel...together with a formal proof (in the theorem prover Isabelle/HOL) of its functional correctness [6]. This means that all the behaviours of the seL4 C...source code are included in the high-level, formal specification of the kernel. This work enabled us to provide further formal guarantees about seL4 , in

  15. Formal Specification of Information Systems Requirements.

    ERIC Educational Resources Information Center

    Kampfner, Roberto R.

    1985-01-01

    Presents a formal model for specification of logical requirements of computer-based information systems that incorporates structural and dynamic aspects based on two separate models: the Logical Information Processing Structure and the Logical Information Processing Network. The model's role in systems development is discussed. (MBR)

  16. Stereo Image Ranging For An Autonomous Robot Vision System

    NASA Astrophysics Data System (ADS)

    Holten, James R.; Rogers, Steven K.; Kabrisky, Matthew; Cross, Steven

    1985-12-01

    The principles of stereo vision for three-dimensional data acquisition are well-known and can be applied to the problem of an autonomous robot vehicle. Coincidental points in the two images are located and then the location of that point in a three-dimensional space can be calculated using the offset of the points and knowledge of the camera positions and geometry. This research investigates the application of artificial intelligence knowledge representation techniques as a means to apply heuristics to relieve the computational intensity of the low level image processing tasks. Specifically a new technique for image feature extraction is presented. This technique, the Queen Victoria Algorithm, uses formal language productions to process the image and characterize its features. These characterized features are then used for stereo image feature registration to obtain the required ranging information. The results can be used by an autonomous robot vision system for environmental modeling and path finding.

  17. Transactions in domain-specific information systems

    NASA Astrophysics Data System (ADS)

    Zacek, Jaroslav

    2017-07-01

    Substantial number of the current information system (IS) implementations is based on transaction approach. In addition, most of the implementations are domain-specific (e.g. accounting IS, resource planning IS). Therefore, we have to have a generic transaction model to build and verify domain-specific IS. The paper proposes a new transaction model for domain-specific ontologies. This model is based on value oriented business process modelling technique. The transaction model is formalized by the Petri Net theory. First part of the paper presents common business processes and analyses related to business process modeling. Second part defines the transactional model delimited by REA enterprise ontology paradigm and introduces states of the generic transaction model. The generic model proposal is defined and visualized by the Petri Net modelling tool. Third part shows application of the generic transaction model. Last part of the paper concludes results and discusses a practical usability of the generic transaction model.

  18. Industry Strength Tool and Technology for Automated Synthesis of Safety-Critical Applications from Formal Specifications

    DTIC Science & Technology

    2015-11-01

    28 2.3.4 Input/Output Automata ...various other modeling frameworks such as I/O Automata , Kahn Process Networks, Petri-nets, Multi-dimensional SDF, etc. are also used for designing...Formal Ideally suited to model DSP applications 3 Petri Nets Graphical Formal Used for modeling distributed systems 4 I/O Automata Both Formal

  19. Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis

    NASA Technical Reports Server (NTRS)

    Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.

    2017-01-01

    This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.

  20. A Multi-Encoding Approach for LTL Symbolic Satisfiability Checking

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Y.; Vardi, Moshe Y.

    2011-01-01

    Formal behavioral specifications written early in the system-design process and communicated across all design phases have been shown to increase the efficiency, consistency, and quality of the system under development. To prevent introducing design or verification errors, it is crucial to test specifications for satisfiability. Our focus here is on specifications expressed in linear temporal logic (LTL). We introduce a novel encoding of symbolic transition-based Buchi automata and a novel, "sloppy," transition encoding, both of which result in improved scalability. We also define novel BDD variable orders based on tree decomposition of formula parse trees. We describe and extensively test a new multi-encoding approach utilizing these novel encoding techniques to create 30 encoding variations. We show that our novel encodings translate to significant, sometimes exponential, improvement over the current standard encoding for symbolic LTL satisfiability checking.

  1. Design and Analysis Techniques for Concurrent Blackboard Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Mcmanus, John William

    1992-01-01

    Blackboard systems are a natural progression of knowledge-based systems into a more powerful problem solving technique. They provide a way for several highly specialized knowledge sources to cooperate to solve large, complex problems. Blackboard systems incorporate the concepts developed by rule-based and expert systems programmers and include the ability to add conventionally coded knowledge sources. The small and specialized knowledge sources are easier to develop and test, and can be hosted on hardware specifically suited to the task that they are solving. The Formal Model for Blackboard Systems was developed to provide a consistent method for describing a blackboard system. A set of blackboard system design tools has been developed and validated for implementing systems that are expressed using the Formal Model. The tools are used to test and refine a proposed blackboard system design before the design is implemented. My research has shown that the level of independence and specialization of the knowledge sources directly affects the performance of blackboard systems. Using the design, simulation, and analysis tools, I developed a concurrent object-oriented blackboard system that is faster, more efficient, and more powerful than existing systems. The use of the design and analysis tools provided the highly specialized and independent knowledge sources required for my concurrent blackboard system to achieve its design goals.

  2. Gsflow-py: An integrated hydrologic model development tool

    NASA Astrophysics Data System (ADS)

    Gardner, M.; Niswonger, R. G.; Morton, C.; Henson, W.; Huntington, J. L.

    2017-12-01

    Integrated hydrologic modeling encompasses a vast number of processes and specifications, variable in time and space, and development of model datasets can be arduous. Model input construction techniques have not been formalized or made easily reproducible. Creating the input files for integrated hydrologic models (IHM) requires complex GIS processing of raster and vector datasets from various sources. Developing stream network topology that is consistent with the model resolution digital elevation model is important for robust simulation of surface water and groundwater exchanges. Distribution of meteorologic parameters over the model domain is difficult in complex terrain at the model resolution scale, but is necessary to drive realistic simulations. Historically, development of input data for IHM models has required extensive GIS and computer programming expertise which has restricted the use of IHMs to research groups with available financial, human, and technical resources. Here we present a series of Python scripts that provide a formalized technique for the parameterization and development of integrated hydrologic model inputs for GSFLOW. With some modifications, this process could be applied to any regular grid hydrologic model. This Python toolkit automates many of the necessary and laborious processes of parameterization, including stream network development and cascade routing, land coverages, and meteorological distribution over the model domain.

  3. Patterns of Hierarchy in Formal and Principled Moral Reasoning.

    ERIC Educational Resources Information Center

    Zeidler, Dana Lewis

    Measurements of formal reasoning and principled moral reasoning ability were obtained from a sample of 99 tenth grade students. Specific modes of formal reasoning (proportional reasoning, controlling variables, probabilistic, correlational and combinatorial reasoning) were first examined. Findings support the notion of hierarchical relationships…

  4. Beyond formalism

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1991-01-01

    The ongoing debate over the role of formalism and formal specifications in software features many speakers with diverse positions. Yet, in the end, they share the conviction that the requirements of a software system can be unambiguously specified, that acceptable software is a product demonstrably meeting the specifications, and that the design process can be carried out with little interaction between designers and users once the specification has been agreed to. This conviction is part of a larger paradigm prevalent in American management thinking, which holds that organizations are systems that can be precisely specified and optimized. This paradigm, which traces historically to the works of Frederick Taylor in the early 1900s, is no longer sufficient for organizations and software systems today. In the domain of software, a new paradigm, called user-centered design, overcomes the limitations of pure formalism. Pioneered in Scandinavia, user-centered design is spreading through Europe and is beginning to make its way into the U.S.

  5. Formal verification of a fault tolerant clock synchronization algorithm

    NASA Technical Reports Server (NTRS)

    Rushby, John; Vonhenke, Frieder

    1989-01-01

    A formal specification and mechanically assisted verification of the interactive convergence clock synchronization algorithm of Lamport and Melliar-Smith is described. Several technical flaws in the analysis given by Lamport and Melliar-Smith were discovered, even though their presentation is unusally precise and detailed. It seems that these flaws were not detected by informal peer scrutiny. The flaws are discussed and a revised presentation of the analysis is given that not only corrects the flaws but is also more precise and easier to follow. Some of the corrections to the flaws require slight modifications to the original assumptions underlying the algorithm and to the constraints on its parameters, and thus change the external specifications of the algorithm. The formal analysis of the interactive convergence clock synchronization algorithm was performed using the Enhanced Hierarchical Development Methodology (EHDM) formal specification and verification environment. This application of EHDM provides a demonstration of some of the capabilities of the system.

  6. Survivable algorithms and redundancy management in NASA's distributed computing systems

    NASA Technical Reports Server (NTRS)

    Malek, Miroslaw

    1992-01-01

    The design of survivable algorithms requires a solid foundation for executing them. While hardware techniques for fault-tolerant computing are relatively well understood, fault-tolerant operating systems, as well as fault-tolerant applications (survivable algorithms), are, by contrast, little understood, and much more work in this field is required. We outline some of our work that contributes to the foundation of ultrareliable operating systems and fault-tolerant algorithm design. We introduce our consensus-based framework for fault-tolerant system design. This is followed by a description of a hierarchical partitioning method for efficient consensus. A scheduler for redundancy management is introduced, and application-specific fault tolerance is described. We give an overview of our hybrid algorithm technique, which is an alternative to the formal approach given.

  7. Gaussian-based techniques for quantum propagation from the time-dependent variational principle: Formulation in terms of trajectories of coupled classical and quantum variables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shalashilin, Dmitrii V.; Burghardt, Irene

    2008-08-28

    In this article, two coherent-state based methods of quantum propagation, namely, coupled coherent states (CCS) and Gaussian-based multiconfiguration time-dependent Hartree (G-MCTDH), are put on the same formal footing, using a derivation from a variational principle in Lagrangian form. By this approach, oscillations of the classical-like Gaussian parameters and oscillations of the quantum amplitudes are formally treated in an identical fashion. We also suggest a new approach denoted here as coupled coherent states trajectories (CCST), which completes the family of Gaussian-based methods. Using the same formalism for all related techniques allows their systematization and a straightforward comparison of their mathematical structuremore » and cost.« less

  8. 48 CFR 14.202-4 - Bid samples.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... would be appropriate for products that must be suitable from the standpoint of balance, facility of use... required by the formal specifications (Federal, Military, or other) applicable to the acquisition. (d....201-6(o)(2). (2) Where samples required by a Federal, Military, or other formal specification are not...

  9. How to become a better clinical teacher: a collaborative peer observation process.

    PubMed

    Finn, Kathleen; Chiappa, Victor; Puig, Alberto; Hunt, Daniel P

    2011-01-01

    Peer observation of teaching (PoT) is most commonly done as a way of evaluating educators in lecture or small group teaching. Teaching in the clinical environment is a complex and hectic endeavor that requires nimble and innovative teaching on a daily basis. Most junior faculty start their careers with little formal training in education and with limited opportunity to be observed or to observe more experienced faculty. Formal PoT would potentially ameliorate these challenges. This article describes a collaborative peer observation process that a group of 11 clinician educators is using as a longitudinal faculty development program. The process described in this article provides detailed and specific teaching feedback for the observed teaching attending while prompting the observing faculty to reflect on their own teaching style and to borrow effective teaching techniques from the observation. This article provides detailed examples from written feedback obtained during collaborative peer observation to emphasize the richness of this combined experience.

  10. Software Design for Real-Time Systems on Parallel Computers: Formal Specifications.

    DTIC Science & Technology

    1996-04-01

    This research investigated the important issues related to the analysis and design of real - time systems targeted to parallel architectures. In...particular, the software specification models for real - time systems on parallel architectures were evaluated. A survey of current formal methods for...uniprocessor real - time systems specifications was conducted to determine their extensibility in specifying real - time systems on parallel architectures. In

  11. Formal Assurance Arguments: A Solution In Search of a Problem?

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.

    2015-01-01

    An assurance case comprises evidence and argument showing how that evidence supports assurance claims (e.g., about safety or security). It is unsurprising that some computer scientists have proposed formalizing assurance arguments: most associate formality with rigor. But while engineers can sometimes prove that source code refines a formal specification, it is not clear that formalization will improve assurance arguments or that this benefit is worth its cost. For example, formalization might reduce the benefits of argumentation by limiting the audience to people who can read formal logic. In this paper, we present (1) a systematic survey of the literature surrounding formal assurance arguments, (2) an analysis of errors that formalism can help to eliminate, (3) a discussion of existing evidence, and (4) suggestions for experimental work to definitively answer the question.

  12. Enhanced Multiobjective Optimization Technique for Comprehensive Aerospace Design. Part A

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John N.

    1997-01-01

    A multidisciplinary design optimization procedure which couples formal multiobjectives based techniques and complex analysis procedures (such as computational fluid dynamics (CFD) codes) developed. The procedure has been demonstrated on a specific high speed flow application involving aerodynamics and acoustics (sonic boom minimization). In order to account for multiple design objectives arising from complex performance requirements, multiobjective formulation techniques are used to formulate the optimization problem. Techniques to enhance the existing Kreisselmeier-Steinhauser (K-S) function multiobjective formulation approach have been developed. The K-S function procedure used in the proposed work transforms a constrained multiple objective functions problem into an unconstrained problem which then is solved using the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. Weight factors are introduced during the transformation process to each objective function. This enhanced procedure will provide the designer the capability to emphasize specific design objectives during the optimization process. The demonstration of the procedure utilizes a computational Fluid dynamics (CFD) code which solves the three-dimensional parabolized Navier-Stokes (PNS) equations for the flow field along with an appropriate sonic boom evaluation procedure thus introducing both aerodynamic performance as well as sonic boom as the design objectives to be optimized simultaneously. Sensitivity analysis is performed using a discrete differentiation approach. An approximation technique has been used within the optimizer to improve the overall computational efficiency of the procedure in order to make it suitable for design applications in an industrial setting.

  13. Safety Verification of the Small Aircraft Transportation System Concept of Operations

    NASA Technical Reports Server (NTRS)

    Carreno, Victor; Munoz, Cesar

    2005-01-01

    A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.

  14. An evaluation of NCRP report 151--radiation shielding design for radiotherapy facilities, and a feasibility study for 6 MV open-door treatments in an existing high-energy radiation therapy bunker

    NASA Astrophysics Data System (ADS)

    Kildea, John

    This thesis describes a study of shielding design techniques used for radiation therapy facilities that employ megavoltage linear accelerators. Specifically, an evaluation of the shielding design formalism described in NCRP report 151 was undertaken and a feasibility study for open-door 6 MV radiation therapy treatments in existing 6 MV, 18 MV treatment rooms at the Montreal General Hospital (MGH) was conducted. To evaluate the shielding design formalism of NCRP 151, barrier-attenuated equivalent doses were measured for several of the treatment rooms at the MGH and compared with expectations from NCRP 151 calculations. It was found that, while the insight and recommendations of NCRP 151 are very valuable, its dose predictions are not always correct. As such, the NCRP 151 methodology is best used in conjunction with physical measurements. The feasibility study for 6 MV open-door treatments made use of the NCRP 151 formalism, together with physical measurements for realistic 6 MV workloads. The results suggest that, dosimetrically, 6 MV open door treatments are feasible. A conservative estimate for the increased dose at the door arising from such treatments is 0.1 mSv, with a 1/8 occupancy factor, as recommended in NCRP 151, included.

  15. On the Need for Practical Formal Methods

    DTIC Science & Technology

    1998-01-01

    additional research and engineering that is needed to make the current set of formal methods more practical. To illustrate the ideas, I present several exam ...either a good violin or a highly talented violinist. Light-weight techniques o er software developers good violins . A user need not be a talented

  16. The implementation and use of Ada on distributed systems with high reliability requirements

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1987-01-01

    A preliminary analysis of the Ada implementation of the Advanced Transport Operating System (ATOPS), an experimental computer control system developed at NASA Langley for a modified Boeing 737 aircraft, is presented. The criteria that was determined for the evaluation of this approach is described. A preliminary version of the requirements for the ATOPS is contained. This requirements specification is not a formal document, but rather a description of certain aspects of the ATOPS system at a level of detail that best suits the needs of the research. The survey of backward error recovery techniques is also presented.

  17. Pedagogical Basis of DAS Formalism in Engineering Education

    ERIC Educational Resources Information Center

    Hiltunen, J.; Heikkinen, E.-P.; Jaako, J.; Ahola, J.

    2011-01-01

    The paper presents a new approach for a bachelor-level curriculum structure in engineering. The approach is called DAS formalism according to its three phases: description, analysis and synthesis. Although developed specifically for process and environmental engineering, DAS formalism has a generic nature and it could also be used in other…

  18. Advanced Tools and Techniques for Formal Techniques in Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    2005-01-01

    This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.

  19. Two-Step Formal Advertisement: An Examination.

    DTIC Science & Technology

    1976-10-01

    The purpose of this report is to examine the potential application of the Two-Step Formal Advertisement method of procurement. Emphasis is placed on...Step formal advertising is a method of procurement designed to take advantage of negotiation flexibility and at the same time obtain the benefits of...formal advertising . It is used where the specifications are not sufficiently definite or may be too restrictive to permit full and free competition

  20. A rigorous approach to self-checking programming

    NASA Technical Reports Server (NTRS)

    Hua, Kien A.; Abraham, Jacob A.

    1986-01-01

    Self-checking programming is shown to be an effective concurrent error detection technique. The reliability of a self-checking program however relies on the quality of its assertion statements. A self-checking program written without formal guidelines could provide a poor coverage of the errors. A constructive technique for self-checking programming is presented. A Structured Program Design Language (SPDL) suitable for self-checking software development is defined. A set of formal rules, was also developed, that allows the transfromation of SPDL designs into self-checking designs to be done in a systematic manner.

  1. A High-Level Language for Modeling Algorithms and Their Properties

    NASA Astrophysics Data System (ADS)

    Akhtar, Sabina; Merz, Stephan; Quinson, Martin

    Designers of concurrent and distributed algorithms usually express them using pseudo-code. In contrast, most verification techniques are based on more mathematically-oriented formalisms such as state transition systems. This conceptual gap contributes to hinder the use of formal verification techniques. Leslie Lamport introduced PlusCal, a high-level algorithmic language that has the "look and feel" of pseudo-code, but is equipped with a precise semantics and includes a high-level expression language based on set theory. PlusCal models can be compiled to TLA + and verified using the model checker tlc.

  2. Formal Specification and Validation of a Hybrid Connectivity Restoration Algorithm for Wireless Sensor and Actor Networks †

    PubMed Central

    Imran, Muhammad; Zafar, Nazir Ahmad

    2012-01-01

    Maintaining inter-actor connectivity is extremely crucial in mission-critical applications of Wireless Sensor and Actor Networks (WSANs), as actors have to quickly plan optimal coordinated responses to detected events. Failure of a critical actor partitions the inter-actor network into disjoint segments besides leaving a coverage hole, and thus hinders the network operation. This paper presents a Partitioning detection and Connectivity Restoration (PCR) algorithm to tolerate critical actor failure. As part of pre-failure planning, PCR determines critical/non-critical actors based on localized information and designates each critical node with an appropriate backup (preferably non-critical). The pre-designated backup detects the failure of its primary actor and initiates a post-failure recovery process that may involve coordinated multi-actor relocation. To prove the correctness, we construct a formal specification of PCR using Z notation. We model WSAN topology as a dynamic graph and transform PCR to corresponding formal specification using Z notation. Formal specification is analyzed and validated using the Z Eves tool. Moreover, we simulate the specification to quantitatively analyze the efficiency of PCR. Simulation results confirm the effectiveness of PCR and the results shown that it outperforms contemporary schemes found in the literature.

  3. Real-Time IRI driven by GIRO data

    NASA Astrophysics Data System (ADS)

    Galkin, Ivan; Huang, Xueqin; Reinisch, Bodo; Bilitza, Dieter; Vesnin, Artem

    Real-time extensions of the empirical International Reference Ionosphere (IRI) model are based on assimilative techniques that preserve the IRI formalism which is optimized for the description of climatological ionospheric features. The Global Ionosphere Radio Observatory (GIRO) team has developed critical parts of an IRI Real Time Assimilative Model (IRTAM) for the global ionospheric plasma distribution using measured data available in real time from ~50 ionosondes of the GIRO network, The current assimilation results present global assimilative maps of foF2 and hmF2 that reproduce available data at the sensor sites and smoothly return to the climatological specifications when and where the data are missing, and are free from artificial sharp gradients and short-lived artifacts when viewed in time progression. Animated real-time maps of foF2 and hmF2 are published with a few minutes latency at http://giro.uml.edu/IRTAM/. Our real-time IRI modeling uses morphing, a technique that transforms the climatological ionospheric specifications to match the observations by iteratively computing corrections to the original coefficients of the diurnal/spatial expansions, used in IRI to map the key ionospheric characteristics, while keeping the IRI expansion basis formalism intact. Computation of the updated coefficient set for a given point in time includes analysis of the latest 24-hour history of observations, which allows the morphing technique to sense evolving ionospheric dynamics even with a sparse sensor network. A Non-linear Error Compensation Technique for Associative Restoration (NECTAR), one of the features in our morphing approach, has been in operation at the Lowell GIRO Data Center since 2013. The cornerstone of NECTAR is a recurrent neural network optimizer that is responsible for smoothing the transitions between the grid cells where observations are available. NECTAR has proved suitable for real-time operations that require the assimilation code to be considerate of data uncertainties (noise) and immune to data errors. Future IRTAM work is directed toward accepting a greater diversity of near-real-time sensor data, and the paper discusses potential new data sources and challenges associated with their assimilation.

  4. Experience Using Formal Methods for Specifying a Multi-Agent System

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Rash, James; Hinchey, Michael; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    The process and results of using formal methods to specify the Lights Out Ground Operations System (LOGOS) is presented in this paper. LOGOS is a prototype multi-agent system developed to show the feasibility of providing autonomy to satellite ground operations functions at NASA Goddard Space Flight Center (GSFC). After the initial implementation of LOGOS the development team decided to use formal methods to check for race conditions, deadlocks and omissions. The specification exercise revealed several omissions as well as race conditions. After completing the specification, the team concluded that certain tools would have made the specification process easier. This paper gives a sample specification of two of the agents in the LOGOS system and examples of omissions and race conditions found. It concludes with describing an architecture of tools that would better support the future specification of agents and other concurrent systems.

  5. Information governance in NHS's NPfIT: a case for policy specification.

    PubMed

    Becker, Moritz Y

    2007-01-01

    The National Health Service's (NHS's) National Programme for Information Technology (NPfIT) in the UK with its proposed nation-wide online health record service poses serious technical challenges, especially with regard to access control and patient confidentiality. The complexity of the confidentiality requirements and their constantly evolving nature (due to changes in law, guidelines and ethical consensus) make traditional technologies such as role-based access control (RBAC) unsuitable. Furthermore, a more formal approach is also needed for debating about and communicating on information governance, as natural-language descriptions of security policies are inherently ambiguous and incomplete. Our main goal is to convince the reader of the strong benefits of employing formal policy specification in nation-wide electronic health record (EHR) projects. Many difficulties could be alleviated by specifying the requirements in a formal authorisation policy language such as Cassandra. The language is unambiguous, declarative and machine-enforceable, and is based on distributed constrained Datalog. Cassandra is interpreted within a distributed Trust Management environment, where digital credentials are used for establishing mutual trust between strangers. To demonstrate how policy specification can be applied to NPfIT, we translate a fragment of natural-language NHS specification into formal Cassandra rules. In particular, we present policy rules pertaining to the management of Clinician Sealed Envelopes, the mechanism by which clinical patient data can be concealed in the nation-wide EHR service. Our case study exposes ambiguities and incompletenesses in the informal NHS documents. We strongly recommend the use of trust management and policy specification technology for the implementation of nation-wide EHR infrastructures. Formal policies can be used for automatically enforcing confidentiality requirements, but also for specification and communication purposes. Formalising the requirements also reveals ambiguities and missing details in the currently used informal specification documents.

  6. Informal Evaluation.

    ERIC Educational Resources Information Center

    Engel, Brenda S.

    Intended for non-experts in evaluative techniques, this monograph presents suggestions and examples for assessing: (1) the child; (2) the classroom; and (3) the program or the school. Illustrative techniques of recordkeeping are presented. Methods of collecting data include documentation and formal records. Techniques to be used during evaluation…

  7. Property-Based Monitoring of Analog and Mixed-Signal Systems

    NASA Astrophysics Data System (ADS)

    Havlicek, John; Little, Scott; Maler, Oded; Nickovic, Dejan

    In the recent past, there has been a steady growth of the market for consumer embedded devices such as cell phones, GPS and portable multimedia systems. In embedded systems, digital, analog and software components are combined on a single chip, resulting in increasingly complex designs that introduce richer functionality on smaller devices. As a consequence, the potential insertion of errors into a design becomes higher, yielding an increasing need for automated analog and mixed-signal validation tools. In the purely digital setting, formal verification based on properties expressed in industrial specification languages such as PSL and SVA is nowadays successfully integrated in the design flow. On the other hand, the validation of analog and mixed-signal systems still largely depends on simulation-based, ad-hoc methods. In this tutorial, we consider some ingredients of the standard verification methodology that can be successfully exported from digital to analog and mixed-signal setting, in particular property-based monitoring techniques. Property-based monitoring is a lighter approach to the formal verification, where the system is seen as a "black-box" that generates sets of traces, whose correctness is checked against a property, that is its high-level specification. Although incomplete, monitoring is effectively used to catch faults in systems, without guaranteeing their full correctness.

  8. Challenges and Demands on Automated Software Revision

    NASA Technical Reports Server (NTRS)

    Bonakdarpour, Borzoo; Kulkarni, Sandeep S.

    2008-01-01

    In the past three decades, automated program verification has undoubtedly been one of the most successful contributions of formal methods to software development. However, when verification of a program against a logical specification discovers bugs in the program, manual manipulation of the program is needed in order to repair it. Thus, in the face of existence of numerous unverified and un- certified legacy software in virtually any organization, tools that enable engineers to automatically verify and subsequently fix existing programs are highly desirable. In addition, since requirements of software systems often evolve during the software life cycle, the issue of incomplete specification has become a customary fact in many design and development teams. Thus, automated techniques that revise existing programs according to new specifications are of great assistance to designers, developers, and maintenance engineers. As a result, incorporating program synthesis techniques where an algorithm generates a program, that is correct-by-construction, seems to be a necessity. The notion of manual program repair described above turns out to be even more complex when programs are integrated with large collections of sensors and actuators in hostile physical environments in the so-called cyber-physical systems. When such systems are safety/mission- critical (e.g., in avionics systems), it is essential that the system reacts to physical events such as faults, delays, signals, attacks, etc, so that the system specification is not violated. In fact, since it is impossible to anticipate all possible such physical events at design time, it is highly desirable to have automated techniques that revise programs with respect to newly identified physical events according to the system specification.

  9. Students' Interpretations of Mechanistic Language in Organic Chemistry before Learning Reactions

    ERIC Educational Resources Information Center

    Galloway, Kelli R.; Stoyanovich, Carlee; Flynn, Alison B.

    2017-01-01

    Research on mechanistic thinking in organic chemistry has shown that students attribute little meaning to the electron-pushing (i.e., curved arrow) formalism. At the University of Ottawa, a new curriculum has been developed in which students are taught the electron-pushing formalism prior to instruction on specific reactions--this formalism is…

  10. The Factor Structure of Concrete and Formal Operations: A Confirmation of Piaget.

    ERIC Educational Resources Information Center

    Gray, William M.

    Piaget has hypothesized that concrete and formal operations can be described by specific logical models. The present study focused on assessing various aspects of four concrete operational groupings and two variations of two formal operational characteristics. Six hundred twenty-two 9-14 year old students participating in the Human Sciences…

  11. Towards the formal specification of the requirements and design of a processor interface unit

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    Work to formally specify the requirements and design of a Processor Interface Unit (PIU), a single-chip subsystem providing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault-tolerant computer system, is described. This system, the Fault-Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance free operation, or both. The approaches that were developed for modeling the PIU requirements and for composition of the PIU subcomponents at high levels of abstraction are described. These approaches were used to specify and verify a nontrivial subset of the PIU behavior. The PIU specification in Higher Order Logic (HOL) is documented in a companion NASA contractor report entitled 'Towards the Formal Specification of the Requirements and Design of a Processor Interfacs Unit - HOL Listings.' The subsequent verification approach and HOL listings are documented in NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit' and NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings.'

  12. An Ill-Structured PBL-Based Microprocessor Course without Formal Laboratory

    ERIC Educational Resources Information Center

    Kim, Jungkuk

    2012-01-01

    This paper introduces a problem-based learning (PBL) microprocessor application course designed according to the following strategies: 1) hands-on training without having a formal laboratory, and 2) intense student-centered cooperative learning through an ill-structured problem. PBL was adopted as the core educational technique of the course to…

  13. Peer Review of a Formal Verification/Design Proof Methodology

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The role of formal verification techniques in system validation was examined. The value and the state of the art of performance proving for fault-tolerant compuers were assessed. The investigation, development, and evaluation of performance proving tools were reviewed. The technical issues related to proof methodologies are examined. The technical issues discussed are summarized.

  14. A Formal Methods Approach to the Analysis of Mode Confusion

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Miller, Steven P.; Potts, James N.; Carreno, Victor A.

    2004-01-01

    The goal of the new NASA Aviation Safety Program (AvSP) is to reduce the civil aviation fatal accident rate by 80% in ten years and 90% in twenty years. This program is being driven by the accident data with a focus on the most recent history. Pilot error is the most commonly cited cause for fatal accidents (up to 70%) and obviously must be given major consideration in this program. While the greatest source of pilot error is the loss of situation awareness , mode confusion is increasingly becoming a major contributor as well. The January 30, 1995 issue of Aviation Week lists 184 incidents and accidents involving mode awareness including the Bangalore A320 crash 2/14/90, the Strasbourg A320 crash 1/20/92, the Mulhouse-Habsheim A320 crash 6/26/88, and the Toulouse A330 crash 6/30/94. These incidents and accidents reveal that pilots sometimes become confused about what the cockpit automation is doing. Consequently, human factors research is an obvious investment area. However, even a cursory look at the accident data reveals that the mode confusion problem is much deeper than just training deficiencies and a lack of human-oriented design. This is readily acknowledged by human factors experts. It seems that further progress in human factors must come through a deeper scrutiny of the internals of the automation. It is in this arena that formal methods can contribute. Formal methods refers to the use of techniques from logic and discrete mathematics in the specification, design, and verification of computer systems, both hardware and software. The fundamental goal of formal methods is to capture requirements, designs and implementations in a mathematically based model that can be analyzed in a rigorous manner. Research in formal methods is aimed at automating this analysis as much as possible. By capturing the internal behavior of a flight deck in a rigorous and detailed formal model, the dark corners of a design can be analyzed. This paper will explore how formal models and analyses can be used to help eliminate mode confusion from flight deck designs and at the same time increase our confidence in the safety of the implementation. The paper is based upon interim results from a new project involving NASA Langley and Rockwell Collins in applying formal methods to a realistic business jet Flight Guidance System (FGS).

  15. Formally verifying human–automation interaction as part of a system model: limitations and tradeoffs

    PubMed Central

    Bass, Ellen J.

    2011-01-01

    Both the human factors engineering (HFE) and formal methods communities are concerned with improving the design of safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to perform formal verification of human–automation interaction with a programmable device. This effort utilizes a system architecture composed of independent models of the human mission, human task behavior, human-device interface, device automation, and operational environment. The goals of this architecture were to allow HFE practitioners to perform formal verifications of realistic systems that depend on human–automation interaction in a reasonable amount of time using representative models, intuitive modeling constructs, and decoupled models of system components that could be easily changed to support multiple analyses. This framework was instantiated using a patient controlled analgesia pump in a two phased process where models in each phase were verified using a common set of specifications. The first phase focused on the mission, human-device interface, and device automation; and included a simple, unconstrained human task behavior model. The second phase replaced the unconstrained task model with one representing normative pump programming behavior. Because models produced in the first phase were too large for the model checker to verify, a number of model revisions were undertaken that affected the goals of the effort. While the use of human task behavior models in the second phase helped mitigate model complexity, verification time increased. Additional modeling tools and technological developments are necessary for model checking to become a more usable technique for HFE. PMID:21572930

  16. Assessing Requirements Quality through Requirements Coverage

    NASA Technical Reports Server (NTRS)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.

  17. A comparison between state-specific and linear-response formalisms for the calculation of vertical electronic transition energy in solution with the CCSD-PCM method.

    PubMed

    Caricato, Marco

    2013-07-28

    The calculation of vertical electronic transition energies of molecular systems in solution with accurate quantum mechanical methods requires the use of approximate and yet reliable models to describe the effect of the solvent on the electronic structure of the solute. The polarizable continuum model (PCM) of solvation represents a computationally efficient way to describe this effect, especially when combined with coupled cluster (CC) methods. Two formalisms are available to compute transition energies within the PCM framework: State-Specific (SS) and Linear-Response (LR). The former provides a more complete account of the solute-solvent polarization in the excited states, while the latter is computationally very efficient (i.e., comparable to gas phase) and transition properties are well defined. In this work, I review the theory for the two formalisms within CC theory with a focus on their computational requirements, and present the first implementation of the LR-PCM formalism with the coupled cluster singles and doubles method (CCSD). Transition energies computed with LR- and SS-CCSD-PCM are presented, as well as a comparison between solvation models in the LR approach. The numerical results show that the two formalisms provide different absolute values of transition energy, but similar relative solvatochromic shifts (from nonpolar to polar solvents). The LR formalism may then be used to explore the solvent effect on multiple states and evaluate transition probabilities, while the SS formalism may be used to refine the description of specific states and for the exploration of excited state potential energy surfaces of solvated systems.

  18. Husbandry of wild-caught song sparrows (Melospiza melodia).

    PubMed

    Smith, Lori; Hallager, Sara; Kendrick, Erin; Hope, Katharine; Danner, Raymond M

    2018-05-08

    Conservation and research efforts occasionally rely upon bringing wild animals into human care to establish breeding programs and to understand their biology. Wild-caught birds may have husbandry requirements that differ from captive-reared animals due, in part, to their social development in the wild and potential exposure to novel pathogens. We developed husbandry techniques to minimize stress and monitor health in a population of wild-caught song sparrows (Melospiza melodia). We describe enclosure conditions, diet and enrichment, and best practices for stress reduction. In addition, we describe several health monitoring strategies, including assessing feces quality, body condition scores, and specific signs of infection. These techniques led to successful housing of song sparrows during formal behavioral and developmental studies. This information will be useful for guiding the husbandry of wild-caught passerine birds in the future. © 2018 Wiley Periodicals, Inc.

  19. Enhancing audiovisual experience with haptic feedback: a survey on HAV.

    PubMed

    Danieau, F; Lecuyer, A; Guillotel, P; Fleureau, J; Mollet, N; Christie, M

    2013-01-01

    Haptic technology has been widely employed in applications ranging from teleoperation and medical simulation to art and design, including entertainment, flight simulation, and virtual reality. Today there is a growing interest among researchers in integrating haptic feedback into audiovisual systems. A new medium emerges from this effort: haptic-audiovisual (HAV) content. This paper presents the techniques, formalisms, and key results pertinent to this medium. We first review the three main stages of the HAV workflow: the production, distribution, and rendering of haptic effects. We then highlight the pressing necessity for evaluation techniques in this context and discuss the key challenges in the field. By building on existing technologies and tackling the specific challenges of the enhancement of audiovisual experience with haptics, we believe the field presents exciting research perspectives whose financial and societal stakes are significant.

  20. Formal mechanization of device interactions with a process algebra

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, Karl; Cohen, Gerald C.

    1992-01-01

    The principle emphasis is to develop a methodology to formally verify correct synchronization communication of devices in a composed hardware system. Previous system integration efforts have focused on vertical integration of one layer on top of another. This task examines 'horizontal' integration of peer devices. To formally reason about communication, we mechanize a process algebra in the Higher Order Logic (HOL) theorem proving system. Using this formalization we show how four types of device interactions can be represented and verified to behave as specified. The report also describes the specification of a system consisting of an AVM-1 microprocessor and a memory management unit which were verified in previous work. A proof of correct communication is presented, and the extensions to the system specification to add a direct memory device are discussed.

  1. Formal development of a clock synchronization circuit

    NASA Technical Reports Server (NTRS)

    Miner, Paul S.

    1995-01-01

    This talk presents the latest stage in formal development of a fault-tolerant clock synchronization circuit. The development spans from a high level specification of the required properties to a circuit realizing the core function of the system. An abstract description of an algorithm has been verified to satisfy the high-level properties using the mechanical verification system EHDM. This abstract description is recast as a behavioral specification input to the Digital Design Derivation system (DDD) developed at Indiana University. DDD provides a formal design algebra for developing correct digital hardware. Using DDD as the principle design environment, a core circuit implementing the clock synchronization algorithm was developed. The design process consisted of standard DDD transformations augmented with an ad hoc refinement justified using the Prototype Verification System (PVS) from SRI International. Subsequent to the above development, Wilfredo Torres-Pomales discovered an area-efficient realization of the same function. Establishing correctness of this optimization requires reasoning in arithmetic, so a general verification is outside the domain of both DDD transformations and model-checking techniques. DDD represents digital hardware by systems of mutually recursive stream equations. A collection of PVS theories was developed to aid in reasoning about DDD-style streams. These theories include a combinator for defining streams that satisfy stream equations, and a means for proving stream equivalence by exhibiting a stream bisimulation. DDD was used to isolate the sub-system involved in Torres-Pomales' optimization. The equivalence between the original design and the optimized verified was verified in PVS by exhibiting a suitable bisimulation. The verification depended upon type constraints on the input streams and made extensive use of the PVS type system. The dependent types in PVS provided a useful mechanism for defining an appropriate bisimulation.

  2. Spin formalism and applications to new physics searches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haber, H.E.

    1994-12-01

    An introduction to spin techniques in particle physics is given. Among the topics covered are: helicity formalism and its applications to the decay and scattering of spin-1/2 and spin-1 particles, techniques for evaluating helicity amplitudes (including projection operator methods and the spinor helicity method), and density matrix techniques. The utility of polarization and spin correlations for untangling new physics beyond the Standard Model at future colliders such as the LHC and a high energy e{sup +}e{sup {minus}} linear collider is then considered. A number of detailed examples are explored including the search for low-energy supersymmetry, a non-minimal Higgs boson sector,more » and new gauge bosons beyond the W{sup {+-}} and Z.« less

  3. Modeling of tool path for the CNC sheet cutting machines

    NASA Astrophysics Data System (ADS)

    Petunin, Aleksandr A.

    2015-11-01

    In the paper the problem of tool path optimization for CNC (Computer Numerical Control) cutting machines is considered. The classification of the cutting techniques is offered. We also propose a new classification of toll path problems. The tasks of cost minimization and time minimization for standard cutting technique (Continuous Cutting Problem, CCP) and for one of non-standard cutting techniques (Segment Continuous Cutting Problem, SCCP) are formalized. We show that the optimization tasks can be interpreted as discrete optimization problem (generalized travel salesman problem with additional constraints, GTSP). Formalization of some constraints for these tasks is described. For the solution GTSP we offer to use mathematical model of Prof. Chentsov based on concept of a megalopolis and dynamic programming.

  4. Investigation on the use of optimization techniques for helicopter airframe vibrations design studies

    NASA Technical Reports Server (NTRS)

    Sreekanta Murthy, T.

    1992-01-01

    Results of the investigation of formal nonlinear programming-based numerical optimization techniques of helicopter airframe vibration reduction are summarized. The objective and constraint function and the sensitivity expressions used in the formulation of airframe vibration optimization problems are presented and discussed. Implementation of a new computational procedure based on MSC/NASTRAN and CONMIN in a computer program system called DYNOPT for optimizing airframes subject to strength, frequency, dynamic response, and dynamic stress constraints is described. An optimization methodology is proposed which is thought to provide a new way of applying formal optimization techniques during the various phases of the airframe design process. Numerical results obtained from the application of the DYNOPT optimization code to a helicopter airframe are discussed.

  5. A formal protocol test procedure for the Survivable Adaptable Fiber Optic Embedded Network (SAFENET)

    NASA Astrophysics Data System (ADS)

    High, Wayne

    1993-03-01

    This thesis focuses upon a new method for verifying the correct operation of a complex, high speed fiber optic communication network. These networks are of growing importance to the military because of their increased connectivity, survivability, and reconfigurability. With the introduction and increased dependence on sophisticated software and protocols, it is essential that their operation be correct. Because of the speed and complexity of fiber optic networks being designed today, they are becoming increasingly difficult to test. Previously, testing was accomplished by application of conformance test methods which had little connection with an implementation's specification. The major goal of conformance testing is to ensure that the implementation of a profile is consistent with its specification. Formal specification is needed to ensure that the implementation performs its intended operations while exhibiting desirable behaviors. The new conformance test method presented is based upon the System of Communicating Machine model which uses a formal protocol specification to generate a test sequence. The major contribution of this thesis is the application of the System of Communicating Machine model to formal profile specifications of the Survivable Adaptable Fiber Optic Embedded Network (SAFENET) standard which results in the derivation of test sequences for a SAFENET profile. The results applying this new method to SAFENET's OSI and Lightweight profiles are presented.

  6. From non-trivial geometries to power spectra and vice versa

    NASA Astrophysics Data System (ADS)

    Brooker, D. J.; Tsamis, N. C.; Woodard, R. P.

    2018-04-01

    We review a recent formalism which derives the functional forms of the primordial—tensor and scalar—power spectra of scalar potential inflationary models. The formalism incorporates the case of geometries with non-constant first slow-roll parameter. Analytic expressions for the power spectra are given that explicitly display the dependence on the geometric properties of the background. Moreover, we present the full algorithm for using our formalism, to reconstruct the model from the observed power spectra. Our techniques are applied to models possessing "features" in their potential with excellent agreement.

  7. Keldysh formalism for multiple parallel worlds

    NASA Astrophysics Data System (ADS)

    Ansari, M.; Nazarov, Y. V.

    2016-03-01

    We present a compact and self-contained review of the recently developed Keldysh formalism for multiple parallel worlds. The formalism has been applied to consistent quantum evaluation of the flows of informational quantities, in particular, to the evaluation of Renyi and Shannon entropy flows. We start with the formulation of the standard and extended Keldysh techniques in a single world in a form convenient for our presentation. We explain the use of Keldysh contours encompassing multiple parallel worlds. In the end, we briefly summarize the concrete results obtained with the method.

  8. Keldysh formalism for multiple parallel worlds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ansari, M.; Nazarov, Y. V., E-mail: y.v.nazarov@tudelft.nl

    We present a compact and self-contained review of the recently developed Keldysh formalism for multiple parallel worlds. The formalism has been applied to consistent quantum evaluation of the flows of informational quantities, in particular, to the evaluation of Renyi and Shannon entropy flows. We start with the formulation of the standard and extended Keldysh techniques in a single world in a form convenient for our presentation. We explain the use of Keldysh contours encompassing multiple parallel worlds. In the end, we briefly summarize the concrete results obtained with the method.

  9. DRS: Derivational Reasoning System

    NASA Technical Reports Server (NTRS)

    Bose, Bhaskar

    1995-01-01

    The high reliability requirements for airborne systems requires fault-tolerant architectures to address failures in the presence of physical faults, and the elimination of design flaws during the specification and validation phase of the design cycle. Although much progress has been made in developing methods to address physical faults, design flaws remain a serious problem. Formal methods provides a mathematical basis for removing design flaws from digital systems. DRS (Derivational Reasoning System) is a formal design tool based on advanced research in mathematical modeling and formal synthesis. The system implements a basic design algebra for synthesizing digital circuit descriptions from high level functional specifications. DRS incorporates an executable specification language, a set of correctness preserving transformations, verification interface, and a logic synthesis interface, making it a powerful tool for realizing hardware from abstract specifications. DRS integrates recent advances in transformational reasoning, automated theorem proving and high-level CAD synthesis systems in order to provide enhanced reliability in designs with reduced time and cost.

  10. 29 CFR 102.54 - Initiation of formal compliance proceedings; issuance of compliance specification and notice of...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the purposes and policies of the Act or to avoid unnecessary costs or delay, the Regional Director may... policies of the Act or to avoid unnecessary costs or delay, the Regional Director may consolidate with a... formal proceeding, the Regional Director may issue and serve on all parties a compliance specification in...

  11. 29 CFR 102.54 - Initiation of formal compliance proceedings; issuance of compliance specification and notice of...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the purposes and policies of the Act or to avoid unnecessary costs or delay, the Regional Director may... policies of the Act or to avoid unnecessary costs or delay, the Regional Director may consolidate with a... formal proceeding, the Regional Director may issue and serve on all parties a compliance specification in...

  12. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems

    DTIC Science & Technology

    1994-07-29

    time systems and to evaluate the design. The evaluation of the design includes investigation of both the capability and potential usefulness of the toolkit environment and the feasibility of its implementation....The goals of Phase 1 are to design in detail a toolkit environment based on formal methods for the specification and verification of distributed real

  13. The effect of tinnitus specific intracochlear stimulation on speech perception in patients with unilateral or asymmetric hearing loss accompanied with tinnitus and the effect of formal auditory training.

    PubMed

    Arts, Remo A G J; George, Erwin L J; Janssen, Miranda A M L; Griessner, Andreas; Zierhofer, Clemens; Stokroos, Robert J

    2018-06-01

    Previous studies show that intracochlear electrical stimulation independent of environmental sounds appears to suppress tinnitus, even long-term. In order to assess the viability of this potential treatment option it is essential to study the effects of this tinnitus specific electrical stimulation on speech perception. A randomised, prospective crossover design. Ten patients with unilateral or asymmetric hearing loss and severe tinnitus complaints. The audiological effects of standard clinical CI, formal auditory training and tinnitus specific electrical stimulation were investigated. Results show that standard clinical CI in unilateral or asymmetric hearing loss is shown to be beneficial for speech perception in quiet, speech perception in noise and subjective hearing ability. Formal auditory training does not appear to improve speech perception performance. However, CI-related discomfort reduces significantly more rapidly during CI rehabilitation in subjects receiving formal auditory training. Furthermore, tinnitus specific electrical stimulation has neither positive nor negative effects on speech perception. In combination with the findings from previous studies on tinnitus suppression using intracochlear electrical stimulation independent of environmental sounds, the results of this study contribute to the viability of cochlear implantation based on tinnitus complaints.

  14. Combinatorial structures to modeling simple games and applications

    NASA Astrophysics Data System (ADS)

    Molinero, Xavier

    2017-09-01

    We connect three different topics: combinatorial structures, game theory and chemistry. In particular, we establish the bases to represent some simple games, defined as influence games, and molecules, defined from atoms, by using combinatorial structures. First, we characterize simple games as influence games using influence graphs. It let us to modeling simple games as combinatorial structures (from the viewpoint of structures or graphs). Second, we formally define molecules as combinations of atoms. It let us to modeling molecules as combinatorial structures (from the viewpoint of combinations). It is open to generate such combinatorial structures using some specific techniques as genetic algorithms, (meta-)heuristics algorithms and parallel programming, among others.

  15. Software development for safety-critical medical applications

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1992-01-01

    There are many computer-based medical applications in which safety and not reliability is the overriding concern. Reduced, altered, or no functionality of such systems is acceptable as long as no harm is done. A precise, formal definition of what software safety means is essential, however, before any attempt can be made to achieve it. Without this definition, it is not possible to determine whether a specific software entity is safe. A set of definitions pertaining to software safety will be presented and a case study involving an experimental medical device will be described. Some new techniques aimed at improving software safety will also be discussed.

  16. Preface to MOST-ONISW 2009

    NASA Astrophysics Data System (ADS)

    Doerr, Martin; Freitas, Fred; Guizzardi, Giancarlo; Han, Hyoil

    Ontology is a cross-disciplinary field concerned with the study of concepts and theories that can be used for representing shared conceptualizations of specific domains. Ontological Engineering is a discipline in computer and information science concerned with the development of techniques, methods, languages and tools for the systematic construction of concrete artifacts capturing these representations, i.e., models (e.g., domain ontologies) and metamodels (e.g., upper-level ontologies). In recent years, there has been a growing interest in the application of formal ontology and ontological engineering to solve modeling problems in diverse areas in computer science such as software and data engineering, knowledge representation, natural language processing, information science, among many others.

  17. Linking Simulation with Formal Verification and Modeling of Wireless Sensor Network in TLA+

    NASA Astrophysics Data System (ADS)

    Martyna, Jerzy

    In this paper, we present the results of the simulation of a wireless sensor network based on the flooding technique and SPIN protocols. The wireless sensor network was specified and verified by means of the TLA+ specification language [1]. For a model of wireless sensor network built this way simulation was carried with the help of specially constructed software tools. The obtained results allow us to predict the behaviour of the wireless sensor network in various topologies and spatial densities. Visualization of the output data enable precise examination of some phenomenas in wireless sensor networks, such as a hidden terminal, etc.

  18. A Formal Semantics for the WS-BPEL Recovery Framework

    NASA Astrophysics Data System (ADS)

    Dragoni, Nicola; Mazzara, Manuel

    While current studies on Web services composition are mostly focused - from the technical viewpoint - on standards and protocols, this work investigates the adoption of formal methods for dependable composition. The Web Services Business Process Execution Language (WS-BPEL) - an OASIS standard widely adopted both in academic and industrial environments - is considered as a touchstone for concrete composition languages and an analysis of its ambiguous Recovery Framework specification is offered. In order to show the use of formal methods, a precise and unambiguous description of its (simplified) mechanisms is provided by means of a conservative extension of the π-calculus. This has to be intended as a well known case study providing methodological arguments for the adoption of formal methods in software specification. The aspect of verification is not the main topic of the paper but some hints are given.

  19. Formality of the Chinese collective leadership.

    PubMed

    Li, Haiying; Graesser, Arthur C

    2016-09-01

    We investigated the linguistic patterns in the discourse of four generations of the collective leadership of the Communist Party of China (CPC) from 1921 to 2012. The texts of Mao Zedong, Deng Xiaoping, Jiang Zemin, and Hu Jintao were analyzed using computational linguistic techniques (a Chinese formality score) to explore the persuasive linguistic features of the leaders in the contexts of power phase, the nation's education level, power duration, and age. The study was guided by the elaboration likelihood model of persuasion, which includes a central route (represented by formal discourse) versus a peripheral route (represented by informal discourse) to persuasion. The results revealed that these leaders adopted the formal, central route more when they were in power than before they came into power. The nation's education level was a significant factor in the leaders' adoption of the persuasion strategy. The leaders' formality also decreased with their increasing age and in-power times. However, the predictability of these factors for formality had subtle differences among the different types of leaders. These results enhance our understanding of the Chinese collective leadership and the role of formality in politically persuasive messages.

  20. The Moderating Effects of Group Work on the Relationship Between Motivation and Cognitive Load

    ERIC Educational Resources Information Center

    Costley, Jamie; Lange, Christopher

    2018-01-01

    Semi-formal learning is used to describe learning that is directed towards the goals of a formal learning institution but outside of the learning structure of a specific class. Students studying online may form semi-formal groups to increase their knowledge of the content by interacting with other learners taking the same class. This study of…

  1. Microprocessor Simulation: A Training Technique.

    ERIC Educational Resources Information Center

    Oscarson, David J.

    1982-01-01

    Describes the design and application of a microprocessor simulation using BASIC for formal training of technicians and managers and as a management tool. Illustrates the utility of the modular approach for the instruction and practice of decision-making techniques. (SK)

  2. Approaches to formalization of the informal waste sector into municipal solid waste management systems in low- and middle-income countries: Review of barriers and success factors.

    PubMed

    Aparcana, Sandra

    2017-03-01

    The Municipal Solid Waste Management (MSWM) sector represents a major challenge for low-and middle-income countries due to significant environmental and socioeconomic issues involving rapid urbanization, their MSWM systems, and the existence of the informal waste sector. Recognizing its role, several countries have implemented various formalization measures, aiming to address the social problems linked to this sector. However, regardless of these initiatives, not all attempts at formalization have proved successful due to the existence of barriers preventing their implementation in the long term. Along with this, there is a frequent lack of knowledge or understanding regarding these barriers and the kind of measures that may enable formalization, thereby attaining a win-win situation for all the stakeholders involved. In this context, policy- and decision-makers in the public and private sectors are frequently confronted with the dilemma of finding workable approaches to formalization, adjusted to their particular MSWM contexts. Building on the review of frequently implemented approaches to formalization, including an analysis of the barriers to and enabling measures for formalization, this paper aims to address this gap by explaining to policy- and decision-makers, and to waste managers in the private sector, certain dynamics that can be observed and that should be taken into account when designing formalization strategies that are adapted to their particular socioeconomic and political-institutional context. This includes possible links between formalization approaches and barriers, the kinds of barriers that need to be removed, and enabling measures leading to successful formalization in the long term. This paper involved a literature review of common approaches to formalization, which were classified into three categories: (1) informal waste workers organized in associations or cooperatives; (2) organized in CBOs or MSEs; and (3) contracted as individual workers by the formal waste sector. This was followed by the identification and subsequent classification of measures for removing common barriers to formalization into five categories: policy/legal, institutional/organizational, technical, social, and economic/financial. The approaches to formalization, as well as the barrier categories, were validated through the assessment of twenty case studies of formalization. Building on the assessment, the paper discussed possible links between formalization approaches and barriers, the 'persistent' challenges that represent barriers to formalization, as well as key enabling factors improving the likelihood of successful formalization. Regardless of the type of approach adopted to formalization, the review identifies measures to remove barriers in all five categories, with a stronger link between the approaches 1 and 2 and the existence of measures in the policy, institutional, and financial categories. Regarding persistent barriers, the review identified ones arising from the absence of measures to address a particular issue before formalization or due to specific country- or sector-related conditions, and their interaction with the MSWM context. 75% of the case studies had persistent barriers in respect of policy/legal issues, 50% of institutional/organizational, 45% of financial/economic, and 40%, and 35% of social and technical issues respectively. This paper concludes that independently of the formalization approach, the lack of interventions or measures in any of the five categories of barriers may lead formalization initiatives to fail, as unaddressed barriers become 'persistent' after formalization is implemented. Furthermore, 'persistent barriers' may also appear due to unfavorable country-specific conditions. The success of a formalization initiative does not depend on a specific approach, but most likely on the inclusion of country-appropriate measures at the policy, economic and institutional levels. The empowerment of informal waste-workers is again confirmed as a further key success factor for their formalization. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. An introduction to requirements capture using PVS: Specification of a simple autopilot

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1996-01-01

    This paper presents an introduction to capturing software requirements in the PVS formal language. The object of study is a simplified digital autopilot that was motivated in part by the mode control panel of NASA Langley's Boeing 737 research aircraft. The paper first presents the requirements for this autopilot in English and then steps the reader through a translation of these requirements into formal mathematics. Along the way deficiencies in the English specification are noted and repaired. Once completed, the formal PVS requirement is analyzed using the PVS theorem prover and shown to maintain an invariant over its state space.

  4. Position paper: the science of deep specification.

    PubMed

    Appel, Andrew W; Beringer, Lennart; Chlipala, Adam; Pierce, Benjamin C; Shao, Zhong; Weirich, Stephanie; Zdancewic, Steve

    2017-10-13

    We introduce our efforts within the project 'The science of deep specification' to work out the key formal underpinnings of industrial-scale formal specifications of software and hardware components, anticipating a world where large verified systems are routinely built out of smaller verified components that are also used by many other projects. We identify an important class of specification that has already been used in a few experiments that connect strong component-correctness theorems across the work of different teams. To help popularize the unique advantages of that style, we dub it deep specification , and we say that it encompasses specifications that are rich , two-sided , formal and live (terms that we define in the article). Our core team is developing a proof-of-concept system (based on the Coq proof assistant) whose specification and verification work is divided across largely decoupled subteams at our four institutions, encompassing hardware microarchitecture, compilers, operating systems and applications, along with cross-cutting principles and tools for effective specification. We also aim to catalyse interest in the approach, not just by basic researchers but also by users in industry.This article is part of the themed issue 'Verified trustworthy software systems'. © 2017 The Author(s).

  5. Extension of specification language for soundness and completeness of service workflow

    NASA Astrophysics Data System (ADS)

    Viriyasitavat, Wattana; Xu, Li Da; Bi, Zhuming; Sapsomboon, Assadaporn

    2018-05-01

    A Service Workflow is an aggregation of distributed services to fulfill specific functionalities. With ever increasing available services, the methodologies for the selections of the services against the given requirements become main research subjects in multiple disciplines. A few of researchers have contributed to the formal specification languages and the methods for model checking; however, existing methods have the difficulties to tackle with the complexity of workflow compositions. In this paper, we propose to formalize the specification language to reduce the complexity of the workflow composition. To this end, we extend a specification language with the consideration of formal logic, so that some effective theorems can be derived for the verification of syntax, semantics, and inference rules in the workflow composition. The logic-based approach automates compliance checking effectively. The Service Workflow Specification (SWSpec) has been extended and formulated, and the soundness, completeness, and consistency of SWSpec applications have been verified; note that a logic-based SWSpec is mandatory for the development of model checking. The application of the proposed SWSpec has been demonstrated by the examples with the addressed soundness, completeness, and consistency.

  6. Recent trends related to the use of formal methods in software engineering

    NASA Technical Reports Server (NTRS)

    Prehn, Soren

    1986-01-01

    An account is given of some recent developments and trends related to the development and use of formal methods in software engineering. Ongoing activities in Europe are focussed on, since there seems to be a notable difference in attitude towards industrial usage of formal methods in Europe and in the U.S. A more detailed account is given of the currently most widespread formal method in Europe: the Vienna Development Method. Finally, the use of Ada is discussed in relation to the application of formal methods, and the potential for constructing Ada-specific tools based on that method is considered.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ni, Xiaotong; Van den Nest, Maarten; Buerschaper, Oliver

    We propose a non-commutative extension of the Pauli stabilizer formalism. The aim is to describe a class of many-body quantum states which is richer than the standard Pauli stabilizer states. In our framework, stabilizer operators are tensor products of single-qubit operators drawn from the group 〈αI, X, S〉, where α = e{sup iπ/4} and S = diag(1, i). We provide techniques to efficiently compute various properties related to bipartite entanglement, expectation values of local observables, preparation by means of quantum circuits, parent Hamiltonians, etc. We also highlight significant differences compared to the Pauli stabilizer formalism. In particular, we give examplesmore » of states in our formalism which cannot arise in the Pauli stabilizer formalism, such as topological models that support non-Abelian anyons.« less

  8. Genetic programming assisted stochastic optimization strategies for optimization of glucose to gluconic acid fermentation.

    PubMed

    Cheema, Jitender Jit Singh; Sankpal, Narendra V; Tambe, Sanjeev S; Kulkarni, Bhaskar D

    2002-01-01

    This article presents two hybrid strategies for the modeling and optimization of the glucose to gluconic acid batch bioprocess. In the hybrid approaches, first a novel artificial intelligence formalism, namely, genetic programming (GP), is used to develop a process model solely from the historic process input-output data. In the next step, the input space of the GP-based model, representing process operating conditions, is optimized using two stochastic optimization (SO) formalisms, viz., genetic algorithms (GAs) and simultaneous perturbation stochastic approximation (SPSA). These SO formalisms possess certain unique advantages over the commonly used gradient-based optimization techniques. The principal advantage of the GP-GA and GP-SPSA hybrid techniques is that process modeling and optimization can be performed exclusively from the process input-output data without invoking the detailed knowledge of the process phenomenology. The GP-GA and GP-SPSA techniques have been employed for modeling and optimization of the glucose to gluconic acid bioprocess, and the optimized process operating conditions obtained thereby have been compared with those obtained using two other hybrid modeling-optimization paradigms integrating artificial neural networks (ANNs) and GA/SPSA formalisms. Finally, the overall optimized operating conditions given by the GP-GA method, when verified experimentally resulted in a significant improvement in the gluconic acid yield. The hybrid strategies presented here are generic in nature and can be employed for modeling and optimization of a wide variety of batch and continuous bioprocesses.

  9. Physically motivated correlation formalism in hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Roy, Ankita; Rafert, J. Bruce

    2004-05-01

    Most remote sensing data-sets contain a limiting number of independent spatial and spectral measurements, beyond which no effective increase in information is achieved. This paper presents a Physically Motivated Correlation Formalism (PMCF) ,which places both Spatial and Spectral data on an equivalent mathematical footing in the context of a specific Kernel, such that, optimal combinations of independent data can be selected from the entire Hypercube via the method of "Correlation Moments". We present an experimental and computational analysis of Hyperspectral data sets using the Michigan Tech VFTHSI [Visible Fourier Transform Hyperspectral Imager] based on a Sagnac Interferometer, adjusted to obtain high SNR levels. The captured Signal Interferograms of different targets - aerial snaps of Houghton and lab-based data (white light , He-Ne laser , discharge tube sources) with the provision of customized scan of targets with the same exposures are processed using inverse imaging transformations and filtering techniques to obtain the Spectral profiles and generate Hypercubes to compute Spectral/Spatial/Cross Moments. PMCF answers the question of how optimally the entire hypercube should be sampled and finds how many spatial-spectral pixels are required for a particular target recognition.

  10. Semantic Data Integration and Knowledge Management to Represent Biological Network Associations.

    PubMed

    Losko, Sascha; Heumann, Klaus

    2017-01-01

    The vast quantities of information generated by academic and industrial research groups are reflected in a rapidly growing body of scientific literature and exponentially expanding resources of formalized data, including experimental data, originating from a multitude of "-omics" platforms, phenotype information, and clinical data. For bioinformatics, the challenge remains to structure this information so that scientists can identify relevant information, to integrate this information as specific "knowledge bases," and to formalize this knowledge across multiple scientific domains to facilitate hypothesis generation and validation. Here we report on progress made in building a generic knowledge management environment capable of representing and mining both explicit and implicit knowledge and, thus, generating new knowledge. Risk management in drug discovery and clinical research is used as a typical example to illustrate this approach. In this chapter we introduce techniques and concepts (such as ontologies, semantic objects, typed relationships, contexts, graphs, and information layers) that are used to represent complex biomedical networks. The BioXM™ Knowledge Management Environment is used as an example to demonstrate how a domain such as oncology is represented and how this representation is utilized for research.

  11. Coherence specific signal detection via chiral pump-probe spectroscopy.

    PubMed

    Holdaway, David I H; Collini, Elisabetta; Olaya-Castro, Alexandra

    2016-05-21

    We examine transient circular dichroism (TRCD) spectroscopy as a technique to investigate signatures of exciton coherence dynamics under the influence of structured vibrational environments. We consider a pump-probe configuration with a linearly polarized pump and a circularly polarized probe, with a variable angle θ between the two directions of propagation. In our theoretical formalism the signal is decomposed in chiral and achiral doorway and window functions. Using this formalism, we show that the chiral doorway component, which beats during the population time, can be isolated by comparing signals with different values of θ. As in the majority of time-resolved pump-probe spectroscopy, the overall TRCD response shows signatures of both excited and ground state dynamics. However, we demonstrate that the chiral doorway function has only a weak ground state contribution, which can generally be neglected if an impulsive pump pulse is used. These findings suggest that the pump-probe configuration of optical TRCD in the impulsive limit has the potential to unambiguously probe quantum coherence beating in the excited state. We present numerical results for theoretical signals in an example dimer system.

  12. Experiences applying Formal Approaches in the Development of Swarm-Based Space Exploration Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher A.; Hinchey, Michael G.; Truszkowski, Walter F.; Rash, James L.

    2006-01-01

    NASA is researching advanced technologies for future exploration missions using intelligent swarms of robotic vehicles. One of these missions is the Autonomous Nan0 Technology Swarm (ANTS) mission that will explore the asteroid belt using 1,000 cooperative autonomous spacecraft. The emergent properties of intelligent swarms make it a potentially powerful concept, but at the same time more difficult to design and ensure that the proper behaviors will emerge. NASA is investigating formal methods and techniques for verification of such missions. The advantage of using formal methods is the ability to mathematically verify the behavior of a swarm, emergent or otherwise. Using the ANTS mission as a case study, we have evaluated multiple formal methods to determine their effectiveness in modeling and ensuring desired swarm behavior. This paper discusses the results of this evaluation and proposes an integrated formal method for ensuring correct behavior of future NASA intelligent swarms.

  13. Industrialization of the mirror plate coatings for the ATHENA mission

    NASA Astrophysics Data System (ADS)

    Massahi, S.; Christensen, F. E.; Ferreira, D. D. M.; Shortt, B.; Collon, M.; Sforzini, J.; Landgraf, B.; Hinze, F.; Aulhorn, S.; Biedermann, R.

    2017-08-01

    In the frame of the development of the Advanced Telescope for High-ENergy Astrophysics (Athena) mission, currently in phase A, ESA is continuing to mature the optics technology and the associated mass production techniques. These efforts are driven by the programmatic and technical requirement of reaching TRL 6 prior to proposing the mission for formal adoption (planned for 2020). A critical part of the current phase A preparation activities is addressing the industrialization of the Silicon Pore Optics mirror plates coating. This include the transfer of the well-established coating processes and techniques, performed at DTU Space, to an industrial scale facility suitable for coating the more than 100,000 mirror plates required for Athena. In this paper, we explain the considerations for the planned coating facility including, requirement specification, equipment and supplier selection, preparing the coating facility for the deposition equipment, designing and fabrication.

  14. Nanomechanical effects of light unveil photons momentum in medium

    PubMed Central

    Verma, Gopal; Chaudhary, Komal; Singh, Kamal P.

    2017-01-01

    Precision measurement on momentum transfer between light and fluid interface has many implications including resolving the intriguing nature of photons momentum in a medium. For example, the existence of Abraham pressure of light under specific experimental configuration and the predictions of Chau-Amperian formalism of optical momentum for TE and TM polarizations remain untested. Here, we quantitatively and cleanly measure nanomehanical dynamics of water surface excited by radiation pressure of a laser beam. We systematically scanned wide range of experimental parameters including long exposure times, angle of incidence, spot size and laser polarization, and used two independent pump-probe techniques to validate a nano- bump on the water surface under all the tested conditions, in quantitative agreement with the Minkowski’s momentum of light. With careful experiments, we demonstrate advantages and limitations of nanometer resolved optical probing techniques and narrow down actual manifestation of optical momentum in a medium. PMID:28198468

  15. Quantifying fibrosis in head and neck cancer treatment: An overview.

    PubMed

    Moloney, Emma C; Brunner, Markus; Alexander, Ashlin J; Clark, Jonathan

    2015-08-01

    Fibrosis is a common late complication of radiotherapy and/or surgical treatment for head and neck cancers. Fibrosis is difficult to quantify and formal methods of measure are not well recognized. The purpose of this review was to summarize the methods available to quantify neck fibrosis. A PubMed search of articles was carried out using key words "neck" and "fibrosis." Many methods have been used to assess fibrosis, however, there is no preferred methodology. Specific to neck fibrosis, most studies have relied upon hand palpation rating scales. Indentation and suction techniques have been used to mechanically quantify neck fibrosis. There is scope to develop applications of ultrasound, dielectric, bioimpedance, and MRI techniques for use in the neck region. Quantitative assessment of neck fibrosis is sought after in order to compare treatment regimens and improve quality of life outcomes in patients with head and neck cancer. © 2014 Wiley Periodicals, Inc.

  16. Forensic detection of noise addition in digital images

    NASA Astrophysics Data System (ADS)

    Cao, Gang; Zhao, Yao; Ni, Rongrong; Ou, Bo; Wang, Yongbin

    2014-03-01

    We proposed a technique to detect the global addition of noise to a digital image. As an anti-forensics tool, noise addition is typically used to disguise the visual traces of image tampering or to remove the statistical artifacts left behind by other operations. As such, the blind detection of noise addition has become imperative as well as beneficial to authenticate the image content and recover the image processing history, which is the goal of general forensics techniques. Specifically, the special image blocks, including constant and strip ones, are used to construct the features for identifying noise addition manipulation. The influence of noising on blockwise pixel value distribution is formulated and analyzed formally. The methodology of detectability recognition followed by binary decision is proposed to ensure the applicability and reliability of noising detection. Extensive experimental results demonstrate the efficacy of our proposed noising detector.

  17. Compositional Verification of a Communication Protocol for a Remotely Operated Vehicle

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.; Munoz, Cesar A.

    2009-01-01

    This paper presents the specification and verification in the Prototype Verification System (PVS) of a protocol intended to facilitate communication in an experimental remotely operated vehicle used by NASA researchers. The protocol is defined as a stack-layered com- position of simpler protocols. It can be seen as the vertical composition of protocol layers, where each layer performs input and output message processing, and the horizontal composition of different processes concurrently inhabiting the same layer, where each process satisfies a distinct requirement. It is formally proven that the protocol components satisfy certain delivery guarantees. Compositional techniques are used to prove these guarantees also hold in the composed system. Although the protocol itself is not novel, the methodology employed in its verification extends existing techniques by automating the tedious and usually cumbersome part of the proof, thereby making the iterative design process of protocols feasible.

  18. Impact of metals in surface matrices from formal and informal electronic-waste recycling around Metro Manila, the Philippines, and intra-Asian comparison.

    PubMed

    Fujimori, Takashi; Takigami, Hidetaka; Agusa, Tetsuro; Eguchi, Akifumi; Bekki, Kanae; Yoshida, Aya; Terazono, Atsushi; Ballesteros, Florencio C

    2012-06-30

    We report concentrations, enrichment factors, and hazard indicators of 11 metals (Ag, As, Cd, Co, Cu, Fe, In, Mn, Ni, Pb, and Zn) in soil and dust surface matrices from formal and informal electronic waste (e-waste) recycling sites around Metro Manila, the Philippines, referring to soil guidelines and previous data from various e-waste recycling sites in Asia. Surface dust from e-waste recycling sites had higher levels of metal contamination than surface soil. Comparison of formal and informal e-waste recycling sites (hereafter, "formal" and "informal") revealed differences in specific contaminants. Formal dust contained a mixture of serious pollutant metals (Ni, Cu, Pb, and Zn) and Cd (polluted modestly), quite high enrichment metals (Ag and In), and crust-derived metals (As, Co, Fe, and Mn). For informal soil, concentration levels of specific metals (Cd, Co, Cu, Mn, Ni, Pb, and Zn) were similar among Asian recycling sites. Formal dust had significantly higher hazardous risk than the other matrices (p<0.005), excluding informal dust (p=0.059, almost significant difference). Thus, workers exposed to formal dust should protect themselves from hazardous toxic metals (Pb and Cu). There is also a high health risk for children ingesting surface matrices from informal e-waste recycling sites. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Formal functional test designs with a test representation language

    NASA Technical Reports Server (NTRS)

    Hops, J. M.

    1993-01-01

    The application of the category-partition method to the test design phase of hardware, software, or system test development is discussed. The method provides a formal framework for reducing the total number of possible test cases to a minimum logical subset for effective testing. An automatic tool and a formal language were developed to implement the method and produce the specification of test cases.

  20. Structured representation for requirements and specifications

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Fisher, Gene; Frincke, Deborah; Wolber, Dave

    1991-01-01

    This document was generated in support of NASA contract NAS1-18586, Design and Validation of Digital Flight Control Systems suitable for Fly-By-Wire Applications, Task Assignment 2. Task 2 is associated with a formal representation of requirements and specifications. In particular, this document contains results associated with the development of a Wide-Spectrum Requirements Specification Language (WSRSL) that can be used to express system requirements and specifications in both stylized and formal forms. Included with this development are prototype tools to support the specification language. In addition a preliminary requirements specification methodology based on the WSRSL has been developed. Lastly, the methodology has been applied to an Advanced Subsonic Civil Transport Flight Control System.

  1. The Digital electronic Guideline Library (DeGeL): a hybrid framework for representation and use of clinical guidelines.

    PubMed

    Shahar, Yuval; Young, Ohad; Shalom, Erez; Mayaffit, Alon; Moskovitch, Robert; Hessing, Alon; Galperin, Maya

    2004-01-01

    We propose to present a poster (and potentially also a demonstration of the implemented system) summarizing the current state of our work on a hybrid, multiple-format representation of clinical guidelines that facilitates conversion of guidelines from free text to a formal representation. We describe a distributed Web-based architecture (DeGeL) and a set of tools using the hybrid representation. The tools enable performing tasks such as guideline specification, semantic markup, search, retrieval, visualization, eligibility determination, runtime application and retrospective quality assessment. The representation includes four parallel formats: Free text (one or more original sources); semistructured text (labeled by the target guideline-ontology semantic labels); semiformal text (which includes some control specification); and a formal, machine-executable representation. The specification, indexing, search, retrieval, and browsing tools are essentially independent of the ontology chosen for guideline representation, but editing the semi-formal and formal formats requires ontology-specific tools, which we have developed in the case of the Asbru guideline-specification language. The four formats support increasingly sophisticated computational tasks. The hybrid guidelines are stored in a Web-based library. All tools, such as for runtime guideline application or retrospective quality assessment, are designed to operate on all representations. We demonstrate the hybrid framework by providing examples from the semantic markup and search tools.

  2. Formal optimization of hovering performance using free wake lifting surface theory

    NASA Technical Reports Server (NTRS)

    Chung, S. Y.

    1986-01-01

    Free wake techniques for performance prediction and optimization of hovering rotor are discussed. The influence functions due to vortex ring, vortex cylinder, and source or vortex sheets are presented. The vortex core sizes of rotor wake vortices are calculated and their importance is discussed. Lifting body theory for finite thickness body is developed for pressure calculation, and hence performance prediction of hovering rotors. Numerical optimization technique based on free wake lifting line theory is presented and discussed. It is demonstrated that formal optimization can be used with the implicit and nonlinear objective or cost function such as the performance of hovering rotors as used in this report.

  3. Techniques for Forecasting Air Passenger Traffic

    NASA Technical Reports Server (NTRS)

    Taneja, N.

    1972-01-01

    The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.

  4. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 2: Formal specification and correctness theorems

    NASA Technical Reports Server (NTRS)

    Bickford, Mark; Srivas, Mandayam

    1991-01-01

    Presented here is a formal specification and verification of a property of a quadruplicately redundant fault tolerant microprocessor system design. A complete listing of the formal specification of the system and the correctness theorems that are proved are given. The system performs the task of obtaining interactive consistency among the processors using a special instruction on the processors. The design is based on an algorithm proposed by Pease, Shostak, and Lamport. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, providing certain preconditions hold, using a computer aided design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  5. Review of Estelle and LOTOS with respect to critical computer applications

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    Man rated NASA space vehicles seem to represent a set of ultimate critical computer applications. These applications require a high degree of security, integrity, and safety. A variety of formal and/or precise modeling techniques are becoming available for the designer of critical systems. The design phase of the software engineering life cycle includes the modification of non-development components. A review of the Estelle and LOTOS formal description languages is presented. Details of the languages and a set of references are provided. The languages were used to formally describe some of the Open System Interconnect (OSI) protocols.

  6. Stepwise construction of a metabolic network in Event-B: The heat shock response.

    PubMed

    Sanwal, Usman; Petre, Luigia; Petre, Ion

    2017-12-01

    There is a high interest in constructing large, detailed computational models for biological processes. This is often done by putting together existing submodels and adding to them extra details/knowledge. The result of such approaches is usually a model that can only answer questions on a very specific level of detail, and thus, ultimately, is of limited use. We focus instead on an approach to systematically add details to a model, with formal verification of its consistency at each step. In this way, one obtains a set of reusable models, at different levels of abstraction, to be used for different purposes depending on the question to address. We demonstrate this approach using Event-B, a computational framework introduced to develop formal specifications of distributed software systems. We first describe how to model generic metabolic networks in Event-B. Then, we apply this method for modeling the biological heat shock response in eukaryotic cells, using Event-B refinement techniques. The advantage of using Event-B consists in having refinement as an intrinsic feature; this provides as a final result not only a correct model, but a chain of models automatically linked by refinement, each of which is provably correct and reusable. This is a proof-of-concept that refinement in Event-B is suitable for biomodeling, serving for mastering biological complexity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Structuring Formal Control Systems Specifications for Reuse: Surviving Hardware Changes

    NASA Technical Reports Server (NTRS)

    Thompson, Jeffrey M.; Heimdahl, Mats P. E.; Erickson, Debra M.

    2000-01-01

    Formal capture and analysis of the required behavior of control systems have many advantages. For instance, it encourages rigorous requirements analysis, the required behavior is unambiguously defined, and we can assure that various safety properties are satisfied. Formal modeling is, however, a costly and time consuming process and if one could reuse the formal models over a family of products, significant cost savings would be realized. In an ongoing project we are investigating how to structure state-based models to achieve a high level of reusability within product families. In this paper we discuss a high-level structure of requirements models that achieves reusability of the desired control behavior across varying hardware platforms in a product family. The structuring approach is demonstrated through a case study in the mobile robotics domain where the desired robot behavior is reused on two diverse platforms-one commercial mobile platform and one build in-house. We use our language RSML (-e) to capture the control behavior for reuse and our tool NIMBUS to demonstrate how the formal specification can be validated and used as a prototype on the two platforms.

  8. Steady-state global optimization of metabolic non-linear dynamic models through recasting into power-law canonical models

    PubMed Central

    2011-01-01

    Background Design of newly engineered microbial strains for biotechnological purposes would greatly benefit from the development of realistic mathematical models for the processes to be optimized. Such models can then be analyzed and, with the development and application of appropriate optimization techniques, one could identify the modifications that need to be made to the organism in order to achieve the desired biotechnological goal. As appropriate models to perform such an analysis are necessarily non-linear and typically non-convex, finding their global optimum is a challenging task. Canonical modeling techniques, such as Generalized Mass Action (GMA) models based on the power-law formalism, offer a possible solution to this problem because they have a mathematical structure that enables the development of specific algorithms for global optimization. Results Based on the GMA canonical representation, we have developed in previous works a highly efficient optimization algorithm and a set of related strategies for understanding the evolution of adaptive responses in cellular metabolism. Here, we explore the possibility of recasting kinetic non-linear models into an equivalent GMA model, so that global optimization on the recast GMA model can be performed. With this technique, optimization is greatly facilitated and the results are transposable to the original non-linear problem. This procedure is straightforward for a particular class of non-linear models known as Saturable and Cooperative (SC) models that extend the power-law formalism to deal with saturation and cooperativity. Conclusions Our results show that recasting non-linear kinetic models into GMA models is indeed an appropriate strategy that helps overcoming some of the numerical difficulties that arise during the global optimization task. PMID:21867520

  9. The Role of Non-Formal Education in Combating the HIV Epidemic in the Philippines and Taiwan

    ERIC Educational Resources Information Center

    Morisky, Donald E.; Lyu, Shu-Yu; Urada, Lianne A.

    2009-01-01

    The Philippines is experiencing a low but slowly growing prevalence of HIV, with a UN estimate of 6,000-11,000 cases out of a population of 91 million, and a 150% increase in new cases in 2008 compared to previous years. Earlier education programmes employed non-formal educational training techniques in the southern Philippines to target high-risk…

  10. Formal methods and digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.

  11. Negotiating the Boundaries between the Formal and the Informal: An Experienced Teacher's Refective Adaptations of Informal Learning in a Keyboard Class for At-Risk Students

    ERIC Educational Resources Information Center

    Costes-Onishi, Pamela

    2016-01-01

    The objective of this study is to address the important questions raised in literature on the intersections between formal and informal learning. Specifically, this will be discussed within the concept of "productive dissonance" and the pedagogical tensions that arise in the effort of experienced teachers to transition from the formal to…

  12. Formal Assurance for Cognitive Architecture Based Autonomous Agent

    NASA Technical Reports Server (NTRS)

    Bhattacharyya, Siddhartha; Eskridge, Thomas; Neogi, Natasha; Carvalho, Marco

    2017-01-01

    Autonomous systems are designed and deployed in different modeling paradigms. These environments focus on specific concepts in designing the system. We focus our effort in the use of cognitive architectures to design autonomous agents to collaborate with humans to accomplish tasks in a mission. Our research focuses on introducing formal assurance methods to verify the behavior of agents designed in Soar, by translating the agent to the formal verification environment Uppaal.

  13. Verification of Emergent Behaviors in Swarm-based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James

    2004-01-01

    The emergent properties of swarms make swarm-based missions powerful, but at the same time more difficult to design and to assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of swarm-based missions. The Autonomous Nano-Technology Swarm (ANTS) mission is being used as an example and case study for swarm-based missions to experiment and test current formal methods with intelligent swarms. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior. This paper introduces how intelligent swarm technology is being proposed for NASA missions, and gives the results of a comparison of several formal methods and approaches for specifying intelligent swarm-based systems and their effectiveness for predicting emergent behavior.

  14. Fast tracking the design of theory-based KT interventions through a consensus process.

    PubMed

    Bussières, André E; Al Zoubi, Fadi; Quon, Jeffrey A; Ahmed, Sara; Thomas, Aliki; Stuber, Kent; Sajko, Sandy; French, Simon

    2015-02-11

    Despite available evidence for optimal management of spinal pain, poor adherence to guidelines and wide variations in healthcare services persist. One of the objectives of the Canadian Chiropractic Guideline Initiative is to develop and evaluate targeted theory- and evidence-informed interventions to improve the management of non-specific neck pain by chiropractors. In order to systematically develop a knowledge translation (KT) intervention underpinned by the Theoretical Domains Framework (TDF), we explored the factors perceived to influence the use of multimodal care to manage non-specific neck pain, and mapped behaviour change techniques to key theoretical domains. Individual telephone interviews exploring beliefs about managing neck pain were conducted with a purposive sample of 13 chiropractors. The interview guide was based upon the TDF. Interviews were digitally recorded, transcribed verbatim and analysed by two independent assessors using thematic content analysis. A 15-member expert panel formally met to design a KT intervention. Nine TDF domains were identified as likely relevant. Key beliefs (and relevant domains of the TDF) included the following: influence of formal training, colleagues and patients on clinicians (Social Influences); availability of educational material (Environmental Context and Resources); and better clinical outcomes reinforcing the use of multimodal care (Reinforcement). Facilitating factors considered important included better communication (Skills); audits of patients' treatment-related outcomes (Behavioural Regulation); awareness and agreement with guidelines (Knowledge); and tailoring of multimodal care (Memory, Attention and Decision Processes). Clinicians conveyed conflicting beliefs about perceived threats to professional autonomy (Social/Professional Role and Identity) and speed of recovery from either applying or ignoring the practice recommendations (Beliefs about Consequences). The expert panel mapped behaviour change techniques to key theoretical domains and identified relevant KT strategies and modes of delivery to increase the use of multimodal care among chiropractors. A multifaceted KT educational intervention targeting chiropractors' management of neck pain was developed. The KT intervention consisted of an online education webinar series, clinical vignettes and a video underpinned by the Brief Action Planning model. The intervention was designed to reflect key theoretical domains, behaviour change techniques and intervention components. The effectiveness of the proposed intervention remains to be tested.

  15. Software Formal Inspections Guidebook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  16. The VATES-Diamond as a Verifier's Best Friend

    NASA Astrophysics Data System (ADS)

    Glesner, Sabine; Bartels, Björn; Göthel, Thomas; Kleine, Moritz

    Within a model-based software engineering process it needs to be ensured that properties of abstract specifications are preserved by transformations down to executable code. This is even more important in the area of safety-critical real-time systems where additionally non-functional properties are crucial. In the VATES project, we develop formal methods for the construction and verification of embedded systems. We follow a novel approach that allows us to formally relate abstract process algebraic specifications to their implementation in a compiler intermediate representation. The idea is to extract a low-level process algebraic description from the intermediate code and to formally relate it to previously developed abstract specifications. We apply this approach to a case study from the area of real-time operating systems and show that this approach has the potential to seamlessly integrate modeling, implementation, transformation and verification stages of embedded system development.

  17. From dissipative dynamics to studies of heat transfer at the nanoscale: analysis of the spin-boson model.

    PubMed

    Boudjada, Nazim; Segal, Dvira

    2014-11-26

    We study in a unified manner the dissipative dynamics and the transfer of heat in the two-bath spin-boson model. We use the Bloch-Redfield (BR) formalism, valid in the very weak system-bath coupling limit, the noninteracting-blip approximation (NIBA), applicable in the nonadiabatic limit, and iterative, numerically exact path integral tools. These methodologies were originally developed for the description of the dissipative dynamics of a quantum system, and here they are applied to explore the problem of quantum energy transport in a nonequilibrium setting. Specifically, we study the weak-to-intermediate system-bath coupling regime at high temperatures kBT/ħ > ε, with ε as the characteristic frequency of the two-state system. The BR formalism and NIBA can lead to close results for the dynamics of the reduced density matrix (RDM) in a certain range of parameters. However, relatively small deviations in the RDM dynamics propagate into significant qualitative discrepancies in the transport behavior. Similarly, beyond the strict nonadiabatic limit NIBA's prediction for the heat current is qualitatively incorrect: It fails to capture the turnover behavior of the current with tunneling energy and temperature. Thus, techniques that proved meaningful for describing the RDM dynamics, to some extent even beyond their rigorous range of validity, should be used with great caution in heat transfer calculations, because qualitative-serious failures develop once parameters are mildly stretched beyond the techniques' working assumptions.

  18. AMPHION: Specification-based programming for scientific subroutine libraries

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Waldinger, Richard; Stickel, Mark

    1994-01-01

    AMPHION is a knowledge-based software engineering (KBSE) system that guides a user in developing a diagram representing a formal problem specification. It then automatically implements a solution to this specification as a program consisting of calls to subroutines from a library. The diagram provides an intuitive domain oriented notation for creating a specification that also facilitates reuse and modification. AMPHION'S architecture is domain independent. AMPHION is specialized to an application domain by developing a declarative domain theory. Creating a domain theory is an iterative process that currently requires the joint expertise of domain experts and experts in automated formal methods for software development.

  19. Formal methods for dependable real-time systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    The motivation for using formal methods to specify and reason about real time properties is outlined and approaches that were proposed and used are sketched. The formal verifications of clock synchronization algorithms are concluded as showing that mechanically supported reasoning about complex real time behavior is feasible. However, there was significant increase in the effectiveness of verification systems since those verifications were performed, at it is to be expected that verifications of comparable difficulty will become fairly routine. The current challenge lies in developing perspicuous and economical approaches to the formalization and specification of real time properties.

  20. Exploring practical knowledge: a case study of an experienced senior tennis performer.

    PubMed

    Langley, D J; Knight, S M

    1996-12-01

    The purpose of the study was to explore sport-related practical knowledge through the perceptions and experiences of a senior adult competitive tennis performer. Practical knowledge was defined as goal oriented, experiential knowledge developed within particular physical activity settings. Data were collected through formal interviews and participant observation and analyzed through narrative inquiry and conventional coding techniques. The data suggest that the tennis environment was perceived in terms of the opportunities afforded by that environment. Specifically, the participant's practical knowledge centered on performance capabilities and strategic planning that revealed opponent limitations. This knowledge appeared to be developed and expressed within the relationships among individual capabilities, the task, and the situated context of game play.

  1. Machine-Checkable Timed CSP

    NASA Technical Reports Server (NTRS)

    Goethel, Thomas; Glesner, Sabine

    2009-01-01

    The correctness of safety-critical embedded software is crucial, whereas non-functional properties like deadlock-freedom and real-time constraints are particularly important. The real-time calculus Timed Communicating Sequential Processes (CSP) is capable of expressing such properties and can therefore be used to verify embedded software. In this paper, we present our formalization of Timed CSP in the Isabelle/HOL theorem prover, which we have formulated as an operational coalgebraic semantics together with bisimulation equivalences and coalgebraic invariants. Furthermore, we apply these techniques in an abstract specification with real-time constraints, which is the basis for current work in which we verify the components of a simple real-time operating system deployed on a satellite.

  2. Localization in abelian Chern-Simons theory

    NASA Astrophysics Data System (ADS)

    McLellan, B. D. K.

    2013-02-01

    Chern-Simons theory on a closed contact three-manifold is studied when the Lie group for gauge transformations is compact, connected, and abelian. The abelian Chern-Simons partition function is derived using the Faddeev-Popov gauge fixing method. The partition function is then formally computed using the technique of non-abelian localization. This study leads to a natural identification of the abelian Reidemeister-Ray-Singer torsion as a specific multiple of the natural unit symplectic volume form on the moduli space of flat abelian connections for the class of Sasakian three-manifolds. The torsion part of the abelian Chern-Simons partition function is computed explicitly in terms of Seifert data for a given Sasakian three-manifold.

  3. Great Computational Intelligence in the Formal Sciences via Analogical Reasoning

    DTIC Science & Technology

    2017-05-08

    computational harnessing of traditional mathematical statistics (as e.g. covered in Hogg, Craig & McKean 2005) is used to power statistical learning techniques...AFRL-AFOSR-VA-TR-2017-0099 Great Computational Intelligence in the Formal Sciences via Analogical Reasoning Selmer Bringsjord RENSSELAER POLYTECHNIC...08-05-2017 2. REPORT TYPE Final Performance 3. DATES COVERED (From - To) 15 Oct 2011 to 31 Dec 2016 4. TITLE AND SUBTITLE Great Computational

  4. Student Incivility in Radiography Education.

    PubMed

    Clark, Kevin R

    2017-07-01

    To examine student incivility in radiography classrooms by exploring the prevalence of uncivil behaviors along with the classroom management strategies educators use to manage and prevent classroom disruptions. A survey was designed to collect data on the severity and frequency of uncivil student behaviors, classroom management strategies used to address minor and major behavioral issues, and techniques to prevent student incivility. The participants were educators in radiography programs accredited by the Joint Review Committee on Education in Radiologic Technology. Findings indicated that severe uncivil student behaviors in radiography classrooms do not occur as often as behaviors classified as less severe. Radiography educators in this study used a variety of strategies and techniques to manage and prevent student incivility; however, radiography educators who received formal training in classroom management reported fewer incidents of student incivility than those who had not received formal training. The participants in this study took a proactive approach to addressing severe behavioral issues in the classroom. Many radiography educators transition from the clinical environment to the classroom setting with little to no formal training in classroom management. Radiography educators are encouraged to attend formal training sessions to learn how to manage the higher education classroom effectively. Student incivility is present in radiography classrooms. This study provides a foundation for future research on incivility. ©2017 American Society of Radiologic Technologists.

  5. Formal Requirements-Based Programming for Complex Systems

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis

    2005-01-01

    Computer science as a field has not yet produced a general method to mechanically transform complex computer system requirements into a provably equivalent implementation. Such a method would be one major step towards dealing with complexity in computing, yet it remains the elusive holy grail of system development. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that such tools and methods leave unfilled is that the formal models cannot be proven to be equivalent to the system requirements as originated by the customer For the classes of complex systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations. While other techniques are available, this method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. We illustrate the application of the method to an example procedure from the Hubble Robotic Servicing Mission currently under study and preliminary formulation at NASA Goddard Space Flight Center.

  6. Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns

    NASA Technical Reports Server (NTRS)

    Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.

    2006-01-01

    Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.

  7. Gluon Bremsstrahlung in Weakly-Coupled Plasmas

    NASA Astrophysics Data System (ADS)

    Arnold, Peter

    2009-11-01

    I report on some theoretical progress concerning the calculation of gluon bremsstrahlung for very high energy particles crossing a weakly-coupled quark-gluon plasma. (i) I advertise that two of the several formalisms used to study this problem, the BDMPS-Zakharov formalism and the AMY formalism (the latter used only for infinite, uniform media), can be made equivalent when appropriately formulated. (ii) A standard technique to simplify calculations is to expand in inverse powers of logarithms ln(E/T). I give an example where such expansions are found to work well for ω/T≳10 where ω is the bremsstrahlung gluon energy. (iii) Finally, I report on perturbative calculations of q̂.

  8. Rasmussen's legacy: A paradigm change in engineering for safety.

    PubMed

    Leveson, Nancy G

    2017-03-01

    This paper describes three applications of Rasmussen's idea to systems engineering practice. The first is the application of the abstraction hierarchy to engineering specifications, particularly requirements specification. The second is the use of Rasmussen's ideas in safety modeling and analysis to create a new, more powerful type of accident causation model that extends traditional models to better handle human-operated, software-intensive, sociotechnical systems. Because this new model has a formal, mathematical foundation built on systems theory (as was Rasmussen's original model), new modeling and analysis tools become possible. The third application is to engineering hazard analysis. Engineers have traditionally either omitted human from consideration in system hazard analysis or have treated them rather superficially, for example, that they behave randomly. Applying Rasmussen's model of human error to a powerful new hazard analysis technique allows human behavior to be included in engineering hazard analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Automated Bilateral Negotiation and Bargaining Impasse

    NASA Astrophysics Data System (ADS)

    Lopes, Fernando; Novais, A. Q.; Coelho, Helder

    The design and implementation of autonomous negotiating agents involve the consideration of insights from multiple relevant research areas to integrate different perspectives on negotiation. As a starting point for an interdisciplinary research effort, this paper employs game-theoretic techniques to define equilibrium strategies for the bargaining game of alternating offers and formalizes a set of negotiation strategies studied in the social sciences. This paper also shifts the emphasis to negotiations that are "difficult" to resolve and can hit an impasse. Specifically, it analyses a situation where two agents bargain over the division of the surplus of several distinct issues to demonstrate how a procedure to avoid impasses can be utilized in a specific negotiation setting. The procedure is based on the addition of new issues to the agenda during the course of negotiation and the exploration of the differences in the valuation of these issues to capitalize on Pareto optimal agreements.

  10. An Ontology Based Approach to Information Security

    NASA Astrophysics Data System (ADS)

    Pereira, Teresa; Santos, Henrique

    The semantically structure of knowledge, based on ontology approaches have been increasingly adopted by several expertise from diverse domains. Recently ontologies have been moved from the philosophical and metaphysics disciplines to be used in the construction of models to describe a specific theory of a domain. The development and the use of ontologies promote the creation of a unique standard to represent concepts within a specific knowledge domain. In the scope of information security systems the use of an ontology to formalize and represent the concepts of security information challenge the mechanisms and techniques currently used. This paper intends to present a conceptual implementation model of an ontology defined in the security domain. The model presented contains the semantic concepts based on the information security standard ISO/IEC_JTC1, and their relationships to other concepts, defined in a subset of the information security domain.

  11. Advanced training systems

    NASA Technical Reports Server (NTRS)

    Savely, Robert T.; Loftin, R. Bowen

    1990-01-01

    Training is a major endeavor in all modern societies. Common training methods include training manuals, formal classes, procedural computer programs, simulations, and on-the-job training. NASA's training approach has focussed primarily on on-the-job training in a simulation environment for both crew and ground based personnel. NASA must explore new approaches to training for the 1990's and beyond. Specific autonomous training systems are described which are based on artificial intelligence technology for use by NASA astronauts, flight controllers, and ground based support personnel that show an alternative to current training systems. In addition to these specific systems, the evolution of a general architecture for autonomous intelligent training systems that integrates many of the features of traditional training programs with artificial intelligence techniques is presented. These Intelligent Computer Aided Training (ICAT) systems would provide much of the same experience that could be gained from the best on-the-job training.

  12. Authoring and verification of clinical guidelines: a model driven approach.

    PubMed

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc. All rights reserved.

  13. Experiences Using Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1996-01-01

    This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  14. Enabling Requirements-Based Programming for Highly-Dependable Complex Parallel and Distributed Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    The manual application of formal methods in system specification has produced successes, but in the end, despite any claims and assertions by practitioners, there is no provable relationship between a manually derived system specification or formal model and the customer's original requirements. Complex parallel and distributed system present the worst case implications for today s dearth of viable approaches for achieving system dependability. No avenue other than formal methods constitutes a serious contender for resolving the problem, and so recognition of requirements-based programming has come at a critical juncture. We describe a new, NASA-developed automated requirement-based programming method that can be applied to certain classes of systems, including complex parallel and distributed systems, to achieve a high degree of dependability.

  15. Model Checking Techniques for Assessing Functional Form Specifications in Censored Linear Regression Models.

    PubMed

    León, Larry F; Cai, Tianxi

    2012-04-01

    In this paper we develop model checking techniques for assessing functional form specifications of covariates in censored linear regression models. These procedures are based on a censored data analog to taking cumulative sums of "robust" residuals over the space of the covariate under investigation. These cumulative sums are formed by integrating certain Kaplan-Meier estimators and may be viewed as "robust" censored data analogs to the processes considered by Lin, Wei & Ying (2002). The null distributions of these stochastic processes can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be generated by computer simulation. Each observed process can then be graphically compared with a few realizations from the Gaussian process. We also develop formal test statistics for numerical comparison. Such comparisons enable one to assess objectively whether an apparent trend seen in a residual plot reects model misspecification or natural variation. We illustrate the methods with a well known dataset. In addition, we examine the finite sample performance of the proposed test statistics in simulation experiments. In our simulation experiments, the proposed test statistics have good power of detecting misspecification while at the same time controlling the size of the test.

  16. Formal specification and verification of a fault-masking and transient-recovery model for digital flight-control systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1991-01-01

    The formal specification and mechanically checked verification for a model of fault-masking and transient-recovery among the replicated computers of digital flight-control systems are presented. The verification establishes, subject to certain carefully stated assumptions, that faults among the component computers are masked so that commands sent to the actuators are the same as those that would be sent by a single computer that suffers no failures.

  17. Formal Specifications for an Electrical Power Grid System Stability and Reliability

    DTIC Science & Technology

    2015-09-01

    expressed in this thesis are those of the author and do not reflect the official policy or position of the Department of Defense or the U.S. Government. IRB...analyze the power grid system requirements and express the critical runtime behavior using first-order logic. First, we identify observable...Verification System, and Type systems to name a few [5]. Theorem proving’s specification dimension is dependent on the expressive power of the formal

  18. A Formal Specification and Verification Method for the Prevention of Denial of Service in Ada Services

    DTIC Science & Technology

    1988-03-01

    Mechanism; Computer Security. 16. PRICE CODE 17. SECURITY CLASSIFICATION IS. SECURITY CLASSIFICATION 19. SECURITY CLASSIFICATION 20. UMrrATION OF ABSTRACT...denial of service. This paper assumes that the reader is a computer science or engineering professional working in the area of formal specification and...recovery from such events as deadlocks and crashes can be accounted for in the computation of the waiting time for each service in the service hierarchy

  19. A Mode-Shape-Based Fault Detection Methodology for Cantilever Beams

    NASA Technical Reports Server (NTRS)

    Tejada, Arturo

    2009-01-01

    An important goal of NASA's Internal Vehicle Health Management program (IVHM) is to develop and verify methods and technologies for fault detection in critical airframe structures. A particularly promising new technology under development at NASA Langley Research Center is distributed Bragg fiber optic strain sensors. These sensors can be embedded in, for instance, aircraft wings to continuously monitor surface strain during flight. Strain information can then be used in conjunction with well-known vibrational techniques to detect faults due to changes in the wing's physical parameters or to the presence of incipient cracks. To verify the benefits of this technology, the Formal Methods Group at NASA LaRC has proposed the use of formal verification tools such as PVS. The verification process, however, requires knowledge of the physics and mathematics of the vibrational techniques and a clear understanding of the particular fault detection methodology. This report presents a succinct review of the physical principles behind the modeling of vibrating structures such as cantilever beams (the natural model of a wing). It also reviews two different classes of fault detection techniques and proposes a particular detection method for cracks in wings, which is amenable to formal verification. A prototype implementation of these methods using Matlab scripts is also described and is related to the fundamental theoretical concepts.

  20. Formal Verification of Large Software Systems

    NASA Technical Reports Server (NTRS)

    Yin, Xiang; Knight, John

    2010-01-01

    We introduce a scalable proof structure to facilitate formal verification of large software systems. In our approach, we mechanically synthesize an abstract specification from the software implementation, match its static operational structure to that of the original specification, and organize the proof as the conjunction of a series of lemmas about the specification structure. By setting up a different lemma for each distinct element and proving each lemma independently, we obtain the important benefit that the proof scales easily for large systems. We present details of the approach and an illustration of its application on a challenge problem from the security domain

  1. Using Ontologies to Formalize Services Specifications in Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Breitman, Karin Koogan; Filho, Aluizio Haendchen; Haeusler, Edward Hermann

    2004-01-01

    One key issue in multi-agent systems (MAS) is their ability to interact and exchange information autonomously across applications. To secure agent interoperability, designers must rely on a communication protocol that allows software agents to exchange meaningful information. In this paper we propose using ontologies as such communication protocol. Ontologies capture the semantics of the operations and services provided by agents, allowing interoperability and information exchange in a MAS. Ontologies are a formal, machine processable, representation that allows to capture the semantics of a domain and, to derive meaningful information by way of logical inference. In our proposal we use a formal knowledge representation language (OWL) that translates into Description Logics (a subset of first order logic), thus eliminating ambiguities and providing a solid base for machine based inference. The main contribution of this approach is to make the requirements explicit, centralize the specification in a single document (the ontology itself), at the same that it provides a formal, unambiguous representation that can be processed by automated inference machines.

  2. The trephine colostomy: a permanent left iliac fossa end colostomy without recourse to laparotomy.

    PubMed Central

    Senapati, A.; Phillips, R. K.

    1991-01-01

    An operative technique for performing a permanent end sigmoid colostomy without recourse to laparotomy is presented. The results from 16 patients have shown a very low morbidity. The technique was unsuccessful in three patients, each needing a formal laparotomy. PMID:1929133

  3. Simulation Techniques in Training College Administrators.

    ERIC Educational Resources Information Center

    Fincher, Cameron

    Traditional methods of recruitment and selection in academic administration have not placed an emphasis on formal training or preparation but have relied heavily on informal notions of experiential learning. Simulation as a device for representing complex processes in a manageable form, gaming as an organizing technique for training and…

  4. Application of phyto-indication and radiocesium indicative methods for microrelief mapping

    NASA Astrophysics Data System (ADS)

    Panidi, E.; Trofimetz, L.; Sokolova, J.

    2016-04-01

    Remote sensing technologies are widely used for production of Digital Elevation Models (DEMs), and geomorphometry techniques are valuable tools for DEM analysis. One of the broadly used applications of these technologies and techniques is relief mapping. In the simplest case, we can identify relief structures using DEM analysis, and produce a map or map series to show the relief condition. However, traditional techniques might fail when used for mapping microrelief structures (structures below ten meters in size). In this case high microrelief dynamics lead to technological and conceptual difficulties. Moreover, erosion of microrelief structures cannot be detected at the initial evolution stage using DEM modelling and analysis only. In our study, we investigate the possibilities and specific techniques for allocation of erosion microrelief structures, and mapping techniques for the microrelief derivatives (e.g. quantitative parameters of microrelief). Our toolset includes the analysis of spatial redistribution of the soil pollutants and phyto-indication analysis, which complement the common DEM modelling and geomorphometric analysis. We use field surveys produced at the test area, which is arable territory with high erosion risks. Our main conclusion at the current stage is that the indicative methods (i.e. radiocesium and phyto-indication methods) are effective for allocation of the erosion microrelief structures. Also, these methods need to be formalized for convenient use.

  5. Rewriting Modulo SMT and Open System Analysis

    NASA Technical Reports Server (NTRS)

    Rocha, Camilo; Meseguer, Jose; Munoz, Cesar

    2014-01-01

    This paper proposes rewriting modulo SMT, a new technique that combines the power of SMT solving, rewriting modulo theories, and model checking. Rewriting modulo SMT is ideally suited to model and analyze infinite-state open systems, i.e., systems that interact with a non-deterministic environment. Such systems exhibit both internal non-determinism, which is proper to the system, and external non-determinism, which is due to the environment. In a reflective formalism, such as rewriting logic, rewriting modulo SMT can be reduced to standard rewriting. Hence, rewriting modulo SMT naturally extends rewriting-based reachability analysis techniques, which are available for closed systems, to open systems. The proposed technique is illustrated with the formal analysis of: (i) a real-time system that is beyond the scope of timed-automata methods and (ii) automatic detection of reachability violations in a synchronous language developed to support autonomous spacecraft operations.

  6. An Integrated Environment for Efficient Formal Design and Verification

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.

  7. Towards a formal semantics for Ada 9X

    NASA Technical Reports Server (NTRS)

    Guaspari, David; Mchugh, John; Wolfgang, Polak; Saaltink, Mark

    1995-01-01

    The Ada 9X language precision team was formed during the revisions of Ada 83, with the goal of analyzing the proposed design, identifying problems, and suggesting improvements, through the use of mathematical models. This report defines a framework for formally describing Ada 9X, based on Kahn's 'natural semantics', and applies the framework to portions of the language. The proposals for exceptions and optimization freedoms are also analyzed, using a different technique.

  8. (abstract) Formal Inspection Technology Transfer Program

    NASA Technical Reports Server (NTRS)

    Welz, Linda A.; Kelly, John C.

    1993-01-01

    A Formal Inspection Technology Transfer Program, based on the inspection process developed by Michael Fagan at IBM, has been developed at JPL. The goal of this program is to support organizations wishing to use Formal Inspections to improve the quality of software and system level engineering products. The Technology Transfer Program provides start-up materials and assistance to help organizations establish their own Formal Inspection program. The course materials and certified instructors associated with the Technology Transfer Program have proven to be effective in classes taught at other NASA centers as well as at JPL. Formal Inspections (NASA tailored Fagan Inspections) are a set of technical reviews whose objective is to increase quality and reduce the cost of software development by detecting and correcting errors early. A primary feature of inspections is the removal of engineering errors before they amplify into larger and more costly problems downstream in the development process. Note that the word 'inspection' is used differently in software than in a manufacturing context. A Formal Inspection is a front-end quality enhancement technique, rather than a task conducted just prior to product shipment for the purpose of sorting defective systems (manufacturing usage). Formal Inspections are supporting and in agreement with the 'total quality' approach being adopted by many NASA centers.

  9. What Is Linguistics? ERIC Digest. [Revised].

    ERIC Educational Resources Information Center

    ERIC Clearinghouse on Languages and Linguistics, Washington, DC.

    Linguistics is the study of language, as contrasted with knowledge of a specific language. Formal linguistics is the study of the structures and processes of language, or how it works and is organized. Different approaches to formal linguistics include traditional or prescriptive, structural, and generative or transformational perspectives. Formal…

  10. Neuroimaging Week: A Novel, Engaging, and Effective Curriculum for Teaching Neuroimaging to Junior Psychiatric Residents

    ERIC Educational Resources Information Center

    Downar, Jonathan; Krizova, Adriana; Ghaffar, Omar; Zaretsky, Ari

    2010-01-01

    Objective: Neuroimaging techniques are increasingly important in psychiatric research and clinical practice, but few postgraduate psychiatry programs offer formal training in neuroimaging. To address this need, the authors developed a course to prepare psychiatric residents to use neuroimaging techniques effectively in independent practice.…

  11. INTERLABORATORY STUDY OF THE COLD VAPOR TECHNIQUE FOR TOTAL MERCURY IN WATER

    EPA Science Inventory

    The American Society for Testing and Materials (ASTM) and the U.S. Environmental Protection Agency (EPA) conducted a joint study of the cold vapor technique for total mercury in water, before formal acceptance of the method by each organization. The method employs an acid-permang...

  12. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  13. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  14. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  15. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  16. Experiences Using Lightweight Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1997-01-01

    This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  17. Toward fidelity between specification and implementation

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.; Morrison, Jeff; Wu, Yunqing

    1994-01-01

    This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.

  18. Dose calculation for photon-emitting brachytherapy sources with average energy higher than 50 keV: report of the AAPM and ESTRO.

    PubMed

    Perez-Calatayud, Jose; Ballester, Facundo; Das, Rupak K; Dewerd, Larry A; Ibbott, Geoffrey S; Meigooni, Ali S; Ouhib, Zoubir; Rivard, Mark J; Sloboda, Ron S; Williamson, Jeffrey F

    2012-05-01

    Recommendations of the American Association of Physicists in Medicine (AAPM) and the European Society for Radiotherapy and Oncology (ESTRO) on dose calculations for high-energy (average energy higher than 50 keV) photon-emitting brachytherapy sources are presented, including the physical characteristics of specific (192)Ir, (137)Cs, and (60)Co source models. This report has been prepared by the High Energy Brachytherapy Source Dosimetry (HEBD) Working Group. This report includes considerations in the application of the TG-43U1 formalism to high-energy photon-emitting sources with particular attention to phantom size effects, interpolation accuracy dependence on dose calculation grid size, and dosimetry parameter dependence on source active length. Consensus datasets for commercially available high-energy photon sources are provided, along with recommended methods for evaluating these datasets. Recommendations on dosimetry characterization methods, mainly using experimental procedures and Monte Carlo, are established and discussed. Also included are methodological recommendations on detector choice, detector energy response characterization and phantom materials, and measurement specification methodology. Uncertainty analyses are discussed and recommendations for high-energy sources without consensus datasets are given. Recommended consensus datasets for high-energy sources have been derived for sources that were commercially available as of January 2010. Data are presented according to the AAPM TG-43U1 formalism, with modified interpolation and extrapolation techniques of the AAPM TG-43U1S1 report for the 2D anisotropy function and radial dose function.

  19. Combining the GW formalism with the polarizable continuum model: A state-specific non-equilibrium approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duchemin, Ivan, E-mail: ivan.duchemin@cea.fr; Jacquemin, Denis; Institut Universitaire de France, 1 rue Descartes, 75005 Paris Cedex 5

    We have implemented the polarizable continuum model within the framework of the many-body Green’s function GW formalism for the calculation of electron addition and removal energies in solution. The present formalism includes both ground-state and non-equilibrium polarization effects. In addition, the polarization energies are state-specific, allowing to obtain the bath-induced renormalisation energy of all occupied and virtual energy levels. Our implementation is validated by comparisons with ΔSCF calculations performed at both the density functional theory and coupled-cluster single and double levels for solvated nucleobases. The present study opens the way to GW and Bethe-Salpeter calculations in disordered condensed phases ofmore » interest in organic optoelectronics, wet chemistry, and biology.« less

  20. Directly executable formal models of middleware for MANET and Cloud Networking and Computing

    NASA Astrophysics Data System (ADS)

    Pashchenko, D. V.; Sadeq Jaafar, Mustafa; Zinkin, S. A.; Trokoz, D. A.; Pashchenko, T. U.; Sinev, M. P.

    2016-04-01

    The article considers some “directly executable” formal models that are suitable for the specification of computing and networking in the cloud environment and other networks which are similar to wireless networks MANET. These models can be easily programmed and implemented on computer networks.

  1. Formal and Integrated Strategies for Competence Development in SMEs

    ERIC Educational Resources Information Center

    Kock, Henrik; Ellstrom, Per-Erik

    2011-01-01

    Purpose: The purpose of this paper is to increase understanding of the relationships among the workplace as a learning environment, strategies for competence development used by SMEs and learning outcomes. Specifically, there is a focus on a distinction between formal and integrated strategies for competence development, the conditions under which…

  2. Navigating the Seas of Policy.

    ERIC Educational Resources Information Center

    Cunningham, Stephanie; Kennedy, Steve; McAlonan, Susan; Hotchkiss, Heather

    As the sun, moon, and stars helped sea captains to navigate, policy (defined as a formalized idea to encourage change) indicates general direction and speed but does not establish a specific approach to achieve implementation. Formal and informal policies have advantages and disadvantages. These are steps in navigating policy formation: identify…

  3. Investigating Actuation Force Fight with Asynchronous and Synchronous Redundancy Management Techniques

    NASA Technical Reports Server (NTRS)

    Hall, Brendan; Driscoll, Kevin; Schweiker, Kevin; Dutertre, Bruno

    2013-01-01

    Within distributed fault-tolerant systems the term force-fight is colloquially used to describe the level of command disagreement present at redundant actuation interfaces. This report details an investigation of force-fight using three distributed system case-study architectures. Each case study architecture is abstracted and formally modeled using the Symbolic Analysis Laboratory (SAL) tool chain from the Stanford Research Institute (SRI). We use the formal SAL models to produce k-induction based proofs of a bounded actuation agreement property. We also present a mathematically derived bound of redundant actuation agreement for sine-wave stimulus. The report documents our experiences and lessons learned developing the formal models and the associated proofs.

  4. Teaching Astronomy in non-formal education: stars workshop

    NASA Astrophysics Data System (ADS)

    Hernán-Obispo, M.; Crespo-Chacón, I.; Gálvez, M. C.; López-Santiago, J.

    One of the fields in which teaching Astronomy is more demanded is non-formal education. The Stars Workshop we present in this contribution consisted on an introduction to Astronomy and observation methods. The main objectives were: to know the main components of the Universe, their characteristics and the scales of size and time existing between them; to understand the movement of the different celestial objects; to know the different observational techniques; to value the different historical explanations about the Earth and the position of Humanity in the Universe. This Stars Workshop was a collaboration with the Escuela de Tiempo Libre Jumavi, which is a school dedicated to the training and non-formal education in the leisure field.

  5. Espacial.com : a cooperative learning model in internet

    NASA Astrophysics Data System (ADS)

    Perez-Poch, A.; Solans, R.

    Espacial.com is the leading and oldest internet site in Spanish language which reports 24 hours a day on space exploration. Moreover it is the only specialized site that has broadcasted live the main space events in the past years with expert comments in Spanish . From its first day, education is the main purpose of the site always with an international and multidisciplinary approach. Fernando Caldeiro, Class 16 NASA Astronaut, is the leading person in the project with his non-stop presence in the forums making valuable comments and answering questions from its young audience. We analyse the ongoing dynamics in the forum, and how a virtual community of space enthusiasts is created. We show that, because of the presence of some key factors (leadership, commitment to excel, motivation, communicative skills, ldots), it is possible to establish a high degree of compromise for learning although in an non-formal way. Cooperative learning is a well-known pedagogical technique which has proven its efficacy in different formal and non-formal areas. Using internet capabilities this technique proves to be an excellent approach to educational outreach on space-related subjects.

  6. V and V of Lexical, Syntactic and Semantic Properties for Interactive Systems Through Model Checking of Formal Description of Dialog

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.; Martinie, Celia; Palanque, Philippe

    2013-01-01

    During early phases of the development of an interactive system, future system properties are identified (through interaction with end users in the brainstorming and prototyping phase of the application, or by other stakehold-ers) imposing requirements on the final system. They can be specific to the application under development or generic to all applications such as usability principles. Instances of specific properties include visibility of the aircraft altitude, speed… in the cockpit and the continuous possibility of disengaging the autopilot in whatever state the aircraft is. Instances of generic properties include availability of undo (for undoable functions) and availability of a progression bar for functions lasting more than four seconds. While behavioral models of interactive systems using formal description techniques provide complete and unambiguous descriptions of states and state changes, it does not provide explicit representation of the absence or presence of properties. Assessing that the system that has been built is the right system remains a challenge usually met through extensive use and acceptance tests. By the explicit representation of properties and the availability of tools to support checking these properties, it becomes possible to provide developers with means for systematic exploration of the behavioral models and assessment of the presence or absence of these properties. This paper proposes the synergistic use two tools for checking both generic and specific properties of interactive applications: Petshop and Java PathFinder. Petshop is dedicated to the description of interactive system behavior. Java PathFinder is dedicated to the runtime verification of Java applications and as an extension dedicated to User Interfaces. This approach is exemplified on a safety critical application in the area of interactive cockpits for large civil aircrafts.

  7. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  8. A Formal Approach to Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.

  9. Abstract Model of the SATS Concept of Operations: Initial Results and Recommendations

    NASA Technical Reports Server (NTRS)

    Dowek, Gilles; Munoz, Cesar; Carreno, Victor A.

    2004-01-01

    An abstract mathematical model of the concept of operations for the Small Aircraft Transportation System (SATS) is presented. The Concept of Operations consist of several procedures that describe nominal operations for SATS, Several safety properties of the system are proven using formal techniques. The final goal of the verification effort is to show that under nominal operations, aircraft are safely separated. The abstract model was written and formally verified in the Prototype Verification System (PVS).

  10. Formal Verification at System Level

    NASA Astrophysics Data System (ADS)

    Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.

    2009-05-01

    System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.

  11. An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach

    DTIC Science & Technology

    2012-08-01

    fusion. Therefore, we provide a detailed discussion on uncertain data types, their origins and three uncertainty pro- cessing formalisms that are popular...suitable membership functions corresponding to the fuzzy sets. 3.2.3 DS Theory The DS belief theory, originally proposed by Dempster, can be thought of as... originated and various imperfections of the source. Uncertainty handling formalisms provide techniques for modeling and working with these uncertain data types

  12. Specification and verification of gate-level VHDL models of synchronous and asynchronous circuits

    NASA Technical Reports Server (NTRS)

    Russinoff, David M.

    1995-01-01

    We present a mathematical definition of hardware description language (HDL) that admits a semantics-preserving translation to a subset of VHDL. Our HDL includes the basic VHDL propagation delay mechanisms and gate-level circuit descriptions. We also develop formal procedures for deriving and verifying concise behavioral specifications of combinational and sequential devices. The HDL and the specification procedures have been formally encoded in the computational logic of Boyer and Moore, which provides a LISP implementation as well as a facility for mechanical proof-checking. As an application, we design, specify, and verify a circuit that achieves asynchronous communication by means of the biphase mark protocol.

  13. Tactical Synthesis Of Efficient Global Search Algorithms

    NASA Technical Reports Server (NTRS)

    Nedunuri, Srinivas; Smith, Douglas R.; Cook, William R.

    2009-01-01

    Algorithm synthesis transforms a formal specification into an efficient algorithm to solve a problem. Algorithm synthesis in Specware combines the formal specification of a problem with a high-level algorithm strategy. To derive an efficient algorithm, a developer must define operators that refine the algorithm by combining the generic operators in the algorithm with the details of the problem specification. This derivation requires skill and a deep understanding of the problem and the algorithmic strategy. In this paper we introduce two tactics to ease this process. The tactics serve a similar purpose to tactics used for determining indefinite integrals in calculus, that is suggesting possible ways to attack the problem.

  14. Report on the formal specification and partial verification of the VIPER microprocessor

    NASA Technical Reports Server (NTRS)

    Brock, Bishop; Hunt, Warren A., Jr.

    1991-01-01

    The VIPER microprocessor chip is partitioned into four levels of abstractions. At the highest level, VIPER is described with decreasingly abstract sets of functions in LCF-LSM. At the lowest level are the gate-level models in proprietary CAD languages. The block-level and gate-level specifications are also given in the ELLA simulation language. Among VIPER's deficiencies are the fact that there is no notion of external events in the top-level specification, and it is impossible to use the top-level specifications to prove abstract properties of programs running on VIPER computers. There is no complete proof that the gate-level specifications implement the top-level specifications. Cohn's proof that the major-state machine correctly implements the top-level specifications has no formal connection with any of the other proof attempts. None of the latter address resetting the machine, memory timeout, forced error, or single step modes.

  15. Distinguishing cause from correlation in tokamak experiments to trigger edge-localised plasma instabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webster, Anthony J.; CCFE, Culham Science Centre, Abingdon OX14 3DB

    2014-11-15

    The generic question is considered: How can we determine the probability of an otherwise quasi-random event, having been triggered by an external influence? A specific problem is the quantification of the success of techniques to trigger, and hence control, edge-localised plasma instabilities (ELMs) in magnetically confined fusion (MCF) experiments. The development of such techniques is essential to ensure tolerable heat loads on components in large MCF fusion devices, and is necessary for their development into economically successful power plants. Bayesian probability theory is used to rigorously formulate the problem and to provide a formal solution. Accurate but pragmatic methods aremore » developed to estimate triggering probabilities, and are illustrated with experimental data. These allow results from experiments to be quantitatively assessed, and rigorously quantified conclusions to be formed. Example applications include assessing whether triggering of ELMs is a statistical or deterministic process, and the establishment of thresholds to ensure that ELMs are reliably triggered.« less

  16. A study and evaluation of image analysis techniques applied to remotely sensed data

    NASA Technical Reports Server (NTRS)

    Atkinson, R. J.; Dasarathy, B. V.; Lybanon, M.; Ramapriyan, H. K.

    1976-01-01

    An analysis of phenomena causing nonlinearities in the transformation from Landsat multispectral scanner coordinates to ground coordinates is presented. Experimental results comparing rms errors at ground control points indicated a slight improvement when a nonlinear (8-parameter) transformation was used instead of an affine (6-parameter) transformation. Using a preliminary ground truth map of a test site in Alabama covering the Mobile Bay area and six Landsat images of the same scene, several classification methods were assessed. A methodology was developed for automatic change detection using classification/cluster maps. A coding scheme was employed for generation of change depiction maps indicating specific types of changes. Inter- and intraseasonal data of the Mobile Bay test area were compared to illustrate the method. A beginning was made in the study of data compression by applying a Karhunen-Loeve transform technique to a small section of the test data set. The second part of the report provides a formal documentation of the several programs developed for the analysis and assessments presented.

  17. Modelling operations and security of cloud systems using Z-notation and Chinese Wall security policy

    NASA Astrophysics Data System (ADS)

    Basu, Srijita; Sengupta, Anirban; Mazumdar, Chandan

    2016-11-01

    Enterprises are increasingly using cloud computing for hosting their applications. Availability of fast Internet and cheap bandwidth are causing greater number of people to use cloud-based services. This has the advantage of lower cost and minimum maintenance. However, ensuring security of user data and proper management of cloud infrastructure remain major areas of concern. Existing techniques are either too complex, or fail to properly represent the actual cloud scenario. This article presents a formal cloud model using the constructs of Z-notation. Principles of the Chinese Wall security policy have been applied to design secure cloud-specific operations. The proposed methodology will enable users to safely host their services, as well as process sensitive data, on cloud.

  18. Optical soliton solutions of the cubic-quintic non-linear Schrödinger's equation including an anti-cubic term

    NASA Astrophysics Data System (ADS)

    Kaplan, Melike; Hosseini, Kamyar; Samadani, Farzan; Raza, Nauman

    2018-07-01

    A wide range of problems in different fields of the applied sciences especially non-linear optics is described by non-linear Schrödinger's equations (NLSEs). In the present paper, a specific type of NLSEs known as the cubic-quintic non-linear Schrödinger's equation including an anti-cubic term has been studied. The generalized Kudryashov method along with symbolic computation package has been exerted to carry out this objective. As a consequence, a series of optical soliton solutions have formally been retrieved. It is corroborated that the generalized form of Kudryashov method is a direct, effectual, and reliable technique to deal with various types of non-linear Schrödinger's equations.

  19. Clinical immunology review series: an approach to desensitization

    PubMed Central

    Krishna, M T; Huissoon, A P

    2011-01-01

    Allergen immunotherapy describes the treatment of allergic disease through administration of gradually increasing doses of allergen. This form of immune tolerance induction is now safer, more reliably efficacious and better understood than when it was first formally described in 1911. In this paper the authors aim to summarize the current state of the art in immunotherapy in the treatment of inhalant, venom and drug allergies, with specific reference to its practice in the United Kingdom. A practical approach has been taken, with reference to current evidence and guidelines, including illustrative protocols and vaccine schedules. A number of novel approaches and techniques are likely to change considerably the way in which we select and treat allergy patients in the coming decade, and these advances are previewed. PMID:21175592

  20. Modeling techniques for quantum cascade lasers

    NASA Astrophysics Data System (ADS)

    Jirauschek, Christian; Kubis, Tillmann

    2014-03-01

    Quantum cascade lasers are unipolar semiconductor lasers covering a wide range of the infrared and terahertz spectrum. Lasing action is achieved by using optical intersubband transitions between quantized states in specifically designed multiple-quantum-well heterostructures. A systematic improvement of quantum cascade lasers with respect to operating temperature, efficiency, and spectral range requires detailed modeling of the underlying physical processes in these structures. Moreover, the quantum cascade laser constitutes a versatile model device for the development and improvement of simulation techniques in nano- and optoelectronics. This review provides a comprehensive survey and discussion of the modeling techniques used for the simulation of quantum cascade lasers. The main focus is on the modeling of carrier transport in the nanostructured gain medium, while the simulation of the optical cavity is covered at a more basic level. Specifically, the transfer matrix and finite difference methods for solving the one-dimensional Schrödinger equation and Schrödinger-Poisson system are discussed, providing the quantized states in the multiple-quantum-well active region. The modeling of the optical cavity is covered with a focus on basic waveguide resonator structures. Furthermore, various carrier transport simulation methods are discussed, ranging from basic empirical approaches to advanced self-consistent techniques. The methods include empirical rate equation and related Maxwell-Bloch equation approaches, self-consistent rate equation and ensemble Monte Carlo methods, as well as quantum transport approaches, in particular the density matrix and non-equilibrium Green's function formalism. The derived scattering rates and self-energies are generally valid for n-type devices based on one-dimensional quantum confinement, such as quantum well structures.

  1. Modeling techniques for quantum cascade lasers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jirauschek, Christian; Kubis, Tillmann

    2014-03-15

    Quantum cascade lasers are unipolar semiconductor lasers covering a wide range of the infrared and terahertz spectrum. Lasing action is achieved by using optical intersubband transitions between quantized states in specifically designed multiple-quantum-well heterostructures. A systematic improvement of quantum cascade lasers with respect to operating temperature, efficiency, and spectral range requires detailed modeling of the underlying physical processes in these structures. Moreover, the quantum cascade laser constitutes a versatile model device for the development and improvement of simulation techniques in nano- and optoelectronics. This review provides a comprehensive survey and discussion of the modeling techniques used for the simulation ofmore » quantum cascade lasers. The main focus is on the modeling of carrier transport in the nanostructured gain medium, while the simulation of the optical cavity is covered at a more basic level. Specifically, the transfer matrix and finite difference methods for solving the one-dimensional Schrödinger equation and Schrödinger-Poisson system are discussed, providing the quantized states in the multiple-quantum-well active region. The modeling of the optical cavity is covered with a focus on basic waveguide resonator structures. Furthermore, various carrier transport simulation methods are discussed, ranging from basic empirical approaches to advanced self-consistent techniques. The methods include empirical rate equation and related Maxwell-Bloch equation approaches, self-consistent rate equation and ensemble Monte Carlo methods, as well as quantum transport approaches, in particular the density matrix and non-equilibrium Green's function formalism. The derived scattering rates and self-energies are generally valid for n-type devices based on one-dimensional quantum confinement, such as quantum well structures.« less

  2. Formal Specification and Automatic Analysis of Business Processes under Authorization Constraints: An Action-Based Approach

    NASA Astrophysics Data System (ADS)

    Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa

    We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.

  3. Formal specification and mechanical verification of SIFT - A fault-tolerant flight control system

    NASA Technical Reports Server (NTRS)

    Melliar-Smith, P. M.; Schwartz, R. L.

    1982-01-01

    The paper describes the methodology being employed to demonstrate rigorously that the SIFT (software-implemented fault-tolerant) computer meets its requirements. The methodology uses a hierarchy of design specifications, expressed in the mathematical domain of multisorted first-order predicate calculus. The most abstract of these, from which almost all details of mechanization have been removed, represents the requirements on the system for reliability and intended functionality. Successive specifications in the hierarchy add design and implementation detail until the PASCAL programs implementing the SIFT executive are reached. A formal proof that a SIFT system in a 'safe' state operates correctly despite the presence of arbitrary faults has been completed all the way from the most abstract specifications to the PASCAL program.

  4. Educational principles and techniques for interpreters.

    Treesearch

    F. David Boulanger; John P. Smith

    1973-01-01

    Interpretation is in large part education, since it attempts to convey information, concepts, and principles while creating attitude changes and such emotional states as wonder, delight, and appreciation. Although interpreters might profit greatly by formal training in the principles and techniques of teaching, many have not had such training. Some means of making the...

  5. Formalizing New Navigation Requirements for NASA's Space Shuttle

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.

    1996-01-01

    We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CRs) were selected as promising targets to demonstrate the utility of formal methods in this demanding application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this industrial usage report. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During a limited analysis conducted on the formal specifications, numerous requirements issues were discovered. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.

  6. Putting it all together: Exhumation histories from a formal combination of heat flow and a suite of thermochronometers

    USGS Publications Warehouse

    d'Alessio, M. A.; Williams, C.F.

    2007-01-01

    A suite of new techniques in thermochronometry allow analysis of the thermal history of a sample over a broad range of temperature sensitivities. New analysis tools must be developed that fully and formally integrate these techniques, allowing a single geologic interpretation of the rate and timing of exhumation and burial events consistent with all data. We integrate a thermal model of burial and exhumation, (U-Th)/He age modeling, and fission track age and length modeling. We then use a genetic algorithm to efficiently explore possible time-exhumation histories of a vertical sample profile (such as a borehole), simultaneously solving for exhumation and burial rates as well as changes in background heat flow. We formally combine all data in a rigorous statistical fashion. By parameterizing the model in terms of exhumation rather than time-temperature paths (as traditionally done in fission track modeling), we can ensure that exhumation histories result in a sedimentary basin whose thickness is consistent with the observed basin, a physically based constraint that eliminates otherwise acceptable thermal histories. We apply the technique to heat flow and thermochronometry data from the 2.1 -km-deep San Andreas Fault Observatory at Depth pilot hole near the San Andreas fault, California. We find that the site experienced <1 km of exhumation or burial since the onset of San Andreas fault activity ???30 Ma.

  7. Toward Synthesis, Analysis, and Certification of Security Protocols

    NASA Technical Reports Server (NTRS)

    Schumann, Johann

    2004-01-01

    Implemented security protocols are basically pieces of software which are used to (a) authenticate the other communication partners, (b) establish a secure communication channel between them (using insecure communication media), and (c) transfer data between the communication partners in such a way that these data only available to the desired receiver, but not to anyone else. Such an implementation usually consists of the following components: the protocol-engine, which controls in which sequence the messages of the protocol are sent over the network, and which controls the assembly/disassembly and processing (e.g., decryption) of the data. the cryptographic routines to actually encrypt or decrypt the data (using given keys), and t,he interface to the operating system and to the application. For a correct working of such a security protocol, all of these components must work flawlessly. Many formal-methods based techniques for the analysis of a security protocols have been developed. They range from using specific logics (e.g.: BAN-logic [4], or higher order logics [12] to model checking [2] approaches. In each approach, the analysis tries to prove that no (or at least not a modeled intruder) can get access to secret data. Otherwise, a scenario illustrating the &tack may be produced. Despite the seeming simplicity of security protocols ("only" a few messages are sent between the protocol partners in order to ensure a secure communication), many flaws have been detected. Unfortunately, even a perfect protocol engine does not guarantee flawless working of a security protocol, as incidents show. Many break-ins and security vulnerabilities are caused by exploiting errors in the implementation of the protocol engine or the underlying operating system. Attacks using buffer-overflows are a very common class of such attacks. Errors in the implementation of exception or error handling can open up additional vulnerabilities. For example, on a website with a log-in screen: multiple tries with invalid passwords caused the expected error message (too many retries). but let the user nevertheless pass. Finally, security can be compromised by silly implementation bugs or design decisions. In a commercial VPN software, all calls to the encryption routines were incidentally replaced by stubs, probably during factory testing. The product worked nicely. and the error (an open VPN) would have gone undetected, if a team member had not inspected the low-level traffic out of curiosity. Also, the use secret proprietary encryption routines can backfire, because such algorithms often exhibit weaknesses which can be exploited easily (see e.g., DVD encoding). Summarizing, there is large number of possibilities to make errors which can compromise the security of a protocol. In today s world with short time-to-market and the use of security protocols in open and hostile networks for safety-critical applications (e.g., power or air-traffic control), such slips could lead to catastrophic situations. Thus, formal methods and automatic reasoning techniques should not be used just for the formal proof of absence of an attack, but they ought to be used to provide an end-to-end tool-supported framework for security software. With such an approach all required artifacts (code, documentation, test cases) , formal analyses, and reliable certification will be generated automatically, given a single, high level specification. By a combination of program synthesis, formal protocol analysis, certification; and proof-carrying code, this goal is within practical reach, since all the important technologies for such an approach actually exist and only need to be assembled in the right way.

  8. 78 FR 20706 - Self-Regulatory Organizations; BOX Options Exchange LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-05

    ... formal disciplinary action against a firm. The Exchange proposes to add to Rule 12140(d)(9) specific... Formal Disciplinary Action. These changes are based on the rules of the Chicago Board Option Exchange...-through violations. Exchange Rule 12140 provides that in lieu of commencing a disciplinary proceeding, the...

  9. Education in Emergencies: The Gender Implications. Advocacy Brief

    ERIC Educational Resources Information Center

    Kirk, Jackie

    2006-01-01

    "Education in emergencies" refers to a broad range of education activities, both formal and non-formal, which are life saving and sustaining. They are critical for children, youth and their families in times of crisis. Programmes for emergency education are often designed according to a specific environmental context and sometimes as temporary…

  10. Formalization and Analysis of Reasoning by Assumption

    ERIC Educational Resources Information Center

    Bosse, Tibor; Jonker, Catholijn M.; Treur, Jan

    2006-01-01

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically…

  11. Non-Formal Education in Free Time: Leisure- or Work-Orientated Activity?

    ERIC Educational Resources Information Center

    Thoidis, Ioannis; Pnevmatikos, Dimitrios

    2014-01-01

    This article deals with the relationship between adults' free time and further education. More specifically, the paper addresses the question of whether there are similarities and analogies between the leisure time that adults dedicate to non-formal educational activities and free time per se. A structured questionnaire was used to examine the…

  12. Teaching about Hazardous and Toxic Materials. Teaching Activities in Environmental Education Series.

    ERIC Educational Resources Information Center

    Disinger, John F.; Lisowski, Marylin

    Designed to assist practitioners of both formal and non-formal settings, this 18th volume of the ERIC Clearinghouse for Science, Mathematics, and Environmental Education's Teaching Activities in Environmental Education series specifically focuses on the theme of hazardous and toxic materials. Initially, basic environmental concepts that deal with…

  13. Development and Evaluation of an Ontology for Guiding Appropriate Antibiotic Prescribing

    PubMed Central

    Furuya, E. Yoko; Kuperman, Gilad J.; Cimino, James J.; Bakken, Suzanne

    2011-01-01

    Objectives To develop and apply formal ontology creation methods to the domain of antimicrobial prescribing and to formally evaluate the resulting ontology through intrinsic and extrinsic evaluation studies. Methods We extended existing ontology development methods to create the ontology and implemented the ontology using Protégé-OWL. Correctness of the ontology was assessed using a set of ontology design principles and domain expert review via the laddering technique. We created three artifacts to support the extrinsic evaluation (set of prescribing rules, alerts and an ontology-driven alert module, and a patient database) and evaluated the usefulness of the ontology for performing knowledge management tasks to maintain the ontology and for generating alerts to guide antibiotic prescribing. Results The ontology includes 199 classes, 10 properties, and 1,636 description logic restrictions. Twenty-three Semantic Web Rule Language rules were written to generate three prescribing alerts: 1) antibiotic-microorganism mismatch alert; 2) medication-allergy alert; and 3) non-recommended empiric antibiotic therapy alert. The evaluation studies confirmed the correctness of the ontology, usefulness of the ontology for representing and maintaining antimicrobial treatment knowledge rules, and usefulness of the ontology for generating alerts to provide feedback to clinicians during antibiotic prescribing. Conclusions This study contributes to the understanding of ontology development and evaluation methods and addresses one knowledge gap related to using ontologies as a clinical decision support system component—a need for formal ontology evaluation methods to measure their quality from the perspective of their intrinsic characteristics and their usefulness for specific tasks. PMID:22019377

  14. Development and evaluation of an ontology for guiding appropriate antibiotic prescribing.

    PubMed

    Bright, Tiffani J; Yoko Furuya, E; Kuperman, Gilad J; Cimino, James J; Bakken, Suzanne

    2012-02-01

    To develop and apply formal ontology creation methods to the domain of antimicrobial prescribing and to formally evaluate the resulting ontology through intrinsic and extrinsic evaluation studies. We extended existing ontology development methods to create the ontology and implemented the ontology using Protégé-OWL. Correctness of the ontology was assessed using a set of ontology design principles and domain expert review via the laddering technique. We created three artifacts to support the extrinsic evaluation (set of prescribing rules, alerts and an ontology-driven alert module, and a patient database) and evaluated the usefulness of the ontology for performing knowledge management tasks to maintain the ontology and for generating alerts to guide antibiotic prescribing. The ontology includes 199 classes, 10 properties, and 1636 description logic restrictions. Twenty-three Semantic Web Rule Language rules were written to generate three prescribing alerts: (1) antibiotic-microorganism mismatch alert; (2) medication-allergy alert; and (3) non-recommended empiric antibiotic therapy alert. The evaluation studies confirmed the correctness of the ontology, usefulness of the ontology for representing and maintaining antimicrobial treatment knowledge rules, and usefulness of the ontology for generating alerts to provide feedback to clinicians during antibiotic prescribing. This study contributes to the understanding of ontology development and evaluation methods and addresses one knowledge gap related to using ontologies as a clinical decision support system component-a need for formal ontology evaluation methods to measure their quality from the perspective of their intrinsic characteristics and their usefulness for specific tasks. Copyright © 2011 Elsevier Inc. All rights reserved.

  15. Robot-assisted laparoscopic skills development: formal versus informal training.

    PubMed

    Benson, Aaron D; Kramer, Brandan A; Boehler, Margaret; Schwind, Cathy J; Schwartz, Bradley F

    2010-08-01

    The learning curve for robotic surgery is not completely defined, and ideal training components have not yet been identified. We attempted to determine whether skill development would be accelerated with formal, organized instruction in robotic surgical techniques versus informal practice alone. Forty-three medical students naive to robotic surgery were randomized into two groups and tested on three tasks using the robotic platform. Between the testing sessions, the students were given equally timed practice sessions. The formal training group participated in an organized, formal training session with instruction from an attending robotic surgeon, whereas the informal training group participated in an equally timed unstructured practice session with the robot. The results were compared based on technical score and time to completion of each task. There was no difference between groups in prepractice testing for any task. In postpractice testing, there was no difference between groups for the ring transfer tasks. However, for the suture placement and knot-tying task, the technical score of the formal training group was significantly better than that of the informal training group (p < 0.001), yet time to completion was not different. Although formal training may not be necessary for basic skills, formal instruction for more advanced skills, such as suture placement and knot tying, is important in developing skills needed for effective robotic surgery. These findings may be important in formulating potential skills labs or training courses for robotic surgery.

  16. Use of healthcare services by injured people in Khartoum State, Sudan.

    PubMed

    El Tayeb, Sally; Abdalla, Safa; Van den Bergh, Graziella; Heuch, Ivar

    2015-05-01

    Trauma care is an important factor in preventing death and reducing disability. Injured persons in low- and middle-income countries are expected to use the formal healthcare system in increasing numbers. The objective of this paper is to examine use of healthcare services after injury in Khartoum State, Sudan. A community-based survey using a stratified two-stage cluster sampling technique in Khartoum State was performed. Information on healthcare utilisation was taken from injured people. A logistic regression analysis was used to explore factors affecting the probability of using formal healthcare services. During the 12 months preceding the survey a total of 441 cases of non-fatal injuries occurred, with 260 patients accessing formal healthcare. About a quarter of the injured persons were admitted to hospital. Injured people with primary education were less likely to use formal healthcare compared to those with no education. Formal health services were most used by males and in cases of road traffic injuries. The lowest socio-economic strata were least likely to use formal healthcare. Public health measures and social security should be strengthened by identifying other real barriers that prevent low socio-economic groups from making use of formal healthcare facilities. Integration and collaboration with traditional orthopaedic practitioners are important aspects that need further attention. © The Author 2014. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  17. The MINERVA Software Development Process

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  18. Coincidence and covariance data acquisition in photoelectron and -ion spectroscopy. I. Formal theory

    NASA Astrophysics Data System (ADS)

    Mikosch, Jochen; Patchkovskii, Serguei

    2013-10-01

    We derive a formal theory of noisy Poisson processes with multiple outcomes. We obtain simple, compact expressions for the probability distribution function of arbitrarily complex composite events and its moments. We illustrate the utility of the theory by analyzing properties of coincidence and covariance photoelectron-photoion detection involving single-ionization events. The results and techniques introduced in this work are directly applicable to more general coincidence and covariance experiments, including multiple ionization and multiple-ion fragmentation pathways.

  19. Informal and formal trail monitoring protocols and baseline conditions: Acadia National Park

    USGS Publications Warehouse

    Marion, Jeffrey L.; Wimpey, Jeremy F.; Park, L.

    2011-01-01

    At Acadia National Park, changing visitor use levels and patterns have contributed to an increasing degree of visitor use impacts to natural and cultural resources. To better understand the extent and severity of these resource impacts and identify effective management techniques, the park sponsored this research to develop monitoring protocols, collect baseline data, and identify suggestions for management strategies. Formal and informal trails were surveyed and their resource conditions were assessed and characterized to support park planning and management decision-making.

  20. Police training in interviewing and interrogation methods: A comparison of techniques used with adult and juvenile suspects.

    PubMed

    Cleary, Hayley M D; Warner, Todd C

    2016-06-01

    Despite empirical progress in documenting and classifying various interrogation techniques, very little is known about how police are trained in interrogation methods, how frequently they use various techniques, and whether they employ techniques differentially with adult versus juvenile suspects. This study reports the nature and extent of formal (e.g., Reid Technique, PEACE, HUMINT) and informal interrogation training as well as self-reported technique usage in a diverse national sample (N = 340) of experienced American police officers. Officers were trained in a variety of different techniques ranging from comparatively benign pre-interrogation strategies (e.g., building rapport, observing body language or speech patterns) to more psychologically coercive techniques (e.g., blaming the victim, discouraging denials). Over half the sample reported being trained to use psychologically coercive techniques with both adults and juveniles. The majority (91%) receive informal, "on the job" interrogation training. Technique usage patterns indicate a spectrum of psychological intensity where information-gathering approaches were used most frequently and high-pressure tactics less frequently. Reid-trained officers (56%) were significantly more likely than officers without Reid training to use pre-interrogation and manipulation techniques. Across all analyses and techniques, usage patterns were identical for adult and juvenile suspects, suggesting that police interrogate youth in the same manner as adults. Overall, results suggest that training in specific interrogation methods is strongly associated with usage. Findings underscore the need for more law enforcement interrogation training in general, especially with juvenile suspects, and highlight the value of training as an avenue for reducing interrogation-induced miscarriages of justice. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  1. Impedance spectroscopy and electric modulus behavior of Molybdenum doped Cobalt-Zinc ferrite

    NASA Astrophysics Data System (ADS)

    Pradhan, A. K.; Nath, T. K.; Saha, S.

    2017-07-01

    The complex impedance spectroscopy and the electric modulus of Mo doped Cobalt-Zinc inverse spinel ferrite has been investigated in detail. The conventional ceramic technique has been used to prepare the CZMO. The HRXRD technique has been used to study the structural analysis which confirms the inverse spinel structure of the material and also suggest the material have Fd3m space group. The complex impedance spectroscopic data and the electric modulus formalism have been used to understand the dielectric relaxation and conduction process. The contribution of grain and grain boundary in the electrical conduction process of CZMO has been confirmed from the Cole-Cole plot. The activation energy is calculated from both the IS (Impedance Spectroscopy) and electric modulus formalism and found to be nearly same for the materials.

  2. Analysis and improvements of Adaptive Particle Refinement (APR) through CPU time, accuracy and robustness considerations

    NASA Astrophysics Data System (ADS)

    Chiron, L.; Oger, G.; de Leffe, M.; Le Touzé, D.

    2018-02-01

    While smoothed-particle hydrodynamics (SPH) simulations are usually performed using uniform particle distributions, local particle refinement techniques have been developed to concentrate fine spatial resolutions in identified areas of interest. Although the formalism of this method is relatively easy to implement, its robustness at coarse/fine interfaces can be problematic. Analysis performed in [16] shows that the radius of refined particles should be greater than half the radius of unrefined particles to ensure robustness. In this article, the basics of an Adaptive Particle Refinement (APR) technique, inspired by AMR in mesh-based methods, are presented. This approach ensures robustness with alleviated constraints. Simulations applying the new formalism proposed achieve accuracy comparable to fully refined spatial resolutions, together with robustness, low CPU times and maintained parallel efficiency.

  3. Schedule-Aware Workflow Management Systems

    NASA Astrophysics Data System (ADS)

    Mans, Ronny S.; Russell, Nick C.; van der Aalst, Wil M. P.; Moleman, Arnold J.; Bakker, Piet J. M.

    Contemporary workflow management systems offer work-items to users through specific work-lists. Users select the work-items they will perform without having a specific schedule in mind. However, in many environments work needs to be scheduled and performed at particular times. For example, in hospitals many work-items are linked to appointments, e.g., a doctor cannot perform surgery without reserving an operating theater and making sure that the patient is present. One of the problems when applying workflow technology in such domains is the lack of calendar-based scheduling support. In this paper, we present an approach that supports the seamless integration of unscheduled (flow) and scheduled (schedule) tasks. Using CPN Tools we have developed a specification and simulation model for schedule-aware workflow management systems. Based on this a system has been realized that uses YAWL, Microsoft Exchange Server 2007, Outlook, and a dedicated scheduling service. The approach is illustrated using a real-life case study at the AMC hospital in the Netherlands. In addition, we elaborate on the experiences obtained when developing and implementing a system of this scale using formal techniques.

  4. Product-oriented Software Certification Process for Software Synthesis

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  5. On the Formal Verification of Conflict Detection Algorithms

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Butler, Ricky W.; Carreno, Victor A.; Dowek, Gilles

    2001-01-01

    Safety assessment of new air traffic management systems is a main issue for civil aviation authorities. Standard techniques such as testing and simulation have serious limitations in new systems that are significantly more autonomous than the older ones. In this paper, we present an innovative approach, based on formal verification, for establishing the correctness of conflict detection systems. Fundamental to our approach is the concept of trajectory, which is a continuous path in the x-y plane constrained by physical laws and operational requirements. From the Model of trajectories, we extract, and formally prove, high level properties that can serve as a framework to analyze conflict scenarios. We use the Airborne Information for Lateral Spacing (AILS) alerting algorithm as a case study of our approach.

  6. Incompleteness of Bluetooth protocol conformance test cases

    NASA Astrophysics Data System (ADS)

    Wu, Peng; Gao, Qiang

    2001-10-01

    This paper describes a formal method to verify the completeness of conformance testing, in which not only Implementation Under Test (IUT) is formalized in SDL, but also conformance tester is described in SDL so that conformance testing can be performed in simulator provided with CASE tool. The protocol set considered is Bluetooth, an open wireless communication technology. Our research results show that Bluetooth conformance test specification is not complete in that it has only limited coverage and many important capabilities defined in Bluetooth core specification are not tested. We also give a detail report on the missing test cases against Bluetooth core specification, and provide a guide on further test case generation in the future.

  7. Symbolic LTL Compilation for Model Checking: Extended Abstract

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Y.; Vardi, Moshe Y.

    2007-01-01

    In Linear Temporal Logic (LTL) model checking, we check LTL formulas representing desired behaviors against a formal model of the system designed to exhibit these behaviors. To accomplish this task, the LTL formulas must be translated into automata [21]. We focus on LTL compilation by investigating LTL satisfiability checking via a reduction to model checking. Having shown that symbolic LTL compilation algorithms are superior to explicit automata construction algorithms for this task [16], we concentrate here on seeking a better symbolic algorithm.We present experimental data comparing algorithmic variations such as normal forms, encoding methods, and variable ordering and examine their effects on performance metrics including processing time and scalability. Safety critical systems, such as air traffic control, life support systems, hazardous environment controls, and automotive control systems, pervade our daily lives, yet testing and simulation alone cannot adequately verify their reliability [3]. Model checking is a promising approach to formal verification for safety critical systems which involves creating a formal mathematical model of the system and translating desired safety properties into a formal specification for this model. The complement of the specification is then checked against the system model. When the model does not satisfy the specification, model-checking tools accompany this negative answer with a counterexample, which points to an inconsistency between the system and the desired behaviors and aids debugging efforts.

  8. Creating More Effective Mentors: Mentoring the Mentor.

    PubMed

    Gandhi, Monica; Johnson, Mallory

    2016-09-01

    Given the diversity of those affected by HIV, increasing diversity in the HIV biomedical research workforce is imperative. A growing body of empirical and experimental evidence supports the importance of strong mentorship in the development and success of trainees and early career investigators in academic research settings, especially for mentees of diversity. Often missing from this discussion is the need for robust mentoring training programs to ensure that mentors are trained in best practices on the tools and techniques of mentoring. Recent experimental evidence shows improvement in mentor and mentee perceptions of mentor competency after structured and formalized training on best practices in mentoring. We developed a 2-day "Mentoring the Mentors" workshop at UCSF to train mid-level and senior HIV researchers from around the country [recruited mainly from Centers for AIDS Research (CFARs)] on best practices, tools and techniques of effective mentoring. The workshop content was designed using principles of Social Cognitive Career Theory (SCCT) and included training specifically geared towards working with early career investigators from underrepresented groups, including sessions on unconscious bias, microaggressions, and diversity supplements. The workshop has been held three times (September 2012, October 2013 and May 2015) with plans for annual training. Mentoring competency was measured using a validated tool before and after each workshop. Mentoring competency skills in six domains of mentoring-specifically effective communication, aligning expectations, assessing understanding, fostering independence, addressing diversity and promoting development-all improved as assessed by a validated measurement tool for participants pre- and -post the "Mentoring the Mentors" training workshops. Qualitative assessments indicated a greater awareness of the micro-insults and unconscious bias experienced by mentees of diversity and a commitment to improve awareness and mitigate these effects via the mentor-mentee relationship. Our "Mentoring the Mentors" workshop for HIV researchers/mentors offers a formal and structured curriculum on best practices, tools and techniques of effective mentoring, and methods to mitigate unconscious bias in the mentoring relationship. We found quantitative and qualitative improvements in mentoring skills as assessed by self-report by participants after each workshop and plan additional programs with longitudinal longer-term assessments focused on objective mentee outcomes (grants, papers, academic retention). Mentoring training can improve mentoring skills and is likely to improve outcomes for optimally-mentored mentees.

  9. Creating more effective mentors: Mentoring the mentor

    PubMed Central

    Gandhi, Monica; Johnson, Mallory

    2016-01-01

    Introduction Given the diversity of those affected by HIV, increasing diversity in the HIV biomedical research workforce is imperative. A growing body of empirical and experimental evidence supports the importance of strong mentorship in the development and success of trainees and early career investigators in academic research settings, especially for mentees of diversity. Often missing from this discussion is the need for robust mentoring training programs to ensure that mentors are trained in best practices on the tools and techniques of mentoring. Recent experimental evidence shows improvement in mentor and mentee perceptions of mentor’s competency after structured and formalized training on best practices in mentoring. Methods We developed a 2-day “Mentoring the Mentors” workshop at UCSF to train mid-level and senior HIV researchers from around the country (recruited mainly from Centers for AIDS Research (CFARs)) on best practices, tools and techniques of effective mentoring. The workshop content was designed using principles of Social Cognitive Career Theory (SCCT) and included training specific to working with early career investigators from underrepresented groups, including training on unconscious bias, microaggressions, and diversity supplements. The workshop has been held 3 times (September 2012, October 2013 and May 2015) with plans for annual training. Mentoring competency was measured using a validated tool before and after each workshop. Results Mentoring competency skills in six domains of mentoring -specifically effective communication, aligning expectations, assessing understanding, fostering independence, addressing diversity and promoting development - all improved as assessed by a validated measurement tool for participants pre- and-post the “Mentoring the Mentors” training workshops. Qualitative assessments indicated a greater awareness of the micro-insults and unconscious bias experienced by mentees of diversity and a commitment to improve awareness and mitigate these effects via the mentor-mentee relationship. Discussion Our “Mentoring the Mentors” workshop for HIV researchers/mentors offers a formal and structured curriculum on best practices, tools and techniques of effective mentoring, and methods to mitigate unconscious bias in the mentoring relationship and at the institutional level with mentees of diversity. We found quantitative and qualitative improvements in mentoring skills as assessed by self-report by participants after each workshop and plan additional programs with longitudinal longer-term assessments focused on objective mentee outcomes (grants, papers, academic retention). Mentoring training can improve mentoring skills and are likely to improve outcomes for optimally-mentored mentees. PMID:27039092

  10. Impact of Managerial Skills Learnt through MA Educational Planning Management Programme of AIOU on the Performance of Institutional Heads

    ERIC Educational Resources Information Center

    Chuadhry, Muhammad Asif; Shah, Syed Manzoor Hussain

    2012-01-01

    Management provides formal coordination in an organization for achieving pre-determined goals. The educational manager particulary performs his duties by using different planning and management techniques. These techniques are equally important for the manager of other sectors. The present study was focused on the impact of managerial skills…

  11. Lee Silverman Voice Treatment for People with Parkinson's: Audit of Outcomes in a Routine Clinic

    ERIC Educational Resources Information Center

    Wight, Sheila; Miller, Nick

    2015-01-01

    Background: Speaking louder/more intensely represents a longstanding technique employed to manage voice and intelligibility changes in people with Parkinson's. This technique has been formalized into a treatment approach and marketed as the Lee Silverman Voice Treatment (LSVT®) programme. Evidence for its efficacy has been published. Studies…

  12. Elicitation Techniques: Getting People to Talk about Ideas They Don't Usually Talk About

    ERIC Educational Resources Information Center

    Barton, Keith C.

    2015-01-01

    Elicitation techniques are a category of research tasks that use visual, verbal, or written stimuli to encourage participants to talk about their ideas. These tasks are particularly useful for exploring topics that may be difficult to discuss in formal interviews, such as those that involve sensitive issues or rely on tacit knowledge. Elicitation…

  13. A Delphi Study on Staff Bereavement Training in the Intellectual and Developmental Disabilities Field

    ERIC Educational Resources Information Center

    Gray, Jennifer A.; Truesdale, Jesslyn

    2015-01-01

    The Delphi technique was used to obtain expert panel consensus to prioritize content areas and delivery methods for developing staff grief and bereavement curriculum training in the intellectual and developmental disabilities (IDD) field. The Delphi technique was conducted with a panel of 18 experts from formal and informal disability caregiving,…

  14. Orthogonality-breaking sensing model based on the instantaneous Stokes vector and the Mueller calculus

    NASA Astrophysics Data System (ADS)

    Ortega-Quijano, Noé; Fade, Julien; Roche, Muriel; Parnet, François; Alouini, Mehdi

    2016-04-01

    Polarimetric sensing by orthogonality breaking has been recently proposed as an alternative technique for performing direct and fast polarimetric measurements using a specific dual-frequency dual-polarization (DFDP) source. Based on the instantaneous Stokes-Mueller formalism to describe the high-frequency evolution of the DFDP beam intensity, we thoroughly analyze the interaction of such a beam with birefringent, dichroic and depolarizing samples. This allows us to confirm that orthogonality breaking is produced by the sample diattenuation, whereas this technique is immune to both birefringence and diagonal depolarization. We further analyze the robustness of this technique when polarimetric sensing is performed through a birefringent waveguide, and the optimal DFDP source configuration for fiber-based endoscopic measurements is subsequently identified. Finally, we consider a stochastic depolarization model based on an ensemble of random linear diattenuators, which makes it possible to understand the progressive vanishing of the detected orthogonality breaking signal as the spatial heterogeneity of the sample increases, thus confirming the insensitivity of this method to diagonal depolarization. The fact that the orthogonality breaking signal is exclusively due to the sample dichroism is an advantageous feature for the precise decoupled characterization of such an anisotropic parameter in samples showing several simultaneous effects.

  15. Telemedicine Platform Enhanced visiophony solution to operate a Robot-Companion

    NASA Astrophysics Data System (ADS)

    Simonnet, Th.; Couet, A.; Ezvan, P.; Givernaud, O.; Hillereau, P.

    Nowadays, one of the ways to reduce medical care costs is to reduce the length of patients hospitalization and reinforce home sanitary support by formal (professionals) and non formal (family) caregivers. The aim is to design and operate a scalable and secured collaborative platform to handle specific tools for patients, their families and doctors.

  16. Emerging Interaction of Political Processes: The Effect on a Study Abroad Program in Cuba

    ERIC Educational Resources Information Center

    Clarke, Ruth

    2007-01-01

    The emerging interaction of political processes sets the stage for the level of macro uncertainty and specific risk events that may occur in an international relationship. Strongly defined social control in Cuba, formal and informal, dominates the dynamics of the relationship, while simultaneously government, formal, action in the U.S. dominates…

  17. Using Exemption Examinations to Assess Finnish Business Students' Non-Formal and Informal Learning of ESP: A Pilot Study

    ERIC Educational Resources Information Center

    Tuomainen, Satu

    2014-01-01

    In recent years Finnish university language centres have increasingly developed procedures for assessing and recognising the skills in English for Specific Purposes (ESP) that students acquire in various non-formal and informal learning environments. This article describes the procedures developed by the University of Eastern Finland Language…

  18. Closing the Gap between Formalism and Application--PBL and Mathematical Skills in Engineering

    ERIC Educational Resources Information Center

    Christensen, Ole Ravn

    2008-01-01

    A common problem in learning mathematics concerns the gap between, on the one hand, doing the formalisms and calculations of abstract mathematics and, on the other hand, applying these in a specific contextualized setting for example the engineering world. The skills acquired through problem-based learning (PBL), in the special model used at…

  19. Literate Specification: Using Design Rationale To Support Formal Methods in the Development of Human-Machine Interfaces.

    ERIC Educational Resources Information Center

    Johnson, Christopher W.

    1996-01-01

    The development of safety-critical systems (aircraft cockpits and reactor control rooms) is qualitatively different from that of other interactive systems. These differences impose burdens on design teams that must ensure the development of human-machine interfaces. Analyzes strengths and weaknesses of formal methods for the design of user…

  20. Internal medicine point-of-care ultrasound assessment of left ventricular function correlates with formal echocardiography.

    PubMed

    Johnson, Benjamin K; Tierney, David M; Rosborough, Terry K; Harris, Kevin M; Newell, Marc C

    2016-02-01

    Although focused cardiac ultrasonographic (FoCUS) examination has been evaluated in emergency departments and intensive care units with good correlation to formal echocardiography, accuracy for the assessment of left ventricular systolic function (LVSF) when performed by internal medicine physicians still needs independent evaluation. This prospective observational study in a 640-bed, academic, quaternary care center, included 178 inpatients examined by 10 internal medicine physicians who had completed our internal medicine bedside ultrasound training program. The ability to estimate LVSF with FoCUS as "normal," "mild to moderately decreased," or "severely decreased" was compared with left ventricular ejection fraction (>50%, 31-49%, and <31%, respectively) from formal echocardiography interpreted by a cardiologist. Sensitivity and specificity of FoCUS for any degree of LVSF impairment were 0.91 (95% confidence interval [CI] 0.80, 0.97) and 0.88 (95% CI 0.81, 0.93), respectively. The interrater agreement between internal medicine physician-performed FoCUS and formal echocardiography for any LVSF impairment was "good/substantial" with κ = 0.77 (p < 0.001), 95% CI (0.67, 0.87). Formal echocardiography was classified as "technically limited due to patient factors" in 20% of patients; however, echogenicity was sufficient in 100% of FoCUS exams to classify LVSF. Internal medicine physicians using FoCUS identify normal versus decreased LVSF with high sensitivity, specificity, and "good/substantial" interrater agreement when compared with formal echocardiography. These results support the role of cardiac FoCUS by properly trained internal medicine physicians for discriminating normal from reduced LVSF. © 2015 Wiley Periodicals, Inc.

  1. LinkEHR-Ed: a multi-reference model archetype editor based on formal semantics.

    PubMed

    Maldonado, José A; Moner, David; Boscá, Diego; Fernández-Breis, Jesualdo T; Angulo, Carlos; Robles, Montserrat

    2009-08-01

    To develop a powerful archetype editing framework capable of handling multiple reference models and oriented towards the semantic description and standardization of legacy data. The main prerequisite for implementing tools providing enhanced support for archetypes is the clear specification of archetype semantics. We propose a formalization of the definition section of archetypes based on types over tree-structured data. It covers the specialization of archetypes, the relationship between reference models and archetypes and conformance of data instances to archetypes. LinkEHR-Ed, a visual archetype editor based on the former formalization with advanced processing capabilities that supports multiple reference models, the editing and semantic validation of archetypes, the specification of mappings to data sources, and the automatic generation of data transformation scripts, is developed. LinkEHR-Ed is a useful tool for building, processing and validating archetypes based on any reference model.

  2. A Formal Valuation Framework for Emotions and Their Control.

    PubMed

    Huys, Quentin J M; Renz, Daniel

    2017-09-15

    Computational psychiatry aims to apply mathematical and computational techniques to help improve psychiatric care. To achieve this, the phenomena under scrutiny should be within the scope of formal methods. As emotions play an important role across many psychiatric disorders, such computational methods must encompass emotions. Here, we consider formal valuation accounts of emotions. We focus on the fact that the flexibility of emotional responses and the nature of appraisals suggest the need for a model-based valuation framework for emotions. However, resource limitations make plain model-based valuation impossible and require metareasoning strategies to apportion cognitive resources adaptively. We argue that emotions may implement such metareasoning approximations by restricting the range of behaviors and states considered. We consider the processes that guide the deployment of the approximations, discerning between innate, model-free, heuristic, and model-based controllers. A formal valuation and metareasoning framework may thus provide a principled approach to examining emotions. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  3. Expressive map design: OGC SLD/SE++ extension for expressive map styles

    NASA Astrophysics Data System (ADS)

    Christophe, Sidonie; Duménieu, Bertrand; Masse, Antoine; Hoarau, Charlotte; Ory, Jérémie; Brédif, Mathieu; Lecordix, François; Mellado, Nicolas; Turbet, Jérémie; Loi, Hugo; Hurtut, Thomas; Vanderhaeghe, David; Vergne, Romain; Thollot, Joëlle

    2018-05-01

    In the context of custom map design, handling more artistic and expressive tools has been identified as a carto-graphic need, in order to design stylized and expressive maps. Based on previous works on style formalization, an approach for specifying the map style has been proposed and experimented for particular use cases. A first step deals with the analysis of inspiration sources, in order to extract `what does make the style of the source', i.e. the salient visual characteristics to be automatically reproduced (textures, spatial arrangements, linear stylization, etc.). In a second step, in order to mimic and generate those visual characteristics, existing and innovative rendering techniques have been implemented in our GIS engine, thus extending the capabilities to generate expressive renderings. Therefore, an extension of the existing cartographic pipeline has been proposed based on the following aspects: 1- extension of the symbolization specifications OGC SLD/SE in order to provide a formalism to specify and reference expressive rendering methods; 2- separate the specification of each rendering method and its parameterization, as metadata. The main contribution has been described in (Christophe et al. 2016). In this paper, we focus firstly on the extension of the cartographic pipeline (SLD++ and metadata) and secondly on map design capabilities which have been experimented on various topographic styles: old cartographic styles (Cassini), artistic styles (watercolor, impressionism, Japanese print), hybrid topographic styles (ortho-imagery & vector data) and finally abstract and photo-realist styles for the geovisualization of costal area. The genericity and interoperability of our approach are promising and have already been tested for 3D visualization.

  4. Diagnostic accuracy and limitations of post-mortem MRI for neurological abnormalities in fetuses and children.

    PubMed

    Arthurs, O J; Thayyil, S; Pauliah, S S; Jacques, T S; Chong, W K; Gunny, R; Saunders, D; Addison, S; Lally, P; Cady, E; Jones, R; Norman, W; Scott, R; Robertson, N J; Wade, A; Chitty, L; Taylor, A M; Sebire, N J

    2015-08-01

    To compare the diagnostic accuracy of non-invasive cerebral post-mortem magnetic resonance imaging (PMMRI) specifically for cerebral and neurological abnormalities in a series of fetuses and children, compared to conventional autopsy. Institutional ethics approval and parental consent was obtained. Pre-autopsy cerebral PMMRI was performed in a sequential prospective cohort (n = 400) of fetuses (n = 277; 185 ≤ 24 weeks and 92 > 24 weeks gestation) and children <16 years (n = 123) of age. PMMRI and conventional autopsy findings were reported blinded and independently of each other. Cerebral PMMRI had sensitivities and specificities (95% confidence interval) of 88.4% (75.5 to 94.9), and 95.2% (92.1 to 97.1), respectively, for cerebral malformations; 100% (83.9 to 100), and 99.1% (97.2 to 99.7) for major intracranial bleeds; and 87.5% (80.1 to 92.4) and 74.1% (68 to 79.4) for overall brain pathology. Formal neuropathological examination was non-diagnostic due to maceration/autolysis in 43/277 (16%) fetuses; of these, cerebral PMMRI imaging provided clinically important information in 23 (53%). The sensitivity of PMMRI for detecting significant ante-mortem ischaemic injury was only 68% (48.4 to 82.8) overall. PMMRI is an accurate investigational technique for identifying significant neuropathology in fetuses and children, and may provide important information even in cases where autolysis prevents formal neuropathological examination; however, PMMRI is less sensitive at detecting hypoxic-ischaemic brain injury, and may not detect rarer disorders not encountered in this study. Copyright © 2015 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  5. Using the Statecharts paradigm for simulation of patient flow in surgical care.

    PubMed

    Sobolev, Boris; Harel, David; Vasilakis, Christos; Levy, Adrian

    2008-03-01

    Computer simulation of patient flow has been used extensively to assess the impacts of changes in the management of surgical care. However, little research is available on the utility of existing modeling techniques. The purpose of this paper is to examine the capacity of Statecharts, a system of graphical specification, for constructing a discrete-event simulation model of the perioperative process. The Statecharts specification paradigm was originally developed for representing reactive systems by extending the formalism of finite-state machines through notions of hierarchy, parallelism, and event broadcasting. Hierarchy permits subordination between states so that one state may contain other states. Parallelism permits more than one state to be active at any given time. Broadcasting of events allows one state to detect changes in another state. In the context of the peri-operative process, hierarchy provides the means to describe steps within activities and to cluster related activities, parallelism provides the means to specify concurrent activities, and event broadcasting provides the means to trigger a series of actions in one activity according to transitions that occur in another activity. Combined with hierarchy and parallelism, event broadcasting offers a convenient way to describe the interaction of concurrent activities. We applied the Statecharts formalism to describe the progress of individual patients through surgical care as a series of asynchronous updates in patient records generated in reaction to events produced by parallel finite-state machines representing concurrent clinical and managerial activities. We conclude that Statecharts capture successfully the behavioral aspects of surgical care delivery by specifying permissible chronology of events, conditions, and actions.

  6. Verifying the interactive convergence clock synchronization algorithm using the Boyer-Moore theorem prover

    NASA Technical Reports Server (NTRS)

    Young, William D.

    1992-01-01

    The application of formal methods to the analysis of computing systems promises to provide higher and higher levels of assurance as the sophistication of our tools and techniques increases. Improvements in tools and techniques come about as we pit the current state of the art against new and challenging problems. A promising area for the application of formal methods is in real-time and distributed computing. Some of the algorithms in this area are both subtle and important. In response to this challenge and as part of an ongoing attempt to verify an implementation of the Interactive Convergence Clock Synchronization Algorithm (ICCSA), we decided to undertake a proof of the correctness of the algorithm using the Boyer-Moore theorem prover. This paper describes our approach to proving the ICCSA using the Boyer-Moore prover.

  7. External beam techniques to boost cervical cancer when brachytherapy is not an option—theories and applications

    PubMed Central

    Kilic, Sarah; Khan, Atif J.; Beriwal, Sushil; Small, William

    2017-01-01

    The management of locally advanced cervical cancer relies on brachytherapy (BT) as an integral part of the radiotherapy delivery armamentarium. Occasionally, intracavitary BT is neither possible nor available. In these circumstances, post-external beam radiotherapy (EBRT) interstitial brachytherapy and/or hysterectomy may represent viable options that must be adequately executed in a timely manner. However, if these options are not applicable due to patient related or facility related reasons, a formal contingency plan should be in place. Innovative EBRT techniques such as intensity modulated and stereotactic radiotherapy may be considered for patients unable to undergo brachytherapy. Relying on provocative arguments and recent data, this review explores the rationale for and limitations of non-brachytherapy substitutes in that setting aiming to establish a formal process for the optimal execution of this alternative plan. PMID:28603722

  8. Breaking the theoretical scaling limit for predicting quasiparticle energies: the stochastic GW approach.

    PubMed

    Neuhauser, Daniel; Gao, Yi; Arntsen, Christopher; Karshenas, Cyrus; Rabani, Eran; Baer, Roi

    2014-08-15

    We develop a formalism to calculate the quasiparticle energy within the GW many-body perturbation correction to the density functional theory. The occupied and virtual orbitals of the Kohn-Sham Hamiltonian are replaced by stochastic orbitals used to evaluate the Green function G, the polarization potential W, and, thereby, the GW self-energy. The stochastic GW (sGW) formalism relies on novel theoretical concepts such as stochastic time-dependent Hartree propagation, stochastic matrix compression, and spatial or temporal stochastic decoupling techniques. Beyond the theoretical interest, the formalism enables linear scaling GW calculations breaking the theoretical scaling limit for GW as well as circumventing the need for energy cutoff approximations. We illustrate the method for silicon nanocrystals of varying sizes with N_{e}>3000 electrons.

  9. [The workplace-based learning: a main paradigm of an effective continuing medical education].

    PubMed

    Lelli, Maria Barbara

    2010-01-01

    On the strength of the literature analysis and the Emilia-Romagna Region experience, we suggest a reflection on the workplace-based learning that goes beyond the analysis of the effectiveness of specific didactic methodologies and aspects related to Continuing Medical Education. Health education and training issue is viewed from a wider perspective, that integrates the three learning dimensions (formal, non formal and informal). In such a perspective the workplace-based learning becomes an essential paradigm to reshape the explicit knowledge conveyed in formal context and to emphasize informal contexts where innovation is generated.

  10. Proceedings of the IDA (Institute for Defense Analyses) Workshop on Formal Specification and Verification of Ada (Trade Name) (2nd) Held in Alexandria, Virginia on July 23-25, 1985.

    DTIC Science & Technology

    1985-11-01

    2% -N X Mailing Directory U Bernard Abrams ABRAMS@USC-ECLB Grumman Aerospace Corporation Mail Station 001-31T Bethpage, NY 11714 (516) 575-9487 Omar...Aerospace & Comm. Corp. 10440 State Highway 83 Colorado Springs, Colorado 80908 Mark R. Cornwell CORNWELL @NRL-CSS Code 7590 Naval Research Lab Washington...5) Role of the Formal Definition of Ada Bernard Lang, INRIA, no date, 10 pages [6) The Users of a Formal Definition for Ada Bernd Krieg-Brdckner 2

  11. Survey of Verification and Validation Techniques for Small Satellite Software Development

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  12. Dynamic Gate Product and Artifact Generation from System Models

    NASA Technical Reports Server (NTRS)

    Jackson, Maddalena; Delp, Christopher; Bindschadler, Duane; Sarrel, Marc; Wollaeger, Ryan; Lam, Doris

    2011-01-01

    Model Based Systems Engineering (MBSE) is gaining acceptance as a way to formalize systems engineering practice through the use of models. The traditional method of producing and managing a plethora of disjointed documents and presentations ("Power-Point Engineering") has proven both costly and limiting as a means to manage the complex and sophisticated specifications of modern space systems. We have developed a tool and method to produce sophisticated artifacts as views and by-products of integrated models, allowing us to minimize the practice of "Power-Point Engineering" from model-based projects and demonstrate the ability of MBSE to work within and supersede traditional engineering practices. This paper describes how we have created and successfully used model-based document generation techniques to extract paper artifacts from complex SysML and UML models in support of successful project reviews. Use of formal SysML and UML models for architecture and system design enables production of review documents, textual artifacts, and analyses that are consistent with one-another and require virtually no labor-intensive maintenance across small-scale design changes and multiple authors. This effort thus enables approaches that focus more on rigorous engineering work and less on "PowerPoint engineering" and production of paper-based documents or their "office-productivity" file equivalents.

  13. Non-Lipschitz Dynamics Approach to Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Zak, M.; Meyers, R.

    1995-01-01

    This paper presents and discusses a mathematical formalism for simulation of discrete event dynamics (DED) - a special type of 'man- made' system designed to aid specific areas of information processing. A main objective is to demonstrate that the mathematical formalism for DED can be based upon the terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.

  14. Protege and Mentor Self-Disclosure: Levels and Outcomes within Formal Mentoring Dyads in a Corporate Context

    ERIC Educational Resources Information Center

    Wanberg, Connie R.; Welsh, Elizabeth T.; Kammeyer-Mueller, John

    2007-01-01

    This study examined the role of self-disclosure within protege/mentor dyads in formal mentoring partnerships within a corporate context as a means of learning more about specific relationship processes that may enhance the positive outcomes of mentoring. While both proteges and mentors self-disclosed in their relationships, proteges disclosed at a…

  15. Multi-User Virtual Environments Fostering Collaboration in Formal Education

    ERIC Educational Resources Information Center

    Di Blas, Nicoletta; Paolini, Paolo

    2014-01-01

    This paper is about how serious games based on MUVEs in formal education can foster collaboration. More specifically, it is about a large case-study with four different programs which took place from 2002 to 2009 and involved more than 9,000 students, aged between 12 and 18, from various nations (18 European countries, Israel and the USA). These…

  16. Developing Metrics for Effective Teaching in Extension Education: A Multi-State Factor-Analytic and Psychometric Analysis of Effective Teaching

    ERIC Educational Resources Information Center

    McKim, Billy R.; Lawver, Rebecca G.; Enns, Kellie; Smith, Amy R.; Aschenbrener, Mollie S.

    2013-01-01

    To successfully educate the public about agriculture, food, and natural resources, we must have effective educators in both formal and nonformal settings. Specifically, this study, which is a valuable part of a larger sequential mixed-method study addressing effective teaching in formal and nonformal agricultural education, provides direction for…

  17. Software Validation via Model Animation

    NASA Technical Reports Server (NTRS)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  18. Interacting hadron resonance gas model in the K -matrix formalism

    NASA Astrophysics Data System (ADS)

    Dash, Ashutosh; Samanta, Subhasis; Mohanty, Bedangadas

    2018-05-01

    An extension of hadron resonance gas (HRG) model is constructed to include interactions using relativistic virial expansion of partition function. The noninteracting part of the expansion contains all the stable baryons and mesons and the interacting part contains all the higher mass resonances which decay into two stable hadrons. The virial coefficients are related to the phase shifts which are calculated using K -matrix formalism in the present work. We have calculated various thermodynamics quantities like pressure, energy density, and entropy density of the system. A comparison of thermodynamic quantities with noninteracting HRG model, calculated using the same number of hadrons, shows that the results of the above formalism are larger. A good agreement between equation of state calculated in K -matrix formalism and lattice QCD simulations is observed. Specifically, the lattice QCD calculated interaction measure is well described in our formalism. We have also calculated second-order fluctuations and correlations of conserved charges in K -matrix formalism. We observe a good agreement of second-order fluctuations and baryon-strangeness correlation with lattice data below the crossover temperature.

  19. "PowerPoint[R] Engagement" Techniques to Foster Deep Learning

    ERIC Educational Resources Information Center

    Berk, Ronald A.

    2011-01-01

    The purpose of this article is to describe a bunch of strategies with which teachers may already be familiar and, perhaps, use regularly, but not always in the context of a formal PowerPoint[R] presentation. Here are the author's top 10 engagement techniques that fit neatly within any version of PowerPoint[R]. Some of these may also be used with…

  20. Dyads versus Groups: Using Different Social Structures in Peer Review to Enhance Online Collaborative Learning Processes

    ERIC Educational Resources Information Center

    Pozzi, Francesca; Ceregini, Andrea; Ferlino, Lucia; Persico, Donatella

    2016-01-01

    The Peer Review (PR) is a very popular technique to support socio-constructivist and connectivist learning processes, online or face-to-face, at all educational levels, in both formal and informal contexts. The idea behind this technique is that sharing views and opinions with others by discussing with peers and receiving and providing formative…

  1. Standardized reporting of resection technique during nephron-sparing surgery: the surface-intermediate-base margin score.

    PubMed

    Minervini, Andrea; Carini, Marco; Uzzo, Robert G; Campi, Riccardo; Smaldone, Marc C; Kutikov, Alexander

    2014-11-01

    A standardized reporting system of nephron-sparing surgery resection techniques is lacking. The surface-intermediate-base scoring system represents a formal reporting instrument to assist in interpretation of reported data and to facilitate comparisons in the urologic literature. Copyright © 2014 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  2. Minimally invasive surgical technique for tethered surgical drains

    PubMed Central

    Hess, Shane R; Satpathy, Jibanananda; Waligora, Andrew C; Ugwu-Oju, Obinna

    2017-01-01

    A feared complication of temporary surgical drain placement is from the technical error of accidentally suturing the surgical drain into the wound. Postoperative discovery of a tethered drain can frequently necessitate return to the operating room if it cannot be successfully removed with nonoperative techniques. Formal wound exploration increases anesthesia and infection risk as well as cost and is best avoided if possible. We present a minimally invasive surgical technique that can avoid the morbidity associated with a full surgical wound exploration to remove a tethered drain when other nonoperative techniques fail. PMID:28400669

  3. Formal Verification for a Next-Generation Space Shuttle

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2002-01-01

    This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.

  4. A formal approach to validation and verification for knowledge-based control systems

    NASA Technical Reports Server (NTRS)

    Castore, Glen

    1987-01-01

    As control systems become more complex in response to desires for greater system flexibility, performance and reliability, the promise is held out that artificial intelligence might provide the means for building such systems. An obstacle to the use of symbolic processing constructs in this domain is the need for verification and validation (V and V) of the systems. Techniques currently in use do not seem appropriate for knowledge-based software. An outline of a formal approach to V and V for knowledge-based control systems is presented.

  5. Modeling formalisms in Systems Biology

    PubMed Central

    2011-01-01

    Systems Biology has taken advantage of computational tools and high-throughput experimental data to model several biological processes. These include signaling, gene regulatory, and metabolic networks. However, most of these models are specific to each kind of network. Their interconnection demands a whole-cell modeling framework for a complete understanding of cellular systems. We describe the features required by an integrated framework for modeling, analyzing and simulating biological processes, and review several modeling formalisms that have been used in Systems Biology including Boolean networks, Bayesian networks, Petri nets, process algebras, constraint-based models, differential equations, rule-based models, interacting state machines, cellular automata, and agent-based models. We compare the features provided by different formalisms, and discuss recent approaches in the integration of these formalisms, as well as possible directions for the future. PMID:22141422

  6. Reciprocal relations between cognitive neuroscience and formal cognitive models: opposites attract?

    PubMed

    Forstmann, Birte U; Wagenmakers, Eric-Jan; Eichele, Tom; Brown, Scott; Serences, John T

    2011-06-01

    Cognitive neuroscientists study how the brain implements particular cognitive processes such as perception, learning, and decision-making. Traditional approaches in which experiments are designed to target a specific cognitive process have been supplemented by two recent innovations. First, formal cognitive models can decompose observed behavioral data into multiple latent cognitive processes, allowing brain measurements to be associated with a particular cognitive process more precisely and more confidently. Second, cognitive neuroscience can provide additional data to inform the development of formal cognitive models, providing greater constraint than behavioral data alone. We argue that these fields are mutually dependent; not only can models guide neuroscientific endeavors, but understanding neural mechanisms can provide key insights into formal models of cognition. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Contour-time approach to the Bose-Hubbard model in the strong coupling regime: Studying two-point spatio-temporal correlations at the Hartree-Fock-Bogoliubov level

    NASA Astrophysics Data System (ADS)

    Fitzpatrick, Matthew R. C.; Kennett, Malcolm P.

    2018-05-01

    We develop a formalism that allows the study of correlations in space and time in both the superfluid and Mott insulating phases of the Bose-Hubbard Model. Specifically, we obtain a two particle irreducible effective action within the contour-time formalism that allows for both equilibrium and out of equilibrium phenomena. We derive equations of motion for both the superfluid order parameter and two-point correlation functions. To assess the accuracy of this formalism, we study the equilibrium solution of the equations of motion and compare our results to existing strong coupling methods as well as exact methods where possible. We discuss applications of this formalism to out of equilibrium situations.

  8. 14 CFR 211.10 - Filing specifications.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) ECONOMIC REGULATIONS APPLICATIONS FOR PERMITS TO FOREIGN AIR CARRIERS General Requirements § 211.10 Filing... in § 302.3 of this chapter as to execution, number of copies, and formal specifications of papers. (b...

  9. Medication dispensing errors in Palestinian community pharmacy practice: a formal consensus using the Delphi technique.

    PubMed

    Shawahna, Ramzi; Haddad, Aseel; Khawaja, Baraa; Raie, Rand; Zaneen, Sireen; Edais, Tasneem

    2016-10-01

    Background Medication dispensing errors (MDEs) are frequent in community pharmacy practice. A definition of MDEs and scenarios representing MDE situations in Palestinian community pharmacy practice were not previously approached using formal consensus techniques. Objective This study was conducted to achieve consensus on a definition of MDEs and a wide range of scenarios that should or should not be considered as MDEs in Palestinian community pharmacy practice by a panel of community pharmacists. Setting Community pharmacy practice in Palestine. Method This was a descriptive study using the Delphi technique. A panel of fifty community pharmacists was recruited from different geographical locations of the West Bank of Palestine. A three round Delphi technique was followed to achieve consensus on a proposed definition of MDEs and 83 different scenarios representing potential MDEs using a nine-point scale. Main outcome measure Agreement or disagreement of a panel of community pharmacists on a proposed definition of MDEs and a series of scenarios representing potential MDEs. Results In the first Delphi round, views of key contact community pharmacists on MDEs were explored and situations representing potential MDEs were collected. In the second Delphi round, consensus was achieved to accept the proposed definition and to include 49 (59 %) of the 83 proposed scenarios as MDEs. In the third Delphi round, consensus was achieved to include further 13 (15.7 %) scenarios as MDEs, exclude 9 (10.8 %) scenarios and the rest of 12 (14.5 %) scenarios were considered equivocal based on the opinions of the panelists. Conclusion Consensus on a definition of MDEs and scenarios representing MDE situations in Palestinian community pharmacy practice was achieved using a formal consensus technique. The use of consensual definitions and scenarios representing MDE situations in community pharmacy practice might minimize methodological variations and their significant effects on the number and rate of MDEs reported in different studies.

  10. The effectiveness of the bone bridge transtibial amputation technique: A systematic review of high-quality evidence.

    PubMed

    Kahle, Jason T; Highsmith, M Jason; Kenney, John; Ruth, Tim; Lunseth, Paul A; Ertl, Janos

    2017-06-01

    This literature review was undertaken to determine if commonly held views about the benefits of a bone bridge technique are supported by the literature. Four databases were searched for articles pertaining to surgical strategies specific to a bone bridge technique of the transtibial amputee. A total of 35 articles were identified as potential articles. Authors included methodology that was applied to separate topics. Following identification, articles were excluded if they were determined to be low quality evidence or not pertinent. Nine articles were identified to be pertinent to one of the topics: Perioperative Care, Acute Care, Subjective Analysis and Function. Two articles sorted into multiple topics. Two articles were sorted into the Perioperative Care topic, 4 articles sorted into the Acute Care topic, 2 articles into the Subjective Analysis topic and 5 articles into the Function topic. There are no high quality (level one or two) clinical trials reporting comparisons of the bone bridge technique to traditional methods. There is limited evidence supporting the clinical outcomes of the bone bridge technique. There is no agreement supporting or discouraging the perioperative and acute care aspects of the bone bridge technique. There is no evidence defining an interventional comparison of the bone bridge technique. Current level III evidence supports a bone bridge technique as an equivalent option to the non-bone bridge transtibial amputation technique. Formal level I and II clinical trials will need to be considered in the future to guide clinical practice. Clinical relevance Clinical Practice Guidelines are evidence based. This systematic literature review identifies the highest quality evidence to date which reports a consensus of outcomes agreeing bone bridge is as safe and effective as alternatives. The clinical relevance is understanding bone bridge could additionally provide a mechanistic advantage for the transtibial amputee.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chuang, Yu-Chun; Sheu, Chou-Fu; Lee, Gene-Hsiang

    High-resolution X-ray diffraction experiments and atom-specific X-ray absorption experiments are applied to investigate a series of square planar complexes with the non-innocent ligand of maleonitriledithiolate (mnt), [S 2C 2(CN) 2] z-, containingM—S bonds. Four complexes of (PyH) z[M(mnt) 2] z-, whereM= Ni or Cu,z= 2 or 1 and PyH += C 5NH 6 +, were studied in order to clarify whether such one-electron oxidation–reduction, [M(mnt) 2] 2-/[M(mnt) 2] 1-, is taking place at the metal or the ligand site. Combining the techniques of metalK-,L-edge and SK-edge X-ray absorption spectroscopy with high-resolution X-ray charge density studies, it is unambiguously demonstrated thatmore » the electron redox reaction is ligand based and metal based for Ni and Cu pairs, respectively. The bonding characters in terms of topological properties associated with the bond critical points are compared between the oxidized form [ML] -and the reduced form [ML] 2-. In the case of Ni complexes, the formal oxidation state of Ni remains as Ni 2+and each mnt ligand carries a 2- charge in [Ni(mnt) 2] 2-, but only one of the ligands is formally oxidized in [Ni(mnt) 2] 1-. In contrast, in the case of Cu complexes, the mnt remains as 2- in both complexes, but the formal oxidation states of the metal are Cu 2+and Cu 3+. Bond characterizations andd-orbital populations will be presented. The complementary results of XAS, XRD and DFT calculations will be discussed. The conclusion on the redox reactions in these complexes can be firmly established.« less

  12. A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks

    PubMed Central

    Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos

    2016-01-01

    Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool (“ADVISES”) to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies. PMID:28025568

  13. Training the Sales Neophyte

    ERIC Educational Resources Information Center

    Harris, Clyde E., Jr.

    1975-01-01

    The article reappraises initial sales training and presents a program emphasizing objectives, responsibility for training, program content, and teaching techniques. Formal Initial Responsive Sales Training System (FIRSTS) is the name of the program explored and evaluated. (Author/MW)

  14. Weaving a Formal Methods Education with Problem-Based Learning

    NASA Astrophysics Data System (ADS)

    Gibson, J. Paul

    The idea of weaving formal methods through computing (or software engineering) degrees is not a new one. However, there has been little success in developing and implementing such a curriculum. Formal methods continue to be taught as stand-alone modules and students, in general, fail to see how fundamental these methods are to the engineering of software. A major problem is one of motivation — how can the students be expected to enthusiastically embrace a challenging subject when the learning benefits, beyond passing an exam and achieving curriculum credits, are not clear? Problem-based learning has gradually moved from being an innovative pedagogique technique, commonly used to better-motivate students, to being widely adopted in the teaching of many different disciplines, including computer science and software engineering. Our experience shows that a good problem can be re-used throughout a student's academic life. In fact, the best computing problems can be used with children (young and old), undergraduates and postgraduates. In this paper we present a process for weaving formal methods through a University curriculum that is founded on the application of problem-based learning and a library of good software engineering problems, where students learn about formal methods without sitting a traditional formal methods module. The process of constructing good problems and integrating them into the curriculum is shown to be analagous to the process of engineering software. This approach is not intended to replace more traditional formal methods modules: it will better prepare students for such specialised modules and ensure that all students have an understanding and appreciation for formal methods even if they do not go on to specialise in them.

  15. Security Modeling and Correctness Proof Using Specware and Isabelle

    DTIC Science & Technology

    2008-12-01

    proving requires substantial knowledge and experience in logical calculus . 15. NUMBER OF PAGES 146 14. SUBJECT TERMS Formal Method, Theorem...although the actual proving requires substantial knowledge and experience in logical calculus . vi THIS PAGE INTENTIONALLY LEFT BLANK vii TABLE OF...formal language and provides tools for proving those formulas in a logical calculus ” [5]. We are demonstrating in this thesis that a specification in

  16. Terminal Dynamics Approach to Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Meyers, Ronald

    1995-01-01

    This paper presents and discusses a mathematical formalism for simulation of discrete event dynamic (DED)-a special type of 'man-made' systems to serve specific purposes of information processing. The main objective of this work is to demonstrate that the mathematical formalism for DED can be based upon a terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.!.

  17. Success Stories on Non-Formal Adult Education and Training for Self-Employment in Micro-Enterprises in South Africa

    ERIC Educational Resources Information Center

    Mayombe, Celestin

    2017-01-01

    Purpose: The purpose of this paper is to investigate the way the adult non-formal education and training (NFET) centres motivated and empowered graduates to start their own micro-enterprises as individuals or as a group. The specific objectives are as follows: to find out the transforming factors fostering the utilisation of acquired skills into…

  18. Comparing the PPAT Drawings of Boys with AD/HD and Age-Matched Controls Using the Formal Elements Art Therapy Scale.

    ERIC Educational Resources Information Center

    Munley, Maripat

    2002-01-01

    Explores whether children with AD/HD respond differently to a specific art directive. Using the Formal Elements Art Therapy Scale to evaluate the drawings, results indicate three elements that would most accurately predict the artists into the AD/HD group: color prominence, details of objects and environments, and line quality. (Contains 29…

  19. META-GLARE: A meta-system for defining your own computer interpretable guideline system-Architecture and acquisition.

    PubMed

    Bottrighi, Alessio; Terenziani, Paolo

    2016-09-01

    Several different computer-assisted management systems of computer interpretable guidelines (CIGs) have been developed by the Artificial Intelligence in Medicine community. Each CIG system is characterized by a specific formalism to represent CIGs, and usually provides a manager to acquire, consult and execute them. Though there are several commonalities between most formalisms in the literature, each formalism has its own peculiarities. The goal of our work is to provide a flexible support to the extension or definition of CIGs formalisms, and of their acquisition and execution engines. Instead of defining "yet another CIG formalism and its manager", we propose META-GLARE (META Guideline Acquisition, Representation, and Execution), a "meta"-system to define new CIG systems. In this paper, META-GLARE, a meta-system to define new CIG systems, is presented. We try to capture the commonalities among current CIG approaches, by providing (i) a general manager for the acquisition, consultation and execution of hierarchical graphs (representing the control flow of actions in CIGs), parameterized over the types of nodes and of arcs constituting it, and (ii) a library of different elementary components of guidelines nodes (actions) and arcs, in which each type definition involves the specification of how objects of this type can be acquired, consulted and executed. We provide generality and flexibility, by allowing free aggregations of such elementary components to define new primitive node and arc types. We have drawn several experiments, in which we have used META-GLARE to build a CIG system (Experiment 1 in Section 8), or to extend it (Experiments 2 and 3). Such experiments show that META-GLARE provides a useful and easy-to-use support to such tasks. For instance, re-building the Guideline Acquisition, Representation, and Execution (GLARE) system using META-GLARE required less than one day (Experiment 1). META-GLARE is a meta-system for CIGs supporting fast prototyping. Since META-GLARE provides acquisition and execution engines that are parametric over the specific CIG formalism, it supports easy update and construction of CIG systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Design verification of SIFT

    NASA Technical Reports Server (NTRS)

    Moser, Louise; Melliar-Smith, Michael; Schwartz, Richard

    1987-01-01

    A SIFT reliable aircraft control computer system, designed to meet the ultrahigh reliability required for safety critical flight control applications by use of processor replications and voting, was constructed for SRI, and delivered to NASA Langley for evaluation in the AIRLAB. To increase confidence in the reliability projections for SIFT, produced by a Markov reliability model, SRI constructed a formal specification, defining the meaning of reliability in the context of flight control. A further series of specifications defined, in increasing detail, the design of SIFT down to pre- and post-conditions on Pascal code procedures. Mechanically checked mathematical proofs were constructed to demonstrate that the more detailed design specifications for SIFT do indeed imply the formal reliability requirement. An additional specification defined some of the assumptions made about SIFT by the Markov model, and further proofs were constructed to show that these assumptions, as expressed by that specification, did indeed follow from the more detailed design specifications for SIFT. This report provides an outline of the methodology used for this hierarchical specification and proof, and describes the various specifications and proofs performed.

  1. Formalization of treatment guidelines using Fuzzy Cognitive Maps and semantic web tools.

    PubMed

    Papageorgiou, Elpiniki I; Roo, Jos De; Huszka, Csaba; Colaert, Dirk

    2012-02-01

    Therapy decision making and support in medicine deals with uncertainty and needs to take into account the patient's clinical parameters, the context of illness and the medical knowledge of the physician and guidelines to recommend a treatment therapy. This research study is focused on the formalization of medical knowledge using a cognitive process, called Fuzzy Cognitive Maps (FCMs) and semantic web approach. The FCM technique is capable of dealing with situations including uncertain descriptions using similar procedure such as human reasoning does. Thus, it was selected for the case of modeling and knowledge integration of clinical practice guidelines. The semantic web tools were established to implement the FCM approach. The knowledge base was constructed from the clinical guidelines as the form of if-then fuzzy rules. These fuzzy rules were transferred to FCM modeling technique and, through the semantic web tools, the whole formalization was accomplished. The problem of urinary tract infection (UTI) in adult community was examined for the proposed approach. Forty-seven clinical concepts and eight therapy concepts were identified for the antibiotic treatment therapy problem of UTIs. A preliminary pilot-evaluation study with 55 patient cases showed interesting findings; 91% of the antibiotic treatments proposed by the implemented approach were in fully agreement with the guidelines and physicians' opinions. The results have shown that the suggested approach formalizes medical knowledge efficiently and gives a front-end decision on antibiotics' suggestion for cystitis. Concluding, modeling medical knowledge/therapeutic guidelines using cognitive methods and web semantic tools is both reliable and useful. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. The Development of Program for Enhancing Learning Management Competency of Teachers in Non-Formal and Informal Education Centers

    ERIC Educational Resources Information Center

    Jutasong, Chanokpon; Sirisuthi, Chaiyut; Phusri-on, Songsak

    2016-01-01

    The objectives of this research are: 1) to study factors and indicators, 2) to study current situations, desirable situations and techniques, 3) to develop the Program, and 4) to study the effect of Program. It comprised 4 phases: (1) studying the factors and indicators; (2) studying the current situations, desirable situations and techniques; (3)…

  3. Partnering

    DTIC Science & Technology

    1991-12-01

    pamplet is one in a series of pamplets describing applications of Alternative Dispute Resolution (ADR). The pamplet is part of a Corps program to...stages, or settle them prior to formal litigation. ADR is a new field, and additional techniques are being developed all the time. These pamplets are a...means of providing Corps managers with examples of how other managers have employed ADR techniques. The information in this pamplet is designed to

  4. Thermal Conductivities in Solids from First Principles: Accurate Computations and Rapid Estimates

    NASA Astrophysics Data System (ADS)

    Carbogno, Christian; Scheffler, Matthias

    In spite of significant research efforts, a first-principles determination of the thermal conductivity κ at high temperatures has remained elusive. Boltzmann transport techniques that account for anharmonicity perturbatively become inaccurate under such conditions. Ab initio molecular dynamics (MD) techniques using the Green-Kubo (GK) formalism capture the full anharmonicity, but can become prohibitively costly to converge in time and size. We developed a formalism that accelerates such GK simulations by several orders of magnitude and that thus enables its application within the limited time and length scales accessible in ab initio MD. For this purpose, we determine the effective harmonic potential occurring during the MD, the associated temperature-dependent phonon properties and lifetimes. Interpolation in reciprocal and frequency space then allows to extrapolate to the macroscopic scale. For both force-field and ab initio MD, we validate this approach by computing κ for Si and ZrO2, two materials known for their particularly harmonic and anharmonic character. Eventually, we demonstrate how these techniques facilitate reasonable estimates of κ from existing MD calculations at virtually no additional computational cost.

  5. Coaching Family Caregivers to Become Better Problem Solvers When Caring for Persons with Advanced Cancer.

    PubMed

    Dionne-Odom, J Nicholas; Lyons, Kathleen D; Akyar, Imatullah; Bakitas, Marie A

    2016-01-01

    Family caregivers of persons with advanced cancer often take on responsibilities that present daunting and complex problems. Serious problems that go unresolved may be burdensome and result in negative outcomes for caregivers' psychological and physical health and affect the quality of care delivered to the care recipients with cancer, especially at the end of life. Formal problem-solving training approaches have been developed over the past several decades to assist individuals with managing problems faced in daily life. Several of these problem-solving principles and techniques were incorporated into ENABLE (Educate, Nurture, Advise, Before Life End), an "early" palliative care telehealth intervention for individuals diagnosed with advanced cancer and their family caregivers. A hypothetical case resembling the situations of actual caregiver participants in ENABLE that exemplifies the complex problems that caregivers face is presented, followed by presentation of an overview of ENABLE's problem-solving key principles, techniques, and steps in problem-solving support. Though more research is needed to formally test the use of problem-solving support in social work practice, social workers can easily incorporate these techniques into everyday practice.

  6. On describing human white matter anatomy: the white matter query language.

    PubMed

    Wassermann, Demian; Makris, Nikos; Rathi, Yogesh; Shenton, Martha; Kikinis, Ron; Kubicki, Marek; Westin, Carl-Fredrik

    2013-01-01

    The main contribution of this work is the careful syntactical definition of major white matter tracts in the human brain based on a neuroanatomist's expert knowledge. We present a technique to formally describe white matter tracts and to automatically extract them from diffusion MRI data. The framework is based on a novel query language with a near-to-English textual syntax. This query language allows us to construct a dictionary of anatomical definitions describing white matter tracts. The definitions include adjacent gray and white matter regions, and rules for spatial relations. This enables automated coherent labeling of white matter anatomy across subjects. We use our method to encode anatomical knowledge in human white matter describing 10 association and 8 projection tracts per hemisphere and 7 commissural tracts. The technique is shown to be comparable in accuracy to manual labeling. We present results applying this framework to create a white matter atlas from 77 healthy subjects, and we use this atlas in a proof-of-concept study to detect tract changes specific to schizophrenia.

  7. Noise Suppression Methods for Robust Speech Processing

    DTIC Science & Technology

    1981-04-01

    1]. Techniques available for voice processor modification to account for noise contamination are being developed [4]. Preprocessor noise reduction...analysis window function. Principles governing discrete implementation of the transform pair are discussed, and relationships are formalized which specify

  8. An intermediate level of abstraction for computational systems chemistry.

    PubMed

    Andersen, Jakob L; Flamm, Christoph; Merkle, Daniel; Stadler, Peter F

    2017-12-28

    Computational techniques are required for narrowing down the vast space of possibilities to plausible prebiotic scenarios, because precise information on the molecular composition, the dominant reaction chemistry and the conditions for that era are scarce. The exploration of large chemical reaction networks is a central aspect in this endeavour. While quantum chemical methods can accurately predict the structures and reactivities of small molecules, they are not efficient enough to cope with large-scale reaction systems. The formalization of chemical reactions as graph grammars provides a generative system, well grounded in category theory, at the right level of abstraction for the analysis of large and complex reaction networks. An extension of the basic formalism into the realm of integer hyperflows allows for the identification of complex reaction patterns, such as autocatalysis, in large reaction networks using optimization techniques.This article is part of the themed issue 'Reconceptualizing the origins of life'. © 2017 The Author(s).

  9. A rule-based approach to model checking of UML state machines

    NASA Astrophysics Data System (ADS)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  10. Standardized terminology for clinical trial protocols based on top-level ontological categories.

    PubMed

    Heller, B; Herre, H; Lippoldt, K; Loeffler, M

    2004-01-01

    This paper describes a new method for the ontologically based standardization of concepts with regard to the quality assurance of clinical trial protocols. We developed a data dictionary for medical and trial-specific terms in which concepts and relations are defined context-dependently. The data dictionary is provided to different medical research networks by means of the software tool Onto-Builder via the internet. The data dictionary is based on domain-specific ontologies and the top-level ontology of GOL. The concepts and relations described in the data dictionary are represented in natural language, semi-formally or formally according to their use.

  11. A formal language for the specification and verification of synchronous and asynchronous circuits

    NASA Technical Reports Server (NTRS)

    Russinoff, David M.

    1993-01-01

    A formal hardware description language for the intended application of verifiable asynchronous communication is described. The language is developed within the logical framework of the Nqthm system of Boyer and Moore and is based on the event-driven behavioral model of VHDL, including the basic VHDL signal propagation mechanisms, the notion of simulation deltas, and the VHDL simulation cycle. A core subset of the language corresponds closely with a subset of VHDL and is adequate for the realistic gate-level modeling of both combinational and sequential circuits. Various extensions to this subset provide means for convenient expression of behavioral circuit specifications.

  12. MatLab Programming for Engineers Having No Formal Programming Knowledge

    NASA Technical Reports Server (NTRS)

    Shaykhian, Linda H.; Shaykhian, Gholam Ali

    2007-01-01

    MatLab is one of the most widely used very high level programming languages for Scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. Also, stated are the current limitations of the MatLab, which possibly can be taken care of by Mathworks Inc. in a future version to make MatLab more versatile.

  13. Formal Semantics and Implementation of BPMN 2.0 Inclusive Gateways

    NASA Astrophysics Data System (ADS)

    Christiansen, David Raymond; Carbone, Marco; Hildebrandt, Thomas

    We present the first direct formalization of the semantics of inclusive gateways as described in the Business Process Modeling Notation (BPMN) 2.0 Beta 1 specification. The formal semantics is given for a minimal subset of BPMN 2.0 containing just the inclusive and exclusive gateways and the start and stop events. By focusing on this subset we achieve a simple graph model that highlights the particular non-local features of the inclusive gateway semantics. We sketch two ways of implementing the semantics using algorithms based on incrementally updated data structures and also discuss distributed communication-based implementations of the two algorithms.

  14. Guideline validation in multiple trauma care through business process modeling.

    PubMed

    Stausberg, Jürgen; Bilir, Hüseyin; Waydhas, Christian; Ruchholtz, Steffen

    2003-07-01

    Clinical guidelines can improve the quality of care in multiple trauma. In our Department of Trauma Surgery a specific guideline is available paper-based as a set of flowcharts. This format is appropriate for the use by experienced physicians but insufficient for electronic support of learning, workflow and process optimization. A formal and logically consistent version represented with a standardized meta-model is necessary for automatic processing. In our project we transferred the paper-based into an electronic format and analyzed the structure with respect to formal errors. Several errors were detected in seven error categories. The errors were corrected to reach a formally and logically consistent process model. In a second step the clinical content of the guideline was revised interactively using a process-modeling tool. Our study reveals that guideline development should be assisted by process modeling tools, which check the content in comparison to a meta-model. The meta-model itself could support the domain experts in formulating their knowledge systematically. To assure sustainability of guideline development a representation independent of specific applications or specific provider is necessary. Then, clinical guidelines could be used for eLearning, process optimization and workflow management additionally.

  15. Formal methods and their role in digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1995-01-01

    This report is based on one prepared as a chapter for the FAA Digital Systems Validation Handbook (a guide to assist FAA certification specialists with advanced technology issues). Its purpose is to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used in critical applications; and to suggest factors for consideration when formal methods are offered in support of certification. The presentation concentrates on the rationale for formal methods and on their contribution to assurance for critical applications within a context such as that provided by DO-178B (the guidelines for software used on board civil aircraft); it is intended as an introduction for those to whom these topics are new.

  16. Electron transport in ultra-thin films and ballistic electron emission microscopy

    NASA Astrophysics Data System (ADS)

    Claveau, Y.; Di Matteo, S.; de Andres, P. L.; Flores, F.

    2017-03-01

    We have developed a calculation scheme for the elastic electron current in ultra-thin epitaxial heterostructures. Our model uses a Keldysh’s non-equilibrium Green’s function formalism and a layer-by-layer construction of the epitaxial film. Such an approach is appropriate to describe the current in a ballistic electron emission microscope (BEEM) where the metal base layer is ultra-thin and generalizes a previous one based on a decimation technique appropriated for thick slabs. This formalism allows a full quantum mechanical description of the transmission across the epitaxial heterostructure interface, including multiple scattering via the Dyson equation, which is deemed a crucial ingredient to describe interfaces of ultra-thin layers properly in the future. We introduce a theoretical formulation needed for ultra-thin layers and we compare with results obtained for thick Au(1 1 1) metal layers. An interesting effect takes place for a width of about ten layers: a BEEM current can propagate via the center of the reciprocal space (\\overlineΓ ) along the Au(1 1 1) direction. We associate this current to a coherent interference finite-width effect that cannot be found using a decimation technique. Finally, we have tested the validity of the handy semiclassical formalism to describe the BEEM current.

  17. Integrating Formal Methods and Testing 2002

    NASA Technical Reports Server (NTRS)

    Cukic, Bojan

    2002-01-01

    Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.

  18. Causal tapestries for psychology and physics.

    PubMed

    Sulis, William H

    2012-04-01

    Archetypal dynamics is a formal approach to the modeling of information flow in complex systems used to study emergence. It is grounded in the Fundamental Triad of realisation (system), interpretation (archetype) and representation (formal model). Tapestries play a fundamental role in the framework of archetypal dynamics as a formal representational system. They represent information flow by means of multi layered, recursive, interlinked graphical structures that express both geometry (form or sign) and logic (semantics). This paper presents a detailed mathematical description of a specific tapestry model, the causal tapestry, selected for use in describing behaving systems such as appear in psychology and physics from the standpoint of Process Theory. Causal tapestries express an explicit Lorentz invariant transient now generated by means of a reality game. Observables are represented by tapestry informons while subjective or hidden components (for example intellectual and emotional processes) are incorporated into the reality game that determines the tapestry dynamics. As a specific example, we formulate a random graphical dynamical system using causal tapestries.

  19. An ontology of scientific experiments

    PubMed Central

    Soldatova, Larisa N; King, Ross D

    2006-01-01

    The formal description of experiments for efficient analysis, annotation and sharing of results is a fundamental part of the practice of science. Ontologies are required to achieve this objective. A few subject-specific ontologies of experiments currently exist. However, despite the unity of scientific experimentation, no general ontology of experiments exists. We propose the ontology EXPO to meet this need. EXPO links the SUMO (the Suggested Upper Merged Ontology) with subject-specific ontologies of experiments by formalizing the generic concepts of experimental design, methodology and results representation. EXPO is expressed in the W3C standard ontology language OWL-DL. We demonstrate the utility of EXPO and its ability to describe different experimental domains, by applying it to two experiments: one in high-energy physics and the other in phylogenetics. The use of EXPO made the goals and structure of these experiments more explicit, revealed ambiguities, and highlighted an unexpected similarity. We conclude that, EXPO is of general value in describing experiments and a step towards the formalization of science. PMID:17015305

  20. Waste minimization/pollution prevention study of high-priority waste streams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ogle, R.B.

    1994-03-01

    Although waste minimization has been practiced by the Metals and Ceramics (M&C) Division in the past, the effort has not been uniform or formalized. To establish the groundwork for continuous improvement, the Division Director initiated a more formalized waste minimization and pollution prevention program. Formalization of the division`s pollution prevention efforts in fiscal year (FY) 1993 was initiated by a more concerted effort to determine the status of waste generation from division activities. The goal for this effort was to reduce or minimize the wastes identified as having the greatest impact on human health, the environment, and costs. Two broadmore » categories of division wastes were identified as solid/liquid wastes and those relating to energy use (primarily electricity and steam). This report presents information on the nonradioactive solid and liquid wastes generated by division activities. More specifically, the information presented was generated by teams of M&C staff members empowered by the Division Director to study specific waste streams.« less

  1. Don't abandon hope all ye who enter here: The protective role of formal mentoring and learning processes on burnout in correctional officers.

    PubMed

    Farnese, M L; Barbieri, B; Bellò, B; Bartone, P T

    2017-01-01

    Within a Job Demands-Resources Model framework, formal mentoring can be conceived as a job resource expressing the organization's support for new members, which may prevent their being at risk for burnout. This research aims at understanding the protective role of formal mentoring on burnout, through the effect of increasing learning personal resources. Specifically, we hypothesized that formal mentoring enhances newcomers' learning about job and social domains related to the new work context, thus leading to lower burnout. In order to test the hypotheses, a multiple regression analysis using the bootstrapping method was used. Based on a questionnaire administered to 117 correctional officer newcomers who had a formal mentor assigned, our results confirm that formal mentoring exerts a positive influence on newcomers' adjustment, and that this in turn exerts a protective influence against burnout onset by reducing cynicism and interpersonal stress and also enhancing the sense of personal accomplishment. Confirming previous literature's suggestions, supportive mentoring and effective socialization seem to represent job and personal resources that are protective against burnout. This study provides empirical support for this relation in the prison context.

  2. The cost of teaching an intern in New South Wales.

    PubMed

    Oates, R Kim; Goulston, Kerry J; Bingham, Craig M; Dent, Owen F

    2014-02-03

    To determine the cost of formal and informal teaching specifically provided for interns and to determine how much of an intern's time is spent in these activities. Costs of formal teaching for 2012 were obtained from the New South Wales Health Education and Training Institute (HETI) and costs of informal teaching by a survey of all interns in a random sample of prevocational networks. The cost of formal intern education provided by HETI; the number of hours of formal teaching provided to interns in hospital; intern estimates of the amount of non-timetabled teaching received in a typical week. The cost of formal teaching was $11 892 per intern per year and the cost of informal teaching was $2965 per intern per year (survey response rate, 63%) - a total of $14 857. Interns spent 2 hours per week in formal teaching and 28 minutes per week in informal teaching, representing 6.2% of a 40-hour week. The time of professionals paid by NSW Health represents most of the expenditure on teaching interns. An increase in time spent on intern teaching beyond the current 6.2% of an intern's 40-hour week would be an investment in better health care.

  3. Educational Specifications: Linking Design of School Facilities to Educational Program.

    ERIC Educational Resources Information Center

    California State Dept. of Education, Sacramento. Div. of School Facilities Planning.

    The California Department of Education, directed to formalize regulations governing standards for new school design and construction, has prepared a guide to help school districts develop specifications based on the architectural principle that form follows function. This guide discusses the meaning of educational specifications and their…

  4. What Sensing Tells Us: Towards a Formal Theory of Testing for Dynamical Systems

    NASA Technical Reports Server (NTRS)

    McIlraith, Sheila; Scherl, Richard

    2005-01-01

    Just as actions can have indirect effects on the state of the world, so too can sensing actions have indirect effects on an agent's state of knowledge. In this paper, we investigate "what sensing actions tell us", i.e., what an agent comes to know indirectly from the outcome of a sensing action, given knowledge of its actions and state constraints that hold in the world. To this end, we propose a formalization of the notion of testing within a dialect of the situation calculus that includes knowledge and sensing actions. Realizing this formalization requires addressing the ramification problem for sensing actions. We formalize simple tests as sensing actions. Complex tests are expressed in the logic programming language Golog. We examine what it means to perform a test, and how the outcome of a test affects an agent's state of knowledge. Finally, we propose automated reasoning techniques for test generation and complex-test verification, under certain restrictions. The work presented in this paper is relevant to a number of application domains including diagnostic problem solving, natural language understanding, plan recognition, and active vision.

  5. IU Progress Report January 2013

    DTIC Science & Technology

    2013-01-01

    given meme /conversation, 3. formalizing in an operational sense the definition of campaign and its associated features, 4. collecting relevant...reconstruct the context of a given meme /conversation, 3. formalizing in an operational sense the definition of campaign and its associated features, 4...specific   memes  (keywords  and  hashtags)  via   the  Twitter  “search  and  tracking  API”,  to  observe  the

  6. Discrete mathematics, formal methods, the Z schema and the software life cycle

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    The proper role and scope for the use of discrete mathematics and formal methods in support of engineering the security and integrity of components within deployed computer systems are discussed. It is proposed that the Z schema can be used as the specification language to capture the precise definition of system and component interfaces. This can be accomplished with an object oriented development paradigm.

  7. Migration and Validation of Non-Formal and Informal Learning in Europe: Inclusion, Exclusion or Polarisation in the Recognition of Skills?

    ERIC Educational Resources Information Center

    Souto-Otero, Manuel; Villalba-Garcia, Ernesto

    2015-01-01

    This article explores (1) the degree to which immigrants can be considered dominant groups in the area of validation of non-formal and informal learning and are subject to specific validation measures in 33 European countries; (2) whether country clusters can be identified within Europe with regard to the dominance of immigrants in the area of…

  8. Systems engineering principles for the design of biomedical signal processing systems.

    PubMed

    Faust, Oliver; Acharya U, Rajendra; Sputh, Bernhard H C; Min, Lim Choo

    2011-06-01

    Systems engineering aims to produce reliable systems which function according to specification. In this paper we follow a systems engineering approach to design a biomedical signal processing system. We discuss requirements capturing, specification definition, implementation and testing of a classification system. These steps are executed as formal as possible. The requirements, which motivate the system design, are based on diabetes research. The main requirement for the classification system is to be a reliable component of a machine which controls diabetes. Reliability is very important, because uncontrolled diabetes may lead to hyperglycaemia (raised blood sugar) and over a period of time may cause serious damage to many of the body systems, especially the nerves and blood vessels. In a second step, these requirements are refined into a formal CSP‖ B model. The formal model expresses the system functionality in a clear and semantically strong way. Subsequently, the proven system model was translated into an implementation. This implementation was tested with use cases and failure cases. Formal modeling and automated model checking gave us deep insight in the system functionality. This insight enabled us to create a reliable and trustworthy implementation. With extensive tests we established trust in the reliability of the implementation. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  9. Modeling and system design for the LOFAR station digital processing

    NASA Astrophysics Data System (ADS)

    Alliot, Sylvain; van Veelen, Martijn

    2004-09-01

    In the context of the LOFAR preliminary design phase and in particular for the specification of the Station Digital Processing (SDP), a performance/cost model of the system was used. We present here the framework and the trajectory followed in this phase when going from requirements to specification. In the phased array antenna concepts for the next generation of radio telescopes (LOFAR, ATA, SKA) signal processing (multi-beaming and RFI mitigation) replaces the large antenna dishes. The embedded systems for these telescopes are major infrastructure cost items. Moreover, the flexibility and overall performance of the instrument depend greatly on them, therefore alternative solutions need to be investigated. In particular, the technology and the various data transport selections play a fundamental role in the optimization of the architecture. We proposed a formal method [1] of exploring these alternatives that has been followed during the SDP developments. Different scenarios were compared for the specification of the application (selection of the algorithms as well as detailed signal processing techniques) and in the specification of the system architecture (selection of high level topologies, platforms and components). It gave us inside knowledge on the possible trade-offs in the application and architecture domains. This was successful in providing firm basis for the design choices that are demanded by technical review committees.

  10. High-order cyclo-difference techniques: An alternative to finite differences

    NASA Technical Reports Server (NTRS)

    Carpenter, Mark H.; Otto, John C.

    1993-01-01

    The summation-by-parts energy norm is used to establish a new class of high-order finite-difference techniques referred to here as 'cyclo-difference' techniques. These techniques are constructed cyclically from stable subelements, and require no numerical boundary conditions; when coupled with the simultaneous approximation term (SAT) boundary treatment, they are time asymptotically stable for an arbitrary hyperbolic system. These techniques are similar to spectral element techniques and are ideally suited for parallel implementation, but do not require special collocation points or orthogonal basis functions. The principal focus is on methods of sixth-order formal accuracy or less; however, these methods could be extended in principle to any arbitrary order of accuracy.

  11. A Microworld Approach to the Formalization of Musical Knowledge.

    ERIC Educational Resources Information Center

    Honing, Henkjan

    1993-01-01

    Discusses the importance of applying computational modeling and artificial intelligence techniques to music cognition and computer music research. Recommends three uses of microworlds to trim computational theories to their bare minimum, allowing for better and easier comparison. (CFR)

  12. International NMR-based Environmental Metabolomics Intercomparison Exercise

    EPA Science Inventory

    Several fundamental requirements must be met so that NMR-based metabolomics and the related technique of metabonomics can be formally adopted into environmental monitoring and chemical risk assessment. Here we report an intercomparison exercise which has evaluated the effectivene...

  13. 48 CFR 48.101 - General.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions... two value engineering approaches: (1) The first is an incentive approach in which contractor...

  14. On acquisition of programming knowledge

    NASA Technical Reports Server (NTRS)

    Amin, Ashok T.

    1987-01-01

    For the evolving discipline of programming, acquisition of programming knowledge is a difficult issue. Common knowledge results from the acceptance of proven techniques based on results of formal inquiries into the nature of the programming process. This is a rather slow process. In addition, the vast body of common knowledge needs to be explicated to a low enough level of details for it to be represented in the machine processable form. It is felt that this is an impediment to the progress of automatic programming. The importance of formal approaches cannot be overstated since their contributions lead to quantum leaps in the state of the art.

  15. Working on the Boundaries: Philosophies and Practices of the Design Process

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.

    1996-01-01

    While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.

  16. Perturbation theory of nuclear matter with a microscopic effective interaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benhar, Omar; Lovato, Alessandro

    Here, an updated and improved version of the effective interaction based on the Argonne-Urbana nuclear Hamiltonian, derived using the formalism of correlated basis functions and the cluster expansion technique, is employed to obtain a number of properties of cold nuclear matter at arbitrary neutron excess within the formalism of many-body perturbation theory. The numerical results, including the ground-state energy per nucleon, the symmetry energy, the pressure, the compressibility, and the single-particle spectrum, are discussed in the context of the available empirical information, obtained from measured nuclear properties and heavy-ion collisions.

  17. Perturbation theory of nuclear matter with a microscopic effective interaction

    DOE PAGES

    Benhar, Omar; Lovato, Alessandro

    2017-11-01

    Here, an updated and improved version of the effective interaction based on the Argonne-Urbana nuclear Hamiltonian, derived using the formalism of correlated basis functions and the cluster expansion technique, is employed to obtain a number of properties of cold nuclear matter at arbitrary neutron excess within the formalism of many-body perturbation theory. The numerical results, including the ground-state energy per nucleon, the symmetry energy, the pressure, the compressibility, and the single-particle spectrum, are discussed in the context of the available empirical information, obtained from measured nuclear properties and heavy-ion collisions.

  18. Perspectives on knowledge in engineering design

    NASA Technical Reports Server (NTRS)

    Rasdorf, W. J.

    1985-01-01

    Various perspectives are given of the knowledge currently used in engineering design, specifically dealing with knowledge-based expert systems (KBES). Constructing an expert system often reveals inconsistencies in domain knowledge while formalizing it. The types of domain knowledge (facts, procedures, judgments, and control) differ from the classes of that knowledge (creative, innovative, and routine). The feasible tasks for expert systems can be determined based on these types and classes of knowledge. Interpretive tasks require reasoning about a task in light of the knowledge available, where generative tasks create potential solutions to be tested against constraints. Only after classifying the domain by type and level can the engineer select a knowledge-engineering tool for the domain being considered. The critical features to be weighed after classification are knowledge representation techniques, control strategies, interface requirements, compatibility with traditional systems, and economic considerations.

  19. Formal specification and design techniques for wireless sensor and actuator networks.

    PubMed

    Martínez, Diego; González, Apolinar; Blanes, Francisco; Aquino, Raúl; Simo, José; Crespo, Alfons

    2011-01-01

    A current trend in the development and implementation of industrial applications is to use wireless networks to communicate the system nodes, mainly to increase application flexibility, reliability and portability, as well as to reduce the implementation cost. However, the nondeterministic and concurrent behavior of distributed systems makes their analysis and design complex, often resulting in less than satisfactory performance in simulation and test bed scenarios, which is caused by using imprecise models to analyze, validate and design these systems. Moreover, there are some simulation platforms that do not support these models. This paper presents a design and validation method for Wireless Sensor and Actuator Networks (WSAN) which is supported on a minimal set of wireless components represented in Colored Petri Nets (CPN). In summary, the model presented allows users to verify the design properties and structural behavior of the system.

  20. Towards the Verification of Human-Robot Teams

    NASA Technical Reports Server (NTRS)

    Fisher, Michael; Pearce, Edward; Wooldridge, Mike; Sierhuis, Maarten; Visser, Willem; Bordini, Rafael H.

    2005-01-01

    Human-Agent collaboration is increasingly important. Not only do high-profile activities such as NASA missions to Mars intend to employ such teams, but our everyday activities involving interaction with computational devices falls into this category. In many of these scenarios, we are expected to trust that the agents will do what we expect and that the agents and humans will work together as expected. But how can we be sure? In this paper, we bring together previous work on the verification of multi-agent systems with work on the modelling of human-agent teamwork. Specifically, we target human-robot teamwork. This paper provides an outline of the way we are using formal verification techniques in order to analyse such collaborative activities. A particular application is the analysis of human-robot teams intended for use in future space exploration.

  1. Formalizing the definition of meta-analysis in Molecular Ecology.

    PubMed

    ArchMiller, Althea A; Bauer, Eric F; Koch, Rebecca E; Wijayawardena, Bhagya K; Anil, Ammu; Kottwitz, Jack J; Munsterman, Amelia S; Wilson, Alan E

    2015-08-01

    Meta-analysis, the statistical synthesis of pertinent literature to develop evidence-based conclusions, is relatively new to the field of molecular ecology, with the first meta-analysis published in the journal Molecular Ecology in 2003 (Slate & Phua 2003). The goal of this article is to formalize the definition of meta-analysis for the authors, editors, reviewers and readers of Molecular Ecology by completing a review of the meta-analyses previously published in this journal. We also provide a brief overview of the many components required for meta-analysis with a more specific discussion of the issues related to the field of molecular ecology, including the use and statistical considerations of Wright's FST and its related analogues as effect sizes in meta-analysis. We performed a literature review to identify articles published as 'meta-analyses' in Molecular Ecology, which were then evaluated by at least two reviewers. We specifically targeted Molecular Ecology publications because as a flagship journal in this field, meta-analyses published in Molecular Ecology have the potential to set the standard for meta-analyses in other journals. We found that while many of these reviewed articles were strong meta-analyses, others failed to follow standard meta-analytical techniques. One of these unsatisfactory meta-analyses was in fact a secondary analysis. Other studies attempted meta-analyses but lacked the fundamental statistics that are considered necessary for an effective and powerful meta-analysis. By drawing attention to the inconsistency of studies labelled as meta-analyses, we emphasize the importance of understanding the components of traditional meta-analyses to fully embrace the strengths of quantitative data synthesis in the field of molecular ecology. © 2015 John Wiley & Sons Ltd.

  2. Use of malaria RDTs in various health contexts across sub-Saharan Africa: a systematic review.

    PubMed

    Boyce, Matthew R; O'Meara, Wendy P

    2017-05-18

    The World Health Organization recommends parasitological confirmation of malaria prior to treatment. Malaria rapid diagnostic tests (RDTs) represent one diagnostic method that is used in a variety of contexts to overcome limitations of other diagnostic techniques. Malaria RDTs increase the availability and feasibility of accurate diagnosis and may result in improved quality of care. Though RDTs are used in a variety of contexts, no studies have compared how well or effectively RDTs are used across these contexts. This review assesses the diagnostic use of RDTs in four different contexts: health facilities, the community, drug shops and schools. A comprehensive search of the Pubmed database was conducted to evaluate RDT execution, test accuracy, or adherence to test results in sub-Saharan Africa. Original RDT and Plasmodium falciparum focused studies conducted in formal health care facilities, drug shops, schools, or by CHWs between the year 2000 and December 2016 were included. Studies were excluded if they were conducted exclusively in a research laboratory setting, where staff from the study team conducted RDTs, or in settings outside of sub-Saharan Africa. The literature search identified 757 reports. A total of 52 studies were included in the analysis. Overall, RDTs were performed safely and effectively by community health workers provided they receive proper training. Analogous information was largely absent for formal health care workers. Tests were generally accurate across contexts, except for in drug shops where lower specificities were observed. Adherence to RDT results was higher among drug shop vendors and community health workers, while adherence was more variable among formal health care workers, most notably with negative test results. Malaria RDTs are generally used well, though compliance with test results is variable - especially in the formal health care sector. If low adherence rates are extrapolated, thousands of patients may be incorrectly diagnosed and receive inappropriate treatment resulting in a low quality of care and unnecessary drug use. Multidisciplinary research should continue to explore determinants of good RDT use, and seek to better understand how to support and sustain the correct use of this diagnostic tool.

  3. Stochastic Game Analysis and Latency Awareness for Self-Adaptation

    DTIC Science & Technology

    2014-01-01

    this paper, we introduce a formal analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to quantify the...Additional Key Words and Phrases: Proactive adaptation, Stochastic multiplayer games , Latency 1. INTRODUCTION When planning how to adapt, self-adaptive...contribution of this paper is twofold: (1) A novel analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to

  4. Pinch technique and the Batalin-Vilkovisky formalism

    NASA Astrophysics Data System (ADS)

    Binosi, Daniele; Papavassiliou, Joannis

    2002-07-01

    In this paper we take the first step towards a nondiagrammatic formulation of the pinch technique. In particular we proceed into a systematic identification of the parts of the one-loop and two-loop Feynman diagrams that are exchanged during the pinching process in terms of unphysical ghost Green's functions; the latter appear in the standard Slavnov-Taylor identity satisfied by the tree-level and one-loop three-gluon vertex. This identification allows for the consistent generalization of the intrinsic pinch technique to two loops, through the collective treatment of entire sets of diagrams, instead of the laborious algebraic manipulation of individual graphs, and sets up the stage for the generalization of the method to all orders. We show that the task of comparing the effective Green's functions obtained by the pinch technique with those computed in the background field method Feynman gauge is significantly facilitated when employing the powerful quantization framework of Batalin and Vilkovisky. This formalism allows for the derivation of a set of useful nonlinear identities, which express the background field method Green's functions in terms of the conventional (quantum) ones and auxiliary Green's functions involving the background source and the gluonic antifield; these latter Green's functions are subsequently related by means of a Schwinger-Dyson type of equation to the ghost Green's functions appearing in the aforementioned Slavnov-Taylor identity.

  5. Quantifying specific capacity and salinity variability in Amman Zarqa Basin, Central Jordan, using empirical statistical and geostatistical techniques.

    PubMed

    Shaqour, F; Taany, R; Rimawi, O; Saffarini, G

    2016-01-01

    Modeling groundwater properties is an important tool by means of which water resources management can judge whether these properties are within the safe limits or not. This is usually done regularly and in the aftermath of crises that are expected to reflect negatively on groundwater properties, as occurred in Jordan due to crises in neighboring countries. In this study, specific capacity and salinity of groundwater of B2/A7 aquifer in Amman Zarqa Basin were evaluated to figure out the effect of population increase in this basin as a result of refugee flux from neighboring countries to this heavily populated basin after Gulf crises 1990 and 2003. Both properties were found to exhibit a three-parameter lognormal distribution. The empirically calculated β parameter of this distribution mounted up to 0.39 m(3)/h/min for specific capacity and 238 ppm for salinity. This parameter is suggested to account for the global changes that took place all over the basin during the entire period of observation and not for local changes at every well or at certain localities in the basin. It can be considered as an exploratory result of data analysis. Formal and implicit evaluation followed this step using structural analysis and construction of experimental semivariograms that represent the spatial variability of both properties. The adopted semivariograms were then used to construct maps to illustrate the spatial variability of the properties under consideration using kriging interpolation techniques. Semivariograms show that specific capacity and salinity values are spatially dependent within 14,529 and 16,309 m, respectively. Specific capacity semivariogram exhibit a nugget effect on a small scale (324 m). This can be attributed to heterogeneity or inadequacies in measurement. Specific capacity and salinity maps show that the major changes exhibit a northwest southeast trend, near As-Samra Wastewater Treatment Plant. The results of this study suggest proper management practices.

  6. Estimating the cost of cervical cancer screening in five developing countries

    PubMed Central

    Goldhaber-Fiebert, Jeremy D; Goldie, Sue J

    2006-01-01

    Background Cost-effectiveness analyses (CEAs) can provide useful information to policymakers concerned with the broad allocation of resources as well as to local decision makers choosing between different options for reducing the burden from a single disease. For the latter, it is important to use country-specific data when possible and to represent cost differences between countries that might make one strategy more or less attractive than another strategy locally. As part of a CEA of cervical cancer screening in five developing countries, we supplemented limited primary cost data by developing other estimation techniques for direct medical and non-medical costs associated with alternative screening approaches using one of three initial screening tests: simple visual screening, HPV DNA testing, and cervical cytology. Here, we report estimation methods and results for three cost areas in which data were lacking. Methods To supplement direct medical costs, including staff, supplies, and equipment depreciation using country-specific data, we used alternative techniques to quantify cervical cytology and HPV DNA laboratory sample processing costs. We used a detailed quantity and price approach whose face validity was compared to an adaptation of a US laboratory estimation methodology. This methodology was also used to project annual sample processing capacities for each laboratory type. The cost of sample transport from the clinic to the laboratory was estimated using spatial models. A plausible range of the cost of patient time spent seeking and receiving screening was estimated using only formal sector employment and wages as well as using both formal and informal sector participation and country-specific minimum wages. Data sources included primary data from country-specific studies, international databases, international prices, and expert opinion. Costs were standardized to year 2000 international dollars using inflation adjustment and purchasing power parity. Results Cervical cytology laboratory processing costs were I$1.57–3.37 using the quantity and price method compared to I$1.58–3.02 from the face validation method. HPV DNA processing costs were I$6.07–6.59. Rural laboratory transport costs for cytology were I$0.12–0.64 and I$0.14–0.74 for HPV DNA laboratories. Under assumptions of lower resource efficiency, these estimates increased to I$0.42–0.83 and I$0.54–1.06. Estimates of the value of an hour of patient time using only formal sector participation were I$0.07–4.16, increasing to I$0.30–4.80 when informal and unpaid labor was also included. The value of patient time for traveling, waiting, and attending a screening visit was I$0.68–17.74. With the total cost of screening for cytology and HPV DNA testing ranging from I$4.85–40.54 and I$11.30–48.77 respectively, the cost of the laboratory transport, processing, and patient time accounted for 26–66% and 33–65% of the total costs. From a payer perspective, laboratory transport and processing accounted for 18–48% and 25–60% of total direct medical costs of I$4.11–19.96 and I$10.57–28.18 respectively. Conclusion Cost estimates of laboratory processing, sample transport, and patient time account for a significant proportion of total cervical cancer screening costs in five developing countries and provide important inputs for CEAs of alternative screening modalities. PMID:16887041

  7. IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation

    NASA Technical Reports Server (NTRS)

    Margaria, Tiziana (Editor); Steffen, Bernhard (Editor); Hichey, Michael G.

    2005-01-01

    This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains.

  8. How do physicians learn to provide palliative care?

    PubMed

    Schulman-Green, Dena

    2003-01-01

    Medical interns, residents, and fellows are heavily involved in caring for dying patients and interacting with their families. Due to a lack of formal medical education in the area, these house staff often have a limited knowledge of palliative care. The purpose of this study was to determine how, given inadequate formal education, house staff learn to provide palliative care. Specifically, this study sought to explore the extent to which physicians learn to provide palliative care through formal medical education, from physicians and other hospital staff, and by on-the-job learning. Twenty physicians were interviewed about their medical education and other learning experiences in palliative care. ATLAS/ti software was used for data coding and analysis. Analysis of transcripts indicated that house staff learn little to nothing through formal education, to varying degrees from attending physicians and hospital staff, and mostly on the job and by making mistakes.

  9. Relating mentor type and mentoring behaviors to academic medicine faculty satisfaction and productivity at one medical school.

    PubMed

    Shollen, S Lynn; Bland, Carole J; Center, Bruce A; Finstad, Deborah A; Taylor, Anne L

    2014-09-01

    To examine relationships among having formal and informal mentors, mentoring behaviors, and satisfaction and productivity for academic medicine faculty. In 2005, the authors surveyed full-time faculty at the University of Minnesota Medical School to assess their perceptions of variables associated with job satisfaction and productivity. This analysis focused on perceptions of mentoring as related to satisfaction with current position and productivity (articles published in peer-reviewed journals [article production] and role as a primary investigator [PI] or a co-PI on a grant/contract). Of 615 faculty, 354 (58%) responded. Satisfied faculty were not necessarily productive, and vice versa. Outcomes differed somewhat for mentor types: Informal mentoring was more important for satisfaction, and formal mentoring was more important for productivity. Regardless of mentor type, the 14 mentoring behaviors examined related more to satisfaction than productivity. Only one behavior-serves as a role model-was significantly, positively related to article production. Although participants reported that formal and informal mentors performed the same mentoring behaviors, mentees were more satisfied or productive when some behaviors were performed by formal mentors. The results emphasize the importance of having both formal and informal mentors who perform mentoring behaviors associated with satisfaction and productivity. The results provide a preliminary indication that mentor types and specific mentoring behaviors may have different effects on satisfaction and productivity. Despite the differences found for some behaviors, it seems that it is more essential that mentoring behaviors be performed by any mentor than by a specific type of mentor.

  10. Proceedings of the first switch tube advanced technology meeting held at EG G, Salem, Massachusetts, May 23, 1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shuman, A.; Beavis, L.

    Early in 1990, J. A. Wilder, Supervisor of Sandia National Laboratories (SNLA), Division 2565 requested that a meeting of the scientists and engineers responsible for developing and producing switch tubes be set up to discuss in a semi-formal way the science and technology of switch tubes. Programmatic and administrative issues were specifically exempted from the discussions. L. Beavis, Division 7471, SNL and A. Shuman, EG G, Salem were made responsible for organizing a program including the materials and processes of switch tubes. The purpose of the Switch Tube Advanced Technology meeting was to allow personnel from Allied Signal Kansas Citymore » Division (AS/KCD); EG G, Salem and Sandia National Laboratories (SNL) to discuss a variety of issues involved in the development and production of switch tubes. It was intended that the formal and informal discussions would allow a better understanding of the production problems by material and process engineers and of the materials and processes by production engineers. This program consisted of formal presentations on May 23 and informal discussions on May 24. The topics chosen for formal presentation were suggested by the people of AS/KCD, EG G, Salem, and SNL involved with the design, development and production of switch tubes. The topics selected were generic. They were not directed to any specific switch tube but rather to all switch tubes in production and development. This document includes summaries of the material presented at the formal presentation on May 23.« less

  11. Improving the interface between informal carers and formal health and social services: a qualitative study.

    PubMed

    McPherson, K M; Kayes, N K; Moloczij, N; Cummins, C

    2014-03-01

    Reports about the impact of caring vary widely, but a consistent finding is that the role is influenced (for better or worse) by how formal services respond to, and work with informal carers and of course the cared for person. We aimed to explore the connection between informal and formal cares and identify how a positive connection or interface might be developed and maintained. We undertook a qualitative descriptive study with focus groups and individual interviews with informal carers, formal care service providers and representatives from carer advocacy groups. Content analysis was used to identify key factors impacting on the interface between informal and formal carers and propose specific recommendations for service development. Community setting including urban and rural areas of New Zealand. Seventy participants (the majority informal carers) took part in 13 focus groups and 22 individual interviews. Four key themes were derived: Quality of care for the care recipient; Knowledge exchange (valuing carer perspectives); One size does not fit all (creating flexible services); and A constant struggle (reducing the burden services add). An optimum interface to address these key areas was proposed. In addition to ensuring quality care for the care recipient, specific structures and processes to support a more positive interface appear warranted if informal carers and services are to work well together. An approach recognising the caring context and carer expertise may decrease the additional burden services contribute, and reduce conflicting information and resultant confusion and/or frustration many carers experience. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. A single cognitive heuristic process meets the complexity of domain-specific moral heuristics.

    PubMed

    Dubljević, Veljko; Racine, Eric

    2014-10-01

    The inherence heuristic (a) offers modest insights into the complex nature of both the is-ought tension in moral reasoning and moral reasoning per se, and (b) does not reflect the complexity of domain-specific moral heuristics. Formal and general in nature, we contextualize the process described as "inherence heuristic" in a web of domain-specific heuristics (e.g., agent specific; action specific; consequences specific).

  13. The quality management journey: the progress of health facilities in Australia.

    PubMed

    Carr, B J

    1994-12-01

    Many facilities in Australia have taken the Total Quality Management (TQM) step. The objective of this study was to examine progress of adopted formal quality systems in health. Sixty per cent of organizations surveyed have adopted formal systems. Of these, Deming adherents are the most common, followed by eclectic choices. Only 35% considered the quality transition as reasonably easy. There was no relationship between accreditation and formal quality systems identified. The most common improvement techniques were: flow charts, histograms, and cause and effect diagrams. Quality practitioners are happy to use several tools exceptionally well rather than have many tools at their disposal. The greatest impediment to the adoption of quality was the lack of top management support. This study did not support the view that clinicians are not readily actively supporting quality initiatives. Total Quality Management is not a mature concept; however, Chief Executive Officers are assured that rewards will be realized over time.

  14. Formal Methods for Automated Diagnosis of Autosub 6000

    NASA Technical Reports Server (NTRS)

    Ernits, Juhan; Dearden, Richard; Pebody, Miles

    2009-01-01

    This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.

  15. Petri Nets - A Mathematical Formalism to Analyze Chemical Reaction Networks.

    PubMed

    Koch, Ina

    2010-12-17

    In this review we introduce and discuss Petri nets - a mathematical formalism to describe and analyze chemical reaction networks. Petri nets were developed to describe concurrency in general systems. We find most applications to technical and financial systems, but since about twenty years also in systems biology to model biochemical systems. This review aims to give a short informal introduction to the basic formalism illustrated by a chemical example, and to discuss possible applications to the analysis of chemical reaction networks, including cheminformatics. We give a short overview about qualitative as well as quantitative modeling Petri net techniques useful in systems biology, summarizing the state-of-the-art in that field and providing the main literature references. Finally, we discuss advantages and limitations of Petri nets and give an outlook to further development. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Mathematics and Information Retrieval.

    ERIC Educational Resources Information Center

    Salton, Gerald

    1979-01-01

    Examines the main mathematical approaches to information retrieval, including both algebraic and probabilistic models, and describes difficulties which impede formalization of information retrieval processes. A number of developments are covered where new theoretical understandings have directly led to improved retrieval techniques and operations.…

  17. 48 CFR 48.101 - General.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... engineering effort is applied to areas of the contract that offer opportunities for considerable savings... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions...

  18. 48 CFR 48.101 - General.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... engineering effort is applied to areas of the contract that offer opportunities for considerable savings... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions...

  19. 48 CFR 48.101 - General.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... engineering effort is applied to areas of the contract that offer opportunities for considerable savings... ENGINEERING Policies and Procedures 48.101 General. (a) Value engineering is the formal technique by which... performing more economically. Value engineering attempts to eliminate, without impairing essential functions...

  20. Two formalisms, one renormalized stress-energy tensor

    NASA Astrophysics Data System (ADS)

    Barceló, C.; Carballo, R.; Garay, L. J.

    2012-04-01

    We explicitly compare the structure of the renormalized stress-energy tensor of a massless scalar field in a (1+1) curved spacetime as obtained by two different strategies: normal-mode construction of the field operator and one-loop effective action. We pay special attention to where and how the information related to the choice of vacuum state in both formalisms is encoded. By establishing a clear translation map between both procedures, we show that these two potentially different renormalized stress-energy tensors are actually equal, when using vacuum-state choices related by this map. One specific aim of the analysis is to facilitate the comparison of results regarding semiclassical effects in gravitational collapse as obtained within these different formalisms.

  1. Formal Analysis of BPMN Models Using Event-B

    NASA Astrophysics Data System (ADS)

    Bryans, Jeremy W.; Wei, Wei

    The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.

  2. Nuclear fragmentation studies for microelectronic application

    NASA Technical Reports Server (NTRS)

    Ngo, Duc M.; Wilson, John W.; Buck, Warren W.; Fogarty, Thomas N.

    1989-01-01

    A formalism for target fragment transport is presented with application to energy loss spectra in thin silicon devices. Predicted results are compared to experiments with the surface barrier detectors of McNulty et al. The intranuclear cascade nuclear reaction model does not predict the McNulty experimental data for the highest energy events. A semiempirical nuclear cross section gives an adequate explanation of McNulty's experiments. Application of the formalism to specific electronic devices is discussed.

  3. Specification of Security and Dependability Properties

    NASA Astrophysics Data System (ADS)

    Gürgens, Sigrid; Pujol, Gimena

    SERENITY S&D Classes as well as S&D Patterns specify the security properties they provide. In order for a system designer to select the correct class and pattern, the security property specification must be both unambiguous and intuitive. Furthermore, in case no class or pattern can be found that provides the exact property desired by the system designer, classes and patterns providing stronger properties will also serve his/her needs. Hence there is the necessity to be able to find and prove relations between properties. In this chapter we introduce the SERENITY approach for the specification of S&D properties that are both intuitively understandable and based on a formal semantics that allows to prove relations between properties. In fact, we use two different languages: the Operational S&D Properties Language, and the Formal S&D Properties Language.

  4. Negotiation techniques for health care professionals.

    PubMed

    Berlin, Jonathan W; Lexa, Frank J

    2007-07-01

    Negotiation is an essential part of health care practice and is not formally taught during medical training. This article aims to improve the negotiation skills of readers by explaining the essential components of preparation before a negotiation and reviewing common techniques for optimizing negotiated agreements. The terms reservation point, target value, and best alternative to a negotiated agreement are defined, and their importance in negotiation preparation is explained. The concept of anchoring, or making the first offer, in a negotiation is reviewed, and important techniques for team negotiation are provided.

  5. Perturbative universal state-selective correction for state-specific multi-reference coupled cluster methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brabec, Jiri; Banik, Subrata; Kowalski, Karol

    2016-10-28

    The implementation details of the universal state-selective (USS) multi-reference coupled cluster (MRCC) formalism with singles and doubles (USS(2)) are discussed on the example of several benchmark systems. We demonstrate that the USS(2) formalism is capable of improving accuracies of state specific multi-reference coupled-cluster (MRCC) methods based on the Brillouin-Wigner and Mukherjee’s sufficiency conditions. Additionally, it is shown that the USS(2) approach significantly alleviates problems associated with the lack of invariance of MRCC theories upon the rotation of active orbitals. We also discuss the perturbative USS(2) formulations that significantly reduce numerical overhead of the full USS(2) method.

  6. Mapping of polycrystalline films of biological fluids utilizing the Jones-matrix formalism

    NASA Astrophysics Data System (ADS)

    Ushenko, Vladimir A.; Dubolazov, Alexander V.; Pidkamin, Leonid Y.; Sakchnovsky, Michael Yu; Bodnar, Anna B.; Ushenko, Yuriy A.; Ushenko, Alexander G.; Bykov, Alexander; Meglinski, Igor

    2018-02-01

    Utilizing a polarized light approach, we reconstruct the spatial distribution of birefringence and optical activity in polycrystalline films of biological fluids. The Jones-matrix formalism is used for an accessible quantitative description of these types of optical anisotropy. We demonstrate that differentiation of polycrystalline films of biological fluids can be performed based on a statistical analysis of the distribution of rotation angles and phase shifts associated with the optical activity and birefringence, respectively. Finally, practical operational characteristics, such as sensitivity, specificity and accuracy of the Jones-matrix reconstruction of optical anisotropy, were identified with special emphasis on biomedical application, specifically for differentiation of bile films taken from healthy donors and from patients with cholelithiasis.

  7. Automated Reuse of Scientific Subroutine Libraries through Deductive Synthesis

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Pressburger, Thomas; VanBaalen, Jeffrey; Roach, Steven

    1997-01-01

    Systematic software construction offers the potential of elevating software engineering from an art-form to an engineering discipline. The desired result is more predictable software development leading to better quality and more maintainable software. However, the overhead costs associated with the formalisms, mathematics, and methods of systematic software construction have largely precluded their adoption in real-world software development. In fact, many mainstream software development organizations, such as Microsoft, still maintain a predominantly oral culture for software development projects; which is far removed from a formalism-based culture for software development. An exception is the limited domain of safety-critical software, where the high-assuiance inherent in systematic software construction justifies the additional cost. We believe that systematic software construction will only be adopted by mainstream software development organization when the overhead costs have been greatly reduced. Two approaches to cost mitigation are reuse (amortizing costs over many applications) and automation. For the last four years, NASA Ames has funded the Amphion project, whose objective is to automate software reuse through techniques from systematic software construction. In particular, deductive program synthesis (i.e., program extraction from proofs) is used to derive a composition of software components (e.g., subroutines) that correctly implements a specification. The construction of reuse libraries of software components is the standard software engineering solution for improving software development productivity and quality.

  8. Counterfeit deterrence and digital imaging technology

    NASA Astrophysics Data System (ADS)

    Church, Sara E.; Fuller, Reese H.; Jaffe, Annette B.; Pagano, Lorelei W.

    2000-04-01

    The US government recognizes the growing problem of counterfeiting currency using digital imaging technology, as desktop systems become more sophisticated, less expensive and more prevalent. As the rate of counterfeiting with this type of equipment has grown, the need for specific prevention methods has become apparent to the banknote authorities. As a result, the Treasury Department and Federal Reserve have begun to address issues related specifically to this type of counterfeiting. The technical representatives of these agencies are taking a comprehensive approach to minimize counterfeiting using digital technology. This approach includes identification of current technology solutions for banknote recognition, data stream intervention and output marking, outreach to the hardware and software industries and enhancement of public education efforts. Other aspects include strong support and cooperation with existing international efforts to prevent counterfeiting, review and amendment of existing anti- counterfeiting legislation and investigation of currency design techniques to make faithful reproduction more difficult. Implementation of these steps and others are to lead to establishment of a formal, permanent policy to address and prevent the use of emerging technologies to counterfeit currency.

  9. Formalizing the Austrian Procedure Catalogue: A 4-step methodological analysis approach.

    PubMed

    Neururer, Sabrina Barbara; Lasierra, Nelia; Peiffer, Karl Peter; Fensel, Dieter

    2016-04-01

    Due to the lack of an internationally accepted and adopted standard for coding health interventions, Austria has established its own country-specific procedure classification system - the Austrian Procedure Catalogue (APC). Even though the APC is an elaborate coding standard for medical procedures, it has shortcomings that limit its usability. In order to enhance usability and usefulness, especially for research purposes and e-health applications, we developed an ontologized version of the APC. In this paper we present a novel four-step approach for the ontology engineering process, which enables accurate extraction of relevant concepts for medical ontologies from written text. The proposed approach for formalizing the APC consists of the following four steps: (1) comparative pre-analysis, (2) definition analysis, (3) typological analysis, and (4) ontology implementation. The first step contained a comparison of the APC to other well-established or elaborate health intervention coding systems in order to identify strengths and weaknesses of the APC. In the second step, a list of definitions of medical terminology used in the APC was obtained. This list of definitions was used as input for Step 3, in which we identified the most important concepts to describe medical procedures using the qualitative typological analysis approach. The definition analysis as well as the typological analysis are well-known and effective methods used in social sciences, but not commonly employed in the computer science or ontology engineering domain. Finally, this list of concepts was used in Step 4 to formalize the APC. The pre-analysis highlighted the major shortcomings of the APC, such as the lack of formal definition, leading to implicitly available, but not directly accessible information (hidden data), or the poor procedural type classification. After performing the definition and subsequent typological analyses, we were able to identify the following main characteristics of health interventions: (1) Procedural type, (2) Anatomical site, (3) Medical device, (4) Pathology, (5) Access, (6) Body system, (7) Population, (8) Aim, (9) Discipline, (10) Technique, and (11) Body Function. These main characteristics were taken as input of classes for the formalization of the APC. We were also able to identify relevant relations between classes. The proposed four-step approach for formalizing the APC provides a novel, systematically developed, strong framework to semantically enrich procedure classifications. Although this methodology was designed to address the particularities of the APC, the included methods are based on generic analysis tasks, and therefore can be re-used to provide a systematic representation of other procedure catalogs or classification systems and hence contribute towards a universal alignment of such representations, if desired. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Medication administration errors from a nursing viewpoint: a formal consensus of definition and scenarios using a Delphi technique.

    PubMed

    Shawahna, Ramzi; Masri, Dina; Al-Gharabeh, Rawan; Deek, Rawan; Al-Thayba, Lama; Halaweh, Masa

    2016-02-01

    To develop and achieve formal consensus on a definition of medication administration errors and scenarios that should or should not be considered as medication administration errors in hospitalised patient settings. Medication administration errors occur frequently in hospitalised patient settings. Currently, there is no formal consensus on a definition of medication administration errors or scenarios that should or should not be considered as medication administration errors. This was a descriptive study using Delphi technique. A panel of experts (n = 50) recruited from major hospitals, nursing schools and universities in Palestine took part in the study. Three Delphi rounds were followed to achieve consensus on a proposed definition of medication administration errors and a series of 61 scenarios representing potential medication administration error situations formulated into a questionnaire. In the first Delphi round, key contact nurses' views on medication administration errors were explored. In the second Delphi round, consensus was achieved to accept the proposed definition of medication administration errors and to include 36 (59%) scenarios and exclude 1 (1·6%) as medication administration errors. In the third Delphi round, consensus was achieved to consider further 14 (23%) and exclude 2 (3·3%) as medication administration errors while the remaining eight (13·1%) were considered equivocal. Of the 61 scenarios included in the Delphi process, experts decided to include 50 scenarios as medication administration errors, exclude three scenarios and include or exclude eight scenarios depending on the individual clinical situation. Consensus on a definition and scenarios representing medication administration errors can be achieved using formal consensus techniques. Researchers should be aware that using different definitions of medication administration errors, inclusion or exclusion of medication administration error situations could significantly affect the rate of medication administration errors reported in their studies. Consensual definitions and medication administration error situations can be used in future epidemiology studies investigating medication administration errors in hospitalised patient settings which may permit and promote direct comparisons of different studies. © 2015 John Wiley & Sons Ltd.

  11. Formal Techniques for Organization Analysis: Task and Resource Management

    DTIC Science & Technology

    1984-06-01

    typical approach has been to base new entities on stereotypical structures and make changes as problems are recognized. Clearly, this is not an...human resources; and provide the means to change and track all 4 L I _ _ _ ____ I I these parameters as they interact with each other and respond to...functioning under internal and external change . 3. Data gathering techniques to allow one to efficiently r,’lect reliable modeling parameters from

  12. Visualizing Matrix Multiplication

    ERIC Educational Resources Information Center

    Daugulis, Peteris; Sondore, Anita

    2018-01-01

    Efficient visualizations of computational algorithms are important tools for students, educators, and researchers. In this article, we point out an innovative visualization technique for matrix multiplication. This method differs from the standard, formal approach by using block matrices to make computations more visual. We find this method a…

  13. Fundamental concepts, current regulatory design and interpretation

    EPA Science Inventory

    Developmental toxicology became a formalized field about 50 years ago. Over this time, it has evolved from a largely observational science to one that is highly mechanistic in nature. Our increasing knowledge of mechanism of action, coupled with techniques that facilitate the gen...

  14. LIBERAL JOURNALISM AND AMERICAN EDUCATION, 1914-1941.

    ERIC Educational Resources Information Center

    WALLACE, JAMES M.

    THE RELATIONSHIP BETWEEN TWO LIBERAL JOURNALS AND THE INSTITUTIONS AND PERSONNEL OF FORMAL EDUCATION WAS STUDIED. "THE NATION" AND "NEW REPUBLIC" WERE SELECTED AS BEING INFLUENTIALLY REPRESENTATIVE OF INTELLECTUAL AMERICAN LIBERALISM DURING THE 20TH CENTURY. STANDARD TECHNIQUES OF HISTORICAL RESEARCH WERE EMPLOYED. RELEVANT…

  15. A formal concept analysis approach to consensus clustering of multi-experiment expression data

    PubMed Central

    2014-01-01

    Background Presently, with the increasing number and complexity of available gene expression datasets, the combination of data from multiple microarray studies addressing a similar biological question is gaining importance. The analysis and integration of multiple datasets are expected to yield more reliable and robust results since they are based on a larger number of samples and the effects of the individual study-specific biases are diminished. This is supported by recent studies suggesting that important biological signals are often preserved or enhanced by multiple experiments. An approach to combining data from different experiments is the aggregation of their clusterings into a consensus or representative clustering solution which increases the confidence in the common features of all the datasets and reveals the important differences among them. Results We propose a novel generic consensus clustering technique that applies Formal Concept Analysis (FCA) approach for the consolidation and analysis of clustering solutions derived from several microarray datasets. These datasets are initially divided into groups of related experiments with respect to a predefined criterion. Subsequently, a consensus clustering algorithm is applied to each group resulting in a clustering solution per group. These solutions are pooled together and further analysed by employing FCA which allows extracting valuable insights from the data and generating a gene partition over all the experiments. In order to validate the FCA-enhanced approach two consensus clustering algorithms are adapted to incorporate the FCA analysis. Their performance is evaluated on gene expression data from multi-experiment study examining the global cell-cycle control of fission yeast. The FCA results derived from both methods demonstrate that, although both algorithms optimize different clustering characteristics, FCA is able to overcome and diminish these differences and preserve some relevant biological signals. Conclusions The proposed FCA-enhanced consensus clustering technique is a general approach to the combination of clustering algorithms with FCA for deriving clustering solutions from multiple gene expression matrices. The experimental results presented herein demonstrate that it is a robust data integration technique able to produce good quality clustering solution that is representative for the whole set of expression matrices. PMID:24885407

  16. Coaching Family Caregivers to become Better Problem Solvers when Caring for Persons with Advanced Cancer

    PubMed Central

    Dionne-Odom, J. Nicholas; Lyons, Kathleen D.; Akyar, Imatullah; Bakitas, Marie

    2016-01-01

    Family caregivers of persons with advanced cancer often take on responsibilities that present daunting and complex problems. Serious problems that go unresolved may be burdensome and result in negative outcomes for caregivers’ psychological and physical health and affect the quality of care delivered to the care recipients with cancer, especially at the end of life. Formal problem-solving training approaches have been developed over the past several decades to assist individuals with managing problems faced in daily life. Several of these problem-solving principles and techniques were incorporated into ENABLE (Educate, Nurture, Advise, Before Life End), an ‘early’ palliative care telehealth intervention for individuals diagnosed with advanced cancer and their family caregivers. A hypothetical case resembling the situations of actual caregiver participants in ENABLE that exemplifies the complex problems that caregivers face is presented followed by presentation of an overview of ENABLE’s problem-solving key principles, techniques and steps in problem-solving support. Though more research is needed to formally test the use of problem-solving support in social work practice, social workers can easily incorporate these techniques into everyday practice. PMID:27143574

  17. Improving the efficiency of single and multiple teleportation protocols based on the direct use of partially entangled states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fortes, Raphael; Rigolin, Gustavo, E-mail: rigolin@ifi.unicamp.br

    We push the limits of the direct use of partially pure entangled states to perform quantum teleportation by presenting several protocols in many different scenarios that achieve the optimal efficiency possible. We review and put in a single formalism the three major strategies known to date that allow one to use partially entangled states for direct quantum teleportation (no distillation strategies permitted) and compare their efficiencies in real world implementations. We show how one can improve the efficiency of many direct teleportation protocols by combining these techniques. We then develop new teleportation protocols employing multipartite partially entangled states. The threemore » techniques are also used here in order to achieve the highest efficiency possible. Finally, we prove the upper bound for the optimal success rate for protocols based on partially entangled Bell states and show that some of the protocols here developed achieve such a bound. -- Highlights: •Optimal direct teleportation protocols using directly partially entangled states. •We put in a single formalism all strategies of direct teleportation. •We extend these techniques for multipartite partially entangle states. •We give upper bounds for the optimal efficiency of these protocols.« less

  18. Validation and Verification of LADEE Models and Software

    NASA Technical Reports Server (NTRS)

    Gundy-Burlet, Karen

    2013-01-01

    The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds.

  19. Elicitation of quantitative data from a heterogeneous expert panel: formal process and application in animal health.

    PubMed

    Van der Fels-Klerx, Ine H J; Goossens, Louis H J; Saatkamp, Helmut W; Horst, Suzan H S

    2002-02-01

    This paper presents a protocol for a formal expert judgment process using a heterogeneous expert panel aimed at the quantification of continuous variables. The emphasis is on the process's requirements related to the nature of expertise within the panel, in particular the heterogeneity of both substantive and normative expertise. The process provides the opportunity for interaction among the experts so that they fully understand and agree upon the problem at hand, including qualitative aspects relevant to the variables of interest, prior to the actual quantification task. Individual experts' assessments on the variables of interest, cast in the form of subjective probability density functions, are elicited with a minimal demand for normative expertise. The individual experts' assessments are aggregated into a single probability density function per variable, thereby weighting the experts according to their expertise. Elicitation techniques proposed include the Delphi technique for the qualitative assessment task and the ELI method for the actual quantitative assessment task. Appropriately, the Classical model was used to weight the experts' assessments in order to construct a single distribution per variable. Applying this model, the experts' quality typically was based on their performance on seed variables. An application of the proposed protocol in the broad and multidisciplinary field of animal health is presented. Results of this expert judgment process showed that the proposed protocol in combination with the proposed elicitation and analysis techniques resulted in valid data on the (continuous) variables of interest. In conclusion, the proposed protocol for a formal expert judgment process aimed at the elicitation of quantitative data from a heterogeneous expert panel provided satisfactory results. Hence, this protocol might be useful for expert judgment studies in other broad and/or multidisciplinary fields of interest.

  20. Tempo: A Toolkit for the Timed Input/Output Automata Formalism

    DTIC Science & Technology

    2008-01-30

    generation of distributed code from specifications. F .4.3 [Formal Languages]: Tempo;, D.3 [Programming Many distributed systems involve a combination of...and require The chek (i) transition is enabled when process i’s program the simulator to check the assertions after every single step counter is set to...output foo (n:Int) The Tempo simulator addresses this issue by putting the states x: Int : = 10;transitions modeler in charge of resolving the non

  1. A general mass term for bigravity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cusin, Giulia; Durrer, Ruth; Guarato, Pietro

    2016-04-01

    We introduce a new formalism to study perturbations of Hassan-Rosen bigravity theory, around general backgrounds for the two dynamical metrics. In particular, we derive the general expression for the mass term of the perturbations and we explicitly compute it for cosmological settings. We study tensor perturbations in a specific branch of bigravity using this formalism. We show that the tensor sector is affected by a late-time instability, which sets in when the mass matrix is no longer positive definite.

  2. Advanced orbiting systems test-bedding and protocol verification

    NASA Technical Reports Server (NTRS)

    Noles, James; De Gree, Melvin

    1989-01-01

    The Consultative Committee for Space Data Systems (CCSDS) has begun the development of a set of protocol recommendations for Advanced Orbiting Systems (SOS). The AOS validation program and formal definition of AOS protocols are reviewed, and the configuration control of the AOS formal specifications is summarized. Independent implementations of the AOS protocols by NASA and ESA are discussed, and cross-support/interoperability tests which will allow the space agencies of various countries to share AOS communication facilities are addressed.

  3. Using formal methods for content validation of medical procedure documents.

    PubMed

    Cota, Érika; Ribeiro, Leila; Bezerra, Jonas Santos; Costa, Andrei; da Silva, Rosiana Estefane; Cota, Gláucia

    2017-08-01

    We propose the use of a formal approach to support content validation of a standard operating procedure (SOP) for a therapeutic intervention. Such an approach provides a useful tool to identify ambiguities, omissions and inconsistencies, and improves the applicability and efficacy of documents in the health settings. We apply and evaluate a methodology originally proposed for the verification of software specification documents to a specific SOP. The verification methodology uses the graph formalism to model the document. Semi-automatic analysis identifies possible problems in the model and in the original document. The verification is an iterative process that identifies possible faults in the original text that should be revised by its authors and/or specialists. The proposed method was able to identify 23 possible issues in the original document (ambiguities, omissions, redundant information, and inaccuracies, among others). The formal verification process aided the specialists to consider a wider range of usage scenarios and to identify which instructions form the kernel of the proposed SOP and which ones represent additional or required knowledge that are mandatory for the correct application of the medical document. By using the proposed verification process, a simpler and yet more complete SOP could be produced. As consequence, during the validation process the experts received a more mature document and could focus on the technical aspects of the procedure itself. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Biological applications of confocal fluorescence polarization microscopy

    NASA Astrophysics Data System (ADS)

    Bigelow, Chad E.

    Fluorescence polarization microscopy is a powerful modality capable of sensing changes in the physical properties and local environment of fluorophores. In this thesis we present new applications for the technique in cancer diagnosis and treatment and explore the limits of the modality in scattering media. We describe modifications to our custom-built confocal fluorescence microscope that enable dual-color imaging, optical fiber-based confocal spectroscopy and fluorescence polarization imaging. Experiments are presented that indicate the performance of the instrument for all three modalities. The limits of confocal fluorescence polarization imaging in scattering media are explored and the microscope parameters necessary for accurate polarization images in this regime are determined. A Monte Carlo routine is developed to model the effect of scattering on images. Included in it are routines to track the polarization state of light using the Mueller-Stokes formalism and a model for fluorescence generation that includes sampling the excitation light polarization ellipse, Brownian motion of excited-state fluorophores in solution, and dipole fluorophore emission. Results from this model are compared to experiments performed on a fluorophore-embedded polymer rod in a turbid medium consisting of polystyrene microspheres in aqueous suspension. We demonstrate the utility of the fluorescence polarization imaging technique for removal of contaminating autofluorescence and for imaging photodynamic therapy drugs in cell monolayers. Images of cells expressing green fluorescent protein are extracted from contaminating fluorescein emission. The distribution of meta-tetrahydroxypheny1chlorin in an EMT6 cell monolayer is also presented. A new technique for imaging enzyme activity is presented that is based on observing changes in the anisotropy of fluorescently-labeled substrates. Proof-of-principle studies are performed in a model system consisting of fluorescently labeled bovine serum albumin attached to sepharose beads. The action of trypsin and proteinase K on the albumin is monitored to demonstrate validity of the technique. Images of the processing of the albumin in J774 murine macrophages are also presented indicating large intercellular differences in enzyme activity. Future directions for the technique are also presented, including the design of enzyme probes specific for prostate specific antigen based on fluorescently-labeled dendrimers. A technique for enzyme imaging based on extracellular autofluorescence is also proposed.

  5. Projection-operator calculations of the lowest e(-)-He resonance

    NASA Technical Reports Server (NTRS)

    Berk, A.; Bhatia, A. K.; Junker, B. R.; Temkin, A.

    1986-01-01

    The 1s (2s)2:2S Schulz resonance of He(-) is investigated theoretically, applying the full projection-operator formalism developed by Temkin and Bhatia (1985) in a Rayleigh-Ritz variational calculation. The technique is described in detail, and results for five different approximations of the He target state are presented in a table. Good convergence is obtained, but it is found that even the best calculated value of the resonance is about 130 meV higher than the experimentally measured value of 19.367 + or - 0.007 eV (Brunt et al., 1977), a discrepancy attributed to the contribution of the shift in the Feshbach formalism.

  6. A Formalisation of Adaptable Pervasive Flows

    NASA Astrophysics Data System (ADS)

    Bucchiarone, Antonio; Lafuente, Alberto Lluch; Marconi, Annapaola; Pistore, Marco

    Adaptable Pervasive Flows is a novel workflow-based paradigm for the design and execution of pervasive applications, where dynamic workflows situated in the real world are able to modify their execution in order to adapt to changes in their environment. In this paper, we study a formalisation of such flows by means of a formal flow language. More precisely, we define APFoL (Adaptable Pervasive Flow Language) and formalise its textual notation by encoding it in Blite, a formalisation of WS-BPEL. The encoding in Blite equips the language with a formal semantics and enables the use of automated verification techniques. We illustrate the approach with an example of a Warehouse Case Study.

  7. Assurance Cases for Proofs as Evidence

    NASA Technical Reports Server (NTRS)

    Chaki, Sagar; Gurfinkel, Arie; Wallnau, Kurt; Weinstock, Charles

    2009-01-01

    Proof-carrying code (PCC) provides a 'gold standard' for establishing formal and objective confidence in program behavior. However, in order to extend the benefits of PCC - and other formal certification techniques - to realistic systems, we must establish the correspondence of a mathematical proof of a program's semantics and its actual behavior. In this paper, we argue that assurance cases are an effective means of establishing such a correspondence. To this end, we present an assurance case pattern for arguing that a proof is free from various proof hazards. We also instantiate this pattern for a proof-based mechanism to provide evidence about a generic medical device software.

  8. A bibliography on formal methods for system specification, design and validation

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.; Furchtgott, D. G.; Movaghar, A.

    1982-01-01

    Literature on the specification, design, verification, testing, and evaluation of avionics systems was surveyed, providing 655 citations. Journal papers, conference papers, and technical reports are included. Manual and computer-based methods were employed. Keywords used in the online search are listed.

  9. CellNOptR: a flexible toolkit to train protein signaling networks to data using multiple logic formalisms.

    PubMed

    Terfve, Camille; Cokelaer, Thomas; Henriques, David; MacNamara, Aidan; Goncalves, Emanuel; Morris, Melody K; van Iersel, Martijn; Lauffenburger, Douglas A; Saez-Rodriguez, Julio

    2012-10-18

    Cells process signals using complex and dynamic networks. Studying how this is performed in a context and cell type specific way is essential to understand signaling both in physiological and diseased situations. Context-specific medium/high throughput proteomic data measured upon perturbation is now relatively easy to obtain but formalisms that can take advantage of these features to build models of signaling are still comparatively scarce. Here we present CellNOptR, an open-source R software package for building predictive logic models of signaling networks by training networks derived from prior knowledge to signaling (typically phosphoproteomic) data. CellNOptR features different logic formalisms, from Boolean models to differential equations, in a common framework. These different logic model representations accommodate state and time values with increasing levels of detail. We provide in addition an interface via Cytoscape (CytoCopteR) to facilitate use and integration with Cytoscape network-based capabilities. Models generated with this pipeline have two key features. First, they are constrained by prior knowledge about the network but trained to data. They are therefore context and cell line specific, which results in enhanced predictive and mechanistic insights. Second, they can be built using different logic formalisms depending on the richness of the available data. Models built with CellNOptR are useful tools to understand how signals are processed by cells and how this is altered in disease. They can be used to predict the effect of perturbations (individual or in combinations), and potentially to engineer therapies that have differential effects/side effects depending on the cell type or context.

  10. CellNOptR: a flexible toolkit to train protein signaling networks to data using multiple logic formalisms

    PubMed Central

    2012-01-01

    Background Cells process signals using complex and dynamic networks. Studying how this is performed in a context and cell type specific way is essential to understand signaling both in physiological and diseased situations. Context-specific medium/high throughput proteomic data measured upon perturbation is now relatively easy to obtain but formalisms that can take advantage of these features to build models of signaling are still comparatively scarce. Results Here we present CellNOptR, an open-source R software package for building predictive logic models of signaling networks by training networks derived from prior knowledge to signaling (typically phosphoproteomic) data. CellNOptR features different logic formalisms, from Boolean models to differential equations, in a common framework. These different logic model representations accommodate state and time values with increasing levels of detail. We provide in addition an interface via Cytoscape (CytoCopteR) to facilitate use and integration with Cytoscape network-based capabilities. Conclusions Models generated with this pipeline have two key features. First, they are constrained by prior knowledge about the network but trained to data. They are therefore context and cell line specific, which results in enhanced predictive and mechanistic insights. Second, they can be built using different logic formalisms depending on the richness of the available data. Models built with CellNOptR are useful tools to understand how signals are processed by cells and how this is altered in disease. They can be used to predict the effect of perturbations (individual or in combinations), and potentially to engineer therapies that have differential effects/side effects depending on the cell type or context. PMID:23079107

  11. Detection and visualization of storm hydrograph changes under urbanization: an impulse response approach.

    PubMed

    Farahmand, Touraj; Fleming, Sean W; Quilty, Edward J

    2007-10-01

    Urbanization often alters catchment storm responses, with a broad range of potentially significant environmental and engineering consequences. At a practical, site-specific management level, efficient and effective assessment and control of such downstream impacts requires a technical capability to rapidly identify development-induced storm hydrograph changes. The method should also speak specifically to alteration of internal watershed dynamics, require few resources to implement, and provide results that are intuitively accessible to all watershed stakeholders. In this short paper, we propose a potential method which might satisfy these criteria. Our emphasis lies upon the integration of existing concepts to provide tools for pragmatic, relatively low-cost environmental monitoring and management. The procedure involves calibration of rainfall-runoff time-series models in each of several successive time windows, which sample varying degrees of watershed urbanization. As implemented here, only precipitation and stream discharge or stage data are required. The readily generated unit impulse response functions of these time-series models might then provide a mathematically formal, yet visually based and intuitive, representation of changes in watershed storm response. Nominally, the empirical response functions capture such changes as soon as they occur, and the assessments of storm hydrograph alteration are independent of variability in meteorological forcing. We provide a preliminary example of how the technique may be applied using a low-order linear ARX model. The technique may offer a fresh perspective on such watershed management issues, and potentially also several advantages over existing approaches. Substantial further testing is required before attempting to apply the concept as a practical environmental management technique; some possible directions for additional work are suggested.

  12. An Approach to Goal-Statement Evaluation

    ERIC Educational Resources Information Center

    Reiner, John R.; Robinson, Donald W.

    1969-01-01

    "The results of this study support the proposition that the application of environmental assessment techniques based on CUES items provides information which can help evaluate the formal goals of an institution in terms of the degree to which the institutional environment is facilitative of those goals. (Author)

  13. Modeling Narrative Discourse

    ERIC Educational Resources Information Center

    Elson, David K.

    2012-01-01

    This thesis describes new approaches to the formal modeling of narrative discourse. Although narratives of all kinds are ubiquitous in daily life, contemporary text processing techniques typically do not leverage the aspects that separate narrative from expository discourse. We describe two approaches to the problem. The first approach considers…

  14. Why formal learning theory matters for cognitive science.

    PubMed

    Fulop, Sean; Chater, Nick

    2013-01-01

    This article reviews a number of different areas in the foundations of formal learning theory. After outlining the general framework for formal models of learning, the Bayesian approach to learning is summarized. This leads to a discussion of Solomonoff's Universal Prior Distribution for Bayesian learning. Gold's model of identification in the limit is also outlined. We next discuss a number of aspects of learning theory raised in contributed papers, related to both computational and representational complexity. The article concludes with a description of how semi-supervised learning can be applied to the study of cognitive learning models. Throughout this overview, the specific points raised by our contributing authors are connected to the models and methods under review. Copyright © 2013 Cognitive Science Society, Inc.

  15. Help Seeking Among Victims of Crime: A Review of the Empirical Literature

    PubMed Central

    McCart, Michael R.; Smith, Daniel W.; Sawyer, Genelle K.

    2013-01-01

    This paper reviews the literature on help-seeking behavior among adult victims of crime. Specifically, the paper summarizes prevalence rates for formal and informal help seeking and reviews predictors of and barriers to service use following victimization. Research suggests that only a small fraction of crime victims seek help from formal support networks; however, many seek support from informal sources. Several variables are associated with increased likelihood of formal help seeking, although the manner in which these variables affect reporting behavior is not clear. From this review, it is concluded that much remains to be learned regarding patterns of help seeking among victims of crime. Gaps in the literature and directions for future research are discussed. PMID:20336674

  16. Formal design specification of a Processor Interface Unit

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1992-01-01

    This report describes work to formally specify the requirements and design of a processor interface unit (PIU), a single-chip subsystem providing memory-interface bus-interface, and additional support services for a commercial microprocessor within a fault-tolerant computer system. This system, the Fault-Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance-free operation, or both. The need for high-quality design assurance in such applications is an undisputed fact, given the disastrous consequences that even a single design flaw can produce. Thus, the further development and application of formal methods to fault-tolerant systems is of critical importance as these systems see increasing use in modern society.

  17. A Formal Basis for Safety Case Patterns

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Pai, Ganesh

    2013-01-01

    By capturing common structures of successful arguments, safety case patterns provide an approach for reusing strategies for reasoning about safety. In the current state of the practice, patterns exist as descriptive specifications with informal semantics, which not only offer little opportunity for more sophisticated usage such as automated instantiation, composition and manipulation, but also impede standardization efforts and tool interoperability. To address these concerns, this paper gives (i) a formal definition for safety case patterns, clarifying both restrictions on the usage of multiplicity and well-founded recursion in structural abstraction, (ii) formal semantics to patterns, and (iii) a generic data model and algorithm for pattern instantiation. We illustrate our contributions by application to a new pattern, the requirements breakdown pattern, which builds upon our previous work

  18. Formal design and verification of a reliable computing platform for real-time control (phase 3 results)

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Divito, Ben L.; Holloway, C. Michael

    1994-01-01

    In this paper the design and formal verification of the lower levels of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications, are presented. The RCP uses NMR-style redundancy to mask faults and internal majority voting to flush the effects of transient faults. Two new layers of the RCP hierarchy are introduced: the Minimal Voting refinement (DA_minv) of the Distributed Asynchronous (DA) model and the Local Executive (LE) Model. Both the DA_minv model and the LE model are specified formally and have been verified using the Ehdm verification system. All specifications and proofs are available electronically via the Internet using anonymous FTP or World Wide Web (WWW) access.

  19. The interventional radiology business plan.

    PubMed

    Beheshti, Michael V; Meek, Mary E; Kaufman, John A

    2012-09-01

    Strategic planning and business planning are processes commonly employed by organizations that exist in competitive environments. Although it is difficult to prove a causal relationship between formal strategic/business planning and positive organizational performance, there is broad agreement that formal strategic and business plans are components of successful organizations. The various elements of strategic plans and business plans are not common in the vernacular of practicing physicians. As health care becomes more competitive, familiarity with these tools may grow in importance. Herein we provide an overview of formal strategic and business planning, and offer a roadmap for an interventional radiology-specific plan that may be useful for organizations confronting competitive and financial threats. Copyright © 2012 SIR. Published by Elsevier Inc. All rights reserved.

  20. Symbolic discrete event system specification

    NASA Technical Reports Server (NTRS)

    Zeigler, Bernard P.; Chi, Sungdo

    1992-01-01

    Extending discrete event modeling formalisms to facilitate greater symbol manipulation capabilities is important to further their use in intelligent control and design of high autonomy systems. An extension to the DEVS formalism that facilitates symbolic expression of event times by extending the time base from the real numbers to the field of linear polynomials over the reals is defined. A simulation algorithm is developed to generate the branching trajectories resulting from the underlying nondeterminism. To efficiently manage symbolic constraints, a consistency checking algorithm for linear polynomial constraints based on feasibility checking algorithms borrowed from linear programming has been developed. The extended formalism offers a convenient means to conduct multiple, simultaneous explorations of model behaviors. Examples of application are given with concentration on fault model analysis.

  1. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  2. Summary of Results from the Risk Management Program for the Mars Microrover Flight Experiment

    NASA Technical Reports Server (NTRS)

    Shishko, Robert; Matijevic, Jacob R.

    2000-01-01

    On 4 July 1997, the Mars Pathfinder landed on the surface of Mars carrying the first planetary rover, known as the Sojourner. Formally known as the Microrover Flight Experiment (MFEX), the Sojourner was a low cost, high-risk technology demonstration, in which new risk management techniques were tried. This paper summarizes the activities and results of the effort to conduct a low-cost, yet meaningful risk management program for the MFEX. The specific activities focused on cost, performance, schedule, and operations risks. Just as the systems engineering process was iterative and produced successive refinements of requirements, designs, etc., so was the risk management process. Qualitative risk assessments were performed first to gain some insights for refining the microrover design and operations concept. These then evolved into more quantitative analyses. Risk management lessons from the manager's perspective is presented for other low-cost, high-risk space missions.

  3. Formal Specification and Design Techniques for Wireless Sensor and Actuator Networks

    PubMed Central

    Martínez, Diego; González, Apolinar; Blanes, Francisco; Aquino, Raúl; Simo, José; Crespo, Alfons

    2011-01-01

    A current trend in the development and implementation of industrial applications is to use wireless networks to communicate the system nodes, mainly to increase application flexibility, reliability and portability, as well as to reduce the implementation cost. However, the nondeterministic and concurrent behavior of distributed systems makes their analysis and design complex, often resulting in less than satisfactory performance in simulation and test bed scenarios, which is caused by using imprecise models to analyze, validate and design these systems. Moreover, there are some simulation platforms that do not support these models. This paper presents a design and validation method for Wireless Sensor and Actuator Networks (WSAN) which is supported on a minimal set of wireless components represented in Colored Petri Nets (CPN). In summary, the model presented allows users to verify the design properties and structural behavior of the system. PMID:22344203

  4. On-line Model Structure Selection for Estimation of Plasma Boundary in a Tokamak

    NASA Astrophysics Data System (ADS)

    Škvára, Vít; Šmídl, Václav; Urban, Jakub

    2015-11-01

    Control of the plasma field in the tokamak requires reliable estimation of the plasma boundary. The plasma boundary is given by a complex mathematical model and the only available measurements are responses of induction coils around the plasma. For the purpose of boundary estimation the model can be reduced to simple linear regression with potentially infinitely many elements. The number of elements must be selected manually and this choice significantly influences the resulting shape. In this paper, we investigate the use of formal model structure estimation techniques for the problem. Specifically, we formulate a sparse least squares estimator using the automatic relevance principle. The resulting algorithm is a repetitive evaluation of the least squares problem which could be computed in real time. Performance of the resulting algorithm is illustrated on simulated data and evaluated with respect to a more detailed and computationally costly model FREEBIE.

  5. Adaptive image contrast enhancement using generalizations of histogram equalization.

    PubMed

    Stark, J A

    2000-01-01

    This paper proposes a scheme for adaptive image-contrast enhancement based on a generalization of histogram equalization (HE). HE is a useful technique for improving image contrast, but its effect is too severe for many purposes. However, dramatically different results can be obtained with relatively minor modifications. A concise description of adaptive HE is set out, and this framework is used in a discussion of past suggestions for variations on HE. A key feature of this formalism is a "cumulation function," which is used to generate a grey level mapping from the local histogram. By choosing alternative forms of cumulation function one can achieve a wide variety of effects. A specific form is proposed. Through the variation of one or two parameters, the resulting process can produce a range of degrees of contrast enhancement, at one extreme leaving the image unchanged, at another yielding full adaptive equalization.

  6. A high-level language for rule-based modelling.

    PubMed

    Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D

    2015-01-01

    Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages.

  7. A Process for the Representation of openEHR ADL Archetypes in OWL Ontologies.

    PubMed

    Porn, Alex Mateus; Peres, Leticia Mara; Didonet Del Fabro, Marcos

    2015-01-01

    ADL is a formal language to express archetypes, independent of standards or domain. However, its specification is not precise enough in relation to the specialization and semantic of archetypes, presenting difficulties in implementation and a few available tools. Archetypes may be implemented using other languages such as XML or OWL, increasing integration with Semantic Web tools. Exchanging and transforming data can be better implemented with semantics oriented models, for example using OWL which is a language to define and instantiate Web ontologies defined by W3C. OWL permits defining significant, detailed, precise and consistent distinctions among classes, properties and relations by the user, ensuring the consistency of knowledge than using ADL techniques. This paper presents a process of an openEHR ADL archetypes representation in OWL ontologies. This process consists of ADL archetypes conversion in OWL ontologies and validation of OWL resultant ontologies using the mutation test.

  8. A High-Level Language for Rule-Based Modelling

    PubMed Central

    Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D.

    2015-01-01

    Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages. PMID:26043208

  9. Standard cost elements for technology programs

    NASA Technical Reports Server (NTRS)

    Christensen, Carisa B.; Wagenfuehrer, Carl

    1992-01-01

    The suitable structure for an effective and accurate cost estimate for general purposes is discussed in the context of a NASA technology program. Cost elements are defined for research, management, and facility-construction portions of technology programs. Attention is given to the mechanisms for insuring the viability of spending programs, and the need for program managers is established for effecting timely fund disbursement. Formal, structures, and intuitive techniques are discussed for cost-estimate development, and cost-estimate defensibility can be improved with increased documentation. NASA policies for cash management are examined to demonstrate the importance of the ability to obligate funds and the ability to cost contracted funds. The NASA approach to consistent cost justification is set forth with a list of standard cost-element definitions. The cost elements reflect the three primary concerns of cost estimates: the identification of major assumptions, the specification of secondary analytic assumptions, and the status of program factors.

  10. Teaching Technical Competencies for Fluid Mechanics Research

    NASA Astrophysics Data System (ADS)

    Tagg, Randall

    2014-11-01

    We are developing an ``on demand'' framework for students to learn techniques used in fluid mechanics research. The site for this work is a university-grade laboratory situated next to Gateway High School in Aurora, Colorado. Undergraduate university students work with K-12 students on research and technical innovation projects. Both groups need customized training as their projects proceed. A modular approach allows particular competencies such as pump selection, construction of flow piping and channels, flow visualization, and specific flow measurement methods to be acquired through focused lessons. These lessons can be learned in either a stand-alone fashion or assembled into units for formal courses. A research example was a student project on diffusion of infectious material in micro-gravity in the event of an intestinal puncture wound. A curriculum example is a 9-week quarter of high-school instruction on instrumentation that uses small-scale water treatment systems as a case study.

  11. Spectral Automorphisms in Quantum Logics

    NASA Astrophysics Data System (ADS)

    Ivanov, Alexandru; Caragheorgheopol, Dan

    2010-12-01

    In quantum mechanics, the Hilbert space formalism might be physically justified in terms of some axioms based on the orthomodular lattice (OML) mathematical structure (Piron in Foundations of Quantum Physics, Benjamin, Reading, 1976). We intend to investigate the extent to which some fundamental physical facts can be described in the more general framework of OMLs, without the support of Hilbert space-specific tools. We consider the study of lattice automorphisms properties as a “substitute” for Hilbert space techniques in investigating the spectral properties of observables. This is why we introduce the notion of spectral automorphism of an OML. Properties of spectral automorphisms and of their spectra are studied. We prove that the presence of nontrivial spectral automorphisms allow us to distinguish between classical and nonclassical theories. We also prove, for finite dimensional OMLs, that for every spectral automorphism there is a basis of invariant atoms. This is an analogue of the spectral theorem for unitary operators having purely point spectrum.

  12. Applying Various Methods of Communicating Science for Community Decision-Making and Public Awareness: A NASA DEVELOP National Program Case Study

    NASA Astrophysics Data System (ADS)

    Miller, T. N.; Brumbaugh, E. J.; Barker, M.; Ly, V.; Schick, R.; Rogers, L.

    2015-12-01

    The NASA DEVELOP National Program conducts over eighty Earth science projects every year. Each project applies NASA Earth observations to impact decision-making related to a local or regional community concern. Small, interdisciplinary teams create a methodology to address the specific issue, and then pass on the results to partner organizations, as well as providing them with instruction to continue using remote sensing for future decisions. Many different methods are used by individual teams, and the program as a whole, to communicate results and research accomplishments to decision-makers, stakeholders, alumni, and the general public. These methods vary in scope from formal publications to more informal venues, such as social media. This presentation will highlight the communication techniques used by the DEVELOP program. Audiences, strategies, and outlets will be discussed, including a newsletter, microjournal, video contest, and several others.

  13. Evolutionary fuzzy modeling human diagnostic decisions.

    PubMed

    Peña-Reyes, Carlos Andrés

    2004-05-01

    Fuzzy CoCo is a methodology, combining fuzzy logic and evolutionary computation, for constructing systems able to accurately predict the outcome of a human decision-making process, while providing an understandable explanation of the underlying reasoning. Fuzzy logic provides a formal framework for constructing systems exhibiting both good numeric performance (accuracy) and linguistic representation (interpretability). However, fuzzy modeling--meaning the construction of fuzzy systems--is an arduous task, demanding the identification of many parameters. To solve it, we use evolutionary computation techniques (specifically cooperative coevolution), which are widely used to search for adequate solutions in complex spaces. We have successfully applied the algorithm to model the decision processes involved in two breast cancer diagnostic problems, the WBCD problem and the Catalonia mammography interpretation problem, obtaining systems both of high performance and high interpretability. For the Catalonia problem, an evolved system was embedded within a Web-based tool-called COBRA-for aiding radiologists in mammography interpretation.

  14. A model-guided symbolic execution approach for network protocol implementations and vulnerability detection.

    PubMed

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Formal techniques have been devoted to analyzing whether network protocol specifications violate security policies; however, these methods cannot detect vulnerabilities in the implementations of the network protocols themselves. Symbolic execution can be used to analyze the paths of the network protocol implementations, but for stateful network protocols, it is difficult to reach the deep states of the protocol. This paper proposes a novel model-guided approach to detect vulnerabilities in network protocol implementations. Our method first abstracts a finite state machine (FSM) model, then utilizes the model to guide the symbolic execution. This approach achieves high coverage of both the code and the protocol states. The proposed method is implemented and applied to test numerous real-world network protocol implementations. The experimental results indicate that the proposed method is more effective than traditional fuzzing methods such as SPIKE at detecting vulnerabilities in the deep states of network protocol implementations.

  15. Designing Specification Languages for Process Control Systems: Lessons Learned and Steps to the Future

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.; Heimdahl, Mats P. E.; Reese, Jon Damon

    1999-01-01

    Previously, we defined a blackbox formal system modeling language called RSML (Requirements State Machine Language). The language was developed over several years while specifying the system requirements for a collision avoidance system for commercial passenger aircraft. During the language development, we received continual feedback and evaluation by FAA employees and industry representatives, which helped us to produce a specification language that is easily learned and used by application experts. Since the completion of the PSML project, we have continued our research on specification languages. This research is part of a larger effort to investigate the more general problem of providing tools to assist in developing embedded systems. Our latest experimental toolset is called SpecTRM (Specification Tools and Requirements Methodology), and the formal specification language is SpecTRM-RL (SpecTRM Requirements Language). This paper describes what we have learned from our use of RSML and how those lessons were applied to the design of SpecTRM-RL. We discuss our goals for SpecTRM-RL and the design features that support each of these goals.

  16. Developing quality indicators and auditing protocols from formal guideline models: knowledge representation and transformations.

    PubMed

    Advani, Aneel; Goldstein, Mary; Shahar, Yuval; Musen, Mark A

    2003-01-01

    Automated quality assessment of clinician actions and patient outcomes is a central problem in guideline- or standards-based medical care. In this paper we describe a model representation and algorithm for deriving structured quality indicators and auditing protocols from formalized specifications of guidelines used in decision support systems. We apply the model and algorithm to the assessment of physician concordance with a guideline knowledge model for hypertension used in a decision-support system. The properties of our solution include the ability to derive automatically context-specific and case-mix-adjusted quality indicators that can model global or local levels of detail about the guideline parameterized by defining the reliability of each indicator or element of the guideline.

  17. Music acquisition: effects of enculturation and formal training on development.

    PubMed

    Hannon, Erin E; Trainor, Laurel J

    2007-11-01

    Musical structure is complex, consisting of a small set of elements that combine to form hierarchical levels of pitch and temporal structure according to grammatical rules. As with language, different systems use different elements and rules for combination. Drawing on recent findings, we propose that music acquisition begins with basic features, such as peripheral frequency-coding mechanisms and multisensory timing connections, and proceeds through enculturation, whereby everyday exposure to a particular music system creates, in a systematic order of acquisition, culture-specific brain structures and representations. Finally, we propose that formal musical training invokes domain-specific processes that affect salience of musical input and the amount of cortical tissue devoted to its processing, as well as domain-general processes of attention and executive functioning.

  18. The Strengths and Weaknesses of Logic Formalisms to Support Mishap Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C. M.

    2002-01-01

    The increasing complexity of many safety critical systems poses new problems for mishap analysis. Techniques developed in the sixties and seventies cannot easily scale-up to analyze incidents involving tightly integrated software and hardware components. Similarly, the realization that many failures have systemic causes has widened the scope of many mishap investigations. Organizations, including NASA and the NTSB, have responded by starting research and training initiatives to ensure that their personnel are well equipped to meet these challenges. One strand of research has identified a range of mathematically based techniques that can be used to reason about the causes of complex, adverse events. The proponents of these techniques have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. Mathematical proofs can reduce the bias that is often perceived to effect the interpretation of adverse events. Others have opposed the introduction of these techniques by identifying social and political aspects to incident investigation that cannot easily be reconciled with a logic-based approach. Traditional theorem proving mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators routinely use in their analysis of adverse events. This paper summarizes some of the benefits that logics provide, describes their weaknesses, and proposes a number of directions for future research.

  19. Understanding Motivators and Barriers to Physical Activity

    ERIC Educational Resources Information Center

    Patay, Mary E.; Patton, Kevin; Parker, Melissa; Fahey, Kathleen; Sinclair, Christina

    2015-01-01

    The purpose of this study was to understand the factors that influence physical activity among year-round residents in an isolated summer resort community. Specifically, we explored the personal, environmental, social, and culture-specific perceived motivators and barriers to physical activity. Participants were formally interviewed about their…

  20. Lexical Specificity Training Effects in Second Language Learners

    ERIC Educational Resources Information Center

    Janssen, Caressa; Segers, Eliane; McQueen, James M.; Verhoeven, Ludo

    2015-01-01

    Children who start formal education in a second language may experience slower vocabulary growth in that language and subsequently experience disadvantages in literacy acquisition. The current study asked whether lexical specificity training can stimulate bilingual children's phonological awareness, which is considered to be a precursor to…

Top