Formal Methods Tool Qualification
NASA Technical Reports Server (NTRS)
Wagner, Lucas G.; Cofer, Darren; Slind, Konrad; Tinelli, Cesare; Mebsout, Alain
2017-01-01
Formal methods tools have been shown to be effective at finding defects in safety-critical digital systems including avionics systems. The publication of DO-178C and the accompanying formal methods supplement DO-333 allows applicants to obtain certification credit for the use of formal methods without providing justification for them as an alternative method. This project conducted an extensive study of existing formal methods tools, identifying obstacles to their qualification and proposing mitigations for those obstacles. Further, it interprets the qualification guidance for existing formal methods tools and provides case study examples for open source tools. This project also investigates the feasibility of verifying formal methods tools by generating proof certificates which capture proof of the formal methods tool's claim, which can be checked by an independent, proof certificate checking tool. Finally, the project investigates the feasibility of qualifying this proof certificate checker, in the DO-330 framework, in lieu of qualifying the model checker itself.
Formal Assurance Certifiable Tooling Formal Assurance Certifiable Tooling Strategy Final Report
NASA Technical Reports Server (NTRS)
Bush, Eric; Oglesby, David; Bhatt, Devesh; Murugesan, Anitha; Engstrom, Eric; Mueller, Joe; Pelican, Michael
2017-01-01
This is the Final Report of a research project to investigate issues and provide guidance for the qualification of formal methods tools under the DO-330 qualification process. It consisted of three major subtasks spread over two years: 1) an assessment of theoretical soundness issues that may affect qualification for three categories of formal methods tools, 2) a case study simulating the DO-330 qualification of two actual tool sets, and 3) an investigation of risk mitigation strategies that might be applied to chains of such formal methods tools in order to increase confidence in their certification of airborne software.
Evidence Arguments for Using Formal Methods in Software Certification
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Pai, Ganesh
2013-01-01
We describe a generic approach for automatically integrating the output generated from a formal method/tool into a software safety assurance case, as an evidence argument, by (a) encoding the underlying reasoning as a safety case pattern, and (b) instantiating it using the data produced from the method/tool. We believe this approach not only improves the trustworthiness of the evidence generated from a formal method/tool, by explicitly presenting the reasoning and mechanisms underlying its genesis, but also provides a way to gauge the suitability of the evidence in the context of the wider assurance case. We illustrate our work by application to a real example-an unmanned aircraft system- where we invoke a formal code analysis tool from its autopilot software safety case, automatically transform the verification output into an evidence argument, and then integrate it into the former.
A Tool for Requirements-Based Programming
NASA Technical Reports Server (NTRS)
Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John
2005-01-01
Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.
Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.
1997-09-30
set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has
Formal methods technology transfer: Some lessons learned
NASA Technical Reports Server (NTRS)
Hamilton, David
1992-01-01
IBM has a long history in the application of formal methods to software development and verification. There have been many successes in the development of methods, tools and training to support formal methods. And formal methods have been very successful on several projects. However, the use of formal methods has not been as widespread as hoped. This presentation summarizes several approaches that have been taken to encourage more widespread use of formal methods, and discusses the results so far. The basic problem is one of technology transfer, which is a very difficult problem. It is even more difficult for formal methods. General problems of technology transfer, especially the transfer of formal methods technology, are also discussed. Finally, some prospects for the future are mentioned.
Formal Methods for Life-Critical Software
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Johnson, Sally C.
1993-01-01
The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.
The Second NASA Formal Methods Workshop 1992
NASA Technical Reports Server (NTRS)
Johnson, Sally C. (Compiler); Holloway, C. Michael (Compiler); Butler, Ricky W. (Compiler)
1992-01-01
The primary goal of the workshop was to bring together formal methods researchers and aerospace industry engineers to investigate new opportunities for applying formal methods to aerospace problems. The first part of the workshop was tutorial in nature. The second part of the workshop explored the potential of formal methods to address current aerospace design and verification problems. The third part of the workshop involved on-line demonstrations of state-of-the-art formal verification tools. Also, a detailed survey was filled in by the attendees; the results of the survey are compiled.
A Formal Approach to Requirements-Based Programming
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.
Recent trends related to the use of formal methods in software engineering
NASA Technical Reports Server (NTRS)
Prehn, Soren
1986-01-01
An account is given of some recent developments and trends related to the development and use of formal methods in software engineering. Ongoing activities in Europe are focussed on, since there seems to be a notable difference in attitude towards industrial usage of formal methods in Europe and in the U.S. A more detailed account is given of the currently most widespread formal method in Europe: the Vienna Development Method. Finally, the use of Ada is discussed in relation to the application of formal methods, and the potential for constructing Ada-specific tools based on that method is considered.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2004-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Verifying Hybrid Systems Modeled as Timed Automata: A Case Study
1997-03-01
Introduction Researchers have proposed many innovative formal methods for developing real - time systems [9]. Such methods can give system developers and...customers greater con dence that real - time systems satisfy their requirements, especially their crit- ical requirements. However, applying formal methods...specifying and reasoning about real - time systems that is designed to address these challenging problems. Our approach is to build formal reasoning tools
Formal functional test designs with a test representation language
NASA Technical Reports Server (NTRS)
Hops, J. M.
1993-01-01
The application of the category-partition method to the test design phase of hardware, software, or system test development is discussed. The method provides a formal framework for reducing the total number of possible test cases to a minimum logical subset for effective testing. An automatic tool and a formal language were developed to implement the method and produce the specification of test cases.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Formal Verification of Complex Systems based on SysML Functional Requirements
2014-12-23
Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools
Why are Formal Methods Not Used More Widely?
NASA Technical Reports Server (NTRS)
Knight, John C.; DeJong, Colleen L.; Gibble, Matthew S.; Nakano, Luis G.
1997-01-01
Despite extensive development over many years and significant demonstrated benefits, formal methods remain poorly accepted by industrial practitioners. Many reasons have been suggested for this situation such as a claim that they extent the development cycle, that they require difficult mathematics, that inadequate tools exist, and that they are incompatible with other software packages. There is little empirical evidence that any of these reasons is valid. The research presented here addresses the question of why formal methods are not used more widely. The approach used was to develop a formal specification for a safety-critical application using several specification notations and assess the results in a comprehensive evaluation framework. The results of the experiment suggests that there remain many impediments to the routine use of formal methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.
Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a probably equivalent model has yet to appeal: Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a probably equivalent implementation are valuable but not su8cient. The "gap" unfilled by such tools and methods is that their. formal models cannot be proven to be equivalent to the system requirements as originated by the customel: For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a probably equivalent formal model that can be used as the basis for code generation and other transformations.
Formal Analysis of the Remote Agent Before and After Flight
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Lowry, Mike; Park, SeungJoon; Pecheur, Charles; Penix, John; Visser, Willem; White, Jon L.
2000-01-01
This paper describes two separate efforts that used the SPIN model checker to verify deep space autonomy flight software. The first effort occurred at the beginning of a spiral development process and found five concurrency errors early in the design cycle that the developers acknowledge would not have been found through testing. This effort required a substantial manual modeling effort involving both abstraction and translation from the prototype LISP code to the PROMELA language used by SPIN. This experience and others led to research to address the gap between formal method tools and the development cycle used by software developers. The Java PathFinder tool which directly translates from Java to PROMELA was developed as part of this research, as well as automatic abstraction tools. In 1999 the flight software flew on a space mission, and a deadlock occurred in a sibling subsystem to the one which was the focus of the first verification effort. A second quick-response "cleanroom" verification effort found the concurrency error in a short amount of time. The error was isomorphic to one of the concurrency errors found during the first verification effort. The paper demonstrates that formal methods tools can find concurrency errors that indeed lead to loss of spacecraft functions, even for the complex software required for autonomy. Second, it describes progress in automatic translation and abstraction that eventually will enable formal methods tools to be inserted directly into the aerospace software development cycle.
An Educational Development Tool Based on Principles of Formal Ontology
ERIC Educational Resources Information Center
Guzzi, Rodolfo; Scarpanti, Stefano; Ballista, Giovanni; Di Nicolantonio, Walter
2005-01-01
Computer science provides with virtual laboratories, places where one can merge real experiments with the formalism of algorithms and mathematics and where, with the advent of multimedia, sounds and movies can also be added. In this paper we present a method, based on principles of formal ontology, allowing one to develop interactive educational…
NASA Technical Reports Server (NTRS)
Bolton, Matthew L.; Bass, Ellen J.
2009-01-01
Both the human factors engineering (HFE) and formal methods communities are concerned with finding and eliminating problems with safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to use model checking with HFE practices to perform formal verification of a human-interactive system. Despite the use of a seemingly simple target system, a patient controlled analgesia pump, the initial model proved to be difficult for the model checker to verify in a reasonable amount of time. This resulted in a number of model revisions that affected the HFE architectural, representativeness, and understandability goals of the effort. If formal methods are to meet the needs of the HFE community, additional modeling tools and technological developments are necessary.
Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald
2017-01-01
ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.
An object-oriented description method of EPMM process
NASA Astrophysics Data System (ADS)
Jiang, Zuo; Yang, Fan
2017-06-01
In order to use the object-oriented mature tools and language in software process model, make the software process model more accord with the industrial standard, it’s necessary to study the object-oriented modelling of software process. Based on the formal process definition in EPMM, considering the characteristics that Petri net is mainly formal modelling tool and combining the Petri net modelling with the object-oriented modelling idea, this paper provides this implementation method to convert EPMM based on Petri net into object models based on object-oriented description.
ADGS-2100 Adaptive Display and Guidance System Window Manager Analysis
NASA Technical Reports Server (NTRS)
Whalen, Mike W.; Innis, John D.; Miller, Steven P.; Wagner, Lucas G.
2006-01-01
Recent advances in modeling languages have made it feasible to formally specify and analyze the behavior of large system components. Synchronous data flow languages, such as Lustre, SCR, and RSML-e are particularly well suited to this task, and commercial versions of these tools such as SCADE and Simulink are growing in popularity among designers of safety critical systems, largely due to their ability to automatically generate code from the models. At the same time, advances in formal analysis tools have made it practical to formally verify important properties of these models to ensure that design defects are identified and corrected early in the lifecycle. This report describes how these tools have been applied to the ADGS-2100 Adaptive Display and Guidance Window Manager being developed by Rockwell Collins Inc. This work demonstrates how formal methods can be easily and cost-efficiently used to remove defects early in the design cycle.
NASA Technical Reports Server (NTRS)
1995-01-01
This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods.
NASA software specification and evaluation system design, part 1
NASA Technical Reports Server (NTRS)
1976-01-01
The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.
Formal Requirements-Based Programming for Complex Systems
NASA Technical Reports Server (NTRS)
Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis
2005-01-01
Computer science as a field has not yet produced a general method to mechanically transform complex computer system requirements into a provably equivalent implementation. Such a method would be one major step towards dealing with complexity in computing, yet it remains the elusive holy grail of system development. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that such tools and methods leave unfilled is that the formal models cannot be proven to be equivalent to the system requirements as originated by the customer For the classes of complex systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations. While other techniques are available, this method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. We illustrate the application of the method to an example procedure from the Hubble Robotic Servicing Mission currently under study and preliminary formulation at NASA Goddard Space Flight Center.
Experience Using Formal Methods for Specifying a Multi-Agent System
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Rash, James; Hinchey, Michael; Szczur, Martha R. (Technical Monitor)
2000-01-01
The process and results of using formal methods to specify the Lights Out Ground Operations System (LOGOS) is presented in this paper. LOGOS is a prototype multi-agent system developed to show the feasibility of providing autonomy to satellite ground operations functions at NASA Goddard Space Flight Center (GSFC). After the initial implementation of LOGOS the development team decided to use formal methods to check for race conditions, deadlocks and omissions. The specification exercise revealed several omissions as well as race conditions. After completing the specification, the team concluded that certain tools would have made the specification process easier. This paper gives a sample specification of two of the agents in the LOGOS system and examples of omissions and race conditions found. It concludes with describing an architecture of tools that would better support the future specification of agents and other concurrent systems.
Applications of Formal Methods to Specification and Safety of Avionics Software
NASA Technical Reports Server (NTRS)
Hoover, D. N.; Guaspari, David; Humenn, Polar
1996-01-01
This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.
Formal verification and testing: An integrated approach to validating Ada programs
NASA Technical Reports Server (NTRS)
Cohen, Norman H.
1986-01-01
An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.
IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation
NASA Technical Reports Server (NTRS)
Margaria, Tiziana (Editor); Steffen, Bernhard (Editor); Hichey, Michael G.
2005-01-01
This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains.
DRS: Derivational Reasoning System
NASA Technical Reports Server (NTRS)
Bose, Bhaskar
1995-01-01
The high reliability requirements for airborne systems requires fault-tolerant architectures to address failures in the presence of physical faults, and the elimination of design flaws during the specification and validation phase of the design cycle. Although much progress has been made in developing methods to address physical faults, design flaws remain a serious problem. Formal methods provides a mathematical basis for removing design flaws from digital systems. DRS (Derivational Reasoning System) is a formal design tool based on advanced research in mathematical modeling and formal synthesis. The system implements a basic design algebra for synthesizing digital circuit descriptions from high level functional specifications. DRS incorporates an executable specification language, a set of correctness preserving transformations, verification interface, and a logic synthesis interface, making it a powerful tool for realizing hardware from abstract specifications. DRS integrates recent advances in transformational reasoning, automated theorem proving and high-level CAD synthesis systems in order to provide enhanced reliability in designs with reduced time and cost.
Screening for Alcohol Problems among 4-Year Colleges and Universities
ERIC Educational Resources Information Center
Winters, Ken C.; Toomey, Traci; Nelson, Toben F.; Erickson, Darin; Lenk, Kathleen; Miazga, Mark
2011-01-01
Objective: To assess the use of alcohol screening tools across US colleges. Participants: Directors of health services at 333 four-year colleges. Methods: An online survey was conducted regarding the use of alcohol screening tools. Schools reporting use of formal tools were further described in terms of 4 tools (AUDIT, CUGE, CAPS, and RAPS) that…
The Personnel Effectiveness Grid (PEG): A New Tool for Estimating Personnel Department Effectiveness
ERIC Educational Resources Information Center
Petersen, Donald J.; Malone, Robert L.
1975-01-01
Examines the difficulties inherent in attempting a formal personnel evaluation system, the major formal methods currently used for evaluating personnel department accountabilities, some parameters that should be part of a valid evaluation program, and a model for conducting the evaluation. (Available from Office of Publications, Graduate School of…
1992-06-01
system capabilities \\Jch as memory management and network communications are provided by a virtual machine-type operating environment. Various human ...thinking. The elements of this substrate include representational formality, genericity, a method of formal analysis, and augmentation of human analytical...the form of identifying: the data entity itself; its aliases (including how the data is presented th programs or human users in the form of copy
Using Petri Net Tools to Study Properties and Dynamics of Biological Systems
Peleg, Mor; Rubin, Daniel; Altman, Russ B.
2005-01-01
Petri Nets (PNs) and their extensions are promising methods for modeling and simulating biological systems. We surveyed PN formalisms and tools and compared them based on their mathematical capabilities as well as by their appropriateness to represent typical biological processes. We measured the ability of these tools to model specific features of biological systems and answer a set of biological questions that we defined. We found that different tools are required to provide all capabilities that we assessed. We created software to translate a generic PN model into most of the formalisms and tools discussed. We have also made available three models and suggest that a library of such models would catalyze progress in qualitative modeling via PNs. Development and wide adoption of common formats would enable researchers to share models and use different tools to analyze them without the need to convert to proprietary formats. PMID:15561791
Automatic Methods and Tools for the Verification of Real Time Systems
1997-11-30
We developed formal methods and tools for the verification of real - time systems . This was accomplished by extending techniques, based on automata...embedded real - time systems , we introduced hybrid automata, which equip traditional discrete automata with real-numbered clock variables and continuous... real - time systems , and we identified the exact boundary between decidability and undecidability of real-time reasoning.
Formal implementation of a performance evaluation model for the face recognition system.
Shin, Yong-Nyuo; Kim, Jason; Lee, Yong-Jun; Shin, Woochang; Choi, Jin-Young
2008-01-01
Due to usability features, practical applications, and its lack of intrusiveness, face recognition technology, based on information, derived from individuals' facial features, has been attracting considerable attention recently. Reported recognition rates of commercialized face recognition systems cannot be admitted as official recognition rates, as they are based on assumptions that are beneficial to the specific system and face database. Therefore, performance evaluation methods and tools are necessary to objectively measure the accuracy and performance of any face recognition system. In this paper, we propose and formalize a performance evaluation model for the biometric recognition system, implementing an evaluation tool for face recognition systems based on the proposed model. Furthermore, we performed evaluations objectively by providing guidelines for the design and implementation of a performance evaluation system, formalizing the performance test process.
Software Tools for Formal Specification and Verification of Distributed Real-Time Systems
1994-07-29
time systems and to evaluate the design. The evaluation of the design includes investigation of both the capability and potential usefulness of the toolkit environment and the feasibility of its implementation....The goals of Phase 1 are to design in detail a toolkit environment based on formal methods for the specification and verification of distributed real
Security Modeling and Correctness Proof Using Specware and Isabelle
2008-12-01
proving requires substantial knowledge and experience in logical calculus . 15. NUMBER OF PAGES 146 14. SUBJECT TERMS Formal Method, Theorem...although the actual proving requires substantial knowledge and experience in logical calculus . vi THIS PAGE INTENTIONALLY LEFT BLANK vii TABLE OF...formal language and provides tools for proving those formulas in a logical calculus ” [5]. We are demonstrating in this thesis that a specification in
Structural Embeddings: Mechanization with Method
NASA Technical Reports Server (NTRS)
Munoz, Cesar; Rushby, John
1999-01-01
The most powerful tools for analysis of formal specifications are general-purpose theorem provers and model checkers, but these tools provide scant methodological support. Conversely, those approaches that do provide a well-developed method generally have less powerful automation. It is natural, therefore, to try to combine the better-developed methods with the more powerful general-purpose tools. An obstacle is that the methods and the tools often employ very different logics. We argue that methods are separable from their logics and are largely concerned with the structure and organization of specifications. We, propose a technique called structural embedding that allows the structural elements of a method to be supported by a general-purpose tool, while substituting the logic of the tool for that of the method. We have found this technique quite effective and we provide some examples of its application. We also suggest how general-purpose systems could be restructured to support this activity better.
Expert2OWL: A Methodology for Pattern-Based Ontology Development.
Tahar, Kais; Xu, Jie; Herre, Heinrich
2017-01-01
The formalization of expert knowledge enables a broad spectrum of applications employing ontologies as underlying technology. These include eLearning, Semantic Web and expert systems. However, the manual construction of such ontologies is time-consuming and thus expensive. Moreover, experts are often unfamiliar with the syntax and semantics of formal ontology languages such as OWL and usually have no experience in developing formal ontologies. To overcome these barriers, we developed a new method and tool, called Expert2OWL that provides efficient features to support the construction of OWL ontologies using GFO (General Formal Ontology) as a top-level ontology. This method allows a close and effective collaboration between ontologists and domain experts. Essentially, this tool integrates Excel spreadsheets as part of a pattern-based ontology development and refinement process. Expert2OWL enables us to expedite the development process and modularize the resulting ontologies. We applied this method in the field of Chinese Herbal Medicine (CHM) and used Expert2OWL to automatically generate an accurate Chinese Herbology ontology (CHO). The expressivity of CHO was tested and evaluated using ontology query languages SPARQL and DL. CHO shows promising results and can generate answers to important scientific questions such as which Chinese herbal formulas contain which substances, which substances treat which diseases, and which ones are the most frequently used in CHM.
Practical Formal Verification of MPI and Thread Programs
NASA Astrophysics Data System (ADS)
Gopalakrishnan, Ganesh; Kirby, Robert M.
Large-scale simulation codes in science and engineering are written using the Message Passing Interface (MPI). Shared memory threads are widely used directly, or to implement higher level programming abstractions. Traditional debugging methods for MPI or thread programs are incapable of providing useful formal guarantees about coverage. They get bogged down in the sheer number of interleavings (schedules), often missing shallow bugs. In this tutorial we will introduce two practical formal verification tools: ISP (for MPI C programs) and Inspect (for Pthread C programs). Unlike other formal verification tools, ISP and Inspect run directly on user source codes (much like a debugger). They pursue only the relevant set of process interleavings, using our own customized Dynamic Partial Order Reduction algorithms. For a given test harness, DPOR allows these tools to guarantee the absence of deadlocks, instrumented MPI object leaks and communication races (using ISP), and shared memory races (using Inspect). ISP and Inspect have been used to verify large pieces of code: in excess of 10,000 lines of MPI/C for ISP in under 5 seconds, and about 5,000 lines of Pthread/C code in a few hours (and much faster with the use of a cluster or by exploiting special cases such as symmetry) for Inspect. We will also demonstrate the Microsoft Visual Studio and Eclipse Parallel Tools Platform integrations of ISP (these will be available on the LiveCD).
Visualizing Matrix Multiplication
ERIC Educational Resources Information Center
Daugulis, Peteris; Sondore, Anita
2018-01-01
Efficient visualizations of computational algorithms are important tools for students, educators, and researchers. In this article, we point out an innovative visualization technique for matrix multiplication. This method differs from the standard, formal approach by using block matrices to make computations more visual. We find this method a…
Tree-oriented interactive processing with an application to theorem-proving, appendix E
NASA Technical Reports Server (NTRS)
Hammerslag, David; Kamin, Samuel N.; Campbell, Roy H.
1985-01-01
The concept of unstructured structure editing and ted, an editor for unstructured trees, is described. Ted is used to manipulate hierarchies of information in an unrestricted manner. The tool was implemented and applied to the problem of organizing formal proofs. As a proof management tool, it maintains the validity of a proof and its constituent lemmas independently from the methods used to validate the proof. It includes an adaptable interface which may be used to invoke theorem provers and other aids to proof construction. Using ted, a user may construct, maintain, and verify formal proofs using a variety of theorem provers, proof checkers, and formatters.
NASA Technical Reports Server (NTRS)
Young, William D.
1992-01-01
The application of formal methods to the analysis of computing systems promises to provide higher and higher levels of assurance as the sophistication of our tools and techniques increases. Improvements in tools and techniques come about as we pit the current state of the art against new and challenging problems. A promising area for the application of formal methods is in real-time and distributed computing. Some of the algorithms in this area are both subtle and important. In response to this challenge and as part of an ongoing attempt to verify an implementation of the Interactive Convergence Clock Synchronization Algorithm (ICCSA), we decided to undertake a proof of the correctness of the algorithm using the Boyer-Moore theorem prover. This paper describes our approach to proving the ICCSA using the Boyer-Moore prover.
NASA Astrophysics Data System (ADS)
Luzzi, R.; Vasconcellos, A. R.; Ramos, J. G.; Rodrigues, C. G.
2018-01-01
We describe the formalism of statistical irreversible thermodynamics constructed based on Zubarev's nonequilibrium statistical operator (NSO) method, which is a powerful and universal tool for investigating the most varied physical phenomena. We present brief overviews of the statistical ensemble formalism and statistical irreversible thermodynamics. The first can be constructed either based on a heuristic approach or in the framework of information theory in the Jeffreys-Jaynes scheme of scientific inference; Zubarev and his school used both approaches in formulating the NSO method. We describe the main characteristics of statistical irreversible thermodynamics and discuss some particular considerations of several authors. We briefly describe how Rosenfeld, Bohr, and Prigogine proposed to derive a thermodynamic uncertainty principle.
An overview of very high level software design methods
NASA Technical Reports Server (NTRS)
Asdjodi, Maryam; Hooper, James W.
1988-01-01
Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.
Formalization of treatment guidelines using Fuzzy Cognitive Maps and semantic web tools.
Papageorgiou, Elpiniki I; Roo, Jos De; Huszka, Csaba; Colaert, Dirk
2012-02-01
Therapy decision making and support in medicine deals with uncertainty and needs to take into account the patient's clinical parameters, the context of illness and the medical knowledge of the physician and guidelines to recommend a treatment therapy. This research study is focused on the formalization of medical knowledge using a cognitive process, called Fuzzy Cognitive Maps (FCMs) and semantic web approach. The FCM technique is capable of dealing with situations including uncertain descriptions using similar procedure such as human reasoning does. Thus, it was selected for the case of modeling and knowledge integration of clinical practice guidelines. The semantic web tools were established to implement the FCM approach. The knowledge base was constructed from the clinical guidelines as the form of if-then fuzzy rules. These fuzzy rules were transferred to FCM modeling technique and, through the semantic web tools, the whole formalization was accomplished. The problem of urinary tract infection (UTI) in adult community was examined for the proposed approach. Forty-seven clinical concepts and eight therapy concepts were identified for the antibiotic treatment therapy problem of UTIs. A preliminary pilot-evaluation study with 55 patient cases showed interesting findings; 91% of the antibiotic treatments proposed by the implemented approach were in fully agreement with the guidelines and physicians' opinions. The results have shown that the suggested approach formalizes medical knowledge efficiently and gives a front-end decision on antibiotics' suggestion for cystitis. Concluding, modeling medical knowledge/therapeutic guidelines using cognitive methods and web semantic tools is both reliable and useful. Copyright © 2011 Elsevier Inc. All rights reserved.
Different Strokes for Different Folks: Visual Presentation Design between Disciplines
Gomez, Steven R.; Jianu, Radu; Ziemkiewicz, Caroline; Guo, Hua; Laidlaw, David H.
2015-01-01
We present an ethnographic study of design differences in visual presentations between academic disciplines. Characterizing design conventions between users and data domains is an important step in developing hypotheses, tools, and design guidelines for information visualization. In this paper, disciplines are compared at a coarse scale between four groups of fields: social, natural, and formal sciences; and the humanities. Two commonplace presentation types were analyzed: electronic slideshows and whiteboard “chalk talks”. We found design differences in slideshows using two methods – coding and comparing manually-selected features, like charts and diagrams, and an image-based analysis using PCA called eigenslides. In whiteboard talks with controlled topics, we observed design behaviors, including using representations and formalisms from a participant’s own discipline, that suggest authors might benefit from novel assistive tools for designing presentations. Based on these findings, we discuss opportunities for visualization ethnography and human-centered authoring tools for visual information. PMID:26357149
Different Strokes for Different Folks: Visual Presentation Design between Disciplines.
Gomez, S R; Jianu, R; Ziemkiewicz, C; Guo, Hua; Laidlaw, D H
2012-12-01
We present an ethnographic study of design differences in visual presentations between academic disciplines. Characterizing design conventions between users and data domains is an important step in developing hypotheses, tools, and design guidelines for information visualization. In this paper, disciplines are compared at a coarse scale between four groups of fields: social, natural, and formal sciences; and the humanities. Two commonplace presentation types were analyzed: electronic slideshows and whiteboard "chalk talks". We found design differences in slideshows using two methods - coding and comparing manually-selected features, like charts and diagrams, and an image-based analysis using PCA called eigenslides. In whiteboard talks with controlled topics, we observed design behaviors, including using representations and formalisms from a participant's own discipline, that suggest authors might benefit from novel assistive tools for designing presentations. Based on these findings, we discuss opportunities for visualization ethnography and human-centered authoring tools for visual information.
Influence Diagrams as Decision-Making Tools for Pesticide Risk Management
The pesticide policy arena is filled with discussion of probabilistic approaches to assess ecological risk, however, similar discussions about implementing formal probabilistic methods in pesticide risk decision making are less common. An influence diagram approach is proposed f...
New directions for Artificial Intelligence (AI) methods in optimum design
NASA Technical Reports Server (NTRS)
Hajela, Prabhat
1989-01-01
Developments and applications of artificial intelligence (AI) methods in the design of structural systems is reviewed. Principal shortcomings in the current approach are emphasized, and the need for some degree of formalism in the development environment for such design tools is underscored. Emphasis is placed on efforts to integrate algorithmic computations in expert systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacques Hugo
Traditional engineering methods do not make provision for the integration of human considerations, while traditional human factors methods do not scale well to the complexity of large-scale nuclear power plant projects. Although the need for up-to-date human factors engineering processes and tools is recognised widely in industry, so far no formal guidance has been developed. This article proposes such a framework.
2009-01-01
Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John
2005-01-01
Requirements-to-Design-to-Code (R2D2C) is an approach to the engineering of computer-based systems that embodies the idea of requirements-based programming in system development. It goes further; however, in that the approach offers not only an underlying formalism, but full formal development from requirements capture through to the automatic generation of provably-correct code. As such, the approach has direct application to the development of systems requiring autonomic properties. We describe a prototype tool to support the method, and illustrate its applicability to the development of LOGOS, a NASA autonomous ground control system, which exhibits autonomic behavior. Finally, we briefly discuss other areas where the approach and prototype tool are being considered for application.
Petri Nets as Modeling Tool for Emergent Agents
NASA Technical Reports Server (NTRS)
Bergman, Marto
2004-01-01
Emergent agents, those agents whose local interactions can cause unexpected global results, require a method of modeling that is both dynamic and structured Petri Nets, a modeling tool developed for dynamic discrete event system of mainly functional agents, provide this, and have the benefit of being an established tool. We present here the details of the modeling method here and discuss how to implement its use for modeling agent-based systems. Petri Nets have been used extensively in the modeling of functional agents, those agents who have defined purposes and whose actions should result in a know outcome. However, emergent agents, those agents who have a defined structure but whose interaction causes outcomes that are unpredictable, have not yet found a modeling style that suits them. A problem with formally modeling emergent agents that any formal modeling style usually expects to show the results of a problem and the results of problems studied using emergent agents are not apparent from the initial construction. However, the study of emergent agents still requires a method to analyze the agents themselves, and have sensible conversation about the differences and similarities between types of emergent agents. We attempt to correct this problem by applying Petri Nets to the characterization of emergent agents. In doing so, the emergent properties of these agents can be highlighted, and conversation about the nature and compatibility of the differing methods of agent creation can begin.
Enhanced semantic interoperability by profiling health informatics standards.
López, Diego M; Blobel, Bernd
2009-01-01
Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.
Non-perturbative background field calculations
NASA Astrophysics Data System (ADS)
Stephens, C. R.
1988-01-01
New methods are developed for calculating one loop functional determinants in quantum field theory. Instead of relying on a calculation of all the eigenvalues of the small fluctuation equation, these techniques exploit the ability of the proper time formalism to reformulate an infinite dimensional field theoretic problem into a finite dimensional covariant quantum mechanical analog, thereby allowing powerful tools such as the method of Jacobi fields to be used advantageously in a field theory setting. More generally the methods developed herein should be extremely valuable when calculating quantum processes in non-constant background fields, offering a utilitarian alternative to the two standard methods of calculation—perturbation theory in the background field or taking the background field into account exactly. The formalism developed also allows for the approximate calculation of covariances of partial differential equations from a knowledge of the solutions of a homogeneous ordinary differential equation.
NASA Astrophysics Data System (ADS)
Brambilla, Marco; Ceri, Stefano; Valle, Emanuele Della; Facca, Federico M.; Tziviskou, Christina
Although Semantic Web Services are expected to produce a revolution in the development of Web-based systems, very few enterprise-wide design experiences are available; one of the main reasons is the lack of sound Software Engineering methods and tools for the deployment of Semantic Web applications. In this chapter, we present an approach to software development for the Semantic Web based on classical Software Engineering methods (i.e., formal business process development, computer-aided and component-based software design, and automatic code generation) and on semantic methods and tools (i.e., ontology engineering, semantic service annotation and discovery).
NASA Technical Reports Server (NTRS)
Eichenlaub, Carl T.; Harper, C. Douglas; Hird, Geoffrey
1993-01-01
Life-critical applications warrant a higher level of software reliability than has yet been achieved. Since it is not certain that traditional methods alone can provide the required ultra reliability, new methods should be examined as supplements or replacements. This paper describes a mathematical counterpart to the traditional process of empirical testing. ORA's Penelope verification system is demonstrated as a tool for evaluating the correctness of Ada software. Grady Booch's Ada calendar utility package, obtained through NASA, was specified in the Larch/Ada language. Formal verification in the Penelope environment established that many of the package's subprograms met their specifications. In other subprograms, failed attempts at verification revealed several errors that had escaped detection by testing.
Multicriteria decision analysis: Overview and implications for environmental decision making
Hermans, Caroline M.; Erickson, Jon D.; Erickson, Jon D.; Messner, Frank; Ring, Irene
2007-01-01
Environmental decision making involving multiple stakeholders can benefit from the use of a formal process to structure stakeholder interactions, leading to more successful outcomes than traditional discursive decision processes. There are many tools available to handle complex decision making. Here we illustrate the use of a multicriteria decision analysis (MCDA) outranking tool (PROMETHEE) to facilitate decision making at the watershed scale, involving multiple stakeholders, multiple criteria, and multiple objectives. We compare various MCDA methods and their theoretical underpinnings, examining methods that most realistically model complex decision problems in ways that are understandable and transparent to stakeholders.
Java PathExplorer: A Runtime Verification Tool
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)
2001-01-01
We describe recent work on designing an environment called Java PathExplorer for monitoring the execution of Java programs. This environment facilitates the testing of execution traces against high level specifications, including temporal logic formulae. In addition, it contains algorithms for detecting classical error patterns in concurrent programs, such as deadlocks and data races. An initial prototype of the tool has been applied to the executive module of the planetary Rover K9, developed at NASA Ames. In this paper we describe the background and motivation for the development of this tool, including comments on how it relates to formal methods tools as well as to traditional testing, and we then present the tool itself.
2004-02-01
Protocol for Unix enumerating by stealing /etc/ passwd and (or) /etc/hosts.equiv and (or) ~/.rhosts; ISU – Identifying SID with user2sid ; IAS...null sessions””, FUE – “Finger Users Enumeration”, UTFTP – “Use of Trivial File Transfer Protocol for Unix enumerating by stealing /etc/ passwd and...Ping of Death”, UF – “UDP flooding”, IFS – “Storm of inquiries to FTP-server”, APF – “Access to Password File . passwd ”, WDPF – “Writing of Data with
The Creative Power of Formal Analogies in Physics: The Case of Albert Einstein
NASA Astrophysics Data System (ADS)
Gingras, Yves
2015-07-01
In order to show how formal analogies between different physical systems play an important conceptual work in physics, this paper analyzes the evolution of Einstein's thoughts on the structure of radiation from the point of view of the formal analogies he used as "lenses" to "see" through the "black box" of Planck's blackbody radiation law. A comparison is also made with his 1925 paper on the quantum gas where he used the same formal methods. Changes of formal points of view are most of the time taken for granted or passed over in silence in studies on the mathematization of physics as if they had no special significance. Revisiting Einstein's classic papers on the nature of light and matter from the angle of the various theoretical tools he used, namely entropy and energy fluctuation calculations, helps explain why he was in a unique position to make visible the particle structure of radiation and the dual (particle and wave) nature of light and matter. Finally, this case study calls attention to the more general question of the surprising creative power of formal analogies and their frequent use in theoretical physics. This aspect of intellectual creation can be useful in the teaching of physics.
Formalization of the engineering science discipline - knowledge engineering
NASA Astrophysics Data System (ADS)
Peng, Xiao
Knowledge is the most precious ingredient facilitating aerospace engineering research and product development activities. Currently, the most common knowledge retention methods are paper-based documents, such as reports, books and journals. However, those media have innate weaknesses. For example, four generations of flying wing aircraft (Horten, Northrop XB-35/YB-49, Boeing BWB and many others) were mostly developed in isolation. The subsequent engineers were not aware of the previous developments, because these projects were documented such which prevented the next generation of engineers to benefit from the previous lessons learned. In this manner, inefficient knowledge retention methods have become a primary obstacle for knowledge transfer from the experienced to the next generation of engineers. In addition, the quality of knowledge itself is a vital criterion; thus, an accurate measure of the quality of 'knowledge' is required. Although qualitative knowledge evaluation criteria have been researched in other disciplines, such as the AAA criterion by Ernest Sosa stemming from the field of philosophy, a quantitative knowledge evaluation criterion needs to be developed which is capable to numerically determine the qualities of knowledge for aerospace engineering research and product development activities. To provide engineers with a high-quality knowledge management tool, the engineering science discipline Knowledge Engineering has been formalized to systematically address knowledge retention issues. This research undertaking formalizes Knowledge Engineering as follows: 1. Categorize knowledge according to its formats and representations for the first time, which serves as the foundation for the subsequent knowledge management function development. 2. Develop an efficiency evaluation criterion for knowledge management by analyzing the characteristics of both knowledge and the parties involved in the knowledge management processes. 3. Propose and develop an innovative Knowledge-Based System (KBS), AVD KBS, forming a systematic approach facilitating knowledge management. 4. Demonstrate the efficiency advantages of AVDKBS over traditional knowledge management methods via selected design case studies. This research formalizes, for the first time, Knowledge Engineering as a distinct discipline by delivering a robust and high-quality knowledge management and process tool, AVDKBS. Formalizing knowledge proves to significantly impact the effectiveness of aerospace knowledge retention and utilization.
Formal Methods for Biological Systems: Languages, Algorithms, and Applications
2016-09-01
Moura. The yices SMT solver. Tool paper at http://yices.csl.sri.com/tool-paper. pdf, 2:2, 2006. 1.2 [80] Volker Ellenrieder, Martin E Fernandez Zapico...Oncology, 32(3):128–131, 2010. 6 [82] Mert Erkan, Simone Hausmann, Christoph W Michalski, Alexander A Fingerle, Martin Dobritz, Jörg Kleeff, and...data. In International Conference on Computer Aided Verification, pages 544–560. Springer, 2015. 1.1 [91] Martin Fränzle, Holger Hermanns, and Tino
NASA Technical Reports Server (NTRS)
Orcutt, John M.; Barbre, Robert E., Jr.; Brenton, James C.; Decker, Ryan K.
2017-01-01
Launch vehicle programs require vertically complete atmospheric profiles. Many systems at the ER to make the necessary measurements, but all have different EVR, vertical coverage, and temporal coverage. MSFC Natural Environments Branch developed a tool to create a vertically complete profile from multiple inputs using Python. Forward work: Finish Formal Testing Acceptance Testing, End-to-End Testing. Formal Release
Observational Research: Formalized Curiosity
ERIC Educational Resources Information Center
Skaggs, Paul
2004-01-01
Design research is a valuable tool to help the designer understand the problem that he/she needs to solve. The purpose of design research is to help state or understand the problems better, which will lead to better solutions. Observational research is a design research method for helping the designer understand and define the problem.…
HDL to verification logic translator
NASA Technical Reports Server (NTRS)
Gambles, J. W.; Windley, P. J.
1992-01-01
The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.
An Integrated Environment for Efficient Formal Design and Verification
NASA Technical Reports Server (NTRS)
1998-01-01
The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.
Papadimitriou, Konstantinos I.; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M.
2014-01-01
The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4th) order topology. PMID:25653579
Papadimitriou, Konstantinos I; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M
2014-01-01
The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4(th)) order topology.
Assessment of parametric uncertainty for groundwater reactive transport modeling,
Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun
2014-01-01
The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood functions, improve model calibration, and reduce predictive uncertainty in other groundwater reactive transport and environmental modeling.
Developing Formal Object-oriented Requirements Specifications: A Model, Tool and Technique.
ERIC Educational Resources Information Center
Jackson, Robert B.; And Others
1995-01-01
Presents a formal object-oriented specification model (OSS) for computer software system development that is supported by a tool that automatically generates a prototype from an object-oriented analysis model (OSA) instance, lets the user examine the prototype, and permits the user to refine the OSA model instance to generate a requirements…
Kon, Alexander A.; Klug, Michael
2010-01-01
Ethicists recommend that investigators assess subjects’ comprehension prior to accepting their consent as valid. Because children represent an at-risk population, ensuring adequate comprehension in pediatric research is vital. We surveyed all corresponding authors of research articles published over a six-month period in five leading adult and pediatric journals. Our goal was to assess how often subject’s comprehension or decisional capacity was assessed in the consent process, whether there was any difference between adult and pediatric research projects, and the rate at which investigators use formal or validated tools to assess capacity. Responses from 102 authors were analyzed (response rate 56%). Approximately two-thirds of respondents stated that they assessed comprehension or decisional capacity prior to accepting consent, and we found no difference between adult and pediatric researchers. Nine investigators used a formal questionnaire, and three used a validated tool. These findings suggest that fewer than expected investigators assess comprehension and decisional capacity, and that the use of standardized and validated tools is the exception rather than the rule. PMID:19385838
Dynamic principle for ensemble control tools.
Samoletov, A; Vasiev, B
2017-11-28
Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.
A graph algebra for scalable visual analytics.
Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V
2012-01-01
Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.
Development of a Peer Teaching-Assessment Program and a Peer Observation and Evaluation Tool
Trujillo, Jennifer M.; Barr, Judith; Gonyeau, Michael; Van Amburgh, Jenny A.; Matthews, S. James; Qualters, Donna
2008-01-01
Objectives To develop a formalized, comprehensive, peer-driven teaching assessment program and a valid and reliable assessment tool. Methods A volunteer taskforce was formed and a peer-assessment program was developed using a multistep, sequential approach and the Peer Observation and Evaluation Tool (POET). A pilot study was conducted to evaluate the efficiency and practicality of the process and to establish interrater reliability of the tool. Intra-class correlation coefficients (ICC) were calculated. Results ICCs for 8 separate lectures evaluated by 2-3 observers ranged from 0.66 to 0.97, indicating good interrater reliability of the tool. Conclusion Our peer assessment program for large classroom teaching, which includes a valid and reliable evaluation tool, is comprehensive, feasible, and can be adopted by other schools of pharmacy. PMID:19325963
Standardized terminology for clinical trial protocols based on top-level ontological categories.
Heller, B; Herre, H; Lippoldt, K; Loeffler, M
2004-01-01
This paper describes a new method for the ontologically based standardization of concepts with regard to the quality assurance of clinical trial protocols. We developed a data dictionary for medical and trial-specific terms in which concepts and relations are defined context-dependently. The data dictionary is provided to different medical research networks by means of the software tool Onto-Builder via the internet. The data dictionary is based on domain-specific ontologies and the top-level ontology of GOL. The concepts and relations described in the data dictionary are represented in natural language, semi-formally or formally according to their use.
Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis
NASA Technical Reports Server (NTRS)
Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.
2017-01-01
This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.
ERIC Educational Resources Information Center
Polavaram, Sridevi
2016-01-01
Neuroscience can greatly benefit from using novel methods in computer science and informatics, which enable knowledge discovery in unexpected ways. Currently one of the biggest challenges in Neuroscience is to map the functional circuitry of the brain. The applications of this goal range from understanding structural reorganization of neurons to…
ERIC Educational Resources Information Center
Weinberg, Richard A., Ed.; Wood, Frank H., Ed.
Presented are 12 papers which focus on four systematized methods of classroom observation. Stressed is the importance of formal, systematic observation as a tool for viewing and recording pupil behaviors and insuring that the individual child's needs are met in both the mainstream and special education settings. R. Brandt offers an historical…
An Analysis of Pre-Service Elementary Teachers' Understanding of Inquiry-Based Science Teaching
ERIC Educational Resources Information Center
Lee, Carole K.; Shea, Marilyn
2016-01-01
This study examines how pre-service elementary teachers (PSETs) view inquiry-based science learning and teaching, and how the science methods course builds their confidence to teach inquiry science. Most PSETs think that inquiry is asking students questions rather than a formal set of pedagogical tools. In the present study, three groups of PSETs…
Elder, Hinemoa; Kersten, Paula
2015-01-01
The importance of tools for the measurement of outcomes and needs in traumatic brain injury is well recognised. The development of tools for these injuries in indigenous communities has been limited despite the well-documented disparity of brain injury. The wairua theory of traumatic brain injury (TBI) in Māori proposes that a culturally defined injury occurs in tandem with the physical injury. A cultural response is therefore indicated. This research investigates a Māori method used in the development of cultural needs assessment tool designed to further examine needs associated with the culturally determined injury and in preparation for formal validation. Whakawhiti kōrero is a method used to develop better statements in the development of the assessment tool. Four wānanga (traditional fora) were held including one with whānau (extended family) with experience of traumatic brain injury. The approach was well received. A final version, Te Waka Kuaka, is now ready for validation. Whakawhiti kōrero is an indigenous method used in the development of cultural needs assessment tool in Māori traumatic brain injury. This method is likely to have wider applicability, such as Mental Health and Addictions Services, to ensure robust process of outcome measure and needs assessment development.
Shahar, Yuval; Young, Ohad; Shalom, Erez; Mayaffit, Alon; Moskovitch, Robert; Hessing, Alon; Galperin, Maya
2004-01-01
We propose to present a poster (and potentially also a demonstration of the implemented system) summarizing the current state of our work on a hybrid, multiple-format representation of clinical guidelines that facilitates conversion of guidelines from free text to a formal representation. We describe a distributed Web-based architecture (DeGeL) and a set of tools using the hybrid representation. The tools enable performing tasks such as guideline specification, semantic markup, search, retrieval, visualization, eligibility determination, runtime application and retrospective quality assessment. The representation includes four parallel formats: Free text (one or more original sources); semistructured text (labeled by the target guideline-ontology semantic labels); semiformal text (which includes some control specification); and a formal, machine-executable representation. The specification, indexing, search, retrieval, and browsing tools are essentially independent of the ontology chosen for guideline representation, but editing the semi-formal and formal formats requires ontology-specific tools, which we have developed in the case of the Asbru guideline-specification language. The four formats support increasingly sophisticated computational tasks. The hybrid guidelines are stored in a Web-based library. All tools, such as for runtime guideline application or retrospective quality assessment, are designed to operate on all representations. We demonstrate the hybrid framework by providing examples from the semantic markup and search tools.
2015-11-01
28 2.3.4 Input/Output Automata ...various other modeling frameworks such as I/O Automata , Kahn Process Networks, Petri-nets, Multi-dimensional SDF, etc. are also used for designing...Formal Ideally suited to model DSP applications 3 Petri Nets Graphical Formal Used for modeling distributed systems 4 I/O Automata Both Formal
Using SCR methods to analyze requirements documentation
NASA Technical Reports Server (NTRS)
Callahan, John; Morrison, Jeffery
1995-01-01
Software Cost Reduction (SCR) methods are being utilized to analyze and verify selected parts of NASA's EOS-DIS Core System (ECS) requirements documentation. SCR is being used as a spot-inspection tool. Through this formal and systematic approach of the SCR requirements methods, insights as to whether the requirements are internally inconsistent or incomplete as the scenarios of intended usage evolve in the OC (Operations Concept) documentation. Thus, by modelling the scenarios and requirements as mode charts using the SCR methods, we have been able to identify problems within and between the documents.
A Mode-Shape-Based Fault Detection Methodology for Cantilever Beams
NASA Technical Reports Server (NTRS)
Tejada, Arturo
2009-01-01
An important goal of NASA's Internal Vehicle Health Management program (IVHM) is to develop and verify methods and technologies for fault detection in critical airframe structures. A particularly promising new technology under development at NASA Langley Research Center is distributed Bragg fiber optic strain sensors. These sensors can be embedded in, for instance, aircraft wings to continuously monitor surface strain during flight. Strain information can then be used in conjunction with well-known vibrational techniques to detect faults due to changes in the wing's physical parameters or to the presence of incipient cracks. To verify the benefits of this technology, the Formal Methods Group at NASA LaRC has proposed the use of formal verification tools such as PVS. The verification process, however, requires knowledge of the physics and mathematics of the vibrational techniques and a clear understanding of the particular fault detection methodology. This report presents a succinct review of the physical principles behind the modeling of vibrating structures such as cantilever beams (the natural model of a wing). It also reviews two different classes of fault detection techniques and proposes a particular detection method for cracks in wings, which is amenable to formal verification. A prototype implementation of these methods using Matlab scripts is also described and is related to the fundamental theoretical concepts.
Quasipolynomial generalization of Lotka-Volterra mappings
NASA Astrophysics Data System (ADS)
Hernández-Bermejo, Benito; Brenig, Léon
2002-07-01
In recent years, it has been shown that Lotka-Volterra mappings constitute a valuable tool from both the theoretical and the applied points of view, with developments in very diverse fields such as physics, population dynamics, chemistry and economy. The purpose of this work is to demonstrate that many of the most important ideas and algebraic methods that constitute the basis of the quasipolynomial formalism (originally conceived for the analysis of ordinary differential equations) can be extended into the mapping domain. The extension of the formalism into the discrete-time context is remarkable as far as the quasipolynomial methodology had never been shown to be applicable beyond the differential case. It will be demonstrated that Lotka-Volterra mappings play a central role in the quasipolynomial formalism for the discrete-time case. Moreover, the extension of the formalism into the discrete-time domain allows a significant generalization of Lotka-Volterra mappings as well as a whole transfer of algebraic methods into the discrete-time context. The result is a novel and more general conceptual framework for the understanding of Lotka-Volterra mappings as well as a new range of possibilities that become open not only for the theoretical analysis of Lotka-Volterra mappings and their generalizations, but also for the development of new applications.
2013-09-01
to a XML file, a code that Bonine in [21] developed for a similar purpose. Using the StateRover XML log file import tool, we are able to generate a...C. Bonine , M. Shing, T.W. Otani, “Computer-aided process and tools for mobile software acquisition,” NPS, Monterey, CA, Tech. Rep. NPS-SE-13...C10P07R05– 075, 2013. [21] C. Bonine , “Specification, validation and verification of mobile application behavior,” M.S. thesis, Dept. Comp. Science, NPS
McNamee, R L; Eddy, W F
2001-12-01
Analysis of variance (ANOVA) is widely used for the study of experimental data. Here, the reach of this tool is extended to cover the preprocessing of functional magnetic resonance imaging (fMRI) data. This technique, termed visual ANOVA (VANOVA), provides both numerical and pictorial information to aid the user in understanding the effects of various parts of the data analysis. Unlike a formal ANOVA, this method does not depend on the mathematics of orthogonal projections or strictly additive decompositions. An illustrative example is presented and the application of the method to a large number of fMRI experiments is discussed. Copyright 2001 Wiley-Liss, Inc.
Design and Analysis Techniques for Concurrent Blackboard Systems. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Mcmanus, John William
1992-01-01
Blackboard systems are a natural progression of knowledge-based systems into a more powerful problem solving technique. They provide a way for several highly specialized knowledge sources to cooperate to solve large, complex problems. Blackboard systems incorporate the concepts developed by rule-based and expert systems programmers and include the ability to add conventionally coded knowledge sources. The small and specialized knowledge sources are easier to develop and test, and can be hosted on hardware specifically suited to the task that they are solving. The Formal Model for Blackboard Systems was developed to provide a consistent method for describing a blackboard system. A set of blackboard system design tools has been developed and validated for implementing systems that are expressed using the Formal Model. The tools are used to test and refine a proposed blackboard system design before the design is implemented. My research has shown that the level of independence and specialization of the knowledge sources directly affects the performance of blackboard systems. Using the design, simulation, and analysis tools, I developed a concurrent object-oriented blackboard system that is faster, more efficient, and more powerful than existing systems. The use of the design and analysis tools provided the highly specialized and independent knowledge sources required for my concurrent blackboard system to achieve its design goals.
The production of audiovisual teaching tools in minimally invasive surgery.
Tolerton, Sarah K; Hugh, Thomas J; Cosman, Peter H
2012-01-01
Audiovisual learning resources have become valuable adjuncts to formal teaching in surgical training. This report discusses the process and challenges of preparing an audiovisual teaching tool for laparoscopic cholecystectomy. The relative value in surgical education and training, for both the creator and viewer are addressed. This audiovisual teaching resource was prepared as part of the Master of Surgery program at the University of Sydney, Australia. The different methods of video production used to create operative teaching tools are discussed. Collating and editing material for an audiovisual teaching resource can be a time-consuming and technically challenging process. However, quality learning resources can now be produced even with limited prior video editing experience. With minimal cost and suitable guidance to ensure clinically relevant content, most surgeons should be able to produce short, high-quality education videos of both open and minimally invasive surgery. Despite the challenges faced during production of audiovisual teaching tools, these resources are now relatively easy to produce using readily available software. These resources are particularly attractive to surgical trainees when real time operative footage is used. They serve as valuable adjuncts to formal teaching, particularly in the setting of minimally invasive surgery. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Integration of tools for binding archetypes to SNOMED CT
Sundvall, Erik; Qamar, Rahil; Nyström, Mikael; Forss, Mattias; Petersson, Håkan; Karlsson, Daniel; Åhlfeldt, Hans; Rector, Alan
2008-01-01
Background The Archetype formalism and the associated Archetype Definition Language have been proposed as an ISO standard for specifying models of components of electronic healthcare records as a means of achieving interoperability between clinical systems. This paper presents an archetype editor with support for manual or semi-automatic creation of bindings between archetypes and terminology systems. Methods Lexical and semantic methods are applied in order to obtain automatic mapping suggestions. Information visualisation methods are also used to assist the user in exploration and selection of mappings. Results An integrated tool for archetype authoring, semi-automatic SNOMED CT terminology binding assistance and terminology visualization was created and released as open source. Conclusion Finding the right terms to bind is a difficult task but the effort to achieve terminology bindings may be reduced with the help of the described approach. The methods and tools presented are general, but here only bindings between SNOMED CT and archetypes based on the openEHR reference model are presented in detail. PMID:19007444
NASA Technical Reports Server (NTRS)
Windley, P. J.
1991-01-01
In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.
An Introduction to the BFS Method and Its Use to Model Binary NiAl Alloys
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Noebe, Ronald D.; Ferrante, J.; Amador, C.
1998-01-01
We introduce the Bozzolo-Ferrante-Smith (BFS) method for alloys as a computationally efficient tool for aiding in the process of alloy design. An intuitive description of the BFS method is provided, followed by a formal discussion of its implementation. The method is applied to the study of the defect structure of NiAl binary alloys. The groundwork is laid for a detailed progression to higher order NiAl-based alloys linking theoretical calculations and computer simulations based on the BFS method and experimental work validating each step of the alloy design process.
Incompleteness of Bluetooth protocol conformance test cases
NASA Astrophysics Data System (ADS)
Wu, Peng; Gao, Qiang
2001-10-01
This paper describes a formal method to verify the completeness of conformance testing, in which not only Implementation Under Test (IUT) is formalized in SDL, but also conformance tester is described in SDL so that conformance testing can be performed in simulator provided with CASE tool. The protocol set considered is Bluetooth, an open wireless communication technology. Our research results show that Bluetooth conformance test specification is not complete in that it has only limited coverage and many important capabilities defined in Bluetooth core specification are not tested. We also give a detail report on the missing test cases against Bluetooth core specification, and provide a guide on further test case generation in the future.
Mending the Gap, An Effort to Aid the Transfer of Formal Methods Technology
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly
2009-01-01
Formal methods can be applied to many of the development and verification activities required for civil avionics software. RTCA/DO-178B, Software Considerations in Airborne Systems and Equipment Certification, gives a brief description of using formal methods as an alternate method of compliance with the objectives of that standard. Despite this, the avionics industry at large has been hesitant to adopt formal methods, with few developers have actually used formal methods for certification credit. Why is this so, given the volume of evidence of the benefits of formal methods? This presentation will explore some of the challenges to using formal methods in a certification context and describe the effort by the Formal Methods Subgroup of RTCA SC-205/EUROCAE WG-71 to develop guidance to make the use of formal methods a recognized approach.
A formal approach to the analysis of clinical computer-interpretable guideline modeling languages.
Grando, M Adela; Glasspool, David; Fox, John
2012-01-01
To develop proof strategies to formally study the expressiveness of workflow-based languages, and to investigate their applicability to clinical computer-interpretable guideline (CIG) modeling languages. We propose two strategies for studying the expressiveness of workflow-based languages based on a standard set of workflow patterns expressed as Petri nets (PNs) and notions of congruence and bisimilarity from process calculus. Proof that a PN-based pattern P can be expressed in a language L can be carried out semi-automatically. Proof that a language L cannot provide the behavior specified by a PNP requires proof by exhaustion based on analysis of cases and cannot be performed automatically. The proof strategies are generic but we exemplify their use with a particular CIG modeling language, PROforma. To illustrate the method we evaluate the expressiveness of PROforma against three standard workflow patterns and compare our results with a previous similar but informal comparison. We show that the two proof strategies are effective in evaluating a CIG modeling language against standard workflow patterns. We find that using the proposed formal techniques we obtain different results to a comparable previously published but less formal study. We discuss the utility of these analyses as the basis for principled extensions to CIG modeling languages. Additionally we explain how the same proof strategies can be reused to prove the satisfaction of patterns expressed in the declarative language CIGDec. The proof strategies we propose are useful tools for analysing the expressiveness of CIG modeling languages. This study provides good evidence of the benefits of applying formal methods of proof over semi-formal ones. Copyright © 2011 Elsevier B.V. All rights reserved.
Non-Formal Education: Interest in Human Capital
ERIC Educational Resources Information Center
Ivanova, I. V.
2016-01-01
We define non-formal education as a part of general education, which gives students the required tools for cognition and creativity. It allows them to fully realize their self-potential and to set their own professional and personal goals. In this article, we outline the fundamental differences between general and non-formal education from the…
Safety Verification of the Small Aircraft Transportation System Concept of Operations
NASA Technical Reports Server (NTRS)
Carreno, Victor; Munoz, Cesar
2005-01-01
A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.
Model-Driven Test Generation of Distributed Systems
NASA Technical Reports Server (NTRS)
Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin
2012-01-01
This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.
Forum on Workforce Development
NASA Technical Reports Server (NTRS)
Hoffman, Edward
2010-01-01
APPEL Mission: To support NASA's mission by promoting individual, team, and organizational excellence in program/project management and engineering through the application of learning strategies, methods, models, and tools. Goals: a) Provide a common frame of reference for NASA s technical workforce. b) Provide and enhance critical job skills. c) Support engineering, program and project teams. d) Promote organizational learning across the agency. e) Supplement formal educational programs.
ERIC Educational Resources Information Center
Jon Schneller, Andrew; Schofield, Casey A.; Frank, Jenna; Hollister, Eliza; Mamuszka, Lauren
2015-01-01
This article reports on a mixed methods evaluation of an indoor garden-based learning curriculum for 5th and 6th graders which incorporated aquaponics and hydroponics technologies. This study provides a better understanding of the extent to which indoor gardening technologies can be used within the formal curriculum as an effective teaching tool.…
Bruns, David E; Burtis, Carl A; Gronowski, Ann M; McQueen, Matthew J; Newman, Anthony; Jonsson, Jon J
2015-03-10
Ethical considerations are increasingly important in medicine. We aimed to determine the mode and extent of teaching of ethics in training programs in clinical chemistry and laboratory medicine. We developed an on-line survey of teaching in areas of ethics relevant to laboratory medicine. Reponses were invited from directors of training programs who were recruited via email to leaders of national organizations. The survey was completed by 80 directors from 24 countries who directed 113 programs. The largest numbers of respondents directed postdoctoral training of scientists (42%) or physicians (33%), post-masters degree programs (33%), and PhD programs (29%). Most programs (82%) were 2years or longer in duration. Formal training was offered in research ethics by 39%, medical ethics by 31%, professional ethics by 24% and business ethics by 9%. The number of reported hours of formal training varied widely, e.g., from 0 to >15h/year for research ethics and from 0 to >15h for medical ethics. Ethics training was required and/or tested in 75% of programs that offered training. A majority (54%) of respondents reported plans to add or enhance training in ethics; many indicated a desire for online resources related to ethics, especially resources with self-assessment tools. Formal teaching of ethics is absent from many training programs in clinical chemistry and laboratory medicine, with heterogeneity in the extent and methods of ethics training among the programs that provide the training. A perceived need exists for online training tools, especially tools with self-assessment components. Copyright © 2014 Elsevier B.V. All rights reserved.
Using formal methods for content validation of medical procedure documents.
Cota, Érika; Ribeiro, Leila; Bezerra, Jonas Santos; Costa, Andrei; da Silva, Rosiana Estefane; Cota, Gláucia
2017-08-01
We propose the use of a formal approach to support content validation of a standard operating procedure (SOP) for a therapeutic intervention. Such an approach provides a useful tool to identify ambiguities, omissions and inconsistencies, and improves the applicability and efficacy of documents in the health settings. We apply and evaluate a methodology originally proposed for the verification of software specification documents to a specific SOP. The verification methodology uses the graph formalism to model the document. Semi-automatic analysis identifies possible problems in the model and in the original document. The verification is an iterative process that identifies possible faults in the original text that should be revised by its authors and/or specialists. The proposed method was able to identify 23 possible issues in the original document (ambiguities, omissions, redundant information, and inaccuracies, among others). The formal verification process aided the specialists to consider a wider range of usage scenarios and to identify which instructions form the kernel of the proposed SOP and which ones represent additional or required knowledge that are mandatory for the correct application of the medical document. By using the proposed verification process, a simpler and yet more complete SOP could be produced. As consequence, during the validation process the experts received a more mature document and could focus on the technical aspects of the procedure itself. Copyright © 2017 Elsevier B.V. All rights reserved.
Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems
NASA Technical Reports Server (NTRS)
Berg, Melanie; LaBel, Kenneth A.
2016-01-01
We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,
An Ontology for State Analysis: Formalizing the Mapping to SysML
NASA Technical Reports Server (NTRS)
Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel
2012-01-01
State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.
Integration of tools for binding archetypes to SNOMED CT.
Sundvall, Erik; Qamar, Rahil; Nyström, Mikael; Forss, Mattias; Petersson, Håkan; Karlsson, Daniel; Ahlfeldt, Hans; Rector, Alan
2008-10-27
The Archetype formalism and the associated Archetype Definition Language have been proposed as an ISO standard for specifying models of components of electronic healthcare records as a means of achieving interoperability between clinical systems. This paper presents an archetype editor with support for manual or semi-automatic creation of bindings between archetypes and terminology systems. Lexical and semantic methods are applied in order to obtain automatic mapping suggestions. Information visualisation methods are also used to assist the user in exploration and selection of mappings. An integrated tool for archetype authoring, semi-automatic SNOMED CT terminology binding assistance and terminology visualization was created and released as open source. Finding the right terms to bind is a difficult task but the effort to achieve terminology bindings may be reduced with the help of the described approach. The methods and tools presented are general, but here only bindings between SNOMED CT and archetypes based on the openEHR reference model are presented in detail.
Tools, information sources, and methods used in deciding on drug availability in HMOs.
Barner, J C; Thomas, J
1998-01-01
The use and importance of specific decision-making tools, information sources, and drug-use management methods in determining drug availability and use in HMOs were studied. A questionnaire was sent to 303 randomly selected HMOs. Respondents were asked to rate their use of each of four formal decision-making tools and its relative importance, as well as the use and importance of eight information sources and 11 methods for managing drug availability and use, on a 5-point scale. The survey response rate was 28%. Approximately half of the respondents reported that their HMOs used decision analysis or multiattribute analysis in deciding on drug availability. If used, these tools were rated as very important. There were significant differences in levels of use by HMO type, membership size, and age. Journal articles and reference books were reported most often as information sources. Retrospective drug-use review was used very often and perceived to be very important in managing drug use. Other management methods were used only occasionally, but the importance placed on these tools when used ranged from moderately to very important. Older organizations used most of the management methods more often than did other HMOs. Decision analysis and multiattribute analysis were the most commonly used tools for deciding on which drugs to make available to HMO members, and reference books and journal articles were the most commonly used information sources. Retrospective and prospective drug-use reviews were the most commonly applied methods for managing HMO members' access to drugs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Witzel, Wayne; Rudinger, Kenneth Michael; Sarovar, Mohan
Using a novel formal methods approach, we have generated computer-veri ed proofs of major theorems pertinent to the quantum phase estimation algorithm. This was accomplished using our Prove-It software package in Python. While many formal methods tools are available, their practical utility is limited. Translating a problem of interest into these systems and working through the steps of a proof is an art form that requires much expertise. One must surrender to the preferences and restrictions of the tool regarding how mathematical notions are expressed and what deductions are allowed. Automation is a major driver that forces restrictions. Our focus,more » on the other hand, is to produce a tool that allows users the ability to con rm proofs that are essentially known already. This goal is valuable in itself. We demonstrate the viability of our approach that allows the user great exibility in expressing state- ments and composing derivations. There were no major obstacles in following a textbook proof of the quantum phase estimation algorithm. There were tedious details of algebraic manipulations that we needed to implement (and a few that we did not have time to enter into our system) and some basic components that we needed to rethink, but there were no serious roadblocks. In the process, we made a number of convenient additions to our Prove-It package that will make certain algebraic manipulations easier to perform in the future. In fact, our intent is for our system to build upon itself in this manner.« less
Ceccarelli, Michele; Cerulo, Luigi; Santone, Antonella
2014-10-01
Reverse engineering of gene regulatory relationships from genomics data is a crucial task to dissect the complex underlying regulatory mechanism occurring in a cell. From a computational point of view the reconstruction of gene regulatory networks is an undetermined problem as the large number of possible solutions is typically high in contrast to the number of available independent data points. Many possible solutions can fit the available data, explaining the data equally well, but only one of them can be the biologically true solution. Several strategies have been proposed in literature to reduce the search space and/or extend the amount of independent information. In this paper we propose a novel algorithm based on formal methods, mathematically rigorous techniques widely adopted in engineering to specify and verify complex software and hardware systems. Starting with a formal specification of gene regulatory hypotheses we are able to mathematically prove whether a time course experiment belongs or not to the formal specification, determining in fact whether a gene regulation exists or not. The method is able to detect both direction and sign (inhibition/activation) of regulations whereas most of literature methods are limited to undirected and/or unsigned relationships. We empirically evaluated the approach on experimental and synthetic datasets in terms of precision and recall. In most cases we observed high levels of accuracy outperforming the current state of art, despite the computational cost increases exponentially with the size of the network. We made available the tool implementing the algorithm at the following url: http://www.bioinformatics.unisannio.it. Copyright © 2014 Elsevier Inc. All rights reserved.
Dong, Zhichao; Cheng, Haobo; Tam, Hon-Yuen
2014-04-10
Numerical simulation of subaperture tool influence functions (TIF) is widely known as a critical procedure in computer-controlled optical surfacing. However, it may lack practicability in engineering because the emulation TIF (e-TIF) has some discrepancy with the practical TIF (p-TIF), and the removal rate could not be predicted by simulations. Prior to the polishing of a formal workpiece, opticians have to conduct TIF spot experiments on another sample to confirm the p-TIF with a quantitative removal rate, which is difficult and time-consuming for sequential polishing runs with different tools. This work is dedicated to applying these e-TIFs into practical engineering by making improvements from two aspects: (1) modifies the pressure distribution model of a flat-pitch polisher by finite element analysis and least square fitting methods to make the removal shape of e-TIFs closer to p-TIFs (less than 5% relative deviation validated by experiments); (2) predicts the removal rate of e-TIFs by reverse calculating the material removal volume of a pre-polishing run to the formal workpiece (relative deviations of peak and volume removal rate were validated to be less than 5%). This can omit TIF spot experiments for the particular flat-pitch tool employed and promote the direct usage of e-TIFs in the optimization of a dwell time map, which can largely save on cost and increase fabrication efficiency.
Exploring Formalized Elite Coach Mentoring Programmes in the UK: 'We've Had to Play the Game'
ERIC Educational Resources Information Center
Sawiuk, Rebecca; Taylor, William G.; Groom, Ryan
2018-01-01
Formalized mentoring programmes have been implemented increasingly by UK sporting institutions as a central coach development tool, yet claims supporting formal mentoring as an effective learning strategy are often speculative, scarce, ill-defined and accepted without verification. The aim of this study, therefore, was to explore some of the…
Toward a mathematical formalism of performance, task difficulty, and activation
NASA Technical Reports Server (NTRS)
Samaras, George M.
1988-01-01
The rudiments of a mathematical formalism for handling operational, physiological, and psychological concepts are developed for use by the man-machine system design engineer. The formalism provides a framework for developing a structured, systematic approach to the interface design problem, using existing mathematical tools, and simplifying the problem of telling a machine how to measure and use performance.
How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations
NASA Astrophysics Data System (ADS)
Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev
With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.
Helping System Engineers Bridge the Peaks
NASA Technical Reports Server (NTRS)
Rungta, Neha; Tkachuk, Oksana; Person, Suzette; Biatek, Jason; Whalen, Michael W.; Castle, Joseph; Castle, JosephGundy-Burlet, Karen
2014-01-01
In our experience at NASA, system engineers generally follow the Twin Peaks approach when developing safety-critical systems. However, iterations between the peaks require considerable manual, and in some cases duplicate, effort. A significant part of the manual effort stems from the fact that requirements are written in English natural language rather than a formal notation. In this work, we propose an approach that enables system engineers to leverage formal requirements and automated test generation to streamline iterations, effectively "bridging the peaks". The key to the approach is a formal language notation that a) system engineers are comfortable with, b) is supported by a family of automated V&V tools, and c) is semantically rich enough to describe the requirements of interest. We believe the combination of formalizing requirements and providing tool support to automate the iterations will lead to a more efficient Twin Peaks implementation at NASA.
Decision Analysis Tools for Volcano Observatories
NASA Astrophysics Data System (ADS)
Hincks, T. H.; Aspinall, W.; Woo, G.
2005-12-01
Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.
Joe, Jonathan; Chaudhuri, Shomir; Le, Thai; Thompson, Hilaire; Demiris, George
2015-08-01
While health information technologies have become increasingly popular, many have not been formally tested to ascertain their usability. Traditional rigorous methods take significant amounts of time and manpower to evaluate the usability of a system. In this paper, we evaluate the use of instant data analysis (IDA) as developed by Kjeldskov et al. to perform usability testing on a tool designed for older adults and caregivers. The IDA method is attractive because it takes significantly less time and manpower than the traditional usability testing methods. In this paper we demonstrate how IDA was used to evaluate usability of a multifunctional wellness tool, discuss study results and lessons learned while using this method. We also present findings from an extension of the method which allows the grouping of similar usability problems in an efficient manner. We found that the IDA method is a quick, relatively easy approach to identifying and ranking usability issues among health information technologies. Copyright © 2015 Elsevier Inc. All rights reserved.
Issues in the Assessment of Social Phobia: A Review
Letamendi, Andrea M.; Chavira, Denise A.; Stein, Murray B.
2010-01-01
Since the emergence of social phobia in DSM nomenclature, the mental health community has witnessed an expansion in standardized methods for the screening, diagnosis, and measurement of the disorder. This article reviews formal assessment methods for social phobia, including diagnostic interview, clinician-administered instruments, and self report questionnaires. Frequently used tools for assessing constructs related to social phobia, such as disability and quality of life, are also briefly presented. This review evaluates each method by highlighting the assessment features recommended in social phobia literature, including method of administration, item content, coverage, length of scale, type of scores generated, and time frame. PMID:19728569
NASA Technical Reports Server (NTRS)
Dunham, J. R. (Editor); Knight, J. C. (Editor)
1982-01-01
The state of the art in the production of crucial software for flight control applications was addressed. The association between reliability metrics and software is considered. Thirteen software development projects are discussed. A short term need for research in the areas of tool development and software fault tolerance was indicated. For the long term, research in format verification or proof methods was recommended. Formal specification and software reliability modeling, were recommended as topics for both short and long term research.
Guidance for Using Formal Methods in a Certification Context
NASA Technical Reports Server (NTRS)
Brown, Duncan; Delseny, Herve; Hayhurst, Kelly; Wiels, Virginie
2010-01-01
This paper discusses some of the challenges to using formal methods in a certification context and describes the effort by the Formal Methods Subgroup of RTCA SC-205/EUROCAE WG-71 to propose guidance to make the use of formal methods a recognized approach. This guidance, expected to take the form of a Formal Methods Technical Supplement to DO-178C/ED-12C, is described, including the activities that are needed when using formal methods, new or modified objectives with respect to the core DO-178C/ED-12C document, and evidence needed for meeting those objectives.
Two-Point Turbulence Closure Applied to Variable Resolution Modeling
NASA Technical Reports Server (NTRS)
Girimaji, Sharath S.; Rubinstein, Robert
2011-01-01
Variable resolution methods have become frontline CFD tools, but in order to take full advantage of this promising new technology, more formal theoretical development is desirable. Two general classes of variable resolution methods can be identified: hybrid or zonal methods in which RANS and LES models are solved in different flow regions, and bridging or seamless models which interpolate smoothly between RANS and LES. This paper considers the formulation of bridging methods using methods of two-point closure theory. The fundamental problem is to derive a subgrid two-equation model. We compare and reconcile two different approaches to this goal: the Partially Integrated Transport Model, and the Partially Averaged Navier-Stokes method.
A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks
Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos
2016-01-01
Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool (“ADVISES”) to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies. PMID:28025568
NASA Technical Reports Server (NTRS)
Kershaw, John
1990-01-01
The VIPER project has so far produced a formal specification of a 32 bit RISC microprocessor, an implementation of that chip in radiation-hard SOS technology, a partial proof of correctness of the implementation which is still being extended, and a large body of supporting software. The time has now come to consider what has been achieved and what directions should be pursued in the future. The most obvious lesson from the VIPER project was the time and effort needed to use formal methods properly. Most of the problems arose in the interfaces between different formalisms, e.g., between the (informal) English description and the HOL spec, between the block-level spec in HOL and the equivalent in ELLA needed by the low-level CAD tools. These interfaces need to be made rigorous or (better) eliminated. VIPER 1A (the latest chip) is designed to operate in pairs, to give protection against breakdowns in service as well as design faults. We have come to regard redundancy and formal design methods as complementary, the one to guard against normal component failures and the other to provide insurance against the risk of the common-cause failures which bedevil reliability predictions. Any future VIPER chips will certainly need improved performance to keep up with increasingly demanding applications. We have a prototype design (not yet specified formally) which includes 32 and 64 bit multiply, instruction pre-fetch, more efficient interface timing, and a new instruction to allow a quick response to peripheral requests. Work is under way to specify this device in MIRANDA, and then to refine the spec into a block-level design by top-down transformations. When the refinement is complete, a relatively simple proof checker should be able to demonstrate its correctness. This paper is presented in viewgraph form.
Inferring subunit stoichiometry from single molecule photobleaching
2013-01-01
Single molecule photobleaching is a powerful tool for determining the stoichiometry of protein complexes. By attaching fluorophores to proteins of interest, the number of associated subunits in a complex can be deduced by imaging single molecules and counting fluorophore photobleaching steps. Because some bleaching steps might be unobserved, the ensemble of steps will be binomially distributed. In this work, it is shown that inferring the true composition of a complex from such data is nontrivial because binomially distributed observations present an ill-posed inference problem. That is, a unique and optimal estimate of the relevant parameters cannot be extracted from the observations. Because of this, a method has not been firmly established to quantify confidence when using this technique. This paper presents a general inference model for interpreting such data and provides methods for accurately estimating parameter confidence. The formalization and methods presented here provide a rigorous analytical basis for this pervasive experimental tool. PMID:23712552
2017-01-01
Unique Molecular Identifiers (UMIs) are random oligonucleotide barcodes that are increasingly used in high-throughput sequencing experiments. Through a UMI, identical copies arising from distinct molecules can be distinguished from those arising through PCR amplification of the same molecule. However, bioinformatic methods to leverage the information from UMIs have yet to be formalized. In particular, sequencing errors in the UMI sequence are often ignored or else resolved in an ad hoc manner. We show that errors in the UMI sequence are common and introduce network-based methods to account for these errors when identifying PCR duplicates. Using these methods, we demonstrate improved quantification accuracy both under simulated conditions and real iCLIP and single-cell RNA-seq data sets. Reproducibility between iCLIP replicates and single-cell RNA-seq clustering are both improved using our proposed network-based method, demonstrating the value of properly accounting for errors in UMIs. These methods are implemented in the open source UMI-tools software package. PMID:28100584
In search of tools to aid logical thinking and communicating about medical decision making.
Hunink, M G
2001-01-01
To have real-time impact on medical decision making, decision analysts need a wide variety of tools to aid logical thinking and communication. Decision models provide a formal framework to integrate evidence and values, but they are commonly perceived as complex and difficult to understand by those unfamiliar with the methods, especially in the context of clinical decision making. The theory of constraints, introduced by Eliyahu Goldratt in the business world, provides a set of tools for logical thinking and communication that could potentially be useful in medical decision making. The author used the concept of a conflict resolution diagram to analyze the decision to perform carotid endarterectomy prior to coronary artery bypass grafting in a patient with both symptomatic coronary and asymptomatic carotid artery disease. The method enabled clinicians to visualize and analyze the issues, identify and discuss the underlying assumptions, search for the best available evidence, and use the evidence to make a well-founded decision. The method also facilitated communication among those involved in the care of the patient. Techniques from fields other than decision analysis can potentially expand the repertoire of tools available to support medical decision making and to facilitate communication in decision consults.
Two Paradoxes in Linear Regression Analysis.
Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong
2016-12-25
Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.
Computing generalized Langevin equations and generalized Fokker-Planck equations.
Darve, Eric; Solomon, Jose; Kia, Amirali
2009-07-07
The Mori-Zwanzig formalism is an effective tool to derive differential equations describing the evolution of a small number of resolved variables. In this paper we present its application to the derivation of generalized Langevin equations and generalized non-Markovian Fokker-Planck equations. We show how long time scales rates and metastable basins can be extracted from these equations. Numerical algorithms are proposed to discretize these equations. An important aspect is the numerical solution of the orthogonal dynamics equation which is a partial differential equation in a high dimensional space. We propose efficient numerical methods to solve this orthogonal dynamics equation. In addition, we present a projection formalism of the Mori-Zwanzig type that is applicable to discrete maps. Numerical applications are presented from the field of Hamiltonian systems.
Exploring Assessment Tools for Research and Evaluation in Astronomy Education and Outreach
NASA Astrophysics Data System (ADS)
Buxner, S. R.; Wenger, M. C.; Dokter, E. F. C.
2011-09-01
The ability to effectively measure knowledge, attitudes, and skills in formal and informal educational settings is an important aspect of astronomy education research and evaluation. Assessments may take the form of interviews, observations, surveys, exams, or other probes to help unpack people's understandings or beliefs. In this workshop, we discussed characteristics of a variety of tools that exist to assess understandings of different concepts in astronomy as well as attitudes towards science and science teaching; these include concept inventories, surveys, interview protocols, observation protocols, card sorting, reflection videos, and other methods currently being used in astronomy education research and EPO program evaluations. In addition, we discussed common questions in the selection of assessment tools including issues of reliability and validity, time to administer, format of implementation, analysis, and human subject concerns.
Effective Tools and Resources from the MAVEN Education and Public Outreach Program
NASA Astrophysics Data System (ADS)
Mason, T.
2015-12-01
Since 2010, NASA's Mars Atmosphere and Volatile Evolution (MAVEN) Education and Public Outreach (E/PO) team has developed and implemented a robust and varied suite of projects, serving audiences of all ages and diverse backgrounds from across the country. With a program designed to reach formal K-12 educators and students, afterschool and summertime communities, museum docents, journalists, and online audiences, we have incorporated an equally varied approach to developing tools, resources, and evaluation methods to specifically reach each target population and to determine the effectiveness of our efforts. This poster will highlight some of the tools and resources we have developed to share the complex science and engineering of the MAVEN mission, as well as initial evaluation results and lessons-learned from each of our E/PO projects.
On the verification of intransitive noninterference in mulitlevel security.
Ben Hadj-Alouane, Nejib; Lafrance, Stéphane; Lin, Feng; Mullins, John; Yeddes, Mohamed Moez
2005-10-01
We propose an algorithmic approach to the problem of verification of the property of intransitive noninterference (INI), using tools and concepts of discrete event systems (DES). INI can be used to characterize and solve several important security problems in multilevel security systems. In a previous work, we have established the notion of iP-observability, which precisely captures the property of INI. We have also developed an algorithm for checking iP-observability by indirectly checking P-observability for systems with at most three security levels. In this paper, we generalize the results for systems with any finite number of security levels by developing a direct method for checking iP-observability, based on an insightful observation that the iP function is a left congruence in terms of relations on formal languages. To demonstrate the applicability of our approach, we propose a formal method to detect denial of service vulnerabilities in security protocols based on INI. This method is illustrated using the TCP/IP protocol. The work extends the theory of supervisory control of DES to a new application domain.
Fuzzy Logic Controller Stability Analysis Using a Satisfiability Modulo Theories Approach
NASA Technical Reports Server (NTRS)
Arnett, Timothy; Cook, Brandon; Clark, Matthew A.; Rattan, Kuldip
2017-01-01
While many widely accepted methods and techniques exist for validation and verification of traditional controllers, at this time no solutions have been accepted for Fuzzy Logic Controllers (FLCs). Due to the highly nonlinear nature of such systems, and the fact that developing a valid FLC does not require a mathematical model of the system, it is quite difficult to use conventional techniques to prove controller stability. Since safety-critical systems must be tested and verified to work as expected for all possible circumstances, the fact that FLC controllers cannot be tested to achieve such requirements poses limitations on the applications for such technology. Therefore, alternative methods for verification and validation of FLCs needs to be explored. In this study, a novel approach using formal verification methods to ensure the stability of a FLC is proposed. Main research challenges include specification of requirements for a complex system, conversion of a traditional FLC to a piecewise polynomial representation, and using a formal verification tool in a nonlinear solution space. Using the proposed architecture, the Fuzzy Logic Controller was found to always generate negative feedback, but inconclusive for Lyapunov stability.
Development of dynamic Bayesian models for web application test management
NASA Astrophysics Data System (ADS)
Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.
2018-03-01
The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.
Developing nursing leadership in social media.
Moorley, Calvin; Chinn, Teresa
2016-03-01
A discussion on how nurse leaders are using social media and developing digital leadership in online communities. Social media is relatively new and how it is used by nurse leaders and nurses in a digital space is under explored. Discussion paper. Searches used CINAHL, the Royal College of Nursing webpages, Wordpress (for blogs) and Twitter from 2000-2015. Search terms used were Nursing leadership + Nursing social media. Understanding the development and value of nursing leadership in social media is important for nurses in formal and informal (online) leadership positions. Nurses in formal leadership roles in organizations such as the National Health Service are beginning to leverage social media. Social media has the potential to become a tool for modern nurse leadership, as it is a space where can you listen on a micro level to each individual. In addition to listening, leadership can be achieved on a much larger scale through the use of social media monitoring tools and exploration of data and crowd sourcing. Through the use of data and social media listening tools nursing leaders can seek understanding and insight into a variety of issues. Social media also places nurse leaders in a visible and accessible position as role models. Social media and formal nursing leadership do not have to be against each other, but they can work in harmony as both formal and online leadership possess skills that are transferable. If used wisely social media has the potential to become a tool for modern nurse leadership. © 2016 John Wiley & Sons Ltd.
Giorgi, R; Gouvernet, J; Dufour, J; Degoulet, P; Laugier, R; Quilichini, F; Fieschi, M
2001-01-01
Present the method used to elaborate and formalize current scientific knowledge to provide physicians with tools available on the Internet, that enable them to evaluate individual patient risk, give personalized preventive recommendations or early screening measures. The approach suggested in this article is in line with medical procedures based on levels of evidence (Evidence-based Medicine). A cyclical process for developing recommendations allows us to quickly incorporate current scientific information. At each phase, the analysis is reevaluated by experts in the field collaborating on the project. The information is formalized through the use of levels of evidence and grades of recommendations. GLIF model is used to implement recommendations for clinical practice guidelines. The most current scientific evidence incorporated in a cyclical process includes several steps: critical analysis according to the Evidence-based Medicine method; identification of predictive factors; setting-up risk levels; identification of prevention measures; elaboration of personalized recommendation. The information technology implementation of the clinical practice guideline enables physicians to quickly obtain personalized information for their patients. Cases of colorectal prevention illustrate our approach. Integration of current scientific knowledge is an important process. The delay between the moment new information arrives and the moment the practitioner applies it, is thus reduced.
NASA Technical Reports Server (NTRS)
Hercencia-Zapana, Heber; Herencia-Zapana, Heber; Hagen, George E.; Neogi, Natasha
2012-01-01
Projections of future traffic in the national airspace show that most of the hub airports and their attendant airspace will need to undergo significant redevelopment and redesign in order to accommodate any significant increase in traffic volume. Even though closely spaced parallel approaches increase throughput into a given airport, controller workload in oversubscribed metroplexes is further taxed by these approaches that require stringent monitoring in a saturated environment. The interval management (IM) concept in the TRACON area is designed to shift some of the operational burden from the control tower to the flight deck, placing the flight crew in charge of implementing the required speed changes to maintain a relative spacing interval. The interval management tolerance is a measure of the allowable deviation from the desired spacing interval for the IM aircraft (and its target aircraft). For this complex task, Formal Methods can help to ensure better design and system implementation. In this paper, we propose a probabilistic framework to quantify the uncertainty and performance associated with the major components of the IM tolerance. The analytical basis for this framework may be used to formalize both correctness and probabilistic system safety claims in a modular fashion at the algorithmic level in a way compatible with several Formal Methods tools.
Efficacy of Environmental Health E-Training for Journalists
Parin, Megan L.; Yancey, Elissa; Beidler, Caroline; Haynes, Erin N.
2015-01-01
Communities report a low level of trust in environmental health media coverage. In order to support risk communication objectives, the goals of the research study were to identify whether or not there is a gap in environmental reporting training for journalists, to outline journalists’ methods for gathering environmental health news, to observe journalists’ attitudes toward environmental health training and communication, and to determine if electronic training (online/e-training) can effectively train journalists in environmental health topics. The results indicated that environmental journalists have very little to no formal environmental journalism training. In addition, a significant percentage of journalists do not have any formal journalism education. Respondents most preferred to receive continuing environmental journalism training online. Online instruction was also perceived as effective in increasing knowledge and providing necessary reporting tools, even among participants adverse to online instructional methods. Our findings highlight the changing media climate’s need for an increase in electronic journalism education opportunities to support environmental health journalism competencies among working professional journalists. PMID:26998499
Software Safety Risk in Legacy Safety-Critical Computer Systems
NASA Technical Reports Server (NTRS)
Hill, Janice; Baggs, Rhoda
2007-01-01
Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.
Stakeholder Perceptions of ICT Usage across Management Institutes
ERIC Educational Resources Information Center
Goyal, Ela; Purohit, Seema; Bhagat, Manju
2013-01-01
Information and communication technology (ICT) which includes radio, television and newer digital technology such as computers and the internet, are potentially powerful tools for extending educational opportunities, formal and non-formal, to one and all. It provides opportunities to deploy innovative teaching methodologies and interesting…
Effects of Mobile Learning in Medical Education: A Counterfactual Evaluation.
Briz-Ponce, Laura; Juanes-Méndez, Juan Antonio; García-Peñalvo, Francisco José; Pereira, Anabela
2016-06-01
The aim of this research is to contribute to the general system education providing new insights and resources. This study performs a quasi-experimental study at University of Salamanca with 30 students to compare results between using an anatomic app for learning and the formal traditional method conducted by a teacher. The findings of the investigation suggest that the performance of learners using mobile apps is statistical better than the students using the traditional method. However, mobile devices should be considered as an additional tool to complement the teachers' explanation and it is necessary to overcome different barriers and challenges to adopt these pedagogical methods at University.
Uncertainty Modeling for Robustness Analysis of Control Upset Prevention and Recovery Systems
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.; Khong, Thuan H.; Shin, Jong-Yeob; Kwatny, Harry; Chang, Bor-Chin; Balas, Gary J.
2005-01-01
Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems (developed for failure detection, identification, and reconfiguration, as well as upset recovery) need to be evaluated over broad regions of the flight envelope and under extreme flight conditions, and should include various sources of uncertainty. However, formulation of linear fractional transformation (LFT) models for representing system uncertainty can be very difficult for complex parameter-dependent systems. This paper describes a preliminary LFT modeling software tool which uses a matrix-based computational approach that can be directly applied to parametric uncertainty problems involving multivariate matrix polynomial dependencies. Several examples are presented (including an F-16 at an extreme flight condition, a missile model, and a generic example with numerous crossproduct terms), and comparisons are given with other LFT modeling tools that are currently available. The LFT modeling method and preliminary software tool presented in this paper are shown to compare favorably with these methods.
Security Risks: Management and Mitigation in the Software Life Cycle
NASA Technical Reports Server (NTRS)
Gilliam, David P.
2004-01-01
A formal approach to managing and mitigating security risks in the software life cycle is requisite to developing software that has a higher degree of assurance that it is free of security defects which pose risk to the computing environment and the organization. Due to its criticality, security should be integrated as a formal approach in the software life cycle. Both a software security checklist and assessment tools should be incorporated into this life cycle process and integrated with a security risk assessment and mitigation tool. The current research at JPL addresses these areas through the development of a Sotfware Security Assessment Instrument (SSAI) and integrating it with a Defect Detection and Prevention (DDP) risk management tool.
Health impact assessment – A survey on quantifying tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fehr, Rainer, E-mail: rainer.fehr@uni-bielefeld.de; Mekel, Odile C.L., E-mail: odile.mekel@lzg.nrw.de; Fintan Hurley, J., E-mail: fintan.hurley@iom-world.org
Integrating human health into prospective impact assessments is known to be challenging. This is true for both approaches: dedicated health impact assessments (HIA) as well as inclusion of health into more general impact assessments. Acknowledging the full range of participatory, qualitative, and quantitative approaches, this study focuses on the latter, especially on computational tools for quantitative health modelling. We conducted a survey among tool developers concerning the status quo of development and availability of such tools; experiences made with model usage in real-life situations; and priorities for further development. Responding toolmaker groups described 17 such tools, most of them beingmore » maintained and reported as ready for use and covering a wide range of topics, including risk & protective factors, exposures, policies, and health outcomes. In recent years, existing models have been improved and were applied in new ways, and completely new models emerged. There was high agreement among respondents on the need to further develop methods for assessment of inequalities and uncertainty. The contribution of quantitative modeling to health foresight would benefit from building joint strategies of further tool development, improving the visibility of quantitative tools and methods, and engaging continuously with actual and potential users. - Highlights: • A survey investigated computational tools for health impact quantification. • Formal evaluation of such tools has been rare. • Handling inequalities and uncertainties are priority areas for further development. • Health foresight would benefit from tool developers and users forming a community. • Joint development strategies across computational tools are needed.« less
Education and Outreach with the Virtual Astronomical Observatory
NASA Astrophysics Data System (ADS)
Lawton, Brandon L.; Eisenhamer, B.; Raddick, M. J.; Mattson, B. J.; Harris, J.
2012-01-01
The Virtual Observatory (VO) is an international effort to bring a large-scale electronic integration of astronomy data, tools, and services to the global community. The Virtual Astronomical Observatory (VAO) is the U.S. NSF- and NASA-funded VO effort that seeks to put efficient astronomical tools in the hands of U.S. astronomers, students, educators, and public outreach leaders. These tools will make use of data collected by the multitude of ground- and space-based missions over the previous decades. Many future missions will also be incorporated into the VAO tools when they launch. The Education and Public Outreach (E/PO) program for the VAO is led by the Space Telescope Science Institute in collaboration with the HEASARC E/PO program and Johns Hopkins University. VAO E/PO efforts seek to bring technology, real-world astronomical data, and the story of the development and infrastructure of the VAO to the general public, formal education, and informal education communities. Our E/PO efforts will be structured to provide uniform access to VAO information, enabling educational opportunities across multiple wavelengths and time-series data sets. The VAO team recognizes that many VO programs have built powerful tools for E/PO purposes, such as Microsoft's World Wide Telescope, SDSS Sky Server, Aladin, and a multitude of citizen-science tools available from Zooniverse. We are building partnerships with Microsoft, Zooniverse, and NASA's Night Sky Network to leverage the communities and tools that already exist to meet the needs of our audiences. Our formal education program is standards-based and aims to give teachers the tools to use real astronomical data to teach the STEM subjects. To determine which tools the VAO will incorporate into the formal education program, needs assessments will be conducted with educators across the U.S.
Software Assurance Measurement -- State of the Practice
2013-11-01
quality and productivity. 30+ languages, C/C++, Java , .NET, Oracle, PeopleSoft, SAP, Siebel, Spring, Struts, Hibernate , and all major databases. ChecKing...NET 39 ActionScript 39 Ada 40 C/C++ 40 Java 41 JavaScript 42 Objective-C 42 Opa 42 Packages 42 Perl 42 PHP 42 Python 42 Formal Methods...Suite—A tool for Ada, C, C++, C#, and Java code that comprises various analyses such as architecture checking, interface analyses, and clone detection
Two Paradoxes in Linear Regression Analysis
FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong
2016-01-01
Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214
Russo, Paola; Piazza, Miriam; Leonardi, Giorgio; Roncoroni, Layla; Russo, Carlo; Spadaro, Salvatore; Quaglini, Silvana
2012-01-01
The blood transfusion is a complex activity subject to a high risk of eventually fatal errors. The development and application of computer-based systems could help reducing the error rate, playing a fundamental role in the improvement of the quality of care. This poster presents an under development eLearning tool formalizing the guidelines of the transfusion process. This system, implemented in YAWL (Yet Another Workflow Language), will be used to train the personnel in order to improve the efficiency of care and to reduce errors.
NASA Astrophysics Data System (ADS)
Palombi, Filippo; Toti, Simona
2015-05-01
Approximate weak solutions of the Fokker-Planck equation represent a useful tool to analyze the equilibrium fluctuations of birth-death systems, as they provide a quantitative knowledge lying in between numerical simulations and exact analytic arguments. In this paper, we adapt the general mathematical formalism known as the Ritz-Galerkin method for partial differential equations to the Fokker-Planck equation with time-independent polynomial drift and diffusion coefficients on the simplex. Then, we show how the method works in two examples, namely the binary and multi-state voter models with zealots.
Modeling Criminal Activity in Urban Landscapes
NASA Astrophysics Data System (ADS)
Brantingham, Patricia; Glässer, Uwe; Jackson, Piper; Vajihollahi, Mona
Computational and mathematical methods arguably have an enormous potential for serving practical needs in crime analysis and prevention by offering novel tools for crime investigations and experimental platforms for evidence-based policy making. We present a comprehensive formal framework and tool support for mathematical and computational modeling of criminal behavior to facilitate systematic experimental studies of a wide range of criminal activities in urban environments. The focus is on spatial and temporal aspects of different forms of crime, including opportunistic and serial violent crimes. However, the proposed framework provides a basis to push beyond conventional empirical research and engage the use of computational thinking and social simulations in the analysis of terrorism and counter-terrorism.
"Transformative Looks": Practicing Citizenship through Photography
ERIC Educational Resources Information Center
Pereira, Sónia; Maiztegui-Oñate, Concha; Mata-Codesal, Diana
2016-01-01
Purpose: The article discusses the meanings of citizenship and citizenship education when formal citizenship is restricted by exploring the potential of photography education and practice as a tool that promotes the exercise of citizenship in the context of non-formal critical adult education. By doing it, this text aims to enhance our…
Improving Project Management Using Formal Models and Architectures
NASA Technical Reports Server (NTRS)
Kahn, Theodore; Sturken, Ian
2011-01-01
This talk discusses the advantages formal modeling and architecture brings to project management. These emerging technologies have both great potential and challenges for improving information available for decision-making. The presentation covers standards, tools and cultural issues needing consideration, and includes lessons learned from projects the presenters have worked on.
Social Argumentation in Online Synchronous Communication
ERIC Educational Resources Information Center
Alagoz, Esra
2013-01-01
The ability to argue well is a valuable skill for students in both formal and informal learning environments. While many studies have explored the argumentative practices in formal environments and some researchers have developed tools to enhance the argumentative skills, the social argumentation that is occurring in informal spaces has yet to be…
A Survey of Formal Methods for Intelligent Swarms
NASA Technical Reports Server (NTRS)
Truszkowski, Walt; Rash, James; Hinchey, Mike; Rouff, Chrustopher A.
2004-01-01
Swarms of intelligent autonomous spacecraft, involving complex behaviors and interactions, are being proposed for future space exploration missions. Such missions provide greater flexibility and offer the possibility of gathering more science data than traditional single spacecraft missions. The emergent properties of swarms make these missions powerful, but simultaneously far more difficult to design, and to assure that the proper behaviors will emerge. These missions are also considerably more complex than previous types of missions, and NASA, like other organizations, has little experience in developing or in verifying and validating these types of missions. A significant challenge when verifying and validating swarms of intelligent interacting agents is how to determine that the possible exponential interactions and emergent behaviors are producing the desired results. Assuring correct behavior and interactions of swarms will be critical to mission success. The Autonomous Nano Technology Swarm (ANTS) mission is an example of one of the swarm types of missions NASA is considering. The ANTS mission will use a swarm of picospacecraft that will fly from Earth orbit to the Asteroid Belt. Using an insect colony analogy, ANTS will be composed of specialized workers for asteroid exploration. Exploration would consist of cataloguing the mass, density, morphology, and chemical composition of the asteroids, including any anomalous concentrations of specific minerals. To perform this task, ANTS would carry miniaturized instruments, such as imagers, spectrometers, and detectors. Since ANTS and other similar missions are going to consist of autonomous spacecraft that may be out of contact with the earth for extended periods of time, and have low bandwidths due to weight constraints, it will be difficult to observe improper behavior and to correct any errors after launch. Providing V&V (verification and validation) for this type of mission is new to NASA, and represents the cutting edge in system correctness, and requires higher levels of assurance than other (traditional) missions that use a single or small number of spacecraft that are deterministic in nature and have near continuous communication access. One of the highest possible levels of assurance comes from the application of formal methods. Formal methods are mathematics-based tools and techniques for specifying and verifying (software and hardware) systems. They are particularly useful for specifying complex parallel systems, such as exemplified by the ANTS mission, where the entire system is difficult for a single person to fully understand, a problem that is multiplied with multiple developers. Once written, a formal specification can be used to prove properties of a system (e.g., the underlying system will go from one state to another or not into a specific state) and check for particular types of errors (e.g., race or livelock conditions). A formal specification can also be used as input to a model checker for further validation. This report gives the results of a survey of formal methods techniques for verification and validation of space missions that use swarm technology. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft using the ANTS mission as an example system. This report is the first result of the project to determine formal approaches that are promising for formally specifying swarm-based systems. From this survey, the most promising approaches were selected and are discussed relative to their possible application to the ANTS mission. Future work will include the application of an integrated approach, based on the selected approaches identified in this report, to the formal specification of the ANTS mission.
Properties of a Formal Method for Prediction of Emergent Behaviors in Swarm-based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James
2004-01-01
Autonomous intelligent swarms of satellites are being proposed for NASA missions that have complex behaviors and interactions. The emergent properties of swarms make these missions powerful, but at the same time more difficult to design and assure that proper behaviors will emerge. This paper gives the results of research into formal methods techniques for verification and validation of NASA swarm-based missions. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft. The NASA ANTS mission was used as an example of swarm intelligence for which to apply the formal methods. This paper will give the evaluation of these formal methods and give partial specifications of the ANTS mission using four selected methods. We then give an evaluation of the methods and the needed properties of a formal method for effective specification and prediction of emergent behavior in swarm-based systems.
Formally verifying human–automation interaction as part of a system model: limitations and tradeoffs
Bass, Ellen J.
2011-01-01
Both the human factors engineering (HFE) and formal methods communities are concerned with improving the design of safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to perform formal verification of human–automation interaction with a programmable device. This effort utilizes a system architecture composed of independent models of the human mission, human task behavior, human-device interface, device automation, and operational environment. The goals of this architecture were to allow HFE practitioners to perform formal verifications of realistic systems that depend on human–automation interaction in a reasonable amount of time using representative models, intuitive modeling constructs, and decoupled models of system components that could be easily changed to support multiple analyses. This framework was instantiated using a patient controlled analgesia pump in a two phased process where models in each phase were verified using a common set of specifications. The first phase focused on the mission, human-device interface, and device automation; and included a simple, unconstrained human task behavior model. The second phase replaced the unconstrained task model with one representing normative pump programming behavior. Because models produced in the first phase were too large for the model checker to verify, a number of model revisions were undertaken that affected the goals of the effort. While the use of human task behavior models in the second phase helped mitigate model complexity, verification time increased. Additional modeling tools and technological developments are necessary for model checking to become a more usable technique for HFE. PMID:21572930
Third NASA Langley Formal Methods Workshop
NASA Technical Reports Server (NTRS)
Holloway, C. Michael (Compiler)
1995-01-01
This publication constitutes the proceedings of NASA Langley Research Center's third workshop on the application of formal methods to the design and verification of life-critical systems. This workshop brought together formal methods researchers, industry engineers, and academicians to discuss the potential of NASA-sponsored formal methods and to investigate new opportunities for applying these methods to industry problems. contained herein are copies of the material presented at the workshop, summaries of many of the presentations, a complete list of attendees, and a detailed summary of the Langley formal methods program. Much of this material is available electronically through the World-Wide Web via the following URL.
Formalizing Space Shuttle Software Requirements
NASA Technical Reports Server (NTRS)
Crow, Judith; DiVito, Ben L.
1996-01-01
This paper describes two case studies in which requirements for new flight-software subsystems on NASA's Space Shuttle were analyzed, one using standard formal specification techniques, the other using state exploration. These applications serve to illustrate three main theses: (1) formal methods can complement conventional requirements analysis processes effectively, (2) formal methods confer benefits regardless of how extensively they are adopted and applied, and (3) formal methods are most effective when they are judiciously tailored to the application.
NASA Technical Reports Server (NTRS)
Bensalem, Saddek; Ganesh, Vijay; Lakhnech, Yassine; Munoz, Cesar; Owre, Sam; Ruess, Harald; Rushby, John; Rusu, Vlad; Saiedi, Hassen; Shankar, N.
2000-01-01
To become practical for assurance, automated formal methods must be made more scalable, automatic, and cost-effective. Such an increase in scope, scale, automation, and utility can be derived from an emphasis on a systematic separation of concerns during verification. SAL (Symbolic Analysis Laboratory) attempts to address these issues. It is a framework for combining different tools to calculate properties of concurrent systems. The heart of SAL is a language, developed in collaboration with Stanford, Berkeley, and Verimag for specifying concurrent systems in a compositional way. Our instantiation of the SAL framework augments PVS with tools for abstraction, invariant generation, program analysis (such as slicing), theorem proving, and model checking to separate concerns as well as calculate properties (i.e., perform, symbolic analysis) of concurrent systems. We. describe the motivation, the language, the tools, their integration in SAL/PAS, and some preliminary experience of their use.
Noyes, Jane; Booth, Andrew; Flemming, Kate; Garside, Ruth; Harden, Angela; Lewin, Simon; Pantoja, Tomas; Hannes, Karin; Cargo, Margaret; Thomas, James
2018-05-01
The Cochrane Qualitative and Implementation Methods Group develops and publishes guidance on the synthesis of qualitative and mixed-method implementation evidence. Choice of appropriate methodologies, methods, and tools is essential when developing a rigorous protocol and conducting the synthesis. Cochrane authors who conduct qualitative evidence syntheses have thus far used a small number of relatively simple methods to address similarly written questions. Cochrane has invested in methodological work to develop new tools and to encourage the production of exemplar reviews to show the value of more innovative methods that address a wider range of questions. In this paper, in the series, we report updated guidance on the selection of tools to assess methodological limitations in qualitative studies and methods to extract and synthesize qualitative evidence. We recommend application of Grades of Recommendation, Assessment, Development, and Evaluation-Confidence in the Evidence from Qualitative Reviews to assess confidence in qualitative synthesized findings. This guidance aims to support review authors to undertake a qualitative evidence synthesis that is intended to be integrated subsequently with the findings of one or more Cochrane reviews of the effects of similar interventions. The review of intervention effects may be undertaken concurrently with or separate to the qualitative evidence synthesis. We encourage further development through reflection and formal testing. Copyright © 2017 Elsevier Inc. All rights reserved.
Irena : tool suite for modeling and analysis of small-angle scattering.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ilavsky, J.; Jemian, P.
2009-04-01
Irena, a tool suite for analysis of both X-ray and neutron small-angle scattering (SAS) data within the commercial Igor Pro application, brings together a comprehensive suite of tools useful for investigations in materials science, physics, chemistry, polymer science and other fields. In addition to Guinier and Porod fits, the suite combines a variety of advanced SAS data evaluation tools for the modeling of size distribution in the dilute limit using maximum entropy and other methods, dilute limit small-angle scattering from multiple non-interacting populations of scatterers, the pair-distance distribution function, a unified fit, the Debye-Bueche model, the reflectivity (X-ray and neutron)more » using Parratt's formalism, and small-angle diffraction. There are also a number of support tools, such as a data import/export tool supporting a broad sampling of common data formats, a data modification tool, a presentation-quality graphics tool optimized for small-angle scattering data, and a neutron and X-ray scattering contrast calculator. These tools are brought together into one suite with consistent interfaces and functionality. The suite allows robust automated note recording and saving of parameters during export.« less
Experience report: Using formal methods for requirements analysis of critical spacecraft software
NASA Technical Reports Server (NTRS)
Lutz, Robyn R.; Ampo, Yoko
1994-01-01
Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.
Investigation of some formal aspects of the boson expansion technique in nuclear theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pedrocchi, V.G.
1982-01-01
The use of the boson expansion theory (BET) in nuclear physics now has about twenty years history, and a large number of papers is available in the literature. Some of them emphasize BET's practical use, successfully showing that it is in fact a very powerful, practical tool to understand the collective properties in nuclei. Others, on the other hand, concentrated more on formal aspects of the BET, and it is these formal aspects we deal with in this dissertation. The BET is not unique, and thus a variety of methods has been proposed. It was felt that it was desirablemore » to see whether they were in fact different theories, or the difference was only apparent. Actually, at the surface, many theories look different from each other, having various merits and demerits. However, it is possible that they are more closely related than they appear; and, if so, it may be attempted to unify them in such a way that a new BET can be constructed which embodies all the merits of the various theories that have been known so far. With this goal in mind, every detailed comparison of two methods was explored: the commutator method (CM) and the Marumori-Yamamura-Tokunaga (MYT) method, which have been discussed most popularly in the past, though often with controversy. It was found that they are, in fact, equivalent theories if looked at from a particular point of view, although they are not necessarily exactly the same in every aspect. Similar comparison was also made with the generalized Holstein-Primakoff (GHP) and the Dyson methods.« less
Inrig, Stephen J; Higashi, Robin T; Tiro, Jasmin A; Argenbright, Keith E; Lee, Simon J Craddock
2017-04-01
Despite federal funding for breast cancer screening, fragmented infrastructure and limited organizational capacity hinder access to the full continuum of breast cancer screening and clinical follow-up procedures among rural-residing women. We proposed a regional hub-and-spoke model, partnering with local providers to expand access across North Texas. We describe development and application of an iterative, mixed-method tool to assess county capacity to conduct community outreach and/or patient navigation in a partnership model. Our tool combined publicly-available quantitative data with qualitative assessments during site visits and semi-structured interviews. Application of our tool resulted in shifts in capacity designation in 10 of 17 county partners: 8 implemented local outreach with hub navigation; 9 relied on the hub for both outreach and navigation. Key factors influencing capacity: (1) formal linkages between partner organizations; (2) inter-organizational relationships; (3) existing clinical service protocols; (4) underserved populations. Qualitative data elucidate how our tool captured these capacity changes. Our capacity assessment tool enabled the hub to establish partnerships with county organizations by tailoring support to local capacity and needs. Absent a vertically integrated provider network for preventive services in these rural counties, our tool facilitated a virtually integrated regional network to extend access to breast cancer screening to underserved women. Copyright © 2016 Elsevier Ltd. All rights reserved.
Addition of CF3 across unsaturated moieties: a powerful functionalization tool
2014-01-01
In the last few years, the efficient introduction of trifluoromethyl groups in organic molecules has become a major research focus. This review highlights the recent developments enabling the incorporation of CF3 groups across unsaturated moieties, preferentially alkenes, and the mechanistic scenarios governing these transformations. We have specially focused on methods involving the simultaneous formation of C–CF3 and C–C or C–heteroatom bonds by formal addition reactions across π-systems, as such difunctionalization processes hold valuable synthetic potential. PMID:24789472
Formal methods and digital systems validation for airborne systems
NASA Technical Reports Server (NTRS)
Rushby, John
1993-01-01
This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.
NASA Formal Methods Workshop, 1990
NASA Technical Reports Server (NTRS)
Butler, Ricky W. (Compiler)
1990-01-01
The workshop brought together researchers involved in the NASA formal methods research effort for detailed technical interchange and provided a mechanism for interaction with representatives from the FAA and the aerospace industry. The workshop also included speakers from industry to debrief the formal methods researchers on the current state of practice in flight critical system design, verification, and certification. The goals were: define and characterize the verification problem for ultra-reliable life critical flight control systems and the current state of practice in industry today; determine the proper role of formal methods in addressing these problems, and assess the state of the art and recent progress toward applying formal methods to this area.
Formal and Informal Registration as Marketing Tools: Do They Produce "Trapped" Executives?
ERIC Educational Resources Information Center
Apple, L. Eugene
1993-01-01
A marketing concept was applied to college registration procedures in an experiment, focusing on degree of "escalation" of effort of students who had failed twice to register in desired courses, type of registration used (formal or informal) on each of three tries, and student characteristics (time until graduation, major, gender). (MSE)
Peer Review of a Formal Verification/Design Proof Methodology
NASA Technical Reports Server (NTRS)
1983-01-01
The role of formal verification techniques in system validation was examined. The value and the state of the art of performance proving for fault-tolerant compuers were assessed. The investigation, development, and evaluation of performance proving tools were reviewed. The technical issues related to proof methodologies are examined. The technical issues discussed are summarized.
Ten Commandments of Formal Methods...Ten Years Later
NASA Technical Reports Server (NTRS)
Bowen, Jonathan P.; Hinchey, Michael G.
2006-01-01
More than a decade ago, in "Ten Commandments of Formal Methods," we offered practical guidelines for projects that sought to use formal methods. Over the years, the article, which was based on our knowledge of successful industrial projects, has been widely cited and has generated much positive feedback. However, despite this apparent enthusiasm, formal methods use has not greatly increased, and some of the same attitudes about the infeasibility of adopting them persist. Formal methodists believe that introducing greater rigor will improve the software development process and yield software with better structure, greater maintainability, and fewer errors.
Huang, Shih-Wei; Chi, Wen-Chou; Yen, Chia-Feng; Chang, Kwang-Hwa; Liao, Hua-Fang; Escorpizo, Reuben; Chang, Feng-Hang; Liou, Tsan-Hon
2017-01-01
Background WHO Disability Assessment Schedule 2.0 (WHODAS 2.0) is a feasible tool for assessing functional disability and analysing the risk of institutionalisation among elderly patients with dementia. However, the data for the effect of education on disability status in patients with dementia is lacking. The aim of this large-scale, population-based study was to analyse the effect of education on the disability status of elderly Taiwanese patients with dementia by using WHODAS 2.0. Methods From the Taiwan Data Bank of Persons with Disability, we enrolled 7698 disabled elderly (older than 65 years) patients diagnosed with dementia between July 2012 and January 2014. According to their education status, we categorised these patients with and without formal education (3849 patients each). We controlled for the demographic variables through propensity score matching. The standardised scores of these patients in the six domains of WHODAS 2.0 were evaluated by certified interviewers. Student’s t-test was used for comparing the WHODAS 2.0 scores of patients with dementia in the two aforementioned groups. Poisson regression was applied for analysing the association among all the investigated variables. Results Patients with formal education had low disability status in the domains of getting along and social participation than did patients without formal education. Poisson regression revealed that standardised scores in all domains of WHODAS 2.0—except self-care—were associated with education status. Conclusions This study revealed lower disability status in the WHODAS 2.0 domains of getting along and social participation for patients with dementia with formal education compared with those without formal education. For patients with disability and dementia without formal education, community intervention of social participation should be implemented to maintain better social interaction ability. PMID:28473510
NASA Technical Reports Server (NTRS)
Menzel, Christopher; Mayer, Richard J.; Edwards, Douglas D.
1991-01-01
The Process Description Capture Method (IDEF3) is one of several Integrated Computer-Aided Manufacturing (ICAM) DEFinition methods developed by the Air Force to support systems engineering activities, and in particular, to support information systems development. These methods have evolved as a distillation of 'good practice' experience by information system developers and are designed to raise the performance level of the novice practitioner to one comparable with that of an expert. IDEF3 is meant to serve as a knowledge acquisition and requirements definition tool that structures the user's understanding of how a given process, event, or system works around process descriptions. A special purpose graphical language accompanying the method serves to highlight temporal precedence and causality relationships relative to the process or event being described.
Jackson, Bret; Coffey, Dane; Thorson, Lauren; Schroeder, David; Ellingson, Arin M; Nuckley, David J; Keefe, Daniel F
2012-10-01
In this position paper we discuss successes and limitations of current evaluation strategies for scientific visualizations and argue for embracing a mixed methods strategy of evaluation. The most novel contribution of the approach that we advocate is a new emphasis on employing design processes as practiced in related fields (e.g., graphic design, illustration, architecture) as a formalized mode of evaluation for data visualizations. To motivate this position we describe a series of recent evaluations of scientific visualization interfaces and computer graphics strategies conducted within our research group. Complementing these more traditional evaluations our visualization research group also regularly employs sketching, critique, and other design methods that have been formalized over years of practice in design fields. Our experience has convinced us that these activities are invaluable, often providing much more detailed evaluative feedback about our visualization systems than that obtained via more traditional user studies and the like. We believe that if design-based evaluation methodologies (e.g., ideation, sketching, critique) can be taught and embraced within the visualization community then these may become one of the most effective future strategies for both formative and summative evaluations.
Liu, Hao; Zhu, Lili; Bai, Shuming; Shi, Qiang
2014-04-07
We investigated applications of the hierarchical equation of motion (HEOM) method to perform high order perturbation calculations of reduced quantum dynamics for a harmonic bath with arbitrary spectral densities. Three different schemes are used to decompose the bath spectral density into analytical forms that are suitable to the HEOM treatment: (1) The multiple Lorentzian mode model that can be obtained by numerically fitting the model spectral density. (2) The combined Debye and oscillatory Debye modes model that can be constructed by fitting the corresponding classical bath correlation function. (3) A new method that uses undamped harmonic oscillator modes explicitly in the HEOM formalism. Methods to extract system-bath correlations were investigated for the above bath decomposition schemes. We also show that HEOM in the undamped harmonic oscillator modes can give detailed information on the partial Wigner transform of the total density operator. Theoretical analysis and numerical simulations of the spin-Boson dynamics and the absorption line shape of molecular dimers show that the HEOM formalism for high order perturbations can serve as an important tool in studying the quantum dissipative dynamics in the intermediate coupling regime.
Jackson, Bret; Coffey, Dane; Thorson, Lauren; Schroeder, David; Ellingson, Arin M.; Nuckley, David J.
2017-01-01
In this position paper we discuss successes and limitations of current evaluation strategies for scientific visualizations and argue for embracing a mixed methods strategy of evaluation. The most novel contribution of the approach that we advocate is a new emphasis on employing design processes as practiced in related fields (e.g., graphic design, illustration, architecture) as a formalized mode of evaluation for data visualizations. To motivate this position we describe a series of recent evaluations of scientific visualization interfaces and computer graphics strategies conducted within our research group. Complementing these more traditional evaluations our visualization research group also regularly employs sketching, critique, and other design methods that have been formalized over years of practice in design fields. Our experience has convinced us that these activities are invaluable, often providing much more detailed evaluative feedback about our visualization systems than that obtained via more traditional user studies and the like. We believe that if design-based evaluation methodologies (e.g., ideation, sketching, critique) can be taught and embraced within the visualization community then these may become one of the most effective future strategies for both formative and summative evaluations. PMID:28944349
Incorporating current research into formal higher education settings using Astrobites
NASA Astrophysics Data System (ADS)
Sanders, Nathan E.; Kohler, Susanna; Faesi, Chris; Villar, Ashley; Zevin, Michael
2017-10-01
A primary goal of many undergraduate- and graduate-level courses in the physical sciences is to prepare students to engage in scientific research or to prepare students for careers that leverage skillsets similar to those used by research scientists. Even for students who may not intend to pursue a career with these characteristics, exposure to the context of applications in modern research can be a valuable tool for teaching and learning. However, a persistent barrier to student participation in research is familiarity with the technical language, format, and context that academic researchers use to communicate research methods and findings with each other: the literature of the field. Astrobites, an online web resource authored by graduate students, has published brief and accessible summaries of more than 1300 articles from the astrophysical literature since its founding in 2010. This article presents three methods for introducing students at all levels within the formal higher education setting to approaches and results from modern research. For each method, we provide a sample lesson plan that integrates content and principles from Astrobites, including step-by-step instructions for instructors, suggestions for adapting the lesson to different class levels across the undergraduate and graduate spectrum, sample student handouts, and a grading rubric.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Hao; Zhu, Lili; Bai, Shuming
2014-04-07
We investigated applications of the hierarchical equation of motion (HEOM) method to perform high order perturbation calculations of reduced quantum dynamics for a harmonic bath with arbitrary spectral densities. Three different schemes are used to decompose the bath spectral density into analytical forms that are suitable to the HEOM treatment: (1) The multiple Lorentzian mode model that can be obtained by numerically fitting the model spectral density. (2) The combined Debye and oscillatory Debye modes model that can be constructed by fitting the corresponding classical bath correlation function. (3) A new method that uses undamped harmonic oscillator modes explicitly inmore » the HEOM formalism. Methods to extract system-bath correlations were investigated for the above bath decomposition schemes. We also show that HEOM in the undamped harmonic oscillator modes can give detailed information on the partial Wigner transform of the total density operator. Theoretical analysis and numerical simulations of the spin-Boson dynamics and the absorption line shape of molecular dimers show that the HEOM formalism for high order perturbations can serve as an important tool in studying the quantum dissipative dynamics in the intermediate coupling regime.« less
THE CAUSAL ANALYSIS / DIAGNOSIS DECISION ...
CADDIS is an on-line decision support system that helps investigators in the regions, states and tribes find, access, organize, use and share information to produce causal evaluations in aquatic systems. It is based on the US EPA's Stressor Identification process which is a formal method for identifying causes of impairments in aquatic systems. CADDIS 2007 increases access to relevant information useful for causal analysis and provides methods and tools that practitioners can use to analyze their own data. The new Candidate Cause section provides overviews of commonly encountered causes of impairments to aquatic systems: metals, sediments, nutrients, flow alteration, temperature, ionic strength, and low dissolved oxygen. CADDIS includes new Conceptual Models that illustrate the relationships from sources to stressors to biological effects. An Interactive Conceptual Model for phosphorus links the diagram with supporting literature citations. The new Analyzing Data section helps practitioners analyze their data sets and interpret and use those results as evidence within the USEPA causal assessment process. Downloadable tools include a graphical user interface statistical package (CADStat), and programs for use with the freeware R statistical package, and a Microsoft Excel template. These tools can be used to quantify associations between causes and biological impairments using innovative methods such as species-sensitivity distributions, biological inferenc
TRL - A FORMAL TEST REPRESENTATION LANGUAGE AND TOOL FOR FUNCTIONAL TEST DESIGNS
NASA Technical Reports Server (NTRS)
Hops, J. M.
1994-01-01
A Formal Test Representation Language and Tool for Functional Test Designs (TRL) is an automatic tool and a formal language that is used to implement the Category-Partition Method and produce the specification of test cases in the testing phase of software development. The Category-Partition Method is particularly useful in defining the inputs, outputs and purpose of the test design phase and combines the benefits of choosing normal cases with error exposing properties. Traceability can be maintained quite easily by creating a test design for each objective in the test plan. The effort to transform the test cases into procedures is simplified by using an automatic tool to create the cases based on the test design. The method allows the rapid elimination of undesired test cases from consideration, and easy review of test designs by peer groups. The first step in the category-partition method is functional decomposition, in which the specification and/or requirements are decomposed into functional units that can be tested independently. A secondary purpose of this step is to identify the parameters that affect the behavior of the system for each functional unit. The second step, category analysis, carries the work done in the previous step further by determining the properties or sub-properties of the parameters that would make the system behave in different ways. The designer should analyze the requirements to determine the features or categories of each parameter and how the system may behave if the category were to vary its value. If the parameter undergoing refinement is a data-item, then categories of this data-item may be any of its attributes, such as type, size, value, units, frequency of change, or source. After all the categories for the parameters of the functional unit have been determined, the next step is to partition each category's range space into mutually exclusive values that the category can assume. In choosing partition values, all possible kinds of values should be included, especially the ones that will maximize error detection. The purpose of the final step, partition constraint analysis, is to refine the test design specification so that only the technically effective and economically feasible test cases are implied. TRL is written in C-language to be machine independent. It has been successfully implemented on an IBM PC compatible running MS DOS, a Sun4 series computer running SunOS, an HP 9000/700 series workstation running HP-UX, a DECstation running DEC RISC ULTRIX, and a DEC VAX series computer running VMS. TRL requires 1Mb of disk space and a minimum of 84K of RAM. The documentation is available in electronic form in Word Perfect format. The standard distribution media for TRL is a 5.25 inch 360K MS-DOS format diskette. Alternate distribution media and formats are available upon request. TRL was developed in 1993 and is a copyrighted work with all copyright vested in NASA.
The quality management journey: the progress of health facilities in Australia.
Carr, B J
1994-12-01
Many facilities in Australia have taken the Total Quality Management (TQM) step. The objective of this study was to examine progress of adopted formal quality systems in health. Sixty per cent of organizations surveyed have adopted formal systems. Of these, Deming adherents are the most common, followed by eclectic choices. Only 35% considered the quality transition as reasonably easy. There was no relationship between accreditation and formal quality systems identified. The most common improvement techniques were: flow charts, histograms, and cause and effect diagrams. Quality practitioners are happy to use several tools exceptionally well rather than have many tools at their disposal. The greatest impediment to the adoption of quality was the lack of top management support. This study did not support the view that clinicians are not readily actively supporting quality initiatives. Total Quality Management is not a mature concept; however, Chief Executive Officers are assured that rewards will be realized over time.
LinkEHR-Ed: a multi-reference model archetype editor based on formal semantics.
Maldonado, José A; Moner, David; Boscá, Diego; Fernández-Breis, Jesualdo T; Angulo, Carlos; Robles, Montserrat
2009-08-01
To develop a powerful archetype editing framework capable of handling multiple reference models and oriented towards the semantic description and standardization of legacy data. The main prerequisite for implementing tools providing enhanced support for archetypes is the clear specification of archetype semantics. We propose a formalization of the definition section of archetypes based on types over tree-structured data. It covers the specialization of archetypes, the relationship between reference models and archetypes and conformance of data instances to archetypes. LinkEHR-Ed, a visual archetype editor based on the former formalization with advanced processing capabilities that supports multiple reference models, the editing and semantic validation of archetypes, the specification of mappings to data sources, and the automatic generation of data transformation scripts, is developed. LinkEHR-Ed is a useful tool for building, processing and validating archetypes based on any reference model.
From Informal Safety-Critical Requirements to Property-Driven Formal Validation
NASA Technical Reports Server (NTRS)
Cimatti, Alessandro; Roveri, Marco; Susi, Angelo; Tonetta, Stefano
2008-01-01
Most of the efforts in formal methods have historically been devoted to comparing a design against a set of requirements. The validation of the requirements themselves, however, has often been disregarded, and it can be considered a largely open problem, which poses several challenges. The first challenge is given by the fact that requirements are often written in natural language, and may thus contain a high degree of ambiguity. Despite the progresses in Natural Language Processing techniques, the task of understanding a set of requirements cannot be automatized, and must be carried out by domain experts, who are typically not familiar with formal languages. Furthermore, in order to retain a direct connection with the informal requirements, the formalization cannot follow standard model-based approaches. The second challenge lies in the formal validation of requirements. On one hand, it is not even clear which are the correctness criteria or the high-level properties that the requirements must fulfill. On the other hand, the expressivity of the language used in the formalization may go beyond the theoretical and/or practical capacity of state-of-the-art formal verification. In order to solve these issues, we propose a new methodology that comprises of a chain of steps, each supported by a specific tool. The main steps are the following. First, the informal requirements are split into basic fragments, which are classified into categories, and dependency and generalization relationships among them are identified. Second, the fragments are modeled using a visual language such as UML. The UML diagrams are both syntactically restricted (in order to guarantee a formal semantics), and enriched with a highly controlled natural language (to allow for modeling static and temporal constraints). Third, an automatic formal analysis phase iterates over the modeled requirements, by combining several, complementary techniques: checking consistency; verifying whether the requirements entail some desirable properties; verify whether the requirements are consistent with selected scenarios; diagnosing inconsistencies by identifying inconsistent cores; identifying vacuous requirements; constructing multiple explanations by enabling the fault-tree analysis related to particular fault models; verifying whether the specification is realizable.
A Generic Software Safety Document Generator
NASA Technical Reports Server (NTRS)
Denney, Ewen; Venkatesan, Ram Prasad
2004-01-01
Formal certification is based on the idea that a mathematical proof of some property of a piece of software can be regarded as a certificate of correctness which, in principle, can be subjected to external scrutiny. In practice, however, proofs themselves are unlikely to be of much interest to engineers. Nevertheless, it is possible to use the information obtained from a mathematical analysis of software to produce a detailed textual justification of correctness. In this paper, we describe an approach to generating textual explanations from automatically generated proofs of program safety, where the proofs are of compliance with an explicit safety policy that can be varied. Key to this is tracing proof obligations back to the program, and we describe a tool which implements this to certify code auto-generated by AutoBayes and AutoFilter, program synthesis systems under development at the NASA Ames Research Center. Our approach is a step towards combining formal certification with traditional certification methods.
Towards a Formal Basis for Modular Safety Cases
NASA Technical Reports Server (NTRS)
Denney, Ewen; Pai, Ganesh
2015-01-01
Safety assurance using argument-based safety cases is an accepted best-practice in many safety-critical sectors. Goal Structuring Notation (GSN), which is widely used for presenting safety arguments graphically, provides a notion of modular arguments to support the goal of incremental certification. Despite the efforts at standardization, GSN remains an informal notation whereas the GSN standard contains appreciable ambiguity especially concerning modular extensions. This, in turn, presents challenges when developing tools and methods to intelligently manipulate modular GSN arguments. This paper develops the elements of a theory of modular safety cases, leveraging our previous work on formalizing GSN arguments. Using example argument structures we highlight some ambiguities arising through the existing guidance, present the intuition underlying the theory, clarify syntax, and address modular arguments, contracts, well-formedness and well-scopedness of modules. Based on this theory, we have a preliminary implementation of modular arguments in our toolset, AdvoCATE.
Designing an architectural style for Pervasive Healthcare systems.
Rafe, Vahid; Hajvali, Masoumeh
2013-04-01
Nowadays, the Pervasive Healthcare (PH) systems are considered as an important research area. These systems have a dynamic structure and configuration. Therefore, an appropriate method for designing such systems is necessary. The Publish/Subscribe Architecture (pub/sub) is one of the convenient architectures to support such systems. PH systems are safety critical; hence, errors can bring disastrous results. To prevent such problems, a powerful analytical tool is required. So using a proper formal language like graph transformation systems for developing of these systems seems necessary. But even if software engineers use such high level methodologies, errors may occur in the system under design. Hence, it should be investigated automatically and formally that whether this model of system satisfies all their requirements or not. In this paper, a dynamic architectural style for developing PH systems is presented. Then, the behavior of these systems is modeled and evaluated using GROOVE toolset. The results of the analysis show its high reliability.
Modeling the fusion of cylindrical bioink particles in post bioprinting structure formation
NASA Astrophysics Data System (ADS)
McCune, Matt; Shafiee, Ashkan; Forgacs, Gabor; Kosztin, Ioan
2015-03-01
Cellular Particle Dynamics (CPD) is an effective computational method to describe the shape evolution and biomechanical relaxation processes in multicellular systems. Thus, CPD is a useful tool to predict the outcome of post-printing structure formation in bioprinting. The predictive power of CPD has been demonstrated for multicellular systems composed of spherical bioink units. Experiments and computer simulations were related through an independently developed theoretical formalism based on continuum mechanics. Here we generalize the CPD formalism to (i) include cylindrical bioink particles often used in specific bioprinting applications, (ii) describe the more realistic experimental situation in which both the length and the volume of the cylindrical bioink units decrease during post-printing structure formation, and (iii) directly connect CPD simulations to the corresponding experiments without the need of the intermediate continuum theory inherently based on simplifying assumptions. Work supported by NSF [PHY-0957914]. Computer time provided by the University of Missouri Bioinformatics Consortium.
NASA Astrophysics Data System (ADS)
Sosa, M.; Grundel, L.; Simini, F.
2016-04-01
Logical reasoning is part of medical practice since its origins. Modern Medicine has included information-intensive tools to refine diagnostics and treatment protocols. We are introducing formal logic teaching in Medical School prior to Clinical Internship, to foster medical practice. Two simple examples (Acute Myocardial Infarction and Diabetes Mellitus) are given in terms of formal logic expression and truth tables. Flowcharts of both diagnostic processes help understand the procedures and to validate them logically. The particularity of medical information is that it is often accompanied by “missing data” which suggests to adapt formal logic to a “three state” logic in the future. Medical Education must include formal logic to understand complex protocols and best practices, prone to mutual interactions.
ERIC Educational Resources Information Center
Seltman, Muriel; Seltman, P. E. J.
1978-01-01
The authors stress the importance of bringing together the causal logic of history and the formal logic of mathematics in order to humanize mathematics and make it more accessible. An example of such treatment is given in a discussion of the centrality of Euclid and the Euclidean system to mathematics development. (MN)
Telemedicine Platform Enhanced visiophony solution to operate a Robot-Companion
NASA Astrophysics Data System (ADS)
Simonnet, Th.; Couet, A.; Ezvan, P.; Givernaud, O.; Hillereau, P.
Nowadays, one of the ways to reduce medical care costs is to reduce the length of patients hospitalization and reinforce home sanitary support by formal (professionals) and non formal (family) caregivers. The aim is to design and operate a scalable and secured collaborative platform to handle specific tools for patients, their families and doctors.
ERIC Educational Resources Information Center
Smith Risser, H.; Bottoms, SueAnn
2014-01-01
The advent of social networking tools allows teachers to create online networks and share information. While some virtual networks have a formal structure and defined boundaries, many do not. These unstructured virtual networks are difficult to study because they lack defined boundaries and a formal structure governing leadership roles and the…
KNOW ESSENTIALS: a tool for informed decisions in the absence of formal HTA systems.
Mathew, Joseph L
2011-04-01
Most developing countries and resource-limited settings lack robust health technology assessment (HTA) systems. Because the development of locally relevant HTA is not immediately viable, and the extrapolation of external HTA is inappropriate, a new model for evaluating health technologies is required. The aim of this study was to describe the development and application of KNOW ESSENTIALS, a tool facilitating evidence-based decisions on health technologies by stakeholders in settings lacking formal HTA systems. Current HTA methodology was examined through literature search. Additional issues relevant to resource-limited settings, but not adequately addressed in current methodology, were identified through further literature search, appraisal of contextually relevant issues, discussion with healthcare professionals familiar with the local context, and personal experience. A set of thirteen elements important for evidence-based decisions was identified, selected and combined into a tool with the mnemonic KNOW ESSENTIALS. Detailed definitions for each element, coding for the elements, and a system to evaluate a given health technology using the tool were developed. Developing countries and resource-limited settings face several challenges to informed decision making. Models that are relevant and applicable in high-income countries are unlikely in such settings. KNOW ESSENTIALS is an alternative that facilitates evidence-based decision making by stakeholders without formal expertise in HTA. The tool could be particularly useful, as an interim measure, in healthcare systems that are developing HTA capacity. It could also be useful anywhere when rapid evidence-based decisions on health technologies are required.
Hamilton-Jacobi theory in multisymplectic classical field theories
NASA Astrophysics Data System (ADS)
de León, Manuel; Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso; Vilariño, Silvia
2017-09-01
The geometric framework for the Hamilton-Jacobi theory developed in the studies of Cariñena et al. [Int. J. Geom. Methods Mod. Phys. 3(7), 1417-1458 (2006)], Cariñena et al. [Int. J. Geom. Methods Mod. Phys. 13(2), 1650017 (2015)], and de León et al. [Variations, Geometry and Physics (Nova Science Publishers, New York, 2009)] is extended for multisymplectic first-order classical field theories. The Hamilton-Jacobi problem is stated for the Lagrangian and the Hamiltonian formalisms of these theories as a particular case of a more general problem, and the classical Hamilton-Jacobi equation for field theories is recovered from this geometrical setting. Particular and complete solutions to these problems are defined and characterized in several equivalent ways in both formalisms, and the equivalence between them is proved. The use of distributions in jet bundles that represent the solutions to the field equations is the fundamental tool in this formulation. Some examples are analyzed and, in particular, the Hamilton-Jacobi equation for non-autonomous mechanical systems is obtained as a special case of our results.
Formal Safety Certification of Aerospace Software
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd
2005-01-01
In principle, formal methods offer many advantages for aerospace software development: they can help to achieve ultra-high reliability, and they can be used to provide evidence of the reliability claims which can then be subjected to external scrutiny. However, despite years of research and many advances in the underlying formalisms of specification, semantics, and logic, formal methods are not much used in practice. In our opinion this is related to three major shortcomings. First, the application of formal methods is still expensive because they are labor- and knowledge-intensive. Second, they are difficult to scale up to complex systems because they are based on deep mathematical insights about the behavior of the systems (t.e., they rely on the "heroic proof"). Third, the proofs can be difficult to interpret, and typically stand in isolation from the original code. In this paper, we describe a tool for formally demonstrating safety-relevant aspects of aerospace software, which largely circumvents these problems. We focus on safely properties because it has been observed that safety violations such as out-of-bounds memory accesses or use of uninitialized variables constitute the majority of the errors found in the aerospace domain. In our approach, safety means that the program will not violate a set of rules that can range for the simple memory access rules to high-level flight rules. These different safety properties are formalized as different safety policies in Hoare logic, which are then used by a verification condition generator along with the code and logical annotations in order to derive formal safety conditions; these are then proven using an automated theorem prover. Our certification system is currently integrated into a model-based code generation toolset that generates the annotations together with the code. However, this automated formal certification technology is not exclusively constrained to our code generator and could, in principle, also be integrated with other code generators such as RealTime Workshop or even applied to legacy code. Our approach circumvents the historical problems with formal methods by increasing the degree of automation on all levels. The restriction to safety policies (as opposed to arbitrary functional behavior) results in simpler proof problems that can generally be solved by fully automatic theorem proves. An automated linking mechanism between the safety conditions and the code provides some of the traceability mandated by process standards such as DO-178B. An automated explanation mechanism uses semantic markup added by the verification condition generator to produce natural-language explanations of the safety conditions and thus supports their interpretation in relation to the code. It shows an automatically generated certification browser that lets users inspect the (generated) code along with the safety conditions (including textual explanations), and uses hyperlinks to automate tracing between the two levels. Here, the explanations reflect the logical structure of the safety obligation but the mechanism can in principle be customized using different sets of domain concepts. The interface also provides some limited control over the certification process itself. Our long-term goal is a seamless integration of certification, code generation, and manual coding that results in a "certified pipeline" in which specifications are automatically transformed into executable code, together with the supporting artifacts necessary for achieving and demonstrating the high level of assurance needed in the aerospace domain.
FORMED: Bringing Formal Methods to the Engineering Desktop
2016-02-01
integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification
Symbolic LTL Compilation for Model Checking: Extended Abstract
NASA Technical Reports Server (NTRS)
Rozier, Kristin Y.; Vardi, Moshe Y.
2007-01-01
In Linear Temporal Logic (LTL) model checking, we check LTL formulas representing desired behaviors against a formal model of the system designed to exhibit these behaviors. To accomplish this task, the LTL formulas must be translated into automata [21]. We focus on LTL compilation by investigating LTL satisfiability checking via a reduction to model checking. Having shown that symbolic LTL compilation algorithms are superior to explicit automata construction algorithms for this task [16], we concentrate here on seeking a better symbolic algorithm.We present experimental data comparing algorithmic variations such as normal forms, encoding methods, and variable ordering and examine their effects on performance metrics including processing time and scalability. Safety critical systems, such as air traffic control, life support systems, hazardous environment controls, and automotive control systems, pervade our daily lives, yet testing and simulation alone cannot adequately verify their reliability [3]. Model checking is a promising approach to formal verification for safety critical systems which involves creating a formal mathematical model of the system and translating desired safety properties into a formal specification for this model. The complement of the specification is then checked against the system model. When the model does not satisfy the specification, model-checking tools accompany this negative answer with a counterexample, which points to an inconsistency between the system and the desired behaviors and aids debugging efforts.
Implementation of Health Insurance Support Tools in Community Health Centers.
Huguet, Nathalie; Hatch, Brigit; Sumic, Aleksandra; Tillotson, Carrie; Hicks, Elizabeth; Nelson, Joan; DeVoe, Jennifer E
2018-01-01
Health information technology (HIT) provides new opportunities for primary care clinics to support patients with health insurance enrollment and maintenance. We present strategies, early findings, and clinic reflections on the development and implementation of HIT tools designed to streamline and improve health insurance tracking at community health centers. We are conducting a hybrid implementation-effectiveness trial to assess novel health insurance enrollment and support tools in primary care clinics. Twenty-three clinics in 7 health centers from the OCHIN practice-based research network are participating in the implementation component of the trial. Participating health centers were randomized to 1 of 2 levels of implementation support, including arm 1 (n = 4 health centers, 11 clinic sites) that received HIT tools and educational materials and arm 2 (n = 3 health centers, 12 clinic sites) that received HIT tools, educational materials, and individualized implementation support with a practice coach. We used mixed-methods (qualitative and quantitative) to assess tool use rates and facilitators and barriers to implementation in the first 6 months. Clinics reported favorable attitudes toward the HIT tools, which replace less efficient and more cumbersome processes, and reflect on the importance of clinic engagement in tool development and refinement. Five of 7 health centers are now regularly using the tools and are actively working to increase tool use. Six months after formal implementation, arm 2 clinics demonstrated higher rates of tool use, compared with arm 1. These results highlight the value of early clinic input in tool development, the potential benefit of practice coaching during HIT tool development and implementation, and a novel method for coupling a hybrid implementation-effectiveness design with principles of improvement science in primary care research. © Copyright 2018 by the American Board of Family Medicine.
Questioned document workflow for handwriting with automated tools
NASA Astrophysics Data System (ADS)
Das, Krishnanand; Srihari, Sargur N.; Srinivasan, Harish
2012-01-01
During the last few years many document recognition methods have been developed to determine whether a handwriting specimen can be attributed to a known writer. However, in practice, the work-flow of the document examiner continues to be manual-intensive. Before a systematic or computational, approach can be developed, an articulation of the steps involved in handwriting comparison is needed. We describe the work flow of handwritten questioned document examination, as described in a standards manual, and the steps where existing automation tools can be used. A well-known ransom note case is considered as an example, where one encounters testing for multiple writers of the same document, determining whether the writing is disguised, known writing is formal while questioned writing is informal, etc. The findings for the particular ransom note case using the tools are given. Also observations are made for developing a more fully automated approach to handwriting examination.
Mixed Methods in CAM Research: A Systematic Review of Studies Published in 2012
Bishop, Felicity L.; Holmes, Michelle M.
2013-01-01
Background. Mixed methods research uses qualitative and quantitative methods together in a single study or a series of related studies. Objectives. To review the prevalence and quality of mixed methods studies in complementary medicine. Methods. All studies published in the top 10 integrative and complementary medicine journals in 2012 were screened. The quality of mixed methods studies was appraised using a published tool designed for mixed methods studies. Results. 4% of papers (95 out of 2349) reported mixed methods studies, 80 of which met criteria for applying the quality appraisal tool. The most popular formal mixed methods design was triangulation (used by 74% of studies), followed by embedded (14%), sequential explanatory (8%), and finally sequential exploratory (5%). Quantitative components were generally of higher quality than qualitative components; when quantitative components involved RCTs they were of particularly high quality. Common methodological limitations were identified. Most strikingly, none of the 80 mixed methods studies addressed the philosophical tensions inherent in mixing qualitative and quantitative methods. Conclusions and Implications. The quality of mixed methods research in CAM can be enhanced by addressing philosophical tensions and improving reporting of (a) analytic methods and reflexivity (in qualitative components) and (b) sampling and recruitment-related procedures (in all components). PMID:24454489
Development of an interactive social media tool for parents with concerns about vaccines.
Shoup, Jo Ann; Wagner, Nicole M; Kraus, Courtney R; Narwaney, Komal J; Goddard, Kristin S; Glanz, Jason M
2015-06-01
Describe a process for designing, building, and evaluating a theory-driven social media intervention tool to help reduce parental concerns about vaccination. We developed an interactive web-based tool using quantitative and qualitative methods (e.g., survey, focus groups, individual interviews, and usability testing). Survey results suggested that social media may represent an effective intervention tool to help parents make informed decisions about vaccination for their children. Focus groups and interviews revealed four main themes for development of the tool: Parents wanted information describing both benefits and risks of vaccination, transparency of sources of information, moderation of the tool by an expert, and ethnic and racial diversity in the visual display of people. Usability testing showed that parents were satisfied with the usability of the tool but had difficulty with performing some of the informational searches. Based on focus groups, interviews, and usability evaluations, we made additional revisions to the tool's content, design, functionality, and overall look and feel. Engaging parents at all stages of development is critical when designing a tool to address concerns about childhood vaccines. Although this can be both resource- and time-intensive, the redesigned tool is more likely to be accepted and used by parents. Next steps involve a formal evaluation through a randomized trial. © 2014 Society for Public Health Education.
NASA Technical Reports Server (NTRS)
Shin, Jong-Yeob; Belcastro, Christine
2008-01-01
Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. As a part of the validation process, this paper describes an analysis method for determining a reliable flight regime in the flight envelope within which an integrated resilent control system can achieve the desired performance of tracking command signals and detecting additive faults in the presence of parameter uncertainty and unmodeled dynamics. To calculate a reliable flight regime, a structured singular value analysis method is applied to analyze the closed-loop system over the entire flight envelope. To use the structured singular value analysis method, a linear fractional transform (LFT) model of a transport aircraft longitudinal dynamics is developed over the flight envelope by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The developed LFT model can capture original nonlinear dynamics over the flight envelope with the ! block which contains key varying parameters: angle of attack and velocity, and real parameter uncertainty: aerodynamic coefficient uncertainty and moment of inertia uncertainty. Using the developed LFT model and a formal robustness analysis method, a reliable flight regime is calculated for a transport aircraft closed-loop system.
Experiences Using Lightweight Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1997-01-01
This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Bolton, Matthew L.; Bass, Ellen J.; Siminiceanu, Radu I.
2012-01-01
Breakdowns in complex systems often occur as a result of system elements interacting in unanticipated ways. In systems with human operators, human-automation interaction associated with both normative and erroneous human behavior can contribute to such failures. Model-driven design and analysis techniques provide engineers with formal methods tools and techniques capable of evaluating how human behavior can contribute to system failures. This paper presents a novel method for automatically generating task analytic models encompassing both normative and erroneous human behavior from normative task models. The generated erroneous behavior is capable of replicating Hollnagel’s zero-order phenotypes of erroneous action for omissions, jumps, repetitions, and intrusions. Multiple phenotypical acts can occur in sequence, thus allowing for the generation of higher order phenotypes. The task behavior model pattern capable of generating erroneous behavior can be integrated into a formal system model so that system safety properties can be formally verified with a model checker. This allows analysts to prove that a human-automation interactive system (as represented by the model) will or will not satisfy safety properties with both normative and generated erroneous human behavior. We present benchmarks related to the size of the statespace and verification time of models to show how the erroneous human behavior generation process scales. We demonstrate the method with a case study: the operation of a radiation therapy machine. A potential problem resulting from a generated erroneous human action is discovered. A design intervention is presented which prevents this problem from occurring. We discuss how our method could be used to evaluate larger applications and recommend future paths of development. PMID:23105914
Experiences Using Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1996-01-01
This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Shrestha, R; Shakya, R M; Khan A, A
2016-01-01
Background Renal colic is a common emergency department presentation. Hydronephrosis is indirect sign of urinary obstruction which may be due to obstructing ureteric calculus and can be detected easily by bedside ultrasound with minimal training. Objective To compare the accuracy of detection of hydronephrosis performed by the emergency physician with that of radiologist's in suspected renal colic cases. Method This was a prospective observational study performed over a period of 6 months. Patients >8 years with provisional diagnosis of renal colic with both the bedside ultrasound and the formal ultrasound performed were included. Presence of hydronephrosis in both ultrasounds and size and location of ureteric stone if present in formal ultrasound was recorded. The accuracy of the emergency physician detection of hydronephrosis was determined using the scan reported by the radiologists as the "gold standard" as computed tomography was unavailable. Statistical analysis was executed using SPSS 17.0. Result Among the 111 included patients, 56.7% had ureteric stone detected in formal ultrasound. The overall sensitivity, specificity, positive predictive value and negative predictive value of bedside ultrasound performed by emergency physician for detection of hydronephrosis with that of formal ultrasound performed by radiologist was 90.8%., 78.3%, 85.5% and 85.7% respectively. Bedside ultrasound and formal ultrasound both detected hydronephrosis more often in patients with larger stones and the difference was statistically significant (p=.000). Conclusion Bedside ultrasound can be potentially used as an important tool in detecting clinically significant hydronephrosis in emergency to evaluate suspected ureteric colic. Focused training in ultrasound could greatly improve the emergency management of these patients.
Planetary data in education: tool development for access to the Planetary Data System
NASA Technical Reports Server (NTRS)
Atkinson, C. H.; Andres, P. M.; Liggett, P. K.; Lowes, L. L.; Sword, B. J.
2003-01-01
In this session we will describe and demonstrate the interface to the PDS access tools and functions developed for the scientific community, and discuss the potential for its utilization in K-14 formal and informal settings.
Formal Methods Case Studies for DO-333
NASA Technical Reports Server (NTRS)
Cofer, Darren; Miller, Steven P.
2014-01-01
RTCA DO-333, Formal Methods Supplement to DO-178C and DO-278A provides guidance for software developers wishing to use formal methods in the certification of airborne systems and air traffic management systems. The supplement identifies the modifications and additions to DO-178C and DO-278A objectives, activities, and software life cycle data that should be addressed when formal methods are used as part of the software development process. This report presents three case studies describing the use of different classes of formal methods to satisfy certification objectives for a common avionics example - a dual-channel Flight Guidance System. The three case studies illustrate the use of theorem proving, model checking, and abstract interpretation. The material presented is not intended to represent a complete certification effort. Rather, the purpose is to illustrate how formal methods can be used in a realistic avionics software development project, with a focus on the evidence produced that could be used to satisfy the verification objectives found in Section 6 of DO-178C.
Formalizing procedures for operations automation, operator training and spacecraft autonomy
NASA Technical Reports Server (NTRS)
Lecouat, Francois; Desaintvincent, Arnaud
1994-01-01
The generation and validation of operations procedures is a key task of mission preparation that is quite complex and costly. This has motivated the development of software applications providing support for procedures preparation. Several applications have been developed at MATRA MARCONI SPACE (MMS) over the last five years. They are presented in the first section of this paper. The main idea is that if procedures are represented in a formal language, they can be managed more easily with a computer tool and some automatic verifications can be performed. One difficulty is to define a formal language that is easy to use for operators and operations engineers. From the experience of the various procedures management tools developed in the last five years (including the POM, EOA, and CSS projects), MMS has derived OPSMAKER, a generic tool for procedure elaboration and validation. It has been applied to quite different types of missions, ranging from crew procedures (PREVISE system), ground control centers management procedures (PROCSU system), and - most relevant to the present paper - satellite operation procedures (PROCSAT developed for CNES, to support the preparation and verification of SPOT 4 operation procedures, and OPSAT for MMS telecom satellites operation procedures).
Modeling biochemical transformation processes and information processing with Narrator.
Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner
2007-03-27
Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is specifically intended for users aiming to construct and simulate dynamic models of biology without recourse to extensive mathematical detail. Its design facilitates mappings to different formal languages and frameworks. The combined set of features makes Narrator unique among tools of its kind. Narrator is implemented as Java software program and available as open-source from http://www.narrator-tool.org.
A brief overview of NASA Langley's research program in formal methods
NASA Technical Reports Server (NTRS)
1992-01-01
An overview of NASA Langley's research program in formal methods is presented. The major goal of this work is to bring formal methods technology to a sufficiently mature level for use by the United States aerospace industry. Towards this goal, work is underway to design and formally verify a fault-tolerant computing platform suitable for advanced flight control applications. Also, several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of six NASA civil servants and contractors from Boeing Military Aircraft Company, Computational Logic Inc., Odyssey Research Associates, SRI International, University of California at Davis, and Vigyan Inc.
Toward designing for trust in database automation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duez, P. P.; Jamieson, G. A.
Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operatingmore » functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process. The connection between an AH for an automated tool and a list of information elements at the three levels of attributional abstraction is then direct, providing a method for satisfying information requirements for appropriate trust in automation. In this paper, we will present our method for developing specific information requirements for an automated tool, based on a formal analysis of that tool and the models presented by Lee and See. We will show an example of the application of the AH to automation, in the domain of relational database automation, and the resulting set of specific information elements for appropriate trust in the automated tool. Finally, we will comment on the applicability of this approach to the domain of nuclear plant instrumentation. (authors)« less
Formal Lifelong E-Learning for Employability and Job Stability during Turbulent Times in Spain
ERIC Educational Resources Information Center
Martínez-Cerdá, Juan-Francisco; Torrent-Sellens, Joan
2017-01-01
In recent decades, international organizations have developed initiatives that incorporate lifelong learning as a tool to increase the employability of citizens. In this context, the goal of this research is to test the influence of formal e-learning on estimating employment status. The research made use of a sample of 595 citizens in 2007 and…
ERIC Educational Resources Information Center
Jones, W. Monty; Smith, Shaunna; Cohen, Jonathan
2017-01-01
This qualitative study examined preservice teachers' beliefs about using maker activities in formal educational settings. Eighty-two preservice and early-career teachers at three different universities in the United States took part in one-time workshops designed to introduce them to various maker tools and activities applicable to K-12…
NASA Langley Research and Technology-Transfer Program in Formal Methods
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Caldwell, James L.; Carreno, Victor A.; Holloway, C. Michael; Miner, Paul S.; DiVito, Ben L.
1995-01-01
This paper presents an overview of NASA Langley research program in formal methods. The major goals of this work are to make formal methods practical for use on life critical systems, and to orchestrate the transfer of this technology to U.S. industry through use of carefully designed demonstration projects. Several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of five NASA civil servants and contractors from Odyssey Research Associates, SRI International, and VIGYAN Inc.
von Kodolitsch, Yskert; Bernhardt, Alexander M.; Robinson, Peter N.; Kölbel, Tilo; Reichenspurner, Hermann; Debus, Sebastian; Detter, Christian
2015-01-01
Background It is the physicians’ task to translate evidence and guidelines into medical strategies for individual patients. Until today, however, there is no formal tool that is instrumental to perform this translation. Methods We introduce the analysis of strengths (S) and weaknesses (W) related to therapy with opportunities (O) and threats (T) related to individual patients as a tool to establish an individualized (I) medical strategy (I-SWOT). The I-SWOT matrix identifies four fundamental types of strategy. These comprise “SO” maximizing strengths and opportunities, “WT” minimizing weaknesses and threats, “WO” minimizing weaknesses and maximizing opportunities, and “ST” maximizing strengths and minimizing threats. Each distinct type of strategy may be considered for individualized medical strategies. Results We describe four steps of I-SWOT to establish an individualized medical strategy to treat aortic disease. In the first step, we define the goal of therapy and identify all evidence-based therapeutic options. In a second step, we assess strengths and weaknesses of each therapeutic option in a SW matrix form. In a third step, we assess opportunities and threats related to the individual patient, and in a final step, we use the I-SWOT matrix to establish an individualized medical strategy through matching “SW” with “OT”. As an example we present two 30-year-old patients with Marfan syndrome with identical medical history and aortic pathology. As a result of I-SWOT analysis of their individual opportunities and threats, we identified two distinct medical strategies in these patients. Conclusion I-SWOT is a formal but easy to use tool to translate medical evidence into individualized medical strategies. PMID:27069939
Garth, Belinda; Kirby, Catherine; Silberberg, Peter; Brown, James
2016-08-19
Learning plans are a compulsory component of the training and assessment requirements of general practice (GP) registrars in Australia. There is a small but growing number of studies reporting that learning plans are not well accepted or utilised in general practice training. There is a lack of research examining this apparent contradiction. The aim of this study was to examine use and perceived utility of formal learning plans in GP vocational training. This mixed-method Australian national research project utilised online learning plan usage data from 208 GP registrars and semi-structured focus groups and telephone interviews with 35 GP registrars, 12 recently fellowed GPs, 16 supervisors and 17 medical educators across three Regional Training Providers (RTPs). Qualitative data were analysed thematically using template analysis. Learning plans were used mostly as a log of activities rather than as a planning tool. Most learning needs were entered and ticked off as complete on the same day. Learning plans were perceived as having little value for registrars in their journey to becoming a competent GP, and as a bureaucratic hurdle serving as a distraction rather than an aid to learning. The process of learning planning was valued more so than the documentation of learning planning. This study provides creditable evidence that mandated learning plans are broadly considered by users to be a bureaucratic impediment with little value as a learning tool. It is more important to support registrars in planning their learning than to enforce documentation of this process in a learning plan. If learning planning is to be an assessed competence, methods of assessment other than the submission of a formal learning plan should be explored.
Proceedings of the Second NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Munoz, Cesar (Editor)
2010-01-01
This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.
NASA Astrophysics Data System (ADS)
Bykovsky, A. Yu; Sherbakov, A. A.
2016-08-01
The C-valued Allen-Givone algebra is the attractive tool for modeling of a robotic agent, but it requires the consensus method of minimization for the simplification of logic expressions. This procedure substitutes some undefined states of the function for the maximal truth value, thus extending the initially given truth table. This further creates the problem of different formal representations for the same initially given function. The multi-criteria optimization is proposed for the deliberate choice of undefined states and model formation.
Multifractal analysis of macro- and microcerebral circulation in rats
NASA Astrophysics Data System (ADS)
Pavlov, Alexey N.; Sindeeva, Olga S.; Sindeev, Sergey S.; Pavlova, Olga N.; Abdurashitov, Arkady S.; Rybalova, Elena V.; Semyachkina-Glushkovskaya, Oxana V.
2016-04-01
Application of noninvasive optical coherent-domain methods and advanced data processing tools such as the wavelet-based multifractal formalism allows revealing effective markers of early stages of functional distortions in the dynamics of cerebral vessels. Based on experiments performed in rats we discuss a possibility to diagnose a hidden stage of the development of intracranial hemorrhage (ICH). We also consider responses of the cerebrovascular dynamics to a pharmacologically induced increase in the peripheral blood pressure. We report distinctions occurring at the levels of macro- and microcerebral circulation.
Learning in non-formal education: Is it "youthful" for youth in action?
NASA Astrophysics Data System (ADS)
Norqvist, Lars; Leffler, Eva
2017-04-01
This article offers insights into the practices of a non-formal education programme for youth provided by the European Union (EU). It takes a qualitative approach and is based on a case study of the European Voluntary Service (EVS). Data were collected during individual and focus group interviews with learners (the EVS volunteers), decision takers and trainers, with the aim of deriving an understanding of learning in non-formal education. The research questions concerned learning, the recognition of learning and perspectives of usefulness. The study also examined the Youthpass documentation tool as a key to understanding the recognition of learning and to determine whether the learning was useful for learners (the volunteers). The findings and analysis offer several interpretations of learning, and the recognition of learning, which take place in non-formal education. The findings also revealed that it is complicated to divide learning into formal and non- formal categories; instead, non-formal education is useful for individual learners when both formal and non-formal educational contexts are integrated. As a consequence, the division of formal and non-formal (and possibly even informal) learning creates a gap which works against the development of flexible and interconnected education with ubiquitous learning and mobility within and across formal and non-formal education. This development is not in the best interests of learners, especially when seeking useful learning and education for youth (what the authors term "youthful" for youth in action).
New tools for emergency managers: an assessment of obstacles to use and implementation.
McCormick, Sabrina
2016-04-01
This paper focuses on the role of the formal response community's use of social media and crowdsourcing for emergency managers (EMs) in disaster planning, response and recovery in the United States. In-depth qualitative interviews with EMs on the Eastern seaboard at the local, state and federal level demonstrate that emergency management tools are in a state of transition--from formal, internally regulated tools for crisis response to an incorporation of new social media and crowdsourcing tools. The first set of findings provides insight into why many EMs are not using social media, and describes their concerns that result in fear, uncertainty and doubt. Second, this research demonstrates how internal functioning and staffing issues within these agencies present challenges. This research seeks to examine the dynamics of this transition and offer lessons for how to improve its outcomes--critical to millions of people across the United States. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.
Teamwork Assessment Tools in Modern Surgical Practice: A Systematic Review
Whittaker, George; Abboudi, Hamid; Khan, Muhammed Shamim; Dasgupta, Prokar; Ahmed, Kamran
2015-01-01
Introduction. Deficiencies in teamwork skills have been shown to contribute to the occurrence of adverse events during surgery. Consequently, several teamwork assessment tools have been developed to evaluate trainee nontechnical performance. This paper aims to provide an overview of these instruments and review the validity of each tool. Furthermore, the present paper aims to review the deficiencies surrounding training and propose several recommendations to address these issues. Methods. A systematic literature search was conducted to identify teamwork assessment tools using MEDLINE (1946 to August 2015), EMBASE (1974 to August 2015), and PsycINFO (1806 to August 2015) databases. Results. Eight assessment tools which encompass aspects of teamwork were identified. The Nontechnical Skills for Surgeons (NOTSS) assessment was found to possess the highest level of validity from a variety of sources; reliability and acceptability have also been established for this tool. Conclusions. Deficits in current surgical training pathways have prompted several recommendations to meet the evolving requirements of surgeons. Recommendations from the current paper include integration of teamwork training and assessment into medical school curricula, standardised formal training of assessors to ensure accurate evaluation of nontechnical skill acquisition, and integration of concurrent technical and nontechnical skills training throughout training. PMID:26425732
Formal methods for modeling and analysis of hybrid systems
NASA Technical Reports Server (NTRS)
Tiwari, Ashish (Inventor); Lincoln, Patrick D. (Inventor)
2009-01-01
A technique based on the use of a quantifier elimination decision procedure for real closed fields and simple theorem proving to construct a series of successively finer qualitative abstractions of hybrid automata is taught. The resulting abstractions are always discrete transition systems which can then be used by any traditional analysis tool. The constructed abstractions are conservative and can be used to establish safety properties of the original system. The technique works on linear and non-linear polynomial hybrid systems: the guards on discrete transitions and the continuous flows in all modes can be specified using arbitrary polynomial expressions over the continuous variables. An exemplar tool in the SAL environment built over the theorem prover PVS is detailed. The technique scales well to large and complex hybrid systems.
Harden, Angela; Thomas, James; Cargo, Margaret; Harris, Janet; Pantoja, Tomas; Flemming, Kate; Booth, Andrew; Garside, Ruth; Hannes, Karin; Noyes, Jane
2018-05-01
The Cochrane Qualitative and Implementation Methods Group develops and publishes guidance on the synthesis of qualitative and mixed-method evidence from process evaluations. Despite a proliferation of methods for the synthesis of qualitative research, less attention has focused on how to integrate these syntheses within intervention effectiveness reviews. In this article, we report updated guidance from the group on approaches, methods, and tools, which can be used to integrate the findings from quantitative studies evaluating intervention effectiveness with those from qualitative studies and process evaluations. We draw on conceptual analyses of mixed methods systematic review designs and the range of methods and tools that have been used in published reviews that have successfully integrated different types of evidence. We outline five key methods and tools as devices for integration which vary in terms of the levels at which integration takes place; the specialist skills and expertise required within the review team; and their appropriateness in the context of limited evidence. In situations where the requirement is the integration of qualitative and process evidence within intervention effectiveness reviews, we recommend the use of a sequential approach. Here, evidence from each tradition is synthesized separately using methods consistent with each tradition before integration takes place using a common framework. Reviews which integrate qualitative and process evaluation evidence alongside quantitative evidence on intervention effectiveness in a systematic way are rare. This guidance aims to support review teams to achieve integration and we encourage further development through reflection and formal testing. Copyright © 2017 Elsevier Inc. All rights reserved.
Weaving a Formal Methods Education with Problem-Based Learning
NASA Astrophysics Data System (ADS)
Gibson, J. Paul
The idea of weaving formal methods through computing (or software engineering) degrees is not a new one. However, there has been little success in developing and implementing such a curriculum. Formal methods continue to be taught as stand-alone modules and students, in general, fail to see how fundamental these methods are to the engineering of software. A major problem is one of motivation — how can the students be expected to enthusiastically embrace a challenging subject when the learning benefits, beyond passing an exam and achieving curriculum credits, are not clear? Problem-based learning has gradually moved from being an innovative pedagogique technique, commonly used to better-motivate students, to being widely adopted in the teaching of many different disciplines, including computer science and software engineering. Our experience shows that a good problem can be re-used throughout a student's academic life. In fact, the best computing problems can be used with children (young and old), undergraduates and postgraduates. In this paper we present a process for weaving formal methods through a University curriculum that is founded on the application of problem-based learning and a library of good software engineering problems, where students learn about formal methods without sitting a traditional formal methods module. The process of constructing good problems and integrating them into the curriculum is shown to be analagous to the process of engineering software. This approach is not intended to replace more traditional formal methods modules: it will better prepare students for such specialised modules and ensure that all students have an understanding and appreciation for formal methods even if they do not go on to specialise in them.
ERIC Educational Resources Information Center
Chassapis, Dimitris
1999-01-01
Focuses on the process by which children develop a formal mathematical concept of the circle by using various instruments to draw circles within the context of a goal-directed drawing task. Concludes that the use of the compass in circle drawing structures the circle-drawing operation in a radically different fashion than circle tracers and…
NASA Astrophysics Data System (ADS)
Hoverman, Suzanne; Ayre, Margaret
2012-12-01
SummaryIndigenous land owners of the Tiwi Islands, Northern Territory Australia have begun the first formal freshwater allocation planning process in Australia entirely within Indigenous lands and waterways. The process is managed by the Northern Territory government agency responsible for water planning, the Department of Natural Resources, Environment, The Arts and Sport, in partnership with the Tiwi Land Council, the principal representative body for Tiwi Islanders on matters of land and water management and governance. Participatory planning methods ('tools') were developed to facilitate community participation in Tiwi water planning. The tools, selected for their potential to generate involvement in the planning process needed both to incorporate Indigenous knowledge of water use and management and raise awareness in the Indigenous community of Western science and water resources management. In consultation with the water planner and Tiwi Land Council officers, the researchers selected four main tools to develop, trial and evaluate. Results demonstrate that the tools provided mechanisms which acknowledge traditional management systems, improve community engagement, and build confidence in the water planning process. The researchers found that participatory planning approaches supported Tiwi natural resource management institutions both in determining appropriate institutional arrangements and clarifying roles and responsibilities in the Islands' Water Management Strategy.
Modeling biochemical transformation processes and information processing with Narrator
Mandel, Johannes J; Fuß, Hendrik; Palfreyman, Niall M; Dubitzky, Werner
2007-01-01
Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a flexible and intuitive systems biology tool. It is specifically intended for users aiming to construct and simulate dynamic models of biology without recourse to extensive mathematical detail. Its design facilitates mappings to different formal languages and frameworks. The combined set of features makes Narrator unique among tools of its kind. Narrator is implemented as Java software program and available as open-source from . PMID:17389034
RAFCON: A Graphical Tool for Engineering Complex, Robotic Tasks
2016-10-09
Robotic tasks are becoming increasingly complex, and with this also the robotic systems. This requires new tools to manage this complexity and to...execution of robotic tasks, called RAFCON. These tasks are described in hierarchical state machines supporting concurrency. A formal notation of this concept
NASA Technical Reports Server (NTRS)
Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor)
2010-01-01
Systems, methods and apparatus are provided through which in some embodiments an informal specification is translated without human intervention into a formal specification. In some embodiments the formal specification is a process-based specification. In some embodiments, the formal specification is translated into a high-level computer programming language which is further compiled into a set of executable computer instructions.
Ten Commandments Revisited: A Ten-Year Perspective on the Industrial Application of Formal Methods
NASA Technical Reports Server (NTRS)
Bowen, Jonathan P.; Hinchey, Michael G.
2005-01-01
Ten years ago, our 1995 paper Ten Commandments of Formal Methods suggested some guidelines to help ensure the success of a formal methods project. It proposed ten important requirements (or "commandments") for formal developers to consider and follow, based on our knowledge of several industrial application success stories, most of which have been reported in more detail in two books. The paper was surprisingly popular, is still widely referenced, and used as required reading in a number of formal methods courses. However, not all have agreed with some of our commandments, feeling that they may not be valid in the long-term. We re-examine the original commandments ten years on, and consider their validity in the light of a further decade of industrial best practice and experiences.
Sustaining Teacher Control in a Blog-Based Personal Learning Environment
ERIC Educational Resources Information Center
Tomberg, Vladimir; Laanpere, Mart; Ley, Tobias; Normak, Peeter
2013-01-01
Various tools and services based on Web 2.0 (mainly blogs, wikis, social networking tools) are increasingly used in formal education to create personal learning environments, providing self-directed learners with more freedom, choice, and control over their learning. In such distributed and personalized learning environments, the traditional role…
ERIC Educational Resources Information Center
Bulut, Mesut; Bars, Mehmet Emin
2013-01-01
In terms of the individual and society folk literature is an important educational tool product; plays an important role in the transmission of culture between generations is an important element of the social culture. Which is an important educational tool for the individual and society folk literature, folk tales products, is one of the major…
Vasconcelos, Hemerson Bruno da Silva; Woods, David John
2017-01-01
This study aimed to identify the knowledge, skills and attitudes of Brazilian hospital pharmacists in the use of information technology and electronic tools to support clinical practice. Methods: A questionnaire was sent by email to clinical pharmacists working public and private hospitals in Brazil. The instrument was validated using the method of Polit and Beck to determine the content validity index. Data (n = 348) were analyzed using descriptive statistics, Pearson's Chi-square test and Gamma correlation tests. Results: Pharmacists had 1–4 electronic devices for personal use, mainly smartphones (84.8%; n = 295) and laptops (81.6%; n = 284). At work, pharmacists had access to a computer (89.4%; n = 311), mostly connected to the internet (83.9%; n = 292). They felt competent (very capable/capable) searching for a web page/web site on a specific subject (100%; n = 348), downloading files (99.7%; n = 347), using spreadsheets (90.2%; n = 314), searching using MeSH terms in PubMed (97.4%; n = 339) and general searching for articles in bibliographic databases (such as Medline/PubMed: 93.4%; n = 325). Pharmacists did not feel competent in using statistical analysis software (somewhat capable/incapable: 78.4%; n = 273). Most pharmacists reported that they had not received formal education to perform most of these actions except searching using MeSH terms. Access to bibliographic databases was available in Brazilian hospitals, however, most pharmacists (78.7%; n = 274) reported daily use of a non-specific search engine such as Google. This result may reflect the lack of formal knowledge and training in the use of bibliographic databases and difficulty with the English language. The need to expand knowledge about information search tools was recognized by most pharmacists in clinical practice in Brazil, especially those with less time dedicated exclusively to clinical activity (Chi-square, p = 0.006). Conclusion: These results will assist in defining minimal competencies for the training of pharmacists in the field of information technology to support clinical practice. Knowledge and skill gaps are evident in the use of bibliographic databases, spreadsheets and statistical tools. PMID:29272292
Applying formal methods and object-oriented analysis to existing flight software
NASA Technical Reports Server (NTRS)
Cheng, Betty H. C.; Auernheimer, Brent
1993-01-01
Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.
Two-Step Formal Advertisement: An Examination.
1976-10-01
The purpose of this report is to examine the potential application of the Two-Step Formal Advertisement method of procurement. Emphasis is placed on...Step formal advertising is a method of procurement designed to take advantage of negotiation flexibility and at the same time obtain the benefits of...formal advertising . It is used where the specifications are not sufficiently definite or may be too restrictive to permit full and free competition
Traditional Knowledge Strengthens NOAA's Environmental Education
NASA Astrophysics Data System (ADS)
Stovall, W. K.; McBride, M. A.; Lewinski, S.; Bennett, S.
2010-12-01
Environmental education efforts are increasingly recognizing the value of traditional knowledge, or indigenous science, as a basis to teach the importance of stewardship. The National Oceanic and Atmospheric Administration (NOAA) Pacific Services Center incorporates Polynesian indigenous science into formal and informal education components of its environmental literacy program. By presenting indigenous science side by side with NOAA science, it becomes clear that the scientific results are the same, although the methods may differ. The platforms for these tools span a vast spectrum, utilizing media from 3-D visualizations to storytelling and lecture. Navigating the Pacific Islands is a Second Life project in which users navigate a virtual Polynesian voyaging canoe between two islands, one featuring native Hawaiian practices and the other where users learn about NOAA research and ships. In partnership with the University of Hawai‘i Waikiki Aquarium, the Nana I Ke Kai (Look to the Sea) series focuses on connecting culture and science during cross-discipline, publicly held discussions between cultural practitioners and research scientists. The Indigenous Science Video Series is a multi-use, animated collection of short films that showcase the efforts of NOAA fisheries management and ship navigation in combination with the accompanying Polynesian perspectives. Formal education resources and lesson plans for grades 3-5 focusing on marine science have also been developed and incorporate indigenous science practices as examples of conservation success. By merging traditional knowledge and stewardship practices with NOAA science in educational tools and resources, NOAA's Pacific Services Center is helping to build and increase environmental literacy through the development of educational tools and resources that are applicable to place-based understanding and approaches.
Formal Methods for Verification and Validation of Partial Specifications: A Case Study
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Callahan, John
1997-01-01
This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.
Modeling formalisms in Systems Biology
2011-01-01
Systems Biology has taken advantage of computational tools and high-throughput experimental data to model several biological processes. These include signaling, gene regulatory, and metabolic networks. However, most of these models are specific to each kind of network. Their interconnection demands a whole-cell modeling framework for a complete understanding of cellular systems. We describe the features required by an integrated framework for modeling, analyzing and simulating biological processes, and review several modeling formalisms that have been used in Systems Biology including Boolean networks, Bayesian networks, Petri nets, process algebras, constraint-based models, differential equations, rule-based models, interacting state machines, cellular automata, and agent-based models. We compare the features provided by different formalisms, and discuss recent approaches in the integration of these formalisms, as well as possible directions for the future. PMID:22141422
Reduced discretization error in HZETRN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slaba, Tony C., E-mail: Tony.C.Slaba@nasa.gov; Blattnig, Steve R., E-mail: Steve.R.Blattnig@nasa.gov; Tweed, John, E-mail: jtweed@odu.edu
2013-02-01
The deterministic particle transport code HZETRN is an efficient analysis tool for studying the effects of space radiation on humans, electronics, and shielding materials. In a previous work, numerical methods in the code were reviewed, and new methods were developed that further improved efficiency and reduced overall discretization error. It was also shown that the remaining discretization error could be attributed to low energy light ions (A < 4) with residual ranges smaller than the physical step-size taken by the code. Accurately resolving the spectrum of low energy light particles is important in assessing risk associated with astronaut radiation exposure.more » In this work, modifications to the light particle transport formalism are presented that accurately resolve the spectrum of low energy light ion target fragments. The modified formalism is shown to significantly reduce overall discretization error and allows a physical approximation to be removed. For typical step-sizes and energy grids used in HZETRN, discretization errors for the revised light particle transport algorithms are shown to be less than 4% for aluminum and water shielding thicknesses as large as 100 g/cm{sup 2} exposed to both solar particle event and galactic cosmic ray environments.« less
Formal hardware verification of digital circuits
NASA Technical Reports Server (NTRS)
Joyce, J.; Seger, C.-J.
1991-01-01
The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.
Structured decision making: Chapter 5
Runge, Michael C.; Grand, James B.; Mitchell, Michael S.; Krausman, Paul R.; Cain, James W. III
2013-01-01
Wildlife management is a decision-focused discipline. It needs to integrate traditional wildlife science and social science to identify actions that are most likely to achieve the array of desires society has surrounding wildlife populations. Decision science, a vast field with roots in economics, operations research, and psychology, offers a rich set of tools to help wildlife managers frame, decompose, analyze, and synthesize their decisions. The nature of wildlife management as a decision science has been recognized since the inception of the field, but formal methods of decision analysis have been underused. There is tremendous potential for wildlife management to grow further through the use of formal decision analysis. First, the wildlife science and human dimensions of wildlife disciplines can be readily integrated. Second, decisions can become more efficient. Third, decisions makers can communicate more clearly with stakeholders and the public. Fourth, good, intuitive wildlife managers, by explicitly examining how they make decisions, can translate their art into a science that is readily used by the next generation.
Resampling approach for anomalous change detection
NASA Astrophysics Data System (ADS)
Theiler, James; Perkins, Simon
2007-04-01
We investigate the problem of identifying pixels in pairs of co-registered images that correspond to real changes on the ground. Changes that are due to environmental differences (illumination, atmospheric distortion, etc.) or sensor differences (focus, contrast, etc.) will be widespread throughout the image, and the aim is to avoid these changes in favor of changes that occur in only one or a few pixels. Formal outlier detection schemes (such as the one-class support vector machine) can identify rare occurrences, but will be confounded by pixels that are "equally rare" in both images: they may be anomalous, but they are not changes. We describe a resampling scheme we have developed that formally addresses both of these issues, and reduces the problem to a binary classification, a problem for which a large variety of machine learning tools have been developed. In principle, the effects of misregistration will manifest themselves as pervasive changes, and our method will be robust against them - but in practice, misregistration remains a serious issue.
Primordial Black Holes from First Principles (Overview)
NASA Astrophysics Data System (ADS)
Lam, Casey; Bloomfield, Jolyon; Moss, Zander; Russell, Megan; Face, Stephen; Guth, Alan
2017-01-01
Given a power spectrum from inflation, our goal is to calculate, from first principles, the number density and mass spectrum of primordial black holes that form in the early universe. Previously, these have been calculated using the Press- Schechter formalism and some demonstrably dubious rules of thumb regarding predictions of black hole collapse. Instead, we use Monte Carlo integration methods to sample field configurations from a power spectrum combined with numerical relativity simulations to obtain a more accurate picture of primordial black hole formation. We demonstrate how this can be applied for both Gaussian perturbations and the more interesting (for primordial black holes) theory of hybrid inflation. One of the tools that we employ is a variant of the BBKS formalism for computing the statistics of density peaks in the early universe. We discuss the issue of overcounting due to subpeaks that can arise from this approach (the ``cloud-in-cloud'' problem). MIT UROP Office- Paul E. Gray (1954) Endowed Fund.
Control by quality: proposition of a typology.
Pujo, P; Pillet, M
The application of Quality tools and methods in industrial management has always had a fundamental impact on the control of production. It influences the behavior of the actors concerned, while introducing the necessary notions and formalizations, especially for production systems with little or no automation, which constitute a large part of the industrial activity. Several quality approaches are applied in the workshop and are implemented at the level of the control. In this paper, the authors present a typology of the various approaches that have successively influenced control, such as statistical process control, quality assurance, and continuous improvement. First the authors present a parallel between production control and quality organizational structure. They note the duality between control, which is aimed at increasing productivity, and quality, which aims to satisfy the needs of the customer. They also note the hierarchical organizational structure of these two systems of management with, at each level, the notion of a feedback loop. This notion is fundamental to any kind of decision making. The paper is organized around the operational, tactical, and strategic levels, by describing for each level the main methods and tools for control by quality. The overview of these tools and methods starts at the operational level, with the Statistical Process Control, the Taguchi technique, and the "six sigma" approach. On the tactical level, we find a quality system approach, with a documented description of the procedures introduced in the firm. The management system can refer here to Quality Assurance, Total Productive Maintenance, or Management by Total Quality. The formalization through procedures of the rules of decision governing the process control enhances the validity of these rules. This leads to the enhancement of their reliability and to their consolidation. All this counterbalances the human, intrinsically fluctuating, behavior of the control operators. Strategic control by quality is then detailed, and the two main approaches, the continuous improvement approach and the proactive improvement approach, are introduced. Finally, the authors observe that at each of the three levels, the continuous process improvement, which is a component of Total Quality, becomes an essential preoccupation for the control. Ultimately, the recursive utilization of the Deming cycle remains the best practice for the control by quality.
An Arbitrary First Order Theory Can Be Represented by a Program: A Theorem
NASA Technical Reports Server (NTRS)
Hosheleva, Olga
1997-01-01
How can we represent knowledge inside a computer? For formalized knowledge, classical logic seems to be the most adequate tool. Classical logic is behind all formalisms of classical mathematics, and behind many formalisms used in Artificial Intelligence. There is only one serious problem with classical logic: due to the famous Godel's theorem, classical logic is algorithmically undecidable; as a result, when the knowledge is represented in the form of logical statements, it is very difficult to check whether, based on this statement, a given query is true or not. To make knowledge representations more algorithmic, a special field of logic programming was invented. An important portion of logic programming is algorithmically decidable. To cover knowledge that cannot be represented in this portion, several extensions of the decidable fragments have been proposed. In the spirit of logic programming, these extensions are usually introduced in such a way that even if a general algorithm is not available, good heuristic methods exist. It is important to check whether the already proposed extensions are sufficient, or further extensions is necessary. In the present paper, we show that one particular extension, namely, logic programming with classical negation, introduced by M. Gelfond and V. Lifschitz, can represent (in some reasonable sense) an arbitrary first order logical theory.
Formalization of the Integral Calculus in the PVS Theorem Prover
NASA Technical Reports Server (NTRS)
Butler, Ricky W.
2004-01-01
The PVS Theorem prover is a widely used formal verification tool used for the analysis of safety-critical systems. The PVS prover, though fully equipped to support deduction in a very general logic framework, namely higher-order logic, it must nevertheless, be augmented with the definitions and associated theorems for every branch of mathematics and Computer Science that is used in a verification. This is a formidable task, ultimately requiring the contributions of researchers and developers all over the world. This paper reports on the formalization of the integral calculus in the PVS theorem prover. All of the basic definitions and theorems covered in a first course on integral calculus have been completed.The theory and proofs were based on Rosenlicht's classic text on real analysis and follow the traditional epsilon-delta method. The goal of this work was to provide a practical set of PVS theories that could be used for verification of hybrid systems that arise in air traffic management systems and other aerospace applications. All of the basic linearity, integrability, boundedness, and continuity properties of the integral calculus were proved. The work culminated in the proof of the Fundamental Theorem Of Calculus. There is a brief discussion about why mechanically checked proofs are so much longer than standard mathematics textbook proofs.
Almedom, Astier M.; Tesfamichael, Berhe; Yacob, Abdu; Debretsion, Zaïd; Teklehaimanot, Kidane; Beyene, Teshome; Kuhn, Kira; Alemu, Zemui
2003-01-01
OBJECTIVE: To establish the context in which maternal psychosocial well-being is understood in war-affected settings in Eritrea. METHOD: Pretested and validated participatory methods and tools of investigation and analysis were employed to allow participants to engage in processes of qualitative data collection, on-site analysis, and interpretation. FINDINGS: Maternal psychosocial well-being in Eritrea is maintained primarily by traditional systems of social support that are mostly outside the domain of statutory primary care. Traditional birth attendants provide a vital link between the two. Formal training and regular supplies of sterile delivery kits appear to be worthwhile options for health policy and practice in the face of the post-conflict challenges of ruined infrastructure and an overstretched and/or ill-mannered workforce in the maternity health service. CONCLUSION: Methodological advances in health research and the dearth of data on maternal psychosocial well-being in complex emergency settings call for scholars and practitioners to collaborate in creative searches for sound evidence on which to base maternity, mental health and social care policy and practice. Participatory methods facilitate the meaningful engagement of key stakeholders and enhance data quality, reliability and usability. PMID:12856054
Creating more effective mentors: Mentoring the mentor
Gandhi, Monica; Johnson, Mallory
2016-01-01
Introduction Given the diversity of those affected by HIV, increasing diversity in the HIV biomedical research workforce is imperative. A growing body of empirical and experimental evidence supports the importance of strong mentorship in the development and success of trainees and early career investigators in academic research settings, especially for mentees of diversity. Often missing from this discussion is the need for robust mentoring training programs to ensure that mentors are trained in best practices on the tools and techniques of mentoring. Recent experimental evidence shows improvement in mentor and mentee perceptions of mentor’s competency after structured and formalized training on best practices in mentoring. Methods We developed a 2-day “Mentoring the Mentors” workshop at UCSF to train mid-level and senior HIV researchers from around the country (recruited mainly from Centers for AIDS Research (CFARs)) on best practices, tools and techniques of effective mentoring. The workshop content was designed using principles of Social Cognitive Career Theory (SCCT) and included training specific to working with early career investigators from underrepresented groups, including training on unconscious bias, microaggressions, and diversity supplements. The workshop has been held 3 times (September 2012, October 2013 and May 2015) with plans for annual training. Mentoring competency was measured using a validated tool before and after each workshop. Results Mentoring competency skills in six domains of mentoring -specifically effective communication, aligning expectations, assessing understanding, fostering independence, addressing diversity and promoting development - all improved as assessed by a validated measurement tool for participants pre- and-post the “Mentoring the Mentors” training workshops. Qualitative assessments indicated a greater awareness of the micro-insults and unconscious bias experienced by mentees of diversity and a commitment to improve awareness and mitigate these effects via the mentor-mentee relationship. Discussion Our “Mentoring the Mentors” workshop for HIV researchers/mentors offers a formal and structured curriculum on best practices, tools and techniques of effective mentoring, and methods to mitigate unconscious bias in the mentoring relationship and at the institutional level with mentees of diversity. We found quantitative and qualitative improvements in mentoring skills as assessed by self-report by participants after each workshop and plan additional programs with longitudinal longer-term assessments focused on objective mentee outcomes (grants, papers, academic retention). Mentoring training can improve mentoring skills and are likely to improve outcomes for optimally-mentored mentees. PMID:27039092
Brass, E P; Lofstedt, R; Renn, O
2011-12-01
Nonprescription drugs pose unique challenges to regulators. The fact that the barriers to access are lower for nonprescription drugs as compared with prescription drugs may permit additional consumers to obtain effective drugs. However, the use of these drugs by consumers in the absence of supervision by a health-care professional may result in unacceptable rates of misuse and suboptimal clinical outcomes. A value-tree method is proposed that defines important benefit and risk domains relevant to nonprescription drugs. This value tree can be used to comprehensively identify product-specific attributes in each domain and can also support formal benefit-risk assessment using a variety of tools. This is illustrated here, using a modification of the International Risk Governance Council (IRGC) framework, a flexible tool previously applied in a number of fields, which systematizes an approach to issue review, early alignment of stakeholders, evaluation, and risk mitigation/management. The proposed approach has the potential to provide structured, transparent tools for regulatory decision making for nonprescription drugs.
Overton, Edgar Turner; Kauwe, John S.K.; Paul, Rob; Tashima, Karen; Tate, David F.; Patel, Pragna; Carpenter, Chuck; Patty, David; Brooks, John T.; Clifford, David B
2013-01-01
HIV-associated neurocognitive disorders (HAND) remain prevalent but challenging to diagnose particularly among non-demented individuals. To determine whether a brief computerized battery correlates with formal neurocognitive testing, we identified 46 HIV-infected persons who had undergone both formal neurocognitive testing and a brief computerized battery. Simple detection tests correlated best with formal neuropsychological testing. By multivariable regression model, 53% of the variance in the composite Global Deficit Score was accounted for by elements from the brief computerized tool (p<0.01). These data confirm previous correlation data with the computerized battery, yet illustrate remaining challenges for neurocognitive screening. PMID:21877204
NASA Technical Reports Server (NTRS)
Broderick, Ron
1997-01-01
The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network development. The changes were to include evaluation tools that can be applied to neural networks at each phase of the software engineering life cycle. The result was a formal evaluation approach to increase the product quality of systems that use neural networks for their implementation.
Modeling of tool path for the CNC sheet cutting machines
NASA Astrophysics Data System (ADS)
Petunin, Aleksandr A.
2015-11-01
In the paper the problem of tool path optimization for CNC (Computer Numerical Control) cutting machines is considered. The classification of the cutting techniques is offered. We also propose a new classification of toll path problems. The tasks of cost minimization and time minimization for standard cutting technique (Continuous Cutting Problem, CCP) and for one of non-standard cutting techniques (Segment Continuous Cutting Problem, SCCP) are formalized. We show that the optimization tasks can be interpreted as discrete optimization problem (generalized travel salesman problem with additional constraints, GTSP). Formalization of some constraints for these tasks is described. For the solution GTSP we offer to use mathematical model of Prof. Chentsov based on concept of a megalopolis and dynamic programming.
Demography and Public Health Emergency Preparedness: Making the Connection
Katz, Rebecca
2009-01-01
The tools and techniques of population sciences are extremely relevant to the discipline of public health emergency preparedness: protecting and securing the population’s health requires information about that population. While related fields such as security studies have successfully integrated demographic tools into their research and literature, the theoretical and practical connection between the methods of demography and the practice of public health emergency preparedness is weak. This article suggests the need to further the interdisciplinary use of demography by examining the need for a systematic use of population science techniques in public health emergency preparedness. Ultimately, we demonstrate how public health emergency preparedness can incorporate demography to develop more effective preparedness plans. Important policy implications emerge: demographers and preparedness experts need to collaborate more formally in order to facilitate community resilience and mitigate the consequences of public health emergencies. PMID:20694030
A New Look at NASA: Strategic Research In Information Technology
NASA Technical Reports Server (NTRS)
Alfano, David; Tu, Eugene (Technical Monitor)
2002-01-01
This viewgraph presentation provides information on research undertaken by NASA to facilitate the development of information technologies. Specific ideas covered here include: 1) Bio/nano technologies: biomolecular and nanoscale systems and tools for assembly and computing; 2) Evolvable hardware: autonomous self-improving, self-repairing hardware and software for survivable space systems in extreme environments; 3) High Confidence Software Technologies: formal methods, high-assurance software design, and program synthesis; 4) Intelligent Controls and Diagnostics: Next generation machine learning, adaptive control, and health management technologies; 5) Revolutionary computing: New computational models to increase capability and robustness to enable future NASA space missions.
The Zeldovich & Adhesion approximations and applications to the local universe
NASA Astrophysics Data System (ADS)
Hidding, Johan; van de Weygaert, Rien; Shandarin, Sergei
2016-10-01
The Zeldovich approximation (ZA) predicts the formation of a web of singularities. While these singularities may only exist in the most formal interpretation of the ZA, they provide a powerful tool for the analysis of initial conditions. We present a novel method to find the skeleton of the resulting cosmic web based on singularities in the primordial deformation tensor and its higher order derivatives. We show that the A 3 lines predict the formation of filaments in a two-dimensional model. We continue with applications of the adhesion model to visualise structures in the local (z < 0.03) universe.
The parser generator as a general purpose tool
NASA Technical Reports Server (NTRS)
Noonan, R. E.; Collins, W. R.
1985-01-01
The parser generator has proven to be an extremely useful, general purpose tool. It can be used effectively by programmers having only a knowledge of grammars and no training at all in the theory of formal parsing. Some of the application areas for which a table-driven parser can be used include interactive, query languages, menu systems, translators, and programming support tools. Each of these is illustrated by an example grammar.
Periodically-Scheduled Controller Analysis using Hybrid Systems Reachability and Continuization
2015-12-01
tools to verify specifications for hybrid automata do not perform well on such periodically scheduled models. This is due to a combination of the large...an additive nondeterministic input. Reachability tools for hybrid automata can better handle such systems. We further improve the analysis by...formally as a hybrid automaton. However, reachability tools to verify specifications for hybrid automata do not perform well on such periodically
Software Validation via Model Animation
NASA Technical Reports Server (NTRS)
Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.
2015-01-01
This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.
Workflow Agents vs. Expert Systems: Problem Solving Methods in Work Systems Design
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten; Seah, Chin
2009-01-01
During the 1980s, a community of artificial intelligence researchers became interested in formalizing problem solving methods as part of an effort called "second generation expert systems" (2nd GES). How do the motivations and results of this research relate to building tools for the workplace today? We provide an historical review of how the theory of expertise has developed, a progress report on a tool for designing and implementing model-based automation (Brahms), and a concrete example how we apply 2nd GES concepts today in an agent-based system for space flight operations (OCAMS). Brahms incorporates an ontology for modeling work practices, what people are doing in the course of a day, characterized as "activities." OCAMS was developed using a simulation-to-implementation methodology, in which a prototype tool was embedded in a simulation of future work practices. OCAMS uses model-based methods to interactively plan its actions and keep track of the work to be done. The problem solving methods of practice are interactive, employing reasoning for and through action in the real world. Analogously, it is as if a medical expert system were charged not just with interpreting culture results, but actually interacting with a patient. Our perspective shifts from building a "problem solving" (expert) system to building an actor in the world. The reusable components in work system designs include entire "problem solvers" (e.g., a planning subsystem), interoperability frameworks, and workflow agents that use and revise models dynamically in a network of people and tools. Consequently, the research focus shifts so "problem solving methods" include ways of knowing that models do not fit the world, and ways of interacting with other agents and people to gain or verify information and (ultimately) adapt rules and procedures to resolve problematic situations.
Experiences applying Formal Approaches in the Development of Swarm-Based Space Exploration Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher A.; Hinchey, Michael G.; Truszkowski, Walter F.; Rash, James L.
2006-01-01
NASA is researching advanced technologies for future exploration missions using intelligent swarms of robotic vehicles. One of these missions is the Autonomous Nan0 Technology Swarm (ANTS) mission that will explore the asteroid belt using 1,000 cooperative autonomous spacecraft. The emergent properties of intelligent swarms make it a potentially powerful concept, but at the same time more difficult to design and ensure that the proper behaviors will emerge. NASA is investigating formal methods and techniques for verification of such missions. The advantage of using formal methods is the ability to mathematically verify the behavior of a swarm, emergent or otherwise. Using the ANTS mission as a case study, we have evaluated multiple formal methods to determine their effectiveness in modeling and ensuring desired swarm behavior. This paper discusses the results of this evaluation and proposes an integrated formal method for ensuring correct behavior of future NASA intelligent swarms.
Scriptwriting as a Tool for Learning Stylistic Variation
ERIC Educational Resources Information Center
Saugera, Valerie
2011-01-01
A film script is a useful tool for allowing students to experiment with language variation. Scripts of love stories comprise a range of language contexts, each triggering a different style on a formal-neutral-informal linguistic continuum: (1) technical cinematographic language in camera directions; (2) narrative language in exposition of scenes,…
Tools and Traits for Highly Effective Science Teaching, K-8
ERIC Educational Resources Information Center
Vasquez, Jo Anne
2007-01-01
Even if the reader has little formal training or background knowledge in science, "Tools & Traits for Highly Effective Science Teaching, K-8" pulls together cognitive and educational research to present an indispensable framework for science in the elementary and middle grades. Readers will discover teaching that increases students' engagement and…
ReACT!: An Interactive Educational Tool for AI Planning for Robotics
ERIC Educational Resources Information Center
Dogmus, Zeynep; Erdem, Esra; Patogulu, Volkan
2015-01-01
This paper presents ReAct!, an interactive educational tool for artificial intelligence (AI) planning for robotics. ReAct! enables students to describe robots' actions and change in dynamic domains without first having to know about the syntactic and semantic details of the underlying formalism, and to solve planning problems using…
Integrating Technology into Peer Leader Responsibilities
ERIC Educational Resources Information Center
Johnson, Melissa L.
2012-01-01
Technology has become an integral part of landscape of higher education. Students are coming to college with an arsenal of technological tools at their disposal. These tools are being used for informal, everyday communication as well as for formal learning in the classroom. At the same time, higher education is experiencing an increase in peer…
Davis, Thomas D
2017-01-01
Practice evaluation strategies range in style from the formal-analytic tools of single-subject designs, rapid assessment instruments, algorithmic steps in evidence-informed practice, and computer software applications, to the informal-interactive tools of clinical supervision, consultation with colleagues, use of client feedback, and clinical experience. The purpose of this article is to provide practice researchers in social work with an evidence-informed theory that is capable of explaining both how and why social workers use practice evaluation strategies to self-monitor the effectiveness of their interventions in terms of client change. The author delineates the theoretical contours and consequences of what is called dual-process theory. Drawing on evidence-informed advances in the cognitive and social neurosciences, the author identifies among everyday social workers a theoretically stable, informal-interactive tool preference that is a cognitively necessary, sufficient, and stand-alone preference that requires neither the supplementation nor balance of formal-analytic tools. The author's delineation of dual-process theory represents a theoretical contribution in the century-old attempt to understand how and why social workers evaluate their practice the way they do.
Shoukourian, S K; Vasilyan, A M; Avagyan, A A; Shukurian, A K
1999-01-01
A formalized "top to bottom" design approach was described in [1] for distributed applications built on databases, which were considered as a medium between virtual and real user environments for a specific medical application. Merging different components within a unified distributed application posits new essential problems for software. Particularly protection tools, which are sufficient separately, become deficient during the integration due to specific additional links and relationships not considered formerly. E.g., it is impossible to protect a shared object in the virtual operating room using only DBMS protection tools, if the object is stored as a record in DB tables. The solution of the problem should be found only within the more general application framework. Appropriate tools are absent or unavailable. The present paper suggests a detailed outline of a design and testing toolset for access differentiation systems (ADS) in distributed medical applications which use databases. The appropriate formal model as well as tools for its mapping to a DMBS are suggested. Remote users connected via global networks are considered too.
NASA Technical Reports Server (NTRS)
Jamsek, Damir A.
1993-01-01
A brief example of the use of formal methods techniques in the specification of a software system is presented. The report is part of a larger effort targeted at defining a formal methods pilot project for NASA. One possible application domain that may be used to demonstrate the effective use of formal methods techniques within the NASA environment is presented. It is not intended to provide a tutorial on either formal methods techniques or the application being addressed. It should, however, provide an indication that the application being considered is suitable for a formal methods by showing how such a task may be started. The particular system being addressed is the Structured File Services (SFS), which is a part of the Data Storage and Retrieval Subsystem (DSAR), which in turn is part of the Data Management System (DMS) onboard Spacestation Freedom. This is a software system that is currently under development for NASA. An informal mathematical development is presented. Section 3 contains the same development using Penelope (23), an Ada specification and verification system. The complete text of the English version Software Requirements Specification (SRS) is reproduced in Appendix A.
Using Formal Methods to Assist in the Requirements Analysis of the Space Shuttle GPS Change Request
NASA Technical Reports Server (NTRS)
DiVito, Ben L.; Roberts, Larry W.
1996-01-01
We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CR's) were selected as promising targets to demonstrate the utility of formal methods in this application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this report. Carried out in parallel with the Shuttle program's conventional requirements analysis process was a limited form of analysis based on formalized requirements. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During the formal methods-based analysis, numerous requirements issues were discovered and submitted as official issues through the normal requirements inspection process. Shuttle analysts felt that many of these issues were uncovered earlier than would have occurred with conventional methods. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.
Properties of a Formal Method to Model Emergence in Swarm-Based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike
2004-01-01
Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.
Formal Methods of V&V of Partial Specifications: An Experience Report
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Callahan, John
1997-01-01
This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe an experiment in the application of the method SCR. to testing for consistency properties of a partial model of requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification is valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.
NASA Astrophysics Data System (ADS)
Nagai, Tetsuro
2017-01-01
Replica-exchange molecular dynamics (REMD) has demonstrated its efficiency by combining trajectories of a wide range of temperatures. As an extension of the method, the author formalizes the mass-manipulating replica-exchange molecular dynamics (MMREMD) method that allows for arbitrary mass scaling with respect to temperature and individual particles. The formalism enables the versatile application of mass-scaling approaches to the REMD method. The key change introduced in the novel formalism is the generalized rules for the velocity and momentum scaling after accepted replica-exchange attempts. As an application of this general formalism, the refinement of the viscosity-REMD (V-REMD) method [P. H. Nguyen,
Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop
NASA Technical Reports Server (NTRS)
Rozier, Kristin Yvonne (Editor)
2008-01-01
Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.
Proceedings of the First NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)
2009-01-01
Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.
Solving the three-body Coulomb breakup problem using exterior complex scaling
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCurdy, C.W.; Baertschy, M.; Rescigno, T.N.
2004-05-17
Electron-impact ionization of the hydrogen atom is the prototypical three-body Coulomb breakup problem in quantum mechanics. The combination of subtle correlation effects and the difficult boundary conditions required to describe two electrons in the continuum have made this one of the outstanding challenges of atomic physics. A complete solution of this problem in the form of a ''reduction to computation'' of all aspects of the physics is given by the application of exterior complex scaling, a modern variant of the mathematical tool of analytic continuation of the electronic coordinates into the complex plane that was used historically to establish themore » formal analytic properties of the scattering matrix. This review first discusses the essential difficulties of the three-body Coulomb breakup problem in quantum mechanics. It then describes the formal basis of exterior complex scaling of electronic coordinates as well as the details of its numerical implementation using a variety of methods including finite difference, finite elements, discrete variable representations, and B-splines. Given these numerical implementations of exterior complex scaling, the scattering wave function can be generated with arbitrary accuracy on any finite volume in the space of electronic coordinates, but there remains the fundamental problem of extracting the breakup amplitudes from it. Methods are described for evaluating these amplitudes. The question of the volume-dependent overall phase that appears in the formal theory of ionization is resolved. A summary is presented of accurate results that have been obtained for the case of electron-impact ionization of hydrogen as well as a discussion of applications to the double photoionization of helium.« less
Spin coefficients and gauge fixing in the Newman-Penrose formalism
NASA Astrophysics Data System (ADS)
Nerozzi, Andrea
2017-03-01
Since its introduction in 1962, the Newman-Penrose formalism has been widely used in analytical and numerical studies of Einstein's equations, like for example for the Teukolsky master equation, or as a powerful wave extraction tool in numerical relativity. Despite the many applications, Einstein's equations in the Newman-Penrose formalism appear complicated and not easily applicable to general studies of spacetimes, mainly because physical and gauge degrees of freedom are mixed in a nontrivial way. In this paper we approach the whole formalism with the goal of expressing the spin coefficients as functions of tetrad invariants once a particular tetrad is chosen. We show that it is possible to do so, and give for the first time a general recipe for the task, as well as an indication of the quantities and identities that are required.
Understanding visualization: a formal approach using category theory and semiotics.
Vickers, Paul; Faith, Joe; Rossiter, Nick
2013-06-01
This paper combines the vocabulary of semiotics and category theory to provide a formal analysis of visualization. It shows how familiar processes of visualization fit the semiotic frameworks of both Saussure and Peirce, and extends these structures using the tools of category theory to provide a general framework for understanding visualization in practice, including: Relationships between systems, data collected from those systems, renderings of those data in the form of representations, the reading of those representations to create visualizations, and the use of those visualizations to create knowledge and understanding of the system under inspection. The resulting framework is validated by demonstrating how familiar information visualization concepts (such as literalness, sensitivity, redundancy, ambiguity, generalizability, and chart junk) arise naturally from it and can be defined formally and precisely. This paper generalizes previous work on the formal characterization of visualization by, inter alia, Ziemkiewicz and Kosara and allows us to formally distinguish properties of the visualization process that previous work does not.
Can mathematics explain the evolution of human language?
Witzany, Guenther
2011-09-01
Investigation into the sequence structure of the genetic code by means of an informatic approach is a real success story. The features of human language are also the object of investigation within the realm of formal language theories. They focus on the common rules of a universal grammar that lies behind all languages and determine generation of syntactic structures. This universal grammar is a depiction of material reality, i.e., the hidden logical order of things and its relations determined by natural laws. Therefore mathematics is viewed not only as an appropriate tool to investigate human language and genetic code structures through computer science-based formal language theory but is itself a depiction of material reality. This confusion between language as a scientific tool to describe observations/experiences within cognitive constructed models and formal language as a direct depiction of material reality occurs not only in current approaches but was the central focus of the philosophy of science debate in the twentieth century, with rather unexpected results. This article recalls these results and their implications for more recent mathematical approaches that also attempt to explain the evolution of human language.
An ORCID based synchronization framework for a national CRIS ecosystem.
Mendes Moreira, João; Cunha, Alcino; Macedo, Nuno
2015-01-01
PTCRIS (Portuguese Current Research Information System) is a program aiming at the creation and sustained development of a national integrated information ecosystem, to support research management according to the best international standards and practices. This paper reports on the experience of designing and prototyping a synchronization framework for PTCRIS based on ORCID (Open Researcher and Contributor ID). This framework embraces the "input once, re-use often" principle, and will enable a substantial reduction of the research output management burden by allowing automatic information exchange between the various national systems. The design of the framework followed best practices in rigorous software engineering, namely well-established principles in the research field of consistency management, and relied on formal analysis techniques and tools for its validation and verification. The notion of consistency between the services was formally specified and discussed with the stakeholders before the technical aspects on how to preserve said consistency were explored. Formal specification languages and automated verification tools were used to analyze the specifications and generate usage scenarios, useful for validation with the stakeholder and essential to certificate compliant services.
2009-05-18
serves as a didactic tool to understand the information required for the approach to coordinate free tracking and navigation problems. Observe that the...layout (left), and in the CN -Complex (right). These paths can be compared by using the algebraic topological tools covered in chapter 2. . . . 34 3.9...right). mathematical tools necessary to make our discussion formal; chapter 3 will present the construction of a simplicial representation called
Software Formal Inspections Guidebook
NASA Technical Reports Server (NTRS)
1993-01-01
The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.
Random-Phase Approximation Methods
NASA Astrophysics Data System (ADS)
Chen, Guo P.; Voora, Vamsee K.; Agee, Matthew M.; Balasubramani, Sree Ganesh; Furche, Filipp
2017-05-01
Random-phase approximation (RPA) methods are rapidly emerging as cost-effective validation tools for semilocal density functional computations. We present the theoretical background of RPA in an intuitive rather than formal fashion, focusing on the physical picture of screening and simple diagrammatic analysis. A new decomposition of the RPA correlation energy into plasmonic modes leads to an appealing visualization of electron correlation in terms of charge density fluctuations. Recent developments in the areas of beyond-RPA methods, RPA correlation potentials, and efficient algorithms for RPA energy and property calculations are reviewed. The ability of RPA to approximately capture static correlation in molecules is quantified by an analysis of RPA natural occupation numbers. We illustrate the use of RPA methods in applications to small-gap systems such as open-shell d- and f-element compounds, radicals, and weakly bound complexes, where semilocal density functional results exhibit strong functional dependence.
NASA Astrophysics Data System (ADS)
Silberman, Donn M.
2014-09-01
For the July 2013 issue of SPIE Professional Magazine, I was invited to and published an article related to this topic. This paper chronicles the progress made since that time and describes our direction towards bringing optics education from the informal programs we have provided for more than 10 years, to incorporating optics and photonics instruction into formal class curriculum. A major educational tool we are using was introduced at this conference two years ago and came to us from Eyestvzw. The Photonics Explorer Kit has been used as a foundation during some OptoBotics courses and it has been provided, a long with a teacher training session, to 10 local high school science teachers in Orange County, CA. The goal of this first phase is to obtain feedback from the teachers as they use the materials in their formal classroom settings and after-school activities; such as science classes and robotics club activities. Results of the teachers' initial feedback will be reviewed and future directions outlined. One clear direction is to understand the changes that will be required to the kits to formally gain acceptance as part of the California state high school science curriculum. Another is to use the Photonics Explorer kits (and other similar tools) to teach students in robotics clubs `how to give their robots eyes."
NASA Astrophysics Data System (ADS)
Miyajima, Hiroyuki; Yuhara, Naohiro
Regenerative Life Support Systems (RLSS), which maintain human lives by recycling substances essential for living, are comprised of humans, plants, and material circulation systems. The plants supply food to the humans or reproduce water and gases by photosynthesis, while the material circulation systems recycle physicochemically and circulate substances disposed by humans and plants. RLSS attracts attention since manned space activities have been shifted from previous short trips to long-term stay activities as such base as a space station, a lunar base, and a Mars base. The present typical space base is the International Space Station (ISS), a manned experimental base for prolonged stays, where RLSS recycles only water and air. In order to accommodate prolonged and extended manned activity in future space bases, developing RLSS that implements food production and regeneration of resources at once using plants is expected. The configuration of RLSS should be designed to suit its own duty, for which design requirements for RLSS with an unprecedented configuration may arise. Accordingly, it is necessary to establish a conceptual design method for generalized RLSS. It is difficult, however, to systematize the design process by analyzing previous design because there are only a few ground-experimental facilities, namely CEEF (Closed Ecology Experiment Facilities) of Japan, BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) of the U.S., and BIOS3 of Russia. Thus a conceptual design method which doesn’t rely on previous design examples is required for generalized RLSS from the above reasons. This study formalizes a conceptual design process, and develops a conceptual design support tool for RLSS based on this design process.
Staton, Lisa J; Kraemer, Suzanne M; Patel, Sangnya; Talente, Gregg M; Estrada, Carlos A
2007-01-01
Background The Accreditation Council on Graduate Medical Education (ACGME) supports chart audit as a method to track competency in Practice-Based Learning and Improvement. We examined whether peer chart audits performed by internal medicine residents were associated with improved documentation of foot care in patients with diabetes mellitus. Methods A retrospective electronic chart review was performed on 347 patients with diabetes mellitus cared for by internal medicine residents in a university-based continuity clinic from May 2003 to September 2004. Residents abstracted information pertaining to documentation of foot examinations (neurological, vascular, and skin) from the charts of patients followed by their physician peers. No formal feedback or education was provided. Results Significant improvement in the documentation of foot exams was observed over the course of the study. The percentage of patients receiving neurological, vascular, and skin exams increased by 20% (from 13% to 33%) (p = 0.001), 26% (from 45% to 71%) (p < 0.001), and 18% (51%–72%) (p = 0.005), respectively. Similarly, the proportion of patients receiving a well-documented exam which includes all three components – neurological, vascular and skin foot exam – increased over time (6% to 24%, p < 0.001). Conclusion Peer chart audits performed by residents in the absence of formal feedback were associated with improved documentation of the foot exam in patients with diabetes mellitus. Although this study suggests that peer chart audits may be an effective tool to improve practice-based learning and documentation of foot care in diabetic patients, evaluating the actual performance of clinical care was beyond the scope of this study and would be better addressed by a randomized controlled trial. PMID:17662124
EU Strategies of Integrating ICT into Initial Teacher Training
ERIC Educational Resources Information Center
Garapko, Vitaliya
2013-01-01
Education and learning are strongly linked with society and its evolution and knowledge. In the field of formal education, ICTs are increasingly deployed as tools to extend the learner's capacity to perceive, understand and communicate, as seen in the increase in online learning programs and the use of the computer as a learning support tool in…
ERIC Educational Resources Information Center
Aderibigbe, Semiyu Adejare; Ajasa, Folorunso Adekemi
2013-01-01
Purpose: The purpose of this paper is to explore the perceptions of college tutors on peer coaching as a tool for professional development to determine its formal institutionalisation. Design/methodology/approach: A survey questionnaire was used for data collection, while analysis of data was done using descriptive statistics. Findings: The…
NASA Technical Reports Server (NTRS)
Hall, Brendan; Driscoll, Kevin; Schweiker, Kevin; Dutertre, Bruno
2013-01-01
Within distributed fault-tolerant systems the term force-fight is colloquially used to describe the level of command disagreement present at redundant actuation interfaces. This report details an investigation of force-fight using three distributed system case-study architectures. Each case study architecture is abstracted and formally modeled using the Symbolic Analysis Laboratory (SAL) tool chain from the Stanford Research Institute (SRI). We use the formal SAL models to produce k-induction based proofs of a bounded actuation agreement property. We also present a mathematically derived bound of redundant actuation agreement for sine-wave stimulus. The report documents our experiences and lessons learned developing the formal models and the associated proofs.
Application of Canonical Effective Methods to Background-Independent Theories
NASA Astrophysics Data System (ADS)
Buyukcam, Umut
Effective formalisms play an important role in analyzing phenomena above some given length scale when complete theories are not accessible. In diverse exotic but physically important cases, the usual path-integral techniques used in a standard Quantum Field Theory approach seldom serve as adequate tools. This thesis exposes a new effective method for quantum systems, called the Canonical Effective Method, which owns particularly wide applicability in backgroundindependent theories as in the case of gravitational phenomena. The central purpose of this work is to employ these techniques to obtain semi-classical dynamics from canonical quantum gravity theories. Application to non-associative quantum mechanics is developed and testable results are obtained. Types of non-associative algebras relevant for magnetic-monopole systems are discussed. Possible modifications of hypersurface deformation algebra and the emergence of effective space-times are presented. iii.
Teif, Vladimir B
2007-01-01
The transfer matrix methodology is proposed as a systematic tool for the statistical-mechanical description of DNA-protein-drug binding involved in gene regulation. We show that a genetic system of several cis-regulatory modules is calculable using this method, considering explicitly the site-overlapping, competitive, cooperative binding of regulatory proteins, their multilayer assembly and DNA looping. In the methodological section, the matrix models are solved for the basic types of short- and long-range interactions between DNA-bound proteins, drugs and nucleosomes. We apply the matrix method to gene regulation at the O(R) operator of phage lambda. The transfer matrix formalism allowed the description of the lambda-switch at a single-nucleotide resolution, taking into account the effects of a range of inter-protein distances. Our calculations confirm previously established roles of the contact CI-Cro-RNAP interactions. Concerning long-range interactions, we show that while the DNA loop between the O(R) and O(L) operators is important at the lysogenic CI concentrations, the interference between the adjacent promoters P(R) and P(RM) becomes more important at small CI concentrations. A large change in the expression pattern may arise in this regime due to anticooperative interactions between DNA-bound RNA polymerases. The applicability of the matrix method to more complex systems is discussed.
Teif, Vladimir B.
2007-01-01
The transfer matrix methodology is proposed as a systematic tool for the statistical–mechanical description of DNA–protein–drug binding involved in gene regulation. We show that a genetic system of several cis-regulatory modules is calculable using this method, considering explicitly the site-overlapping, competitive, cooperative binding of regulatory proteins, their multilayer assembly and DNA looping. In the methodological section, the matrix models are solved for the basic types of short- and long-range interactions between DNA-bound proteins, drugs and nucleosomes. We apply the matrix method to gene regulation at the OR operator of phage λ. The transfer matrix formalism allowed the description of the λ-switch at a single-nucleotide resolution, taking into account the effects of a range of inter-protein distances. Our calculations confirm previously established roles of the contact CI–Cro–RNAP interactions. Concerning long-range interactions, we show that while the DNA loop between the OR and OL operators is important at the lysogenic CI concentrations, the interference between the adjacent promoters PR and PRM becomes more important at small CI concentrations. A large change in the expression pattern may arise in this regime due to anticooperative interactions between DNA-bound RNA polymerases. The applicability of the matrix method to more complex systems is discussed. PMID:17526526
Formal methods and their role in digital systems validation for airborne systems
NASA Technical Reports Server (NTRS)
Rushby, John
1995-01-01
This report is based on one prepared as a chapter for the FAA Digital Systems Validation Handbook (a guide to assist FAA certification specialists with advanced technology issues). Its purpose is to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used in critical applications; and to suggest factors for consideration when formal methods are offered in support of certification. The presentation concentrates on the rationale for formal methods and on their contribution to assurance for critical applications within a context such as that provided by DO-178B (the guidelines for software used on board civil aircraft); it is intended as an introduction for those to whom these topics are new.
Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.
Robles-Rubio, Carlos Alejandro; Bertolizio, Gianluca; Brown, Karen A; Kearney, Robert E
2015-01-01
Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA). POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i) a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP) signals; (ii) RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii) a library of data segments representing each of the 6 patterns; (iv) a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v) a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness). Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential for longitudinal and multicenter studies.
Advanced Tools and Techniques for Formal Techniques in Aerospace Systems
NASA Technical Reports Server (NTRS)
Knight, John C.
2005-01-01
This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.
2011-01-01
Background The Molecular Interaction Map (MIM) notation offers a standard set of symbols and rules on their usage for the depiction of cellular signaling network diagrams. Such diagrams are essential for disseminating biological information in a concise manner. A lack of software tools for the notation restricts wider usage of the notation. Development of software is facilitated by a more detailed specification regarding software requirements than has previously existed for the MIM notation. Results A formal implementation of the MIM notation was developed based on a core set of previously defined glyphs. This implementation provides a detailed specification of the properties of the elements of the MIM notation. Building upon this specification, a machine-readable format is provided as a standardized mechanism for the storage and exchange of MIM diagrams. This new format is accompanied by a Java-based application programming interface to help software developers to integrate MIM support into software projects. A validation mechanism is also provided to determine whether MIM datasets are in accordance with syntax rules provided by the new specification. Conclusions The work presented here provides key foundational components to promote software development for the MIM notation. These components will speed up the development of interoperable tools supporting the MIM notation and will aid in the translation of data stored in MIM diagrams to other standardized formats. Several projects utilizing this implementation of the notation are outlined herein. The MIM specification is available as an additional file to this publication. Source code, libraries, documentation, and examples are available at http://discover.nci.nih.gov/mim. PMID:21586134
Luna, Augustin; Karac, Evrim I; Sunshine, Margot; Chang, Lucas; Nussinov, Ruth; Aladjem, Mirit I; Kohn, Kurt W
2011-05-17
The Molecular Interaction Map (MIM) notation offers a standard set of symbols and rules on their usage for the depiction of cellular signaling network diagrams. Such diagrams are essential for disseminating biological information in a concise manner. A lack of software tools for the notation restricts wider usage of the notation. Development of software is facilitated by a more detailed specification regarding software requirements than has previously existed for the MIM notation. A formal implementation of the MIM notation was developed based on a core set of previously defined glyphs. This implementation provides a detailed specification of the properties of the elements of the MIM notation. Building upon this specification, a machine-readable format is provided as a standardized mechanism for the storage and exchange of MIM diagrams. This new format is accompanied by a Java-based application programming interface to help software developers to integrate MIM support into software projects. A validation mechanism is also provided to determine whether MIM datasets are in accordance with syntax rules provided by the new specification. The work presented here provides key foundational components to promote software development for the MIM notation. These components will speed up the development of interoperable tools supporting the MIM notation and will aid in the translation of data stored in MIM diagrams to other standardized formats. Several projects utilizing this implementation of the notation are outlined herein. The MIM specification is available as an additional file to this publication. Source code, libraries, documentation, and examples are available at http://discover.nci.nih.gov/mim.
Distributed learning or medical tourism? A Canadian residency program's experience in global health.
Kelly, Kate; McCarthy, Anne; McLean, Laurie
2015-01-01
Global health experiences (GHEs) are becoming increasingly prevalent in surgical residency education. Although it may seem intuitive that participation in GHEs develops CanMEDS competencies, this has not been studied in depth in surgery. The purpose of this study is (1) to explore if and how otolaryngology-head and neck surgery (OHNS) resident participation in GHEs facilitates the development of CanMEDS competencies and (2) to develop an OHNS GHE tool to facilitate the integration of CanMEDS into GHE participation and evaluation. An online survey explored the GHEs of current and past OHNS residents in Canada. Based on the data collected and a literature review, a foundational tool was then created to (1) enable OHNS residents to structure their GHEs into CanMEDS-related learning objectives and (2) enable OHNS program directors to more effectively evaluate residents' GHEs with respect to CanMEDS competencies. Participants' GHEs varied widely. These experiences often contributed informally to the development of several CanMEDS competencies. However, few residents had concrete objectives, rarely were CanMEDS roles clearly incorporated, and most residents were not formally evaluated during their experience. Residents felt they achieved greater learning when predeparture objectives and postexperience reflections were integrated into their GHEs. Although GHEs vary widely, they can serve as valuable forums for developing CanMEDS competencies among participating residents. Without clear objectives that adhere to the CanMEDS framework or formal assessment methods however, residents in GHEs risk becoming medical tourists. The use of an objective and evaluation tool may facilitate the creation of predeparture learning objectives, encourage self-reflection on their GHE, and better enable program directors to evaluate residents participating in GHEs. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Genetic Design Automation: engineering fantasy or scientific renewal?
Lux, Matthew W.; Bramlett, Brian W.; Ball, David A.; Peccoud, Jean
2013-01-01
Synthetic biology aims to make genetic systems more amenable to engineering, which has naturally led to the development of Computer-Aided Design (CAD) tools. Experimentalists still primarily rely on project-specific ad-hoc workflows instead of domain-specific tools, suggesting that CAD tools are lagging behind the front line of the field. Here, we discuss the scientific hurdles that have limited the productivity gains anticipated from existing tools. We argue that the real value of efforts to develop CAD tools is the formalization of genetic design rules that determine the complex relationships between genotype and phenotype. PMID:22001068
Bridging the gap between formal and experience-based knowledge for context-aware laparoscopy.
Katić, Darko; Schuck, Jürgen; Wekerle, Anna-Laura; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie
2016-06-01
Computer assistance is increasingly common in surgery. However, the amount of information is bound to overload processing abilities of surgeons. We propose methods to recognize the current phase of a surgery for context-aware information filtering. The purpose is to select the most suitable subset of information for surgical situations which require special assistance. We combine formal knowledge, represented by an ontology, and experience-based knowledge, represented by training samples, to recognize phases. For this purpose, we have developed two different methods. Firstly, we use formal knowledge about possible phase transitions to create a composition of random forests. Secondly, we propose a method based on cultural optimization to infer formal rules from experience to recognize phases. The proposed methods are compared with a purely formal knowledge-based approach using rules and a purely experience-based one using regular random forests. The comparative evaluation on laparoscopic pancreas resections and adrenalectomies employs a consistent set of quality criteria on clean and noisy input. The rule-based approaches proved best with noisefree data. The random forest-based ones were more robust in the presence of noise. Formal and experience-based knowledge can be successfully combined for robust phase recognition.
A Natural Language Interface Concordant with a Knowledge Base.
Han, Yong-Jin; Park, Seong-Bae; Park, Se-Young
2016-01-01
The discordance between expressions interpretable by a natural language interface (NLI) system and those answerable by a knowledge base is a critical problem in the field of NLIs. In order to solve this discordance problem, this paper proposes a method to translate natural language questions into formal queries that can be generated from a graph-based knowledge base. The proposed method considers a subgraph of a knowledge base as a formal query. Thus, all formal queries corresponding to a concept or a predicate in the knowledge base can be generated prior to query time and all possible natural language expressions corresponding to each formal query can also be collected in advance. A natural language expression has a one-to-one mapping with a formal query. Hence, a natural language question is translated into a formal query by matching the question with the most appropriate natural language expression. If the confidence of this matching is not sufficiently high the proposed method rejects the question and does not answer it. Multipredicate queries are processed by regarding them as a set of collected expressions. The experimental results show that the proposed method thoroughly handles answerable questions from the knowledge base and rejects unanswerable ones effectively.
A Rule-Based Modeling for the Description of Flexible and Self-healing Business Processes
NASA Astrophysics Data System (ADS)
Boukhebouze, Mohamed; Amghar, Youssef; Benharkat, Aïcha-Nabila; Maamar, Zakaria
In this paper we discuss the importance of ensuring that business processes are label robust and agile at the same time robust and agile. To this end, we consider reviewing the way business processes are managed. For instance we consider offering a flexible way to model processes so that changes in regulations are handled through some self-healing mechanisms. These changes may raise exceptions at run-time if not properly reflected on these processes. To this end we propose a new rule based model that adopts the ECA rules and is built upon formal tools. The business logic of a process can be summarized with a set of rules that implement an organization’s policies. Each business rule is formalized using our ECAPE formalism (Event-Condition-Action-Post condition- post Event). This formalism allows translating a process into a graph of rules that is analyzed in terms of reliably and flexibility.
Li, Xing-Ming; Rasooly, Alon; Peng, Bo; JianWang; Xiong, Shu-Yu
2017-11-10
Our study aimed to design a tool of evaluating intersectional collaboration on Non-communicable Chronic Disease (NCD) prevention and control, and further to understand the current status of intersectional collaboration in community health service institutions of China. We surveyed 444 main officials of community health service institutions in Beijing, Tianjin, Hubei and Ningxia regions of China in 2014 by using a questionnaire. A model of collaboration measurement, including four relational dimensions of governance, shared goals and vision, formalization and internalization, was used to compare the scores of evaluation scale in NCD management procedures across community healthcare institutions and other ones. Reliability and validity of the evaluation tool on inter-organizational collaboration on NCD prevention and control were verified. The test on tool evaluating inter-organizational collaboration in community NCD management revealed a good reliability and validity (Cronbach's Alpha = 0.89,split-half reliability = 0.84, the variance contribution rate of an extracted principal component = 49.70%). The results of inter-organizational collaboration of different departments and management segments showed there were statistically significant differences in formalization dimension for physical examination (p = 0.01).There was statistically significant difference in governance dimension, formalization dimension and total score of the collaboration scale for health record sector (p = 0.01,0.00,0.00). Statistical differences were found in the formalization dimension for exercise and nutrition health education segment (p = 0.01). There were no statistically significant difference in formalization dimension of medication guidance for psychological consultation, medical referral service and rehabilitation guidance (all p > 0.05). The multi-department collaboration mechanism of NCD prevention and control has been rudimentarily established. Community management institutions and general hospitals are more active in participating in community NCD management with better collaboration score, whereas the CDC shows relatively poor collaboration in China. Xing-ming Li and Alon Rasooly have the same contribution to the paper. Xing-ming Li and Alon Rasooly listed as the same first author.
Verification of Emergent Behaviors in Swarm-based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James
2004-01-01
The emergent properties of swarms make swarm-based missions powerful, but at the same time more difficult to design and to assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of swarm-based missions. The Autonomous Nano-Technology Swarm (ANTS) mission is being used as an example and case study for swarm-based missions to experiment and test current formal methods with intelligent swarms. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior. This paper introduces how intelligent swarm technology is being proposed for NASA missions, and gives the results of a comparison of several formal methods and approaches for specifying intelligent swarm-based systems and their effectiveness for predicting emergent behavior.
Test-Case Generation using an Explicit State Model Checker Final Report
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.; Gao, Jimin
2003-01-01
In the project 'Test-Case Generation using an Explicit State Model Checker' we have extended an existing tools infrastructure for formal modeling to export Java code so that we can use the NASA Ames tool Java Pathfinder (JPF) for test case generation. We have completed a translator from our source language RSML(exp -e) to Java and conducted initial studies of how JPF can be used as a testing tool. In this final report, we provide a detailed description of the translation approach as implemented in our tools.
Planform: an application and database of graph-encoded planarian regenerative experiments.
Lobo, Daniel; Malone, Taylor J; Levin, Michael
2013-04-15
Understanding the mechanisms governing the regeneration capabilities of many organisms is a fundamental interest in biology and medicine. An ever-increasing number of manipulation and molecular experiments are attempting to discover a comprehensive model for regeneration, with the planarian flatworm being one of the most important model species. Despite much effort, no comprehensive, constructive, mechanistic models exist yet, and it is now clear that computational tools are needed to mine this huge dataset. However, until now, there is no database of regenerative experiments, and the current genotype-phenotype ontologies and databases are based on textual descriptions, which are not understandable by computers. To overcome these difficulties, we present here Planform (Planarian formalization), a manually curated database and software tool for planarian regenerative experiments, based on a mathematical graph formalism. The database contains more than a thousand experiments from the main publications in the planarian literature. The software tool provides the user with a graphical interface to easily interact with and mine the database. The presented system is a valuable resource for the regeneration community and, more importantly, will pave the way for the application of novel artificial intelligence tools to extract knowledge from this dataset. The database and software tool are freely available at http://planform.daniel-lobo.com.
NASA Astrophysics Data System (ADS)
Zheng, Mingfang; He, Cunfu; Lu, Yan; Wu, Bin
2018-01-01
We presented a numerical method to solve phase dispersion curve in general anisotropic plates. This approach involves an exact solution to the problem in the form of the Legendre polynomial of multiple integrals, which we substituted into the state-vector formalism. In order to improve the efficiency of the proposed method, we made a special effort to demonstrate the analytical methodology. Furthermore, we analyzed the algebraic symmetries of the matrices in the state-vector formalism for anisotropic plates. The basic feature of the proposed method was the expansion of field quantities by Legendre polynomials. The Legendre polynomial method avoid to solve the transcendental dispersion equation, which can only be solved numerically. This state-vector formalism combined with Legendre polynomial expansion distinguished the adjacent dispersion mode clearly, even when the modes were very close. We then illustrated the theoretical solutions of the dispersion curves by this method for isotropic and anisotropic plates. Finally, we compared the proposed method with the global matrix method (GMM), which shows excellent agreement.
Dose rate calculations around 192Ir brachytherapy sources using a Sievert integration model
NASA Astrophysics Data System (ADS)
Karaiskos, P.; Angelopoulos, A.; Baras, P.; Rozaki-Mavrouli, H.; Sandilos, P.; Vlachos, L.; Sakelliou, L.
2000-02-01
The classical Sievert integral method is a valuable tool for dose rate calculations around brachytherapy sources, combining simplicity with reasonable computational times. However, its accuracy in predicting dose rate anisotropy around 192 Ir brachytherapy sources has been repeatedly put into question. In this work, we used a primary and scatter separation technique to improve an existing modification of the Sievert integral (Williamson's isotropic scatter model) that determines dose rate anisotropy around commercially available 192 Ir brachytherapy sources. The proposed Sievert formalism provides increased accuracy while maintaining the simplicity and computational time efficiency of the Sievert integral method. To describe transmission within the materials encountered, the formalism makes use of narrow beam attenuation coefficients which can be directly and easily calculated from the initially emitted 192 Ir spectrum. The other numerical parameters required for its implementation, once calculated with the aid of our home-made Monte Carlo simulation code, can be used for any 192 Ir source design. Calculations of dose rate and anisotropy functions with the proposed Sievert expression, around commonly used 192 Ir high dose rate sources and other 192 Ir elongated source designs, are in good agreement with corresponding accurate Monte Carlo results which have been reported by our group and other authors.
Dynamic Gate Product and Artifact Generation from System Models
NASA Technical Reports Server (NTRS)
Jackson, Maddalena; Delp, Christopher; Bindschadler, Duane; Sarrel, Marc; Wollaeger, Ryan; Lam, Doris
2011-01-01
Model Based Systems Engineering (MBSE) is gaining acceptance as a way to formalize systems engineering practice through the use of models. The traditional method of producing and managing a plethora of disjointed documents and presentations ("Power-Point Engineering") has proven both costly and limiting as a means to manage the complex and sophisticated specifications of modern space systems. We have developed a tool and method to produce sophisticated artifacts as views and by-products of integrated models, allowing us to minimize the practice of "Power-Point Engineering" from model-based projects and demonstrate the ability of MBSE to work within and supersede traditional engineering practices. This paper describes how we have created and successfully used model-based document generation techniques to extract paper artifacts from complex SysML and UML models in support of successful project reviews. Use of formal SysML and UML models for architecture and system design enables production of review documents, textual artifacts, and analyses that are consistent with one-another and require virtually no labor-intensive maintenance across small-scale design changes and multiple authors. This effort thus enables approaches that focus more on rigorous engineering work and less on "PowerPoint engineering" and production of paper-based documents or their "office-productivity" file equivalents.
The uses of isospin in early nuclear and particle physics
NASA Astrophysics Data System (ADS)
Borrelli, Arianna
2017-11-01
This paper reconstructs the early history of isospin up to and including its employment in 1951sbnd 52 to conceptualize high-energy pion-proton scattering. Studying the history of isospin serves as an entry point for investigating the interplay of theoretical and experimental practices in early nuclear and particle physics, showing the complexity of processes of knowledge construction which have often been presented as straightforward both in physicists' recollections and in the historiography of science. The story of isospin has often been told in terms of the discovery of the first ;intrinsic property; of elementary particles, but I will argue that the isospin formalism emerged and was further developed because it proved to be a useful tool to match theory and experiment within the steadily broadening field of high-energy (nuclear) physics. Isospin was variously appropriated and adapted in the course of two decades, before eventually the physical-mathematical implications of its uses started being spelled out. The case study also highlights some interesting features of high-energy physics around 1950: the contribution to post-war research of theoretical methods developed before and during the war, the role of young theoretical post-docs in mediating between theorists and experimenters, and the importance of traditional formalisms such as those of spin and angular momentum as a template both for formalizing and conceptualizing experimental results.
Systems, methods and apparatus for verification of knowledge-based systems
NASA Technical Reports Server (NTRS)
Rash, James L. (Inventor); Gracinin, Denis (Inventor); Erickson, John D. (Inventor); Rouff, Christopher A. (Inventor); Hinchey, Michael G. (Inventor)
2010-01-01
Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.
ERIC Educational Resources Information Center
Jacob, Bridgette L.
2013-01-01
The difficulties introductory statistics students have with formal statistical inference are well known in the field of statistics education. "Informal" statistical inference has been studied as a means to introduce inferential reasoning well before and without the formalities of formal statistical inference. This mixed methods study…
NASA Astrophysics Data System (ADS)
Doerr, Martin; Freitas, Fred; Guizzardi, Giancarlo; Han, Hyoil
Ontology is a cross-disciplinary field concerned with the study of concepts and theories that can be used for representing shared conceptualizations of specific domains. Ontological Engineering is a discipline in computer and information science concerned with the development of techniques, methods, languages and tools for the systematic construction of concrete artifacts capturing these representations, i.e., models (e.g., domain ontologies) and metamodels (e.g., upper-level ontologies). In recent years, there has been a growing interest in the application of formal ontology and ontological engineering to solve modeling problems in diverse areas in computer science such as software and data engineering, knowledge representation, natural language processing, information science, among many others.
Integrity and security in an Ada runtime environment
NASA Technical Reports Server (NTRS)
Bown, Rodney L.
1991-01-01
A review is provided of the Formal Methods group discussions. It was stated that integrity is not a pure mathematical dual of security. The input data is part of the integrity domain. The group provided a roadmap for research. One item of the roadmap and the final position statement are closely related to the space shuttle and space station. The group's position is to use a safe subset of Ada. Examples of safe sets include the Army Secure Operating System and the Penelope Ada verification tool. It is recommended that a conservative attitude is required when writing Ada code for life and property critical systems.
Facilitating critical thinking.
Hansten, R I; Washburn, M J
2000-01-01
Supporting staff to think effectively is essential to improve clinical systems, decrease errors and sentinel events, and engage staff involvement to refine patient care systems in readiness for new care-delivery models that truly reflect the valued role of the RN. The authors explore practical methods, based on current research and national consulting experience, to facilitate the development of mature critical thinking skills. Assessment tools, a sample agenda for formal presentations, and teaching strategies using behavioral examples that make the important and necessary link of theory to reality are discussed in the form of a critical thinking test as well as a conceptual model for application in problem solving.
Multi-criteria decision analysis in environmental sciences: ten years of applications and trends.
Huang, Ivy B; Keisler, Jeffrey; Linkov, Igor
2011-09-01
Decision-making in environmental projects requires consideration of trade-offs between socio-political, environmental, and economic impacts and is often complicated by various stakeholder views. Multi-criteria decision analysis (MCDA) emerged as a formal methodology to face available technical information and stakeholder values to support decisions in many fields and can be especially valuable in environmental decision making. This study reviews environmental applications of MCDA. Over 300 papers published between 2000 and 2009 reporting MCDA applications in the environmental field were identified through a series of queries in the Web of Science database. The papers were classified by their environmental application area, decision or intervention type. In addition, the papers were also classified by the MCDA methods used in the analysis (analytic hierarchy process, multi-attribute utility theory, and outranking). The results suggest that there is a significant growth in environmental applications of MCDA over the last decade across all environmental application areas. Multiple MCDA tools have been successfully used for environmental applications. Even though the use of the specific methods and tools varies in different application areas and geographic regions, our review of a few papers where several methods were used in parallel with the same problem indicates that recommended course of action does not vary significantly with the method applied. Published by Elsevier B.V.
Field Theoretical Methods in Cosmology
NASA Astrophysics Data System (ADS)
Singh, Anupam
1995-01-01
To optimally utilize all the exciting cosmological data coming in we need to sharpen also the theoretical tools available to cosmologists. One such indispensible tool to understand hot big bang cosmology is finite temperature field theory. We review and summarise the efforts made by us to use finite temperature field theory to address issues of current interest to cosmologists. An introduction to both the real time and the imaginary time formalisms is provided. The imaginary time formalism is illustrated by applying it to understand the interesting possibility of late Time Phase Transitions. Recent observations of the space distribution of quasars indicate a very notable peak in space density at a redshift of 2 to 3. It is pointed out that this may be the result of a phase transition which has a critical temperature of roughly a few meV (in the cosmological units, h = c = k = 1), which is natural in the context of massive neutrinos. In fact, the neutrino masses required for quasar production and those required to solve the solar neutrino problem by the MSW mechanism are consistent with each other. As a bonus, the cosmological constant implied by this model may also help resolve the discrepancy between the recently measured value of the Hubble Constant and the age of the universe. We illustrate the real time formalism by studying one of the most important time-dependent and non-equilibrium phenomena associated with phase transitions. The non-equilibrium dynamics of the first stage of the reheating process, that is dissipation via particle production is studied in scalar field theories. We show that a complete understanding of the mechanism of dissipation via particle production requires a non-perturbative resummation. We then study a Hartree approximation and clearly exhibit dissipative effects related to particle production. The effect of dissipation by Goldstone bosons is studied non-perturbatively in the large N limit in an O(N) theory. We also place our work in perspective and point out some of the related issues which clearly need further exploration.
Contact Tools in Japanese Acupuncture: An Ethnography of Acupuncture Practitioners in Japan.
Chant, Benjamin Cw; Madison, Jeanne; Coop, Paul; Dieberg, Gudrun
2017-10-01
This study aimed to identify procedural elements of Japanese acupuncture, describe these elements in detail, and explain them in terms of the key thematic category of treatment principles. Between August 2012 and December 2016, ethnographic fieldwork was conducted in Japan. In total, 38 participants were recruited by chain referral and emergent sampling. Data was collected through participant observation, interviews, and by analyzing documents. A total of 22 participants agreed to clinical observation; 221 treatments were observed with 172 patients. Seventeen consented to formal interviews and 28 to informal interviews. Thematic analysis was used to critically evaluate data. One especially interesting theme was interpreted from the data: a variety of contact tools were applied in treatment and these were manipulated by adjusting elements of form, speed, repetition, and pressure. Tapping, holding, pressing/pushing, and stroking were the most important ways contact tools were used on patients. Contact tools are noninvasive, painless, can be applied in almost any environment, and may be easily accepted by patients worldwide. Contact tool theory and practice may be successfully integrated into acupuncture curricula outside of Japan, used to inform clinical trials, and contribute to an expanded repertoire of methods for practitioners to benefit individual patients in international contexts. Copyright © 2017. Published by Elsevier B.V.
Experimental Evaluation of Verification and Validation Tools on Martian Rover Software
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareani, Corina; Venet, Arnaud; Visser, Willem; Washington, Rich
2003-01-01
We report on a study to determine the maturity of different verification and validation technologies (V&V) on a representative example of NASA flight software. The study consisted of a controlled experiment where three technologies (static analysis, runtime analysis and model checking) were compared to traditional testing with respect to their ability to find seeded errors in a prototype Mars Rover. What makes this study unique is that it is the first (to the best of our knowledge) to do a controlled experiment to compare formal methods based tools to testing on a realistic industrial-size example where the emphasis was on collecting as much data on the performance of the tools and the participants as possible. The paper includes a description of the Rover code that was analyzed, the tools used as well as a detailed description of the experimental setup and the results. Due to the complexity of setting up the experiment, our results can not be generalized, but we believe it can still serve as a valuable point of reference for future studies of this kind. It did confirm the belief we had that advanced tools can outperform testing when trying to locate concurrency errors. Furthermore the results of the experiment inspired a novel framework for testing the next generation of the Rover.
Why Engineers Should Consider Formal Methods
NASA Technical Reports Server (NTRS)
Holloway, C. Michael
1997-01-01
This paper presents a logical analysis of a typical argument favoring the use of formal methods for software development, and suggests an alternative argument that is simpler and stronger than the typical one.
Structuring Formal Control Systems Specifications for Reuse: Surviving Hardware Changes
NASA Technical Reports Server (NTRS)
Thompson, Jeffrey M.; Heimdahl, Mats P. E.; Erickson, Debra M.
2000-01-01
Formal capture and analysis of the required behavior of control systems have many advantages. For instance, it encourages rigorous requirements analysis, the required behavior is unambiguously defined, and we can assure that various safety properties are satisfied. Formal modeling is, however, a costly and time consuming process and if one could reuse the formal models over a family of products, significant cost savings would be realized. In an ongoing project we are investigating how to structure state-based models to achieve a high level of reusability within product families. In this paper we discuss a high-level structure of requirements models that achieves reusability of the desired control behavior across varying hardware platforms in a product family. The structuring approach is demonstrated through a case study in the mobile robotics domain where the desired robot behavior is reused on two diverse platforms-one commercial mobile platform and one build in-house. We use our language RSML (-e) to capture the control behavior for reuse and our tool NIMBUS to demonstrate how the formal specification can be validated and used as a prototype on the two platforms.
A Formal Basis for Safety Case Patterns
NASA Technical Reports Server (NTRS)
Denney, Ewen; Pai, Ganesh
2013-01-01
By capturing common structures of successful arguments, safety case patterns provide an approach for reusing strategies for reasoning about safety. In the current state of the practice, patterns exist as descriptive specifications with informal semantics, which not only offer little opportunity for more sophisticated usage such as automated instantiation, composition and manipulation, but also impede standardization efforts and tool interoperability. To address these concerns, this paper gives (i) a formal definition for safety case patterns, clarifying both restrictions on the usage of multiplicity and well-founded recursion in structural abstraction, (ii) formal semantics to patterns, and (iii) a generic data model and algorithm for pattern instantiation. We illustrate our contributions by application to a new pattern, the requirements breakdown pattern, which builds upon our previous work
The interventional radiology business plan.
Beheshti, Michael V; Meek, Mary E; Kaufman, John A
2012-09-01
Strategic planning and business planning are processes commonly employed by organizations that exist in competitive environments. Although it is difficult to prove a causal relationship between formal strategic/business planning and positive organizational performance, there is broad agreement that formal strategic and business plans are components of successful organizations. The various elements of strategic plans and business plans are not common in the vernacular of practicing physicians. As health care becomes more competitive, familiarity with these tools may grow in importance. Herein we provide an overview of formal strategic and business planning, and offer a roadmap for an interventional radiology-specific plan that may be useful for organizations confronting competitive and financial threats. Copyright © 2012 SIR. Published by Elsevier Inc. All rights reserved.
Formal methods for dependable real-time systems
NASA Technical Reports Server (NTRS)
Rushby, John
1993-01-01
The motivation for using formal methods to specify and reason about real time properties is outlined and approaches that were proposed and used are sketched. The formal verifications of clock synchronization algorithms are concluded as showing that mechanically supported reasoning about complex real time behavior is feasible. However, there was significant increase in the effectiveness of verification systems since those verifications were performed, at it is to be expected that verifications of comparable difficulty will become fairly routine. The current challenge lies in developing perspicuous and economical approaches to the formalization and specification of real time properties.
Mahar, Alyson L.; Compton, Carolyn; McShane, Lisa M.; Halabi, Susan; Asamura, Hisao; Rami-Porta, Ramon; Groome, Patti A.
2015-01-01
Introduction Accurate, individualized prognostication for lung cancer patients requires the integration of standard patient and pathologic factors, biologic, genetic, and other molecular characteristics of the tumor. Clinical prognostic tools aim to aggregate information on an individual patient to predict disease outcomes such as overall survival, but little is known about their clinical utility and accuracy in lung cancer. Methods A systematic search of the scientific literature for clinical prognostic tools in lung cancer published Jan 1, 1996-Jan 27, 2015 was performed. In addition, web-based resources were searched. A priori criteria determined by the Molecular Modellers Working Group of the American Joint Committee on Cancer were used to investigate the quality and usefulness of tools. Criteria included clinical presentation, model development approaches, validation strategies, and performance metrics. Results Thirty-two prognostic tools were identified. Patients with metastases were the most frequently considered population in non-small cell lung cancer. All tools for small cell lung cancer covered that entire patient population. Included prognostic factors varied considerably across tools. Internal validity was not formally evaluated for most tools and only eleven were evaluated for external validity. Two key considerations were highlighted for tool development: identification of an explicit purpose related to a relevant clinical population and clear decision-points, and prioritized inclusion of established prognostic factors over emerging factors. Conclusions Prognostic tools will contribute more meaningfully to the practice of personalized medicine if better study design and analysis approaches are used in their development and validation. PMID:26313682
Guideline validation in multiple trauma care through business process modeling.
Stausberg, Jürgen; Bilir, Hüseyin; Waydhas, Christian; Ruchholtz, Steffen
2003-07-01
Clinical guidelines can improve the quality of care in multiple trauma. In our Department of Trauma Surgery a specific guideline is available paper-based as a set of flowcharts. This format is appropriate for the use by experienced physicians but insufficient for electronic support of learning, workflow and process optimization. A formal and logically consistent version represented with a standardized meta-model is necessary for automatic processing. In our project we transferred the paper-based into an electronic format and analyzed the structure with respect to formal errors. Several errors were detected in seven error categories. The errors were corrected to reach a formally and logically consistent process model. In a second step the clinical content of the guideline was revised interactively using a process-modeling tool. Our study reveals that guideline development should be assisted by process modeling tools, which check the content in comparison to a meta-model. The meta-model itself could support the domain experts in formulating their knowledge systematically. To assure sustainability of guideline development a representation independent of specific applications or specific provider is necessary. Then, clinical guidelines could be used for eLearning, process optimization and workflow management additionally.
Development and validation of a new assessment tool for suturing skills in medical students.
Sundhagen, Henriette Pisani; Almeland, Stian Kreken; Hansson, Emma
2018-01-01
In recent years, emphasis has been put on that medical student should demonstrate pre-practice/pre-registration core procedural skills to ensure patient safety. Nonetheless, the formal teaching and training of basic suturing skills to medical students have received relatively little attention and there is no standard for what should be tested and how. The aim of this study was to develop and validate, using scientific methods, a tool for assessment of medical students' suturing skills, measuring both micro- and macrosurgical qualities. A tool was constructed and content, construct, concurrent validity, and inter-rater, inter-item, inter-test reliability were tested. Three groups were included: students with no training in suturing skills, students who have had training, plastic surgery. The results show promising reliability and validity when assessing novice medical students' suturing skills. Further studies are needed on implementation of the instrument. Moreover, how the instrument can be used to give formative feedback, evaluate if a required standard is met and for curriculum development needs further investigation.Level of Evidence: Not ratable.
Tenhaven, Christoph; Tipold, Andrea; Fischer, Martin R.; Ehlers, Jan P.
2013-01-01
Introduction: Informal and formal lifelong learning is essential at university and in the workplace. Apart from classical learning techniques, Web 2.0 tools can be used. It is controversial whether there is a so-called net generation amongst people under 30. Aims: To test the hypothesis that a net generation among students and young veterinarians exists. Methods: An online survey of students and veterinarians was conducted in the German-speaking countries which was advertised via online media and traditional print media. Results: 1780 people took part in the survey. Students and veterinarians have different usage patterns regarding social networks (91.9% vs. 69%) and IM (55.9% vs. 24.5%). All tools were predominantly used passively and in private, to a lesser extent also professionally and for studying. Outlook: The use of Web 2.0 tools is useful, however, teaching information and media skills, preparing codes of conduct for the internet and verification of user generated content is essential. PMID:23467682
Promoting Adoption of the 3Rs through Regulatory Qualification.
Walker, Elizabeth Gribble; Baker, Amanda F; Sauer, John-Michael
2016-12-01
One mechanism to advance the application of novel safety assessment methodologies in drug development, including in silico or in vitro approaches that reduce the use of animals in toxicology studies, is regulatory qualification. Regulatory qualification, a formal process defined at the the U. S. Food and Drug Administration and the European Medicines Agency, hinges on a central concept of stating an appropriate "context of use" for a novel drug development tool (DDT) that precisely defines how that DDT can be used to support decision making in a regulated drug development setting. When accumulating the data to support a particular "context-of-use," the concept of "fit-for-purpose" often guides assay validation, as well as the type and amount of data or evidence required to evaluate the tool. This paper will review pathways for regulatory acceptance of novel DDTs and discuss examples of safety projects considered for regulatory qualification. Key concepts to be considered when defining the evidence required to formally adopt and potentially replace animal-intensive traditional safety assessment methods using qualified DDTs are proposed. Presently, the use of qualified translational kidney safety biomarkers can refine and reduce the total numbers of animals used in drug development. We propose that the same conceptual regulatory framework will be appropriate to assess readiness of new technologies that may eventually replace whole animal models. © The Author 2016. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Initial implementation of a comparative data analysis ontology.
Prosdocimi, Francisco; Chisham, Brandon; Pontelli, Enrico; Thompson, Julie D; Stoltzfus, Arlin
2009-07-03
Comparative analysis is used throughout biology. When entities under comparison (e.g. proteins, genomes, species) are related by descent, evolutionary theory provides a framework that, in principle, allows N-ary comparisons of entities, while controlling for non-independence due to relatedness. Powerful software tools exist for specialized applications of this approach, yet it remains under-utilized in the absence of a unifying informatics infrastructure. A key step in developing such an infrastructure is the definition of a formal ontology. The analysis of use cases and existing formalisms suggests that a significant component of evolutionary analysis involves a core problem of inferring a character history, relying on key concepts: "Operational Taxonomic Units" (OTUs), representing the entities to be compared; "character-state data" representing the observations compared among OTUs; "phylogenetic tree", representing the historical path of evolution among the entities; and "transitions", the inferred evolutionary changes in states of characters that account for observations. Using the Web Ontology Language (OWL), we have defined these and other fundamental concepts in a Comparative Data Analysis Ontology (CDAO). CDAO has been evaluated for its ability to represent token data sets and to support simple forms of reasoning. With further development, CDAO will provide a basis for tools (for semantic transformation, data retrieval, validation, integration, etc.) that make it easier for software developers and biomedical researchers to apply evolutionary methods of inference to diverse types of data, so as to integrate this powerful framework for reasoning into their research.
Ontology and medical diagnosis.
Bertaud-Gounot, Valérie; Duvauferrier, Régis; Burgun, Anita
2012-03-01
Ontology and associated generic tools are appropriate for knowledge modeling and reasoning, but most of the time, disease definitions in existing description logic (DL) ontology are not sufficient to classify patient's characteristics under a particular disease because they do not formalize operational definitions of diseases (association of signs and symptoms=diagnostic criteria). The main objective of this study is to propose an ontological representation which takes into account the diagnostic criteria on which specific patient conditions may be classified under a specific disease. This method needs as a prerequisite a clear list of necessary and sufficient diagnostic criteria as defined for lots of diseases by learned societies. It does not include probability/uncertainty which Web Ontology Language (OWL 2.0) cannot handle. We illustrate it with spondyloarthritis (SpA). Ontology has been designed in Protégé 4.1 OWL-DL2.0. Several kinds of criteria were formalized: (1) mandatory criteria, (2) picking two criteria among several diagnostic criteria, (3) numeric criteria. Thirty real patient cases were successfully classified with the reasoner. This study shows that it is possible to represent operational definitions of diseases with OWL and successfully classify real patient cases. Representing diagnostic criteria as descriptive knowledge (instead of rules in Semantic Web Rule Language or Prolog) allows us to take advantage of tools already available for OWL. While we focused on Assessment of SpondyloArthritis international Society SpA criteria, we believe that many of the representation issues addressed here are relevant to using OWL-DL for operational definition of other diseases in ontology.
Stautberg Iii, Eugene F; Romero, Jose; Bender, Sean; DeHart, Marc
2018-04-11
Introduction Practice management and health policy have generally not been considered integral to orthopaedic resident education. Our objective was to evaluate residents' current experience and knowledge, formal training, and desire for further education in practice management and health policy. Methods We developed a 29-question survey that was divided into three sections: practice management, initial employment opportunity, and health policy. Within each section, questions were directed at a resident's current experience and knowledge, formal training, and interest in further education. The survey was distributed at the end of the academic year through an Internet-based survey tool (www.surveymonkey.com) to orthopaedic residents representing multiple programs and all postgraduate years. Results The survey was distributed to 121 residents representing eight residency programs. Of those, 87 residents responded, resulting in a 72% response rate. All postgraduate years were represented. Regarding practice management, 66% had "no confidence" or "some confidence" in coding clinical encounters. When asked if practice models, finance management, and coding should be taught in residency, 95%, 93%, and 97% responded "yes," respectively. When evaluating first employment opportunities, the three most important factors were location, operating room block time, and call. Regarding health policy, 28% were "moderately familiar" or "very familiar" with the Physician Payments Sunshine Act, and 72% were "not familiar" or "somewhat familiar" with bundled payments for arthroplasty. Finally, when asked if yearly lectures in political activities would enhance resident education, 90% responded "yes." Discussion and conclusion Regarding practice management, the survey suggests that current orthopaedic residents are not familiar with basic topics, do not receive formal training, and want further education. The survey suggests that residents also receive minimal training in health policy. Residents feel that health policy will be important in their careers, and they would benefit from formal training in residency.
Fischer, Heidi J; Vergara, Ximena P; Yost, Michael; Silva, Michael; Lombardi, David A; Kheifets, Leeka
2017-01-01
Job exposure matrices (JEMs) are tools used to classify exposures for job titles based on general job tasks in the absence of individual level data. However, exposure uncertainty due to variations in worker practices, job conditions, and the quality of data has never been quantified systematically in a JEM. We describe a methodology for creating a JEM which defines occupational exposures on a continuous scale and utilizes elicitation methods to quantify exposure uncertainty by assigning exposures probability distributions with parameters determined through expert involvement. Experts use their knowledge to develop mathematical models using related exposure surrogate data in the absence of available occupational level data and to adjust model output against other similar occupations. Formal expert elicitation methods provided a consistent, efficient process to incorporate expert judgment into a large, consensus-based JEM. A population-based electric shock JEM was created using these methods, allowing for transparent estimates of exposure.
An Exact Formula for Calculating Inverse Radial Lens Distortions
Drap, Pierre; Lefèvre, Julien
2016-01-01
This article presents a new approach to calculating the inverse of radial distortions. The method presented here provides a model of reverse radial distortion, currently modeled by a polynomial expression, that proposes another polynomial expression where the new coefficients are a function of the original ones. After describing the state of the art, the proposed method is developed. It is based on a formal calculus involving a power series used to deduce a recursive formula for the new coefficients. We present several implementations of this method and describe the experiments conducted to assess the validity of the new approach. Such an approach, non-iterative, using another polynomial expression, able to be deduced from the first one, can actually be interesting in terms of performance, reuse of existing software, or bridging between different existing software tools that do not consider distortion from the same point of view. PMID:27258288
Genetic design automation: engineering fantasy or scientific renewal?
Lux, Matthew W; Bramlett, Brian W; Ball, David A; Peccoud, Jean
2012-02-01
The aim of synthetic biology is to make genetic systems more amenable to engineering, which has naturally led to the development of computer-aided design (CAD) tools. Experimentalists still primarily rely on project-specific ad hoc workflows instead of domain-specific tools, which suggests that CAD tools are lagging behind the front line of the field. Here, we discuss the scientific hurdles that have limited the productivity gains anticipated from existing tools. We argue that the real value of efforts to develop CAD tools is the formalization of genetic design rules that determine the complex relationships between genotype and phenotype. Copyright © 2011 Elsevier Ltd. All rights reserved.
Small scale sequence automation pays big dividends
NASA Technical Reports Server (NTRS)
Nelson, Bill
1994-01-01
Galileo sequence design and integration are supported by a suite of formal software tools. Sequence review, however, is largely a manual process with reviewers scanning hundreds of pages of cryptic computer printouts to verify sequence correctness. Beginning in 1990, a series of small, PC based sequence review tools evolved. Each tool performs a specific task but all have a common 'look and feel'. The narrow focus of each tool means simpler operation, and easier creation, testing, and maintenance. Benefits from these tools are (1) decreased review time by factors of 5 to 20 or more with a concomitant reduction in staffing, (2) increased review accuracy, and (3) excellent returns on time invested.
From Classical to Quantum: New Canonical Tools for the Dynamics of Gravity
NASA Astrophysics Data System (ADS)
Höhn, P. A.
2012-05-01
In a gravitational context, canonical methods offer an intuitive picture of the dynamics and simplify an identification of the degrees of freedom. Nevertheless, extracting dynamical information from background independent approaches to quantum gravity is a highly non-trivial challenge. In this thesis, the conundrum of (quantum) gravitational dynamics is approached from two different directions by means of new canonical tools. This thesis is accordingly divided into two parts: In the first part, a general canonical formalism for discrete systems featuring a variational action principle is developed which is equivalent to the covariant formulation following directly from the action. This formalism can handle evolving phase spaces and is thus appropriate for describing evolving lattices. Attention will be devoted to a characterization of the constraints, symmetries and degrees of freedom appearing in such discrete systems which, in the case of evolving phase spaces, is time step dependent. The advantage of this formalism is that it does not depend on the particular discretization and, hence, is suitable for coarse graining procedures. This formalism is applicable to discrete mechanics, lattice field theories and discrete gravity models---underlying some approaches to quantum gravity---and, furthermore, may prove useful for numerical imple mentations. For concreteness, these new tools are employed to formulate Regge Calculus canonically as a theory of the dynamics of discrete hypersurfaces in discrete spacetimes, thereby removing a longstanding obstacle to connecting covariant simplicial gravity models with canonical frameworks. This result is interesting in view of several background independent approaches to quantum gravity. In addition, perturbative expansions around symmetric background solutions of Regge Calculus are studied up to second order. Background gauge modes generically become propagating at second order as a consequence of a symmetry breaking. In the second part of this thesis, the paradigm of relational dynamics is considered. Dynamical observables in gravity are relational. Unfortunately, their construction and evaluation is notoriously difficult, especially in the quantum theory. An effective canonical framework is devised which permits to evaluate the semiclassical relational dynamics of constrained quantum systems by sidestepping technical problems associated with explicit constructions of physical Hilbert spaces. This effective approach is well-geared for addressing the concept of relational evolution in general quantum cosmological models since it (i) allows to depart from idealized relational `clock references’ and, instead, to employ generic degrees of freedom as imperfect relational `clocks’, (ii) enables one to systematically switch between different such `clocks’ and (iii) yields a consistent (temporally) local time evolution with transient observables so long as semiclassicality holds. These techniques are illustrated by toy models and, finally, are applied to a non-integrable cosmological model. It is argued that relational evolution is generically only a transient and semiclassical phenomenon
DOE Office of Scientific and Technical Information (OSTI.GOV)
Messud, J.; Dinh, P. M.; Suraud, Eric
2009-10-15
We propose a simplification of the time-dependent self-interaction correction (TD-SIC) method using two sets of orbitals, applying the optimized effective potential (OEP) method. The resulting scheme is called time-dependent 'generalized SIC-OEP'. A straightforward approximation, using the spatial localization of one set of orbitals, leads to the 'generalized SIC-Slater' formalism. We show that it represents a great improvement compared to the traditional SIC-Slater and Krieger-Li-Iafrate formalisms.
NASA Astrophysics Data System (ADS)
Messud, J.; Dinh, P. M.; Reinhard, P.-G.; Suraud, Eric
2009-10-01
We propose a simplification of the time-dependent self-interaction correction (TD-SIC) method using two sets of orbitals, applying the optimized effective potential (OEP) method. The resulting scheme is called time-dependent “generalized SIC-OEP.” A straightforward approximation, using the spatial localization of one set of orbitals, leads to the “generalized SIC-Slater” formalism. We show that it represents a great improvement compared to the traditional SIC-Slater and Krieger-Li-Iafrate formalisms.
Continuous spectra of atomic hydrogen in a strong magnetic field
NASA Astrophysics Data System (ADS)
Zhao, L. B.; Zatsarinny, O.; Bartschat, K.
2016-09-01
We describe a theoretical method, developed in the coupled-channel formalism, to study photoionization of H atoms in a strong magnetic field of a size that is typical for magnetic white dwarfs. The coupled Schrödinger equations are solved numerically using the renormalized Numerov method proposed by Johnson [B. R. Johnson, J. Chem. Phys. 67, 4086 (1977), 10.1063/1.435384; B. R. Johnson, J. Chem. Phys. 69, 4678 (1978), 10.1063/1.436421]. The distinct advantage of this method is the fact that no overflow problems are encountered in the classically forbidden region, and hence the method exhibits excellent numerical stability. Photoionization cross sections are presented for magnetized H atoms in the ground and 2 p excited states. The calculated results are compared with those obtained by other theories. The present method is particularly useful for explaining the complex features of continuous spectra in a strong magnetic field and hence provides an efficient tool for modeling photoionization spectra observed in the atmosphere of magnetic white dwarfs.
Visualizing, Approximating, and Understanding Black-Hole Binaries
NASA Astrophysics Data System (ADS)
Nichols, David A.
Numerical-relativity simulations of black-hole binaries and advancements in gravitational-wave detectors now make it possible to learn more about the collisions of compact astrophysical bodies. To be able to infer more about the dynamical behavior of these objects requires a fuller analysis of the connection between the dynamics of pairs of black holes and their emitted gravitational waves. The chapters of this thesis describe three approaches to learn more about the relationship between the dynamics of black-hole binaries and their gravitational waves: modeling momentum flow in binaries with the Landau-Lifshitz formalism, approximating binary dynamics near the time of merger with post-Newtonian and black-hole-perturbation theories, and visualizing spacetime curvature with tidal tendexes and frame-drag vortexes. In Chapters 2--4, my collaborators and I present a method to quantify the flow of momentum in black-hole binaries using the Landau-Lifshitz formalism. Chapter 2 reviews an intuitive version of the formalism in the first-post-Newtonian approximation that bears a strong resemblance to Maxwell's theory of electromagnetism. Chapter 3 applies this approximation to relate the simultaneous bobbing motion of rotating black holes in the superkick configuration---equal-mass black holes with their spins anti-aligned and in the orbital plane---to the flow of momentum in the spacetime, prior to the black holes' merger. Chapter 4 then uses the Landau-Lifshitz formalism to explain the dynamics of a head-on merger of spinning black holes, whose spins are anti-aligned and transverse to the infalling motion. Before they merge, the black holes move with a large, transverse, velocity, which we can explain using the post-Newtonian approximation; as the holes merge and form a single black hole, we can use the Landau-Lifshitz formalism without any approximations to connect the slowing of the final black hole to its absorbing momentum density during the merger. In Chapters 5--7, we discuss using analytical approximations, such as post-Newtonian and black-hole-perturbation theories, to gain further understanding into how gravitational waves are generated by black-hole binaries. Chapter 5 presents a way of combining post-Newtonian and black-hole-perturbation theories---which we call the hybrid method---for head-on mergers of black holes. It was able to produce gravitational waveforms and gravitational recoils that agreed well with comparable results from numerical-relativity simulations. Chapter 6 discusses a development of the hybrid model to include a radiation-reaction force, which is better suited for studying inspiralling black-hole binaries. The gravitational waveform from the hybrid method for inspiralling mergers agreed qualitatively with that from numerical-relativity simulations; when applied to the superkick configuration, it gave a simplified picture of the formation of the large black-hole kick. Chapter 7 describes an approximate method of calculating the frequencies of the ringdown gravitational waveforms of rotating black holes (quasinormal modes). The method generalizes a geometric interpretation of black-hole quasinormal modes and explains a degeneracy in the spectrum of these modes. In Chapters 8--11, we describe a new way of visualizing spacetime curvature using tools called tidal tendexes and frame-drag vortexes. This relies upon a time-space split of spacetime, which allows one to break the vacuum Riemann curvature tensor into electric and magnetic parts (symmetric, trace-free tensors that have simple physical interpretations). The regions where the eigenvalues of these tensors are large form the tendexes and vortexes of a spacetime, and the integral curves of their eigenvectors are its tendex and vortex lines, for the electric and magnetic parts, respectively. Chapter 8 provides an overview of these visualization tools and presents initial results from numerical-relativity simulations. Chapter 9 uses topological properties of vortex and tendex lines to classify properties of gravitational waves far from a source. Chapter 10 describes the formalism in more detail, and discusses the vortexes and tendexes of multipolar spacetimes in linearized gravity about flat space. The chapter helps to explain how near-zone vortexes and tendexes become gravitational waves far from a weakly gravitating, time-varying source. Chapter 11 is a detailed investigation of the vortexes and tendexes of stationary and perturbed black holes. It develops insight into how perturbations of (strongly gravitating) black holes extend from near the horizon to become gravitational waves.
A hierarchical model for probabilistic independent component analysis of multi-subject fMRI studies
Tang, Li
2014-01-01
Summary An important goal in fMRI studies is to decompose the observed series of brain images to identify and characterize underlying brain functional networks. Independent component analysis (ICA) has been shown to be a powerful computational tool for this purpose. Classic ICA has been successfully applied to single-subject fMRI data. The extension of ICA to group inferences in neuroimaging studies, however, is challenging due to the unavailability of a pre-specified group design matrix. Existing group ICA methods generally concatenate observed fMRI data across subjects on the temporal domain and then decompose multi-subject data in a similar manner to single-subject ICA. The major limitation of existing methods is that they ignore between-subject variability in spatial distributions of brain functional networks in group ICA. In this paper, we propose a new hierarchical probabilistic group ICA method to formally model subject-specific effects in both temporal and spatial domains when decomposing multi-subject fMRI data. The proposed method provides model-based estimation of brain functional networks at both the population and subject level. An important advantage of the hierarchical model is that it provides a formal statistical framework to investigate similarities and differences in brain functional networks across subjects, e.g., subjects with mental disorders or neurodegenerative diseases such as Parkinson’s as compared to normal subjects. We develop an EM algorithm for model estimation where both the E-step and M-step have explicit forms. We compare the performance of the proposed hierarchical model with that of two popular group ICA methods via simulation studies. We illustrate our method with application to an fMRI study of Zen meditation. PMID:24033125
ERIC Educational Resources Information Center
Freels, Jeffrey W.
2015-01-01
The emergence of social media technologies (SMT) as important features of life in the twenty-first century has aroused the curiosity of teachers and scholars in higher education and given rise to numerous experiments using SMT as tools of instruction in college and university classrooms. A body of research has emerged from those experiments which…
DARPA Initiative in Concurrent Engineering (DICE). Phase 2
1990-07-31
XS spreadsheet tool " Q-Calc spreadsheet tool " TAE+ outer wrapper for XS • Framemaker-based formal EDN (Electronic Design Notebook) " Data...shared global object space and object persistence. Technical Results Module Development XS Integration Environment A prototype of the wrapper concepts...for a spreadsheet integration environment, using an X-Windows based extensible Lotus 1-2-3 emulation called XS , and was (initially) targeted for
Animals, Emperors, Senses: Exploring a Story-Based Learning Design in a Museum Setting
ERIC Educational Resources Information Center
Murmann, Mai; Avraamidou, Lucy
2014-01-01
The aim of this qualitative case study was to explore the use of stories as tools for learning within formal and informal learning environments. The design was based on three areas of interest: (a) the story as a tool for learning; (b) the student as subjects engaging with the story; and (c) the context in which the story learning activity takes…
When product designers use perceptually based color tools
NASA Astrophysics Data System (ADS)
Bender, Walter R.
1998-07-01
Palette synthesis and analysis tools have been built based upon a model of color experience. This model adjusts formal compositional elements such as hue, value, chroma, and their contrasts, as well as size and proportion. Clothing and household product designers were given these tools to give guidance to their selection of seasonal palettes for use in production of the private-label merchandise of a large retail chain. The designers chose base palettes. Accents to these palettes were generated with and without the aid of the color tools. These palettes are compared by using perceptual metrics and interviews. The results are presented.
When product designers use perceptually based color tools
NASA Astrophysics Data System (ADS)
Bender, Walter R.
2001-01-01
Palette synthesis and analysis tools have been built based upon a model of color experience. This model adjusts formal compositional elements such as hue, value, chroma, and their contrasts, as well as size and proportion. Clothing and household product designers were given these tools to guide their selection of seasonal palettes in the production of the private-label merchandise in a large retail chain. The designers chose base palettes. Accents to these palettes were generated with and without the aid of the color tools. These palettes are compared by using perceptual metrics and interviews. The results are presented.
Verification of NASA Emergent Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike
2004-01-01
NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.
NASA Astrophysics Data System (ADS)
Kalthoff, Mona; Keim, Frederik; Krull, Holger; Uhrig, Götz S.
2017-05-01
The density matrix formalism and the equation of motion approach are two semi-analytical methods that can be used to compute the non-equilibrium dynamics of correlated systems. While for a bilinear Hamiltonian both formalisms yield the exact result, for any non-bilinear Hamiltonian a truncation is necessary. Due to the fact that the commonly used truncation schemes differ for these two methods, the accuracy of the obtained results depends significantly on the chosen approach. In this paper, both formalisms are applied to the quantum Rabi model. This allows us to compare the approximate results and the exact dynamics of the system and enables us to discuss the accuracy of the approximations as well as the advantages and the disadvantages of both methods. It is shown to which extent the results fulfill physical requirements for the observables and which properties of the methods lead to unphysical results.
2017-04-17
Cyberphysical Systems, Formal Methods , Requirements Patterns, AADL, Assume Guarantee Reasoning Environment 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...5 3. Methods , Assumptions, and Procedures...Rockwell Collins has been addressing these challenges by developing compositional reasoning methods that permit the verification of systems that exceed
ENVIRONMENTAL SYSTEMS MANAGEMENT / POLLUTION PREVENTION RESEARCH
Goal 8.4 Improve Environmental Systems Management (Formally Pollution Prevention and New Technology) Background The U.S. Environmental Protection Agency (EPA) has developed and evaluated tools and technologies to monitor, prevent, control, and clean-up pollution through...
Environmental impact assessment of rail infrastructure.
DOT National Transportation Integrated Search
2016-01-29
This project resulted in three products: a comprehensive "Sustainable Rail Checklist," a rail planning GIS database, and a web GIS tool that integrates sustainability metrics and facilitates a rapid assessment before a formal NEPA process is implemen...
EAP: An Important Supervisory Tool.
ERIC Educational Resources Information Center
Wright, Jim
1984-01-01
Discusses elements of the Employee Assistance Program: why employees need it, their acceptance of the program, when to refer an employee to the program, counseling, formal referral, plan of action, and how the program helps the supervisor. (CT)
Dominant partition method. [based on a wave function formalism
NASA Technical Reports Server (NTRS)
Dixon, R. M.; Redish, E. F.
1979-01-01
By use of the L'Huillier, Redish, and Tandy (LRT) wave function formalism, a partially connected method, the dominant partition method (DPM) is developed for obtaining few body reductions of the many body problem in the LRT and Bencze, Redish, and Sloan (BRS) formalisms. The DPM maps the many body problem to a fewer body one by using the criterion that the truncated formalism must be such that consistency with the full Schroedinger equation is preserved. The DPM is based on a class of new forms for the irreducible cluster potential, which is introduced in the LRT formalism. Connectivity is maintained with respect to all partitions containing a given partition, which is referred to as the dominant partition. Degrees of freedom corresponding to the breakup of one or more of the clusters of the dominant partition are treated in a disconnected manner. This approach for simplifying the complicated BRS equations is appropriate for physical problems where a few body reaction mechanism prevails.
Systems, methods and apparatus for pattern matching in procedure development and verification
NASA Technical Reports Server (NTRS)
Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor)
2011-01-01
Systems, methods and apparatus are provided through which, in some embodiments, a formal specification is pattern-matched from scenarios, the formal specification is analyzed, and flaws in the formal specification are corrected. The systems, methods and apparatus may include pattern-matching an equivalent formal model from an informal specification. Such a model can be analyzed for contradictions, conflicts, use of resources before the resources are available, competition for resources, and so forth. From such a formal model, an implementation can be automatically generated in a variety of notations. The approach can improve the resulting implementation, which, in some embodiments, is provably equivalent to the procedures described at the outset, which in turn can improve confidence that the system reflects the requirements, and in turn reduces system development time and reduces the amount of testing required of a new system. Moreover, in some embodiments, two or more implementations can be "reversed" to appropriate formal models, the models can be combined, and the resulting combination checked for conflicts. Then, the combined, error-free model can be used to generate a new (single) implementation that combines the functionality of the original separate implementations, and may be more likely to be correct.
Killings, duality and characteristic polynomials
NASA Astrophysics Data System (ADS)
Álvarez, Enrique; Borlaf, Javier; León, José H.
1998-03-01
In this paper the complete geometrical setting of (lowest order) abelian T-duality is explored with the help of some new geometrical tools (the reduced formalism). In particular, all invariant polynomials (the integrands of the characteristic classes) can be explicitly computed for the dual model in terms of quantities pertaining to the original one and with the help of the canonical connection whose intrinsic characterization is given. Using our formalism the physically, and T-duality invariant, relevant result that top forms are zero when there is an isometry without fixed points is easily proved. © 1998
Formal Verification for a Next-Generation Space Shuttle
NASA Technical Reports Server (NTRS)
Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)
2002-01-01
This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.
Automated Verification of Specifications with Typestates and Access Permissions
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Catano, Nestor
2011-01-01
We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).
Development of a Software Safety Process and a Case Study of Its Use
NASA Technical Reports Server (NTRS)
Knight, J. C.
1997-01-01
Research in the year covered by this reporting period has been primarily directed toward the following areas: (1) Formal specification of user interfaces; (2) Fault-tree analysis including software; (3) Evaluation of formal specification notations; (4) Evaluation of formal verification techniques; (5) Expanded analysis of the shell architecture concept; (6) Development of techniques to address the problem of information survivability; and (7) Development of a sophisticated tool for the manipulation of formal specifications written in Z. This report summarizes activities under the grant. The technical results relating to this grant and the remainder of the principal investigator's research program are contained in various reports and papers. The remainder of this report is organized as follows. In the next section, an overview of the project is given. This is followed by a summary of accomplishments during the reporting period and details of students funded. Seminars presented describing work under this grant are listed in the following section, and the final section lists publications resulting from this grant.
NASA Astrophysics Data System (ADS)
Ayalon, Michal; Watson, Anne; Lerman, Steve
2016-09-01
This study examines expressions of reasoning by some higher achieving 11 to 18 year-old English students responding to a survey consisting of function tasks developed in collaboration with their teachers. We report on 70 students, 10 from each of English years 7-13. Iterative and comparative analysis identified capabilities and difficulties of students and suggested conjectures concerning links between the affordances of the tasks, the curriculum, and students' responses. The paper focuses on five of the survey tasks and highlights connections between informal and formal expressions of reasoning about variables in learning. We introduce the notion of `schooled' expressions of reasoning, neither formal nor informal, to emphasise the role of the formatting tools introduced in school that shape future understanding and reasoning.
NASA Langley's Formal Methods Research in Support of the Next Generation Air Transportation System
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Munoz, Cesar A.
2008-01-01
This talk will provide a brief introduction to the formal methods developed at NASA Langley and the National Institute for Aerospace (NIA) for air traffic management applications. NASA Langley's formal methods research supports the Interagency Joint Planning and Development Office (JPDO) effort to define and develop the 2025 Next Generation Air Transportation System (NGATS). The JPDO was created by the passage of the Vision 100 Century of Aviation Reauthorization Act in Dec 2003. The NGATS vision calls for a major transformation of the nation s air transportation system that will enable growth to 3 times the traffic of the current system. The transformation will require an unprecedented level of safety-critical automation used in complex procedural operations based on 4-dimensional (4D) trajectories that enable dynamic reconfiguration of airspace scalable to geographic and temporal demand. The goal of our formal methods research is to provide verification methods that can be used to insure the safety of the NGATS system. Our work has focused on the safety assessment of concepts of operation and fundamental algorithms for conflict detection and resolution (CD&R) and self- spacing in the terminal area. Formal analysis of a concept of operations is a novel area of application of formal methods. Here one must establish that a system concept involving aircraft, pilots, and ground resources is safe. The formal analysis of algorithms is a more traditional endeavor. However, the formal analysis of ATM algorithms involves reasoning about the interaction of algorithmic logic and aircraft trajectories defined over an airspace. These trajectories are described using 2D and 3D vectors and are often constrained by trigonometric relations. Thus, in many cases it has been necessary to unload the full power of an advanced theorem prover. The verification challenge is to establish that the safety-critical algorithms produce valid solutions that are guaranteed to maintain separation under all possible scenarios. Current research has assumed perfect knowledge of the location of other aircraft in the vicinity so absolute guarantees are possible, but increasingly we are relaxing the assumptions to allow incomplete, inaccurate, and/or faulty information from communication sources.
MDAS: an integrated system for metabonomic data analysis.
Liu, Juan; Li, Bo; Xiong, Jiang-Hui
2009-03-01
Metabonomics, the latest 'omics' research field, shows great promise as a tool in biomarker discovery, drug efficacy and toxicity analysis, disease diagnosis and prognosis. One of the major challenges now facing researchers is how to process this data to yield useful information about a biological system, e.g., the mechanism of diseases. Traditional methods employed in metabonomic data analysis use multivariate analysis methods developed independently in chemometrics research. Additionally, with the development of machine learning approaches, some methods such as SVMs also show promise for use in metabonomic data analysis. Aside from the application of general multivariate analysis and machine learning methods to this problem, there is also a need for an integrated tool customized for metabonomic data analysis which can be easily used by biologists to reveal interesting patterns in metabonomic data.In this paper, we present a novel software tool MDAS (Metabonomic Data Analysis System) for metabonomic data analysis which integrates traditional chemometrics methods and newly introduced machine learning approaches. MDAS contains a suite of functional models for metabonomic data analysis and optimizes the flow of data analysis. Several file formats can be accepted as input. The input data can be optionally preprocessed and can then be processed with operations such as feature analysis and dimensionality reduction. The data with reduced dimensionalities can be used for training or testing through machine learning models. The system supplies proper visualization for data preprocessing, feature analysis, and classification which can be a powerful function for users to extract knowledge from the data. MDAS is an integrated platform for metabonomic data analysis, which transforms a complex analysis procedure into a more formalized and simplified one. The software package can be obtained from the authors.
Caricato, Marco
2013-07-28
The calculation of vertical electronic transition energies of molecular systems in solution with accurate quantum mechanical methods requires the use of approximate and yet reliable models to describe the effect of the solvent on the electronic structure of the solute. The polarizable continuum model (PCM) of solvation represents a computationally efficient way to describe this effect, especially when combined with coupled cluster (CC) methods. Two formalisms are available to compute transition energies within the PCM framework: State-Specific (SS) and Linear-Response (LR). The former provides a more complete account of the solute-solvent polarization in the excited states, while the latter is computationally very efficient (i.e., comparable to gas phase) and transition properties are well defined. In this work, I review the theory for the two formalisms within CC theory with a focus on their computational requirements, and present the first implementation of the LR-PCM formalism with the coupled cluster singles and doubles method (CCSD). Transition energies computed with LR- and SS-CCSD-PCM are presented, as well as a comparison between solvation models in the LR approach. The numerical results show that the two formalisms provide different absolute values of transition energy, but similar relative solvatochromic shifts (from nonpolar to polar solvents). The LR formalism may then be used to explore the solvent effect on multiple states and evaluate transition probabilities, while the SS formalism may be used to refine the description of specific states and for the exploration of excited state potential energy surfaces of solvated systems.
Microprocessor Simulation: A Training Technique.
ERIC Educational Resources Information Center
Oscarson, David J.
1982-01-01
Describes the design and application of a microprocessor simulation using BASIC for formal training of technicians and managers and as a management tool. Illustrates the utility of the modular approach for the instruction and practice of decision-making techniques. (SK)
Recent advances in applying decision science to managing national forests
Marcot, Bruce G.; Thompson, Matthew P.; Runge, Michael C.; Thompson, Frank R.; McNulty, Steven; Cleaves, David; Tomosy, Monica; Fisher, Larry A.; Andrew, Bliss
2012-01-01
Management of federal public forests to meet sustainability goals and multiple use regulations is an immense challenge. To succeed, we suggest use of formal decision science procedures and tools in the context of structured decision making (SDM). SDM entails four stages: problem structuring (framing the problem and defining objectives and evaluation criteria), problem analysis (defining alternatives, evaluating likely consequences, identifying key uncertainties, and analyzing tradeoffs), decision point (identifying the preferred alternative), and implementation and monitoring the preferred alternative with adaptive management feedbacks. We list a wide array of models, techniques, and tools available for each stage, and provide three case studies of their selected use in National Forest land management and project plans. Successful use of SDM involves participation by decision-makers, analysts, scientists, and stakeholders. We suggest specific areas for training and instituting SDM to foster transparency, rigor, clarity, and inclusiveness in formal decision processes regarding management of national forests.
Best behaviour? Ontologies and the formal description of animal behaviour.
Gkoutos, Georgios V; Hoehndorf, Robert; Tsaprouni, Loukia; Schofield, Paul N
2015-10-01
The development of ontologies for describing animal behaviour has proved to be one of the most difficult of all scientific knowledge domains. Ranging from neurological processes to human emotions, the range and scope needed for such ontologies is highly challenging, but if data integration and computational tools such as automated reasoning are to be fully applied in this important area the underlying principles of these ontologies need to be better established and development needs detailed coordination. Whilst the state of scientific knowledge is always paramount in ontology and formal description framework design, this is a particular problem with neurobehavioural ontologies where our understanding of the relationship between behaviour and its underlying biophysical basis is currently in its infancy. In this commentary, we discuss some of the fundamental problems in designing and using behaviour ontologies, and present some of the best developed tools in this domain.
1991-10-01
SUBJECT TERMS 15. NUMBER OF PAGES engineering management information systems method formalization 60 information engineering process modeling 16 PRICE...CODE information systems requirements definition methods knowlede acquisition methods systems engineering 17. SECURITY CLASSIFICATION ji. SECURITY... Management , Inc., Santa Monica, California. CORYNEN, G. C., 1975, A Mathematical Theory of Modeling and Simula- tion. Ph.D. Dissertation, Department
2014-01-01
Background Scientific publications are documentary representations of defeasible arguments, supported by data and repeatable methods. They are the essential mediating artifacts in the ecosystem of scientific communications. The institutional “goal” of science is publishing results. The linear document publication format, dating from 1665, has survived transition to the Web. Intractable publication volumes; the difficulty of verifying evidence; and observed problems in evidence and citation chains suggest a need for a web-friendly and machine-tractable model of scientific publications. This model should support: digital summarization, evidence examination, challenge, verification and remix, and incremental adoption. Such a model must be capable of expressing a broad spectrum of representational complexity, ranging from minimal to maximal forms. Results The micropublications semantic model of scientific argument and evidence provides these features. Micropublications support natural language statements; data; methods and materials specifications; discussion and commentary; challenge and disagreement; as well as allowing many kinds of statement formalization. The minimal form of a micropublication is a statement with its attribution. The maximal form is a statement with its complete supporting argument, consisting of all relevant evidence, interpretations, discussion and challenges brought forward in support of or opposition to it. Micropublications may be formalized and serialized in multiple ways, including in RDF. They may be added to publications as stand-off metadata. An OWL 2 vocabulary for micropublications is available at http://purl.org/mp. A discussion of this vocabulary along with RDF examples from the case studies, appears as OWL Vocabulary and RDF Examples in Additional file 1. Conclusion Micropublications, because they model evidence and allow qualified, nuanced assertions, can play essential roles in the scientific communications ecosystem in places where simpler, formalized and purely statement-based models, such as the nanopublications model, will not be sufficient. At the same time they will add significant value to, and are intentionally compatible with, statement-based formalizations. We suggest that micropublications, generated by useful software tools supporting such activities as writing, editing, reviewing, and discussion, will be of great value in improving the quality and tractability of biomedical communications. PMID:26261718
Lfm2000: Fifth NASA Langley Formal Methods Workshop
NASA Technical Reports Server (NTRS)
Holloway, C. Michael (Compiler)
2000-01-01
This is the proceedings of Lfm2000: Fifth NASA Langley Formal Methods Workshop. The workshop was held June 13-15, 2000, in Williamsburg, Virginia. See the web site
The Hierarchical Structure of Formal Operational Tasks.
ERIC Educational Resources Information Center
Bart, William M.; Mertens, Donna M.
1979-01-01
The hierarchical structure of the formal operational period of Piaget's theory of cognitive development was explored through the application of ordering theoretical methods to a set of data that systematically utilized the various formal operational schemes. Results suggested a common structure underlying task performance. (Author/BH)
Supersymmetric symplectic quantum mechanics
NASA Astrophysics Data System (ADS)
de Menezes, Miralvo B.; Fernandes, M. C. B.; Martins, Maria das Graças R.; Santana, A. E.; Vianna, J. D. M.
2018-02-01
Symplectic Quantum Mechanics SQM considers a non-commutative algebra of functions on a phase space Γ and an associated Hilbert space HΓ to construct a unitary representation for the Galilei group. From this unitary representation the Schrödinger equation is rewritten in phase space variables and the Wigner function can be derived without the use of the Liouville-von Neumann equation. In this article we extend the methods of supersymmetric quantum mechanics SUSYQM to SQM. With the purpose of applications in quantum systems, the factorization method of the quantum mechanical formalism is then set within supersymmetric SQM. A hierarchy of simpler hamiltonians is generated leading to new computation tools for solving the eigenvalue problem in SQM. We illustrate the results by computing the states and spectra of the problem of a charged particle in a homogeneous magnetic field as well as the corresponding Wigner function.
Paul Weiss and the genesis of canonical quantization
NASA Astrophysics Data System (ADS)
Rickles, Dean; Blum, Alexander
2015-12-01
This paper describes the life and work of a figure who, we argue, was of primary importance during the early years of field quantisation and (albeit more indirectly) quantum gravity. A student of Dirac and Born, he was interned in Canada during the second world war as an enemy alien and after his release never seemed to regain a good foothold in physics, identifying thereafter as a mathematician. He developed a general method of quantizing (linear and non-linear) field theories based on the parameters labelling an arbitrary hypersurface. This method (the `parameter formalism' often attributed to Dirac), though later discarded, was employed (and viewed at the time as an extremely important tool) by the leading figures associated with canonical quantum gravity: Dirac, Pirani and Schild, Bergmann, DeWitt, and others. We argue that he deserves wider recognition for this and other innovations.
Böhme, Cathleen; von Osthoff, Marc Baron; Frey, Katrin; Hübner, Jutta
2017-08-17
Mobile apps are offered in large numbers and have different qualities. The aim of this article was to develop a rating tool based on formal and content-related criteria for the assessment of cancer apps and to test its applicability on apps. After a thorough analysis of the literature, we developed a specific rating tool for cancer apps based on the MARS (mobile app rating system) and a rating tool for cancer websites. This instrument was applied to apps freely available in stores and focusing on some cancer topic. Ten apps were rated on the basis of 22 criteria. Sixty percent of the apps (6/10) were rated poor and insufficient. The rating by different scientists was homogenous. The good apps had reliable sources were regularly updated and had a concrete intent/purpose in their app description. In contrast, the apps that were rated poor had no distinction of scientific content and advertisement. In some cases, there was no imprint to identify the provider. As apps of poor quality can give misinformation and lead to wrong treatment decisions, efforts have to be made to increase usage of high-quality apps. Certification would help cancer patients to identify reliable apps, yet acceptance of a certification system must be backed up.
A Formal Semantics for the WS-BPEL Recovery Framework
NASA Astrophysics Data System (ADS)
Dragoni, Nicola; Mazzara, Manuel
While current studies on Web services composition are mostly focused - from the technical viewpoint - on standards and protocols, this work investigates the adoption of formal methods for dependable composition. The Web Services Business Process Execution Language (WS-BPEL) - an OASIS standard widely adopted both in academic and industrial environments - is considered as a touchstone for concrete composition languages and an analysis of its ambiguous Recovery Framework specification is offered. In order to show the use of formal methods, a precise and unambiguous description of its (simplified) mechanisms is provided by means of a conservative extension of the π-calculus. This has to be intended as a well known case study providing methodological arguments for the adoption of formal methods in software specification. The aspect of verification is not the main topic of the paper but some hints are given.
The ReaxFF reactive force-field: Development, applications, and future directions
Senftle, Thomas; Hong, Sungwook; Islam, Md Mahbubul; ...
2016-03-04
The reactive force-field (ReaxFF) interatomic potential is a powerful computational tool for exploring, developing and optimizing material properties. Methods based on the principles of quantum mechanics (QM), while offering valuable theoretical guidance at the electronic level, are often too computationally intense for simulations that consider the full dynamic evolution of a system. Alternatively, empirical interatomic potentials that are based on classical principles require significantly fewer computational resources, which enables simulations to better describe dynamic processes over longer timeframes and on larger scales. Such methods, however, typically require a predefined connectivity between atoms, precluding simulations that involve reactive events. The ReaxFFmore » method was developed to help bridge this gap. Approaching the gap from the classical side, ReaxFF casts the empirical interatomic potential within a bond-order formalism, thus implicitly describing chemical bonding without expensive QM calculations. As a result, this article provides an overview of the development, application, and future directions of the ReaxFF method.« less
On making things the best - Aeronautical uses of optimization /Wright Bros. lecture/
NASA Technical Reports Server (NTRS)
Ashley, H.
1981-01-01
The paper's purpose is to summarize and evaluate the results of an investigation into the degree to which formal optimization methods have contributed practically to the design and operation of atmospheric flight vehicles. The nature of this technology is reviewed and illustrated with simple structural examples. A series of published successful applications is described, from the fields of aerodynamics, structures, guidance and control, optimal trajectories and vehicle configuration optimization. The corresponding improvements over conventional analysis are assessed. Speculations are offered as to why these tools have made such little headway toward acceptance by designers. The growing need for their use in the future is explained; they hold out an unparalleled opportunity for improved efficiencies.
The role of fMRI in cognitive neuroscience: where do we stand?
Poldrack, Russell A
2008-04-01
Functional magnetic resonance imaging (fMRI) has quickly become the most prominent tool in cognitive neuroscience. In this article, I outline some of the limits on the kinds of inferences that can be supported by fMRI, focusing particularly on reverse inference, in which the engagement of specific mental processes is inferred from patterns of brain activation. Although this form of inference is weak, newly developed methods from the field of machine learning offer the potential to formalize and strengthen reverse inferences. I conclude by discussing the increasing presence of fMRI results in the popular media and the ethical implications of the increasing predictive power of fMRI.
Formal Foundations for Hierarchical Safety Cases
NASA Technical Reports Server (NTRS)
Denney, Ewen; Pai, Ganesh; Whiteside, Iain
2015-01-01
Safety cases are increasingly being required in many safety-critical domains to assure, using structured argumentation and evidence, that a system is acceptably safe. However, comprehensive system-wide safety arguments present appreciable challenges to develop, understand, evaluate, and manage, partly due to the volume of information that they aggregate, such as the results of hazard analysis, requirements analysis, testing, formal verification, and other engineering activities. Previously, we have proposed hierarchical safety cases, hicases, to aid the comprehension of safety case argument structures. In this paper, we build on a formal notion of safety case to formalise the use of hierarchy as a structuring technique, and show that hicases satisfy several desirable properties. Our aim is to provide a formal, theoretical foundation for safety cases. In particular, we believe that tools for high assurance systems should be granted similar assurance to the systems to which they are applied. To this end, we formally specify and prove the correctness of key operations for constructing and managing hicases, which gives the specification for implementing hicases in AdvoCATE, our toolset for safety case automation. We motivate and explain the theory with the help of a simple running example, extracted from a real safety case and developed using AdvoCATE.
Appraisal Tools for Clinical Practice Guidelines: A Systematic Review
Siering, Ulrich; Eikermann, Michaela; Hausner, Elke; Hoffmann-Eßer, Wiebke; Neugebauer, Edmund A.
2013-01-01
Introduction Clinical practice guidelines can improve healthcare processes and patient outcomes, but are often of low quality. Guideline appraisal tools aim to help potential guideline users in assessing guideline quality. We conducted a systematic review of publications describing guideline appraisal tools in order to identify and compare existing tools. Methods Among others we searched MEDLINE, EMBASE and the Cochrane Database of Systematic Reviews from 1995 to May 2011 for relevant primary and secondary publications. We also handsearched the reference lists of relevant publications. On the basis of the available literature we firstly generated 34 items to be used in the comparison of appraisal tools and grouped them into thirteen quality dimensions. We then extracted formal characteristics as well as questions and statements of the appraisal tools and assigned them to the items. Results We identified 40 different appraisal tools. They covered between three and thirteen of the thirteen possible quality dimensions and between three and 29 of the possible 34 items. The main focus of the appraisal tools were the quality dimensions “evaluation of evidence” (mentioned in 35 tools; 88%), “presentation of guideline content” (34 tools; 85%), “transferability” (33 tools; 83%), “independence” (32 tools; 80%), “scope” (30 tools; 75%), and “information retrieval” (29 tools; 73%). The quality dimensions “consideration of different perspectives” and “dissemination, implementation and evaluation of the guideline” were covered by only twenty (50%) and eighteen tools (45%) respectively. Conclusions Most guideline appraisal tools assess whether the literature search and the evaluation, synthesis and presentation of the evidence in guidelines follow the principles of evidence-based medicine. Although conflicts of interest and norms and values of guideline developers, as well as patient involvement, affect the trustworthiness of guidelines, they are currently insufficiently considered. Greater focus should be placed on these issues in the further development of guideline appraisal tools. PMID:24349397
Rumala, Bernice B; Cason, Frederick D
2007-09-01
Recruitment of more underrepresented minority students (black, Hispanic and native American) to increase racial diversity in the physician workforce is on the agenda for medical schools around the nation. The benefits of having a racially diverse class are indisputable. Minority physicians are more likely to provide care to minority, underserved, disadvantaged and low-income populations. Therefore, medical schools would benefit from diversity through utilizing strategies for recruitment of underrepresented minority (URM) students. Numerous recruitment strategies have been employed to increase the number of underrepresented minority students. However, formal collaboration with minority medical student organizations is an underutilized tool in the recruitment process. Many medical schools have informally used minority medical students and members of various minority organizations on campus in the recruitment process, but a formal collaboration which entails a strategic approach on using minority medical student organizations has yet to be included in the literature. This paper discusses the innovative collaboration between the University of Toledo College of Medicine (UTCOM) chapter of the Student National Medical Association (SNMA) and the college of medicine's admissions office to strategize a recruitment plan to increase the number of underrepresented minority students at the UTCOM. This paper suggests that minority medical student organizations, particularly the SNMA, can be used as a recruiting tool; hence, admissions offices cannot negate the usefulness of having formal involvement of minority medical student organizations as a recruiting tool. This approach may also be applicable to residency programs and other graduate professional fields with a severe shortage of URM students.
Automatically Grading Customer Confidence in a Formal Specification.
ERIC Educational Resources Information Center
Shukur, Zarina; Burke, Edmund; Foxley, Eric
1999-01-01
Describes an automatic grading system for a formal methods computer science course that is able to evaluate a formal specification written in the Z language. Quality is measured by considering first, specification correctness (syntax, semantics, and satisfaction of customer requirements), and second, specification maintainability (comparison of…
Formal Solutions for Polarized Radiative Transfer. II. High-order Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janett, Gioele; Steiner, Oskar; Belluzzi, Luca, E-mail: gioele.janett@irsol.ch
When integrating the radiative transfer equation for polarized light, the necessity of high-order numerical methods is well known. In fact, well-performing high-order formal solvers enable higher accuracy and the use of coarser spatial grids. Aiming to provide a clear comparison between formal solvers, this work presents different high-order numerical schemes and applies the systematic analysis proposed by Janett et al., emphasizing their advantages and drawbacks in terms of order of accuracy, stability, and computational cost.
Formalizing New Navigation Requirements for NASA's Space Shuttle
NASA Technical Reports Server (NTRS)
DiVito, Ben L.
1996-01-01
We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CRs) were selected as promising targets to demonstrate the utility of formal methods in this demanding application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this industrial usage report. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During a limited analysis conducted on the formal specifications, numerous requirements issues were discovered. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.
Bridging the Gulf between Formal Calculus and Physical Reasoning.
ERIC Educational Resources Information Center
Van Der Meer, A.
1980-01-01
Some ways to link calculus instruction with the mathematical models used in physics courses are presented. The activity of modelling is presented as a major tool in synchronizing physics and mathematics instruction in undergraduate engineering programs. (MP)
Prioritising coastal zone management issues through fuzzy cognitive mapping approach.
Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi
2012-04-30
Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. Copyright © 2011 Elsevier Ltd. All rights reserved.
Job Search Methods: Consequences for Gender-based Earnings Inequality.
ERIC Educational Resources Information Center
Huffman, Matt L.; Torres, Lisa
2001-01-01
Data from adults in Atlanta, Boston, and Los Angeles (n=1,942) who searched for work using formal (ads, agencies) or informal (networks) methods indicated that type of method used did not contribute to the gender gap in earnings. Results do not support formal job search as a way to reduce gender inequality. (Contains 55 references.) (SK)
A bioinformatics expert system linking functional data to anatomical outcomes in limb regeneration
Lobo, Daniel; Feldman, Erica B.; Shah, Michelle; Malone, Taylor J.
2014-01-01
Abstract Amphibians and molting arthropods have the remarkable capacity to regenerate amputated limbs, as described by an extensive literature of experimental cuts, amputations, grafts, and molecular techniques. Despite a rich history of experimental effort, no comprehensive mechanistic model exists that can account for the pattern regulation observed in these experiments. While bioinformatics algorithms have revolutionized the study of signaling pathways, no such tools have heretofore been available to assist scientists in formulating testable models of large‐scale morphogenesis that match published data in the limb regeneration field. Major barriers to preventing an algorithmic approach are the lack of formal descriptions for experimental regenerative information and a repository to centralize storage and mining of functional data on limb regeneration. Establishing a new bioinformatics of shape would significantly accelerate the discovery of key insights into the mechanisms that implement complex regeneration. Here, we describe a novel mathematical ontology for limb regeneration to unambiguously encode phenotype, manipulation, and experiment data. Based on this formalism, we present the first centralized formal database of published limb regeneration experiments together with a user‐friendly expert system tool to facilitate its access and mining. These resources are freely available for the community and will assist both human biologists and artificial intelligence systems to discover testable, mechanistic models of limb regeneration. PMID:25729585
Deep first formal concept search.
Zhang, Tao; Li, Hui; Hong, Wenxue; Yuan, Xiamei; Wei, Xinyu
2014-01-01
The calculation of formal concepts is a very important part in the theory of formal concept analysis (FCA); however, within the framework of FCA, computing all formal concepts is the main challenge because of its exponential complexity and difficulty in visualizing the calculating process. With the basic idea of Depth First Search, this paper presents a visualization algorithm by the attribute topology of formal context. Limited by the constraints and calculation rules, all concepts are achieved by the visualization global formal concepts searching, based on the topology degenerated with the fixed start and end points, without repetition and omission. This method makes the calculation of formal concepts precise and easy to operate and reflects the integrity of the algorithm, which enables it to be suitable for visualization analysis.
Formalisms for user interface specification and design
NASA Technical Reports Server (NTRS)
Auernheimer, Brent J.
1989-01-01
The application of formal methods to the specification and design of human-computer interfaces is described. A broad outline of human-computer interface problems, a description of the field of cognitive engineering and two relevant research results, the appropriateness of formal specification techniques, and potential NASA application areas are described.
A STUDY OF FORMALLY ADVERTISED PROCUREMENT
As a method of procuring goods and services, formally advertised procurement offers a number of advantages. These include the prevention of fraud and...two-thirds of all contracts are let in these cases. This is done by examining over 2,300 contracts let under formal advertising procedures. A measure of
Geometry and Formal Linguistics.
ERIC Educational Resources Information Center
Huff, George A.
This paper presents a method of encoding geometric line-drawings in a way which allows sets of such drawings to be interpreted as formal languages. A characterization of certain geometric predicates in terms of their properties as languages is obtained, and techniques usually associated with generative grammars and formal automata are then applied…
Entropy in bimolecular simulations: A comprehensive review of atomic fluctuations-based methods.
Kassem, Summer; Ahmed, Marawan; El-Sheikh, Salah; Barakat, Khaled H
2015-11-01
Entropy of binding constitutes a major, and in many cases a detrimental, component of the binding affinity in biomolecular interactions. While the enthalpic part of the binding free energy is easier to calculate, estimating the entropy of binding is further more complicated. A precise evaluation of entropy requires a comprehensive exploration of the complete phase space of the interacting entities. As this task is extremely hard to accomplish in the context of conventional molecular simulations, calculating entropy has involved many approximations. Most of these golden standard methods focused on developing a reliable estimation of the conformational part of the entropy. Here, we review these methods with a particular emphasis on the different techniques that extract entropy from atomic fluctuations. The theoretical formalisms behind each method is explained highlighting its strengths as well as its limitations, followed by a description of a number of case studies for each method. We hope that this brief, yet comprehensive, review provides a useful tool to understand these methods and realize the practical issues that may arise in such calculations. Copyright © 2015 Elsevier Inc. All rights reserved.
Liu, Nan; Xing, Huayi; Zhou, Mouwang; Biering-Sørensen, Fin
2018-03-29
Objective To investigate the use of functional outcome measurements after spinal cord injury (SCI) in current clinical practice and to explore the knowledge about the Spinal Cord Independence Measure (SCIM) among SCI physicians in China, and to find facilitators for a broader utilization of SCIM. Design A survey-based study. Setting SCI workshops at Peking University. Participants 125 Chinese SCI physicians attending annual workshops in two consecutive years. Interventions Not applicable. Outcome measures A questionnaire was administered. The following items were included: whether functional outcome measurement for SCI individuals was performed and with which assessment tool(s); what items should be included in the assessment; whether they knew about the SCIM, its latest version, the Chinese translation, and if so from what source; the possible reasons why SCIM was not implemented in clinical practice; and whether training before using the SCIM was needed, and the training method preferred. Results Among these physicians, 84.8% performed functional outcome measurement for individuals with SCI, but only 29.6% of attendees were aware of the SCIM and 20.8% had used it. Lack of training was the major reason why SCIM was not used in clinical practice. Furthermore, 74.4% of the physicians felt they needed formal training before using the SCIM. Conclusion The use of SCIM is limited in clinical practice in China, which is mainly attributed to lack of knowledge and training. Formal training on the use of the SCIM is essential for its dissemination and will improve functional SCI outcome measurement in China.
Stopping power of dense plasmas: The collisional method and limitations of the dielectric formalism.
Clauser, C F; Arista, N R
2018-02-01
We present a study of the stopping power of plasmas using two main approaches: the collisional (scattering theory) and the dielectric formalisms. In the former case, we use a semiclassical method based on quantum scattering theory. In the latter case, we use the full description given by the extension of the Lindhard dielectric function for plasmas of all degeneracies. We compare these two theories and show that the dielectric formalism has limitations when it is used for slow heavy ions or atoms in dense plasmas. We present a study of these limitations and show the regimes where the dielectric formalism can be used, with appropriate corrections to include the usual quantum and classical limits. On the other hand, the semiclassical method shows the correct behavior for all plasma conditions and projectile velocity and charge. We consider different models for the ion charge distributions, including bare and dressed ions as well as neutral atoms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Yiying, E-mail: yiyingyan@sjtu.edu.cn; Lü, Zhiguo, E-mail: zglv@sjtu.edu.cn; Zheng, Hang, E-mail: hzheng@sjtu.edu.cn
We present a theoretical formalism for resonance fluorescence radiating from a two-level system (TLS) driven by any periodic driving and coupled to multiple reservoirs. The formalism is derived analytically based on the combination of Floquet theory and Born–Markov master equation. The formalism allows us to calculate the spectrum when the Floquet states and quasienergies are analytically or numerically solved for simple or complicated driving fields. We can systematically explore the spectral features by implementing the present formalism. To exemplify this theory, we apply the unified formalism to comprehensively study a generic model that a harmonically driven TLS is simultaneously coupledmore » to a radiative reservoir and a dephasing reservoir. We demonstrate that the significant features of the fluorescence spectra, the driving-induced asymmetry and the dephasing-induced asymmetry, can be attributed to the violation of detailed balance condition, and explained in terms of the driving-related transition quantities between Floquet-states and their steady populations. In addition, we find the distinguished features of the fluorescence spectra under the biharmonic and multiharmonic driving fields in contrast with that of the harmonic driving case. In the case of the biharmonic driving, we find that the spectra are significantly different from the result of the RWA under the multiple resonance conditions. By the three concrete applications, we illustrate that the present formalism provides a routine tool for comprehensively exploring the fluorescence spectrum of periodically strongly driven TLSs.« less
Workshop on dimensional analysis for design, development, and research executives
NASA Technical Reports Server (NTRS)
Goodman, R. A.; Abernathy, W. J.
1971-01-01
The proceedings of a conference of research and development executives are presented. The purpose of the meeting was to develop an understanding of the conditions which are appropriate for the use of certain general management tools and those conditions which render these tools inappropriate. The verbatim statements of the participants are included to show the direction taken initially by the conference. Formal presentations of management techniques for research and development are developed.
Solar-Terrestrial Ontology Development
NASA Astrophysics Data System (ADS)
McGuinness, D.; Fox, P.; Middleton, D.; Garcia, J.; Cinquni, L.; West, P.; Darnell, J. A.; Benedict, J.
2005-12-01
The development of an interdisciplinary virtual observatory (the Virtual Solar-Terrestrial Observatory; VSTO) as a scalable environment for searching, integrating, and analyzing databases distributed over the Internet requires a higher level of semantic interoperability than here-to-fore required by most (if not all) distributed data systems or discipline specific virtual observatories. The formalization of semantics using ontologies and their encodings for the internet (e.g. OWL - the Web Ontology Language), as well as the use of accompanying tools, such as reasoning, inference and explanation, open up both a substantial leap in options for interoperability and in the need for formal development principles to guide ontology development and use within modern, multi-tiered network data environments. In this presentation, we outline the formal methodologies we utilize in the VSTO project, the currently developed use-cases, ontologies and their relation to existing ontologies (such as SWEET).
Tools reference manual for a Requirements Specification Language (RSL), version 2.0
NASA Technical Reports Server (NTRS)
Fisher, Gene L.; Cohen, Gerald C.
1993-01-01
This report describes a general-purpose Requirements Specification Language, RSL. The purpose of RSL is to specify precisely the external structure of a mechanized system and to define requirements that the system must meet. A system can be comprised of a mixture of hardware, software, and human processing elements. RSL is a hybrid of features found in several popular requirements specification languages, such as SADT (Structured Analysis and Design Technique), PSL (Problem Statement Language), and RMF (Requirements Modeling Framework). While languages such as these have useful features for structuring a specification, they generally lack formality. To overcome the deficiencies of informal requirements languages, RSL has constructs for formal mathematical specification. These constructs are similar to those found in formal specification languages such as EHDM (Enhanced Hierarchical Development Methodology), Larch, and OBJ3.
European Train Control System: A Case Study in Formal Verification
NASA Astrophysics Data System (ADS)
Platzer, André; Quesel, Jan-David
Complex physical systems have several degrees of freedom. They only work correctly when their control parameters obey corresponding constraints. Based on the informal specification of the European Train Control System (ETCS), we design a controller for its cooperation protocol. For its free parameters, we successively identify constraints that are required to ensure collision freedom. We formally prove the parameter constraints to be sharp by characterizing them equivalently in terms of reachability properties of the hybrid system dynamics. Using our deductive verification tool KeYmaera, we formally verify controllability, safety, liveness, and reactivity properties of the ETCS protocol that entail collision freedom. We prove that the ETCS protocol remains correct even in the presence of perturbation by disturbances in the dynamics. We verify that safety is preserved when a PI controlled speed supervision is used.
NASA Technical Reports Server (NTRS)
Bickford, Mark; Srivas, Mandayam
1991-01-01
Presented here is a formal specification and verification of a property of a quadruplicately redundant fault tolerant microprocessor system design. A complete listing of the formal specification of the system and the correctness theorems that are proved are given. The system performs the task of obtaining interactive consistency among the processors using a special instruction on the processors. The design is based on an algorithm proposed by Pease, Shostak, and Lamport. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, providing certain preconditions hold, using a computer aided design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.
Stochastic Formal Correctness of Numerical Algorithms
NASA Technical Reports Server (NTRS)
Daumas, Marc; Lester, David; Martin-Dorel, Erik; Truffert, Annick
2009-01-01
We provide a framework to bound the probability that accumulated errors were never above a given threshold on numerical algorithms. Such algorithms are used for example in aircraft and nuclear power plants. This report contains simple formulas based on Levy's and Markov's inequalities and it presents a formal theory of random variables with a special focus on producing concrete results. We selected four very common applications that fit in our framework and cover the common practices of systems that evolve for a long time. We compute the number of bits that remain continuously significant in the first two applications with a probability of failure around one out of a billion, where worst case analysis considers that no significant bit remains. We are using PVS as such formal tools force explicit statement of all hypotheses and prevent incorrect uses of theorems.
ERIC Educational Resources Information Center
Crestani, Fabio; Dominich, Sandor; Lalmas, Mounia; van Rijsbergen, Cornelis Joost
2003-01-01
Discusses the importance of research on the use of mathematical, logical, and formal methods in information retrieval to help enhance retrieval effectiveness and clarify underlying concepts of information retrieval. Highlights include logic; probability; spaces; and future research needs. (Author/LRW)
Néri, Eugenie Desirèe Rabelo; Meira, Assuero Silva; Vasconcelos, Hemerson Bruno da Silva; Woods, David John; Fonteles, Marta Maria de França
2017-01-01
This study aimed to identify the knowledge, skills and attitudes of Brazilian hospital pharmacists in the use of information technology and electronic tools to support clinical practice. A questionnaire was sent by email to clinical pharmacists working public and private hospitals in Brazil. The instrument was validated using the method of Polit and Beck to determine the content validity index. Data (n = 348) were analyzed using descriptive statistics, Pearson's Chi-square test and Gamma correlation tests. Pharmacists had 1-4 electronic devices for personal use, mainly smartphones (84.8%; n = 295) and laptops (81.6%; n = 284). At work, pharmacists had access to a computer (89.4%; n = 311), mostly connected to the internet (83.9%; n = 292). They felt competent (very capable/capable) searching for a web page/web site on a specific subject (100%; n = 348), downloading files (99.7%; n = 347), using spreadsheets (90.2%; n = 314), searching using MeSH terms in PubMed (97.4%; n = 339) and general searching for articles in bibliographic databases (such as Medline/PubMed: 93.4%; n = 325). Pharmacists did not feel competent in using statistical analysis software (somewhat capable/incapable: 78.4%; n = 273). Most pharmacists reported that they had not received formal education to perform most of these actions except searching using MeSH terms. Access to bibliographic databases was available in Brazilian hospitals, however, most pharmacists (78.7%; n = 274) reported daily use of a non-specific search engine such as Google. This result may reflect the lack of formal knowledge and training in the use of bibliographic databases and difficulty with the English language. The need to expand knowledge about information search tools was recognized by most pharmacists in clinical practice in Brazil, especially those with less time dedicated exclusively to clinical activity (Chi-square, p = 0.006). These results will assist in defining minimal competencies for the training of pharmacists in the field of information technology to support clinical practice. Knowledge and skill gaps are evident in the use of bibliographic databases, spreadsheets and statistical tools.
Baker, Sarah E; Painter, Elizabeth E; Morgan, Brandon C; Kaus, Anna L; Petersen, Evan J; Allen, Christopher S; Deyle, Gail D; Jensen, Gail M
2017-01-01
Clinical reasoning is essential to physical therapist practice. Solid clinical reasoning processes may lead to greater understanding of the patient condition, early diagnostic hypothesis development, and well-tolerated examination and intervention strategies, as well as mitigate the risk of diagnostic error. However, the complex and often subconscious nature of clinical reasoning can impede the development of this skill. Protracted tools have been published to help guide self-reflection on clinical reasoning but might not be feasible in typical clinical settings. This case illustrates how the Systematic Clinical Reasoning in Physical Therapy (SCRIPT) tool can be used to guide the clinical reasoning process and prompt a physical therapist to search the literature to answer a clinical question and facilitate formal mentorship sessions in postprofessional physical therapist training programs. The SCRIPT tool enabled the mentee to generate appropriate hypotheses, plan the examination, query the literature to answer a clinical question, establish a physical therapist diagnosis, and design an effective treatment plan. The SCRIPT tool also facilitated the mentee's clinical reasoning and provided the mentor insight into the mentee's clinical reasoning. The reliability and validity of the SCRIPT tool have not been formally studied. Clinical mentorship is a cornerstone of postprofessional training programs and intended to develop advanced clinical reasoning skills. However, clinical reasoning is often subconscious and, therefore, a challenging skill to develop. The use of a tool such as the SCRIPT may facilitate developing clinical reasoning skills by providing a systematic approach to data gathering and making clinical judgments to bring clinical reasoning to the conscious level, facilitate self-reflection, and make a mentored physical therapist's thought processes explicit to his or her clinical mentor. © 2017 American Physical Therapy Association
Federal and tribal lands road safety audits : case studies
DOT National Transportation Integrated Search
2009-12-01
A road safety audit (RSA) is a formal safety performance examination by an independent, multidisciplinary team. RSAs are an effective tool for proactively improving the safety performance of a road project during the planning and design stages, and f...
Kankya, Clovice; Muleme, James; Akandinda, Ann; Sserugga, Joseph; Nantima, Noelina; Okori, Edward; Odoch, Terence
2017-01-01
Aim An evaluation exercise was carried out to assess the performance of Community Animal Health Workers (CAHWs) in the delivery of animal health care services in Karamoja region, identify capacity gaps and recommend remedial measures. Materials & methods Participatory methods were used to design data collection tools. Questionnaires were administered to 204 CAHWs, 215 farmers and 7 District Veterinary Officers (DVOs) to collect quantitative data. Seven DVOs and 1 Non Government Organization (NGO) representative were interviewed as key informants and one focus group discussion was conducted with a farmer group in Nakapiripirit to collect qualitative data. Questionnaire data was analyzed using SPSS version 19. Key messages from interviews and the focus group discussion were recorded in a notebook and reported verbatim. Results 70% of the farmers revealed that CAHWs are the most readily available animal health care service providers in their respective villages. CAHWs were instrumental in treatment of sick animals, disease surveillance, control of external parasites, animal production, vaccination, reporting, animal identification, and performing minor surgeries. Regarding their overall performance 88.8%(191/215) of the farmers said they were impressed. The main challenges faced by the CAHWs were inadequate facilitation, lack of tools and equipments, unwillingness of government to integrate them into the formal extension system, poor information flow, limited technical capacity to diagnose diseases, unwillingness of farmers to pay for services and sustainability issues. Conclusions and recommendations CAHWs remain the main source of animal health care services in Karamoja region and their services are largely satisfactory. The technical deficits identified require continuous capacity building programs, close supervision and technical backstopping. For sustainability of animal health care services in the region continuous training and strategic deployment of paraprofessionals that are formally recognised by the traditional civil service to gradually replace CAHWs is recommended. PMID:28594945
Modeling and Analysis of Asynchronous Systems Using SAL and Hybrid SAL
NASA Technical Reports Server (NTRS)
Tiwari, Ashish; Dutertre, Bruno
2013-01-01
We present formal models and results of formal analysis of two different asynchronous systems. We first examine a mid-value select module that merges the signals coming from three different sensors that are each asynchronously sampling the same input signal. We then consider the phase locking protocol proposed by Daly, Hopkins, and McKenna. This protocol is designed to keep a set of non-faulty (asynchronous) clocks phase locked even in the presence of Byzantine-faulty clocks on the network. All models and verifications have been developed using the SAL model checking tools and the Hybrid SAL abstractor.
IOOC Organizational Network (ION) Project
NASA Astrophysics Data System (ADS)
Dean, H.
2013-12-01
In order to meet the growing need for ocean information, research communities at the national and international levels have responded most recently by developing organizational frameworks that can help to integrate information across systems of existing networks and standardize methods of data gathering, management, and processing that facilitate integration. To address recommendations and identified challenges related to the need for a better understanding of ocean observing networks, members of the U.S. Interagency Ocean Observation Committee (IOOC) supported pursuing a project that came to be titled the IOOC Organizational Network (ION). The ION tool employs network mapping approaches which mirror approaches developed in academic literature aimed at understanding political networks. Researchers gathered data on the list of global ocean observing organizations included in the Framework for Ocean Observing (FOO), developed in 2012 by the international Task Team for an Integrated Framework for Sustained Ocean Observing. At the international scale, researchers reviewed organizational research plans and documents, websites, and formal international agreement documents. At the U.S. national scale, researchers analyzed legislation, formal inter-agency agreements, work plans, charters, and policy documents. Researchers based analysis of relationships among global organizations and national federal organizations on four broad relationship categories: Communications, Data, Infrastructure, and Human Resources. In addition to the four broad relationship categories, researchers also gathered data on relationship instrument types, strength of relationships, and (at the global level) ocean observing variables. Using network visualization software, researchers then developed a series of dynamic webpages. Researchers used the tool to address questions identified by the ocean observing community, including identifying gaps in global relationships and the types of tools used to develop networks at the U.S. national level. As the ION project goes through beta testing and is utilized to address specific questions posed by the ocean observing community, it will become more refined and more closely linked to user needs and interests.
Formal Methods in Air Traffic Management: The Case of Unmanned Aircraft Systems
NASA Technical Reports Server (NTRS)
Munoz, Cesar A.
2015-01-01
As the technological and operational capabilities of unmanned aircraft systems (UAS) continue to grow, so too does the need to introduce these systems into civil airspace. Unmanned Aircraft Systems Integration in the National Airspace System is a NASA research project that addresses the integration of civil UAS into non-segregated airspace operations. One of the major challenges of this integration is the lack of an onboard pilot to comply with the legal requirement that pilots see and avoid other aircraft. The need to provide an equivalent to this requirement for UAS has motivated the development of a detect and avoid (DAA) capability to provide the appropriate situational awareness and maneuver guidance in avoiding and remaining well clear of traffic aircraft. Formal methods has played a fundamental role in the development of this capability. This talk reports on the formal methods work conducted under NASA's Safe Autonomous System Operations project in support of the development of DAA for UAS. This work includes specification of low-level and high-level functional requirements, formal verification of algorithms, and rigorous validation of software implementations. The talk also discusses technical challenges in formal methods research in the context of the development and safety analysis of advanced air traffic management concepts.
A Formal Methods Approach to the Analysis of Mode Confusion
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Miller, Steven P.; Potts, James N.; Carreno, Victor A.
2004-01-01
The goal of the new NASA Aviation Safety Program (AvSP) is to reduce the civil aviation fatal accident rate by 80% in ten years and 90% in twenty years. This program is being driven by the accident data with a focus on the most recent history. Pilot error is the most commonly cited cause for fatal accidents (up to 70%) and obviously must be given major consideration in this program. While the greatest source of pilot error is the loss of situation awareness , mode confusion is increasingly becoming a major contributor as well. The January 30, 1995 issue of Aviation Week lists 184 incidents and accidents involving mode awareness including the Bangalore A320 crash 2/14/90, the Strasbourg A320 crash 1/20/92, the Mulhouse-Habsheim A320 crash 6/26/88, and the Toulouse A330 crash 6/30/94. These incidents and accidents reveal that pilots sometimes become confused about what the cockpit automation is doing. Consequently, human factors research is an obvious investment area. However, even a cursory look at the accident data reveals that the mode confusion problem is much deeper than just training deficiencies and a lack of human-oriented design. This is readily acknowledged by human factors experts. It seems that further progress in human factors must come through a deeper scrutiny of the internals of the automation. It is in this arena that formal methods can contribute. Formal methods refers to the use of techniques from logic and discrete mathematics in the specification, design, and verification of computer systems, both hardware and software. The fundamental goal of formal methods is to capture requirements, designs and implementations in a mathematically based model that can be analyzed in a rigorous manner. Research in formal methods is aimed at automating this analysis as much as possible. By capturing the internal behavior of a flight deck in a rigorous and detailed formal model, the dark corners of a design can be analyzed. This paper will explore how formal models and analyses can be used to help eliminate mode confusion from flight deck designs and at the same time increase our confidence in the safety of the implementation. The paper is based upon interim results from a new project involving NASA Langley and Rockwell Collins in applying formal methods to a realistic business jet Flight Guidance System (FGS).
Huesch, Marco D
2017-12-01
Surveillance of the safety of prescribed drugs after marketing approval has been secured remains fraught with complications. Formal ascertainment by providers and reporting to adverse-event registries, formal surveys by manufacturers, and mining of electronic medical records are all well-known approaches with varying degrees of difficulty, cost, and success. Novel approaches may be a useful adjunct, especially approaches that mine or sample internet-based methods such as online social networks. A novel commercial software-as-a-service data-mining product supplied by Sysomos from Datasift/Facebook was used to mine all mentions on Facebook of statins and stain-related side effects in the US in the 1-month period 9 January 2017 through 8 February 2017. A total of 4.3% of all 25,700 mentions of statins also mentioned typical stain-related side effects. Multiple methodological weaknesses stymie interpretation of this percentage, which is however not inconsistent with estimates that 5-20% of patients taking statins will experience typical side effects at some time. Future work on pharmacovigilance may be informed by this novel commercial tool, but the inability to mine the full text of a posting poses serious challenges to content categorization.
NASA Technical Reports Server (NTRS)
Barnes, Jeffrey M.
2011-01-01
All software systems of significant size and longevity eventually undergo changes to their basic architectural structure. Such changes may be prompted by evolving requirements, changing technology, or other reasons. Whatever the cause, software architecture evolution is commonplace in real world software projects. Recently, software architecture researchers have begun to study this phenomenon in depth. However, this work has suffered from problems of validation; research in this area has tended to make heavy use of toy examples and hypothetical scenarios and has not been well supported by real world examples. To help address this problem, I describe an ongoing effort at the Jet Propulsion Laboratory to re-architect the Advanced Multimission Operations System (AMMOS), which is used to operate NASA's deep-space and astrophysics missions. Based on examination of project documents and interviews with project personnel, I describe the goals and approach of this evolution effort and then present models that capture some of the key architectural changes. Finally, I demonstrate how approaches and formal methods from my previous research in architecture evolution may be applied to this evolution, while using languages and tools already in place at the Jet Propulsion Laboratory.
Working the College System: Six Strategies for Building a Personal Powerbase
ERIC Educational Resources Information Center
Simplicio, Joseph S. C.
2008-01-01
Within each college system there are prescribed formalized methods for accomplishing tasks and achieving established goals. To truly understand how a college, or any large organization functions, it is vital to understand the basis of the formal structure. Those individuals who understand formal systems within a college can use this knowledge to…
Development of the Space Operations Incident Reporting Tool (SOIRT)
NASA Technical Reports Server (NTRS)
Minton, Jacquie
1997-01-01
The space operations incident reporting tool (SOIRT) is an instrument used to record information about an anomaly occurring during flight which may have been due to insufficient and/or inappropriate application of human factors knowledge. We originally developed the SOIRT form after researching other incident reporting systems of this type. We modified the form after performing several in-house reviews and a pilot test to access usability. Finally, crew members from Space Shuttle flights participated in a usability test of the tool after their missions. Since the National Aeronautics and Space Administration (NASA) currently has no system for continuous collection of this type of information, the SOIRT was developed to report issues such as reach envelope constraints, control operation difficulties, and vision impairments. However, if the SOIRT were to become a formal NASA process, information from crew members could be collected in a database and made available to individuals responsible for improving in-flight safety and productivity. Potential benefits include documentation to justify the redesign or development of new equipment/systems, provide the mission planners with a method for identifying past incidents, justify the development of timelines and mission scenarios, and require the creation of more appropriate work/rest cycles.
NASA Astrophysics Data System (ADS)
Hennell, Michael
This chapter relies on experience with tool development gained over the last thirty years. It shows that there are a large number of techniques that contribute to any successful project, and that formality is always the key: a modern software test tool is based on a firm mathematical foundation. After a brief introduction, Section 2 recalls and extends the terminology of Chapter 1. Section 3 discusses the the design of different sorts of static and dynamic analysis tools. Nine important issues to be taken into consideration when evaluating such tools are presented in Section 4. Section 5 investigates the interplay between testing and proof. In Section 6, we call for developers to take their own medicine and verify their tools. Finally, we conclude in Section 7 with a summary of our main messages, emphasising the important role of testing.
Teaching Basic Quantum Mechanics in Secondary School Using Concepts of Feynman Path Integrals Method
ERIC Educational Resources Information Center
Fanaro, Maria de los Angeles; Otero, Maria Rita; Arlego, Marcelo
2012-01-01
This paper discusses the teaching of basic quantum mechanics in high school. Rather than following the usual formalism, our approach is based on Feynman's path integral method. Our presentation makes use of simulation software and avoids sophisticated mathematical formalism. (Contains 3 figures.)
On the Need for Practical Formal Methods
1998-01-01
additional research and engineering that is needed to make the current set of formal methods more practical. To illustrate the ideas, I present several exam ...either a good violin or a highly talented violinist. Light-weight techniques o er software developers good violins . A user need not be a talented
A Vector Representation for Thermodynamic Relationships
ERIC Educational Resources Information Center
Pogliani, Lionello
2006-01-01
The existing vector formalism method for thermodynamic relationship maintains tractability and uses accessible mathematics, which can be seen as a diverting and entertaining step into the mathematical formalism of thermodynamics and as an elementary application of matrix algebra. The method is based on ideas and operations apt to improve the…
Creating More Effective Mentors: Mentoring the Mentor.
Gandhi, Monica; Johnson, Mallory
2016-09-01
Given the diversity of those affected by HIV, increasing diversity in the HIV biomedical research workforce is imperative. A growing body of empirical and experimental evidence supports the importance of strong mentorship in the development and success of trainees and early career investigators in academic research settings, especially for mentees of diversity. Often missing from this discussion is the need for robust mentoring training programs to ensure that mentors are trained in best practices on the tools and techniques of mentoring. Recent experimental evidence shows improvement in mentor and mentee perceptions of mentor competency after structured and formalized training on best practices in mentoring. We developed a 2-day "Mentoring the Mentors" workshop at UCSF to train mid-level and senior HIV researchers from around the country [recruited mainly from Centers for AIDS Research (CFARs)] on best practices, tools and techniques of effective mentoring. The workshop content was designed using principles of Social Cognitive Career Theory (SCCT) and included training specifically geared towards working with early career investigators from underrepresented groups, including sessions on unconscious bias, microaggressions, and diversity supplements. The workshop has been held three times (September 2012, October 2013 and May 2015) with plans for annual training. Mentoring competency was measured using a validated tool before and after each workshop. Mentoring competency skills in six domains of mentoring-specifically effective communication, aligning expectations, assessing understanding, fostering independence, addressing diversity and promoting development-all improved as assessed by a validated measurement tool for participants pre- and -post the "Mentoring the Mentors" training workshops. Qualitative assessments indicated a greater awareness of the micro-insults and unconscious bias experienced by mentees of diversity and a commitment to improve awareness and mitigate these effects via the mentor-mentee relationship. Our "Mentoring the Mentors" workshop for HIV researchers/mentors offers a formal and structured curriculum on best practices, tools and techniques of effective mentoring, and methods to mitigate unconscious bias in the mentoring relationship. We found quantitative and qualitative improvements in mentoring skills as assessed by self-report by participants after each workshop and plan additional programs with longitudinal longer-term assessments focused on objective mentee outcomes (grants, papers, academic retention). Mentoring training can improve mentoring skills and is likely to improve outcomes for optimally-mentored mentees.
A Guide for Scientists Interested in Researching Student Outcomes
NASA Astrophysics Data System (ADS)
Buxner, Sanlyn R.; Anbar, Ariel; Semken, Steve; Mead, Chris; Horodyskyj, Lev; Perera, Viranga; Bruce, Geoffrey; Schönstein, David
2015-11-01
Scientists spend years training in their scientific discipline and are well versed the literature, methods, and innovations in their own field. Many scientists also take on teaching responsibilities with little formal training in how to implement their courses or assess their students. There is a growing body of literature of what students know in space science courses and the types of innovations that can work to increase student learning but scientists rarely have exposure to this body of literature. For scientists who are interested in more effectively understanding what their students know or investigating the impact their courses have on students, there is little guidance. Undertaking a more formal study of students poses more complexities including finding robust instruments and employing appropriate data analysis. Additionally, formal research with students involves issues of privacy and human subjects concerns, both regulated by federal laws.This poster details the important decisions and issues to consider for both course evaluation and more formal research using a course developed, facilitated, evaluated and researched by a hybrid team of scientists and science education researchers. HabWorlds, designed and implemented by a team of scientists and faculty at Arizona State University, has been using student data to continually improve the course as well as conduct formal research on students’ knowledge and attitudes in science. This ongoing project has had external funding sources to allow robust assessment not available to most instructors. This is a case study for discussing issues that are applicable to designing and assessing all science courses. Over the course of several years, instructors have refined course outcomes and learning objectives that are shared with students as a roadmap of instruction. The team has searched for appropriate tools for assessing student learning and attitudes, tested them and decided which have worked, or not, for assessment in the course. Data from this assessment has led to many changes in the course to better meet the course goals. We will share challenges and lessons learned in our project to assist other instructors interested in doing research on student outcomes.
NASA Technical Reports Server (NTRS)
Weber, Doug; Jamsek, Damir
1994-01-01
The goal of this task was to investigate how formal methods could be incorporated into a software engineering process for flight-control systems under DO-178B and to demonstrate that process by developing a formal specification for NASA's Guidance and Controls Software (GCS) Experiment. GCS is software to control the descent of a spacecraft onto a planet's surface. The GCS example is simplified from a real example spacecraft, but exhibits the characteristics of realistic spacecraft control software. The formal specification is written in Larch.
d'Alquen, Daniela; De Boeck, Kris; Bradley, Judy; Vávrová, Věra; Dembski, Birgit; Wagner, Thomas O F; Pfalz, Annette; Hebestreit, Helge
2012-02-06
The European Centres of Reference Network for Cystic Fibrosis (ECORN-CF) established an Internet forum which provides the opportunity for CF patients and other interested people to ask experts questions about CF in their mother language. The objectives of this study were to: 1) develop a detailed quality assessment tool to analyze quality of expert answers, 2) evaluate the intra- and inter-rater agreement of this tool, and 3) explore changes in the quality of expert answers over the time frame of the project. The quality assessment tool was developed by an expert panel. Five experts within the ECORN-CF project used the quality assessment tool to analyze the quality of 108 expert answers published on ECORN-CF from six language zones. 25 expert answers were scored at two time points, one year apart. Quality of answers was also assessed at an early and later period of the project. Individual rater scores and group mean scores were analyzed for each expert answer. A scoring system and training manual were developed analyzing two quality categories of answers: content and formal quality. For content quality, the grades based on group mean scores for all raters showed substantial agreement between two time points, however this was not the case for the grades based on individual rater scores. For formal quality the grades based on group mean scores showed only slight agreement between two time points and there was also poor agreement between time points for the individual grades. The inter-rater agreement for content quality was fair (mean kappa value 0.232 ± 0.036, p < 0.001) while only slight agreement was observed for the grades of the formal quality (mean kappa value 0.105 ± 0.024, p < 0.001). The quality of expert answers was rated high (four language zones) or satisfactory (two language zones) and did not change over time. The quality assessment tool described in this study was feasible and reliable when content quality was assessed by a group of raters. Within ECORN-CF, the tool will help ensure that CF patients all over Europe have equal possibility of access to high quality expert advice on their illness. © 2012 d’Alquen et al; licensee BioMed Central Ltd.
Quality assurance software inspections at NASA Ames: Metrics for feedback and modification
NASA Technical Reports Server (NTRS)
Wenneson, G.
1985-01-01
Software inspections are a set of formal technical review procedures held at selected key points during software development in order to find defects in software documents--is described in terms of history, participants, tools, procedures, statistics, and database analysis.
What can formal methods offer to digital flight control systems design
NASA Technical Reports Server (NTRS)
Good, Donald I.
1990-01-01
Formal methods research begins to produce methods which will enable mathematic modeling of the physical behavior of digital hardware and software systems. The development of these methods directly supports the NASA mission of increasing the scope and effectiveness of flight system modeling capabilities. The conventional, continuous mathematics that is used extensively in modeling flight systems is not adequate for accurate modeling of digital systems. Therefore, the current practice of digital flight control system design has not had the benefits of extensive mathematical modeling which are common in other parts of flight system engineering. Formal methods research shows that by using discrete mathematics, very accurate modeling of digital systems is possible. These discrete modeling methods will bring the traditional benefits of modeling to digital hardware and hardware design. Sound reasoning about accurate mathematical models of flight control systems can be an important part of reducing risk of unsafe flight control.
A peer-led teaching initiative for foundation doctors.
Ramsden, Sophie; Abidogun, Abiola; Stringer, Emma; Mahgoub, Sara; Kastrissianakis, Artemis; Baker, Paul
2015-08-01
Peer teaching has been used informally throughout the history of medical education. Formal studies within the medical student and allied health care professional communities have found it to be a popular, and highly effective, method of teaching. Newly qualified doctors are currently an underused resource in terms of teaching one another. A committee, made up of newly qualified doctors and postgraduate education staff, was established. Using only a few resources, this committee organised regular, peer-led tutorials and used educational needs assessment tools, such as questionnaires, to make improvements to early postgraduate training. A realistic and well-received intervention to improve the teaching of newly qualified doctors, which is feasible in the modern, busy health care setting. Other institutions may find this method and its resources valuable. Newly qualified doctors are currently an underused resource in terms of teaching one another. © 2015 John Wiley & Sons Ltd.
Extending Quantum Chemistry of Bound States to Electronic Resonances
NASA Astrophysics Data System (ADS)
Jagau, Thomas-C.; Bravaya, Ksenia B.; Krylov, Anna I.
2017-05-01
Electronic resonances are metastable states with finite lifetime embedded in the ionization or detachment continuum. They are ubiquitous in chemistry, physics, and biology. Resonances play a central role in processes as diverse as DNA radiolysis, plasmonic catalysis, and attosecond spectroscopy. This review describes novel equation-of-motion coupled-cluster (EOM-CC) methods designed to treat resonances and bound states on an equal footing. Built on complex-variable techniques such as complex scaling and complex absorbing potentials that allow resonances to be associated with a single eigenstate of the molecular Hamiltonian rather than several continuum eigenstates, these methods extend electronic-structure tools developed for bound states to electronic resonances. Selected examples emphasize the formal advantages as well as the numerical accuracy of EOM-CC in the treatment of electronic resonances. Connections to experimental observables such as spectra and cross sections, as well as practical aspects of implementing complex-valued approaches, are also discussed.
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
Hotspot Patterns: The Formal Definition and Automatic Detection of Architecture Smells
2015-01-15
serious question for a project manager or architect: how to determine which parts of the code base should be given higher priority for maintenance and...services framework; Hadoop8 is a tool for distributed processing of large data sets; HBase9 is the Hadoop database; Ivy10 is a dependency management tool...answer this question more rigorously, we conducted Pearson Correlation Analysis to test the dependency between the number of issues a file involves
Offshore safety case approach and formal safety assessment of ships.
Wang, J
2002-01-01
Tragic marine and offshore accidents have caused serious consequences including loss of lives, loss of property, and damage of the environment. A proactive, risk-based "goal setting" regime is introduced to the marine and offshore industries to increase the level of safety. To maximize marine and offshore safety, risks need to be modeled and safety-based decisions need to be made in a logical and confident way. Risk modeling and decision-making tools need to be developed and applied in a practical environment. This paper describes both the offshore safety case approach and formal safety assessment of ships in detail with particular reference to the design aspects. The current practices and the latest development in safety assessment in both the marine and offshore industries are described. The relationship between the offshore safety case approach and formal ship safety assessment is described and discussed. Three examples are used to demonstrate both the offshore safety case approach and formal ship safety assessment. The study of risk criteria in marine and offshore safety assessment is carried out. The recommendations on further work required are given. This paper gives safety engineers in the marine and offshore industries an overview of the offshore safety case approach and formal ship safety assessment. The significance of moving toward a risk-based "goal setting" regime is given.
Semantically-Rigorous Systems Engineering Modeling Using Sysml and OWL
NASA Technical Reports Server (NTRS)
Jenkins, J. Steven; Rouquette, Nicolas F.
2012-01-01
The Systems Modeling Language (SysML) has found wide acceptance as a standard graphical notation for the domain of systems engineering. SysML subsets and extends the Unified Modeling Language (UML) to define conventions for expressing structural, behavioral, and analytical elements, and relationships among them. SysML-enabled modeling tools are available from multiple providers, and have been used for diverse projects in military aerospace, scientific exploration, and civil engineering. The Web Ontology Language (OWL) has found wide acceptance as a standard notation for knowledge representation. OWL-enabled modeling tools are available from multiple providers, as well as auxiliary assets such as reasoners and application programming interface libraries, etc. OWL has been applied to diverse projects in a wide array of fields. While the emphasis in SysML is on notation, SysML inherits (from UML) a semantic foundation that provides for limited reasoning and analysis. UML's partial formalization (FUML), however, does not cover the full semantics of SysML, which is a substantial impediment to developing high confidence in the soundness of any conclusions drawn therefrom. OWL, by contrast, was developed from the beginning on formal logical principles, and consequently provides strong support for verification of consistency and satisfiability, extraction of entailments, conjunctive query answering, etc. This emphasis on formal logic is counterbalanced by the absence of any graphical notation conventions in the OWL standards. Consequently, OWL has had only limited adoption in systems engineering. The complementary strengths and weaknesses of SysML and OWL motivate an interest in combining them in such a way that we can benefit from the attractive graphical notation of SysML and the formal reasoning of OWL. This paper describes an approach to achieving that combination.
NASA Astrophysics Data System (ADS)
Ayu Nurul Handayani, Hemas; Waspada, Indra
2018-05-01
Non-formal Early Childhood Education (non-formal ECE) is an education that is held for children under 4 years old. The implementation in District of Banyumas, Non-formal ECE is monitored by The District Government of Banyumas and helped by Sanggar Kegiatan Belajar (SKB) Purwokerto as one of the organizer of Non-formal Education. The government itself has a program for distributing ECE to all villages in Indonesia. However, The location to construct the ECE school in several years ahead is not arranged yet. Therefore, for supporting that program, a decision support system is made to give some recommendation villages for constructing The ECE building. The data are projected based on Brown’s Double Exponential Smoothing Method and utilizing Preference Ranking Organization Method for Enrichment Evaluation (Promethee) to generate priority order. As the recommendations system, it generates map visualization which is colored according to the priority level of sub-district and village area. The system was tested with black box testing, Promethee testing, and usability testing. The results showed that the system functionality and Promethee algorithm were working properly, and the user was satisfied.
Generalizability of the Ordering among Five Formal Reasoning Tasks by an Ordering-Theoretic Method.
ERIC Educational Resources Information Center
Bart, William M.; And Others
1979-01-01
Five Inhelder-Piaget formal operations tasks were analyzed to determine the extent that the formal operational skills they assess were ordered into a stable hierarchy generalizable across samples of subjects. Subjects were 34 collegiate gymnasts (19 males, 15 females), and 22 students (1 male, 21 females) from a university nursing program.…
ERIC Educational Resources Information Center
Goldratt, Miri; Cohen, Eric H.
2016-01-01
This article explores encounters between formal, informal, and non-formal education and the role of mentor-educators in creating values education in which such encounters take place. Mixed-methods research was conducted in Israeli public schools participating in the Personal Education Model, which combines educational modes. Ethnographic and…
Formal Method of Description Supporting Portfolio Assessment
ERIC Educational Resources Information Center
Morimoto, Yasuhiko; Ueno, Maomi; Kikukawa, Isao; Yokoyama, Setsuo; Miyadera, Youzou
2006-01-01
Teachers need to assess learner portfolios in the field of education. However, they need support in the process of designing and practicing what kind of portfolios are to be assessed. To solve the problem, a formal method of describing the relations between the lesson forms and portfolios that need to be collected and the relations between…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harber, K.S.
1993-05-01
This report contains the following papers: Implications in vivid logic; a self-learning bayesian expert system; a natural language generation system for a heterogeneous distributed database system; competence-switching'' managed by intelligent systems; strategy acquisition by an artificial neural network: Experiments in learning to play a stochastic game; viewpoints and selective inheritance in object-oriented modeling; multivariate discretization of continuous attributes for machine learning; utilization of the case-based reasoning method to resolve dynamic problems; formalization of an ontology of ceramic science in CLASSIC; linguistic tools for intelligent systems; an application of rough sets in knowledge synthesis; and a relational model for imprecise queries.more » These papers have been indexed separately.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harber, K.S.
1993-05-01
This report contains the following papers: Implications in vivid logic; a self-learning Bayesian Expert System; a natural language generation system for a heterogeneous distributed database system; ``competence-switching`` managed by intelligent systems; strategy acquisition by an artificial neural network: Experiments in learning to play a stochastic game; viewpoints and selective inheritance in object-oriented modeling; multivariate discretization of continuous attributes for machine learning; utilization of the case-based reasoning method to resolve dynamic problems; formalization of an ontology of ceramic science in CLASSIC; linguistic tools for intelligent systems; an application of rough sets in knowledge synthesis; and a relational model for imprecise queries.more » These papers have been indexed separately.« less
Logical Modeling and Dynamical Analysis of Cellular Networks
Abou-Jaoudé, Wassim; Traynard, Pauline; Monteiro, Pedro T.; Saez-Rodriguez, Julio; Helikar, Tomáš; Thieffry, Denis; Chaouiya, Claudine
2016-01-01
The logical (or logic) formalism is increasingly used to model regulatory and signaling networks. Complementing these applications, several groups contributed various methods and tools to support the definition and analysis of logical models. After an introduction to the logical modeling framework and to several of its variants, we review here a number of recent methodological advances to ease the analysis of large and intricate networks. In particular, we survey approaches to determine model attractors and their reachability properties, to assess the dynamical impact of variations of external signals, and to consistently reduce large models. To illustrate these developments, we further consider several published logical models for two important biological processes, namely the differentiation of T helper cells and the control of mammalian cell cycle. PMID:27303434
NASA Astrophysics Data System (ADS)
Fitzpatrick, Matthew R. C.; Kennett, Malcolm P.
2018-05-01
We develop a formalism that allows the study of correlations in space and time in both the superfluid and Mott insulating phases of the Bose-Hubbard Model. Specifically, we obtain a two particle irreducible effective action within the contour-time formalism that allows for both equilibrium and out of equilibrium phenomena. We derive equations of motion for both the superfluid order parameter and two-point correlation functions. To assess the accuracy of this formalism, we study the equilibrium solution of the equations of motion and compare our results to existing strong coupling methods as well as exact methods where possible. We discuss applications of this formalism to out of equilibrium situations.
Overton, Edgar Turner; Kauwe, John S K; Paul, Robert; Tashima, Karen; Tate, David F; Patel, Pragna; Carpenter, Charles C J; Patty, David; Brooks, John T; Clifford, David B
2011-11-01
HIV-associated neurocognitive disorders remain prevalent but challenging to diagnose particularly among non-demented individuals. To determine whether a brief computerized battery correlates with formal neurocognitive testing, we identified 46 HIV-infected persons who had undergone both formal neurocognitive testing and a brief computerized battery. Simple detection tests correlated best with formal neuropsychological testing. By multivariable regression model, 53% of the variance in the composite Global Deficit Score was accounted for by elements from the brief computerized tool (P < 0.01). These data confirm previous correlation data with the computerized battery. Using the five significant parameters from the regression model in a Receiver Operating Characteristic curve, 90% of persons were accurately classified as being cognitively impaired or not. The test battery requires additional evaluation, specifically for identifying persons with mild impairment, a state upon which interventions may be effective.
Mechanisms of Developmental Change in Infant Categorization
ERIC Educational Resources Information Center
Westermann, Gert; Mareschal, Denis
2012-01-01
Computational models are tools for testing mechanistic theories of learning and development. Formal models allow us to instantiate theories of cognitive development in computer simulations. Model behavior can then be compared to real performance. Connectionist models, loosely based on neural information processing, have been successful in…
Art to science: Tools for greater objectivity in resource monitoring
USDA-ARS?s Scientific Manuscript database
The earliest inventories of western US rangelands were “ocular” estimates. Now, objective data consistent with formal scientific inquiry is needed to support management decisions that sustain the resource while balancing numerous competing land uses and sometimes-vociferous stakeholders. Yet, the co...
Peer assisted learning as a formal instructional tool.
Naqi, Syed Asghar
2014-03-01
To explore the utility of peer assisted learning (PAL) in medical schools as a formal instructional tool. Grounded theory approach. King Edward Medical University, Lahore, from July 2011 to December 2011. A study was designed using semi-structured in-depth interviews to collect data from final year medical students (n=6), residents (n=4) and faculty members (n=3), selected on the basis of non-probability purposive sampling. The qualitative data thus generated was first translated in English and transcribed and organized into major categories by using a coding framework. Participants were interviewed two more times to further explore their perceptions and experiences related to emergent categories. An iterative process was employed using grounded theory analysis technique to eventually generate theory. PAL was perceived as rewarding in terms of fostering higher order thinking, effective teaching skills and in improving self efficacy among learners. PAL can offer learning opportunity to medical students, residents and faculty members. It can improve depth of their knowledge and skills.
A knowledge based software engineering environment testbed
NASA Technical Reports Server (NTRS)
Gill, C.; Reedy, A.; Baker, L.
1985-01-01
The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing
Formal methods demonstration project for space applications
NASA Technical Reports Server (NTRS)
Divito, Ben L.
1995-01-01
The Space Shuttle program is cooperating in a pilot project to apply formal methods to live requirements analysis activities. As one of the larger ongoing shuttle Change Requests (CR's), the Global Positioning System (GPS) CR involves a significant upgrade to the Shuttle's navigation capability. Shuttles are to be outfitted with GPS receivers and the primary avionics software will be enhanced to accept GPS-provided positions and integrate them into navigation calculations. Prior to implementing the CR, requirements analysts at Loral Space Information Systems, the Shuttle software contractor, must scrutinize the CR to identify and resolve any requirements issues. We describe an ongoing task of the Formal Methods Demonstration Project for Space Applications whose goal is to find an effective way to use formal methods in the GPS CR requirements analysis phase. This phase is currently under way and a small team from NASA Langley, ViGYAN Inc. and Loral is now engaged in this task. Background on the GPS CR is provided and an overview of the hardware/software architecture is presented. We outline the approach being taken to formalize the requirements, only a subset of which is being attempted. The approach features the use of the PVS specification language to model 'principal functions', which are major units of Shuttle software. Conventional state machine techniques form the basis of our approach. Given this background, we present interim results based on a snapshot of work in progress. Samples of requirements specifications rendered in PVS are offered to illustration. We walk through a specification sketch for the principal function known as GPS Receiver State processing. Results to date are summarized and feedback from Loral requirements analysts is highlighted. Preliminary data is shown comparing issues detected by the formal methods team versus those detected using existing requirements analysis methods. We conclude by discussing our plan to complete the remaining activities of this task.
Johansen, E; Moore, Z; van Etten, M; Strapp, H
2014-07-01
To explore similarities and differences in nurses' views on risk assessment practices and preventive care activities in a context where patients' risk of developing pressure ulcers is assessed using clinical judgment (Norway) and a context where patients' risk of developing pressure ulcers is assessed using a formal structured risk assessment combined with clinical judgement (Ireland). A descriptive, qualitative design was employed across two different care settings with a total of 14 health care workers, nine from Norway and five from Ireland. Regardless of whether risk assessment was undertaken using clinical judgment or formal structured risk assessment, identified risk factors, at risk patients and appropriate preventive initiatives discussed by participant were similar across care settings. Furthermore, risk assessment did not necessarily result in the planning and implementation of appropriate pressure ulcer prevention initiatives. Thus, in this instance, use of a formal risk assessment tool does not seem to make any difference to the planning, initiation and evaluation of pressure ulcer prevention strategies. Regardless of the method of risk assessment, patients at risk of developing pressure ulcers are detected, suggesting that the practice of risk assessment should be re-evaluated. Moreover, appropriate preventive interventions were described. However, the missing link between risk assessment and documented care planning is of concern and barriers to appropriate pressure ulcer documentation should be explored further. This work is partly funded by a research grant from the Norwegian Nurses Organisation (NNO) (Norsk Sykepleierforbund NSF) in 2012. The authors have no conflict of interest to declare.
Lost in the chaos: Flawed literature should not generate new disorders.
Van Rooij, Antonius J; Kardefelt-Winther, Daniel
2017-06-01
The paper by Kuss, Griffiths, and Pontes (2016) titled "Chaos and confusion in DSM-5 diagnosis of Internet Gaming Disorder: Issues, concerns, and recommendations for clarity in the field" examines issues relating to the concept of Internet Gaming Disorder. We agree that there are serious issues and extend their arguments by suggesting that the field lacks basic theory, definitions, patient research, and properly validated and standardized assessment tools. As most studies derive data from survey research in functional populations, they exclude people with severe functional impairment and provide only limited information on the hypothesized disorder. Yet findings from such studies are widely used and often exaggerated, leading many to believe that we know more about the problem behavior than we do. We further argue that video game play is associated with several benefits and that formalizing this popular hobby as a psychiatric disorder is not without risks. It might undermine children's right to play or encourage repressive treatment programs, which ultimately threaten children's right to protection against violence. While Kuss et al. (2016) express support for the formal implementation of a disorder, we argue that before we have a proper evidence base, a sound theory, and validated assessment tools, it is irresponsible to support a formal category of disorder and doing so would solidify a confirmatory approach to research in this area.
Lost in the chaos: Flawed literature should not generate new disorders
Van Rooij, Antonius J.; Kardefelt-Winther, Daniel
2017-01-01
The paper by Kuss, Griffiths, and Pontes (2016) titled “Chaos and confusion in DSM-5 diagnosis of Internet Gaming Disorder: Issues, concerns, and recommendations for clarity in the field” examines issues relating to the concept of Internet Gaming Disorder. We agree that there are serious issues and extend their arguments by suggesting that the field lacks basic theory, definitions, patient research, and properly validated and standardized assessment tools. As most studies derive data from survey research in functional populations, they exclude people with severe functional impairment and provide only limited information on the hypothesized disorder. Yet findings from such studies are widely used and often exaggerated, leading many to believe that we know more about the problem behavior than we do. We further argue that video game play is associated with several benefits and that formalizing this popular hobby as a psychiatric disorder is not without risks. It might undermine children’s right to play or encourage repressive treatment programs, which ultimately threaten children’s right to protection against violence. While Kuss et al. (2016) express support for the formal implementation of a disorder, we argue that before we have a proper evidence base, a sound theory, and validated assessment tools, it is irresponsible to support a formal category of disorder and doing so would solidify a confirmatory approach to research in this area. PMID:28301968
Applications of a formal approach to decipher discrete genetic networks.
Corblin, Fabien; Fanchon, Eric; Trilling, Laurent
2010-07-20
A growing demand for tools to assist the building and analysis of biological networks exists in systems biology. We argue that the use of a formal approach is relevant and applicable to address questions raised by biologists about such networks. The behaviour of these systems being complex, it is essential to exploit efficiently every bit of experimental information. In our approach, both the evolution rules and the partial knowledge about the structure and the behaviour of the network are formalized using a common constraint-based language. In this article our formal and declarative approach is applied to three biological applications. The software environment that we developed allows to specifically address each application through a new class of biologically relevant queries. We show that we can describe easily and in a formal manner the partial knowledge about a genetic network. Moreover we show that this environment, based on a constraint algorithmic approach, offers a wide variety of functionalities, going beyond simple simulations, such as proof of consistency, model revision, prediction of properties, search for minimal models relatively to specified criteria. The formal approach proposed here deeply changes the way to proceed in the exploration of genetic and biochemical networks, first by avoiding the usual trial-and-error procedure, and second by placing the emphasis on sets of solutions, rather than a single solution arbitrarily chosen among many others. Last, the constraint approach promotes an integration of model and experimental data in a single framework.
Unmanned Aircraft Systems in the National Airspace System: A Formal Methods Perspective
NASA Technical Reports Server (NTRS)
Munoz, Cesar A.; Dutle, Aaron; Narkawicz, Anthony; Upchurch, Jason
2016-01-01
As the technological and operational capabilities of unmanned aircraft systems (UAS) have grown, so too have international efforts to integrate UAS into civil airspace. However, one of the major concerns that must be addressed in realizing this integration is that of safety. For example, UAS lack an on-board pilot to comply with the legal requirement that pilots see and avoid other aircraft. This requirement has motivated the development of a detect and avoid (DAA) capability for UAS that provides situational awareness and maneuver guidance to UAS operators to aid them in avoiding and remaining well clear of other aircraft in the airspace. The NASA Langley Research Center Formal Methods group has played a fundamental role in the development of this capability. This article gives a selected survey of the formal methods work conducted in support of the development of a DAA concept for UAS. This work includes specification of low-level and high-level functional requirements, formal verification of algorithms, and rigorous validation of software implementations.
Memory sparing, fast scattering formalism for rigorous diffraction modeling
NASA Astrophysics Data System (ADS)
Iff, W.; Kämpfe, T.; Jourlin, Y.; Tishchenko, A. V.
2017-07-01
The basics and algorithmic steps of a novel scattering formalism suited for memory sparing and fast electromagnetic calculations are presented. The formalism, called ‘S-vector algorithm’ (by analogy with the known scattering-matrix algorithm), allows the calculation of the collective scattering spectra of individual layered micro-structured scattering objects. A rigorous method of linear complexity is applied to model the scattering at individual layers; here the generalized source method (GSM) resorting to Fourier harmonics as basis functions is used as one possible method of linear complexity. The concatenation of the individual scattering events can be achieved sequentially or in parallel, both having pros and cons. The present development will largely concentrate on a consecutive approach based on the multiple reflection series. The latter will be reformulated into an implicit formalism which will be associated with an iterative solver, resulting in improved convergence. The examples will first refer to 1D grating diffraction for the sake of simplicity and intelligibility, with a final 2D application example.
Egidi, Franco; Sun, Shichao; Goings, Joshua J; Scalmani, Giovanni; Frisch, Michael J; Li, Xiaosong
2017-06-13
We present a linear response formalism for the description of the electronic excitations of a noncollinear reference defined via Kohn-Sham spin density functional methods. A set of auxiliary variables, defined using the density and noncollinear magnetization density vector, allows the generalization of spin density functional kernels commonly used in collinear DFT to noncollinear cases, including local density, GGA, meta-GGA and hybrid functionals. Working equations and derivations of functional second derivatives with respect to the noncollinear density, required in the linear response noncollinear TDDFT formalism, are presented in this work. This formalism takes all components of the spin magnetization into account independent of the type of reference state (open or closed shell). As a result, the method introduced here is able to afford a nonzero local xc torque on the spin magnetization while still satisfying the zero-torque theorem globally. The formalism is applied to a few test cases using the variational exact-two-component reference including spin-orbit coupling to illustrate the capabilities of the method.
A Linguistic Truth-Valued Temporal Reasoning Formalism and Its Implementation
NASA Astrophysics Data System (ADS)
Lu, Zhirui; Liu, Jun; Augusto, Juan C.; Wang, Hui
Temporality and uncertainty are important features of many real world systems. Solving problems in such systems requires the use of formal mechanism such as logic systems, statistical methods or other reasoning and decision-making methods. In this paper, we propose a linguistic truth-valued temporal reasoning formalism to enable the management of both features concurrently using a linguistic truth valued logic and a temporal logic. We also provide a backward reasoning algorithm which allows the answering of user queries. A simple but realistic scenario in a smart home application is used to illustrate our work.
Bespalova, Nadejda; Morgan, Juliet; Coverdale, John
2016-02-01
Because training residents and faculty to identify human trafficking victims is a major public health priority, the authors review existing assessment tools. PubMed and Google were searched using combinations of search terms including human, trafficking, sex, labor, screening, identification, and tool. Nine screening tools that met the inclusion criteria were found. They varied greatly in length, format, target demographic, supporting resources, and other parameters. Only two tools were designed specifically for healthcare providers. Only one tool was formally assessed to be valid and reliable in a pilot project in trafficking victim service organizations, although it has not been validated in the healthcare setting. This toolbox should facilitate the education of resident physicians and faculty in screening for trafficking victims, assist educators in assessing screening skills, and promote future research on the identification of trafficking victims.
Integrating Reliability Analysis with a Performance Tool
NASA Technical Reports Server (NTRS)
Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael
1995-01-01
A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.
Salvador-Carulla, L; Lukersmith, S; Sullivan, W
2017-04-01
Guideline methods to develop recommendations dedicate most effort around organising discovery and corroboration knowledge following the evidence-based medicine (EBM) framework. Guidelines typically use a single dimension of information, and generally discard contextual evidence and formal expert knowledge and consumer's experiences in the process. In recognition of the limitations of guidelines in complex cases, complex interventions and systems research, there has been significant effort to develop new tools, guides, resources and structures to use alongside EBM methods of guideline development. In addition to these advances, a new framework based on the philosophy of science is required. Guidelines should be defined as implementation decision support tools for improving the decision-making process in real-world practice and not only as a procedure to optimise the knowledge base of scientific discovery and corroboration. A shift from the model of the EBM pyramid of corroboration of evidence to the use of broader multi-domain perspective graphically depicted as 'Greek temple' could be considered. This model takes into account the different stages of scientific knowledge (discovery, corroboration and implementation), the sources of knowledge relevant to guideline development (experimental, observational, contextual, expert-based and experiential); their underlying inference mechanisms (deduction, induction, abduction, means-end inferences) and a more precise definition of evidence and related terms. The applicability of this broader approach is presented for the development of the Canadian Consensus Guidelines for the Primary Care of People with Developmental Disabilities.
The East Anglian specialist registrar assessment tool
Robinson, Susan; Boursicot, Katharine; Hayhurst, Catherine
2007-01-01
Background In our region, it was acknowledged that the process of assessment needed to be improved, but before developing a system for this, there was a need to define the “competent or satisfactory trainee”. Objective To outline the process by which a consensus was achieved on this standard, and how a system for formally assessing competency across a wide range of knowledge skills and attitudes was subsequently agreed on, thus enabling increased opportunities for training and feedback and improving the accuracy of assessment in the region. Methods The opinions of trainees and trainers from across the region were collated, and a consensus was achieved with regard to the minimum acceptable standard for a trainee in emergency medicine, thus defining a competent trainee. The group that set the standard then focused on identifying the assessment methods most appropriate for the evaluation of the knowledge, skills and attitudes required of an emergency medicine trainee. The tool was subsequently trialled for a period of 6 months, and opinion evaluated by use of a questionnaire. Results The use of the tool was reviewed from both the trainers' and trainees' perspectives. 42% (n = 11) of trainers and 31% (n = 8) trainees responded to the questionnaire. In the region, there were 26 trainers and 26 trainees. Five trainees and nine trainers had used the tool. 93% (14/15) of respondents thought that the descriptors used to describe the satisfactory trainee were acceptable; 89% (8/9) of trainers thought that it helped them assess trainees more accurately. 60% (3/5) of trainees thought that, as a result, they had a better understanding of their weak areas. Conclusion We believe that we achieved a consensus across our region as to what defined a satisfactory trainee and set the standard against which all our trainees would subsequently be evaluated. The use of this tool to assess trainees during the pilot period was disappointing; however, we were encouraged that most of those using the tool thought that it allowed an objective assessment of trainees and feedback on areas requiring further work. Those who used the tool identified important reasons that may have hindered widespread use of the assessment tool. PMID:17351222
ERIC Educational Resources Information Center
Ugwu, Chinwe U.
2015-01-01
The National Commission for Mass Literacy, Adult and Non-Formal Education (NMEC) is the Federal Statutory Agency set up to co-ordinate all aspects of Non-Formal Education in Nigeria whether offered by government agencies or non-governmental organisations. This study looked at the existing Capacity Building Programme, the delivery methods, impact…
ERIC Educational Resources Information Center
Penning, Margaret J.
2002-01-01
Purpose: In response to concerns among policymakers and others that increases in the availability of publicly funded formal services will lead to reductions in self- and informal care, this study examines the relationship between the extent of formal in-home care received and levels of self- and informal care. Design and Methods: Two-stage least…
The Influence of Rural Location on Utilization of Formal Home Care: The Role of Medicaid
ERIC Educational Resources Information Center
McAuley, William J.; Spector, William D.; Van Nostrand, Joan; Shaffer, Tom
2004-01-01
Purpose: This research examines the impact of rural-urban residence on formal home-care utilization among older people and determines whether and how Medicaid coverage influences the association between, rural-urban location and risk of formal home-care use. Design and Methods: We combined data from the 1998 consolidated file of the Medical…
Phasic dopamine signals: from subjective reward value to formal economic utility
Schultz, Wolfram; Carelli, Regina M; Wightman, R Mark
2015-01-01
Although rewards are physical stimuli and objects, their value for survival and reproduction is subjective. The phasic, neurophysiological and voltammetric dopamine reward prediction error response signals subjective reward value. The signal incorporates crucial reward aspects such as amount, probability, type, risk, delay and effort. Differences of dopamine release dynamics with temporal delay and effort in rodents may derive from methodological issues and require further study. Recent designs using concepts and behavioral tools from experimental economics allow to formally characterize the subjective value signal as economic utility and thus to establish a neuronal value function. With these properties, the dopamine response constitutes a utility prediction error signal. PMID:26719853
Popplow, Marcus
2015-12-01
Recent critical approaches to what has conventionally been described as "scientific" and "technical" knowledge in early modern Europe have provided a wealth of new insights. So far, the various analytical concepts suggested by these studies have not yet been comprehensively discussed. The present essay argues that such comprehensive approaches might prove of special value for long-term and cross-cultural reflections on technology-related knowledge. As heuristic tools, the notions of "formalization" and "interaction" are proposed as part of alternative narratives to those highlighting the emergence of "science" as the most relevant development for technology-related knowledge in early modern Europe.
The Units Ontology: a tool for integrating units of measurement in science
Gkoutos, Georgios V.; Schofield, Paul N.; Hoehndorf, Robert
2012-01-01
Units are basic scientific tools that render meaning to numerical data. Their standardization and formalization caters for the report, exchange, process, reproducibility and integration of quantitative measurements. Ontologies are means that facilitate the integration of data and knowledge allowing interoperability and semantic information processing between diverse biomedical resources and domains. Here, we present the Units Ontology (UO), an ontology currently being used in many scientific resources for the standardized description of units of measurements. PMID:23060432
ERIC Educational Resources Information Center
Johnson, Christopher W.
1996-01-01
The development of safety-critical systems (aircraft cockpits and reactor control rooms) is qualitatively different from that of other interactive systems. These differences impose burdens on design teams that must ensure the development of human-machine interfaces. Analyzes strengths and weaknesses of formal methods for the design of user…
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
The manual application of formal methods in system specification has produced successes, but in the end, despite any claims and assertions by practitioners, there is no provable relationship between a manually derived system specification or formal model and the customer's original requirements. Complex parallel and distributed system present the worst case implications for today s dearth of viable approaches for achieving system dependability. No avenue other than formal methods constitutes a serious contender for resolving the problem, and so recognition of requirements-based programming has come at a critical juncture. We describe a new, NASA-developed automated requirement-based programming method that can be applied to certain classes of systems, including complex parallel and distributed systems, to achieve a high degree of dependability.
Griffin: A Tool for Symbolic Inference of Synchronous Boolean Molecular Networks.
Muñoz, Stalin; Carrillo, Miguel; Azpeitia, Eugenio; Rosenblueth, David A
2018-01-01
Boolean networks are important models of biochemical systems, located at the high end of the abstraction spectrum. A number of Boolean gene networks have been inferred following essentially the same method. Such a method first considers experimental data for a typically underdetermined "regulation" graph. Next, Boolean networks are inferred by using biological constraints to narrow the search space, such as a desired set of (fixed-point or cyclic) attractors. We describe Griffin , a computer tool enhancing this method. Griffin incorporates a number of well-established algorithms, such as Dubrova and Teslenko's algorithm for finding attractors in synchronous Boolean networks. In addition, a formal definition of regulation allows Griffin to employ "symbolic" techniques, able to represent both large sets of network states and Boolean constraints. We observe that when the set of attractors is required to be an exact set, prohibiting additional attractors, a naive Boolean coding of this constraint may be unfeasible. Such cases may be intractable even with symbolic methods, as the number of Boolean constraints may be astronomically large. To overcome this problem, we employ an Artificial Intelligence technique known as "clause learning" considerably increasing Griffin 's scalability. Without clause learning only toy examples prohibiting additional attractors are solvable: only one out of seven queries reported here is answered. With clause learning, by contrast, all seven queries are answered. We illustrate Griffin with three case studies drawn from the Arabidopsis thaliana literature. Griffin is available at: http://turing.iimas.unam.mx/griffin.
Akins, Ralitsa B.; Handal, Gilbert A.
2009-01-01
Objective Although there is an expectation for outcomes-oriented training in residency programs, the reality is that few guidelines and examples exist as to how to provide this type of education and training. We aimed to improve patient care outcomes in our pediatric residency program by using quality improvement (QI) methods, tools, and approaches. Methods A series of QI projects were implemented over a 3-year period in a pediatric residency program to improve patient care outcomes and teach the residents how to use QI methods, tools, and approaches. Residents experienced practice-based learning and systems-based assessment through group projects and review of their own patient outcomes. Resident QI experiences were reviewed quarterly by the program director and were a mandatory part of resident training portfolios. Results Using QI methodology, we were able to improve management of children with obesity, to achieve high compliance with the national patient safety goals, improve the pediatric hotline service, and implement better patient flow in resident continuity clinic. Conclusion Based on our experiences, we conclude that to successfully implement QI projects in residency programs, QI techniques must be formally taught, the opportunities for resident participation must be multiple and diverse, and QI outcomes should be incorporated in resident training and assessment so that they experience the benefits of the QI intervention. The lessons learned from our experiences, as well as the projects we describe, can be easily deployed and implemented in other residency programs. PMID:21975995
Young, Helen T M; Carr, Norman J; Green, Bryan; Tilley, Charles; Bhargava, Vidhi; Pearce, Neil
2013-08-01
To compare the accuracy of eyeball estimates of the Ki-67 proliferation index (PI) with formal counting of 2000 cells as recommend by the Royal College of Pathologists. Sections from gastroenteropancreatic neuroendocrine tumours were immunostained for Ki-67. PI was calculated using three methods: (1) a manual tally count of 2000 cells from the area of highest nuclear labelling using a microscope eyepiece graticule; (2) eyeball estimates made by four pathologists within the same area of highest nuclear labelling; and (3) image analysis of microscope photographs taken from this area using the ImageJ 'cell counter' tool. ImageJ analysis was considered the gold standard for comparison. Levels of agreement between methods were evaluated using Bland-Altman plots. Agreement between the manual tally and ImageJ assessments was very high at low PIs. Agreement between eyeball assessments and ImageJ analysis varied between pathologists. Where data for low PIs alone were analysed, there was a moderate level of agreement between pathologists' estimates and the gold standard, but when all data were included, agreement was poor. Manual tally counts of 2000 cells exhibited similar levels of accuracy to the gold standard, especially at low PIs. Eyeball estimates were significantly less accurate than the gold standard. This suggests that tumour grades may be misclassified by eyeballing and that formal tally counting of positive cells produces more reliable results. Further studies are needed to identify accurate clinically appropriate ways of calculating.
Practitioner review: the assessment of language pragmatics.
Adams, Catherine
2002-11-01
The assessment of pragmatics expressed in spoken language is a central issue in the evaluation of children with communication impairments and related disorders. A developmental approach to assessment has remained problematic due to the complex interaction of social, linguistic, cognitive and cultural influences on pragmatics. A selective review and critique of current formal and informal testing methods and pragmatic analytic procedures. Formal testing of pragmatics has limited potential to reveal the typical pragmatic abnormalities in interaction but has a significant role to play in the assessment of comprehension of pragmatic intent. Clinical assessment of pragmatics with the pre-school child should focus on elicitation of communicative intent via naturalistic methods as part of an overall assessment of social communication skills. Assessments for older children should include a comprehensive investigation of speech acts, conversational and narrative abilities, the understanding of implicature and intent as well as the child's ability to employ contextual cues to understanding. Practical recommendations are made regarding the choice of a core set of pragmatic assessments and elicitation techniques. The practitioner's attention is drawn to the lack of the usual safeguards of reliability and validity that have persisted in some language pragmatics assessments. A core set of pragmatic assessment tools can be identified from the proliferation of instruments in current use. Further research is required to establish clearer norms and ranges in the development of pragmatic ability, particularly with respect to the understanding of inference, topic management and coherence.
Selecting essential information for biosurveillance--a multi-criteria decision analysis.
Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina
2014-01-01
The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.
The Consortium for Site Characterization Technology (CSCT) has established a formal program to accelerate acceptance and application of innovative monitoring and site characterization technologies that improve the way the nation manages its environmental problems. In 1995 the CS...
DOT National Transportation Integrated Search
2011-12-01
Several agencies are applying asset management principles as a business tool and paradigm to help them define goals and prioritize agency resources in decision making. Previously, transportation asset management (TAM) has focused more on big ticke...
Defense Partnerships: Documenting Trends and Emerging Topics for Action
2015-03-01
between the air force research lab and antelope Valley College (aVC) results in increases in number of scientists, engi- neers, and technicians from...guiding document, tool, or resource should address best prac- tices for project valuation , what types of formalized arrangements are acceptable, and
Technology Focus: Enhancing Conceptual Knowledge of Linear Programming with a Flash Tool
ERIC Educational Resources Information Center
Garofalo, Joe; Cory, Beth
2007-01-01
Mathematical knowledge can be categorized in different ways. One commonly used way is to distinguish between procedural mathematical knowledge and conceptual mathematical knowledge. Procedural knowledge of mathematics refers to formal language, symbols, algorithms, and rules. Conceptual knowledge is essential for meaningful understanding of…
Mapping benefits as a tool for natural resource management in estuarine watersheds
Natural resource managers are often called upon to justify the value of protecting or restoring natural capital based on its perceived benefit to stakeholders. This usually takes the form of formal valuation exercises (i.e., ancillary costs) of a resource without consideration f...
ERIC Educational Resources Information Center
Cannon, Kama
2018-01-01
Although formal papers are typical, sometimes posters or other visual presentations are more useful tools for sharing visual-spatial information. By incorporating creativity and technology into the study of geographical science, STEM (the study of Science, Technology Engineering, and Mathematics) is changed to STEAM (the A stands for ART)! The…
2011-01-01
Background The integration of traditional ecological knowledge (TEK) into formal school curricula may be a key tool for the revitalisation of biocultural diversity, and has the potential to improve the delivery of educational objectives. This paper explores perceptions of the value of TEK to formal education curricula on Malekula Island, Vanuatu. We conducted 49 interviews with key stakeholders (local TEK experts, educators, and officials) regarding the use of the formal school system to transmit, maintain, and revitalise TEK. Interviews also gathered information on the areas where TEK might add value to school curricula and on the perceived barriers to maintaining and revitalising TEK via formal education programs. Results Participants reported that TEK had eroded on Malekula, and identified the formal school system as a principal driver. Most interviewees believed that if an appropriate format could be developed, TEK could be included in the formal education system. Such an approach has potential to maintain customary knowledge and practice in the focus communities. Participants identified several specific domains of TEK for inclusion in school curricula, including ethnomedical knowledge, agricultural knowledge and practice, and the reinforcement of respect for traditional authority and values. However, interviewees also noted a number of practical and epistemological barriers to teaching TEK in school. These included the cultural diversity of Malekula, tensions between public and private forms of knowledge, and multiple values of TEK within the community. Conclusions TEK has potential to add value to formal education systems in Vanuatu by contextualising the content and process of curricular delivery, and by facilitating character development and self-awareness in students. These benefits are congruent with UNESCO-mandated goals for curricular reform and provide a strong argument for the inclusion of TEK in formal school systems. Such approaches may also assist in the maintenance and revitalisation of at-risk systems of ethnobiological knowledge. However, we urge further research attention to the significant epistemological challenges inherent in including TEK in formal school, particularly as participants noted the potential for such approaches to have negative consequences. PMID:22112326
McCarter, Joe; Gavin, Michael C
2011-11-23
The integration of traditional ecological knowledge (TEK) into formal school curricula may be a key tool for the revitalisation of biocultural diversity, and has the potential to improve the delivery of educational objectives. This paper explores perceptions of the value of TEK to formal education curricula on Malekula Island, Vanuatu. We conducted 49 interviews with key stakeholders (local TEK experts, educators, and officials) regarding the use of the formal school system to transmit, maintain, and revitalise TEK. Interviews also gathered information on the areas where TEK might add value to school curricula and on the perceived barriers to maintaining and revitalising TEK via formal education programs. Participants reported that TEK had eroded on Malekula, and identified the formal school system as a principal driver. Most interviewees believed that if an appropriate format could be developed, TEK could be included in the formal education system. Such an approach has potential to maintain customary knowledge and practice in the focus communities. Participants identified several specific domains of TEK for inclusion in school curricula, including ethnomedical knowledge, agricultural knowledge and practice, and the reinforcement of respect for traditional authority and values. However, interviewees also noted a number of practical and epistemological barriers to teaching TEK in school. These included the cultural diversity of Malekula, tensions between public and private forms of knowledge, and multiple values of TEK within the community. TEK has potential to add value to formal education systems in Vanuatu by contextualising the content and process of curricular delivery, and by facilitating character development and self-awareness in students. These benefits are congruent with UNESCO-mandated goals for curricular reform and provide a strong argument for the inclusion of TEK in formal school systems. Such approaches may also assist in the maintenance and revitalisation of at-risk systems of ethnobiological knowledge. However, we urge further research attention to the significant epistemological challenges inherent in including TEK in formal school, particularly as participants noted the potential for such approaches to have negative consequences.
NASA Astrophysics Data System (ADS)
Dupret, M.-A.; De Ridder, J.; De Cat, P.; Aerts, C.; Scuflaire, R.; Noels, A.; Thoul, A.
2003-02-01
We present an improved version of the method of photometric mode identification of Heynderickx et al. (\\cite{hey}). Our new version is based on the inclusion of precise non-adiabatic eigenfunctions determined in the outer stellar atmosphere according to the formalism recently proposed by Dupret et al. (\\cite{dup}). Our improved photometric mode identification technique is therefore no longer dependent on ad hoc parameters for the non-adiabatic effects. It contains the complete physical conditions of the outer atmosphere of the star, provided that rotation does not play a key role. We apply our method to the two slowly pulsating B stars HD 74560 and HD 138764 and to the beta Cephei star EN (16) Lac. Besides identifying the degree l of the pulsating stars, our method is also a tool for improving the knowledge of stellar interiors and atmospheres, by imposing constraints on parameters such as the metallicity and the mixing-length parameter alpha (a procedure we label non-adiabatic asteroseismology). The non-adiabatic eigenfunctions needed for the mode identification are available upon request from the authors.
Bugeza, James; Kankya, Clovice; Muleme, James; Akandinda, Ann; Sserugga, Joseph; Nantima, Noelina; Okori, Edward; Odoch, Terence
2017-01-01
An evaluation exercise was carried out to assess the performance of Community Animal Health Workers (CAHWs) in the delivery of animal health care services in Karamoja region, identify capacity gaps and recommend remedial measures. Participatory methods were used to design data collection tools. Questionnaires were administered to 204 CAHWs, 215 farmers and 7 District Veterinary Officers (DVOs) to collect quantitative data. Seven DVOs and 1 Non Government Organization (NGO) representative were interviewed as key informants and one focus group discussion was conducted with a farmer group in Nakapiripirit to collect qualitative data. Questionnaire data was analyzed using SPSS version 19. Key messages from interviews and the focus group discussion were recorded in a notebook and reported verbatim. 70% of the farmers revealed that CAHWs are the most readily available animal health care service providers in their respective villages. CAHWs were instrumental in treatment of sick animals, disease surveillance, control of external parasites, animal production, vaccination, reporting, animal identification, and performing minor surgeries. Regarding their overall performance 88.8%(191/215) of the farmers said they were impressed. The main challenges faced by the CAHWs were inadequate facilitation, lack of tools and equipments, unwillingness of government to integrate them into the formal extension system, poor information flow, limited technical capacity to diagnose diseases, unwillingness of farmers to pay for services and sustainability issues. CAHWs remain the main source of animal health care services in Karamoja region and their services are largely satisfactory. The technical deficits identified require continuous capacity building programs, close supervision and technical backstopping. For sustainability of animal health care services in the region continuous training and strategic deployment of paraprofessionals that are formally recognised by the traditional civil service to gradually replace CAHWs is recommended.
Miskowiak, K W; Burdick, K E; Martinez-Aran, A; Bonnin, C M; Bowie, C R; Carvalho, A F; Gallagher, P; Lafer, B; López-Jaramillo, C; Sumiyoshi, T; McIntyre, R S; Schaffer, A; Porter, R J; Purdon, S; Torres, I J; Yatham, L N; Young, A H; Kessing, L V; Vieta, E
2018-05-01
Cognition is a new treatment target to aid functional recovery and enhance quality of life for patients with bipolar disorder. The International Society for Bipolar Disorders (ISBD) Targeting Cognition Task Force aimed to develop consensus-based clinical recommendations on whether, when and how to assess and address cognitive impairment. The task force, consisting of 19 international experts from nine countries, discussed the challenges and recommendations in a face-to-face meeting, telephone conference call and email exchanges. Consensus-based recommendations were achieved through these exchanges with no need for formal consensus methods. The identified questions were: (I) Should cognitive screening assessments be routinely conducted in clinical settings? (II) What are the most feasible screening tools? (III) What are the implications if cognitive impairment is detected? (IV) What are the treatment perspectives? Key recommendations are that clinicians: (I) formally screen cognition in partially or fully remitted patients whenever possible, (II) use brief, easy-to-administer tools such as the Screen for Cognitive Impairment in Psychiatry and Cognitive Complaints in Bipolar Disorder Rating Assessment, and (III) evaluate the impact of medication and comorbidity, refer patients for comprehensive neuropsychological evaluation when clinically indicated, and encourage patients to build cognitive reserve. Regarding question (IV), there is limited evidence for current evidence-based treatments but intense research efforts are underway to identify new pharmacological and/or psychological cognition treatments. This task force paper provides the first consensus-based recommendations for clinicians on whether, when, and how to assess and address cognition, which may aid patients' functional recovery and improve their quality of life. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Patel, Ashish D; Tan, Mary K; Angaran, Paul; Bell, Alan D; Berall, Murray; Bucci, Claudia; Demchuk, Andrew M; Essebag, Vidal; Goldin, Lianne; Green, Martin S; Gregoire, Jean C; Gross, Peter L; Heilbron, Brett; Lin, Peter J; Ramanathan, Krishnan; Skanes, Allan; Wheeler, Bruce H; Goodman, Shaun G
2015-03-01
The objectives of this national chart audit (January to June 2013) of 6,346 patients with atrial fibrillation (AF; ≥18 years without a significant heart valve disorder) from 647 primary care physicians were to (1) describe the frequency of stroke and bleed risk assessments in patients with nonvalvular AF by primary care physicians, including the accuracy of these assessments relative to established predictive indexes; (2) outline contemporary methods of anticoagulation used; and (3) report the time in the therapeutic range among patients prescribed warfarin. An annual stroke risk assessment was not undertaken in 15% and estimated without a formal risk tool in 33%; agreement with CHADS2 score estimation was seen in 87% of patients. Major bleeding risk assessment was not undertaken in 25% and estimated without a formal risk tool in 47%; agreement with HAS-BLED score estimation was observed in 64% with physician overestimation in 26% of patients. Antithrombotic therapy included warfarin (58%), dabigatran (22%), rivaroxaban (14%), and apixaban (<1%). Among warfarin-treated patients, the median international normalized ratio was 2.4 and time in therapeutic range (TTR) was 73%; however, the TTR was <50% in 845 (25%), 50% to 69% in 674 (20%), and ≥70% in 1,827 (55%) patients. In conclusion, we describe a contemporary real-world elderly population with AF at important risk for stroke. There is apparent overestimation of bleeding risk in many patients. Warfarin was the dominant stroke prevention treatment; however, the suggested TTR target was achieved in only 55% of these patients. Copyright © 2015 Elsevier Inc. All rights reserved.
Staton, Lisa J; Kraemer, Suzanne M; Patel, Sangnya; Talente, Gregg M; Estrada, Carlos A
2007-07-27
The Accreditation Council on Graduate Medical Education (ACGME) supports chart audit as a method to track competency in Practice-Based Learning and Improvement. We examined whether peer chart audits performed by internal medicine residents were associated with improved documentation of foot care in patients with diabetes mellitus. A retrospective electronic chart review was performed on 347 patients with diabetes mellitus cared for by internal medicine residents in a university-based continuity clinic from May 2003 to September 2004. Residents abstracted information pertaining to documentation of foot examinations (neurological, vascular, and skin) from the charts of patients followed by their physician peers. No formal feedback or education was provided. Significant improvement in the documentation of foot exams was observed over the course of the study. The percentage of patients receiving neurological, vascular, and skin exams increased by 20% (from 13% to 33%) (p = 0.001), 26% (from 45% to 71%) (p < 0.001), and 18% (51%-72%) (p = 0.005), respectively. Similarly, the proportion of patients receiving a well-documented exam which includes all three components - neurological, vascular and skin foot exam - increased over time (6% to 24%, p < 0.001). Peer chart audits performed by residents in the absence of formal feedback were associated with improved documentation of the foot exam in patients with diabetes mellitus. Although this study suggests that peer chart audits may be an effective tool to improve practice-based learning and documentation of foot care in diabetic patients, evaluating the actual performance of clinical care was beyond the scope of this study and would be better addressed by a randomized controlled trial.
A review of creative and expressive writing as a pedagogical tool in medical education.
Cowen, Virginia S; Kaufman, Diane; Schoenherr, Lisa
2016-03-01
The act of writing offers an opportunity to foster self-expression and organisational abilities, along with observation and descriptive skills. These soft skills are relevant to clinical thinking and medical practice. Medical school curricula employ pedagogical approaches suitable for assessing medical and clinical knowledge, but teaching methods for soft skills in critical thinking, listening and verbal expression, which are important in patient communication and engagement, may be less formal. Creative and expressive writing that is incorporated into medical school courses or clerkships offers a vehicle for medical students to develop soft skills. The aim of this review was to explore creative and expressive writing as a pedagogical tool in medical schools in relation to outcomes of medical education. This project employed a scoping review approach to gather, evaluate and synthesise reports on the use of creative and expressive writing in US medical education. Ten databases were searched for scholarly articles reporting on creative or expressive writing during medical school. Limitation of the results to activities associated with US medical schools, produced 91 articles. A thematic analysis of the articles was conducted to identify how writing was incorporated into the curriculum. Enthusiasm for writing as a pedagogical tool was identified in 28 editorials and overviews. Quasi-experimental, mixed methods and qualitative studies, primarily writing activities, were aimed at helping students cognitively or emotionally process difficult challenges in medical education, develop a personal identity or reflect on interpersonal skills. The programmes and interventions using creative or expressive writing were largely associated with elective courses or clerkships, and not required courses. Writing was identified as a potentially relevant pedagogical tool, but not included as an essential component of medical school curricula. © 2016 John Wiley & Sons Ltd.
Yang, Nathan; Hosseini, Sarah; Mascarella, Marco A; Young, Meredith; Posel, Nancy; Fung, Kevin; Nguyen, Lily H P
2017-05-25
Learners often utilize online resources to supplement formalized curricula, and to appropriately support learning, these resources should be of high quality. Thus, the objectives of this study are to develop and provide validity evidence supporting an assessment tool designed to assess the quality of educational websites in Otolaryngology- Head & Neck Surgery (ORL-HNS), and identify those that could support effective web-based learning. METHODS: After a literature review, the Modified Education in Otolaryngology Website (MEOW) assessment tool was designed by a panel of experts based on a previously validated website assessment tool. A search strategy using a Google-based search engine was used subsequently to identify websites. Those that were free of charge and in English were included. Websites were coded for whether their content targeted medical students or residents. Using the MEOW assessment tool, two independent raters scored the websites. Inter-rater and intra-rater reliability were evaluated, and scores were compared to recommendations from a content expert. The MEOW assessment tool included a total of 20 items divided in 8 categories related to authorship, frequency of revision, content accuracy, interactivity, visual presentation, navigability, speed and recommended hyperlinks. A total of 43 out of 334 websites identified by the search met inclusion criteria. The scores generated by our tool appeared to differentiate higher quality websites from lower quality ones: websites that the expert "would recommend" scored 38.4 (out of 56; CI [34.4-42.4]) and "would not recommend" 27.0 (CI [23.2-30.9]). Inter-rater and intra-rater intraclass correlation coefficient were greater than 0.7. Using the MEOW assessment tool, high quality ORL-HNS educational websites were identified.
Changes in formal sex education: 1995-2002.
Lindberg, Laura Duberstein; Santelli, John S; Singh, Susheela
2006-12-01
Although comprehensive sex education is broadly supported by health professionals, funding for abstinence-only education has increased. Using data from the 1995 National Survey of Adolescent Males, the 1995 National Survey of Family Growth (NSFG) and the 2002 NSFG, changes in male and female adolescents' reports of the sex education they have received from formal sources were examined. Life-table methods were used to measure the timing of instruction, and t tests were used for changes over time. From 1995 to 2002, reports of formal instruction about birth control methods declined among both genders (males, from 81% to 66%; females, from 87% to 70%). This, combined with increases in reports of abstinence education among males (from 74% to 83%), resulted in a lower proportion of teenagers' overall receiving formal instruction about both abstinence and birth control methods (males, 65% to 59%; females, 84% to 65%), and a higher proportion of teenagers' receiving instruction only about abstinence (males, 9% to 24%; females, 8% to 21%). Teenagers in 2002 had received abstinence education about two years earlier (median age, 11.4 for males, 11.8 for females) than they had received birth control instruction (median age, 13.5 for both males and females). Among sexually experienced adolescents, 62% of females and 54% of males had received instruction about birth control methods prior to first sex. A substantial retreat from formal instruction about birth control methods has left increasing proportions of adolescents receiving only abstinence education. Efforts are needed to expand teenagers' access to medically accurate and comprehensive reproductive health information.
Educational Software for First Order Logic Semantics in Introductory Logic Courses
ERIC Educational Resources Information Center
Mauco, María Virginia; Ferrante, Enzo; Felice, Laura
2014-01-01
Basic courses on logic are common in most computer science curricula. Students often have difficulties in handling formalisms and getting familiar with them. Educational software helps to motivate and improve the teaching-learning processes. Therefore, incorporating these kinds of tools becomes important, because they contribute to gaining…
A Tool Chain for the V and V of NASA Cryogenic Fuel Loading Health Management
2014-10-02
Here, K. (2011). Formal testing for separation assurance. Ann. Math. Artif . Intell., 63(1), 5–30. Goodrich, C., Narasimhan, S., Daigle, M...Probabilistic Reasoning in Intelligent Sys- tems: Networks of plausible inference Morgan Kauf- mann: . Reed, E., Schumann, J., & Mengshoel, O. (2011
48 CFR 31.205-19 - Insurance and indemnification.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., and disappearance of small hand tools that occur in the ordinary course of business and that are not... general conduct of its business are allowable subject to the following limitations: (i) Types and extent... acquisition cost of the insured assets is allowable only when the contractor has a formal written policy...
EPA has released of the draft document solely for the purpose of pre-dissemination peer review under applicable Information Quality Guidelines (IQGs). This document has not been formally disseminated by EPA. It does not represent and should not be construed to represent any Agenc...
Wikis for Building Content Knowledge in the Foreign Language Classroom
ERIC Educational Resources Information Center
Pellet, Stephanie H.
2012-01-01
Most pedagogical applications of wikis in foreign language education draw on this collaborative tool to improve (formal) writing skills or to develop target language cultural sensitivity, missing largely on the opportunity to support student-developed L2 content knowledge. Seeking an alternative to traditional teacher-centered approaches, this…
Food Safety Posters for Safe Handling of Leafy Greens
ERIC Educational Resources Information Center
Rajagopal, Lakshman; Arendt, Susan W.; Shaw, Angela M.; Strohbehn, Catherine H.; Sauer, Kevin L.
2016-01-01
This article describes food safety educational tools depicting safe handling of leafy greens that are available as downloadable posters to Extension educators and practitioners (www.extension.iastate.edu). Nine visual-based minimal-text colored posters in English, Chinese, and Spanish were developed for use when formally or informally educating…
Polynomial Approximation of Functions: Historical Perspective and New Tools
ERIC Educational Resources Information Center
Kidron, Ivy
2003-01-01
This paper examines the effect of applying symbolic computation and graphics to enhance students' ability to move from a visual interpretation of mathematical concepts to formal reasoning. The mathematics topics involved, Approximation and Interpolation, were taught according to their historical development, and the students tried to follow the…
Constrained variational calculus for higher order classical field theories
NASA Astrophysics Data System (ADS)
Campos, Cédric M.; de León, Manuel; Martín de Diego, David
2010-11-01
We develop an intrinsic geometrical setting for higher order constrained field theories. As a main tool we use an appropriate generalization of the classical Skinner-Rusk formalism. Some examples of applications are studied, in particular to the geometrical description of optimal control theory for partial differential equations.
Swimming with Sharks: A Physical Educator's Guide to Effective Crowdsourcing
ERIC Educational Resources Information Center
Bulger, Sean M.; Jones, Emily M.; Katz, Nicole; Shrewsbury, Gentry; Wood, Justin
2016-01-01
The reality-competition television series Shark Tank affords up-and-coming entrepreneurs the opportunity to make a formal business presentation to a panel of potential investors. Adopting a similar framework, entrepreneurial teachers have started using web-based collaborative fundraising or crowdsourcing as a tool to build program capacity with…
Basic Education and Policy Support Activity: Tools and Publications.
ERIC Educational Resources Information Center
Creative Associates International, Inc., Washington, DC.
The Basic Education and Policy Support (BEPS) Activity is a United States Agency for International Development (USAID)-sponsored, multi-year initiative designed to further improve the quality of, effectiveness of, and access to formal and nonformal basic education. This catalog is one element of the BEPS information dissemination process. The…
Dialogue as a Tool for Meaning Making
ERIC Educational Resources Information Center
Bruni, Angela Suzanne Dudley
2013-01-01
In order to empower citizens to analyze the effects, risk, and value of science, a knowledge of scientific concepts is necessary (Mejlgaard, 2009). The formal educational system plays a role in this endeavor (Gil-Perez & Vilches, 2005). One proposed constructivist practice is the use of social learning activities using verbalized, shared…
Engaging Pediatricians in Developmental Screening: The Effectiveness of Academic Detailing
ERIC Educational Resources Information Center
Honigfeld, Lisa; Chandhok, Laura; Spiegelman, Kenneth
2012-01-01
Use of formal developmental screening tools in the pediatric medical home improves early identification of children with developmental delays and disorders, including Autism Spectrum Disorders. A pilot study evaluated the impact of an academic detailing module in which trainers visited 43 pediatric primary care practices to provide education about…
Visualization of Learning Scenarios with UML4LD
ERIC Educational Resources Information Center
Laforcade, Pierre
2007-01-01
Present Educational Modelling Languages are used to formally specify abstract learning scenarios in a machine-interpretable format. Current tooling does not provide teachers/designers with some graphical facilities to help them in reusing existent scenarios. They need human-readable representations. This paper discusses the UML4LD experimental…
Tackling environmental, economic, and social sustainability issues with community stakeholders will often lead to choices that are costly, complex and uncertain. A formal process with proper guidance is needed to understand the issues, identify sources of disagreement, consider t...