Sample records for formal analysis tools

  1. ADGS-2100 Adaptive Display and Guidance System Window Manager Analysis

    NASA Technical Reports Server (NTRS)

    Whalen, Mike W.; Innis, John D.; Miller, Steven P.; Wagner, Lucas G.

    2006-01-01

    Recent advances in modeling languages have made it feasible to formally specify and analyze the behavior of large system components. Synchronous data flow languages, such as Lustre, SCR, and RSML-e are particularly well suited to this task, and commercial versions of these tools such as SCADE and Simulink are growing in popularity among designers of safety critical systems, largely due to their ability to automatically generate code from the models. At the same time, advances in formal analysis tools have made it practical to formally verify important properties of these models to ensure that design defects are identified and corrected early in the lifecycle. This report describes how these tools have been applied to the ADGS-2100 Adaptive Display and Guidance Window Manager being developed by Rockwell Collins Inc. This work demonstrates how formal methods can be easily and cost-efficiently used to remove defects early in the design cycle.

  2. Evidence Arguments for Using Formal Methods in Software Certification

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Pai, Ganesh

    2013-01-01

    We describe a generic approach for automatically integrating the output generated from a formal method/tool into a software safety assurance case, as an evidence argument, by (a) encoding the underlying reasoning as a safety case pattern, and (b) instantiating it using the data produced from the method/tool. We believe this approach not only improves the trustworthiness of the evidence generated from a formal method/tool, by explicitly presenting the reasoning and mechanisms underlying its genesis, but also provides a way to gauge the suitability of the evidence in the context of the wider assurance case. We illustrate our work by application to a real example-an unmanned aircraft system- where we invoke a formal code analysis tool from its autopilot software safety case, automatically transform the verification output into an evidence argument, and then integrate it into the former.

  3. Developing Formal Object-oriented Requirements Specifications: A Model, Tool and Technique.

    ERIC Educational Resources Information Center

    Jackson, Robert B.; And Others

    1995-01-01

    Presents a formal object-oriented specification model (OSS) for computer software system development that is supported by a tool that automatically generates a prototype from an object-oriented analysis model (OSA) instance, lets the user examine the prototype, and permits the user to refine the OSA model instance to generate a requirements…

  4. The Profile Envision and Splicing Tool (PRESTO): Developing an Atmospheric Wind Analysis Tool for Space Launch Vehicles Using Python

    NASA Technical Reports Server (NTRS)

    Orcutt, John M.; Barbre, Robert E., Jr.; Brenton, James C.; Decker, Ryan K.

    2017-01-01

    Launch vehicle programs require vertically complete atmospheric profiles. Many systems at the ER to make the necessary measurements, but all have different EVR, vertical coverage, and temporal coverage. MSFC Natural Environments Branch developed a tool to create a vertically complete profile from multiple inputs using Python. Forward work: Finish Formal Testing Acceptance Testing, End-to-End Testing. Formal Release

  5. Formal Methods for Life-Critical Software

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Johnson, Sally C.

    1993-01-01

    The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.

  6. Model-Driven Test Generation of Distributed Systems

    NASA Technical Reports Server (NTRS)

    Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin

    2012-01-01

    This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.

  7. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  8. Formal Methods Tool Qualification

    NASA Technical Reports Server (NTRS)

    Wagner, Lucas G.; Cofer, Darren; Slind, Konrad; Tinelli, Cesare; Mebsout, Alain

    2017-01-01

    Formal methods tools have been shown to be effective at finding defects in safety-critical digital systems including avionics systems. The publication of DO-178C and the accompanying formal methods supplement DO-333 allows applicants to obtain certification credit for the use of formal methods without providing justification for them as an alternative method. This project conducted an extensive study of existing formal methods tools, identifying obstacles to their qualification and proposing mitigations for those obstacles. Further, it interprets the qualification guidance for existing formal methods tools and provides case study examples for open source tools. This project also investigates the feasibility of verifying formal methods tools by generating proof certificates which capture proof of the formal methods tool's claim, which can be checked by an independent, proof certificate checking tool. Finally, the project investigates the feasibility of qualifying this proof certificate checker, in the DO-330 framework, in lieu of qualifying the model checker itself.

  9. Formal Assurance Certifiable Tooling Formal Assurance Certifiable Tooling Strategy Final Report

    NASA Technical Reports Server (NTRS)

    Bush, Eric; Oglesby, David; Bhatt, Devesh; Murugesan, Anitha; Engstrom, Eric; Mueller, Joe; Pelican, Michael

    2017-01-01

    This is the Final Report of a research project to investigate issues and provide guidance for the qualification of formal methods tools under the DO-330 qualification process. It consisted of three major subtasks spread over two years: 1) an assessment of theoretical soundness issues that may affect qualification for three categories of formal methods tools, 2) a case study simulating the DO-330 qualification of two actual tool sets, and 3) an investigation of risk mitigation strategies that might be applied to chains of such formal methods tools in order to increase confidence in their certification of airborne software.

  10. Neuromorphic log-domain silicon synapse circuits obey bernoulli dynamics: a unifying tutorial analysis

    PubMed Central

    Papadimitriou, Konstantinos I.; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M.

    2014-01-01

    The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4th) order topology. PMID:25653579

  11. Neuromorphic log-domain silicon synapse circuits obey bernoulli dynamics: a unifying tutorial analysis.

    PubMed

    Papadimitriou, Konstantinos I; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M

    2014-01-01

    The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4(th)) order topology.

  12. Periodically-Scheduled Controller Analysis using Hybrid Systems Reachability and Continuization

    DTIC Science & Technology

    2015-12-01

    tools to verify specifications for hybrid automata do not perform well on such periodically scheduled models. This is due to a combination of the large...an additive nondeterministic input. Reachability tools for hybrid automata can better handle such systems. We further improve the analysis by...formally as a hybrid automaton. However, reachability tools to verify specifications for hybrid automata do not perform well on such periodically

  13. An Ontology for State Analysis: Formalizing the Mapping to SysML

    NASA Technical Reports Server (NTRS)

    Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel

    2012-01-01

    State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.

  14. Applications of Formal Methods to Specification and Safety of Avionics Software

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.; Guaspari, David; Humenn, Polar

    1996-01-01

    This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.

  15. Formal Analysis of the Remote Agent Before and After Flight

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Lowry, Mike; Park, SeungJoon; Pecheur, Charles; Penix, John; Visser, Willem; White, Jon L.

    2000-01-01

    This paper describes two separate efforts that used the SPIN model checker to verify deep space autonomy flight software. The first effort occurred at the beginning of a spiral development process and found five concurrency errors early in the design cycle that the developers acknowledge would not have been found through testing. This effort required a substantial manual modeling effort involving both abstraction and translation from the prototype LISP code to the PROMELA language used by SPIN. This experience and others led to research to address the gap between formal method tools and the development cycle used by software developers. The Java PathFinder tool which directly translates from Java to PROMELA was developed as part of this research, as well as automatic abstraction tools. In 1999 the flight software flew on a space mission, and a deadlock occurred in a sibling subsystem to the one which was the focus of the first verification effort. A second quick-response "cleanroom" verification effort found the concurrency error in a short amount of time. The error was isomorphic to one of the concurrency errors found during the first verification effort. The paper demonstrates that formal methods tools can find concurrency errors that indeed lead to loss of spacecraft functions, even for the complex software required for autonomy. Second, it describes progress in automatic translation and abstraction that eventually will enable formal methods tools to be inserted directly into the aerospace software development cycle.

  16. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  17. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  18. Peer Coaching as an Institutionalised Tool for Professional Development: The Perceptions of Tutors in a Nigerian College

    ERIC Educational Resources Information Center

    Aderibigbe, Semiyu Adejare; Ajasa, Folorunso Adekemi

    2013-01-01

    Purpose: The purpose of this paper is to explore the perceptions of college tutors on peer coaching as a tool for professional development to determine its formal institutionalisation. Design/methodology/approach: A survey questionnaire was used for data collection, while analysis of data was done using descriptive statistics. Findings: The…

  19. Formal Methods Specification and Analysis Guidebook for the Verification of Software and Computer Systems. Volume 2; A Practitioner's Companion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods.

  20. Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis

    NASA Technical Reports Server (NTRS)

    Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.

    2017-01-01

    This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.

  1. Design and Analysis Techniques for Concurrent Blackboard Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Mcmanus, John William

    1992-01-01

    Blackboard systems are a natural progression of knowledge-based systems into a more powerful problem solving technique. They provide a way for several highly specialized knowledge sources to cooperate to solve large, complex problems. Blackboard systems incorporate the concepts developed by rule-based and expert systems programmers and include the ability to add conventionally coded knowledge sources. The small and specialized knowledge sources are easier to develop and test, and can be hosted on hardware specifically suited to the task that they are solving. The Formal Model for Blackboard Systems was developed to provide a consistent method for describing a blackboard system. A set of blackboard system design tools has been developed and validated for implementing systems that are expressed using the Formal Model. The tools are used to test and refine a proposed blackboard system design before the design is implemented. My research has shown that the level of independence and specialization of the knowledge sources directly affects the performance of blackboard systems. Using the design, simulation, and analysis tools, I developed a concurrent object-oriented blackboard system that is faster, more efficient, and more powerful than existing systems. The use of the design and analysis tools provided the highly specialized and independent knowledge sources required for my concurrent blackboard system to achieve its design goals.

  2. Modeling and Analysis of Asynchronous Systems Using SAL and Hybrid SAL

    NASA Technical Reports Server (NTRS)

    Tiwari, Ashish; Dutertre, Bruno

    2013-01-01

    We present formal models and results of formal analysis of two different asynchronous systems. We first examine a mid-value select module that merges the signals coming from three different sensors that are each asynchronously sampling the same input signal. We then consider the phase locking protocol proposed by Daly, Hopkins, and McKenna. This protocol is designed to keep a set of non-faulty (asynchronous) clocks phase locked even in the presence of Byzantine-faulty clocks on the network. All models and verifications have been developed using the SAL model checking tools and the Hybrid SAL abstractor.

  3. Decision Analysis Tools for Volcano Observatories

    NASA Astrophysics Data System (ADS)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  4. Structuring Formal Control Systems Specifications for Reuse: Surviving Hardware Changes

    NASA Technical Reports Server (NTRS)

    Thompson, Jeffrey M.; Heimdahl, Mats P. E.; Erickson, Debra M.

    2000-01-01

    Formal capture and analysis of the required behavior of control systems have many advantages. For instance, it encourages rigorous requirements analysis, the required behavior is unambiguously defined, and we can assure that various safety properties are satisfied. Formal modeling is, however, a costly and time consuming process and if one could reuse the formal models over a family of products, significant cost savings would be realized. In an ongoing project we are investigating how to structure state-based models to achieve a high level of reusability within product families. In this paper we discuss a high-level structure of requirements models that achieves reusability of the desired control behavior across varying hardware platforms in a product family. The structuring approach is demonstrated through a case study in the mobile robotics domain where the desired robot behavior is reused on two diverse platforms-one commercial mobile platform and one build in-house. We use our language RSML (-e) to capture the control behavior for reuse and our tool NIMBUS to demonstrate how the formal specification can be validated and used as a prototype on the two platforms.

  5. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.

    DTIC Science & Technology

    1997-09-30

    set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has

  6. State Event Models for the Formal Analysis of Human-Machine Interactions

    NASA Technical Reports Server (NTRS)

    Combefis, Sebastien; Giannakopoulou, Dimitra; Pecheur, Charles

    2014-01-01

    The work described in this paper was motivated by our experience with applying a framework for formal analysis of human-machine interactions (HMI) to a realistic model of an autopilot. The framework is built around a formally defined conformance relation called "fullcontrol" between an actual system and the mental model according to which the system is operated. Systems are well-designed if they can be described by relatively simple, full-control, mental models for their human operators. For this reason, our framework supports automated generation of minimal full-control mental models for HMI systems, where both the system and the mental models are described as labelled transition systems (LTS). The autopilot that we analysed has been developed in the NASA Ames HMI prototyping tool ADEPT. In this paper, we describe how we extended the models that our HMI analysis framework handles to allow adequate representation of ADEPT models. We then provide a property-preserving reduction from these extended models to LTSs, to enable application of our LTS-based formal analysis algorithms. Finally, we briefly discuss the analyses we were able to perform on the autopilot model with our extended framework.

  7. Development of a Software Safety Process and a Case Study of Its Use

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1997-01-01

    Research in the year covered by this reporting period has been primarily directed toward the following areas: (1) Formal specification of user interfaces; (2) Fault-tree analysis including software; (3) Evaluation of formal specification notations; (4) Evaluation of formal verification techniques; (5) Expanded analysis of the shell architecture concept; (6) Development of techniques to address the problem of information survivability; and (7) Development of a sophisticated tool for the manipulation of formal specifications written in Z. This report summarizes activities under the grant. The technical results relating to this grant and the remainder of the principal investigator's research program are contained in various reports and papers. The remainder of this report is organized as follows. In the next section, an overview of the project is given. This is followed by a summary of accomplishments during the reporting period and details of students funded. Seminars presented describing work under this grant are listed in the following section, and the final section lists publications resulting from this grant.

  8. Different Strokes for Different Folks: Visual Presentation Design between Disciplines

    PubMed Central

    Gomez, Steven R.; Jianu, Radu; Ziemkiewicz, Caroline; Guo, Hua; Laidlaw, David H.

    2015-01-01

    We present an ethnographic study of design differences in visual presentations between academic disciplines. Characterizing design conventions between users and data domains is an important step in developing hypotheses, tools, and design guidelines for information visualization. In this paper, disciplines are compared at a coarse scale between four groups of fields: social, natural, and formal sciences; and the humanities. Two commonplace presentation types were analyzed: electronic slideshows and whiteboard “chalk talks”. We found design differences in slideshows using two methods – coding and comparing manually-selected features, like charts and diagrams, and an image-based analysis using PCA called eigenslides. In whiteboard talks with controlled topics, we observed design behaviors, including using representations and formalisms from a participant’s own discipline, that suggest authors might benefit from novel assistive tools for designing presentations. Based on these findings, we discuss opportunities for visualization ethnography and human-centered authoring tools for visual information. PMID:26357149

  9. Different Strokes for Different Folks: Visual Presentation Design between Disciplines.

    PubMed

    Gomez, S R; Jianu, R; Ziemkiewicz, C; Guo, Hua; Laidlaw, D H

    2012-12-01

    We present an ethnographic study of design differences in visual presentations between academic disciplines. Characterizing design conventions between users and data domains is an important step in developing hypotheses, tools, and design guidelines for information visualization. In this paper, disciplines are compared at a coarse scale between four groups of fields: social, natural, and formal sciences; and the humanities. Two commonplace presentation types were analyzed: electronic slideshows and whiteboard "chalk talks". We found design differences in slideshows using two methods - coding and comparing manually-selected features, like charts and diagrams, and an image-based analysis using PCA called eigenslides. In whiteboard talks with controlled topics, we observed design behaviors, including using representations and formalisms from a participant's own discipline, that suggest authors might benefit from novel assistive tools for designing presentations. Based on these findings, we discuss opportunities for visualization ethnography and human-centered authoring tools for visual information.

  10. Integration Toolkit and Methods (ITKM) Corporate Data Integration Tools (CDIT). Review of the State-of-the-Art with Respect to Integration Toolkits and Methods (ITKM)

    DTIC Science & Technology

    1992-06-01

    system capabilities \\Jch as memory management and network communications are provided by a virtual machine-type operating environment. Various human ...thinking. The elements of this substrate include representational formality, genericity, a method of formal analysis, and augmentation of human analytical...the form of identifying: the data entity itself; its aliases (including how the data is presented th programs or human users in the form of copy

  11. Investigating Actuation Force Fight with Asynchronous and Synchronous Redundancy Management Techniques

    NASA Technical Reports Server (NTRS)

    Hall, Brendan; Driscoll, Kevin; Schweiker, Kevin; Dutertre, Bruno

    2013-01-01

    Within distributed fault-tolerant systems the term force-fight is colloquially used to describe the level of command disagreement present at redundant actuation interfaces. This report details an investigation of force-fight using three distributed system case-study architectures. Each case study architecture is abstracted and formally modeled using the Symbolic Analysis Laboratory (SAL) tool chain from the Stanford Research Institute (SRI). We use the formal SAL models to produce k-induction based proofs of a bounded actuation agreement property. We also present a mathematically derived bound of redundant actuation agreement for sine-wave stimulus. The report documents our experiences and lessons learned developing the formal models and the associated proofs.

  12. Formal Foundations for Hierarchical Safety Cases

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Pai, Ganesh; Whiteside, Iain

    2015-01-01

    Safety cases are increasingly being required in many safety-critical domains to assure, using structured argumentation and evidence, that a system is acceptably safe. However, comprehensive system-wide safety arguments present appreciable challenges to develop, understand, evaluate, and manage, partly due to the volume of information that they aggregate, such as the results of hazard analysis, requirements analysis, testing, formal verification, and other engineering activities. Previously, we have proposed hierarchical safety cases, hicases, to aid the comprehension of safety case argument structures. In this paper, we build on a formal notion of safety case to formalise the use of hierarchy as a structuring technique, and show that hicases satisfy several desirable properties. Our aim is to provide a formal, theoretical foundation for safety cases. In particular, we believe that tools for high assurance systems should be granted similar assurance to the systems to which they are applied. To this end, we formally specify and prove the correctness of key operations for constructing and managing hicases, which gives the specification for implementing hicases in AdvoCATE, our toolset for safety case automation. We motivate and explain the theory with the help of a simple running example, extracted from a real safety case and developed using AdvoCATE.

  13. Quality assurance software inspections at NASA Ames: Metrics for feedback and modification

    NASA Technical Reports Server (NTRS)

    Wenneson, G.

    1985-01-01

    Software inspections are a set of formal technical review procedures held at selected key points during software development in order to find defects in software documents--is described in terms of history, participants, tools, procedures, statistics, and database analysis.

  14. Model Based Analysis and Test Generation for Flight Software

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  15. Development of a Rating Tool for Mobile Cancer Apps: Information Analysis and Formal and Content-Related Evaluation of Selected Cancer Apps.

    PubMed

    Böhme, Cathleen; von Osthoff, Marc Baron; Frey, Katrin; Hübner, Jutta

    2017-08-17

    Mobile apps are offered in large numbers and have different qualities. The aim of this article was to develop a rating tool based on formal and content-related criteria for the assessment of cancer apps and to test its applicability on apps. After a thorough analysis of the literature, we developed a specific rating tool for cancer apps based on the MARS (mobile app rating system) and a rating tool for cancer websites. This instrument was applied to apps freely available in stores and focusing on some cancer topic. Ten apps were rated on the basis of 22 criteria. Sixty percent of the apps (6/10) were rated poor and insufficient. The rating by different scientists was homogenous. The good apps had reliable sources were regularly updated and had a concrete intent/purpose in their app description. In contrast, the apps that were rated poor had no distinction of scientific content and advertisement. In some cases, there was no imprint to identify the provider. As apps of poor quality can give misinformation and lead to wrong treatment decisions, efforts have to be made to increase usage of high-quality apps. Certification would help cancer patients to identify reliable apps, yet acceptance of a certification system must be backed up.

  16. Learning in non-formal education: Is it "youthful" for youth in action?

    NASA Astrophysics Data System (ADS)

    Norqvist, Lars; Leffler, Eva

    2017-04-01

    This article offers insights into the practices of a non-formal education programme for youth provided by the European Union (EU). It takes a qualitative approach and is based on a case study of the European Voluntary Service (EVS). Data were collected during individual and focus group interviews with learners (the EVS volunteers), decision takers and trainers, with the aim of deriving an understanding of learning in non-formal education. The research questions concerned learning, the recognition of learning and perspectives of usefulness. The study also examined the Youthpass documentation tool as a key to understanding the recognition of learning and to determine whether the learning was useful for learners (the volunteers). The findings and analysis offer several interpretations of learning, and the recognition of learning, which take place in non-formal education. The findings also revealed that it is complicated to divide learning into formal and non- formal categories; instead, non-formal education is useful for individual learners when both formal and non-formal educational contexts are integrated. As a consequence, the division of formal and non-formal (and possibly even informal) learning creates a gap which works against the development of flexible and interconnected education with ubiquitous learning and mobility within and across formal and non-formal education. This development is not in the best interests of learners, especially when seeking useful learning and education for youth (what the authors term "youthful" for youth in action).

  17. When product designers use perceptually based color tools

    NASA Astrophysics Data System (ADS)

    Bender, Walter R.

    1998-07-01

    Palette synthesis and analysis tools have been built based upon a model of color experience. This model adjusts formal compositional elements such as hue, value, chroma, and their contrasts, as well as size and proportion. Clothing and household product designers were given these tools to give guidance to their selection of seasonal palettes for use in production of the private-label merchandise of a large retail chain. The designers chose base palettes. Accents to these palettes were generated with and without the aid of the color tools. These palettes are compared by using perceptual metrics and interviews. The results are presented.

  18. When product designers use perceptually based color tools

    NASA Astrophysics Data System (ADS)

    Bender, Walter R.

    2001-01-01

    Palette synthesis and analysis tools have been built based upon a model of color experience. This model adjusts formal compositional elements such as hue, value, chroma, and their contrasts, as well as size and proportion. Clothing and household product designers were given these tools to guide their selection of seasonal palettes in the production of the private-label merchandise in a large retail chain. The designers chose base palettes. Accents to these palettes were generated with and without the aid of the color tools. These palettes are compared by using perceptual metrics and interviews. The results are presented.

  19. Automated Verification of Specifications with Typestates and Access Permissions

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Catano, Nestor

    2011-01-01

    We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).

  20. Understanding visualization: a formal approach using category theory and semiotics.

    PubMed

    Vickers, Paul; Faith, Joe; Rossiter, Nick

    2013-06-01

    This paper combines the vocabulary of semiotics and category theory to provide a formal analysis of visualization. It shows how familiar processes of visualization fit the semiotic frameworks of both Saussure and Peirce, and extends these structures using the tools of category theory to provide a general framework for understanding visualization in practice, including: Relationships between systems, data collected from those systems, renderings of those data in the form of representations, the reading of those representations to create visualizations, and the use of those visualizations to create knowledge and understanding of the system under inspection. The resulting framework is validated by demonstrating how familiar information visualization concepts (such as literalness, sensitivity, redundancy, ambiguity, generalizability, and chart junk) arise naturally from it and can be defined formally and precisely. This paper generalizes previous work on the formal characterization of visualization by, inter alia, Ziemkiewicz and Kosara and allows us to formally distinguish properties of the visualization process that previous work does not.

  1. A primer in macromolecular linguistics.

    PubMed

    Searls, David B

    2013-03-01

    Polymeric macromolecules, when viewed abstractly as strings of symbols, can be treated in terms of formal language theory, providing a mathematical foundation for characterizing such strings both as collections and in terms of their individual structures. In addition this approach offers a framework for analysis of macromolecules by tools and conventions widely used in computational linguistics. This article introduces the ways that linguistics can be and has been applied to molecular biology, covering the relevant formal language theory at a relatively nontechnical level. Analogies between macromolecules and human natural language are used to provide intuitive insights into the relevance of grammars, parsing, and analysis of language complexity to biology. Copyright © 2012 Wiley Periodicals, Inc.

  2. Workshop on dimensional analysis for design, development, and research executives

    NASA Technical Reports Server (NTRS)

    Goodman, R. A.; Abernathy, W. J.

    1971-01-01

    The proceedings of a conference of research and development executives are presented. The purpose of the meeting was to develop an understanding of the conditions which are appropriate for the use of certain general management tools and those conditions which render these tools inappropriate. The verbatim statements of the participants are included to show the direction taken initially by the conference. Formal presentations of management techniques for research and development are developed.

  3. IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation

    NASA Technical Reports Server (NTRS)

    Margaria, Tiziana (Editor); Steffen, Bernhard (Editor); Hichey, Michael G.

    2005-01-01

    This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains.

  4. An ORCID based synchronization framework for a national CRIS ecosystem.

    PubMed

    Mendes Moreira, João; Cunha, Alcino; Macedo, Nuno

    2015-01-01

    PTCRIS (Portuguese Current Research Information System) is a program aiming at the creation and sustained development of a national integrated information ecosystem, to support research management according to the best international standards and practices. This paper reports on the experience of designing and prototyping a synchronization framework for PTCRIS based on ORCID (Open Researcher and Contributor ID). This framework embraces the "input once, re-use often" principle, and will enable a substantial reduction of the research output management burden by allowing automatic information exchange between the various national systems. The design of the framework followed best practices in rigorous software engineering, namely well-established principles in the research field of consistency management, and relied on formal analysis techniques and tools for its validation and verification. The notion of consistency between the services was formally specified and discussed with the stakeholders before the technical aspects on how to preserve said consistency were explored. Formal specification languages and automated verification tools were used to analyze the specifications and generate usage scenarios, useful for validation with the stakeholder and essential to certificate compliant services.

  5. Visual analysis of variance: a tool for quantitative assessment of fMRI data processing and analysis.

    PubMed

    McNamee, R L; Eddy, W F

    2001-12-01

    Analysis of variance (ANOVA) is widely used for the study of experimental data. Here, the reach of this tool is extended to cover the preprocessing of functional magnetic resonance imaging (fMRI) data. This technique, termed visual ANOVA (VANOVA), provides both numerical and pictorial information to aid the user in understanding the effects of various parts of the data analysis. Unlike a formal ANOVA, this method does not depend on the mathematics of orthogonal projections or strictly additive decompositions. An illustrative example is presented and the application of the method to a large number of fMRI experiments is discussed. Copyright 2001 Wiley-Liss, Inc.

  6. Verifying the interactive convergence clock synchronization algorithm using the Boyer-Moore theorem prover

    NASA Technical Reports Server (NTRS)

    Young, William D.

    1992-01-01

    The application of formal methods to the analysis of computing systems promises to provide higher and higher levels of assurance as the sophistication of our tools and techniques increases. Improvements in tools and techniques come about as we pit the current state of the art against new and challenging problems. A promising area for the application of formal methods is in real-time and distributed computing. Some of the algorithms in this area are both subtle and important. In response to this challenge and as part of an ongoing attempt to verify an implementation of the Interactive Convergence Clock Synchronization Algorithm (ICCSA), we decided to undertake a proof of the correctness of the algorithm using the Boyer-Moore theorem prover. This paper describes our approach to proving the ICCSA using the Boyer-Moore prover.

  7. DASEES: A decision analysis tool with Bayesian networks from the Environmental Protection Agency’s Sustainable and Healthy Communities Research Program

    EPA Science Inventory

    Tackling environmental, economic, and social sustainability issues with community stakeholders will often lead to choices that are costly, complex and uncertain. A formal process with proper guidance is needed to understand the issues, identify sources of disagreement, consider t...

  8. Multicriteria decision analysis: Overview and implications for environmental decision making

    USGS Publications Warehouse

    Hermans, Caroline M.; Erickson, Jon D.; Erickson, Jon D.; Messner, Frank; Ring, Irene

    2007-01-01

    Environmental decision making involving multiple stakeholders can benefit from the use of a formal process to structure stakeholder interactions, leading to more successful outcomes than traditional discursive decision processes. There are many tools available to handle complex decision making. Here we illustrate the use of a multicriteria decision analysis (MCDA) outranking tool (PROMETHEE) to facilitate decision making at the watershed scale, involving multiple stakeholders, multiple criteria, and multiple objectives. We compare various MCDA methods and their theoretical underpinnings, examining methods that most realistically model complex decision problems in ways that are understandable and transparent to stakeholders.

  9. Hotspot Patterns: The Formal Definition and Automatic Detection of Architecture Smells

    DTIC Science & Technology

    2015-01-15

    serious question for a project manager or architect: how to determine which parts of the code base should be given higher priority for maintenance and...services framework; Hadoop8 is a tool for distributed processing of large data sets; HBase9 is the Hadoop database; Ivy10 is a dependency management tool...answer this question more rigorously, we conducted Pearson Correlation Analysis to test the dependency between the number of issues a file involves

  10. A Tool for Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John

    2005-01-01

    Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.

  11. Reasoning about variables in 11 to 18 year olds: informal, schooled and formal expression in learning about functions

    NASA Astrophysics Data System (ADS)

    Ayalon, Michal; Watson, Anne; Lerman, Steve

    2016-09-01

    This study examines expressions of reasoning by some higher achieving 11 to 18 year-old English students responding to a survey consisting of function tasks developed in collaboration with their teachers. We report on 70 students, 10 from each of English years 7-13. Iterative and comparative analysis identified capabilities and difficulties of students and suggested conjectures concerning links between the affordances of the tasks, the curriculum, and students' responses. The paper focuses on five of the survey tasks and highlights connections between informal and formal expressions of reasoning about variables in learning. We introduce the notion of `schooled' expressions of reasoning, neither formal nor informal, to emphasise the role of the formatting tools introduced in school that shape future understanding and reasoning.

  12. Statechart Analysis with Symbolic PathFinder

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.

    2012-01-01

    We report here on our on-going work that addresses the automated analysis and test case generation for software systems modeled using multiple Statechart formalisms. The work is motivated by large programs such as NASA Exploration, that involve multiple systems that interact via safety-critical protocols and are designed with different Statechart variants. To verify these safety-critical systems, we have developed Polyglot, a framework for modeling and analysis of model-based software written using different Statechart formalisms. Polyglot uses a common intermediate representation with customizable Statechart semantics and leverages the analysis and test generation capabilities of the Symbolic PathFinder tool. Polyglot is used as follows: First, the structure of the Statechart model (expressed in Matlab Stateflow or Rational Rhapsody) is translated into a common intermediate representation (IR). The IR is then translated into Java code that represents the structure of the model. The semantics are provided as "pluggable" modules.

  13. Recent advances in applying decision science to managing national forests

    USGS Publications Warehouse

    Marcot, Bruce G.; Thompson, Matthew P.; Runge, Michael C.; Thompson, Frank R.; McNulty, Steven; Cleaves, David; Tomosy, Monica; Fisher, Larry A.; Andrew, Bliss

    2012-01-01

    Management of federal public forests to meet sustainability goals and multiple use regulations is an immense challenge. To succeed, we suggest use of formal decision science procedures and tools in the context of structured decision making (SDM). SDM entails four stages: problem structuring (framing the problem and defining objectives and evaluation criteria), problem analysis (defining alternatives, evaluating likely consequences, identifying key uncertainties, and analyzing tradeoffs), decision point (identifying the preferred alternative), and implementation and monitoring the preferred alternative with adaptive management feedbacks. We list a wide array of models, techniques, and tools available for each stage, and provide three case studies of their selected use in National Forest land management and project plans. Successful use of SDM involves participation by decision-makers, analysts, scientists, and stakeholders. We suggest specific areas for training and instituting SDM to foster transparency, rigor, clarity, and inclusiveness in formal decision processes regarding management of national forests.

  14. An Overview of SAL

    NASA Technical Reports Server (NTRS)

    Bensalem, Saddek; Ganesh, Vijay; Lakhnech, Yassine; Munoz, Cesar; Owre, Sam; Ruess, Harald; Rushby, John; Rusu, Vlad; Saiedi, Hassen; Shankar, N.

    2000-01-01

    To become practical for assurance, automated formal methods must be made more scalable, automatic, and cost-effective. Such an increase in scope, scale, automation, and utility can be derived from an emphasis on a systematic separation of concerns during verification. SAL (Symbolic Analysis Laboratory) attempts to address these issues. It is a framework for combining different tools to calculate properties of concurrent systems. The heart of SAL is a language, developed in collaboration with Stanford, Berkeley, and Verimag for specifying concurrent systems in a compositional way. Our instantiation of the SAL framework augments PVS with tools for abstraction, invariant generation, program analysis (such as slicing), theorem proving, and model checking to separate concerns as well as calculate properties (i.e., perform, symbolic analysis) of concurrent systems. We. describe the motivation, the language, the tools, their integration in SAL/PAS, and some preliminary experience of their use.

  15. 2014-2015 Partnership accomplishments report on joint activities: National Gap Analysis Program and LANDFIRE

    USGS Publications Warehouse

    Davidson, Anne; McKerrow, Alexa; Long, Don; Earnhardt, Todd

    2015-01-01

    The intended target audience for this document initially is management and project technical specialist and scientists involved in the Gap Analysis Program (GAP) and the Landscape Fire and Resource Management Planning Tools - (LANDFIRE) program to help communicate coordination activities to all involved parties. This document is also intended to give background information in other parts of the USGS and beyond, although some details given are relatively oriented to management of the respective programs. Because the Gap Analysis Program (GAP) and the Landscape Fire and Resource Management Planning Tools - LANDFIRE programs both rely on characterizations of land cover using similar scales and resolutions, the programs have been coordinating their work to improve scientific consistency and efficiency of production. Initial discussions and informal sharing of ideas and work began in 2008. Although this collaboration was fruitful, there was no formal process for reporting results, plans, or outstanding issues, nor was there any formally-defined coordinated management team that spanned the two programs. In 2012, leadership from the two programs agreed to strengthen the coordination of their respective work efforts. In 2013 the GAP and LANDFIRE programs developed an umbrella plan of objectives and components related to three mutual focus areas for the GAP and LANDFIRE collaboration for the years 2013 and 2014 (GAP/LANDFIRE 2013). The evolution of this partnership resulted in the drafting of an inter-program Memorandum of Understanding (MOU) in 2014. This MOU identified three coordination topics relevant to the two programs participating at this point in the MOU history: Vegetation mappingDisturbance classesFormal quality assessment

  16. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  17. Tools reference manual for a Requirements Specification Language (RSL), version 2.0

    NASA Technical Reports Server (NTRS)

    Fisher, Gene L.; Cohen, Gerald C.

    1993-01-01

    This report describes a general-purpose Requirements Specification Language, RSL. The purpose of RSL is to specify precisely the external structure of a mechanized system and to define requirements that the system must meet. A system can be comprised of a mixture of hardware, software, and human processing elements. RSL is a hybrid of features found in several popular requirements specification languages, such as SADT (Structured Analysis and Design Technique), PSL (Problem Statement Language), and RMF (Requirements Modeling Framework). While languages such as these have useful features for structuring a specification, they generally lack formality. To overcome the deficiencies of informal requirements languages, RSL has constructs for formal mathematical specification. These constructs are similar to those found in formal specification languages such as EHDM (Enhanced Hierarchical Development Methodology), Larch, and OBJ3.

  18. Stochastic Formal Correctness of Numerical Algorithms

    NASA Technical Reports Server (NTRS)

    Daumas, Marc; Lester, David; Martin-Dorel, Erik; Truffert, Annick

    2009-01-01

    We provide a framework to bound the probability that accumulated errors were never above a given threshold on numerical algorithms. Such algorithms are used for example in aircraft and nuclear power plants. This report contains simple formulas based on Levy's and Markov's inequalities and it presents a formal theory of random variables with a special focus on producing concrete results. We selected four very common applications that fit in our framework and cover the common practices of systems that evolve for a long time. We compute the number of bits that remain continuously significant in the first two applications with a probability of failure around one out of a billion, where worst case analysis considers that no significant bit remains. We are using PVS as such formal tools force explicit statement of all hypotheses and prevent incorrect uses of theorems.

  19. Software Tool Issues

    NASA Astrophysics Data System (ADS)

    Hennell, Michael

    This chapter relies on experience with tool development gained over the last thirty years. It shows that there are a large number of techniques that contribute to any successful project, and that formality is always the key: a modern software test tool is based on a firm mathematical foundation. After a brief introduction, Section 2 recalls and extends the terminology of Chapter 1. Section 3 discusses the the design of different sorts of static and dynamic analysis tools. Nine important issues to be taken into consideration when evaluating such tools are presented in Section 4. Section 5 investigates the interplay between testing and proof. In Section 6, we call for developers to take their own medicine and verify their tools. Finally, we conclude in Section 7 with a summary of our main messages, emphasising the important role of testing.

  20. p-exponent and p-leaders, Part II: Multifractal analysis. Relations to detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Leonarduzzi, R.; Wendt, H.; Abry, P.; Jaffard, S.; Melot, C.; Roux, S. G.; Torres, M. E.

    2016-04-01

    Multifractal analysis studies signals, functions, images or fields via the fluctuations of their local regularity along time or space, which capture crucial features of their temporal/spatial dynamics. It has become a standard signal and image processing tool and is commonly used in numerous applications of different natures. In its common formulation, it relies on the Hölder exponent as a measure of local regularity, which is by nature restricted to positive values and can hence be used for locally bounded functions only. In this contribution, it is proposed to replace the Hölder exponent with a collection of novel exponents for measuring local regularity, the p-exponents. One of the major virtues of p-exponents is that they can potentially take negative values. The corresponding wavelet-based multiscale quantities, the p-leaders, are constructed and shown to permit the definition of a new multifractal formalism, yielding an accurate practical estimation of the multifractal properties of real-world data. Moreover, theoretical and practical connections to and comparisons against another multifractal formalism, referred to as multifractal detrended fluctuation analysis, are achieved. The performance of the proposed p-leader multifractal formalism is studied and compared to previous formalisms using synthetic multifractal signals and images, illustrating its theoretical and practical benefits. The present contribution is complemented by a companion article studying in depth the theoretical properties of p-exponents and the rich classification of local singularities it permits.

  1. The Digital electronic Guideline Library (DeGeL): a hybrid framework for representation and use of clinical guidelines.

    PubMed

    Shahar, Yuval; Young, Ohad; Shalom, Erez; Mayaffit, Alon; Moskovitch, Robert; Hessing, Alon; Galperin, Maya

    2004-01-01

    We propose to present a poster (and potentially also a demonstration of the implemented system) summarizing the current state of our work on a hybrid, multiple-format representation of clinical guidelines that facilitates conversion of guidelines from free text to a formal representation. We describe a distributed Web-based architecture (DeGeL) and a set of tools using the hybrid representation. The tools enable performing tasks such as guideline specification, semantic markup, search, retrieval, visualization, eligibility determination, runtime application and retrospective quality assessment. The representation includes four parallel formats: Free text (one or more original sources); semistructured text (labeled by the target guideline-ontology semantic labels); semiformal text (which includes some control specification); and a formal, machine-executable representation. The specification, indexing, search, retrieval, and browsing tools are essentially independent of the ontology chosen for guideline representation, but editing the semi-formal and formal formats requires ontology-specific tools, which we have developed in the case of the Asbru guideline-specification language. The four formats support increasingly sophisticated computational tasks. The hybrid guidelines are stored in a Web-based library. All tools, such as for runtime guideline application or retrospective quality assessment, are designed to operate on all representations. We demonstrate the hybrid framework by providing examples from the semantic markup and search tools.

  2. Usability engineering: domain analysis activities for augmented-reality systems

    NASA Astrophysics Data System (ADS)

    Gabbard, Joseph; Swan, J. E., II; Hix, Deborah; Lanzagorta, Marco O.; Livingston, Mark; Brown, Dennis B.; Julier, Simon J.

    2002-05-01

    This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured, iterative, stepwise development process. Like the related disciplines of software and systems engineering, usability engineering is a combination of management principals and techniques, formal and semi- formal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system that displays heads- up battlefield intelligence information to a dismounted warrior. The paper discusses our general usability engineering process. We originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented reality system. The focus of this paper is our work on domain analysis, the first activity of the usability engineering process. We describe our plans for and our progress to date on our domain analysis for BARS. We give results in terms of a specific urban battlefield use case we have designed.

  3. Using a formal requirements management tool for system engineering: first results at ESO

    NASA Astrophysics Data System (ADS)

    Zamparelli, Michele

    2006-06-01

    The attention to proper requirement analysis and maintenance is growing in modern astronomical undertakings. The increasing degree of complexity that current and future generations of projects have reached requires substantial system engineering efforts and the usage of all available technology to keep project development under control. One such technology is a tool which helps managing relationships between deliverables at various development stages, and across functional subsystems and disciplines as different as software, mechanics, optics and electronics. The immediate benefits are traceability and the possibility to do impact analysis. An industrially proven tool for requirements management is presented together with the first results across some projects at ESO and a cost/benefit analysis of its usage. Experience gathered so far shows that the extensibility and configurability of the tool from one hand, and integration with common documentation formats and standards on the other, make it appear as a promising solution for even small scale system development.

  4. Industry Strength Tool and Technology for Automated Synthesis of Safety-Critical Applications from Formal Specifications

    DTIC Science & Technology

    2015-11-01

    28 2.3.4 Input/Output Automata ...various other modeling frameworks such as I/O Automata , Kahn Process Networks, Petri-nets, Multi-dimensional SDF, etc. are also used for designing...Formal Ideally suited to model DSP applications 3 Petri Nets Graphical Formal Used for modeling distributed systems 4 I/O Automata Both Formal

  5. Two Paradoxes in Linear Regression Analysis.

    PubMed

    Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong

    2016-12-25

    Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.

  6. An Analysis of Pre-Service Elementary Teachers' Understanding of Inquiry-Based Science Teaching

    ERIC Educational Resources Information Center

    Lee, Carole K.; Shea, Marilyn

    2016-01-01

    This study examines how pre-service elementary teachers (PSETs) view inquiry-based science learning and teaching, and how the science methods course builds their confidence to teach inquiry science. Most PSETs think that inquiry is asking students questions rather than a formal set of pedagogical tools. In the present study, three groups of PSETs…

  7. Extraction and Analysis of Display Data

    NASA Technical Reports Server (NTRS)

    Land, Chris; Moye, Kathryn

    2008-01-01

    The Display Audit Suite is an integrated package of software tools that partly automates the detection of Portable Computer System (PCS) Display errors. [PCS is a lap top computer used onboard the International Space Station (ISS).] The need for automation stems from the large quantity of PCS displays (6,000+, with 1,000,000+ lines of command and telemetry data). The Display Audit Suite includes data-extraction tools, automatic error detection tools, and database tools for generating analysis spread sheets. These spread sheets allow engineers to more easily identify many different kinds of possible errors. The Suite supports over 40 independent analyses, 16 NASA Tech Briefs, November 2008 and complements formal testing by being comprehensive (all displays can be checked) and by revealing errors that are difficult to detect via test. In addition, the Suite can be run early in the development cycle to find and correct errors in advance of testing.

  8. Analysis of Strengths, Weaknesses, Opportunities, and Threats as a Tool for Translating Evidence into Individualized Medical Strategies (I-SWOT)

    PubMed Central

    von Kodolitsch, Yskert; Bernhardt, Alexander M.; Robinson, Peter N.; Kölbel, Tilo; Reichenspurner, Hermann; Debus, Sebastian; Detter, Christian

    2015-01-01

    Background It is the physicians’ task to translate evidence and guidelines into medical strategies for individual patients. Until today, however, there is no formal tool that is instrumental to perform this translation. Methods We introduce the analysis of strengths (S) and weaknesses (W) related to therapy with opportunities (O) and threats (T) related to individual patients as a tool to establish an individualized (I) medical strategy (I-SWOT). The I-SWOT matrix identifies four fundamental types of strategy. These comprise “SO” maximizing strengths and opportunities, “WT” minimizing weaknesses and threats, “WO” minimizing weaknesses and maximizing opportunities, and “ST” maximizing strengths and minimizing threats. Each distinct type of strategy may be considered for individualized medical strategies. Results We describe four steps of I-SWOT to establish an individualized medical strategy to treat aortic disease. In the first step, we define the goal of therapy and identify all evidence-based therapeutic options. In a second step, we assess strengths and weaknesses of each therapeutic option in a SW matrix form. In a third step, we assess opportunities and threats related to the individual patient, and in a final step, we use the I-SWOT matrix to establish an individualized medical strategy through matching “SW” with “OT”. As an example we present two 30-year-old patients with Marfan syndrome with identical medical history and aortic pathology. As a result of I-SWOT analysis of their individual opportunities and threats, we identified two distinct medical strategies in these patients. Conclusion I-SWOT is a formal but easy to use tool to translate medical evidence into individualized medical strategies. PMID:27069939

  9. Analysis of Strengths, Weaknesses, Opportunities, and Threats as a Tool for Translating Evidence into Individualized Medical Strategies (I-SWOT).

    PubMed

    von Kodolitsch, Yskert; Bernhardt, Alexander M; Robinson, Peter N; Kölbel, Tilo; Reichenspurner, Hermann; Debus, Sebastian; Detter, Christian

    2015-06-01

    It is the physicians' task to translate evidence and guidelines into medical strategies for individual patients. Until today, however, there is no formal tool that is instrumental to perform this translation. We introduce the analysis of strengths (S) and weaknesses (W) related to therapy with opportunities (O) and threats (T) related to individual patients as a tool to establish an individualized (I) medical strategy (I-SWOT). The I-SWOT matrix identifies four fundamental types of strategy. These comprise "SO" maximizing strengths and opportunities, "WT" minimizing weaknesses and threats, "WO" minimizing weaknesses and maximizing opportunities, and "ST" maximizing strengths and minimizing threats. Each distinct type of strategy may be considered for individualized medical strategies. We describe four steps of I-SWOT to establish an individualized medical strategy to treat aortic disease. In the first step, we define the goal of therapy and identify all evidence-based therapeutic options. In a second step, we assess strengths and weaknesses of each therapeutic option in a SW matrix form. In a third step, we assess opportunities and threats related to the individual patient, and in a final step, we use the I-SWOT matrix to establish an individualized medical strategy through matching "SW" with "OT". As an example we present two 30-year-old patients with Marfan syndrome with identical medical history and aortic pathology. As a result of I-SWOT analysis of their individual opportunities and threats, we identified two distinct medical strategies in these patients. I-SWOT is a formal but easy to use tool to translate medical evidence into individualized medical strategies.

  10. Quasipolynomial generalization of Lotka-Volterra mappings

    NASA Astrophysics Data System (ADS)

    Hernández-Bermejo, Benito; Brenig, Léon

    2002-07-01

    In recent years, it has been shown that Lotka-Volterra mappings constitute a valuable tool from both the theoretical and the applied points of view, with developments in very diverse fields such as physics, population dynamics, chemistry and economy. The purpose of this work is to demonstrate that many of the most important ideas and algebraic methods that constitute the basis of the quasipolynomial formalism (originally conceived for the analysis of ordinary differential equations) can be extended into the mapping domain. The extension of the formalism into the discrete-time context is remarkable as far as the quasipolynomial methodology had never been shown to be applicable beyond the differential case. It will be demonstrated that Lotka-Volterra mappings play a central role in the quasipolynomial formalism for the discrete-time case. Moreover, the extension of the formalism into the discrete-time domain allows a significant generalization of Lotka-Volterra mappings as well as a whole transfer of algebraic methods into the discrete-time context. The result is a novel and more general conceptual framework for the understanding of Lotka-Volterra mappings as well as a new range of possibilities that become open not only for the theoretical analysis of Lotka-Volterra mappings and their generalizations, but also for the development of new applications.

  11. Safety Verification of the Small Aircraft Transportation System Concept of Operations

    NASA Technical Reports Server (NTRS)

    Carreno, Victor; Munoz, Cesar

    2005-01-01

    A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.

  12. MPEG-7-based description infrastructure for an audiovisual content analysis and retrieval system

    NASA Astrophysics Data System (ADS)

    Bailer, Werner; Schallauer, Peter; Hausenblas, Michael; Thallinger, Georg

    2005-01-01

    We present a case study of establishing a description infrastructure for an audiovisual content-analysis and retrieval system. The description infrastructure consists of an internal metadata model and access tool for using it. Based on an analysis of requirements, we have selected, out of a set of candidates, MPEG-7 as the basis of our metadata model. The openness and generality of MPEG-7 allow using it in broad range of applications, but increase complexity and hinder interoperability. Profiling has been proposed as a solution, with the focus on selecting and constraining description tools. Semantic constraints are currently only described in textual form. Conformance in terms of semantics can thus not be evaluated automatically and mappings between different profiles can only be defined manually. As a solution, we propose an approach to formalize the semantic constraints of an MPEG-7 profile using a formal vocabulary expressed in OWL, which allows automated processing of semantic constraints. We have defined the Detailed Audiovisual Profile as the profile to be used in our metadata model and we show how some of the semantic constraints of this profile can be formulated using ontologies. To work practically with the metadata model, we have implemented a MPEG-7 library and a client/server document access infrastructure.

  13. Modeling Criminal Activity in Urban Landscapes

    NASA Astrophysics Data System (ADS)

    Brantingham, Patricia; Glässer, Uwe; Jackson, Piper; Vajihollahi, Mona

    Computational and mathematical methods arguably have an enormous potential for serving practical needs in crime analysis and prevention by offering novel tools for crime investigations and experimental platforms for evidence-based policy making. We present a comprehensive formal framework and tool support for mathematical and computational modeling of criminal behavior to facilitate systematic experimental studies of a wide range of criminal activities in urban environments. The focus is on spatial and temporal aspects of different forms of crime, including opportunistic and serial violent crimes. However, the proposed framework provides a basis to push beyond conventional empirical research and engage the use of computational thinking and social simulations in the analysis of terrorism and counter-terrorism.

  14. Two Paradoxes in Linear Regression Analysis

    PubMed Central

    FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong

    2016-01-01

    Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214

  15. Promoting Argumentation in Primary Science Contexts: An Analysis of Students' Interactions in Formal and Informal Learning Environments

    ERIC Educational Resources Information Center

    Simon, S.; Johnson, S; Cavell, S.; Parsons, T.

    2012-01-01

    The paper reports on the outcomes of a study that utilized a graphical tool, Digalo, to stimulate argumentative interactions in both school and informal learning settings. Digalo was developed in a European study to explore argumentation in a range of learning environments. The focus here is on the potential for using Digalo in promoting…

  16. Identification of physical habitats limiting the production of coho salmon in western Oregon and Washington.

    Treesearch

    G.H. Reeves; F.H. Everest; T.E. Nickelson

    1989-01-01

    Fishery managers are currently spending millions of dollars per year on habitat enhancement for anadromous salmonids but often do not have the tools needed to ensure success. An analysis of factors limiting production of salmonids in streams must be completed before any habitat-enhancement program is begun. This paper outlines the first formal procedure for identifying...

  17. Irena : tool suite for modeling and analysis of small-angle scattering.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilavsky, J.; Jemian, P.

    2009-04-01

    Irena, a tool suite for analysis of both X-ray and neutron small-angle scattering (SAS) data within the commercial Igor Pro application, brings together a comprehensive suite of tools useful for investigations in materials science, physics, chemistry, polymer science and other fields. In addition to Guinier and Porod fits, the suite combines a variety of advanced SAS data evaluation tools for the modeling of size distribution in the dilute limit using maximum entropy and other methods, dilute limit small-angle scattering from multiple non-interacting populations of scatterers, the pair-distance distribution function, a unified fit, the Debye-Bueche model, the reflectivity (X-ray and neutron)more » using Parratt's formalism, and small-angle diffraction. There are also a number of support tools, such as a data import/export tool supporting a broad sampling of common data formats, a data modification tool, a presentation-quality graphics tool optimized for small-angle scattering data, and a neutron and X-ray scattering contrast calculator. These tools are brought together into one suite with consistent interfaces and functionality. The suite allows robust automated note recording and saving of parameters during export.« less

  18. Peer assisted learning as a formal instructional tool.

    PubMed

    Naqi, Syed Asghar

    2014-03-01

    To explore the utility of peer assisted learning (PAL) in medical schools as a formal instructional tool. Grounded theory approach. King Edward Medical University, Lahore, from July 2011 to December 2011. A study was designed using semi-structured in-depth interviews to collect data from final year medical students (n=6), residents (n=4) and faculty members (n=3), selected on the basis of non-probability purposive sampling. The qualitative data thus generated was first translated in English and transcribed and organized into major categories by using a coding framework. Participants were interviewed two more times to further explore their perceptions and experiences related to emergent categories. An iterative process was employed using grounded theory analysis technique to eventually generate theory. PAL was perceived as rewarding in terms of fostering higher order thinking, effective teaching skills and in improving self efficacy among learners. PAL can offer learning opportunity to medical students, residents and faculty members. It can improve depth of their knowledge and skills.

  19. An analysis on intersectional collaboration on non-communicable chronic disease prevention and control in China: a cross-sectional survey on main officials of community health service institutions.

    PubMed

    Li, Xing-Ming; Rasooly, Alon; Peng, Bo; JianWang; Xiong, Shu-Yu

    2017-11-10

    Our study aimed to design a tool of evaluating intersectional collaboration on Non-communicable Chronic Disease (NCD) prevention and control, and further to understand the current status of intersectional collaboration in community health service institutions of China. We surveyed 444 main officials of community health service institutions in Beijing, Tianjin, Hubei and Ningxia regions of China in 2014 by using a questionnaire. A model of collaboration measurement, including four relational dimensions of governance, shared goals and vision, formalization and internalization, was used to compare the scores of evaluation scale in NCD management procedures across community healthcare institutions and other ones. Reliability and validity of the evaluation tool on inter-organizational collaboration on NCD prevention and control were verified. The test on tool evaluating inter-organizational collaboration in community NCD management revealed a good reliability and validity (Cronbach's Alpha = 0.89,split-half reliability = 0.84, the variance contribution rate of an extracted principal component = 49.70%). The results of inter-organizational collaboration of different departments and management segments showed there were statistically significant differences in formalization dimension for physical examination (p = 0.01).There was statistically significant difference in governance dimension, formalization dimension and total score of the collaboration scale for health record sector (p = 0.01,0.00,0.00). Statistical differences were found in the formalization dimension for exercise and nutrition health education segment (p = 0.01). There were no statistically significant difference in formalization dimension of medication guidance for psychological consultation, medical referral service and rehabilitation guidance (all p > 0.05). The multi-department collaboration mechanism of NCD prevention and control has been rudimentarily established. Community management institutions and general hospitals are more active in participating in community NCD management with better collaboration score, whereas the CDC shows relatively poor collaboration in China. Xing-ming Li and Alon Rasooly have the same contribution to the paper. Xing-ming Li and Alon Rasooly listed as the same first author.

  20. Non-Formal Education: Interest in Human Capital

    ERIC Educational Resources Information Center

    Ivanova, I. V.

    2016-01-01

    We define non-formal education as a part of general education, which gives students the required tools for cognition and creativity. It allows them to fully realize their self-potential and to set their own professional and personal goals. In this article, we outline the fundamental differences between general and non-formal education from the…

  1. An Educational Development Tool Based on Principles of Formal Ontology

    ERIC Educational Resources Information Center

    Guzzi, Rodolfo; Scarpanti, Stefano; Ballista, Giovanni; Di Nicolantonio, Walter

    2005-01-01

    Computer science provides with virtual laboratories, places where one can merge real experiments with the formalism of algorithms and mathematics and where, with the advent of multimedia, sounds and movies can also be added. In this paper we present a method, based on principles of formal ontology, allowing one to develop interactive educational…

  2. Cognitive mapping tools: review and risk management needs.

    PubMed

    Wood, Matthew D; Bostrom, Ann; Bridges, Todd; Linkov, Igor

    2012-08-01

    Risk managers are increasingly interested in incorporating stakeholder beliefs and other human factors into the planning process. Effective risk assessment and management requires understanding perceptions and beliefs of involved stakeholders, and how these beliefs give rise to actions that influence risk management decisions. Formal analyses of risk manager and stakeholder cognitions represent an important first step. Techniques for diagramming stakeholder mental models provide one tool for risk managers to better understand stakeholder beliefs and perceptions concerning risk, and to leverage this new understanding in developing risk management strategies. This article reviews three methodologies for assessing and diagramming stakeholder mental models--decision-analysis-based mental modeling, concept mapping, and semantic web analysis--and assesses them with regard to their ability to address risk manager needs. © 2012 Society for Risk Analysis.

  3. A formal approach to the analysis of clinical computer-interpretable guideline modeling languages.

    PubMed

    Grando, M Adela; Glasspool, David; Fox, John

    2012-01-01

    To develop proof strategies to formally study the expressiveness of workflow-based languages, and to investigate their applicability to clinical computer-interpretable guideline (CIG) modeling languages. We propose two strategies for studying the expressiveness of workflow-based languages based on a standard set of workflow patterns expressed as Petri nets (PNs) and notions of congruence and bisimilarity from process calculus. Proof that a PN-based pattern P can be expressed in a language L can be carried out semi-automatically. Proof that a language L cannot provide the behavior specified by a PNP requires proof by exhaustion based on analysis of cases and cannot be performed automatically. The proof strategies are generic but we exemplify their use with a particular CIG modeling language, PROforma. To illustrate the method we evaluate the expressiveness of PROforma against three standard workflow patterns and compare our results with a previous similar but informal comparison. We show that the two proof strategies are effective in evaluating a CIG modeling language against standard workflow patterns. We find that using the proposed formal techniques we obtain different results to a comparable previously published but less formal study. We discuss the utility of these analyses as the basis for principled extensions to CIG modeling languages. Additionally we explain how the same proof strategies can be reused to prove the satisfaction of patterns expressed in the declarative language CIGDec. The proof strategies we propose are useful tools for analysing the expressiveness of CIG modeling languages. This study provides good evidence of the benefits of applying formal methods of proof over semi-formal ones. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Initial implementation of a comparative data analysis ontology.

    PubMed

    Prosdocimi, Francisco; Chisham, Brandon; Pontelli, Enrico; Thompson, Julie D; Stoltzfus, Arlin

    2009-07-03

    Comparative analysis is used throughout biology. When entities under comparison (e.g. proteins, genomes, species) are related by descent, evolutionary theory provides a framework that, in principle, allows N-ary comparisons of entities, while controlling for non-independence due to relatedness. Powerful software tools exist for specialized applications of this approach, yet it remains under-utilized in the absence of a unifying informatics infrastructure. A key step in developing such an infrastructure is the definition of a formal ontology. The analysis of use cases and existing formalisms suggests that a significant component of evolutionary analysis involves a core problem of inferring a character history, relying on key concepts: "Operational Taxonomic Units" (OTUs), representing the entities to be compared; "character-state data" representing the observations compared among OTUs; "phylogenetic tree", representing the historical path of evolution among the entities; and "transitions", the inferred evolutionary changes in states of characters that account for observations. Using the Web Ontology Language (OWL), we have defined these and other fundamental concepts in a Comparative Data Analysis Ontology (CDAO). CDAO has been evaluated for its ability to represent token data sets and to support simple forms of reasoning. With further development, CDAO will provide a basis for tools (for semantic transformation, data retrieval, validation, integration, etc.) that make it easier for software developers and biomedical researchers to apply evolutionary methods of inference to diverse types of data, so as to integrate this powerful framework for reasoning into their research.

  5. Structural Embeddings: Mechanization with Method

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Rushby, John

    1999-01-01

    The most powerful tools for analysis of formal specifications are general-purpose theorem provers and model checkers, but these tools provide scant methodological support. Conversely, those approaches that do provide a well-developed method generally have less powerful automation. It is natural, therefore, to try to combine the better-developed methods with the more powerful general-purpose tools. An obstacle is that the methods and the tools often employ very different logics. We argue that methods are separable from their logics and are largely concerned with the structure and organization of specifications. We, propose a technique called structural embedding that allows the structural elements of a method to be supported by a general-purpose tool, while substituting the logic of the tool for that of the method. We have found this technique quite effective and we provide some examples of its application. We also suggest how general-purpose systems could be restructured to support this activity better.

  6. The Second NASA Formal Methods Workshop 1992

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C. (Compiler); Holloway, C. Michael (Compiler); Butler, Ricky W. (Compiler)

    1992-01-01

    The primary goal of the workshop was to bring together formal methods researchers and aerospace industry engineers to investigate new opportunities for applying formal methods to aerospace problems. The first part of the workshop was tutorial in nature. The second part of the workshop explored the potential of formal methods to address current aerospace design and verification problems. The third part of the workshop involved on-line demonstrations of state-of-the-art formal verification tools. Also, a detailed survey was filled in by the attendees; the results of the survey are compiled.

  7. Formal methods technology transfer: Some lessons learned

    NASA Technical Reports Server (NTRS)

    Hamilton, David

    1992-01-01

    IBM has a long history in the application of formal methods to software development and verification. There have been many successes in the development of methods, tools and training to support formal methods. And formal methods have been very successful on several projects. However, the use of formal methods has not been as widespread as hoped. This presentation summarizes several approaches that have been taken to encourage more widespread use of formal methods, and discusses the results so far. The basic problem is one of technology transfer, which is a very difficult problem. It is even more difficult for formal methods. General problems of technology transfer, especially the transfer of formal methods technology, are also discussed. Finally, some prospects for the future are mentioned.

  8. From Informal Safety-Critical Requirements to Property-Driven Formal Validation

    NASA Technical Reports Server (NTRS)

    Cimatti, Alessandro; Roveri, Marco; Susi, Angelo; Tonetta, Stefano

    2008-01-01

    Most of the efforts in formal methods have historically been devoted to comparing a design against a set of requirements. The validation of the requirements themselves, however, has often been disregarded, and it can be considered a largely open problem, which poses several challenges. The first challenge is given by the fact that requirements are often written in natural language, and may thus contain a high degree of ambiguity. Despite the progresses in Natural Language Processing techniques, the task of understanding a set of requirements cannot be automatized, and must be carried out by domain experts, who are typically not familiar with formal languages. Furthermore, in order to retain a direct connection with the informal requirements, the formalization cannot follow standard model-based approaches. The second challenge lies in the formal validation of requirements. On one hand, it is not even clear which are the correctness criteria or the high-level properties that the requirements must fulfill. On the other hand, the expressivity of the language used in the formalization may go beyond the theoretical and/or practical capacity of state-of-the-art formal verification. In order to solve these issues, we propose a new methodology that comprises of a chain of steps, each supported by a specific tool. The main steps are the following. First, the informal requirements are split into basic fragments, which are classified into categories, and dependency and generalization relationships among them are identified. Second, the fragments are modeled using a visual language such as UML. The UML diagrams are both syntactically restricted (in order to guarantee a formal semantics), and enriched with a highly controlled natural language (to allow for modeling static and temporal constraints). Third, an automatic formal analysis phase iterates over the modeled requirements, by combining several, complementary techniques: checking consistency; verifying whether the requirements entail some desirable properties; verify whether the requirements are consistent with selected scenarios; diagnosing inconsistencies by identifying inconsistent cores; identifying vacuous requirements; constructing multiple explanations by enabling the fault-tree analysis related to particular fault models; verifying whether the specification is realizable.

  9. Analysis of Phase-Type Stochastic Petri Nets With Discrete and Continuous Timing

    NASA Technical Reports Server (NTRS)

    Jones, Robert L.; Goode, Plesent W. (Technical Monitor)

    2000-01-01

    The Petri net formalism is useful in studying many discrete-state, discrete-event systems exhibiting concurrency, synchronization, and other complex behavior. As a bipartite graph, the net can conveniently capture salient aspects of the system. As a mathematical tool, the net can specify an analyzable state space. Indeed, one can reason about certain qualitative properties (from state occupancies) and how they arise (the sequence of events leading there). By introducing deterministic or random delays, the model is forced to sojourn in states some amount of time, giving rise to an underlying stochastic process, one that can be specified in a compact way and capable of providing quantitative, probabilistic measures. We formalize a new non-Markovian extension to the Petri net that captures both discrete and continuous timing in the same model. The approach affords efficient, stationary analysis in most cases and efficient transient analysis under certain restrictions. Moreover, this new formalism has the added benefit in modeling fidelity stemming from the simultaneous capture of discrete- and continuous-time events (as opposed to capturing only one and approximating the other). We show how the underlying stochastic process, which is non-Markovian, can be resolved into simpler Markovian problems that enjoy efficient solutions. Solution algorithms are provided that can be easily programmed.

  10. Research Directions in Real-Time Systems.

    DTIC Science & Technology

    1996-09-01

    This report summarizes a survey of published research in real time systems . Material is presented that provides an overview of the topic, focusing on...communications protocols and scheduling techniques. It is noted that real - time systems deserve special attention separate from other areas because of...formal tools for design and analysis of real - time systems . The early work on applications as well as notable theoretical advances are summarized

  11. Toward Intelligent Software Defect Detection

    NASA Technical Reports Server (NTRS)

    Benson, Markland J.

    2011-01-01

    Source code level software defect detection has gone from state of the art to a software engineering best practice. Automated code analysis tools streamline many of the aspects of formal code inspections but have the drawback of being difficult to construct and either prone to false positives or severely limited in the set of defects that can be detected. Machine learning technology provides the promise of learning software defects by example, easing construction of detectors and broadening the range of defects that can be found. Pinpointing software defects with the same level of granularity as prominent source code analysis tools distinguishes this research from past efforts, which focused on analyzing software engineering metrics data with granularity limited to that of a particular function rather than a line of code.

  12. Formal Verification of Complex Systems based on SysML Functional Requirements

    DTIC Science & Technology

    2014-12-23

    Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools

  13. Modified subaperture tool influence functions of a flat-pitch polisher with reverse-calculated material removal rate.

    PubMed

    Dong, Zhichao; Cheng, Haobo; Tam, Hon-Yuen

    2014-04-10

    Numerical simulation of subaperture tool influence functions (TIF) is widely known as a critical procedure in computer-controlled optical surfacing. However, it may lack practicability in engineering because the emulation TIF (e-TIF) has some discrepancy with the practical TIF (p-TIF), and the removal rate could not be predicted by simulations. Prior to the polishing of a formal workpiece, opticians have to conduct TIF spot experiments on another sample to confirm the p-TIF with a quantitative removal rate, which is difficult and time-consuming for sequential polishing runs with different tools. This work is dedicated to applying these e-TIFs into practical engineering by making improvements from two aspects: (1) modifies the pressure distribution model of a flat-pitch polisher by finite element analysis and least square fitting methods to make the removal shape of e-TIFs closer to p-TIFs (less than 5% relative deviation validated by experiments); (2) predicts the removal rate of e-TIFs by reverse calculating the material removal volume of a pre-polishing run to the formal workpiece (relative deviations of peak and volume removal rate were validated to be less than 5%). This can omit TIF spot experiments for the particular flat-pitch tool employed and promote the direct usage of e-TIFs in the optimization of a dwell time map, which can largely save on cost and increase fabrication efficiency.

  14. Exploring Formalized Elite Coach Mentoring Programmes in the UK: 'We've Had to Play the Game'

    ERIC Educational Resources Information Center

    Sawiuk, Rebecca; Taylor, William G.; Groom, Ryan

    2018-01-01

    Formalized mentoring programmes have been implemented increasingly by UK sporting institutions as a central coach development tool, yet claims supporting formal mentoring as an effective learning strategy are often speculative, scarce, ill-defined and accepted without verification. The aim of this study, therefore, was to explore some of the…

  15. Toward a mathematical formalism of performance, task difficulty, and activation

    NASA Technical Reports Server (NTRS)

    Samaras, George M.

    1988-01-01

    The rudiments of a mathematical formalism for handling operational, physiological, and psychological concepts are developed for use by the man-machine system design engineer. The formalism provides a framework for developing a structured, systematic approach to the interface design problem, using existing mathematical tools, and simplifying the problem of telling a machine how to measure and use performance.

  16. How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations

    NASA Astrophysics Data System (ADS)

    Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev

    With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.

  17. Combining Archetypes, Ontologies and Formalization Enables Automated Computation of Quality Indicators.

    PubMed

    Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald

    2017-01-01

    ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.

  18. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2004-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  19. A Formal Approach to Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.

  20. Helping System Engineers Bridge the Peaks

    NASA Technical Reports Server (NTRS)

    Rungta, Neha; Tkachuk, Oksana; Person, Suzette; Biatek, Jason; Whalen, Michael W.; Castle, Joseph; Castle, JosephGundy-Burlet, Karen

    2014-01-01

    In our experience at NASA, system engineers generally follow the Twin Peaks approach when developing safety-critical systems. However, iterations between the peaks require considerable manual, and in some cases duplicate, effort. A significant part of the manual effort stems from the fact that requirements are written in English natural language rather than a formal notation. In this work, we propose an approach that enables system engineers to leverage formal requirements and automated test generation to streamline iterations, effectively "bridging the peaks". The key to the approach is a formal language notation that a) system engineers are comfortable with, b) is supported by a family of automated V&V tools, and c) is semantically rich enough to describe the requirements of interest. We believe the combination of formalizing requirements and providing tool support to automate the iterations will lead to a more efficient Twin Peaks implementation at NASA.

  1. Rewriting Logic Semantics of a Plan Execution Language

    NASA Technical Reports Server (NTRS)

    Dowek, Gilles; Munoz, Cesar A.; Rocha, Camilo

    2009-01-01

    The Plan Execution Interchange Language (PLEXIL) is a synchronous language developed by NASA to support autonomous spacecraft operations. In this paper, we propose a rewriting logic semantics of PLEXIL in Maude, a high-performance logical engine. The rewriting logic semantics is by itself a formal interpreter of the language and can be used as a semantic benchmark for the implementation of PLEXIL executives. The implementation in Maude has the additional benefit of making available to PLEXIL designers and developers all the formal analysis and verification tools provided by Maude. The formalization of the PLEXIL semantics in rewriting logic poses an interesting challenge due to the synchronous nature of the language and the prioritized rules defining its semantics. To overcome this difficulty, we propose a general procedure for simulating synchronous set relations in rewriting logic that is sound and, for deterministic relations, complete. We also report on the finding of two issues at the design level of the original PLEXIL semantics that were identified with the help of the executable specification in Maude.

  2. Practical Formal Verification of MPI and Thread Programs

    NASA Astrophysics Data System (ADS)

    Gopalakrishnan, Ganesh; Kirby, Robert M.

    Large-scale simulation codes in science and engineering are written using the Message Passing Interface (MPI). Shared memory threads are widely used directly, or to implement higher level programming abstractions. Traditional debugging methods for MPI or thread programs are incapable of providing useful formal guarantees about coverage. They get bogged down in the sheer number of interleavings (schedules), often missing shallow bugs. In this tutorial we will introduce two practical formal verification tools: ISP (for MPI C programs) and Inspect (for Pthread C programs). Unlike other formal verification tools, ISP and Inspect run directly on user source codes (much like a debugger). They pursue only the relevant set of process interleavings, using our own customized Dynamic Partial Order Reduction algorithms. For a given test harness, DPOR allows these tools to guarantee the absence of deadlocks, instrumented MPI object leaks and communication races (using ISP), and shared memory races (using Inspect). ISP and Inspect have been used to verify large pieces of code: in excess of 10,000 lines of MPI/C for ISP in under 5 seconds, and about 5,000 lines of Pthread/C code in a few hours (and much faster with the use of a cluster or by exploiting special cases such as symmetry) for Inspect. We will also demonstrate the Microsoft Visual Studio and Eclipse Parallel Tools Platform integrations of ISP (these will be available on the LiveCD).

  3. Software Assurance: Five Essential Considerations for Acquisition Officials

    DTIC Science & Technology

    2007-05-01

    May 2007 www.stsc.hill.af.mil 17 2 • address security concerns in the software development life cycle ( SDLC )? • Are there formal software quality...What threat modeling process, if any, is used when designing the software ? What analysis, design, and construction tools are used by your software design...the-shelf (COTS), government off-the-shelf (GOTS), open- source, embedded, and legacy software . Attackers exploit unintentional vulnerabil- ities or

  4. Verifying Hybrid Systems Modeled as Timed Automata: A Case Study

    DTIC Science & Technology

    1997-03-01

    Introduction Researchers have proposed many innovative formal methods for developing real - time systems [9]. Such methods can give system developers and...customers greater con dence that real - time systems satisfy their requirements, especially their crit- ical requirements. However, applying formal methods...specifying and reasoning about real - time systems that is designed to address these challenging problems. Our approach is to build formal reasoning tools

  5. Formal functional test designs with a test representation language

    NASA Technical Reports Server (NTRS)

    Hops, J. M.

    1993-01-01

    The application of the category-partition method to the test design phase of hardware, software, or system test development is discussed. The method provides a formal framework for reducing the total number of possible test cases to a minimum logical subset for effective testing. An automatic tool and a formal language were developed to implement the method and produce the specification of test cases.

  6. Neurocognitive inefficacy of the strategy process.

    PubMed

    Klein, Harold E; D'Esposito, Mark

    2007-11-01

    The most widely used (and taught) protocols for strategic analysis-Strengths, Weaknesses, Opportunities, and Threats (SWOT) and Porter's (1980) Five Force Framework for industry analysis-have been found to be insufficient as stimuli for strategy creation or even as a basis for further strategy development. We approach this problem from a neurocognitive perspective. We see profound incompatibilities between the cognitive process-deductive reasoning-channeled into the collective mind of strategists within the formal planning process through its tools of strategic analysis (i.e., rational technologies) and the essentially inductive reasoning process actually needed to address ill-defined, complex strategic situations. Thus, strategic analysis protocols that may appear to be and, indeed, are entirely rational and logical are not interpretable as such at the neuronal substrate level where thinking takes place. The analytical structure (or propositional representation) of these tools results in a mental dead end, the phenomenon known in cognitive psychology as functional fixedness. The difficulty lies with the inability of the brain to make out meaningful (i.e., strategy-provoking) stimuli from the mental images (or depictive representations) generated by strategic analysis tools. We propose decreasing dependence on these tools and conducting further research employing brain imaging technology to explore complex data handling protocols with richer mental representation and greater potential for strategy creation.

  7. Using Petri Net Tools to Study Properties and Dynamics of Biological Systems

    PubMed Central

    Peleg, Mor; Rubin, Daniel; Altman, Russ B.

    2005-01-01

    Petri Nets (PNs) and their extensions are promising methods for modeling and simulating biological systems. We surveyed PN formalisms and tools and compared them based on their mathematical capabilities as well as by their appropriateness to represent typical biological processes. We measured the ability of these tools to model specific features of biological systems and answer a set of biological questions that we defined. We found that different tools are required to provide all capabilities that we assessed. We created software to translate a generic PN model into most of the formalisms and tools discussed. We have also made available three models and suggest that a library of such models would catalyze progress in qualitative modeling via PNs. Development and wide adoption of common formats would enable researchers to share models and use different tools to analyze them without the need to convert to proprietary formats. PMID:15561791

  8. Applications of a formal approach to decipher discrete genetic networks.

    PubMed

    Corblin, Fabien; Fanchon, Eric; Trilling, Laurent

    2010-07-20

    A growing demand for tools to assist the building and analysis of biological networks exists in systems biology. We argue that the use of a formal approach is relevant and applicable to address questions raised by biologists about such networks. The behaviour of these systems being complex, it is essential to exploit efficiently every bit of experimental information. In our approach, both the evolution rules and the partial knowledge about the structure and the behaviour of the network are formalized using a common constraint-based language. In this article our formal and declarative approach is applied to three biological applications. The software environment that we developed allows to specifically address each application through a new class of biologically relevant queries. We show that we can describe easily and in a formal manner the partial knowledge about a genetic network. Moreover we show that this environment, based on a constraint algorithmic approach, offers a wide variety of functionalities, going beyond simple simulations, such as proof of consistency, model revision, prediction of properties, search for minimal models relatively to specified criteria. The formal approach proposed here deeply changes the way to proceed in the exploration of genetic and biochemical networks, first by avoiding the usual trial-and-error procedure, and second by placing the emphasis on sets of solutions, rather than a single solution arbitrarily chosen among many others. Last, the constraint approach promotes an integration of model and experimental data in a single framework.

  9. Semantically-Rigorous Systems Engineering Modeling Using Sysml and OWL

    NASA Technical Reports Server (NTRS)

    Jenkins, J. Steven; Rouquette, Nicolas F.

    2012-01-01

    The Systems Modeling Language (SysML) has found wide acceptance as a standard graphical notation for the domain of systems engineering. SysML subsets and extends the Unified Modeling Language (UML) to define conventions for expressing structural, behavioral, and analytical elements, and relationships among them. SysML-enabled modeling tools are available from multiple providers, and have been used for diverse projects in military aerospace, scientific exploration, and civil engineering. The Web Ontology Language (OWL) has found wide acceptance as a standard notation for knowledge representation. OWL-enabled modeling tools are available from multiple providers, as well as auxiliary assets such as reasoners and application programming interface libraries, etc. OWL has been applied to diverse projects in a wide array of fields. While the emphasis in SysML is on notation, SysML inherits (from UML) a semantic foundation that provides for limited reasoning and analysis. UML's partial formalization (FUML), however, does not cover the full semantics of SysML, which is a substantial impediment to developing high confidence in the soundness of any conclusions drawn therefrom. OWL, by contrast, was developed from the beginning on formal logical principles, and consequently provides strong support for verification of consistency and satisfiability, extraction of entailments, conjunctive query answering, etc. This emphasis on formal logic is counterbalanced by the absence of any graphical notation conventions in the OWL standards. Consequently, OWL has had only limited adoption in systems engineering. The complementary strengths and weaknesses of SysML and OWL motivate an interest in combining them in such a way that we can benefit from the attractive graphical notation of SysML and the formal reasoning of OWL. This paper describes an approach to achieving that combination.

  10. SINEBase: a database and tool for SINE analysis.

    PubMed

    Vassetzky, Nikita S; Kramerov, Dmitri A

    2013-01-01

    SINEBase (http://sines.eimb.ru) integrates the revisited body of knowledge about short interspersed elements (SINEs). A set of formal definitions concerning SINEs was introduced. All available sequence data were screened through these definitions and the genetic elements misidentified as SINEs were discarded. As a result, 175 SINE families have been recognized in animals, flowering plants and green algae. These families were classified by the modular structure of their nucleotide sequences and the frequencies of different patterns were evaluated. These data formed the basis for the database of SINEs. The SINEBase website can be used in two ways: first, to explore the database of SINE families, and second, to analyse candidate SINE sequences using specifically developed tools. This article presents an overview of the database and the process of SINE identification and analysis.

  11. SINEBase: a database and tool for SINE analysis

    PubMed Central

    Vassetzky, Nikita S.; Kramerov, Dmitri A.

    2013-01-01

    SINEBase (http://sines.eimb.ru) integrates the revisited body of knowledge about short interspersed elements (SINEs). A set of formal definitions concerning SINEs was introduced. All available sequence data were screened through these definitions and the genetic elements misidentified as SINEs were discarded. As a result, 175 SINE families have been recognized in animals, flowering plants and green algae. These families were classified by the modular structure of their nucleotide sequences and the frequencies of different patterns were evaluated. These data formed the basis for the database of SINEs. The SINEBase website can be used in two ways: first, to explore the database of SINE families, and second, to analyse candidate SINE sequences using specifically developed tools. This article presents an overview of the database and the process of SINE identification and analysis. PMID:23203982

  12. Formal methods for modeling and analysis of hybrid systems

    NASA Technical Reports Server (NTRS)

    Tiwari, Ashish (Inventor); Lincoln, Patrick D. (Inventor)

    2009-01-01

    A technique based on the use of a quantifier elimination decision procedure for real closed fields and simple theorem proving to construct a series of successively finer qualitative abstractions of hybrid automata is taught. The resulting abstractions are always discrete transition systems which can then be used by any traditional analysis tool. The constructed abstractions are conservative and can be used to establish safety properties of the original system. The technique works on linear and non-linear polynomial hybrid systems: the guards on discrete transitions and the continuous flows in all modes can be specified using arbitrary polynomial expressions over the continuous variables. An exemplar tool in the SAL environment built over the theorem prover PVS is detailed. The technique scales well to large and complex hybrid systems.

  13. Recent trends related to the use of formal methods in software engineering

    NASA Technical Reports Server (NTRS)

    Prehn, Soren

    1986-01-01

    An account is given of some recent developments and trends related to the development and use of formal methods in software engineering. Ongoing activities in Europe are focussed on, since there seems to be a notable difference in attitude towards industrial usage of formal methods in Europe and in the U.S. A more detailed account is given of the currently most widespread formal method in Europe: the Vienna Development Method. Finally, the use of Ada is discussed in relation to the application of formal methods, and the potential for constructing Ada-specific tools based on that method is considered.

  14. The Tracking Meteogram, an AWIPS II Tool for Time-Series Analysis

    NASA Technical Reports Server (NTRS)

    Burks, Jason Eric; Sperow, Ken

    2015-01-01

    A new tool has been developed for the National Weather Service (NWS) Advanced Weather Interactive Processing System (AWIPS) II through collaboration between NASA's Short-term Prediction Research and Transition (SPoRT) and the NWS Meteorological Development Laboratory (MDL). Referred to as the "Tracking Meteogram", the tool aids NWS forecasters in assessing meteorological parameters associated with moving phenomena. The tool aids forecasters in severe weather situations by providing valuable satellite and radar derived trends such as cloud top cooling rates, radial velocity couplets, reflectivity, and information from ground-based lightning networks. The Tracking Meteogram tool also aids in synoptic and mesoscale analysis by tracking parameters such as the deepening of surface low pressure systems, changes in surface or upper air temperature, and other properties. The tool provides a valuable new functionality and demonstrates the flexibility and extensibility of the NWS AWIPS II architecture. In 2014, the operational impact of the tool was formally evaluated through participation in the NOAA/NWS Operations Proving Ground (OPG), a risk reduction activity to assess performance and operational impact of new forecasting concepts, tools, and applications. Performance of the Tracking Meteogram Tool during the OPG assessment confirmed that it will be a valuable asset to the operational forecasters. This presentation reviews development of the Tracking Meteogram tool, performance and feedback acquired during the OPG activity, and future goals for continued support and extension to other application areas.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  16. Using task analysis to understand the Data System Operations Team

    NASA Technical Reports Server (NTRS)

    Holder, Barbara E.

    1994-01-01

    The Data Systems Operations Team (DSOT) currently monitors the Multimission Ground Data System (MGDS) at JPL. The MGDS currently supports five spacecraft and within the next five years, it will support ten spacecraft simultaneously. The ground processing element of the MGDS consists of a distributed UNIX-based system of over 40 nodes and 100 processes. The MGDS system provides operators with little or no information about the system's end-to-end processing status or end-to-end configuration. The lack of system visibility has become a critical issue in the daily operation of the MGDS. A task analysis was conducted to determine what kinds of tools were needed to provide DSOT with useful status information and to prioritize the tool development. The analysis provided the formality and structure needed to get the right information exchange between development and operations. How even a small task analysis can improve developer-operator communications is described, and the challenges associated with conducting a task analysis in a real-time mission operations environment are examined.

  17. Formal analysis of the surgical pathway and development of a new software tool to assist surgeons in the decision making in primary breast surgery.

    PubMed

    Catanuto, Giuseppe; Pappalardo, Francesco; Rocco, Nicola; Leotta, Marco; Ursino, Venera; Chiodini, Paolo; Buggi, Federico; Folli, Secondo; Catalano, Francesca; Nava, Maurizio B

    2016-10-01

    The increased complexity of the decisional process in breast cancer surgery is well documented. With this study we aimed to create a software tool able to assist patients and surgeons in taking proper decisions. We hypothesized that the endpoints of breast cancer surgery could be addressed combining a set of decisional drivers. We created a decision support system software tool (DSS) and an interactive decision tree. A formal analysis estimated the information gain derived from each feature in the process. We tested the DSS on 52 patients and we analyzed the concordance of decisions obtained by different users and between the DSS suggestions and the actual surgery. We also tested the ability of the system to prevent post breast conservation deformities. The information gain revealed that patients preferences are the root of our decision tree. An observed concordance respectively of 0.98 and 0.88 was reported when the DSS was used twice by an expert operator or by a newly trained operator vs. an expert one. The observed concordance between the DSS suggestion and the actual decision was 0.69. A significantly higher incidence of post breast conservation defects was reported among patients who did not follow the DSS decision (Type III of Fitoussi, N = 4; 33.3%, p = 0.004). The DSS decisions can be reproduced by operators with different experience. The concordance between suggestions and actual decision is quite low, however the DSS is able to prevent post- breast conservation deformities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    PubMed

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  19. Language-Agnostic Reproducible Data Analysis Using Literate Programming

    PubMed Central

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123

  20. Developing nursing leadership in social media.

    PubMed

    Moorley, Calvin; Chinn, Teresa

    2016-03-01

    A discussion on how nurse leaders are using social media and developing digital leadership in online communities. Social media is relatively new and how it is used by nurse leaders and nurses in a digital space is under explored. Discussion paper. Searches used CINAHL, the Royal College of Nursing webpages, Wordpress (for blogs) and Twitter from 2000-2015. Search terms used were Nursing leadership + Nursing social media. Understanding the development and value of nursing leadership in social media is important for nurses in formal and informal (online) leadership positions. Nurses in formal leadership roles in organizations such as the National Health Service are beginning to leverage social media. Social media has the potential to become a tool for modern nurse leadership, as it is a space where can you listen on a micro level to each individual. In addition to listening, leadership can be achieved on a much larger scale through the use of social media monitoring tools and exploration of data and crowd sourcing. Through the use of data and social media listening tools nursing leaders can seek understanding and insight into a variety of issues. Social media also places nurse leaders in a visible and accessible position as role models. Social media and formal nursing leadership do not have to be against each other, but they can work in harmony as both formal and online leadership possess skills that are transferable. If used wisely social media has the potential to become a tool for modern nurse leadership. © 2016 John Wiley & Sons Ltd.

  1. Stakeholder Perceptions of ICT Usage across Management Institutes

    ERIC Educational Resources Information Center

    Goyal, Ela; Purohit, Seema; Bhagat, Manju

    2013-01-01

    Information and communication technology (ICT) which includes radio, television and newer digital technology such as computers and the internet, are potentially powerful tools for extending educational opportunities, formal and non-formal, to one and all. It provides opportunities to deploy innovative teaching methodologies and interesting…

  2. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  3. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  4. Exploring Assessment Tools for Research and Evaluation in Astronomy Education and Outreach

    NASA Astrophysics Data System (ADS)

    Buxner, S. R.; Wenger, M. C.; Dokter, E. F. C.

    2011-09-01

    The ability to effectively measure knowledge, attitudes, and skills in formal and informal educational settings is an important aspect of astronomy education research and evaluation. Assessments may take the form of interviews, observations, surveys, exams, or other probes to help unpack people's understandings or beliefs. In this workshop, we discussed characteristics of a variety of tools that exist to assess understandings of different concepts in astronomy as well as attitudes towards science and science teaching; these include concept inventories, surveys, interview protocols, observation protocols, card sorting, reflection videos, and other methods currently being used in astronomy education research and EPO program evaluations. In addition, we discussed common questions in the selection of assessment tools including issues of reliability and validity, time to administer, format of implementation, analysis, and human subject concerns.

  5. In Search of Rationality: The Purposes behind the Use of Formal Analysis in Organizations.

    ERIC Educational Resources Information Center

    Langley, Ann

    1989-01-01

    Examines how formal analysis is actually practiced in 3 different organizations. Identifies 4 main groups of purposes for formal analysis and relates them to various hierarchical relationships. Formal analysis and social interaction seem inextricably linked in organizational decision-making. Different structural configurations may generate…

  6. Assessment of parametric uncertainty for groundwater reactive transport modeling,

    USGS Publications Warehouse

    Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun

    2014-01-01

    The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood functions, improve model calibration, and reduce predictive uncertainty in other groundwater reactive transport and environmental modeling.

  7. Tools, information sources, and methods used in deciding on drug availability in HMOs.

    PubMed

    Barner, J C; Thomas, J

    1998-01-01

    The use and importance of specific decision-making tools, information sources, and drug-use management methods in determining drug availability and use in HMOs were studied. A questionnaire was sent to 303 randomly selected HMOs. Respondents were asked to rate their use of each of four formal decision-making tools and its relative importance, as well as the use and importance of eight information sources and 11 methods for managing drug availability and use, on a 5-point scale. The survey response rate was 28%. Approximately half of the respondents reported that their HMOs used decision analysis or multiattribute analysis in deciding on drug availability. If used, these tools were rated as very important. There were significant differences in levels of use by HMO type, membership size, and age. Journal articles and reference books were reported most often as information sources. Retrospective drug-use review was used very often and perceived to be very important in managing drug use. Other management methods were used only occasionally, but the importance placed on these tools when used ranged from moderately to very important. Older organizations used most of the management methods more often than did other HMOs. Decision analysis and multiattribute analysis were the most commonly used tools for deciding on which drugs to make available to HMO members, and reference books and journal articles were the most commonly used information sources. Retrospective and prospective drug-use reviews were the most commonly applied methods for managing HMO members' access to drugs.

  8. An analysis of texture, timbre, and rhythm in relation to form in Magnus Lindberg's "Gran Duo"

    NASA Astrophysics Data System (ADS)

    Wolfe, Brian Thomas

    Gran Duo (1999-2000) by Magnus Lindberg (b. 1958) is the result of a commission by Sir Simon Rattle, former conductor of the City of Birmingham (England) Symphony Orchestra, and the Royal Festival Hall to commemorate the third millennium. Composed for twenty-four woodwinds and brass, Lindberg divides the woodwind and brass families into eight characters that serve as participants in an attentive twenty-minute conversation. The document includes biographical information about the composition to further understand Lindberg's writing style. The composer's use of computer-assisted composition techniques inspires an alternative structural analysis of Gran Duo. Spectral graphs provide a supplementary tool for score study assisting with the verification of formal structural elements. A tempo chart allows the conductor to easily identify form and tempo relationships between each of the nineteen sections throughout the five-movement composition. In order to reveal character areas and their relation to the structure of the work, the analysis of texture, timbre, and rhythm reveal the formal structure of the composition, which reflects a conversation between the brass and woodwinds in this setting for wind instruments.

  9. Formal implementation of a performance evaluation model for the face recognition system.

    PubMed

    Shin, Yong-Nyuo; Kim, Jason; Lee, Yong-Jun; Shin, Woochang; Choi, Jin-Young

    2008-01-01

    Due to usability features, practical applications, and its lack of intrusiveness, face recognition technology, based on information, derived from individuals' facial features, has been attracting considerable attention recently. Reported recognition rates of commercialized face recognition systems cannot be admitted as official recognition rates, as they are based on assumptions that are beneficial to the specific system and face database. Therefore, performance evaluation methods and tools are necessary to objectively measure the accuracy and performance of any face recognition system. In this paper, we propose and formalize a performance evaluation model for the biometric recognition system, implementing an evaluation tool for face recognition systems based on the proposed model. Furthermore, we performed evaluations objectively by providing guidelines for the design and implementation of a performance evaluation system, formalizing the performance test process.

  10. Elaboration and formalization of current scientific knowledge of risks and preventive measures illustrated by colorectal cancer.

    PubMed

    Giorgi, R; Gouvernet, J; Dufour, J; Degoulet, P; Laugier, R; Quilichini, F; Fieschi, M

    2001-01-01

    Present the method used to elaborate and formalize current scientific knowledge to provide physicians with tools available on the Internet, that enable them to evaluate individual patient risk, give personalized preventive recommendations or early screening measures. The approach suggested in this article is in line with medical procedures based on levels of evidence (Evidence-based Medicine). A cyclical process for developing recommendations allows us to quickly incorporate current scientific information. At each phase, the analysis is reevaluated by experts in the field collaborating on the project. The information is formalized through the use of levels of evidence and grades of recommendations. GLIF model is used to implement recommendations for clinical practice guidelines. The most current scientific evidence incorporated in a cyclical process includes several steps: critical analysis according to the Evidence-based Medicine method; identification of predictive factors; setting-up risk levels; identification of prevention measures; elaboration of personalized recommendation. The information technology implementation of the clinical practice guideline enables physicians to quickly obtain personalized information for their patients. Cases of colorectal prevention illustrate our approach. Integration of current scientific knowledge is an important process. The delay between the moment new information arrives and the moment the practitioner applies it, is thus reduced.

  11. Formal analysis and automatic generation of user interfaces: approach, methodology, and an algorithm.

    PubMed

    Heymann, Michael; Degani, Asaf

    2007-04-01

    We present a formal approach and methodology for the analysis and generation of user interfaces, with special emphasis on human-automation interaction. A conceptual approach for modeling, analyzing, and verifying the information content of user interfaces is discussed. The proposed methodology is based on two criteria: First, the interface must be correct--that is, given the interface indications and all related information (user manuals, training material, etc.), the user must be able to successfully perform the specified tasks. Second, the interface and related information must be succinct--that is, the amount of information (mode indications, mode buttons, parameter settings, etc.) presented to the user must be reduced (abstracted) to the minimum necessary. A step-by-step procedure for generating the information content of the interface that is both correct and succinct is presented and then explained and illustrated via two examples. Every user interface is an abstract description of the underlying system. The correspondence between the abstracted information presented to the user and the underlying behavior of a given machine can be analyzed and addressed formally. The procedure for generating the information content of user interfaces can be automated, and a software tool for its implementation has been developed. Potential application areas include adaptive interface systems and customized/personalized interfaces.

  12. Formalization of the Integral Calculus in the PVS Theorem Prover

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    2004-01-01

    The PVS Theorem prover is a widely used formal verification tool used for the analysis of safety-critical systems. The PVS prover, though fully equipped to support deduction in a very general logic framework, namely higher-order logic, it must nevertheless, be augmented with the definitions and associated theorems for every branch of mathematics and Computer Science that is used in a verification. This is a formidable task, ultimately requiring the contributions of researchers and developers all over the world. This paper reports on the formalization of the integral calculus in the PVS theorem prover. All of the basic definitions and theorems covered in a first course on integral calculus have been completed.The theory and proofs were based on Rosenlicht's classic text on real analysis and follow the traditional epsilon-delta method. The goal of this work was to provide a practical set of PVS theories that could be used for verification of hybrid systems that arise in air traffic management systems and other aerospace applications. All of the basic linearity, integrability, boundedness, and continuity properties of the integral calculus were proved. The work culminated in the proof of the Fundamental Theorem Of Calculus. There is a brief discussion about why mechanically checked proofs are so much longer than standard mathematics textbook proofs.

  13. An enhanced biometric authentication scheme for telecare medicine information systems with nonce using chaotic hash function.

    PubMed

    Das, Ashok Kumar; Goswami, Adrijit

    2014-06-01

    Recently, Awasthi and Srivastava proposed a novel biometric remote user authentication scheme for the telecare medicine information system (TMIS) with nonce. Their scheme is very efficient as it is based on efficient chaotic one-way hash function and bitwise XOR operations. In this paper, we first analyze Awasthi-Srivastava's scheme and then show that their scheme has several drawbacks: (1) incorrect password change phase, (2) fails to preserve user anonymity property, (3) fails to establish a secret session key beween a legal user and the server, (4) fails to protect strong replay attack, and (5) lacks rigorous formal security analysis. We then a propose a novel and secure biometric-based remote user authentication scheme in order to withstand the security flaw found in Awasthi-Srivastava's scheme and enhance the features required for an idle user authentication scheme. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. In addition, we simulate our scheme for the formal security verification using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool and show that our scheme is secure against passive and active attacks, including the replay and man-in-the-middle attacks. Our scheme is also efficient as compared to Awasthi-Srivastava's scheme.

  14. Security Risks: Management and Mitigation in the Software Life Cycle

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.

    2004-01-01

    A formal approach to managing and mitigating security risks in the software life cycle is requisite to developing software that has a higher degree of assurance that it is free of security defects which pose risk to the computing environment and the organization. Due to its criticality, security should be integrated as a formal approach in the software life cycle. Both a software security checklist and assessment tools should be incorporated into this life cycle process and integrated with a security risk assessment and mitigation tool. The current research at JPL addresses these areas through the development of a Sotfware Security Assessment Instrument (SSAI) and integrating it with a Defect Detection and Prevention (DDP) risk management tool.

  15. Structured decision making: Chapter 5

    USGS Publications Warehouse

    Runge, Michael C.; Grand, James B.; Mitchell, Michael S.; Krausman, Paul R.; Cain, James W. III

    2013-01-01

    Wildlife management is a decision-focused discipline. It needs to integrate traditional wildlife science and social science to identify actions that are most likely to achieve the array of desires society has surrounding wildlife populations. Decision science, a vast field with roots in economics, operations research, and psychology, offers a rich set of tools to help wildlife managers frame, decompose, analyze, and synthesize their decisions. The nature of wildlife management as a decision science has been recognized since the inception of the field, but formal methods of decision analysis have been underused. There is tremendous potential for wildlife management to grow further through the use of formal decision analysis. First, the wildlife science and human dimensions of wildlife disciplines can be readily integrated. Second, decisions can become more efficient. Third, decisions makers can communicate more clearly with stakeholders and the public. Fourth, good, intuitive wildlife managers, by explicitly examining how they make decisions, can translate their art into a science that is readily used by the next generation.

  16. Uniform, optimal signal processing of mapped deep-sequencing data.

    PubMed

    Kumar, Vibhor; Muratani, Masafumi; Rayan, Nirmala Arul; Kraus, Petra; Lufkin, Thomas; Ng, Huck Hui; Prabhakar, Shyam

    2013-07-01

    Despite their apparent diversity, many problems in the analysis of high-throughput sequencing data are merely special cases of two general problems, signal detection and signal estimation. Here we adapt formally optimal solutions from signal processing theory to analyze signals of DNA sequence reads mapped to a genome. We describe DFilter, a detection algorithm that identifies regulatory features in ChIP-seq, DNase-seq and FAIRE-seq data more accurately than assay-specific algorithms. We also describe EFilter, an estimation algorithm that accurately predicts mRNA levels from as few as 1-2 histone profiles (R ∼0.9). Notably, the presence of regulatory motifs in promoters correlates more with histone modifications than with mRNA levels, suggesting that histone profiles are more predictive of cis-regulatory mechanisms. We show by applying DFilter and EFilter to embryonic forebrain ChIP-seq data that regulatory protein identification and functional annotation are feasible despite tissue heterogeneity. The mathematical formalism underlying our tools facilitates integrative analysis of data from virtually any sequencing-based functional profile.

  17. Education and Outreach with the Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Lawton, Brandon L.; Eisenhamer, B.; Raddick, M. J.; Mattson, B. J.; Harris, J.

    2012-01-01

    The Virtual Observatory (VO) is an international effort to bring a large-scale electronic integration of astronomy data, tools, and services to the global community. The Virtual Astronomical Observatory (VAO) is the U.S. NSF- and NASA-funded VO effort that seeks to put efficient astronomical tools in the hands of U.S. astronomers, students, educators, and public outreach leaders. These tools will make use of data collected by the multitude of ground- and space-based missions over the previous decades. Many future missions will also be incorporated into the VAO tools when they launch. The Education and Public Outreach (E/PO) program for the VAO is led by the Space Telescope Science Institute in collaboration with the HEASARC E/PO program and Johns Hopkins University. VAO E/PO efforts seek to bring technology, real-world astronomical data, and the story of the development and infrastructure of the VAO to the general public, formal education, and informal education communities. Our E/PO efforts will be structured to provide uniform access to VAO information, enabling educational opportunities across multiple wavelengths and time-series data sets. The VAO team recognizes that many VO programs have built powerful tools for E/PO purposes, such as Microsoft's World Wide Telescope, SDSS Sky Server, Aladin, and a multitude of citizen-science tools available from Zooniverse. We are building partnerships with Microsoft, Zooniverse, and NASA's Night Sky Network to leverage the communities and tools that already exist to meet the needs of our audiences. Our formal education program is standards-based and aims to give teachers the tools to use real astronomical data to teach the STEM subjects. To determine which tools the VAO will incorporate into the formal education program, needs assessments will be conducted with educators across the U.S.

  18. Uncertainty Modeling for Robustness Analysis of Control Upset Prevention and Recovery Systems

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Khong, Thuan H.; Shin, Jong-Yeob; Kwatny, Harry; Chang, Bor-Chin; Balas, Gary J.

    2005-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems (developed for failure detection, identification, and reconfiguration, as well as upset recovery) need to be evaluated over broad regions of the flight envelope and under extreme flight conditions, and should include various sources of uncertainty. However, formulation of linear fractional transformation (LFT) models for representing system uncertainty can be very difficult for complex parameter-dependent systems. This paper describes a preliminary LFT modeling software tool which uses a matrix-based computational approach that can be directly applied to parametric uncertainty problems involving multivariate matrix polynomial dependencies. Several examples are presented (including an F-16 at an extreme flight condition, a missile model, and a generic example with numerous crossproduct terms), and comparisons are given with other LFT modeling tools that are currently available. The LFT modeling method and preliminary software tool presented in this paper are shown to compare favorably with these methods.

  19. Developing an eLearning tool formalizing in YAWL the guidelines used in a transfusion medicine service.

    PubMed

    Russo, Paola; Piazza, Miriam; Leonardi, Giorgio; Roncoroni, Layla; Russo, Carlo; Spadaro, Salvatore; Quaglini, Silvana

    2012-01-01

    The blood transfusion is a complex activity subject to a high risk of eventually fatal errors. The development and application of computer-based systems could help reducing the error rate, playing a fundamental role in the improvement of the quality of care. This poster presents an under development eLearning tool formalizing the guidelines of the transfusion process. This system, implemented in YAWL (Yet Another Workflow Language), will be used to train the personnel in order to improve the efficiency of care and to reduce errors.

  20. An object-oriented description method of EPMM process

    NASA Astrophysics Data System (ADS)

    Jiang, Zuo; Yang, Fan

    2017-06-01

    In order to use the object-oriented mature tools and language in software process model, make the software process model more accord with the industrial standard, it’s necessary to study the object-oriented modelling of software process. Based on the formal process definition in EPMM, considering the characteristics that Petri net is mainly formal modelling tool and combining the Petri net modelling with the object-oriented modelling idea, this paper provides this implementation method to convert EPMM based on Petri net into object models based on object-oriented description.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, C.

    The Rocky Flats Environmental Technology Site (RFETS) has initiated a major work process improvement campaign using the tools of formalized benchmarking and streamlining. This paper provides insights into some of the process improvement activities performed at Rocky Flats from November 1995 through December 1996. It reviews the background, motivation, methodology, results, and lessons learned from this ongoing effort. The paper also presents important gains realized through process analysis and improvement including significant cost savings, productivity improvements, and an enhanced understanding of site work processes.

  2. "Transformative Looks": Practicing Citizenship through Photography

    ERIC Educational Resources Information Center

    Pereira, Sónia; Maiztegui-Oñate, Concha; Mata-Codesal, Diana

    2016-01-01

    Purpose: The article discusses the meanings of citizenship and citizenship education when formal citizenship is restricted by exploring the potential of photography education and practice as a tool that promotes the exercise of citizenship in the context of non-formal critical adult education. By doing it, this text aims to enhance our…

  3. Improving Project Management Using Formal Models and Architectures

    NASA Technical Reports Server (NTRS)

    Kahn, Theodore; Sturken, Ian

    2011-01-01

    This talk discusses the advantages formal modeling and architecture brings to project management. These emerging technologies have both great potential and challenges for improving information available for decision-making. The presentation covers standards, tools and cultural issues needing consideration, and includes lessons learned from projects the presenters have worked on.

  4. Formal verification and testing: An integrated approach to validating Ada programs

    NASA Technical Reports Server (NTRS)

    Cohen, Norman H.

    1986-01-01

    An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.

  5. Social Argumentation in Online Synchronous Communication

    ERIC Educational Resources Information Center

    Alagoz, Esra

    2013-01-01

    The ability to argue well is a valuable skill for students in both formal and informal learning environments. While many studies have explored the argumentative practices in formal environments and some researchers have developed tools to enhance the argumentative skills, the social argumentation that is occurring in informal spaces has yet to be…

  6. Robustness Analysis and Reliable Flight Regime Estimation of an Integrated Resilent Control System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine

    2008-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. As a part of the validation process, this paper describes an analysis method for determining a reliable flight regime in the flight envelope within which an integrated resilent control system can achieve the desired performance of tracking command signals and detecting additive faults in the presence of parameter uncertainty and unmodeled dynamics. To calculate a reliable flight regime, a structured singular value analysis method is applied to analyze the closed-loop system over the entire flight envelope. To use the structured singular value analysis method, a linear fractional transform (LFT) model of a transport aircraft longitudinal dynamics is developed over the flight envelope by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The developed LFT model can capture original nonlinear dynamics over the flight envelope with the ! block which contains key varying parameters: angle of attack and velocity, and real parameter uncertainty: aerodynamic coefficient uncertainty and moment of inertia uncertainty. Using the developed LFT model and a formal robustness analysis method, a reliable flight regime is calculated for a transport aircraft closed-loop system.

  7. Formalization of treatment guidelines using Fuzzy Cognitive Maps and semantic web tools.

    PubMed

    Papageorgiou, Elpiniki I; Roo, Jos De; Huszka, Csaba; Colaert, Dirk

    2012-02-01

    Therapy decision making and support in medicine deals with uncertainty and needs to take into account the patient's clinical parameters, the context of illness and the medical knowledge of the physician and guidelines to recommend a treatment therapy. This research study is focused on the formalization of medical knowledge using a cognitive process, called Fuzzy Cognitive Maps (FCMs) and semantic web approach. The FCM technique is capable of dealing with situations including uncertain descriptions using similar procedure such as human reasoning does. Thus, it was selected for the case of modeling and knowledge integration of clinical practice guidelines. The semantic web tools were established to implement the FCM approach. The knowledge base was constructed from the clinical guidelines as the form of if-then fuzzy rules. These fuzzy rules were transferred to FCM modeling technique and, through the semantic web tools, the whole formalization was accomplished. The problem of urinary tract infection (UTI) in adult community was examined for the proposed approach. Forty-seven clinical concepts and eight therapy concepts were identified for the antibiotic treatment therapy problem of UTIs. A preliminary pilot-evaluation study with 55 patient cases showed interesting findings; 91% of the antibiotic treatments proposed by the implemented approach were in fully agreement with the guidelines and physicians' opinions. The results have shown that the suggested approach formalizes medical knowledge efficiently and gives a front-end decision on antibiotics' suggestion for cystitis. Concluding, modeling medical knowledge/therapeutic guidelines using cognitive methods and web semantic tools is both reliable and useful. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. Screening for Alcohol Problems among 4-Year Colleges and Universities

    ERIC Educational Resources Information Center

    Winters, Ken C.; Toomey, Traci; Nelson, Toben F.; Erickson, Darin; Lenk, Kathleen; Miazga, Mark

    2011-01-01

    Objective: To assess the use of alcohol screening tools across US colleges. Participants: Directors of health services at 333 four-year colleges. Methods: An online survey was conducted regarding the use of alcohol screening tools. Schools reporting use of formal tools were further described in terms of 4 tools (AUDIT, CUGE, CAPS, and RAPS) that…

  9. Anatomy and histology as socially networked learning environments: some preliminary findings.

    PubMed

    Hafferty, Frederic W; Castellani, Brian; Hafferty, Philip K; Pawlina, Wojciech

    2013-09-01

    An exploratory study to better understand the "networked" life of the medical school as a learning environment. In a recent academic year, the authors gathered data during two six-week blocks of a sequential histology and anatomy course at a U.S. medical college. An eight-item questionnaire captured different dimensions of student interactions. The student cohort/network was 48 first-year medical students. Using social network analysis (SNA), the authors focused on (1) the initial structure and the evolution of informal class networks over time, (2) how informal class networks compare to formal in-class small-group assignments in influencing student information gathering, and (3) how peer assignment of professionalism role model status is shaped more by informal than formal ties. In examining these latter two issues, the authors explored not only how formal group assignment persisted over time but also how it functioned to prevent the tendency for groupings based on gender or ethnicity. The study revealed an evolving dynamic between the formal small-group learning structure of the course blocks and the emergence of informal student networks. For example, whereas formal group membership did influence in-class questions and did prevent formation of groups of like gender and ethnicity, outside-class questions and professionalism were influenced more by informal group ties where gender and, to a much lesser extent, ethnicity influence student information gathering. The richness of these preliminary findings suggests that SNA may be a useful tool in examining an array of medical student learning encounters.

  10. Concept similarity and related categories in information retrieval using formal concept analysis

    NASA Astrophysics Data System (ADS)

    Eklund, P.; Ducrou, J.; Dau, F.

    2012-11-01

    The application of formal concept analysis to the problem of information retrieval has been shown useful but has lacked any real analysis of the idea of relevance ranking of search results. SearchSleuth is a program developed to experiment with the automated local analysis of Web search using formal concept analysis. SearchSleuth extends a standard search interface to include a conceptual neighbourhood centred on a formal concept derived from the initial query. This neighbourhood of the concept derived from the search terms is decorated with its upper and lower neighbours representing more general and special concepts, respectively. SearchSleuth is in many ways an archetype of search engines based on formal concept analysis with some novel features. In SearchSleuth, the notion of related categories - which are themselves formal concepts - is also introduced. This allows the retrieval focus to shift to a new formal concept called a sibling. This movement across the concept lattice needs to relate one formal concept to another in a principled way. This paper presents the issues concerning exploring, searching, and ordering the space of related categories. The focus is on understanding the use and meaning of proximity and semantic distance in the context of information retrieval using formal concept analysis.

  11. Why are Formal Methods Not Used More Widely?

    NASA Technical Reports Server (NTRS)

    Knight, John C.; DeJong, Colleen L.; Gibble, Matthew S.; Nakano, Luis G.

    1997-01-01

    Despite extensive development over many years and significant demonstrated benefits, formal methods remain poorly accepted by industrial practitioners. Many reasons have been suggested for this situation such as a claim that they extent the development cycle, that they require difficult mathematics, that inadequate tools exist, and that they are incompatible with other software packages. There is little empirical evidence that any of these reasons is valid. The research presented here addresses the question of why formal methods are not used more widely. The approach used was to develop a formal specification for a safety-critical application using several specification notations and assess the results in a comprehensive evaluation framework. The results of the experiment suggests that there remain many impediments to the routine use of formal methods.

  12. Connecting Architecture and Implementation

    NASA Astrophysics Data System (ADS)

    Buchgeher, Georg; Weinreich, Rainer

    Software architectures are still typically defined and described independently from implementation. To avoid architectural erosion and drift, architectural representation needs to be continuously updated and synchronized with system implementation. Existing approaches for architecture representation like informal architecture documentation, UML diagrams, and Architecture Description Languages (ADLs) provide only limited support for connecting architecture descriptions and implementations. Architecture management tools like Lattix, SonarJ, and Sotoarc and UML-tools tackle this problem by extracting architecture information directly from code. This approach works for low-level architectural abstractions like classes and interfaces in object-oriented systems but fails to support architectural abstractions not found in programming languages. In this paper we present an approach for linking and continuously synchronizing a formalized architecture representation to an implementation. The approach is a synthesis of functionality provided by code-centric architecture management and UML tools and higher-level architecture analysis approaches like ADLs.

  13. GAMES II Project: a general architecture for medical knowledge-based systems.

    PubMed

    Bruno, F; Kindler, H; Leaning, M; Moustakis, V; Scherrer, J R; Schreiber, G; Stefanelli, M

    1994-10-01

    GAMES II aims at developing a comprehensive and commercially viable methodology to avoid problems ordinarily occurring in KBS development. GAMES II methodology proposes to design a KBS starting from an epistemological model of medical reasoning (the Select and Test Model). The design is viewed as a process of adding symbol level information to the epistemological model. The architectural framework provided by GAMES II integrates the use of different formalisms and techniques providing a large set of tools. The user can select the most suitable one for representing a piece of knowledge after a careful analysis of its epistemological characteristics. Special attention is devoted to the tools dealing with knowledge acquisition (both manual and automatic). A panel of practicing physicians are assessing the medical value of such a framework and its related tools by using it in a practical application.

  14. Formal and Informal Registration as Marketing Tools: Do They Produce "Trapped" Executives?

    ERIC Educational Resources Information Center

    Apple, L. Eugene

    1993-01-01

    A marketing concept was applied to college registration procedures in an experiment, focusing on degree of "escalation" of effort of students who had failed twice to register in desired courses, type of registration used (formal or informal) on each of three tries, and student characteristics (time until graduation, major, gender). (MSE)

  15. The Personnel Effectiveness Grid (PEG): A New Tool for Estimating Personnel Department Effectiveness

    ERIC Educational Resources Information Center

    Petersen, Donald J.; Malone, Robert L.

    1975-01-01

    Examines the difficulties inherent in attempting a formal personnel evaluation system, the major formal methods currently used for evaluating personnel department accountabilities, some parameters that should be part of a valid evaluation program, and a model for conducting the evaluation. (Available from Office of Publications, Graduate School of…

  16. Peer Review of a Formal Verification/Design Proof Methodology

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The role of formal verification techniques in system validation was examined. The value and the state of the art of performance proving for fault-tolerant compuers were assessed. The investigation, development, and evaluation of performance proving tools were reviewed. The technical issues related to proof methodologies are examined. The technical issues discussed are summarized.

  17. Line group techniques in description of the structural phase transitions in some superconductors

    NASA Technical Reports Server (NTRS)

    Meszaros, CS.; Balint, A.; Bankuti, J.

    1995-01-01

    The main features of the theory of line groups, and their irreducible representations are briefly discussed, as well as the most important applications of them. A new approach in the general symmetry analysis of the modulated systems is presented. It is shown, that the line group formalism could be a very effective tool in the examination of the structural phase transitions in High Temperature SUperconductors. As an example, the material YBa2Cu3O(7-x) is discussed briefly.

  18. An Independent and Coordinated Criterion for Kinematic Aircraft Maneuvers

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony J.; Munoz, Cesar A.; Hagen, George

    2014-01-01

    This paper proposes a mathematical definition of an aircraft-separation criterion for kinematic-based horizontal maneuvers. It has been formally proved that kinematic maneu- vers that satisfy the new criterion are independent and coordinated for repulsiveness, i.e., the distance at closest point of approach increases whether one or both aircraft maneuver according to the criterion. The proposed criterion is currently used in NASA's Airborne Coordinated Resolution and Detection (ACCoRD) set of tools for the design and analysis of separation assurance systems.

  19. Enhanced semantic interoperability by profiling health informatics standards.

    PubMed

    López, Diego M; Blobel, Bernd

    2009-01-01

    Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.

  20. Safety Analysis of FMS/CTAS Interactions During Aircraft Arrivals

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.

    1998-01-01

    This grant funded research on human-computer interaction design and analysis techniques, using future ATC environments as a testbed. The basic approach was to model the nominal behavior of both the automated and human procedures and then to apply safety analysis techniques to these models. Our previous modeling language, RSML, had been used to specify the system requirements for TCAS II for the FAA. Using the lessons learned from this experience, we designed a new modeling language that (among other things) incorporates features to assist in designing less error-prone human-computer interactions and interfaces and in detecting potential HCI problems, such as mode confusion. The new language, SpecTRM-RL, uses "intent" abstractions, based on Rasmussen's abstraction hierarchy, and includes both informal (English and graphical) specifications and formal, executable models for specifying various aspects of the system. One of the goals for our language was to highlight the system modes and mode changes to assist in identifying the potential for mode confusion. Three published papers resulted from this research. The first builds on the work of Degani on mode confusion to identify aspects of the system design that could lead to potential hazards. We defined and modeled modes differently than Degani and also defined design criteria for SpecTRM-RL models. Our design criteria include the Degani criteria but extend them to include more potential problems. In a second paper, Leveson and Palmer showed how the criteria for indirect mode transitions could be applied to a mode confusion problem found in several ASRS reports for the MD-88. In addition, we defined a visual task modeling language that can be used by system designers to model human-computer interaction. The visual models can be translated into SpecTRM-RL models, and then the SpecTRM-RL suite of analysis tools can be used to perform formal and informal safety analyses on the task model in isolation or integrated with the rest of the modeled system. We had hoped to be able to apply these modeling languages and analysis tools to a TAP air/ground trajectory negotiation scenario, but the development of the tools took more time than we anticipated.

  1. The quality management journey: the progress of health facilities in Australia.

    PubMed

    Carr, B J

    1994-12-01

    Many facilities in Australia have taken the Total Quality Management (TQM) step. The objective of this study was to examine progress of adopted formal quality systems in health. Sixty per cent of organizations surveyed have adopted formal systems. Of these, Deming adherents are the most common, followed by eclectic choices. Only 35% considered the quality transition as reasonably easy. There was no relationship between accreditation and formal quality systems identified. The most common improvement techniques were: flow charts, histograms, and cause and effect diagrams. Quality practitioners are happy to use several tools exceptionally well rather than have many tools at their disposal. The greatest impediment to the adoption of quality was the lack of top management support. This study did not support the view that clinicians are not readily actively supporting quality initiatives. Total Quality Management is not a mature concept; however, Chief Executive Officers are assured that rewards will be realized over time.

  2. LinkEHR-Ed: a multi-reference model archetype editor based on formal semantics.

    PubMed

    Maldonado, José A; Moner, David; Boscá, Diego; Fernández-Breis, Jesualdo T; Angulo, Carlos; Robles, Montserrat

    2009-08-01

    To develop a powerful archetype editing framework capable of handling multiple reference models and oriented towards the semantic description and standardization of legacy data. The main prerequisite for implementing tools providing enhanced support for archetypes is the clear specification of archetype semantics. We propose a formalization of the definition section of archetypes based on types over tree-structured data. It covers the specialization of archetypes, the relationship between reference models and archetypes and conformance of data instances to archetypes. LinkEHR-Ed, a visual archetype editor based on the former formalization with advanced processing capabilities that supports multiple reference models, the editing and semantic validation of archetypes, the specification of mappings to data sources, and the automatic generation of data transformation scripts, is developed. LinkEHR-Ed is a useful tool for building, processing and validating archetypes based on any reference model.

  3. Experience Using Formal Methods for Specifying a Multi-Agent System

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Rash, James; Hinchey, Michael; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    The process and results of using formal methods to specify the Lights Out Ground Operations System (LOGOS) is presented in this paper. LOGOS is a prototype multi-agent system developed to show the feasibility of providing autonomy to satellite ground operations functions at NASA Goddard Space Flight Center (GSFC). After the initial implementation of LOGOS the development team decided to use formal methods to check for race conditions, deadlocks and omissions. The specification exercise revealed several omissions as well as race conditions. After completing the specification, the team concluded that certain tools would have made the specification process easier. This paper gives a sample specification of two of the agents in the LOGOS system and examples of omissions and race conditions found. It concludes with describing an architecture of tools that would better support the future specification of agents and other concurrent systems.

  4. DRS: Derivational Reasoning System

    NASA Technical Reports Server (NTRS)

    Bose, Bhaskar

    1995-01-01

    The high reliability requirements for airborne systems requires fault-tolerant architectures to address failures in the presence of physical faults, and the elimination of design flaws during the specification and validation phase of the design cycle. Although much progress has been made in developing methods to address physical faults, design flaws remain a serious problem. Formal methods provides a mathematical basis for removing design flaws from digital systems. DRS (Derivational Reasoning System) is a formal design tool based on advanced research in mathematical modeling and formal synthesis. The system implements a basic design algebra for synthesizing digital circuit descriptions from high level functional specifications. DRS incorporates an executable specification language, a set of correctness preserving transformations, verification interface, and a logic synthesis interface, making it a powerful tool for realizing hardware from abstract specifications. DRS integrates recent advances in transformational reasoning, automated theorem proving and high-level CAD synthesis systems in order to provide enhanced reliability in designs with reduced time and cost.

  5. Towards an Automated Development Methodology for Dependable Systems with Application to Sensor Networks

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a probably equivalent model has yet to appeal: Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a probably equivalent implementation are valuable but not su8cient. The "gap" unfilled by such tools and methods is that their. formal models cannot be proven to be equivalent to the system requirements as originated by the customel: For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a probably equivalent formal model that can be used as the basis for code generation and other transformations.

  6. Putting it all together: Exhumation histories from a formal combination of heat flow and a suite of thermochronometers

    USGS Publications Warehouse

    d'Alessio, M. A.; Williams, C.F.

    2007-01-01

    A suite of new techniques in thermochronometry allow analysis of the thermal history of a sample over a broad range of temperature sensitivities. New analysis tools must be developed that fully and formally integrate these techniques, allowing a single geologic interpretation of the rate and timing of exhumation and burial events consistent with all data. We integrate a thermal model of burial and exhumation, (U-Th)/He age modeling, and fission track age and length modeling. We then use a genetic algorithm to efficiently explore possible time-exhumation histories of a vertical sample profile (such as a borehole), simultaneously solving for exhumation and burial rates as well as changes in background heat flow. We formally combine all data in a rigorous statistical fashion. By parameterizing the model in terms of exhumation rather than time-temperature paths (as traditionally done in fission track modeling), we can ensure that exhumation histories result in a sedimentary basin whose thickness is consistent with the observed basin, a physically based constraint that eliminates otherwise acceptable thermal histories. We apply the technique to heat flow and thermochronometry data from the 2.1 -km-deep San Andreas Fault Observatory at Depth pilot hole near the San Andreas fault, California. We find that the site experienced <1 km of exhumation or burial since the onset of San Andreas fault activity ???30 Ma.

  7. Formal Logic and Flowchart for Diagnosis Validity Verification and Inclusion in Clinical Decision Support Systems

    NASA Astrophysics Data System (ADS)

    Sosa, M.; Grundel, L.; Simini, F.

    2016-04-01

    Logical reasoning is part of medical practice since its origins. Modern Medicine has included information-intensive tools to refine diagnostics and treatment protocols. We are introducing formal logic teaching in Medical School prior to Clinical Internship, to foster medical practice. Two simple examples (Acute Myocardial Infarction and Diabetes Mellitus) are given in terms of formal logic expression and truth tables. Flowcharts of both diagnostic processes help understand the procedures and to validate them logically. The particularity of medical information is that it is often accompanied by “missing data” which suggests to adapt formal logic to a “three state” logic in the future. Medical Education must include formal logic to understand complex protocols and best practices, prone to mutual interactions.

  8. Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering

    NASA Technical Reports Server (NTRS)

    Bolton, Matthew L.; Bass, Ellen J.

    2009-01-01

    Both the human factors engineering (HFE) and formal methods communities are concerned with finding and eliminating problems with safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to use model checking with HFE practices to perform formal verification of a human-interactive system. Despite the use of a seemingly simple target system, a patient controlled analgesia pump, the initial model proved to be difficult for the model checker to verify in a reasonable amount of time. This resulted in a number of model revisions that affected the HFE architectural, representativeness, and understandability goals of the effort. If formal methods are to meet the needs of the HFE community, additional modeling tools and technological developments are necessary.

  9. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareani, Corina; Venet, Arnaud; Visser, Willem; Washington, Rich

    2003-01-01

    We report on a study to determine the maturity of different verification and validation technologies (V&V) on a representative example of NASA flight software. The study consisted of a controlled experiment where three technologies (static analysis, runtime analysis and model checking) were compared to traditional testing with respect to their ability to find seeded errors in a prototype Mars Rover. What makes this study unique is that it is the first (to the best of our knowledge) to do a controlled experiment to compare formal methods based tools to testing on a realistic industrial-size example where the emphasis was on collecting as much data on the performance of the tools and the participants as possible. The paper includes a description of the Rover code that was analyzed, the tools used as well as a detailed description of the experimental setup and the results. Due to the complexity of setting up the experiment, our results can not be generalized, but we believe it can still serve as a valuable point of reference for future studies of this kind. It did confirm the belief we had that advanced tools can outperform testing when trying to locate concurrency errors. Furthermore the results of the experiment inspired a novel framework for testing the next generation of the Rover.

  10. Growth Processes and Formal Logic. Comments on History and Mathematics Regarded as Combined Educational Tools

    ERIC Educational Resources Information Center

    Seltman, Muriel; Seltman, P. E. J.

    1978-01-01

    The authors stress the importance of bringing together the causal logic of history and the formal logic of mathematics in order to humanize mathematics and make it more accessible. An example of such treatment is given in a discussion of the centrality of Euclid and the Euclidean system to mathematics development. (MN)

  11. Telemedicine Platform Enhanced visiophony solution to operate a Robot-Companion

    NASA Astrophysics Data System (ADS)

    Simonnet, Th.; Couet, A.; Ezvan, P.; Givernaud, O.; Hillereau, P.

    Nowadays, one of the ways to reduce medical care costs is to reduce the length of patients hospitalization and reinforce home sanitary support by formal (professionals) and non formal (family) caregivers. The aim is to design and operate a scalable and secured collaborative platform to handle specific tools for patients, their families and doctors.

  12. "Newbies" and "Celebrities": Detecting Social Roles in an Online Network of Teachers via Participation Patterns

    ERIC Educational Resources Information Center

    Smith Risser, H.; Bottoms, SueAnn

    2014-01-01

    The advent of social networking tools allows teachers to create online networks and share information. While some virtual networks have a formal structure and defined boundaries, many do not. These unstructured virtual networks are difficult to study because they lack defined boundaries and a formal structure governing leadership roles and the…

  13. KNOW ESSENTIALS: a tool for informed decisions in the absence of formal HTA systems.

    PubMed

    Mathew, Joseph L

    2011-04-01

    Most developing countries and resource-limited settings lack robust health technology assessment (HTA) systems. Because the development of locally relevant HTA is not immediately viable, and the extrapolation of external HTA is inappropriate, a new model for evaluating health technologies is required. The aim of this study was to describe the development and application of KNOW ESSENTIALS, a tool facilitating evidence-based decisions on health technologies by stakeholders in settings lacking formal HTA systems. Current HTA methodology was examined through literature search. Additional issues relevant to resource-limited settings, but not adequately addressed in current methodology, were identified through further literature search, appraisal of contextually relevant issues, discussion with healthcare professionals familiar with the local context, and personal experience. A set of thirteen elements important for evidence-based decisions was identified, selected and combined into a tool with the mnemonic KNOW ESSENTIALS. Detailed definitions for each element, coding for the elements, and a system to evaluate a given health technology using the tool were developed. Developing countries and resource-limited settings face several challenges to informed decision making. Models that are relevant and applicable in high-income countries are unlikely in such settings. KNOW ESSENTIALS is an alternative that facilitates evidence-based decision making by stakeholders without formal expertise in HTA. The tool could be particularly useful, as an interim measure, in healthcare systems that are developing HTA capacity. It could also be useful anywhere when rapid evidence-based decisions on health technologies are required.

  14. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  15. A Generic Software Safety Document Generator

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Venkatesan, Ram Prasad

    2004-01-01

    Formal certification is based on the idea that a mathematical proof of some property of a piece of software can be regarded as a certificate of correctness which, in principle, can be subjected to external scrutiny. In practice, however, proofs themselves are unlikely to be of much interest to engineers. Nevertheless, it is possible to use the information obtained from a mathematical analysis of software to produce a detailed textual justification of correctness. In this paper, we describe an approach to generating textual explanations from automatically generated proofs of program safety, where the proofs are of compliance with an explicit safety policy that can be varied. Key to this is tracing proof obligations back to the program, and we describe a tool which implements this to certify code auto-generated by AutoBayes and AutoFilter, program synthesis systems under development at the NASA Ames Research Center. Our approach is a step towards combining formal certification with traditional certification methods.

  16. Designing an architectural style for Pervasive Healthcare systems.

    PubMed

    Rafe, Vahid; Hajvali, Masoumeh

    2013-04-01

    Nowadays, the Pervasive Healthcare (PH) systems are considered as an important research area. These systems have a dynamic structure and configuration. Therefore, an appropriate method for designing such systems is necessary. The Publish/Subscribe Architecture (pub/sub) is one of the convenient architectures to support such systems. PH systems are safety critical; hence, errors can bring disastrous results. To prevent such problems, a powerful analytical tool is required. So using a proper formal language like graph transformation systems for developing of these systems seems necessary. But even if software engineers use such high level methodologies, errors may occur in the system under design. Hence, it should be investigated automatically and formally that whether this model of system satisfies all their requirements or not. In this paper, a dynamic architectural style for developing PH systems is presented. Then, the behavior of these systems is modeled and evaluated using GROOVE toolset. The results of the analysis show its high reliability.

  17. Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.

    PubMed

    Robles-Rubio, Carlos Alejandro; Bertolizio, Gianluca; Brown, Karen A; Kearney, Robert E

    2015-01-01

    Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA). POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i) a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP) signals; (ii) RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii) a library of data segments representing each of the 6 patterns; (iv) a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v) a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness). Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential for longitudinal and multicenter studies.

  18. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  19. Concepts of formal concept analysis

    NASA Astrophysics Data System (ADS)

    Žáček, Martin; Homola, Dan; Miarka, Rostislav

    2017-07-01

    The aim of this article is apply of Formal Concept Analysis on concept of world. Formal concept analysis (FCA) as a methodology of data analysis, information management and knowledge representation has potential to be applied to a verity of linguistic problems. FCA is mathematical theory for concepts and concept hierarchies that reflects an understanding of concept. Formal concept analysis explicitly formalizes extension and intension of a concept, their mutual relationships. A distinguishing feature of FCA is an inherent integration of three components of conceptual processing of data and knowledge, namely, the discovery and reasoning with concepts in data, discovery and reasoning with dependencies in data, and visualization of data, concepts, and dependencies with folding/unfolding capabilities.

  20. Using formal methods for content validation of medical procedure documents.

    PubMed

    Cota, Érika; Ribeiro, Leila; Bezerra, Jonas Santos; Costa, Andrei; da Silva, Rosiana Estefane; Cota, Gláucia

    2017-08-01

    We propose the use of a formal approach to support content validation of a standard operating procedure (SOP) for a therapeutic intervention. Such an approach provides a useful tool to identify ambiguities, omissions and inconsistencies, and improves the applicability and efficacy of documents in the health settings. We apply and evaluate a methodology originally proposed for the verification of software specification documents to a specific SOP. The verification methodology uses the graph formalism to model the document. Semi-automatic analysis identifies possible problems in the model and in the original document. The verification is an iterative process that identifies possible faults in the original text that should be revised by its authors and/or specialists. The proposed method was able to identify 23 possible issues in the original document (ambiguities, omissions, redundant information, and inaccuracies, among others). The formal verification process aided the specialists to consider a wider range of usage scenarios and to identify which instructions form the kernel of the proposed SOP and which ones represent additional or required knowledge that are mandatory for the correct application of the medical document. By using the proposed verification process, a simpler and yet more complete SOP could be produced. As consequence, during the validation process the experts received a more mature document and could focus on the technical aspects of the procedure itself. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Graphical Modeling Meets Systems Pharmacology.

    PubMed

    Lombardo, Rosario; Priami, Corrado

    2017-01-01

    A main source of failures in systems projects (including systems pharmacology) is poor communication level and different expectations among the stakeholders. A common and not ambiguous language that is naturally comprehensible by all the involved players is a boost to success. We present bStyle, a modeling tool that adopts a graphical language close enough to cartoons to be a common media to exchange ideas and data and that it is at the same time formal enough to enable modeling, analysis, and dynamic simulations of a system. Data analysis and simulation integrated in the same application are fundamental to understand the mechanisms of actions of drugs: a core aspect of systems pharmacology.

  2. Graphical Modeling Meets Systems Pharmacology

    PubMed Central

    Lombardo, Rosario; Priami, Corrado

    2017-01-01

    A main source of failures in systems projects (including systems pharmacology) is poor communication level and different expectations among the stakeholders. A common and not ambiguous language that is naturally comprehensible by all the involved players is a boost to success. We present bStyle, a modeling tool that adopts a graphical language close enough to cartoons to be a common media to exchange ideas and data and that it is at the same time formal enough to enable modeling, analysis, and dynamic simulations of a system. Data analysis and simulation integrated in the same application are fundamental to understand the mechanisms of actions of drugs: a core aspect of systems pharmacology. PMID:28469411

  3. Current management for word finding difficulties by speech-language therapists in South African remedial schools.

    PubMed

    de Rauville, Ingrid; Chetty, Sandhya; Pahl, Jenny

    2006-01-01

    Word finding difficulties frequently found in learners with language learning difficulties (Casby, 1992) are an integral part of Speech-Language Therapists' management role when working with learning disabled children. This study investigated current management for word finding difficulties by 70 Speech-Language Therapists in South African remedial schools. A descriptive survey design using a quantitative and qualitative approach was used. A questionnaire and follow-up focus group discussion were used to collect data. Results highlighted the use of the Renfrew Word Finding Scale (Renfrew, 1972, 1995) as the most frequently used formal assessment tool. Language sample analysis and discourse analysis were the most frequently used informal assessment procedures. Formal intervention programmes were generally not used. Phonetic, phonemic or phonological cueing were the most frequently used therapeutic strategies. The authors note strengths and raise concerns about current management for word finding difficulties in South African remedial schools, particularly in terms of bilingualism. Opportunities are highlighted regarding the development of assessment and intervention measures relevant to the diverse learning disabled population in South Africa.

  4. Grammar Is a System That Characterizes Talk in Interaction

    PubMed Central

    Ginzburg, Jonathan; Poesio, Massimo

    2016-01-01

    Much of contemporary mainstream formal grammar theory is unable to provide analyses for language as it occurs in actual spoken interaction. Its analyses are developed for a cleaned up version of language which omits the disfluencies, non-sentential utterances, gestures, and many other phenomena that are ubiquitous in spoken language. Using evidence from linguistics, conversation analysis, multimodal communication, psychology, language acquisition, and neuroscience, we show these aspects of language use are rule governed in much the same way as phenomena captured by conventional grammars. Furthermore, we argue that over the past few years some of the tools required to provide a precise characterizations of such phenomena have begun to emerge in theoretical and computational linguistics; hence, there is no reason for treating them as “second class citizens” other than pre-theoretical assumptions about what should fall under the purview of grammar. Finally, we suggest that grammar formalisms covering such phenomena would provide a better foundation not just for linguistic analysis of face-to-face interaction, but also for sister disciplines, such as research on spoken dialogue systems and/or psychological work on language acquisition. PMID:28066279

  5. Detecting Mode Confusion Through Formal Modeling and Analysis

    NASA Technical Reports Server (NTRS)

    Miller, Steven P.; Potts, James N.

    1999-01-01

    Aircraft safety has improved steadily over the last few decades. While much of this improvement can be attributed to the introduction of advanced automation in the cockpit, the growing complexity of these systems also increases the potential for the pilots to become confused about what the automation is doing. This phenomenon, often referred to as mode confusion, has been involved in several accidents involving modern aircraft. This report describes an effort by Rockwell Collins and NASA Langley to identify potential sources of mode confusion through two complementary strategies. The first is to create a clear, executable model of the automation, connect it to a simulation of the flight deck, and use this combination to review of the behavior of the automation and the man-machine interface with the designers, pilots, and experts in human factors. The second strategy is to conduct mathematical analyses of the model by translating it into a formal specification suitable for analysis with automated tools. The approach is illustrated by applying it to a hypothetical, but still realistic, example of the mode logic of a Flight Guidance System.

  6. Planetary data in education: tool development for access to the Planetary Data System

    NASA Technical Reports Server (NTRS)

    Atkinson, C. H.; Andres, P. M.; Liggett, P. K.; Lowes, L. L.; Sword, B. J.

    2003-01-01

    In this session we will describe and demonstrate the interface to the PDS access tools and functions developed for the scientific community, and discuss the potential for its utilization in K-14 formal and informal settings.

  7. Bedside Ultrasound in the Emergency Department to Detect Hydronephrosis for the Evaluation of Suspected Ureteric Colic.

    PubMed

    Shrestha, R; Shakya, R M; Khan A, A

    2016-01-01

    Background Renal colic is a common emergency department presentation. Hydronephrosis is indirect sign of urinary obstruction which may be due to obstructing ureteric calculus and can be detected easily by bedside ultrasound with minimal training. Objective To compare the accuracy of detection of hydronephrosis performed by the emergency physician with that of radiologist's in suspected renal colic cases. Method This was a prospective observational study performed over a period of 6 months. Patients >8 years with provisional diagnosis of renal colic with both the bedside ultrasound and the formal ultrasound performed were included. Presence of hydronephrosis in both ultrasounds and size and location of ureteric stone if present in formal ultrasound was recorded. The accuracy of the emergency physician detection of hydronephrosis was determined using the scan reported by the radiologists as the "gold standard" as computed tomography was unavailable. Statistical analysis was executed using SPSS 17.0. Result Among the 111 included patients, 56.7% had ureteric stone detected in formal ultrasound. The overall sensitivity, specificity, positive predictive value and negative predictive value of bedside ultrasound performed by emergency physician for detection of hydronephrosis with that of formal ultrasound performed by radiologist was 90.8%., 78.3%, 85.5% and 85.7% respectively. Bedside ultrasound and formal ultrasound both detected hydronephrosis more often in patients with larger stones and the difference was statistically significant (p=.000). Conclusion Bedside ultrasound can be potentially used as an important tool in detecting clinically significant hydronephrosis in emergency to evaluate suspected ureteric colic. Focused training in ultrasound could greatly improve the emergency management of these patients.

  8. Closed-Loop Evaluation of an Integrated Failure Identification and Fault Tolerant Control System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine; Khong, thuan

    2006-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems developed for failure detection, identification, and reconfiguration, as well as upset recovery, need to be evaluated over broad regions of the flight envelope or under extreme flight conditions, and should include various sources of uncertainty. To apply formal robustness analysis, formulation of linear fractional transformation (LFT) models of complex parameter-dependent systems is required, which represent system uncertainty due to parameter uncertainty and actuator faults. This paper describes a detailed LFT model formulation procedure from the nonlinear model of a transport aircraft by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The closed-loop system is evaluated over the entire flight envelope based on the generated LFT model which can cover nonlinear dynamics. The robustness analysis results of the closed-loop fault tolerant control system of a transport aircraft are presented. A reliable flight envelope (safe flight regime) is also calculated from the robust performance analysis results, over which the closed-loop system can achieve the desired performance of command tracking and failure detection.

  9. Tree-oriented interactive processing with an application to theorem-proving, appendix E

    NASA Technical Reports Server (NTRS)

    Hammerslag, David; Kamin, Samuel N.; Campbell, Roy H.

    1985-01-01

    The concept of unstructured structure editing and ted, an editor for unstructured trees, is described. Ted is used to manipulate hierarchies of information in an unrestricted manner. The tool was implemented and applied to the problem of organizing formal proofs. As a proof management tool, it maintains the validity of a proof and its constituent lemmas independently from the methods used to validate the proof. It includes an adaptable interface which may be used to invoke theorem provers and other aids to proof construction. Using ted, a user may construct, maintain, and verify formal proofs using a variety of theorem provers, proof checkers, and formatters.

  10. Formalizing procedures for operations automation, operator training and spacecraft autonomy

    NASA Technical Reports Server (NTRS)

    Lecouat, Francois; Desaintvincent, Arnaud

    1994-01-01

    The generation and validation of operations procedures is a key task of mission preparation that is quite complex and costly. This has motivated the development of software applications providing support for procedures preparation. Several applications have been developed at MATRA MARCONI SPACE (MMS) over the last five years. They are presented in the first section of this paper. The main idea is that if procedures are represented in a formal language, they can be managed more easily with a computer tool and some automatic verifications can be performed. One difficulty is to define a formal language that is easy to use for operators and operations engineers. From the experience of the various procedures management tools developed in the last five years (including the POM, EOA, and CSS projects), MMS has derived OPSMAKER, a generic tool for procedure elaboration and validation. It has been applied to quite different types of missions, ranging from crew procedures (PREVISE system), ground control centers management procedures (PROCSU system), and - most relevant to the present paper - satellite operation procedures (PROCSAT developed for CNES, to support the preparation and verification of SPOT 4 operation procedures, and OPSAT for MMS telecom satellites operation procedures).

  11. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems

    DTIC Science & Technology

    1994-07-29

    time systems and to evaluate the design. The evaluation of the design includes investigation of both the capability and potential usefulness of the toolkit environment and the feasibility of its implementation....The goals of Phase 1 are to design in detail a toolkit environment based on formal methods for the specification and verification of distributed real

  12. Formal Lifelong E-Learning for Employability and Job Stability during Turbulent Times in Spain

    ERIC Educational Resources Information Center

    Martínez-Cerdá, Juan-Francisco; Torrent-Sellens, Joan

    2017-01-01

    In recent decades, international organizations have developed initiatives that incorporate lifelong learning as a tool to increase the employability of citizens. In this context, the goal of this research is to test the influence of formal e-learning on estimating employment status. The research made use of a sample of 595 citizens in 2007 and…

  13. Preservice Teachers' Beliefs about Using Maker Activities in Formal K-12 Educational Settings: A Multi-Institutional Study

    ERIC Educational Resources Information Center

    Jones, W. Monty; Smith, Shaunna; Cohen, Jonathan

    2017-01-01

    This qualitative study examined preservice teachers' beliefs about using maker activities in formal educational settings. Eighty-two preservice and early-career teachers at three different universities in the United States took part in one-time workshops designed to introduce them to various maker tools and activities applicable to K-12…

  14. Toward designing for trust in database automation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duez, P. P.; Jamieson, G. A.

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operatingmore » functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process. The connection between an AH for an automated tool and a list of information elements at the three levels of attributional abstraction is then direct, providing a method for satisfying information requirements for appropriate trust in automation. In this paper, we will present our method for developing specific information requirements for an automated tool, based on a formal analysis of that tool and the models presented by Lee and See. We will show an example of the application of the AH to automation, in the domain of relational database automation, and the resulting set of specific information elements for appropriate trust in the automated tool. Finally, we will comment on the applicability of this approach to the domain of nuclear plant instrumentation. (authors)« less

  15. Identifying and tracking attacks on networks: C3I displays and related technologies

    NASA Astrophysics Data System (ADS)

    Manes, Gavin W.; Dawkins, J.; Shenoi, Sujeet; Hale, John C.

    2003-09-01

    Converged network security is extremely challenging for several reasons; expanded system and technology perimeters, unexpected feature interaction, and complex interfaces all conspire to provide hackers with greater opportunities for compromising large networks. Preventive security services and architectures are essential, but in and of themselves do not eliminate all threat of compromise. Attack management systems mitigate this residual risk by facilitating incident detection, analysis and response. There are a wealth of attack detection and response tools for IP networks, but a dearth of such tools for wireless and public telephone networks. Moreover, methodologies and formalisms have yet to be identified that can yield a common model for vulnerabilities and attacks in converged networks. A comprehensive attack management system must coordinate detection tools for converged networks, derive fully-integrated attack and network models, perform vulnerability and multi-stage attack analysis, support large-scale attack visualization, and orchestrate strategic responses to cyber attacks that cross network boundaries. We present an architecture that embodies these principles for attack management. The attack management system described engages a suite of detection tools for various networking domains, feeding real-time attack data to a comprehensive modeling, analysis and visualization subsystem. The resulting early warning system not only provides network administrators with a heads-up cockpit display of their entire network, it also supports guided response and predictive capabilities for multi-stage attacks in converged networks.

  16. New tools for emergency managers: an assessment of obstacles to use and implementation.

    PubMed

    McCormick, Sabrina

    2016-04-01

    This paper focuses on the role of the formal response community's use of social media and crowdsourcing for emergency managers (EMs) in disaster planning, response and recovery in the United States. In-depth qualitative interviews with EMs on the Eastern seaboard at the local, state and federal level demonstrate that emergency management tools are in a state of transition--from formal, internally regulated tools for crisis response to an incorporation of new social media and crowdsourcing tools. The first set of findings provides insight into why many EMs are not using social media, and describes their concerns that result in fear, uncertainty and doubt. Second, this research demonstrates how internal functioning and staffing issues within these agencies present challenges. This research seeks to examine the dynamics of this transition and offer lessons for how to improve its outcomes--critical to millions of people across the United States. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.

  17. Logical Modeling and Dynamical Analysis of Cellular Networks

    PubMed Central

    Abou-Jaoudé, Wassim; Traynard, Pauline; Monteiro, Pedro T.; Saez-Rodriguez, Julio; Helikar, Tomáš; Thieffry, Denis; Chaouiya, Claudine

    2016-01-01

    The logical (or logic) formalism is increasingly used to model regulatory and signaling networks. Complementing these applications, several groups contributed various methods and tools to support the definition and analysis of logical models. After an introduction to the logical modeling framework and to several of its variants, we review here a number of recent methodological advances to ease the analysis of large and intricate networks. In particular, we survey approaches to determine model attractors and their reachability properties, to assess the dynamical impact of variations of external signals, and to consistently reduce large models. To illustrate these developments, we further consider several published logical models for two important biological processes, namely the differentiation of T helper cells and the control of mammalian cell cycle. PMID:27303434

  18. MOOC & B-Learning: Students' Barriers and Satisfaction in Formal and Non-Formal Learning Environments

    ERIC Educational Resources Information Center

    Gutiérrez-Santiuste, Elba; Gámiz-Sánchez, Vanesa-M.; Gutiérrez-Pérez, Jose

    2015-01-01

    The study presents a comparative analysis of two virtual learning formats: one non-formal through a Massive Open Online Course (MOOC) and the other formal through b-learning. We compare the communication barriers and the satisfaction perceived by the students (N = 249) by developing a qualitative analysis using semi-structured questionnaires and…

  19. Line group techniques in description of the structural phase transitions in some superconductors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meszaros, C.; Bankuti, J.; Balint, A.

    1994-12-31

    The main features of the theory of line groups, and their irreducible representations are briefly discussed, as well as the most important applications of them. A new approach in the general symmetry analysis of the modulated systems is presented. It is shown, that the line group formalism could be a very effective tool in the examination of the structural phase transitions in High Temperature Superconductors. As an example, the material YBa{sub 2}Cu{sub 3}O{sub 7-x} is discussed briefly.

  20. Multifractal analysis of macro- and microcerebral circulation in rats

    NASA Astrophysics Data System (ADS)

    Pavlov, Alexey N.; Sindeeva, Olga S.; Sindeev, Sergey S.; Pavlova, Olga N.; Abdurashitov, Arkady S.; Rybalova, Elena V.; Semyachkina-Glushkovskaya, Oxana V.

    2016-04-01

    Application of noninvasive optical coherent-domain methods and advanced data processing tools such as the wavelet-based multifractal formalism allows revealing effective markers of early stages of functional distortions in the dynamics of cerebral vessels. Based on experiments performed in rats we discuss a possibility to diagnose a hidden stage of the development of intracranial hemorrhage (ICH). We also consider responses of the cerebrovascular dynamics to a pharmacologically induced increase in the peripheral blood pressure. We report distinctions occurring at the levels of macro- and microcerebral circulation.

  1. Practical example of game theory application for production route selection

    NASA Astrophysics Data System (ADS)

    Olender, M.; Krenczyk, D.

    2017-08-01

    The opportunity which opens before manufacturers on the dynamic market, especially before those from the sector of the small and medium-sized enterprises, is associated with the use of the virtual organizations concept. The planning stage of such organizations could be based on supporting decision-making tasks using the tools and formalisms taken from the game theory. In the paper the model of the virtual manufacturing network, along with the practical example of decision-making situation as two person game and the decision strategies with an analysis of calculation results are presented.

  2. The Mediation of Tools in the Development of Formal Mathematical Concepts: The Compass and the Circle as an Example.

    ERIC Educational Resources Information Center

    Chassapis, Dimitris

    1999-01-01

    Focuses on the process by which children develop a formal mathematical concept of the circle by using various instruments to draw circles within the context of a goal-directed drawing task. Concludes that the use of the compass in circle drawing structures the circle-drawing operation in a radically different fashion than circle tracers and…

  3. Security Modeling and Correctness Proof Using Specware and Isabelle

    DTIC Science & Technology

    2008-12-01

    proving requires substantial knowledge and experience in logical calculus . 15. NUMBER OF PAGES 146 14. SUBJECT TERMS Formal Method, Theorem...although the actual proving requires substantial knowledge and experience in logical calculus . vi THIS PAGE INTENTIONALLY LEFT BLANK vii TABLE OF...formal language and provides tools for proving those formulas in a logical calculus ” [5]. We are demonstrating in this thesis that a specification in

  4. RAFCON: A Graphical Tool for Engineering Complex, Robotic Tasks

    DTIC Science & Technology

    2016-10-09

    Robotic tasks are becoming increasingly complex, and with this also the robotic systems. This requires new tools to manage this complexity and to...execution of robotic tasks, called RAFCON. These tasks are described in hierarchical state machines supporting concurrency. A formal notation of this concept

  5. A Random Variable Approach to Nuclear Targeting and Survivability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Undem, Halvor A.

    We demonstrate a common mathematical formalism for analyzing problems in nuclear survivability and targeting. This formalism, beginning with a random variable approach, can be used to interpret past efforts in nuclear-effects analysis, including targeting analysis. It can also be used to analyze new problems brought about by the post Cold War Era, such as the potential effects of yield degradation in a permanently untested nuclear stockpile. In particular, we illustrate the formalism through four natural case studies or illustrative problems, linking these to actual past data, modeling, and simulation, and suggesting future uses. In the first problem, we illustrate themore » case of a deterministically modeled weapon used against a deterministically responding target. Classic "Cookie Cutter" damage functions result. In the second problem, we illustrate, with actual target test data, the case of a deterministically modeled weapon used against a statistically responding target. This case matches many of the results of current nuclear targeting modeling and simulation tools, including the result of distance damage functions as complementary cumulative lognormal functions in the range variable. In the third problem, we illustrate the case of a statistically behaving weapon used against a deterministically responding target. In particular, we show the dependence of target damage on weapon yield for an untested nuclear stockpile experiencing yield degradation. Finally, and using actual unclassified weapon test data, we illustrate in the fourth problem the case of a statistically behaving weapon used against a statistically responding target.« less

  6. Sustaining Teacher Control in a Blog-Based Personal Learning Environment

    ERIC Educational Resources Information Center

    Tomberg, Vladimir; Laanpere, Mart; Ley, Tobias; Normak, Peeter

    2013-01-01

    Various tools and services based on Web 2.0 (mainly blogs, wikis, social networking tools) are increasingly used in formal education to create personal learning environments, providing self-directed learners with more freedom, choice, and control over their learning. In such distributed and personalized learning environments, the traditional role…

  7. The Role of Education as a Tool in Transmitting Cultural Stereotypes Words (Formal's): The Case of "Kerem and Asli" Story

    ERIC Educational Resources Information Center

    Bulut, Mesut; Bars, Mehmet Emin

    2013-01-01

    In terms of the individual and society folk literature is an important educational tool product; plays an important role in the transmission of culture between generations is an important element of the social culture. Which is an important educational tool for the individual and society folk literature, folk tales products, is one of the major…

  8. Reliable and valid assessment of Lichtenstein hernia repair skills.

    PubMed

    Carlsen, C G; Lindorff-Larsen, K; Funch-Jensen, P; Lund, L; Charles, P; Konge, L

    2014-08-01

    Lichtenstein hernia repair is a common surgical procedure and one of the first procedures performed by a surgical trainee. However, formal assessment tools developed for this procedure are few and sparsely validated. The aim of this study was to determine the reliability and validity of an assessment tool designed to measure surgical skills in Lichtenstein hernia repair. Key issues were identified through a focus group interview. On this basis, an assessment tool with eight items was designed. Ten surgeons and surgical trainees were video recorded while performing Lichtenstein hernia repair, (four experts, three intermediates, and three novices). The videos were blindly and individually assessed by three raters (surgical consultants) using the assessment tool. Based on these assessments, validity and reliability were explored. The internal consistency of the items was high (Cronbach's alpha = 0.97). The inter-rater reliability was very good with an intra-class correlation coefficient (ICC) = 0.93. Generalizability analysis showed a coefficient above 0.8 even with one rater. The coefficient improved to 0.92 if three raters were used. One-way analysis of variance found a significant difference between the three groups which indicates construct validity, p < 0.001. Lichtenstein hernia repair skills can be assessed blindly by a single rater in a reliable and valid fashion with the new procedure-specific assessment tool. We recommend this tool for future assessment of trainees performing Lichtenstein hernia repair to ensure that the objectives of competency-based surgical training are met.

  9. Modeling and performance analysis using extended fuzzy-timing Petri nets for networked virtual environments.

    PubMed

    Zhou, Y; Murata, T; Defanti, T A

    2000-01-01

    Despite their attractive properties, networked virtual environments (net-VEs) are notoriously difficult to design, implement, and test due to the concurrency, real-time and networking features in these systems. Net-VEs demand high quality-of-service (QoS) requirements on the network to maintain natural and real-time interactions among users. The current practice for net-VE design is basically trial and error, empirical, and totally lacks formal methods. This paper proposes to apply a Petri net formal modeling technique to a net-VE-NICE (narrative immersive constructionist/collaborative environment), predict the net-VE performance based on simulation, and improve the net-VE performance. NICE is essentially a network of collaborative virtual reality systems called the CAVE-(CAVE automatic virtual environment). First, we introduce extended fuzzy-timing Petri net (EFTN) modeling and analysis techniques. Then, we present EFTN models of the CAVE, NICE, and transport layer protocol used in NICE: transmission control protocol (TCP). We show the possibility analysis based on the EFTN model for the CAVE. Then, by using these models and design/CPN as the simulation tool, we conducted various simulations to study real-time behavior, network effects and performance (latencies and jitters) of NICE. Our simulation results are consistent with experimental data.

  10. MDAS: an integrated system for metabonomic data analysis.

    PubMed

    Liu, Juan; Li, Bo; Xiong, Jiang-Hui

    2009-03-01

    Metabonomics, the latest 'omics' research field, shows great promise as a tool in biomarker discovery, drug efficacy and toxicity analysis, disease diagnosis and prognosis. One of the major challenges now facing researchers is how to process this data to yield useful information about a biological system, e.g., the mechanism of diseases. Traditional methods employed in metabonomic data analysis use multivariate analysis methods developed independently in chemometrics research. Additionally, with the development of machine learning approaches, some methods such as SVMs also show promise for use in metabonomic data analysis. Aside from the application of general multivariate analysis and machine learning methods to this problem, there is also a need for an integrated tool customized for metabonomic data analysis which can be easily used by biologists to reveal interesting patterns in metabonomic data.In this paper, we present a novel software tool MDAS (Metabonomic Data Analysis System) for metabonomic data analysis which integrates traditional chemometrics methods and newly introduced machine learning approaches. MDAS contains a suite of functional models for metabonomic data analysis and optimizes the flow of data analysis. Several file formats can be accepted as input. The input data can be optionally preprocessed and can then be processed with operations such as feature analysis and dimensionality reduction. The data with reduced dimensionalities can be used for training or testing through machine learning models. The system supplies proper visualization for data preprocessing, feature analysis, and classification which can be a powerful function for users to extract knowledge from the data. MDAS is an integrated platform for metabonomic data analysis, which transforms a complex analysis procedure into a more formalized and simplified one. The software package can be obtained from the authors.

  11. Modeling formalisms in Systems Biology

    PubMed Central

    2011-01-01

    Systems Biology has taken advantage of computational tools and high-throughput experimental data to model several biological processes. These include signaling, gene regulatory, and metabolic networks. However, most of these models are specific to each kind of network. Their interconnection demands a whole-cell modeling framework for a complete understanding of cellular systems. We describe the features required by an integrated framework for modeling, analyzing and simulating biological processes, and review several modeling formalisms that have been used in Systems Biology including Boolean networks, Bayesian networks, Petri nets, process algebras, constraint-based models, differential equations, rule-based models, interacting state machines, cellular automata, and agent-based models. We compare the features provided by different formalisms, and discuss recent approaches in the integration of these formalisms, as well as possible directions for the future. PMID:22141422

  12. Formal Requirements-Based Programming for Complex Systems

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis

    2005-01-01

    Computer science as a field has not yet produced a general method to mechanically transform complex computer system requirements into a provably equivalent implementation. Such a method would be one major step towards dealing with complexity in computing, yet it remains the elusive holy grail of system development. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that such tools and methods leave unfilled is that the formal models cannot be proven to be equivalent to the system requirements as originated by the customer For the classes of complex systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations. While other techniques are available, this method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. We illustrate the application of the method to an example procedure from the Hubble Robotic Servicing Mission currently under study and preliminary formulation at NASA Goddard Space Flight Center.

  13. Deep first formal concept search.

    PubMed

    Zhang, Tao; Li, Hui; Hong, Wenxue; Yuan, Xiamei; Wei, Xinyu

    2014-01-01

    The calculation of formal concepts is a very important part in the theory of formal concept analysis (FCA); however, within the framework of FCA, computing all formal concepts is the main challenge because of its exponential complexity and difficulty in visualizing the calculating process. With the basic idea of Depth First Search, this paper presents a visualization algorithm by the attribute topology of formal context. Limited by the constraints and calculation rules, all concepts are achieved by the visualization global formal concepts searching, based on the topology degenerated with the fixed start and end points, without repetition and omission. This method makes the calculation of formal concepts precise and easy to operate and reflects the integrity of the algorithm, which enables it to be suitable for visualization analysis.

  14. Bioastronautics Roadmap: A Risk Reduction Strategy for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The Bioastronautics Critical Path Roadmap is the framework used to identify and assess the risks to crews exposed to the hazardous environments of space. It guides the implementation of research strategies to prevent or reduce those risks. Although the BCPR identifies steps that must be taken to reduce the risks to health and performance that are associated with human space flight, the BCPR is not a "critical path" analysis in the strict engineering sense. The BCPR will evolve to accommodate new information and technology development and will enable NASA to conduct a formal critical path analysis in the future. As a management tool, the BCPR provides information for making informed decisions about research priorities and resource allocation. The outcome-driven nature of the BCPR makes it amenable for assessing the focus, progress and success of the Bioastronautics research and technology program. The BCPR is also a tool for communicating program priorities and progress to the research community and NASA management.

  15. The use of think-aloud and instant data analysis in evaluation research: Exemplar and lessons learned.

    PubMed

    Joe, Jonathan; Chaudhuri, Shomir; Le, Thai; Thompson, Hilaire; Demiris, George

    2015-08-01

    While health information technologies have become increasingly popular, many have not been formally tested to ascertain their usability. Traditional rigorous methods take significant amounts of time and manpower to evaluate the usability of a system. In this paper, we evaluate the use of instant data analysis (IDA) as developed by Kjeldskov et al. to perform usability testing on a tool designed for older adults and caregivers. The IDA method is attractive because it takes significantly less time and manpower than the traditional usability testing methods. In this paper we demonstrate how IDA was used to evaluate usability of a multifunctional wellness tool, discuss study results and lessons learned while using this method. We also present findings from an extension of the method which allows the grouping of similar usability problems in an efficient manner. We found that the IDA method is a quick, relatively easy approach to identifying and ranking usability issues among health information technologies. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. A matter of tradeoffs: reintroduction as a multiple objective decision

    USGS Publications Warehouse

    Converse, Sarah J.; Moore, Clinton T.; Folk, Martin J.; Runge, Michael C.

    2013-01-01

    Decision making in guidance of reintroduction efforts is made challenging by the substantial scientific uncertainty typically involved. However, a less recognized challenge is that the management objectives are often numerous and complex. Decision makers managing reintroduction efforts are often concerned with more than just how to maximize the probability of reintroduction success from a population perspective. Decision makers are also weighing other concerns such as budget limitations, public support and/or opposition, impacts on the ecosystem, and the need to consider not just a single reintroduction effort, but conservation of the entire species. Multiple objective decision analysis is a powerful tool for formal analysis of such complex decisions. We demonstrate the use of multiple objective decision analysis in the case of the Florida non-migratory whooping crane reintroduction effort. In this case, the State of Florida was considering whether to resume releases of captive-reared crane chicks into the non-migratory whooping crane population in that state. Management objectives under consideration included maximizing the probability of successful population establishment, minimizing costs, maximizing public relations benefits, maximizing the number of birds available for alternative reintroduction efforts, and maximizing learning about the demographic patterns of reintroduced whooping cranes. The State of Florida engaged in a collaborative process with their management partners, first, to evaluate and characterize important uncertainties about system behavior, and next, to formally evaluate the tradeoffs between objectives using the Simple Multi-Attribute Rating Technique (SMART). The recommendation resulting from this process, to continue releases of cranes at a moderate intensity, was adopted by the State of Florida in late 2008. Although continued releases did not receive support from the International Whooping Crane Recovery Team, this approach does provide a template for the formal, transparent consideration of multiple, potentially competing, objectives in reintroduction decision making.

  17. Who is excluded and how? An analysis of community spaces for maternal and child health in Pakistan.

    PubMed

    Aziz, Ayesha; Khan, Fazal Ali; Wood, Geof

    2015-11-25

    The maternal, newborn, and child health (MNCH) indicators of Pakistan depict the deplorable state of the poor and rural women and children. Many MNCH programmes stress the need to engage the poor in community spaces. However, caste and class based hierarchies and gendered social norms exclude the lower caste poor women from accessing healthcare. To find pathways for improving the lives of the excluded, this study considers the social system as a whole and describes the mechanisms of exclusion in the externally created formal community spaces and their interaction with the indigenous informal spaces. The study used a qualitative case study design to identify the formal and informal community spaces in three purposively selected villages of Thatta, Rajanpur, and Ghizer districts. Community perspectives were gathered by conducting 37 focus group discussions, based on participatory rural appraisal tools, with separate groups of women and men. Relevant documents of six MNCH programmes were reviewed and 25 key informant interviews were conducted with programme staff. We found that lower caste poor tenants and nomadic peasants were excluded from formal and informal spaces. The formal community spaces formed by MNCH programmes across Pakistan included fixed, small transitory, large transitory, and emerging institutional spaces. Programme guidelines mandated selection of community notables in groups/committees and used criteria that prevented registration of nomadic groups as eligible clients. The selection criteria and adverse attitude of healthcare workers, along with inadequacy of programmatic resources to sustain outreach activities also contributed to exclusion of the lower caste poor women from formal spaces. The informal community spaces were mostly gender segregated. Infrequently, MNCH information trickled down from the better-off to the lower caste poor women through transitory interactions in the informal domestic sphere. A revision of the purpose and implementation mechanisms for MNCH programmes is mandated to transform formal health spaces into sites of equitable healthcare.

  18. Rasmussen's legacy: A paradigm change in engineering for safety.

    PubMed

    Leveson, Nancy G

    2017-03-01

    This paper describes three applications of Rasmussen's idea to systems engineering practice. The first is the application of the abstraction hierarchy to engineering specifications, particularly requirements specification. The second is the use of Rasmussen's ideas in safety modeling and analysis to create a new, more powerful type of accident causation model that extends traditional models to better handle human-operated, software-intensive, sociotechnical systems. Because this new model has a formal, mathematical foundation built on systems theory (as was Rasmussen's original model), new modeling and analysis tools become possible. The third application is to engineering hazard analysis. Engineers have traditionally either omitted human from consideration in system hazard analysis or have treated them rather superficially, for example, that they behave randomly. Applying Rasmussen's model of human error to a powerful new hazard analysis technique allows human behavior to be included in engineering hazard analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Training traditional birth attendants on the use of misoprostol and a blood measurement tool to prevent postpartum haemorrhage: lessons learnt from Bangladesh.

    PubMed

    Bell, Suzanne; Passano, Paige; Bohl, Daniel D; Islam, Arshadul; Prata, Ndola

    2014-03-01

    A consensus emerged in the late 1990s among leaders in global maternal health that traditional birth attendants (TBAs) should no longer be trained in delivery skills and should instead be trained as promoters of facility-based care. Many TBAs continue to be trained in places where home deliveries are the norm and the potential impacts of this training are important to understand. The primary objective of this study was to gain a more nuanced understanding of the full impact of training TBAs to use misoprostol and a blood measurement tool (mat) for the prevention of postpartum haemorrhage (PPH) at home deliveries through the perspective of those involved in the project. This qualitative study, conducted between July 2009 and July 2010 in Bangladesh, was nested within larger operations research, testing the feasibility and acceptability of scaling up community-based provision of misoprostol and a blood measurement tool for prevention of PPH. A total of 87 in-depth interviews (IDIs) were conducted with TBAs, community health workers (CHWs), managers, and government-employed family welfare visitors (FWVs) at three time points during the study. Computer-assisted thematic data analysis was conducted using ATLAS.ti (version 5.2). Four primary themes emerged during the data analysis, which all highlight changes that occurred following the training. The first theme describes the perceived direct changes linked to the two new interventions. The following three themes describe the indirect changes that interviewees perceived: strengthened linkages between TBAs and the formal healthcare system; strengthened linkages between TBAs and the communities they serve; and improved quality of services/service utilization. The data indicate that training TBAs and CHW supervisors resulted in perceived broader and more nuanced changes than simply improvements in TBAs' knowledge, attitudes, and practices. Acknowledgeing TBAs' important role in the community and in home deliveries and integrating them into the formal healthcare system has the potential to result in changes similar to those seen in this study.

  20. Research Costs Investigated: A Study Into the Budgets of Dutch Publicly Funded Drug-Related Research.

    PubMed

    van Asselt, Thea; Ramaekers, Bram; Corro Ramos, Isaac; Joore, Manuela; Al, Maiwenn; Lesman-Leegte, Ivonne; Postma, Maarten; Vemer, Pepijn; Feenstra, Talitha

    2018-01-01

    The costs of performing research are an important input in value of information (VOI) analyses but are difficult to assess. The aim of this study was to investigate the costs of research, serving two purposes: (1) estimating research costs for use in VOI analyses; and (2) developing a costing tool to support reviewers of grant proposals in assessing whether the proposed budget is realistic. For granted study proposals from the Netherlands Organization for Health Research and Development (ZonMw), type of study, potential cost drivers, proposed budget, and general characteristics were extracted. Regression analysis was conducted in an attempt to generate a 'predicted budget' for certain combinations of cost drivers, for implementation in the costing tool. Of 133 drug-related research grant proposals, 74 were included for complete data extraction. Because an association between cost drivers and budgets was not confirmed, we could not generate a predicted budget based on regression analysis, but only historic reference budgets given certain study characteristics. The costing tool was designed accordingly, i.e. with given selection criteria the tool returns the range of budgets in comparable studies. This range can be used in VOI analysis to estimate whether the expected net benefit of sampling will be positive to decide upon the net value of future research. The absence of association between study characteristics and budgets may indicate inconsistencies in the budgeting or granting process. Nonetheless, the tool generates useful information on historical budgets, and the option to formally relate VOI to budgets. To our knowledge, this is the first attempt at creating such a tool, which can be complemented with new studies being granted, enlarging the underlying database and keeping estimates up to date.

  1. Bioinformatic pipelines in Python with Leaf

    PubMed Central

    2013-01-01

    Background An incremental, loosely planned development approach is often used in bioinformatic studies when dealing with custom data analysis in a rapidly changing environment. Unfortunately, the lack of a rigorous software structuring can undermine the maintainability, communicability and replicability of the process. To ameliorate this problem we propose the Leaf system, the aim of which is to seamlessly introduce the pipeline formality on top of a dynamical development process with minimum overhead for the programmer, thus providing a simple layer of software structuring. Results Leaf includes a formal language for the definition of pipelines with code that can be transparently inserted into the user’s Python code. Its syntax is designed to visually highlight dependencies in the pipeline structure it defines. While encouraging the developer to think in terms of bioinformatic pipelines, Leaf supports a number of automated features including data and session persistence, consistency checks between steps of the analysis, processing optimization and publication of the analytic protocol in the form of a hypertext. Conclusions Leaf offers a powerful balance between plan-driven and change-driven development environments in the design, management and communication of bioinformatic pipelines. Its unique features make it a valuable alternative to other related tools. PMID:23786315

  2. A Study of Technical Engineering Peer Reviews at NASA

    NASA Technical Reports Server (NTRS)

    Chao, Lawrence P.; Tumer, Irem Y.; Bell, David G.

    2003-01-01

    This report describes the state of practices of design reviews at NASA and research into what can be done to improve peer review practices. There are many types of reviews at NASA: required and not, formalized and informal, programmatic and technical. Standing project formal reviews such as the Preliminary Design Review and Critical Design Review are a required part of every project and mission development. However, the technical, engineering peer reviews that support teams' work on such projects are informal, some times ad hoc, and inconsistent across the organization. The goal of this work is to identify best practices and lessons learned from NASA's experience, supported by academic research and methodologies to ultimately improve the process. This research has determined that the organization, composition, scope, and approach of the reviews impact their success. Failure Modes and Effects Analysis (FMEA) can identify key areas of concern before or in the reviews. Product definition tools like the Project Priority Matrix, engineering-focused Customer Value Chain Analysis (CVCA), and project or system-based Quality Function Deployment (QFD) help prioritize resources in reviews. The use of information technology and structured design methodologies can strengthen the engineering peer review process to help NASA work towards error-proofing the design process.

  3. THE CAUSAL ANALYSIS / DIAGNOSIS DECISION ...

    EPA Pesticide Factsheets

    CADDIS is an on-line decision support system that helps investigators in the regions, states and tribes find, access, organize, use and share information to produce causal evaluations in aquatic systems. It is based on the US EPA's Stressor Identification process which is a formal method for identifying causes of impairments in aquatic systems. CADDIS 2007 increases access to relevant information useful for causal analysis and provides methods and tools that practitioners can use to analyze their own data. The new Candidate Cause section provides overviews of commonly encountered causes of impairments to aquatic systems: metals, sediments, nutrients, flow alteration, temperature, ionic strength, and low dissolved oxygen. CADDIS includes new Conceptual Models that illustrate the relationships from sources to stressors to biological effects. An Interactive Conceptual Model for phosphorus links the diagram with supporting literature citations. The new Analyzing Data section helps practitioners analyze their data sets and interpret and use those results as evidence within the USEPA causal assessment process. Downloadable tools include a graphical user interface statistical package (CADStat), and programs for use with the freeware R statistical package, and a Microsoft Excel template. These tools can be used to quantify associations between causes and biological impairments using innovative methods such as species-sensitivity distributions, biological inferenc

  4. Performances on the CogState and Standard Neuropsychological Batteries Among HIV Patients Without Dementia

    PubMed Central

    Overton, Edgar Turner; Kauwe, John S.K.; Paul, Rob; Tashima, Karen; Tate, David F.; Patel, Pragna; Carpenter, Chuck; Patty, David; Brooks, John T.; Clifford, David B

    2013-01-01

    HIV-associated neurocognitive disorders (HAND) remain prevalent but challenging to diagnose particularly among non-demented individuals. To determine whether a brief computerized battery correlates with formal neurocognitive testing, we identified 46 HIV-infected persons who had undergone both formal neurocognitive testing and a brief computerized battery. Simple detection tests correlated best with formal neuropsychological testing. By multivariable regression model, 53% of the variance in the composite Global Deficit Score was accounted for by elements from the brief computerized tool (p<0.01). These data confirm previous correlation data with the computerized battery, yet illustrate remaining challenges for neurocognitive screening. PMID:21877204

  5. Secure anonymity-preserving password-based user authentication and session key agreement scheme for telecare medicine information systems.

    PubMed

    Sutrala, Anil Kumar; Das, Ashok Kumar; Odelu, Vanga; Wazid, Mohammad; Kumari, Saru

    2016-10-01

    Information and communication and technology (ICT) has changed the entire paradigm of society. ICT facilitates people to use medical services over the Internet, thereby reducing the travel cost, hospitalization cost and time to a greater extent. Recent advancements in Telecare Medicine Information System (TMIS) facilitate users/patients to access medical services over the Internet by gaining health monitoring facilities at home. Amin and Biswas recently proposed a RSA-based user authentication and session key agreement protocol usable for TMIS, which is an improvement over Giri et al.'s RSA-based user authentication scheme for TMIS. In this paper, we show that though Amin-Biswas's scheme considerably improves the security drawbacks of Giri et al.'s scheme, their scheme has security weaknesses as it suffers from attacks such as privileged insider attack, user impersonation attack, replay attack and also offline password guessing attack. A new RSA-based user authentication scheme for TMIS is proposed, which overcomes the security pitfalls of Amin-Biswas's scheme and also preserves user anonymity property. The careful formal security analysis using the two widely accepted Burrows-Abadi-Needham (BAN) logic and the random oracle models is done. Moreover, the informal security analysis of the scheme is also done. These security analyses show the robustness of our new scheme against the various known attacks as well as attacks found in Amin-Biswas's scheme. The simulation of the proposed scheme using the widely accepted Automated Validation of Internet Security Protocols and Applications (AVISPA) tool is also done. We present a new user authentication and session key agreement scheme for TMIS, which fixes the mentioned security pitfalls found in Amin-Biswas's scheme, and we also show that the proposed scheme provides better security than other existing schemes through the rigorous security analysis and verification tool. Furthermore, we present the formal security verification of our scheme using the widely accepted AVISPA tool. High security and extra functionality features allow our proposed scheme to be applicable for telecare medicine information systems which is used for e-health care medical applications. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Modeling of tool path for the CNC sheet cutting machines

    NASA Astrophysics Data System (ADS)

    Petunin, Aleksandr A.

    2015-11-01

    In the paper the problem of tool path optimization for CNC (Computer Numerical Control) cutting machines is considered. The classification of the cutting techniques is offered. We also propose a new classification of toll path problems. The tasks of cost minimization and time minimization for standard cutting technique (Continuous Cutting Problem, CCP) and for one of non-standard cutting techniques (Segment Continuous Cutting Problem, SCCP) are formalized. We show that the optimization tasks can be interpreted as discrete optimization problem (generalized travel salesman problem with additional constraints, GTSP). Formalization of some constraints for these tasks is described. For the solution GTSP we offer to use mathematical model of Prof. Chentsov based on concept of a megalopolis and dynamic programming.

  7. The parser generator as a general purpose tool

    NASA Technical Reports Server (NTRS)

    Noonan, R. E.; Collins, W. R.

    1985-01-01

    The parser generator has proven to be an extremely useful, general purpose tool. It can be used effectively by programmers having only a knowledge of grammars and no training at all in the theory of formal parsing. Some of the application areas for which a table-driven parser can be used include interactive, query languages, menu systems, translators, and programming support tools. Each of these is illustrated by an example grammar.

  8. Software Validation via Model Animation

    NASA Technical Reports Server (NTRS)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  9. Collaborative peer review process as an informal interprofessional learning tool: Findings from an exploratory study.

    PubMed

    Kwon, Jae Yung; Bulk, Laura Yvonne; Giannone, Zarina; Liva, Sarah; Chakraborty, Bubli; Brown, Helen

    2018-01-01

    Despite numerous studies on formal interprofessional education programes, less attention has been focused on informal interprofessional learning opportunities. To provide such an opportunity, a collaborative peer review process (CPRP) was created as part of a peer-reviewed journal. Replacing the traditional peer review process wherein two or more reviewers review the manuscript separately, the CPRP brings together students from different professions to collaboratively review a manuscript. The aim of this study was to assess whether the CPRP can be used as an informal interprofessional learning tool using an exploratory qualitative approach. Eight students from Counselling Psychology, Occupational and Physical Therapy, Nursing, and Rehabilitation Sciences were invited to participate in interprofessional focus groups. Data were analysed inductively using thematic analysis. Two key themes emerged, revealing that the CPRP created new opportunities for interprofessional learning and gave practice in negotiating feedback. The results reveal that the CPRP has the potential to be a valuable interprofessional learning tool that can also enhance reviewing and constructive feedback skills.

  10. Scriptwriting as a Tool for Learning Stylistic Variation

    ERIC Educational Resources Information Center

    Saugera, Valerie

    2011-01-01

    A film script is a useful tool for allowing students to experiment with language variation. Scripts of love stories comprise a range of language contexts, each triggering a different style on a formal-neutral-informal linguistic continuum: (1) technical cinematographic language in camera directions; (2) narrative language in exposition of scenes,…

  11. Tools and Traits for Highly Effective Science Teaching, K-8

    ERIC Educational Resources Information Center

    Vasquez, Jo Anne

    2007-01-01

    Even if the reader has little formal training or background knowledge in science, "Tools & Traits for Highly Effective Science Teaching, K-8" pulls together cognitive and educational research to present an indispensable framework for science in the elementary and middle grades. Readers will discover teaching that increases students' engagement and…

  12. ReACT!: An Interactive Educational Tool for AI Planning for Robotics

    ERIC Educational Resources Information Center

    Dogmus, Zeynep; Erdem, Esra; Patogulu, Volkan

    2015-01-01

    This paper presents ReAct!, an interactive educational tool for artificial intelligence (AI) planning for robotics. ReAct! enables students to describe robots' actions and change in dynamic domains without first having to know about the syntactic and semantic details of the underlying formalism, and to solve planning problems using…

  13. Integrating Technology into Peer Leader Responsibilities

    ERIC Educational Resources Information Center

    Johnson, Melissa L.

    2012-01-01

    Technology has become an integral part of landscape of higher education. Students are coming to college with an arsenal of technological tools at their disposal. These tools are being used for informal, everyday communication as well as for formal learning in the classroom. At the same time, higher education is experiencing an increase in peer…

  14. Practice Evaluation Strategies Among Social Workers: Why an Evidence-Informed Dual-Process Theory Still Matters.

    PubMed

    Davis, Thomas D

    2017-01-01

    Practice evaluation strategies range in style from the formal-analytic tools of single-subject designs, rapid assessment instruments, algorithmic steps in evidence-informed practice, and computer software applications, to the informal-interactive tools of clinical supervision, consultation with colleagues, use of client feedback, and clinical experience. The purpose of this article is to provide practice researchers in social work with an evidence-informed theory that is capable of explaining both how and why social workers use practice evaluation strategies to self-monitor the effectiveness of their interventions in terms of client change. The author delineates the theoretical contours and consequences of what is called dual-process theory. Drawing on evidence-informed advances in the cognitive and social neurosciences, the author identifies among everyday social workers a theoretically stable, informal-interactive tool preference that is a cognitively necessary, sufficient, and stand-alone preference that requires neither the supplementation nor balance of formal-analytic tools. The author's delineation of dual-process theory represents a theoretical contribution in the century-old attempt to understand how and why social workers evaluate their practice the way they do.

  15. An approach for access differentiation design in medical distributed applications built on databases.

    PubMed

    Shoukourian, S K; Vasilyan, A M; Avagyan, A A; Shukurian, A K

    1999-01-01

    A formalized "top to bottom" design approach was described in [1] for distributed applications built on databases, which were considered as a medium between virtual and real user environments for a specific medical application. Merging different components within a unified distributed application posits new essential problems for software. Particularly protection tools, which are sufficient separately, become deficient during the integration due to specific additional links and relationships not considered formerly. E.g., it is impossible to protect a shared object in the virtual operating room using only DBMS protection tools, if the object is stored as a record in DB tables. The solution of the problem should be found only within the more general application framework. Appropriate tools are absent or unavailable. The present paper suggests a detailed outline of a design and testing toolset for access differentiation systems (ADS) in distributed medical applications which use databases. The appropriate formal model as well as tools for its mapping to a DMBS are suggested. Remote users connected via global networks are considered too.

  16. An Analysis of the Formal Features of "Reality-Based" Television Programs.

    ERIC Educational Resources Information Center

    Neapolitan, D. M.

    Reality-based television programs showcase actual footage or recreate actual events, and include programs such as "America's Most Wanted" and "Rescue 911." To identify the features that typify reality-based television programs, this study conducted an analysis of formal features used in reality-based programs. Formal features…

  17. INFORMATION: THEORY, BRAIN, AND BEHAVIOR

    PubMed Central

    Jensen, Greg; Ward, Ryan D.; Balsam, Peter D.

    2016-01-01

    In the 65 years since its formal specification, information theory has become an established statistical paradigm, providing powerful tools for quantifying probabilistic relationships. Behavior analysis has begun to adopt these tools as a novel means of measuring the interrelations between behavior, stimuli, and contingent outcomes. This approach holds great promise for making more precise determinations about the causes of behavior and the forms in which conditioning may be encoded by organisms. In addition to providing an introduction to the basics of information theory, we review some of the ways that information theory has informed the studies of Pavlovian conditioning, operant conditioning, and behavioral neuroscience. In addition to enriching each of these empirical domains, information theory has the potential to act as a common statistical framework by which results from different domains may be integrated, compared, and ultimately unified. PMID:24122456

  18. Use of dirichlet distributions and orthogonal projection techniques for the fluctuation analysis of steady-state multivariate birth-death systems

    NASA Astrophysics Data System (ADS)

    Palombi, Filippo; Toti, Simona

    2015-05-01

    Approximate weak solutions of the Fokker-Planck equation represent a useful tool to analyze the equilibrium fluctuations of birth-death systems, as they provide a quantitative knowledge lying in between numerical simulations and exact analytic arguments. In this paper, we adapt the general mathematical formalism known as the Ritz-Galerkin method for partial differential equations to the Fokker-Planck equation with time-independent polynomial drift and diffusion coefficients on the simplex. Then, we show how the method works in two examples, namely the binary and multi-state voter models with zealots.

  19. The Zeldovich & Adhesion approximations and applications to the local universe

    NASA Astrophysics Data System (ADS)

    Hidding, Johan; van de Weygaert, Rien; Shandarin, Sergei

    2016-10-01

    The Zeldovich approximation (ZA) predicts the formation of a web of singularities. While these singularities may only exist in the most formal interpretation of the ZA, they provide a powerful tool for the analysis of initial conditions. We present a novel method to find the skeleton of the resulting cosmic web based on singularities in the primordial deformation tensor and its higher order derivatives. We show that the A 3 lines predict the formation of filaments in a two-dimensional model. We continue with applications of the adhesion model to visualise structures in the local (z < 0.03) universe.

  20. The gravity apple tree

    NASA Astrophysics Data System (ADS)

    Espinosa Aldama, Mariana

    2015-04-01

    The gravity apple tree is a genealogical tree of the gravitation theories developed during the past century. The graphic representation is full of information such as guides in heuristic principles, names of main proponents, dates and references for original articles (See under Supplementary Data for the graphic representation). This visual presentation and its particular classification allows a quick synthetic view for a plurality of theories, many of them well validated in the Solar System domain. Its diachronic structure organizes information in a shape of a tree following similarities through a formal concept analysis. It can be used for educational purposes or as a tool for philosophical discussion.

  1. Spin coefficients and gauge fixing in the Newman-Penrose formalism

    NASA Astrophysics Data System (ADS)

    Nerozzi, Andrea

    2017-03-01

    Since its introduction in 1962, the Newman-Penrose formalism has been widely used in analytical and numerical studies of Einstein's equations, like for example for the Teukolsky master equation, or as a powerful wave extraction tool in numerical relativity. Despite the many applications, Einstein's equations in the Newman-Penrose formalism appear complicated and not easily applicable to general studies of spacetimes, mainly because physical and gauge degrees of freedom are mixed in a nontrivial way. In this paper we approach the whole formalism with the goal of expressing the spin coefficients as functions of tetrad invariants once a particular tetrad is chosen. We show that it is possible to do so, and give for the first time a general recipe for the task, as well as an indication of the quantities and identities that are required.

  2. Modeling biological pathway dynamics with timed automata.

    PubMed

    Schivo, Stefano; Scholma, Jetse; Wanders, Brend; Urquidi Camacho, Ricardo A; van der Vet, Paul E; Karperien, Marcel; Langerak, Rom; van de Pol, Jaco; Post, Janine N

    2014-05-01

    Living cells are constantly subjected to a plethora of environmental stimuli that require integration into an appropriate cellular response. This integration takes place through signal transduction events that form tightly interconnected networks. The understanding of these networks requires capturing their dynamics through computational support and models. ANIMO (analysis of Networks with Interactive Modeling) is a tool that enables the construction and exploration of executable models of biological networks, helping to derive hypotheses and to plan wet-lab experiments. The tool is based on the formalism of Timed Automata, which can be analyzed via the UPPAAL model checker. Thanks to Timed Automata, we can provide a formal semantics for the domain-specific language used to represent signaling networks. This enforces precision and uniformity in the definition of signaling pathways, contributing to the integration of isolated signaling events into complex network models. We propose an approach to discretization of reaction kinetics that allows us to efficiently use UPPAAL as the computational engine to explore the dynamic behavior of the network of interest. A user-friendly interface hides the use of Timed Automata from the user, while keeping the expressive power intact. Abstraction to single-parameter kinetics speeds up construction of models that remain faithful enough to provide meaningful insight. The resulting dynamic behavior of the network components is displayed graphically, allowing for an intuitive and interactive modeling experience.

  3. Can mathematics explain the evolution of human language?

    PubMed

    Witzany, Guenther

    2011-09-01

    Investigation into the sequence structure of the genetic code by means of an informatic approach is a real success story. The features of human language are also the object of investigation within the realm of formal language theories. They focus on the common rules of a universal grammar that lies behind all languages and determine generation of syntactic structures. This universal grammar is a depiction of material reality, i.e., the hidden logical order of things and its relations determined by natural laws. Therefore mathematics is viewed not only as an appropriate tool to investigate human language and genetic code structures through computer science-based formal language theory but is itself a depiction of material reality. This confusion between language as a scientific tool to describe observations/experiences within cognitive constructed models and formal language as a direct depiction of material reality occurs not only in current approaches but was the central focus of the philosophy of science debate in the twentieth century, with rather unexpected results. This article recalls these results and their implications for more recent mathematical approaches that also attempt to explain the evolution of human language.

  4. Using Formal Methods to Assist in the Requirements Analysis of the Space Shuttle GPS Change Request

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Roberts, Larry W.

    1996-01-01

    We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CR's) were selected as promising targets to demonstrate the utility of formal methods in this application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this report. Carried out in parallel with the Shuttle program's conventional requirements analysis process was a limited form of analysis based on formalized requirements. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During the formal methods-based analysis, numerous requirements issues were discovered and submitted as official issues through the normal requirements inspection process. Shuttle analysts felt that many of these issues were uncovered earlier than would have occurred with conventional methods. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.

  5. Tacit Knowledge Capture and the Brain-Drain at Electrical Utilities

    NASA Astrophysics Data System (ADS)

    Perjanik, Nicholas Steven

    As a consequence of an aging workforce, electric utilities are at risk of losing their most experienced and knowledgeable electrical engineers. In this research, the problem was a lack of understanding of what electric utilities were doing to capture the tacit knowledge or know-how of these engineers. The purpose of this qualitative research study was to explore the tacit knowledge capture strategies currently used in the industry by conducting a case study of 7 U.S. electrical utilities that have demonstrated an industry commitment to improving operational standards. The research question addressed the implemented strategies to capture the tacit knowledge of retiring electrical engineers and technical personnel. The research methodology involved a qualitative embedded case study. The theories used in this study included knowledge creation theory, resource-based theory, and organizational learning theory. Data were collected through one time interviews of a senior electrical engineer or technician within each utility and a workforce planning or training professional within 2 of the 7 utilities. The analysis included the use of triangulation and content analysis strategies. Ten tacit knowledge capture strategies were identified: (a) formal and informal on-boarding mentorship and apprenticeship programs, (b) formal and informal off-boarding mentorship programs, (c) formal and informal training programs, (d) using lessons learned during training sessions, (e) communities of practice, (f) technology enabled tools, (g) storytelling, (h) exit interviews, (i) rehiring of retirees as consultants, and (j) knowledge risk assessments. This research contributes to social change by offering strategies to capture the know-how needed to ensure operational continuity in the delivery of safe, reliable, and sustainable power.

  6. Critical Analysis on Open Source LMSs Using FCA

    ERIC Educational Resources Information Center

    Sumangali, K.; Kumar, Ch. Aswani

    2013-01-01

    The objective of this paper is to apply Formal Concept Analysis (FCA) to identify the best open source Learning Management System (LMS) for an E-learning environment. FCA is a mathematical framework that represents knowledge derived from a formal context. In constructing the formal context, LMSs are treated as objects and their features as…

  7. Agricultural land management options after the Chernobyl and Fukushima accidents: The articulation of science, technology, and society.

    PubMed

    Vandenhove, Hildegarde; Turcanu, Catrinel

    2016-10-01

    The options adopted for recovery of agricultural land after the Chernobyl and Fukushima accidents are compared by examining their technical and socio-economic aspects. The analysis highlights commonalities such as the implementation of tillage and other types of countermeasures and differences in approach, such as preferences for topsoil removal in Fukushima and the application of K fertilizers in Chernobyl. This analysis shows that the recovery approach needs to be context-specific to best suit the physical, social, and political environment. The complex nature of the decision problem calls for a formal process for engaging stakeholders and the development of adequate decision support tools. Integr Environ Assess Manag 2016;12:662-666. © 2016 SETAC. © 2016 SETAC.

  8. Modeling of Biometric Identification System Using the Colored Petri Nets

    NASA Astrophysics Data System (ADS)

    Petrosyan, G. R.; Ter-Vardanyan, L. A.; Gaboutchian, A. V.

    2015-05-01

    In this paper we present a model of biometric identification system transformed into Petri Nets. Petri Nets, as a graphical and mathematical tool, provide a uniform environment for modelling, formal analysis, and design of discrete event systems. The main objective of this paper is to introduce the fundamental concepts of Petri Nets to the researchers and practitioners, both from identification systems, who are involved in the work in the areas of modelling and analysis of biometric identification types of systems, as well as those who may potentially be involved in these areas. In addition, the paper introduces high-level Petri Nets, as Colored Petri Nets (CPN). In this paper the model of Colored Petri Net describes the identification process much simpler.

  9. Occlusions in Camera Networks and Vision: The Bridge between Topological Recovery and Metric Reconstruction

    DTIC Science & Technology

    2009-05-18

    serves as a didactic tool to understand the information required for the approach to coordinate free tracking and navigation problems. Observe that the...layout (left), and in the CN -Complex (right). These paths can be compared by using the algebraic topological tools covered in chapter 2. . . . 34 3.9...right). mathematical tools necessary to make our discussion formal; chapter 3 will present the construction of a simplicial representation called

  10. A New Measure of Text Formality: An Analysis of Discourse of Mao Zedong

    ERIC Educational Resources Information Center

    Li, Haiying; Graesser, Arthur C.; Conley, Mark; Cai, Zhiqiang; Pavlik, Philip I., Jr.; Pennebaker, James W.

    2016-01-01

    Formality has long been of interest in the study of discourse, with periodic discussions of the best measure of formality and the relationship between formality and text categories. In this research, we explored what features predict formality as humans perceive the construct. We categorized a corpus consisting of 1,158 discourse samples published…

  11. Expert2OWL: A Methodology for Pattern-Based Ontology Development.

    PubMed

    Tahar, Kais; Xu, Jie; Herre, Heinrich

    2017-01-01

    The formalization of expert knowledge enables a broad spectrum of applications employing ontologies as underlying technology. These include eLearning, Semantic Web and expert systems. However, the manual construction of such ontologies is time-consuming and thus expensive. Moreover, experts are often unfamiliar with the syntax and semantics of formal ontology languages such as OWL and usually have no experience in developing formal ontologies. To overcome these barriers, we developed a new method and tool, called Expert2OWL that provides efficient features to support the construction of OWL ontologies using GFO (General Formal Ontology) as a top-level ontology. This method allows a close and effective collaboration between ontologists and domain experts. Essentially, this tool integrates Excel spreadsheets as part of a pattern-based ontology development and refinement process. Expert2OWL enables us to expedite the development process and modularize the resulting ontologies. We applied this method in the field of Chinese Herbal Medicine (CHM) and used Expert2OWL to automatically generate an accurate Chinese Herbology ontology (CHO). The expressivity of CHO was tested and evaluated using ontology query languages SPARQL and DL. CHO shows promising results and can generate answers to important scientific questions such as which Chinese herbal formulas contain which substances, which substances treat which diseases, and which ones are the most frequently used in CHM.

  12. Optics outreach evolves in southern California as OptoBotics begins to link informal to formal curriculum

    NASA Astrophysics Data System (ADS)

    Silberman, Donn M.

    2014-09-01

    For the July 2013 issue of SPIE Professional Magazine, I was invited to and published an article related to this topic. This paper chronicles the progress made since that time and describes our direction towards bringing optics education from the informal programs we have provided for more than 10 years, to incorporating optics and photonics instruction into formal class curriculum. A major educational tool we are using was introduced at this conference two years ago and came to us from Eyestvzw. The Photonics Explorer Kit has been used as a foundation during some OptoBotics courses and it has been provided, a long with a teacher training session, to 10 local high school science teachers in Orange County, CA. The goal of this first phase is to obtain feedback from the teachers as they use the materials in their formal classroom settings and after-school activities; such as science classes and robotics club activities. Results of the teachers' initial feedback will be reviewed and future directions outlined. One clear direction is to understand the changes that will be required to the kits to formally gain acceptance as part of the California state high school science curriculum. Another is to use the Photonics Explorer kits (and other similar tools) to teach students in robotics clubs `how to give their robots eyes."

  13. EU Strategies of Integrating ICT into Initial Teacher Training

    ERIC Educational Resources Information Center

    Garapko, Vitaliya

    2013-01-01

    Education and learning are strongly linked with society and its evolution and knowledge. In the field of formal education, ICTs are increasingly deployed as tools to extend the learner's capacity to perceive, understand and communicate, as seen in the increase in online learning programs and the use of the computer as a learning support tool in…

  14. Experiences with a Requirements-Based Programming Approach to the Development of a NASA Autonomous Ground Control System

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John

    2005-01-01

    Requirements-to-Design-to-Code (R2D2C) is an approach to the engineering of computer-based systems that embodies the idea of requirements-based programming in system development. It goes further; however, in that the approach offers not only an underlying formalism, but full formal development from requirements capture through to the automatic generation of provably-correct code. As such, the approach has direct application to the development of systems requiring autonomic properties. We describe a prototype tool to support the method, and illustrate its applicability to the development of LOGOS, a NASA autonomous ground control system, which exhibits autonomic behavior. Finally, we briefly discuss other areas where the approach and prototype tool are being considered for application.

  15. Advanced Tools and Techniques for Formal Techniques in Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    2005-01-01

    This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.

  16. Cost-Benefit Analysis of U.S. Copyright Formalities. Final Report.

    ERIC Educational Resources Information Center

    King Research, Inc., Rockville, MD.

    This study of the feasibility of conducting a cost-benefit analysis in the complex environment of the formalities used in the United States as part of its administration of the copyright law focused on the formalities of copyright notice, deposit, registration, and recordation. The U.S. system is also compared with the less centralized copyright…

  17. Genetic Design Automation: engineering fantasy or scientific renewal?

    PubMed Central

    Lux, Matthew W.; Bramlett, Brian W.; Ball, David A.; Peccoud, Jean

    2013-01-01

    Synthetic biology aims to make genetic systems more amenable to engineering, which has naturally led to the development of Computer-Aided Design (CAD) tools. Experimentalists still primarily rely on project-specific ad-hoc workflows instead of domain-specific tools, suggesting that CAD tools are lagging behind the front line of the field. Here, we discuss the scientific hurdles that have limited the productivity gains anticipated from existing tools. We argue that the real value of efforts to develop CAD tools is the formalization of genetic design rules that determine the complex relationships between genotype and phenotype. PMID:22001068

  18. A Rule-Based Modeling for the Description of Flexible and Self-healing Business Processes

    NASA Astrophysics Data System (ADS)

    Boukhebouze, Mohamed; Amghar, Youssef; Benharkat, Aïcha-Nabila; Maamar, Zakaria

    In this paper we discuss the importance of ensuring that business processes are label robust and agile at the same time robust and agile. To this end, we consider reviewing the way business processes are managed. For instance we consider offering a flexible way to model processes so that changes in regulations are handled through some self-healing mechanisms. These changes may raise exceptions at run-time if not properly reflected on these processes. To this end we propose a new rule based model that adopts the ECA rules and is built upon formal tools. The business logic of a process can be summarized with a set of rules that implement an organization’s policies. Each business rule is formalized using our ECAPE formalism (Event-Condition-Action-Post condition- post Event). This formalism allows translating a process into a graph of rules that is analyzed in terms of reliably and flexibility.

  19. Simple methods of exploiting the underlying structure of rule-based systems

    NASA Technical Reports Server (NTRS)

    Hendler, James

    1986-01-01

    Much recent work in the field of expert systems research has aimed at exploiting the underlying structures of the rule base for reasons of analysis. Such techniques as Petri-nets and GAGs have been proposed as representational structures that will allow complete analysis. Much has been made of proving isomorphisms between the rule bases and the mechanisms, and in examining the theoretical power of this analysis. In this paper we describe some early work in a new system which has much simpler (and thus, one hopes, more easily achieved) aims and less formality. The technique being examined is a very simple one: OPS5 programs are analyzed in a purely syntactic way and a FSA description is generated. In this paper we describe the technique and some user interface tools which exploit this structure.

  20. Failure mode and effects analysis as a performance improvement tool in trauma.

    PubMed

    Day, Suzanne; Dalto, Joseph; Fox, Jolene; Turpin, Melinda

    2006-01-01

    Performance improvement (PI) in the multiple systems injured patient frequently highlights areas for improvement in overall hospital care processes. Failure mode effects analysis (FMEA) is an effective tool to assess and prioritize areas of risk in clinical practice. Failure mode effects analysis is often initiated by a "near-miss" or concern for risk as opposed to a root cause analysis that is initiated solely after a sentinel event. In contrast to a root cause analysis, the FMEA looks more broadly at processes involved in the delivery of care. The purpose of this abstract was to demonstrate the usefulness of FMEA as a PI tool by describing an event and following the event through the healthcare delivery PI processes involved. During routine chart abstraction, a trauma registrar found that an elderly trauma patient admitted with a subdural hematoma inadvertently received heparin during the course of a dialysis treatment. Although heparin use was contraindicated in this patient, there were no sequelae as a result of the error. This case was reviewed by the trauma service PI committee and the quality improvement team, which initiated FMEA. An FMEA of inpatient dialysis process was conducted following this incident. The process included physician, nursing, and allied health representatives involved in dialysis. As part of the process, observations of dialysis treatments and staff interviews were conducted. Observation revealed that nurses generally left the patient's room and did not involve themselves in the dialysis process. A formal patient "pass-off" report was not done. Nurses did not review dialysis orders or reevaluate the treatment plan before treatment. We found that several areas of our current practice placed our patients at risk. 1. The nephrology consult/dialysis communication process was inconsistent. 2. Scheduling of treatments for chronic dialysis patients could occur without a formal consult or order. 3. RNs were not consistently involved in dialysis scheduling, setup, or treatment. 4. Dialysis technicians may exceed scope of practice (taking telephone orders) when scheduling of treatment occurred before consult and written orders. Near-miss events may be overlooked as opportunities for improvement in cases where no harm has come to the patient. As a result of our FMEA investigation, the following recommendations were made to improve hospital care delivery in those trauma patients who require inpatient dialysis: 1. Education of RNs about the dialysis process. 2. Implementation of a formal reporting process between the RN and the dialysis technician before the procedure is initiated. 3. RN supervision of dialysis treatments. 4. Use of a preprinted inpatient dialysis form. 5. Education of dialysis technicians regarding their scope of practice. 6. Improve notification process for scheduling dialysis procedures between units and dialysis coordinator (similar to x-ray scheduling). Our performance improvement focus has broadened to include all reported "near-miss" events in order to improve our healthcare delivery process before an event with sequelae occurs. We have found that using FMEA has greatly increased our ability to facilitate change across all services and departments within the hospital.

  1. Test-Case Generation using an Explicit State Model Checker Final Report

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Gao, Jimin

    2003-01-01

    In the project 'Test-Case Generation using an Explicit State Model Checker' we have extended an existing tools infrastructure for formal modeling to export Java code so that we can use the NASA Ames tool Java Pathfinder (JPF) for test case generation. We have completed a translator from our source language RSML(exp -e) to Java and conducted initial studies of how JPF can be used as a testing tool. In this final report, we provide a detailed description of the translation approach as implemented in our tools.

  2. Planform: an application and database of graph-encoded planarian regenerative experiments.

    PubMed

    Lobo, Daniel; Malone, Taylor J; Levin, Michael

    2013-04-15

    Understanding the mechanisms governing the regeneration capabilities of many organisms is a fundamental interest in biology and medicine. An ever-increasing number of manipulation and molecular experiments are attempting to discover a comprehensive model for regeneration, with the planarian flatworm being one of the most important model species. Despite much effort, no comprehensive, constructive, mechanistic models exist yet, and it is now clear that computational tools are needed to mine this huge dataset. However, until now, there is no database of regenerative experiments, and the current genotype-phenotype ontologies and databases are based on textual descriptions, which are not understandable by computers. To overcome these difficulties, we present here Planform (Planarian formalization), a manually curated database and software tool for planarian regenerative experiments, based on a mathematical graph formalism. The database contains more than a thousand experiments from the main publications in the planarian literature. The software tool provides the user with a graphical interface to easily interact with and mine the database. The presented system is a valuable resource for the regeneration community and, more importantly, will pave the way for the application of novel artificial intelligence tools to extract knowledge from this dataset. The database and software tool are freely available at http://planform.daniel-lobo.com.

  3. The Creative Power of Formal Analogies in Physics: The Case of Albert Einstein

    NASA Astrophysics Data System (ADS)

    Gingras, Yves

    2015-07-01

    In order to show how formal analogies between different physical systems play an important conceptual work in physics, this paper analyzes the evolution of Einstein's thoughts on the structure of radiation from the point of view of the formal analogies he used as "lenses" to "see" through the "black box" of Planck's blackbody radiation law. A comparison is also made with his 1925 paper on the quantum gas where he used the same formal methods. Changes of formal points of view are most of the time taken for granted or passed over in silence in studies on the mathematization of physics as if they had no special significance. Revisiting Einstein's classic papers on the nature of light and matter from the angle of the various theoretical tools he used, namely entropy and energy fluctuation calculations, helps explain why he was in a unique position to make visible the particle structure of radiation and the dual (particle and wave) nature of light and matter. Finally, this case study calls attention to the more general question of the surprising creative power of formal analogies and their frequent use in theoretical physics. This aspect of intellectual creation can be useful in the teaching of physics.

  4. Multivariate meta-analysis for non-linear and other multi-parameter associations

    PubMed Central

    Gasparrini, A; Armstrong, B; Kenward, M G

    2012-01-01

    In this paper, we formalize the application of multivariate meta-analysis and meta-regression to synthesize estimates of multi-parameter associations obtained from different studies. This modelling approach extends the standard two-stage analysis used to combine results across different sub-groups or populations. The most straightforward application is for the meta-analysis of non-linear relationships, described for example by regression coefficients of splines or other functions, but the methodology easily generalizes to any setting where complex associations are described by multiple correlated parameters. The modelling framework of multivariate meta-analysis is implemented in the package mvmeta within the statistical environment R. As an illustrative example, we propose a two-stage analysis for investigating the non-linear exposure–response relationship between temperature and non-accidental mortality using time-series data from multiple cities. Multivariate meta-analysis represents a useful analytical tool for studying complex associations through a two-stage procedure. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22807043

  5. Maternal psychosocial well-being in Eritrea: application of participatory methods and tools of investigation and analysis in complex emergency settings.

    PubMed Central

    Almedom, Astier M.; Tesfamichael, Berhe; Yacob, Abdu; Debretsion, Zaïd; Teklehaimanot, Kidane; Beyene, Teshome; Kuhn, Kira; Alemu, Zemui

    2003-01-01

    OBJECTIVE: To establish the context in which maternal psychosocial well-being is understood in war-affected settings in Eritrea. METHOD: Pretested and validated participatory methods and tools of investigation and analysis were employed to allow participants to engage in processes of qualitative data collection, on-site analysis, and interpretation. FINDINGS: Maternal psychosocial well-being in Eritrea is maintained primarily by traditional systems of social support that are mostly outside the domain of statutory primary care. Traditional birth attendants provide a vital link between the two. Formal training and regular supplies of sterile delivery kits appear to be worthwhile options for health policy and practice in the face of the post-conflict challenges of ruined infrastructure and an overstretched and/or ill-mannered workforce in the maternity health service. CONCLUSION: Methodological advances in health research and the dearth of data on maternal psychosocial well-being in complex emergency settings call for scholars and practitioners to collaborate in creative searches for sound evidence on which to base maternity, mental health and social care policy and practice. Participatory methods facilitate the meaningful engagement of key stakeholders and enhance data quality, reliability and usability. PMID:12856054

  6. Formalization of an environmental model using formal concept analysis - FCA

    NASA Astrophysics Data System (ADS)

    Bourdon-García, Rubén D.; Burgos-Salcedo, Javier D.

    2016-08-01

    Nowadays, there is a huge necessity to generate novel strategies for social-ecological systems analyses for resolving global sustainability problems. This paper has as main purpose the application of the formal concept analysis to formalize the theory of Augusto Ángel Maya, who without a doubt, was one of the most important environmental philosophers in South America; Ángel Maya proposed and established that Ecosystem-Culture relations, instead Human-Nature ones, are determinants in our understanding and management of natural resources. Based on this, a concept lattice, formal concepts, subconcept-superconcept relations, partially ordered sets, supremum and infimum of the lattice and implications between attributes (Duquenne-Guigues base), were determined for the ecosystem-culture relations.

  7. Automatic Methods and Tools for the Verification of Real Time Systems

    DTIC Science & Technology

    1997-11-30

    We developed formal methods and tools for the verification of real - time systems . This was accomplished by extending techniques, based on automata...embedded real - time systems , we introduced hybrid automata, which equip traditional discrete automata with real-numbered clock variables and continuous... real - time systems , and we identified the exact boundary between decidability and undecidability of real-time reasoning.

  8. JTSA: an open source framework for time series abstractions.

    PubMed

    Sacchi, Lucia; Capozzi, Davide; Bellazzi, Riccardo; Larizza, Cristiana

    2015-10-01

    The evaluation of the clinical status of a patient is frequently based on the temporal evolution of some parameters, making the detection of temporal patterns a priority in data analysis. Temporal abstraction (TA) is a methodology widely used in medical reasoning for summarizing and abstracting longitudinal data. This paper describes JTSA (Java Time Series Abstractor), a framework including a library of algorithms for time series preprocessing and abstraction and an engine to execute a workflow for temporal data processing. The JTSA framework is grounded on a comprehensive ontology that models temporal data processing both from the data storage and the abstraction computation perspective. The JTSA framework is designed to allow users to build their own analysis workflows by combining different algorithms. Thanks to the modular structure of a workflow, simple to highly complex patterns can be detected. The JTSA framework has been developed in Java 1.7 and is distributed under GPL as a jar file. JTSA provides: a collection of algorithms to perform temporal abstraction and preprocessing of time series, a framework for defining and executing data analysis workflows based on these algorithms, and a GUI for workflow prototyping and testing. The whole JTSA project relies on a formal model of the data types and of the algorithms included in the library. This model is the basis for the design and implementation of the software application. Taking into account this formalized structure, the user can easily extend the JTSA framework by adding new algorithms. Results are shown in the context of the EU project MOSAIC to extract relevant patterns from data coming related to the long term monitoring of diabetic patients. The proof that JTSA is a versatile tool to be adapted to different needs is given by its possible uses, both as a standalone tool for data summarization and as a module to be embedded into other architectures to select specific phenotypes based on TAs in a large dataset. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. Parallel satellite orbital situational problems solver for space missions design and control

    NASA Astrophysics Data System (ADS)

    Atanassov, Atanas Marinov

    2016-11-01

    Solving different scientific problems for space applications demands implementation of observations, measurements or realization of active experiments during time intervals in which specific geometric and physical conditions are fulfilled. The solving of situational problems for determination of these time intervals when the satellite instruments work optimally is a very important part of all activities on every stage of preparation and realization of space missions. The elaboration of universal, flexible and robust approach for situation analysis, which is easily portable toward new satellite missions, is significant for reduction of missions' preparation times and costs. Every situation problem could be based on one or more situation conditions. Simultaneously solving different kinds of situation problems based on different number and types of situational conditions, each one of them satisfied on different segments of satellite orbit requires irregular calculations. Three formal approaches are presented. First one is related to situation problems description that allows achieving flexibility in situation problem assembling and presentation in computer memory. The second formal approach is connected with developing of situation problem solver organized as processor that executes specific code for every particular situational condition. The third formal approach is related to solver parallelization utilizing threads and dynamic scheduling based on "pool of threads" abstraction and ensures a good load balance. The developed situation problems solver is intended for incorporation in the frames of multi-physics multi-satellite space mission's design and simulation tools.

  10. In search of tools to aid logical thinking and communicating about medical decision making.

    PubMed

    Hunink, M G

    2001-01-01

    To have real-time impact on medical decision making, decision analysts need a wide variety of tools to aid logical thinking and communication. Decision models provide a formal framework to integrate evidence and values, but they are commonly perceived as complex and difficult to understand by those unfamiliar with the methods, especially in the context of clinical decision making. The theory of constraints, introduced by Eliyahu Goldratt in the business world, provides a set of tools for logical thinking and communication that could potentially be useful in medical decision making. The author used the concept of a conflict resolution diagram to analyze the decision to perform carotid endarterectomy prior to coronary artery bypass grafting in a patient with both symptomatic coronary and asymptomatic carotid artery disease. The method enabled clinicians to visualize and analyze the issues, identify and discuss the underlying assumptions, search for the best available evidence, and use the evidence to make a well-founded decision. The method also facilitated communication among those involved in the care of the patient. Techniques from fields other than decision analysis can potentially expand the repertoire of tools available to support medical decision making and to facilitate communication in decision consults.

  11. An exploration of student midwives' language to describe non-formal learning in professional practice.

    PubMed

    Finnerty, Gina; Pope, Rosemary

    2005-05-01

    The essence of non-formal learning in midwifery practice has not been previously explored. This paper provides an in-depth analysis of the language of a sample of student midwives' descriptions of their practice learning in a range of clinical settings. The students submitted audio-diaries as part of a national study (Pope, R., Graham. L., Finnerty. G., Magnusson, C. 2003. An investigation of the preparation and assessment for midwifery practice within a range of settings. Project Report. University of Surrey). Participants detailed their learning activities and support obtained whilst working with their named mentors for approximately 10 days or shifts. The rich audio-diary data have been analysed using Discourse Analysis. A typology of non-formal learning (Eraut, M. 2000. Non-formal learning and implicit knowledge in professional work. British Journal of Educational Psychology 70, 113-136) has been used to provide a framework for the analysis. Non-formal learning is defined as any learning which does not take place within a formally organised learning programme (Eraut, M. 2000. Non-formal learning and implicit knowledge in professional work. British Journal of Educational Psychology 70, 113-136). Findings indicate that fear and ambiguity hindered students' learning. Recommendations include the protection of time by mentors within the clinical curriculum to guide and supervise students in both formal and non-formal elements of midwifery practice. This paper will explore the implications of the findings for practice-based education.

  12. Advanced Weather Awareness and Reporting Enhancements

    NASA Technical Reports Server (NTRS)

    Busquets, Anthony M. (Technical Monitor); Ruokangas, Corinne Clinton; Kelly, Wallace E., III

    2005-01-01

    AWARE (Aviation Weather Awareness and Reporting Enhancements) was a NASA Cooperative Research and Development program conducted jointly by Rockwell Scientific, Rockwell Collins, and NASA. The effort culminated in an enhanced weather briefing and reporting tool prototype designed to integrate graphical and text-based aviation weather data to provide clear situational awareness in the context of a specific pilot, flight and equipment profile. The initial implementation of AWARE was as a web-based preflight planning tool, specifically for general aviation pilots, who do not have access to support such as the dispatchers available for commercial airlines. Initial usability tests showed that for VFR (Visual Flight Rules) pilots, AWARE provided faster and more effective weather evaluation. In a subsequent formal usability test for IFR (Instrument Flight Rules) pilots, all users finished the AWARE tests faster than the parallel DUAT tests, and all subjects graded AWARE higher for effectiveness, efficiency, and usability. The decision analysis basis of AWARE differentiates it from other aviation safety programs, providing analysis of context-sensitive data in a personalized graphical format to aid pilots/dispatchers in their complex flight requirements.

  13. A Robust and Effective Smart-Card-Based Remote User Authentication Mechanism Using Hash Function

    PubMed Central

    Odelu, Vanga; Goswami, Adrijit

    2014-01-01

    In a remote user authentication scheme, a remote server verifies whether a login user is genuine and trustworthy, and also for mutual authentication purpose a login user validates whether the remote server is genuine and trustworthy. Several remote user authentication schemes using the password, the biometrics, and the smart card have been proposed in the literature. However, most schemes proposed in the literature are either computationally expensive or insecure against several known attacks. In this paper, we aim to propose a new robust and effective password-based remote user authentication scheme using smart card. Our scheme is efficient, because our scheme uses only efficient one-way hash function and bitwise XOR operations. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. We perform the simulation for the formal security analysis using the widely accepted AVISPA (Automated Validation Internet Security Protocols and Applications) tool to ensure that our scheme is secure against passive and active attacks. Furthermore, our scheme supports efficiently the password change phase always locally without contacting the remote server and correctly. In addition, our scheme performs significantly better than other existing schemes in terms of communication, computational overheads, security, and features provided by our scheme. PMID:24892078

  14. A robust and effective smart-card-based remote user authentication mechanism using hash function.

    PubMed

    Das, Ashok Kumar; Odelu, Vanga; Goswami, Adrijit

    2014-01-01

    In a remote user authentication scheme, a remote server verifies whether a login user is genuine and trustworthy, and also for mutual authentication purpose a login user validates whether the remote server is genuine and trustworthy. Several remote user authentication schemes using the password, the biometrics, and the smart card have been proposed in the literature. However, most schemes proposed in the literature are either computationally expensive or insecure against several known attacks. In this paper, we aim to propose a new robust and effective password-based remote user authentication scheme using smart card. Our scheme is efficient, because our scheme uses only efficient one-way hash function and bitwise XOR operations. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. We perform the simulation for the formal security analysis using the widely accepted AVISPA (Automated Validation Internet Security Protocols and Applications) tool to ensure that our scheme is secure against passive and active attacks. Furthermore, our scheme supports efficiently the password change phase always locally without contacting the remote server and correctly. In addition, our scheme performs significantly better than other existing schemes in terms of communication, computational overheads, security, and features provided by our scheme.

  15. Contact Tools in Japanese Acupuncture: An Ethnography of Acupuncture Practitioners in Japan.

    PubMed

    Chant, Benjamin Cw; Madison, Jeanne; Coop, Paul; Dieberg, Gudrun

    2017-10-01

    This study aimed to identify procedural elements of Japanese acupuncture, describe these elements in detail, and explain them in terms of the key thematic category of treatment principles. Between August 2012 and December 2016, ethnographic fieldwork was conducted in Japan. In total, 38 participants were recruited by chain referral and emergent sampling. Data was collected through participant observation, interviews, and by analyzing documents. A total of 22 participants agreed to clinical observation; 221 treatments were observed with 172 patients. Seventeen consented to formal interviews and 28 to informal interviews. Thematic analysis was used to critically evaluate data. One especially interesting theme was interpreted from the data: a variety of contact tools were applied in treatment and these were manipulated by adjusting elements of form, speed, repetition, and pressure. Tapping, holding, pressing/pushing, and stroking were the most important ways contact tools were used on patients. Contact tools are noninvasive, painless, can be applied in almost any environment, and may be easily accepted by patients worldwide. Contact tool theory and practice may be successfully integrated into acupuncture curricula outside of Japan, used to inform clinical trials, and contribute to an expanded repertoire of methods for practitioners to benefit individual patients in international contexts. Copyright © 2017. Published by Elsevier B.V.

  16. A Formal Basis for Safety Case Patterns

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Pai, Ganesh

    2013-01-01

    By capturing common structures of successful arguments, safety case patterns provide an approach for reusing strategies for reasoning about safety. In the current state of the practice, patterns exist as descriptive specifications with informal semantics, which not only offer little opportunity for more sophisticated usage such as automated instantiation, composition and manipulation, but also impede standardization efforts and tool interoperability. To address these concerns, this paper gives (i) a formal definition for safety case patterns, clarifying both restrictions on the usage of multiplicity and well-founded recursion in structural abstraction, (ii) formal semantics to patterns, and (iii) a generic data model and algorithm for pattern instantiation. We illustrate our contributions by application to a new pattern, the requirements breakdown pattern, which builds upon our previous work

  17. The interventional radiology business plan.

    PubMed

    Beheshti, Michael V; Meek, Mary E; Kaufman, John A

    2012-09-01

    Strategic planning and business planning are processes commonly employed by organizations that exist in competitive environments. Although it is difficult to prove a causal relationship between formal strategic/business planning and positive organizational performance, there is broad agreement that formal strategic and business plans are components of successful organizations. The various elements of strategic plans and business plans are not common in the vernacular of practicing physicians. As health care becomes more competitive, familiarity with these tools may grow in importance. Herein we provide an overview of formal strategic and business planning, and offer a roadmap for an interventional radiology-specific plan that may be useful for organizations confronting competitive and financial threats. Copyright © 2012 SIR. Published by Elsevier Inc. All rights reserved.

  18. Exploring the focus and experiences of smartphone applications for addiction recovery.

    PubMed

    Savic, Michael; Best, David; Rodda, Simone; Lubman, Dan I

    2013-01-01

    Addiction recovery Smartphone applications (apps) (n = 87) identified on the Google Play store in 2012 were coded, along with app user reviews, to explore functions, foci, and user experiences. Content analysis revealed that apps typically provided information on recovery, as well as content to enhance motivation, promote social support and tools to monitor progress. App users commented that the apps helped to inform them, keep them focussed, inspire them, and connect them with other people and groups. Because few addiction recovery apps appear to have been formally evaluated, further research is needed to ascertain their effectiveness as stand-alone or adjunctive interventions.

  19. Causal premise semantics.

    PubMed

    Kaufmann, Stefan

    2013-08-01

    The rise of causality and the attendant graph-theoretic modeling tools in the study of counterfactual reasoning has had resounding effects in many areas of cognitive science, but it has thus far not permeated the mainstream in linguistic theory to a comparable degree. In this study I show that a version of the predominant framework for the formal semantic analysis of conditionals, Kratzer-style premise semantics, allows for a straightforward implementation of the crucial ideas and insights of Pearl-style causal networks. I spell out the details of such an implementation, focusing especially on the notions of intervention on a network and backtracking interpretations of counterfactuals. Copyright © 2013 Cognitive Science Society, Inc.

  20. Positronics of subnanometer atomistic imperfections in solids as a high-informative structure characterization tool.

    PubMed

    Shpotyuk, Oleh; Filipecki, Jacek; Ingram, Adam; Golovchak, Roman; Vakiv, Mykola; Klym, Halyna; Balitska, Valentyna; Shpotyuk, Mykhaylo; Kozdras, Andrzej

    2015-01-01

    Methodological possibilities of positron annihilation lifetime (PAL) spectroscopy applied to characterize different types of nanomaterials treated within three-term fitting procedure are critically reconsidered. In contrast to conventional three-term analysis based on admixed positron- and positronium-trapping modes, the process of nanostructurization is considered as substitutional positron-positronium trapping within the same host matrix. Developed formalism allows estimate interfacial void volumes responsible for positron trapping and characteristic bulk positron lifetimes in nanoparticle-affected inhomogeneous media. This algorithm was well justified at the example of thermally induced nanostructurization occurring in 80GeSe2-20Ga2Se3 glass.

  1. Positronics of subnanometer atomistic imperfections in solids as a high-informative structure characterization tool

    NASA Astrophysics Data System (ADS)

    Shpotyuk, Oleh; Filipecki, Jacek; Ingram, Adam; Golovchak, Roman; Vakiv, Mykola; Klym, Halyna; Balitska, Valentyna; Shpotyuk, Mykhaylo; Kozdras, Andrzej

    2015-02-01

    Methodological possibilities of positron annihilation lifetime (PAL) spectroscopy applied to characterize different types of nanomaterials treated within three-term fitting procedure are critically reconsidered. In contrast to conventional three-term analysis based on admixed positron- and positronium-trapping modes, the process of nanostructurization is considered as substitutional positron-positronium trapping within the same host matrix. Developed formalism allows estimate interfacial void volumes responsible for positron trapping and characteristic bulk positron lifetimes in nanoparticle-affected inhomogeneous media. This algorithm was well justified at the example of thermally induced nanostructurization occurring in 80GeSe2-20Ga2Se3 glass.

  2. TRL - A FORMAL TEST REPRESENTATION LANGUAGE AND TOOL FOR FUNCTIONAL TEST DESIGNS

    NASA Technical Reports Server (NTRS)

    Hops, J. M.

    1994-01-01

    A Formal Test Representation Language and Tool for Functional Test Designs (TRL) is an automatic tool and a formal language that is used to implement the Category-Partition Method and produce the specification of test cases in the testing phase of software development. The Category-Partition Method is particularly useful in defining the inputs, outputs and purpose of the test design phase and combines the benefits of choosing normal cases with error exposing properties. Traceability can be maintained quite easily by creating a test design for each objective in the test plan. The effort to transform the test cases into procedures is simplified by using an automatic tool to create the cases based on the test design. The method allows the rapid elimination of undesired test cases from consideration, and easy review of test designs by peer groups. The first step in the category-partition method is functional decomposition, in which the specification and/or requirements are decomposed into functional units that can be tested independently. A secondary purpose of this step is to identify the parameters that affect the behavior of the system for each functional unit. The second step, category analysis, carries the work done in the previous step further by determining the properties or sub-properties of the parameters that would make the system behave in different ways. The designer should analyze the requirements to determine the features or categories of each parameter and how the system may behave if the category were to vary its value. If the parameter undergoing refinement is a data-item, then categories of this data-item may be any of its attributes, such as type, size, value, units, frequency of change, or source. After all the categories for the parameters of the functional unit have been determined, the next step is to partition each category's range space into mutually exclusive values that the category can assume. In choosing partition values, all possible kinds of values should be included, especially the ones that will maximize error detection. The purpose of the final step, partition constraint analysis, is to refine the test design specification so that only the technically effective and economically feasible test cases are implied. TRL is written in C-language to be machine independent. It has been successfully implemented on an IBM PC compatible running MS DOS, a Sun4 series computer running SunOS, an HP 9000/700 series workstation running HP-UX, a DECstation running DEC RISC ULTRIX, and a DEC VAX series computer running VMS. TRL requires 1Mb of disk space and a minimum of 84K of RAM. The documentation is available in electronic form in Word Perfect format. The standard distribution media for TRL is a 5.25 inch 360K MS-DOS format diskette. Alternate distribution media and formats are available upon request. TRL was developed in 1993 and is a copyrighted work with all copyright vested in NASA.

  3. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  4. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    NASA Astrophysics Data System (ADS)

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  5. Guideline validation in multiple trauma care through business process modeling.

    PubMed

    Stausberg, Jürgen; Bilir, Hüseyin; Waydhas, Christian; Ruchholtz, Steffen

    2003-07-01

    Clinical guidelines can improve the quality of care in multiple trauma. In our Department of Trauma Surgery a specific guideline is available paper-based as a set of flowcharts. This format is appropriate for the use by experienced physicians but insufficient for electronic support of learning, workflow and process optimization. A formal and logically consistent version represented with a standardized meta-model is necessary for automatic processing. In our project we transferred the paper-based into an electronic format and analyzed the structure with respect to formal errors. Several errors were detected in seven error categories. The errors were corrected to reach a formally and logically consistent process model. In a second step the clinical content of the guideline was revised interactively using a process-modeling tool. Our study reveals that guideline development should be assisted by process modeling tools, which check the content in comparison to a meta-model. The meta-model itself could support the domain experts in formulating their knowledge systematically. To assure sustainability of guideline development a representation independent of specific applications or specific provider is necessary. Then, clinical guidelines could be used for eLearning, process optimization and workflow management additionally.

  6. Model-Based Testability Assessment and Directed Troubleshooting of Shuttle Wiring Systems

    NASA Technical Reports Server (NTRS)

    Deb, Somnath; Domagala, Chuck; Shrestha, Roshan; Malepati, Venkatesh; Cavanaugh, Kevin; Patterson-Hine, Ann; Sanderfer, Dwight; Cockrell, Jim; Norvig, Peter (Technical Monitor)

    2000-01-01

    We have recently completed a pilot study on the Space shuttle wiring system commissioned by the Wiring Integrity Research (WIRe) team at NASA Ames Research Center, As the space shuttle ages, it is experiencing wiring degradation problems including arcing, chaffing insulation breakdown and broken conductors. A systematic and comprehensive test process is required to thoroughly test and quality assure (QA) the wiring systems. The NASA WIRe team recognized the value of a formal model based analysis for risk-assessment and fault coverage analysis. However. wiring systems are complex and involve over 50,000 wire segments. Therefore, NASA commissioned this pilot study with Qualtech Systems. Inc. (QSI) to explore means of automatically extracting high fidelity multi-signal models from wiring information database for use with QSI's Testability Engineering and Maintenance System (TEAMS) tool.

  7. 49 CFR 236.923 - Task analysis and basic requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... classroom, simulator, computer-based, hands-on, or other formally structured training and testing, except... for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements...) Based on a formal task analysis, identify the installation, maintenance, repair, modification...

  8. Refining prognosis in lung cancer: A report on the quality and relevance of clinical prognostic tools

    PubMed Central

    Mahar, Alyson L.; Compton, Carolyn; McShane, Lisa M.; Halabi, Susan; Asamura, Hisao; Rami-Porta, Ramon; Groome, Patti A.

    2015-01-01

    Introduction Accurate, individualized prognostication for lung cancer patients requires the integration of standard patient and pathologic factors, biologic, genetic, and other molecular characteristics of the tumor. Clinical prognostic tools aim to aggregate information on an individual patient to predict disease outcomes such as overall survival, but little is known about their clinical utility and accuracy in lung cancer. Methods A systematic search of the scientific literature for clinical prognostic tools in lung cancer published Jan 1, 1996-Jan 27, 2015 was performed. In addition, web-based resources were searched. A priori criteria determined by the Molecular Modellers Working Group of the American Joint Committee on Cancer were used to investigate the quality and usefulness of tools. Criteria included clinical presentation, model development approaches, validation strategies, and performance metrics. Results Thirty-two prognostic tools were identified. Patients with metastases were the most frequently considered population in non-small cell lung cancer. All tools for small cell lung cancer covered that entire patient population. Included prognostic factors varied considerably across tools. Internal validity was not formally evaluated for most tools and only eleven were evaluated for external validity. Two key considerations were highlighted for tool development: identification of an explicit purpose related to a relevant clinical population and clear decision-points, and prioritized inclusion of established prognostic factors over emerging factors. Conclusions Prognostic tools will contribute more meaningfully to the practice of personalized medicine if better study design and analysis approaches are used in their development and validation. PMID:26313682

  9. Understanding diagnosis and management of dementia and guideline implementation in general practice: a qualitative study using the theoretical domains framework.

    PubMed

    Murphy, Kerry; O'Connor, Denise A; Browning, Colette J; French, Simon D; Michie, Susan; Francis, Jill J; Russell, Grant M; Workman, Barbara; Flicker, Leon; Eccles, Martin P; Green, Sally E

    2014-03-03

    Dementia is a growing problem, causing substantial burden for patients, their families, and society. General practitioners (GPs) play an important role in diagnosing and managing dementia; however, there are gaps between recommended and current practice. The aim of this study was to explore GPs' reported practice in diagnosing and managing dementia and to describe, in theoretical terms, the proposed explanations for practice that was and was not consistent with evidence-based guidelines. Semi-structured interviews were conducted with GPs in Victoria, Australia. The Theoretical Domains Framework (TDF) guided data collection and analysis. Interviews explored the factors hindering and enabling achievement of 13 recommended behaviours. Data were analysed using content and thematic analysis. This paper presents an in-depth description of the factors influencing two behaviours, assessing co-morbid depression using a validated tool, and conducting a formal cognitive assessment using a validated scale. A total of 30 GPs were interviewed. Most GPs reported that they did not assess for co-morbid depression using a validated tool as per recommended guidance. Barriers included the belief that depression can be adequately assessed using general clinical indicators and that validated tools provide little additional information (theoretical domain of 'Beliefs about consequences'); discomfort in using validated tools ('Emotion'), possibly due to limited training and confidence ('Skills'; 'Beliefs about capabilities'); limited awareness of the need for, and forgetting to conduct, a depression assessment ('Knowledge'; 'Memory, attention and decision processes'). Most reported practising in a manner consistent with the recommendation that a formal cognitive assessment using a validated scale be undertaken. Key factors enabling this were having an awareness of the need to conduct a cognitive assessment ('Knowledge'); possessing the necessary skills and confidence ('Skills'; 'Beliefs about capabilities'); and having adequate time and resources ('Environmental context and resources'). This is the first study to our knowledge to use a theoretical approach to investigate the barriers and enablers to guideline-recommended diagnosis and management of dementia in general practice. It has identified key factors likely to explain GPs' uptake of the guidelines. The results have informed the design of an intervention aimed at supporting practice change in line with dementia guidelines, which is currently being evaluated in a cluster randomised trial.

  10. Fuzzy Logic Controller Stability Analysis Using a Satisfiability Modulo Theories Approach

    NASA Technical Reports Server (NTRS)

    Arnett, Timothy; Cook, Brandon; Clark, Matthew A.; Rattan, Kuldip

    2017-01-01

    While many widely accepted methods and techniques exist for validation and verification of traditional controllers, at this time no solutions have been accepted for Fuzzy Logic Controllers (FLCs). Due to the highly nonlinear nature of such systems, and the fact that developing a valid FLC does not require a mathematical model of the system, it is quite difficult to use conventional techniques to prove controller stability. Since safety-critical systems must be tested and verified to work as expected for all possible circumstances, the fact that FLC controllers cannot be tested to achieve such requirements poses limitations on the applications for such technology. Therefore, alternative methods for verification and validation of FLCs needs to be explored. In this study, a novel approach using formal verification methods to ensure the stability of a FLC is proposed. Main research challenges include specification of requirements for a complex system, conversion of a traditional FLC to a piecewise polynomial representation, and using a formal verification tool in a nonlinear solution space. Using the proposed architecture, the Fuzzy Logic Controller was found to always generate negative feedback, but inconclusive for Lyapunov stability.

  11. Collective three-flavor oscillations of supernova neutrinos

    NASA Astrophysics Data System (ADS)

    Dasgupta, Basudeb; Dighe, Amol

    2008-06-01

    Neutrinos and antineutrinos emitted from a core collapse supernova interact among themselves, giving rise to collective flavor conversion effects that are significant near the neutrinosphere. We develop a formalism to analyze these collective effects in the complete three-flavor framework. It naturally generalizes the spin-precession analogy to three flavors and is capable of analytically describing phenomena like vacuum/Mikheyev-Smirnov-Wolfenstein (MSW) oscillations, synchronized oscillations, bipolar oscillations, and spectral split. Using the formalism, we demonstrate that the flavor conversions may be “factorized” into two-flavor oscillations with hierarchical frequencies. We explicitly show how the three-flavor solution may be constructed by combining two-flavor solutions. For a typical supernova density profile, we identify an approximate separation of regions where distinctly different flavor conversion mechanisms operate, and demonstrate the interplay between collective and MSW effects. We pictorialize our results in terms of the “e3-e8 triangle” diagram, which is a tool that can be used to visualize three-neutrino flavor conversions in general, and offers insights into the analysis of the collective effects in particular.

  12. Genetic design automation: engineering fantasy or scientific renewal?

    PubMed

    Lux, Matthew W; Bramlett, Brian W; Ball, David A; Peccoud, Jean

    2012-02-01

    The aim of synthetic biology is to make genetic systems more amenable to engineering, which has naturally led to the development of computer-aided design (CAD) tools. Experimentalists still primarily rely on project-specific ad hoc workflows instead of domain-specific tools, which suggests that CAD tools are lagging behind the front line of the field. Here, we discuss the scientific hurdles that have limited the productivity gains anticipated from existing tools. We argue that the real value of efforts to develop CAD tools is the formalization of genetic design rules that determine the complex relationships between genotype and phenotype. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Small scale sequence automation pays big dividends

    NASA Technical Reports Server (NTRS)

    Nelson, Bill

    1994-01-01

    Galileo sequence design and integration are supported by a suite of formal software tools. Sequence review, however, is largely a manual process with reviewers scanning hundreds of pages of cryptic computer printouts to verify sequence correctness. Beginning in 1990, a series of small, PC based sequence review tools evolved. Each tool performs a specific task but all have a common 'look and feel'. The narrow focus of each tool means simpler operation, and easier creation, testing, and maintenance. Benefits from these tools are (1) decreased review time by factors of 5 to 20 or more with a concomitant reduction in staffing, (2) increased review accuracy, and (3) excellent returns on time invested.

  14. A Secure and Robust User Authenticated Key Agreement Scheme for Hierarchical Multi-medical Server Environment in TMIS.

    PubMed

    Das, Ashok Kumar; Odelu, Vanga; Goswami, Adrijit

    2015-09-01

    The telecare medicine information system (TMIS) helps the patients to gain the health monitoring facility at home and access medical services over the Internet of mobile networks. Recently, Amin and Biswas presented a smart card based user authentication and key agreement security protocol usable for TMIS system using the cryptographic one-way hash function and biohashing function, and claimed that their scheme is secure against all possible attacks. Though their scheme is efficient due to usage of one-way hash function, we show that their scheme has several security pitfalls and design flaws, such as (1) it fails to protect privileged-insider attack, (2) it fails to protect strong replay attack, (3) it fails to protect strong man-in-the-middle attack, (4) it has design flaw in user registration phase, (5) it has design flaw in login phase, (6) it has design flaw in password change phase, (7) it lacks of supporting biometric update phase, and (8) it has flaws in formal security analysis. In order to withstand these security pitfalls and design flaws, we aim to propose a secure and robust user authenticated key agreement scheme for the hierarchical multi-server environment suitable in TMIS using the cryptographic one-way hash function and fuzzy extractor. Through the rigorous security analysis including the formal security analysis using the widely-accepted Burrows-Abadi-Needham (BAN) logic, the formal security analysis under the random oracle model and the informal security analysis, we show that our scheme is secure against possible known attacks. Furthermore, we simulate our scheme using the most-widely accepted and used Automated Validation of Internet Security Protocols and Applications (AVISPA) tool. The simulation results show that our scheme is also secure. Our scheme is more efficient in computation and communication as compared to Amin-Biswas's scheme and other related schemes. In addition, our scheme supports extra functionality features as compared to other related schemes. As a result, our scheme is very appropriate for practical applications in TMIS.

  15. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  16. Formalizing Space Shuttle Software Requirements

    NASA Technical Reports Server (NTRS)

    Crow, Judith; DiVito, Ben L.

    1996-01-01

    This paper describes two case studies in which requirements for new flight-software subsystems on NASA's Space Shuttle were analyzed, one using standard formal specification techniques, the other using state exploration. These applications serve to illustrate three main theses: (1) formal methods can complement conventional requirements analysis processes effectively, (2) formal methods confer benefits regardless of how extensively they are adopted and applied, and (3) formal methods are most effective when they are judiciously tailored to the application.

  17. Exploring the Utility of Microblogging as a Tool for Formal Content-Based Learning in the Community College History Classroom

    ERIC Educational Resources Information Center

    Freels, Jeffrey W.

    2015-01-01

    The emergence of social media technologies (SMT) as important features of life in the twenty-first century has aroused the curiosity of teachers and scholars in higher education and given rise to numerous experiments using SMT as tools of instruction in college and university classrooms. A body of research has emerged from those experiments which…

  18. DARPA Initiative in Concurrent Engineering (DICE). Phase 2

    DTIC Science & Technology

    1990-07-31

    XS spreadsheet tool " Q-Calc spreadsheet tool " TAE+ outer wrapper for XS • Framemaker-based formal EDN (Electronic Design Notebook) " Data...shared global object space and object persistence. Technical Results Module Development XS Integration Environment A prototype of the wrapper concepts...for a spreadsheet integration environment, using an X-Windows based extensible Lotus 1-2-3 emulation called XS , and was (initially) targeted for

  19. Animals, Emperors, Senses: Exploring a Story-Based Learning Design in a Museum Setting

    ERIC Educational Resources Information Center

    Murmann, Mai; Avraamidou, Lucy

    2014-01-01

    The aim of this qualitative case study was to explore the use of stories as tools for learning within formal and informal learning environments. The design was based on three areas of interest: (a) the story as a tool for learning; (b) the student as subjects engaging with the story; and (c) the context in which the story learning activity takes…

  20. ENVIRONMENTAL SYSTEMS MANAGEMENT / POLLUTION PREVENTION RESEARCH

    EPA Science Inventory

    Goal 8.4 Improve Environmental Systems Management (Formally Pollution Prevention and New Technology) Background The U.S. Environmental Protection Agency (EPA) has developed and evaluated tools and technologies to monitor, prevent, control, and clean-up pollution through...

  1. Environmental impact assessment of rail infrastructure.

    DOT National Transportation Integrated Search

    2016-01-29

    This project resulted in three products: a comprehensive "Sustainable Rail Checklist," a rail planning GIS database, and a web GIS tool that integrates sustainability metrics and facilitates a rapid assessment before a formal NEPA process is implemen...

  2. EAP: An Important Supervisory Tool.

    ERIC Educational Resources Information Center

    Wright, Jim

    1984-01-01

    Discusses elements of the Employee Assistance Program: why employees need it, their acceptance of the program, when to refer an employee to the program, counseling, formal referral, plan of action, and how the program helps the supervisor. (CT)

  3. A secure user anonymity-preserving three-factor remote user authentication scheme for the telecare medicine information systems.

    PubMed

    Das, Ashok Kumar

    2015-03-01

    Recent advanced technology enables the telecare medicine information system (TMIS) for the patients to gain the health monitoring facility at home and also to access medical services over the Internet of mobile networks. Several remote user authentication schemes have been proposed in the literature for TMIS. However, most of them are either insecure against various known attacks or they are inefficient. Recently, Tan proposed an efficient user anonymity preserving three-factor authentication scheme for TMIS. In this paper, we show that though Tan's scheme is efficient, it has several security drawbacks such as (1) it fails to provide proper authentication during the login phase, (2) it fails to provide correct updation of password and biometric of a user during the password and biometric update phase, and (3) it fails to protect against replay attack. In addition, Tan's scheme lacks the formal security analysis and verification. Later, Arshad and Nikooghadam also pointed out some security flaws in Tan's scheme and then presented an improvement on Tan's s scheme. However, we show that Arshad and Nikooghadam's scheme is still insecure against the privileged-insider attack through the stolen smart-card attack, and it also lacks the formal security analysis and verification. In order to withstand those security loopholes found in both Tan's scheme, and Arshad and Nikooghadam's scheme, we aim to propose an effective and more secure three-factor remote user authentication scheme for TMIS. Our scheme provides the user anonymity property. Through the rigorous informal and formal security analysis using random oracle models and the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool, we show that our scheme is secure against various known attacks, including the replay and man-in-the-middle attacks. Furthermore, our scheme is also efficient as compared to other related schemes.

  4. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  5. Steinberg ``AUDIOMAPS'' Music Appreciation-Via-Understanding: Special-Relativity + Expectations ``Quantum-Theory'': a Quantum-ACOUSTO/MUSICO-Dynamics (QA/MD)

    NASA Astrophysics Data System (ADS)

    Fender, Lee; Steinberg, Russell; Siegel, Edward Carl-Ludwig

    2011-03-01

    Steinberg wildly popular "AUDIOMAPS" music enjoyment/appreciation-via-understanding methodology, versus art, music-dynamics evolves, telling a story in (3+1)-dimensions: trails, frames, timbres, + dynamics amplitude vs. music-score time-series (formal-inverse power-spectrum) surprisingly closely parallels (3+1)-dimensional Einstein(1905) special-relativity "+" (with its enjoyment-expectations) a manifestation of quantum-theory expectation-values, together a music quantum-ACOUSTO/MUSICO-dynamics(QA/MD). Analysis via Derrida deconstruction enabled Siegel-Baez "Category-Semantics" "FUZZYICS"="CATEGORYICS ('TRIZ") Aristotle SoO DEduction , irrespective of Boon-Klimontovich vs. Voss-Clark[PRL(77)] music power-spectrum analysis sampling-time/duration controversy: part versus whole, shows QA/MD reigns supreme as THE music appreciation-via-analysis tool for the listener in musicology!!! Connection to Deutsch-Hartmann-Levitin[This is Your Brain on Music, (06)] brain/mind-barrier brain/mind-music connection is subtle/compelling/immediate!!!

  6. Strategic analysis of a water rights conflict in the south western United States.

    PubMed

    Philpot, Simone; Hipel, Keith; Johnson, Peter

    2016-09-15

    A strategic analysis of the ongoing conflict between Nevada and Utah, over groundwater allocation at Snake Valley, is carried out in order to investigate ways on how to resolve this dispute. More specifically, the Graph Model for Conflict Resolution is employed to formally model and analyze this conflict using the decision support system called GMCR+. The conflict analysis findings indicate that the dispute is enduring because of a lack of incentive and opportunity for any party to move beyond the present circumstances. Continued negotiations are not likely to resolve this conflict. A substantial change in the preferences or options of the disputants, or new governance tools will be required to move this conflict forward. This may hold lessons for future groundwater conflicts. It is, however, increasingly likely that the parties will require a third party intervention, such as equal apportionment by the US Supreme Court. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Methods and Practices of Investigators for Determining Participants’ Decisional Capacity and Comprehension of Protocols

    PubMed Central

    Kon, Alexander A.; Klug, Michael

    2010-01-01

    Ethicists recommend that investigators assess subjects’ comprehension prior to accepting their consent as valid. Because children represent an at-risk population, ensuring adequate comprehension in pediatric research is vital. We surveyed all corresponding authors of research articles published over a six-month period in five leading adult and pediatric journals. Our goal was to assess how often subject’s comprehension or decisional capacity was assessed in the consent process, whether there was any difference between adult and pediatric research projects, and the rate at which investigators use formal or validated tools to assess capacity. Responses from 102 authors were analyzed (response rate 56%). Approximately two-thirds of respondents stated that they assessed comprehension or decisional capacity prior to accepting consent, and we found no difference between adult and pediatric researchers. Nine investigators used a formal questionnaire, and three used a validated tool. These findings suggest that fewer than expected investigators assess comprehension and decisional capacity, and that the use of standardized and validated tools is the exception rather than the rule. PMID:19385838

  8. Training Traditional Birth Attendants on the Use of Misoprostol and a Blood Measurement Tool to Prevent Postpartum Haemorrhage: Lessons Learnt from Bangladesh

    PubMed Central

    Passano, Paige; Bohl, Daniel D.; Islam, Arshadul; Prata, Ndola

    2014-01-01

    A consensus emerged in the late 1990s among leaders in global maternal health that traditional birth attendants (TBAs) should no longer be trained in delivery skills and should instead be trained as promoters of facility-based care. Many TBAs continue to be trained in places where home deliveries are the norm and the potential impacts of this training are important to understand. The primary objective of this study was to gain a more nuanced understanding of the full impact of training TBAs to use misoprostol and a blood measurement tool (mat) for the prevention of postpartum haemorrhage (PPH) at home deliveries through the perspective of those involved in the project. This qualitative study, conducted between July 2009 and July 2010 in Bangladesh, was nested within larger operations research, testing the feasibility and acceptability of scaling up community-based provision of misoprostol and a blood measurement tool for prevention of PPH. A total of 87 in-depth interviews (IDIs) were conducted with TBAs, community health workers (CHWs), managers, and government-employed family welfare visitors (FWVs) at three time points during the study. Computer-assisted thematic data analysis was conducted using ATLAS.ti (version 5.2). Four primary themes emerged during the data analysis, which all highlight changes that occurred following the training. The first theme describes the perceived direct changes linked to the two new interventions. The following three themes describe the indirect changes that interviewees perceived: strengthened linkages between TBAs and the formal healthcare system; strengthened linkages between TBAs and the communities they serve; and improved quality of services/service utilization. The data indicate that training TBAs and CHW supervisors resulted in perceived broader and more nuanced changes than simply improvements in TBAs’ knowledge, attitudes, and practices. Acknowledgeing TBAs’ important role in the community and in home deliveries and integrating them into the formal healthcare system has the potential to result in changes similar to those seen in this study. PMID:24847601

  9. Ontological analysis of SNOMED CT.

    PubMed

    Héja, Gergely; Surján, György; Varga, Péter

    2008-10-27

    SNOMED CT is the most comprehensive medical terminology. However, its use for intelligent services based on formal reasoning is questionable. The analysis of the structure of SNOMED CT is based on the formal top-level ontology DOLCE. The analysis revealed several ontological and knowledge-engineering errors, the most important are errors in the hierarchy (mostly from an ontological point of view, but also regarding medical aspects) and the mixing of subsumption relations with other types (mostly 'part of'). The found errors impede formal reasoning. The paper presents a possible way to correct these problems.

  10. Formal Analysis of Extended Well-Clear Boundaries for Unmanned Aircraft

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Narkawicz, Anthony

    2016-01-01

    This paper concerns the application of formal methods to the definition of a detect and avoid concept for unmanned aircraft systems (UAS). In particular, it illustrates how formal analysis was used to explain and correct unexpected behaviors of the logic that issues alerts when two aircraft are predicted not to be well clear from one another. As a result of this analysis, a recommendation was proposed to, and subsequently adopted by, the US standards organization that defines the minimum operational requirements for the UAS detect and avoid concept.

  11. Formal Assurance Arguments: A Solution In Search of a Problem?

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.

    2015-01-01

    An assurance case comprises evidence and argument showing how that evidence supports assurance claims (e.g., about safety or security). It is unsurprising that some computer scientists have proposed formalizing assurance arguments: most associate formality with rigor. But while engineers can sometimes prove that source code refines a formal specification, it is not clear that formalization will improve assurance arguments or that this benefit is worth its cost. For example, formalization might reduce the benefits of argumentation by limiting the audience to people who can read formal logic. In this paper, we present (1) a systematic survey of the literature surrounding formal assurance arguments, (2) an analysis of errors that formalism can help to eliminate, (3) a discussion of existing evidence, and (4) suggestions for experimental work to definitively answer the question.

  12. Killings, duality and characteristic polynomials

    NASA Astrophysics Data System (ADS)

    Álvarez, Enrique; Borlaf, Javier; León, José H.

    1998-03-01

    In this paper the complete geometrical setting of (lowest order) abelian T-duality is explored with the help of some new geometrical tools (the reduced formalism). In particular, all invariant polynomials (the integrands of the characteristic classes) can be explicitly computed for the dual model in terms of quantities pertaining to the original one and with the help of the canonical connection whose intrinsic characterization is given. Using our formalism the physically, and T-duality invariant, relevant result that top forms are zero when there is an isometry without fixed points is easily proved. © 1998

  13. Formal Verification for a Next-Generation Space Shuttle

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2002-01-01

    This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.

  14. What constitutes a good hand offs in the emergency department: a patient's perspective.

    PubMed

    Downey, La Vonne; Zun, Leslie; Burke, Trena

    2013-01-01

    The aim is to determine, from the patient's perspective, what constitutes a good hand-off procedure in the emergency department (ED). The secondary purpose is to evaluate what impact a formalized hand-off had on patient knowledge, throughput and customer service This study used a randomized controlled clinical trial involving two unique hand-off approaches and a convenience sample. The study alternated between the current hand-off process that documented the process but not specific elements (referred to as the informal process) to one using the IPASS the BATON process (considered the formal process). Consenting patients completed a 12-question validated questionnaire on how the process was perceived by patients and about their understanding why they waited in the ED. Statistical analysis using SPSS calculated descriptive frequencies and t-tests. In total 107 patients were enrolled: 50 in the informal and 57 in the formal group. Most patients had positive answers to the customer survey. There were significant differences between formal and informal groups: recalling the oncoming and outgoing physician coming to the patient's bed (p = 0.000), with more formal group recalling that than informal group patients; the oncoming physician introducing him/herself (p = 0.01), with more from the formal group answering yes and the physician discussing tests and implications with formal group patients (p = 0.02). This study was done at an urban inner city ED, a fact that may have skewed its results. A comparison of suburban and rural EDs would make the results stronger. It also reflected a very high level of customer satisfaction within the ED. This lack of variance may have meant that the correlation between customer service and handoffs was missed or underrepresented. There was no codified observation of either those using the IPASS the BATON script or those using informal procedures, so no comparison of level and types of information given between the two groups was done. There could have been a bias of those attending who had internalized the IPASS the BATON procedures and used them even when they were assigned to the informal group. A hand off from one physician to the next in the emergency department is best done using a formalized process. IPASS the BATON is a useful tool for hand off in the ED in part because it involved the patient in the process. The formal hand off increased communication between patient and doctor as its use increased the patient's opportunity to ask and respond to questions. The researchers evaluated an ED physician specific hand-off process and illustrate the value and impact of involving patients in the hand-off process.

  15. Critical Analysis of the Mathematical Formalism of Theoretical Physics. II. Foundations of Vector Calculus

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2014-03-01

    A critical analysis of the foundations of standard vector calculus is proposed. The methodological basis of the analysis is the unity of formal logic and of rational dialectics. It is proved that the vector calculus is incorrect theory because: (a) it is not based on a correct methodological basis - the unity of formal logic and of rational dialectics; (b) it does not contain the correct definitions of ``movement,'' ``direction'' and ``vector'' (c) it does not take into consideration the dimensions of physical quantities (i.e., number names, denominate numbers, concrete numbers), characterizing the concept of ''physical vector,'' and, therefore, it has no natural-scientific meaning; (d) operations on ``physical vectors'' and the vector calculus propositions relating to the ''physical vectors'' are contrary to formal logic.

  16. Microprocessor Simulation: A Training Technique.

    ERIC Educational Resources Information Center

    Oscarson, David J.

    1982-01-01

    Describes the design and application of a microprocessor simulation using BASIC for formal training of technicians and managers and as a management tool. Illustrates the utility of the modular approach for the instruction and practice of decision-making techniques. (SK)

  17. Best behaviour? Ontologies and the formal description of animal behaviour.

    PubMed

    Gkoutos, Georgios V; Hoehndorf, Robert; Tsaprouni, Loukia; Schofield, Paul N

    2015-10-01

    The development of ontologies for describing animal behaviour has proved to be one of the most difficult of all scientific knowledge domains. Ranging from neurological processes to human emotions, the range and scope needed for such ontologies is highly challenging, but if data integration and computational tools such as automated reasoning are to be fully applied in this important area the underlying principles of these ontologies need to be better established and development needs detailed coordination. Whilst the state of scientific knowledge is always paramount in ontology and formal description framework design, this is a particular problem with neurobehavioural ontologies where our understanding of the relationship between behaviour and its underlying biophysical basis is currently in its infancy. In this commentary, we discuss some of the fundamental problems in designing and using behaviour ontologies, and present some of the best developed tools in this domain.

  18. An Integrated Environment for Efficient Formal Design and Verification

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.

  19. Accuracy of visual assessments of proliferation indices in gastroenteropancreatic neuroendocrine tumours.

    PubMed

    Young, Helen T M; Carr, Norman J; Green, Bryan; Tilley, Charles; Bhargava, Vidhi; Pearce, Neil

    2013-08-01

    To compare the accuracy of eyeball estimates of the Ki-67 proliferation index (PI) with formal counting of 2000 cells as recommend by the Royal College of Pathologists. Sections from gastroenteropancreatic neuroendocrine tumours were immunostained for Ki-67. PI was calculated using three methods: (1) a manual tally count of 2000 cells from the area of highest nuclear labelling using a microscope eyepiece graticule; (2) eyeball estimates made by four pathologists within the same area of highest nuclear labelling; and (3) image analysis of microscope photographs taken from this area using the ImageJ 'cell counter' tool. ImageJ analysis was considered the gold standard for comparison. Levels of agreement between methods were evaluated using Bland-Altman plots. Agreement between the manual tally and ImageJ assessments was very high at low PIs. Agreement between eyeball assessments and ImageJ analysis varied between pathologists. Where data for low PIs alone were analysed, there was a moderate level of agreement between pathologists' estimates and the gold standard, but when all data were included, agreement was poor. Manual tally counts of 2000 cells exhibited similar levels of accuracy to the gold standard, especially at low PIs. Eyeball estimates were significantly less accurate than the gold standard. This suggests that tumour grades may be misclassified by eyeballing and that formal tally counting of positive cells produces more reliable results. Further studies are needed to identify accurate clinically appropriate ways of calculating.

  20. Persian version of frontal assessment battery: Correlations with formal measures of executive functioning and providing normative data for Persian population.

    PubMed

    Asaadi, Sina; Ashrafi, Farzad; Omidbeigi, Mahmoud; Nasiri, Zahra; Pakdaman, Hossein; Amini-Harandi, Ali

    2016-01-05

    Cognitive impairment in patients with Parkinson's disease (PD) mainly involves executive function (EF). The frontal assessment battery (FAB) is an efficient tool for the assessment of EFs. The aims of this study were to determine the validity and reliability of the psychometric properties of the Persian version of FAB and assess its correlation with formal measures of EFs to provide normative data for the Persian version of FAB in patients with PD. The study recruited 149 healthy participants and 49 patients with idiopathic PD. In PD patients, FAB results were compared to their performance on EF tests. Reliability analysis involved test-retest reliability and internal consistency, whereas validity analysis involved convergent validity approach. FAB scores compared in normal controls and in PD patients matched for age, education, and Mini-Mental State Examination (MMSE) score. In PD patients, FAB scores were significantly decreased compared to normal controls, and correlated with Stroop test and Wisconsin Card Sorting Test (WCST). In healthy subjects, FAB scores varied according to the age, education, and MMSE. In the FAB subtest analysis, the performances of PD patients were worse than the healthy participants on similarities, fluency tasks, and Luria's motor series. Persian version of FAB could be used as a reliable scale for the assessment of frontal lobe functions in Iranian patients with PD. Furthermore, normative data provided for the Persian version of this test improve the accuracy and confidence in the clinical application of the FAB.

  1. The production of audiovisual teaching tools in minimally invasive surgery.

    PubMed

    Tolerton, Sarah K; Hugh, Thomas J; Cosman, Peter H

    2012-01-01

    Audiovisual learning resources have become valuable adjuncts to formal teaching in surgical training. This report discusses the process and challenges of preparing an audiovisual teaching tool for laparoscopic cholecystectomy. The relative value in surgical education and training, for both the creator and viewer are addressed. This audiovisual teaching resource was prepared as part of the Master of Surgery program at the University of Sydney, Australia. The different methods of video production used to create operative teaching tools are discussed. Collating and editing material for an audiovisual teaching resource can be a time-consuming and technically challenging process. However, quality learning resources can now be produced even with limited prior video editing experience. With minimal cost and suitable guidance to ensure clinically relevant content, most surgeons should be able to produce short, high-quality education videos of both open and minimally invasive surgery. Despite the challenges faced during production of audiovisual teaching tools, these resources are now relatively easy to produce using readily available software. These resources are particularly attractive to surgical trainees when real time operative footage is used. They serve as valuable adjuncts to formal teaching, particularly in the setting of minimally invasive surgery. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  2. Java PathExplorer: A Runtime Verification Tool

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2001-01-01

    We describe recent work on designing an environment called Java PathExplorer for monitoring the execution of Java programs. This environment facilitates the testing of execution traces against high level specifications, including temporal logic formulae. In addition, it contains algorithms for detecting classical error patterns in concurrent programs, such as deadlocks and data races. An initial prototype of the tool has been applied to the executive module of the planetary Rover K9, developed at NASA Ames. In this paper we describe the background and motivation for the development of this tool, including comments on how it relates to formal methods tools as well as to traditional testing, and we then present the tool itself.

  3. Variability of ethics education in laboratory medicine training programs: results of an international survey.

    PubMed

    Bruns, David E; Burtis, Carl A; Gronowski, Ann M; McQueen, Matthew J; Newman, Anthony; Jonsson, Jon J

    2015-03-10

    Ethical considerations are increasingly important in medicine. We aimed to determine the mode and extent of teaching of ethics in training programs in clinical chemistry and laboratory medicine. We developed an on-line survey of teaching in areas of ethics relevant to laboratory medicine. Reponses were invited from directors of training programs who were recruited via email to leaders of national organizations. The survey was completed by 80 directors from 24 countries who directed 113 programs. The largest numbers of respondents directed postdoctoral training of scientists (42%) or physicians (33%), post-masters degree programs (33%), and PhD programs (29%). Most programs (82%) were 2years or longer in duration. Formal training was offered in research ethics by 39%, medical ethics by 31%, professional ethics by 24% and business ethics by 9%. The number of reported hours of formal training varied widely, e.g., from 0 to >15h/year for research ethics and from 0 to >15h for medical ethics. Ethics training was required and/or tested in 75% of programs that offered training. A majority (54%) of respondents reported plans to add or enhance training in ethics; many indicated a desire for online resources related to ethics, especially resources with self-assessment tools. Formal teaching of ethics is absent from many training programs in clinical chemistry and laboratory medicine, with heterogeneity in the extent and methods of ethics training among the programs that provide the training. A perceived need exists for online training tools, especially tools with self-assessment components. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Recruitment of underrepresented minority students to medical school: minority medical student organizations, an untapped resource.

    PubMed

    Rumala, Bernice B; Cason, Frederick D

    2007-09-01

    Recruitment of more underrepresented minority students (black, Hispanic and native American) to increase racial diversity in the physician workforce is on the agenda for medical schools around the nation. The benefits of having a racially diverse class are indisputable. Minority physicians are more likely to provide care to minority, underserved, disadvantaged and low-income populations. Therefore, medical schools would benefit from diversity through utilizing strategies for recruitment of underrepresented minority (URM) students. Numerous recruitment strategies have been employed to increase the number of underrepresented minority students. However, formal collaboration with minority medical student organizations is an underutilized tool in the recruitment process. Many medical schools have informally used minority medical students and members of various minority organizations on campus in the recruitment process, but a formal collaboration which entails a strategic approach on using minority medical student organizations has yet to be included in the literature. This paper discusses the innovative collaboration between the University of Toledo College of Medicine (UTCOM) chapter of the Student National Medical Association (SNMA) and the college of medicine's admissions office to strategize a recruitment plan to increase the number of underrepresented minority students at the UTCOM. This paper suggests that minority medical student organizations, particularly the SNMA, can be used as a recruiting tool; hence, admissions offices cannot negate the usefulness of having formal involvement of minority medical student organizations as a recruiting tool. This approach may also be applicable to residency programs and other graduate professional fields with a severe shortage of URM students.

  5. Present-value analysis: A systems approach to public decisionmaking for cost effectiveness

    NASA Technical Reports Server (NTRS)

    Herbert, T. T.

    1971-01-01

    Decision makers within Governmental agencies and Congress must evaluate competing (and sometimes conflicting) proposals which seek funding and implementation. Present value analysis can be an effective decision making tool by enabling the formal evaluation of the effects of competing proposals on efficient national resource utilization. A project's costs are not only its direct disbursements, but its social costs as well. How much does it cost to have those funds diverted from their use and economic benefit by the private sector to the public project? Comparisons of competing projects' social costs allow decision makers to expand their decision bases by quantifying the projects' impacts upon the economy and the efficient utilization of the country's limited national resources. A conceptual model is established for the choosing of the appropriate discount rate to be used in evaluation decisions through the technique.

  6. On a Formal Tool for Reasoning About Flight Software Cost Analysis

    NASA Technical Reports Server (NTRS)

    Spagnuolo, John N., Jr.; Stukes, Sherry A.

    2013-01-01

    A report focuses on the development of flight software (FSW) cost estimates for 16 Discovery-class missions at JPL. The techniques and procedures developed enabled streamlining of the FSW analysis process, and provided instantaneous confirmation that the data and processes used for these estimates were consistent across all missions. The research provides direction as to how to build a prototype rule-based system for FSW cost estimation that would provide (1) FSW cost estimates, (2) explanation of how the estimates were arrived at, (3) mapping of costs, (4) mathematical trend charts with explanations of why the trends are what they are, (5) tables with ancillary FSW data of interest to analysts, (6) a facility for expert modification/enhancement of the rules, and (7) a basis for conceptually convenient expansion into more complex, useful, and general rule-based systems.

  7. Bridging the Gulf between Formal Calculus and Physical Reasoning.

    ERIC Educational Resources Information Center

    Van Der Meer, A.

    1980-01-01

    Some ways to link calculus instruction with the mathematical models used in physics courses are presented. The activity of modelling is presented as a major tool in synchronizing physics and mathematics instruction in undergraduate engineering programs. (MP)

  8. Using NASA Data in the Classroom: Promoting STEM Learning in Formal Education using Real Space Science Data

    NASA Astrophysics Data System (ADS)

    Lawton, B.; Hemenway, M. K.; Mendez, B.; Odenwald, S.

    2013-04-01

    Among NASA's major education goals is the training of students in the Science, Technology, Engineering, and Math (STEM) disciplines. The use of real data, from some of the most sophisticated observatories in the world, provides formal educators the opportunity to teach their students real-world applications of the STEM subjects. Combining real space science data with lessons aimed at meeting state and national education standards provides a memorable educational experience that students can build upon throughout their academic careers. Many of our colleagues have adopted the use of real data in their education and public outreach (EPO) programs. There are challenges in creating resources using real data for classroom use that include, but are not limited to, accessibility to computers/Internet and proper instruction. Understanding and sharing these difficulties and best practices with the larger EPO community is critical to the development of future resources. In this session, we highlight three examples of how NASA data is being utilized in the classroom: the Galaxies and Cosmos Explorer Tool (GCET) that utilizes real Hubble Space Telescope data; the computer image-analysis resources utilized by the NASA WISE infrared mission; and the space science derived math applications from SpaceMath@NASA featuring the Chandra and Kepler space telescopes. Challenges and successes are highlighted for these projects. We also facilitate small-group discussions that focus on additional benefits and challenges of using real data in the formal education environment. The report-outs from those discussions are given here.

  9. [Formal quality assessment of informed consent documents in 9 hospitals].

    PubMed

    Calle-Urra, J E; Parra-Hidalgo, P; Saturno-Hernández, P J; Martínez-Martínez, M J; Navarro-Moya, F J

    2013-01-01

    Informed consent forms are very important in the process of medical information. The aim of this study is to design reliable formal quality criteria of these documents and their application in the evaluation of those used in the hospitals of a regional health service. Criteria have been designed from the analysis of existing regulations, previous studies and consultation with key experts. The interobserver concordance was assessed using the kappa index. Criteria evaluation was performed on 1425 documents of 9 hospitals. A total of 19 criteria used in the evaluation of the quality of informed consent forms have been obtained. Kappa values were higher than 0,60 in 17 of them and higher than 0,52 in the other 2. The average number of defects per document was 7.6, with a high-low ratio among hospitals of 1.84. More than 90% of the documents had defects in the information on consequences and contraindications, and in about 90% it did not mention the copy to the patient. More than 60% did not comply with stating the purpose of the procedure, a statement of having understood and clarified doubts, and the treatment options. A tool has been obtained to reliably assess the formal quality of the informed consent forms. The documents assessed have a wide margin for improvement related to giving a copy to the patient, and some aspects of the specific information that patients should receive. Copyright © 2012 SECA. Published by Elsevier Espana. All rights reserved.

  10. On making things the best - Aeronautical uses of optimization /Wright Bros. lecture/

    NASA Technical Reports Server (NTRS)

    Ashley, H.

    1981-01-01

    The paper's purpose is to summarize and evaluate the results of an investigation into the degree to which formal optimization methods have contributed practically to the design and operation of atmospheric flight vehicles. The nature of this technology is reviewed and illustrated with simple structural examples. A series of published successful applications is described, from the fields of aerodynamics, structures, guidance and control, optimal trajectories and vehicle configuration optimization. The corresponding improvements over conventional analysis are assessed. Speculations are offered as to why these tools have made such little headway toward acceptance by designers. The growing need for their use in the future is explained; they hold out an unparalleled opportunity for improved efficiencies.

  11. A bioinformatics expert system linking functional data to anatomical outcomes in limb regeneration

    PubMed Central

    Lobo, Daniel; Feldman, Erica B.; Shah, Michelle; Malone, Taylor J.

    2014-01-01

    Abstract Amphibians and molting arthropods have the remarkable capacity to regenerate amputated limbs, as described by an extensive literature of experimental cuts, amputations, grafts, and molecular techniques. Despite a rich history of experimental effort, no comprehensive mechanistic model exists that can account for the pattern regulation observed in these experiments. While bioinformatics algorithms have revolutionized the study of signaling pathways, no such tools have heretofore been available to assist scientists in formulating testable models of large‐scale morphogenesis that match published data in the limb regeneration field. Major barriers to preventing an algorithmic approach are the lack of formal descriptions for experimental regenerative information and a repository to centralize storage and mining of functional data on limb regeneration. Establishing a new bioinformatics of shape would significantly accelerate the discovery of key insights into the mechanisms that implement complex regeneration. Here, we describe a novel mathematical ontology for limb regeneration to unambiguously encode phenotype, manipulation, and experiment data. Based on this formalism, we present the first centralized formal database of published limb regeneration experiments together with a user‐friendly expert system tool to facilitate its access and mining. These resources are freely available for the community and will assist both human biologists and artificial intelligence systems to discover testable, mechanistic models of limb regeneration. PMID:25729585

  12. Multi-criteria decision analysis in environmental sciences: ten years of applications and trends.

    PubMed

    Huang, Ivy B; Keisler, Jeffrey; Linkov, Igor

    2011-09-01

    Decision-making in environmental projects requires consideration of trade-offs between socio-political, environmental, and economic impacts and is often complicated by various stakeholder views. Multi-criteria decision analysis (MCDA) emerged as a formal methodology to face available technical information and stakeholder values to support decisions in many fields and can be especially valuable in environmental decision making. This study reviews environmental applications of MCDA. Over 300 papers published between 2000 and 2009 reporting MCDA applications in the environmental field were identified through a series of queries in the Web of Science database. The papers were classified by their environmental application area, decision or intervention type. In addition, the papers were also classified by the MCDA methods used in the analysis (analytic hierarchy process, multi-attribute utility theory, and outranking). The results suggest that there is a significant growth in environmental applications of MCDA over the last decade across all environmental application areas. Multiple MCDA tools have been successfully used for environmental applications. Even though the use of the specific methods and tools varies in different application areas and geographic regions, our review of a few papers where several methods were used in parallel with the same problem indicates that recommended course of action does not vary significantly with the method applied. Published by Elsevier B.V.

  13. The Statistical Consulting Center for Astronomy (SCCA)

    NASA Technical Reports Server (NTRS)

    Akritas, Michael

    2001-01-01

    The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.

  14. XML-based data model and architecture for a knowledge-based grid-enabled problem-solving environment for high-throughput biological imaging.

    PubMed

    Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif

    2008-03-01

    High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.

  15. Restorative Practices as Formal and Informal Education

    ERIC Educational Resources Information Center

    Carter, Candice C.

    2013-01-01

    This article reviews restorative practices (RP) as education in formal and informal contexts of learning that are fertile sites for cultivating peace. Formal practices involve instruction about response to conflict, while informal learning occurs beyond academic lessons. The research incorporated content analysis and a critical examination of the…

  16. Formal hardware verification of digital circuits

    NASA Technical Reports Server (NTRS)

    Joyce, J.; Seger, C.-J.

    1991-01-01

    The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.

  17. Prioritising coastal zone management issues through fuzzy cognitive mapping approach.

    PubMed

    Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi

    2012-04-30

    Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. draco: Analysis and simulation of drift scan radio data

    NASA Astrophysics Data System (ADS)

    Shaw, J. Richard

    2017-12-01

    draco analyzes transit radio data with the m-mode formalism. It is telescope agnostic, and is used as part of the analysis and simulation pipeline for the CHIME (Canadian Hydrogen Intensity Mapping Experiment) telescope. It can simulate time stream data from maps of the sky (using the m-mode formalism) and add gain fluctuations and correctly correlated instrumental noise (i.e. Wishart distributed). Further, it can perform various cuts on the data and make maps of the sky from data using the m-mode formalism.

  19. Towards automated processing of clinical Finnish: sublanguage analysis and a rule-based parser.

    PubMed

    Laippala, Veronika; Ginter, Filip; Pyysalo, Sampo; Salakoski, Tapio

    2009-12-01

    In this paper, we present steps taken towards more efficient automated processing of clinical Finnish, focusing on daily nursing notes in a Finnish Intensive Care Unit (ICU). First, we analyze ICU Finnish as a sublanguage, identifying its specific features facilitating, for example, the development of a specialized syntactic analyser. The identified features include frequent omission of finite verbs, limitations in allowed syntactic structures, and domain-specific vocabulary. Second, we develop a formal grammar and a parser for ICU Finnish, thus providing better tools for the development of further applications in the clinical domain. The grammar is implemented in the LKB system in a typed feature structure formalism. The lexicon is automatically generated based on the output of the FinTWOL morphological analyzer adapted to the clinical domain. As an additional experiment, we study the effect of using Finnish constraint grammar to reduce the size of the lexicon. The parser construction thus makes efficient use of existing resources for Finnish. The grammar currently covers 76.6% of ICU Finnish sentences, producing highly accurate best-parse analyzes with F-score of 91.1%. We find that building a parser for the highly specialized domain sublanguage is not only feasible, but also surprisingly efficient, given an existing morphological analyzer with broad vocabulary coverage. The resulting parser enables a deeper analysis of the text than was previously possible.

  20. Formalizing New Navigation Requirements for NASA's Space Shuttle

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.

    1996-01-01

    We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CRs) were selected as promising targets to demonstrate the utility of formal methods in this demanding application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this industrial usage report. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During a limited analysis conducted on the formal specifications, numerous requirements issues were discovered. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Yiying, E-mail: yiyingyan@sjtu.edu.cn; Lü, Zhiguo, E-mail: zglv@sjtu.edu.cn; Zheng, Hang, E-mail: hzheng@sjtu.edu.cn

    We present a theoretical formalism for resonance fluorescence radiating from a two-level system (TLS) driven by any periodic driving and coupled to multiple reservoirs. The formalism is derived analytically based on the combination of Floquet theory and Born–Markov master equation. The formalism allows us to calculate the spectrum when the Floquet states and quasienergies are analytically or numerically solved for simple or complicated driving fields. We can systematically explore the spectral features by implementing the present formalism. To exemplify this theory, we apply the unified formalism to comprehensively study a generic model that a harmonically driven TLS is simultaneously coupledmore » to a radiative reservoir and a dephasing reservoir. We demonstrate that the significant features of the fluorescence spectra, the driving-induced asymmetry and the dephasing-induced asymmetry, can be attributed to the violation of detailed balance condition, and explained in terms of the driving-related transition quantities between Floquet-states and their steady populations. In addition, we find the distinguished features of the fluorescence spectra under the biharmonic and multiharmonic driving fields in contrast with that of the harmonic driving case. In the case of the biharmonic driving, we find that the spectra are significantly different from the result of the RWA under the multiple resonance conditions. By the three concrete applications, we illustrate that the present formalism provides a routine tool for comprehensively exploring the fluorescence spectrum of periodically strongly driven TLSs.« less

  2. Solar-Terrestrial Ontology Development

    NASA Astrophysics Data System (ADS)

    McGuinness, D.; Fox, P.; Middleton, D.; Garcia, J.; Cinquni, L.; West, P.; Darnell, J. A.; Benedict, J.

    2005-12-01

    The development of an interdisciplinary virtual observatory (the Virtual Solar-Terrestrial Observatory; VSTO) as a scalable environment for searching, integrating, and analyzing databases distributed over the Internet requires a higher level of semantic interoperability than here-to-fore required by most (if not all) distributed data systems or discipline specific virtual observatories. The formalization of semantics using ontologies and their encodings for the internet (e.g. OWL - the Web Ontology Language), as well as the use of accompanying tools, such as reasoning, inference and explanation, open up both a substantial leap in options for interoperability and in the need for formal development principles to guide ontology development and use within modern, multi-tiered network data environments. In this presentation, we outline the formal methodologies we utilize in the VSTO project, the currently developed use-cases, ontologies and their relation to existing ontologies (such as SWEET).

  3. European Train Control System: A Case Study in Formal Verification

    NASA Astrophysics Data System (ADS)

    Platzer, André; Quesel, Jan-David

    Complex physical systems have several degrees of freedom. They only work correctly when their control parameters obey corresponding constraints. Based on the informal specification of the European Train Control System (ETCS), we design a controller for its cooperation protocol. For its free parameters, we successively identify constraints that are required to ensure collision freedom. We formally prove the parameter constraints to be sharp by characterizing them equivalently in terms of reachability properties of the hybrid system dynamics. Using our deductive verification tool KeYmaera, we formally verify controllability, safety, liveness, and reactivity properties of the ETCS protocol that entail collision freedom. We prove that the ETCS protocol remains correct even in the presence of perturbation by disturbances in the dynamics. We verify that safety is preserved when a PI controlled speed supervision is used.

  4. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 2: Formal specification and correctness theorems

    NASA Technical Reports Server (NTRS)

    Bickford, Mark; Srivas, Mandayam

    1991-01-01

    Presented here is a formal specification and verification of a property of a quadruplicately redundant fault tolerant microprocessor system design. A complete listing of the formal specification of the system and the correctness theorems that are proved are given. The system performs the task of obtaining interactive consistency among the processors using a special instruction on the processors. The design is based on an algorithm proposed by Pease, Shostak, and Lamport. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, providing certain preconditions hold, using a computer aided design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  5. Developing an educational video on lung lobectomy for the general surgery resident.

    PubMed

    Hayden, Emily L; Seagull, F Jacob; Reddy, Rishindra M

    2015-06-15

    The educational resources available to general surgery residents preparing for complex thoracic surgeries vary greatly in content and target audience. We hypothesized that the preparatory resources could be improved in both efficiency of use and targeting. A formal needs analysis was performed to determine residents' knowledge gaps and desired format and/or content of an educational tool while preparing for their first lung resections. The results of the needs assessment then guided the creation of a 20-min video tool. The video was evaluated by a focus group of experts for appropriateness to the target audience, ease of use, and relevance. The needs assessment illustrated that residents feel there is a paucity of appropriate resources available to them while preparing for the lung resection procedure; 82% of respondents felt that easy-to-use and concise resources on the lobectomy procedure were either "not at all" or "somewhat" accessible. Residents reported that video was their preferred format for a learning tool overall and identified a broad spectrum of most challenging procedural aspects. These results were used to guide the creation of a 20-min video tool. A focus group validated the efficacy and appropriateness of the video. Targeted and efficient tools for residents preparing for complex subspecialty procedures are needed and valued. These results clearly encourage further work in the creation of focused educational tools for surgical residents, especially in the format of short video overviews. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Learning Needs Analysis of Collaborative E-Classes in Semi-Formal Settings: The REVIT Example

    ERIC Educational Resources Information Center

    Mavroudi, Anna; Hadzilacos, Thanasis

    2013-01-01

    Analysis, the first phase of the typical instructional design process, is often downplayed. This paper focuses on the analysis concerning a series of e-courses for collaborative adult education in semi-formal settings by reporting and generalizing results from the REVIT project. REVIT, an EU-funded research project, offered custom e-courses to…

  7. Systematic Clinical Reasoning in Physical Therapy (SCRIPT): Tool for the Purposeful Practice of Clinical Reasoning in Orthopedic Manual Physical Therapy.

    PubMed

    Baker, Sarah E; Painter, Elizabeth E; Morgan, Brandon C; Kaus, Anna L; Petersen, Evan J; Allen, Christopher S; Deyle, Gail D; Jensen, Gail M

    2017-01-01

    Clinical reasoning is essential to physical therapist practice. Solid clinical reasoning processes may lead to greater understanding of the patient condition, early diagnostic hypothesis development, and well-tolerated examination and intervention strategies, as well as mitigate the risk of diagnostic error. However, the complex and often subconscious nature of clinical reasoning can impede the development of this skill. Protracted tools have been published to help guide self-reflection on clinical reasoning but might not be feasible in typical clinical settings. This case illustrates how the Systematic Clinical Reasoning in Physical Therapy (SCRIPT) tool can be used to guide the clinical reasoning process and prompt a physical therapist to search the literature to answer a clinical question and facilitate formal mentorship sessions in postprofessional physical therapist training programs. The SCRIPT tool enabled the mentee to generate appropriate hypotheses, plan the examination, query the literature to answer a clinical question, establish a physical therapist diagnosis, and design an effective treatment plan. The SCRIPT tool also facilitated the mentee's clinical reasoning and provided the mentor insight into the mentee's clinical reasoning. The reliability and validity of the SCRIPT tool have not been formally studied. Clinical mentorship is a cornerstone of postprofessional training programs and intended to develop advanced clinical reasoning skills. However, clinical reasoning is often subconscious and, therefore, a challenging skill to develop. The use of a tool such as the SCRIPT may facilitate developing clinical reasoning skills by providing a systematic approach to data gathering and making clinical judgments to bring clinical reasoning to the conscious level, facilitate self-reflection, and make a mentored physical therapist's thought processes explicit to his or her clinical mentor. © 2017 American Physical Therapy Association

  8. Pedagogical Basis of DAS Formalism in Engineering Education

    ERIC Educational Resources Information Center

    Hiltunen, J.; Heikkinen, E.-P.; Jaako, J.; Ahola, J.

    2011-01-01

    The paper presents a new approach for a bachelor-level curriculum structure in engineering. The approach is called DAS formalism according to its three phases: description, analysis and synthesis. Although developed specifically for process and environmental engineering, DAS formalism has a generic nature and it could also be used in other…

  9. Federal and tribal lands road safety audits : case studies

    DOT National Transportation Integrated Search

    2009-12-01

    A road safety audit (RSA) is a formal safety performance examination by an independent, multidisciplinary team. RSAs are an effective tool for proactively improving the safety performance of a road project during the planning and design stages, and f...

  10. Influence Diagrams as Decision-Making Tools for Pesticide Risk Management

    EPA Science Inventory

    The pesticide policy arena is filled with discussion of probabilistic approaches to assess ecological risk, however, similar discussions about implementing formal probabilistic methods in pesticide risk decision making are less common. An influence diagram approach is proposed f...

  11. Analyzing stakeholder preferences for managing elk and bison at the National Elk Refuge and Grand Teton National Park: An example of the disparate stakeholder management approach

    USGS Publications Warehouse

    Koontz, Lynne; Hoag, Dana L.

    2005-01-01

    Many programs and tools have been developed by different disciplines to facilitate group negotiation and decision making. Three examples are relevant here. First, decision analysis models such as the Analytical Hierarchy Process (AHP) are commonly used to prioritize the goals and objectives of stakeholders’ preferences for resource planning by formally structuring conflicts and assisting decision makers in developing a compromised solution (Forman, 1998). Second, institutional models such as the Legal Institutional Analysis Model (LIAM) have been used to describe the organizational rules of behavior and the institutional boundaries constraining management decisions (Lamb and others, 1998). Finally, public choice models have been used to predict the potential success of rent-seeking activity (spending additional time and money to exert political pressure) to change the political rules (Becker, 1983). While these tools have been successful at addressing various pieces of the natural resource decision making process, their use in isolation is not enough to fully depict the complexities of the physical and biological systems with the rules and constraints of the underlying economic and political systems. An approach is needed that combines natural sciences, economics, and politics.

  12. The evolving field of prognostication and risk stratification in MDS: Recent developments and future directions.

    PubMed

    Lee, Eun-Ju; Podoltsev, Nikolai; Gore, Steven D; Zeidan, Amer M

    2016-01-01

    The clinical course of patients with myelodysplastic syndromes (MDS) is characterized by wide variability reflecting the underlying genetic and biological heterogeneity of the disease. Accurate prediction of outcomes for individual patients is an integral part of the evidence-based risk/benefit calculations that are necessary for tailoring the aggressiveness of therapeutic interventions. While several prognostication tools have been developed and validated for risk stratification, each of these systems has limitations. The recent progress in genomic sequencing techniques has led to discoveries of recurrent molecular mutations in MDS patients with independent impact on relevant clinical outcomes. Reliable assays of these mutations have already entered the clinic and efforts are currently ongoing to formally incorporate mutational analysis into the existing clinicopathologic risk stratification tools. Additionally, mutational analysis holds promise for going beyond prognostication to therapeutic selection and individualized treatment-specific prediction of outcomes; abilities that would revolutionize MDS patient care. Despite these exciting developments, the best way of incorporating molecular testing for use in prognostication and prediction of outcomes in clinical practice remains undefined and further research is warranted. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Expressions of cultural safety in public health nursing practice.

    PubMed

    Richardson, Anna; Yarwood, Judy; Richardson, Sandra

    2017-01-01

    Cultural safety is an essential concept within New Zealand nursing that is formally linked to registration and competency-based practice certification. Despite its centrality to New Zealand nursing philosophies and the stated expectation of cultural safety as a practice element, there is limited evidence of its application in the literature. This research presents insight into public health nurse's (PHN) experiences, demonstrating the integration of cultural safety principles into practice. These findings emerged following secondary analysis of data from a collaborative, educative research project where PHNs explored the use of family assessment tools. In particular, the 15-minute interview tool was introduced and used by the PHNs when working with families. Critical analysis of transcribed data from PHN interviews, utilising a cultural safety lens, illuminated practical ways in which cultural safety concepts infused PHN practice with families. The themes that emerged reflected the interweaving of the principles of cultural safety with the application of the five components of the 15-minute interview. This highlights elements of PHN work with individuals and families not previously acknowledged. Examples of culturally safe nursing practice resonated throughout the PHN conversations as they grappled with the increasing complexity of working with a diverse range of families. © 2016 John Wiley & Sons Ltd.

  14. Developing an approach for teaching and learning about Lewis structures

    NASA Astrophysics Data System (ADS)

    Kaufmann, Ilana; Hamza, Karim M.; Rundgren, Carl-Johan; Eriksson, Lars

    2017-08-01

    This study explores first-year university students' reasoning as they learn to draw Lewis structures. We also present a theoretical account of the formal procedure commonly taught for drawing these structures. Students' discussions during problem-solving activities were video recorded and detailed analyses of the discussions were made through the use of practical epistemology analysis (PEA). Our results show that the formal procedure was central for drawing Lewis structures, but its use varied depending on situational aspects. Commonly, the use of individual steps of the formal procedure was contingent on experiences of chemical structures, and other information such as the characteristics of the problem given. The analysis revealed a number of patterns in how students constructed, checked and modified the structure in relation to the formal procedure and the situational aspects. We suggest that explicitly teaching the formal procedure as a process of constructing, checking and modifying might be helpful for students learning to draw Lewis structures. By doing so, the students may learn to check the accuracy of the generated structure not only in relation to the octet rule and formal charge, but also to other experiences that are not explicitly included in the formal procedure.

  15. Sensitivity and specificity of a two-question screening tool for depression in a specialist palliative care unit.

    PubMed

    Payne, Ann; Barry, Sandra; Creedon, Brian; Stone, Carol; Sweeney, Catherine; O' Brien, Tony; O' Sullivan, Kathleen

    2007-04-01

    The primary objective in this study is to determine the sensitivity and specificity of a two-item screening interview for depression versus the formal psychiatric interview, in the setting of a specialist palliative in-patient unit so that we may identify those individuals suffering from depressive disorder and therefore optimise their management in this often-complex population. A prospective sample of consecutive admissions (n = 167) consented to partake in the study, and the screening interview was asked separately to the formal psychiatric interview. The two-item questionnaire, achieved a sensitivity of 90.7% (95% CI 76.9-97.0) but a lower specificity of 67.7% (95% CI 58.7-75.7). The false positive rate was 32.3% (95% CI 24.3-41.3), but the false negative rate was found to be a low 9.3% (95% CI 3.0-23.1). A subgroup analysis of individuals with a past experience of depressive illness, (n = 95), revealed that a significant number screened positive for depression by the screening test, 55.2% (16/29) compared to those with no background history of depression, 33.3% (22/66) (P = 0.045). The high sensitivity and low false negative rate of the two-question screening tool will aid health professionals in identifying depression in the in-patient specialist palliative care unit. Individuals, who admit to a previous experience of depressive illness, are more likely to respond positively to the two-item questionnaire than those who report no prior history of depressive illness (P = 0.045).

  16. Double Dutch: A Tool for Designing Combinatorial Libraries of Biological Systems.

    PubMed

    Roehner, Nicholas; Young, Eric M; Voigt, Christopher A; Gordon, D Benjamin; Densmore, Douglas

    2016-06-17

    Recently, semirational approaches that rely on combinatorial assembly of characterized DNA components have been used to engineer biosynthetic pathways. In practice, however, it is not practical to assemble and test millions of pathway variants in order to elucidate how different DNA components affect the behavior of a pathway. To address this challenge, we apply a rigorous mathematical approach known as design of experiments (DOE) that can be used to construct empirical models of system behavior without testing all variants. To support this approach, we have developed a tool named Double Dutch, which uses a formal grammar and heuristic algorithms to automate the process of DOE library design. Compared to designing by hand, Double Dutch enables users to more efficiently and scalably design libraries of pathway variants that can be used in a DOE framework and uniquely provides a means to flexibly balance design considerations of statistical analysis, construction cost, and risk of homologous recombination, thereby demonstrating the utility of automating decision making when faced with complex design trade-offs.

  17. An advanced environment for hybrid modeling of biological systems based on modelica.

    PubMed

    Pross, Sabrina; Bachmann, Bernhard

    2011-01-20

    Biological systems are often very complex so that an appropriate formalism is needed for modeling their behavior. Hybrid Petri Nets, consisting of time-discrete Petri Net elements as well as continuous ones, have proven to be ideal for this task. Therefore, a new Petri Net library was implemented based on the object-oriented modeling language Modelica which allows the modeling of discrete, stochastic and continuous Petri Net elements by differential, algebraic and discrete equations. An appropriate Modelica-tool performs the hybrid simulation with discrete events and the solution of continuous differential equations. A special sub-library contains so-called wrappers for specific reactions to simplify the modeling process. The Modelica-models can be connected to Simulink-models for parameter optimization, sensitivity analysis and stochastic simulation in Matlab. The present paper illustrates the implementation of the Petri Net component models, their usage within the modeling process and the coupling between the Modelica-tool Dymola and Matlab/Simulink. The application is demonstrated by modeling the metabolism of Chinese Hamster Ovary Cells.

  18. A modified operational sequence methodology for zoo exhibit design and renovation: conceptualizing animals, staff, and visitors as interdependent coworkers.

    PubMed

    Kelling, Nicholas J; Gaalema, Diann E; Kelling, Angela S

    2014-01-01

    Human factors analyses have been used to improve efficiency and safety in various work environments. Although generally limited to humans, the universality of these analyses allows for their formal application to a much broader domain. This paper outlines a model for the use of human factors to enhance zoo exhibits and optimize spaces for all user groups; zoo animals, zoo visitors, and zoo staff members. Zoo exhibits are multi-faceted and each user group has a distinct set of requirements that can clash or complement each other. Careful analysis and a reframing of the three groups as interdependent coworkers can enhance safety, efficiency, and experience for all user groups. This paper details a general creation and specific examples of the use of the modified human factors tools of function allocation, operational sequence diagram and needs assessment. These tools allow for adaptability and ease of understanding in the design or renovation of exhibits. © 2014 Wiley Periodicals, Inc.

  19. How to Boost Engineering Support Via Web 2.0 - Seeds for the Ares Project...and/or Yours?

    NASA Technical Reports Server (NTRS)

    Scott, David W.

    2010-01-01

    The Mission Operations Laboratory (MOL) at Marshall Space Flight Center (MSFC) is responsible for Engineering Support capability for NASA s Ares launch system development. In pursuit of this, MOL is building the Ares Engineering and Operations Network (AEON), a web-based portal intended to provide a seamless interface to support and simplify two critical activities: a) Access and analyze Ares manufacturing, test, and flight performance data, with access to Shuttle data for comparison. b) Provide archive storage for engineering instrumentation data to support engineering design, development, and test. A mix of NASA-written and COTS software provides engineering analysis tools. A by-product of using a data portal to access and display data is access to collaborative tools inherent in a Web 2.0 environment. This paper discusses how Web 2.0 techniques, particularly social media, might be applied to the traditionally conservative and formal engineering support arena. A related paper by the author [1] considers use

  20. Formal Methods for Information Protection Technology. Task 1: Formal Grammar-Based Approach and Tool for Simulation Attacks against Computer Network. Part 1

    DTIC Science & Technology

    2004-02-01

    Protocol for Unix enumerating by stealing /etc/ passwd and (or) /etc/hosts.equiv and (or) ~/.rhosts; ISU – Identifying SID with user2sid ; IAS...null sessions””, FUE – “Finger Users Enumeration”, UTFTP – “Use of Trivial File Transfer Protocol for Unix enumerating by stealing /etc/ passwd and...Ping of Death”, UF – “UDP flooding”, IFS – “Storm of inquiries to FTP-server”, APF – “Access to Password File . passwd ”, WDPF – “Writing of Data with

  1. Knowledge, skills and attitudes of hospital pharmacists in the use of information technology and electronic tools to support clinical practice: A Brazilian survey

    PubMed Central

    Vasconcelos, Hemerson Bruno da Silva; Woods, David John

    2017-01-01

    This study aimed to identify the knowledge, skills and attitudes of Brazilian hospital pharmacists in the use of information technology and electronic tools to support clinical practice. Methods: A questionnaire was sent by email to clinical pharmacists working public and private hospitals in Brazil. The instrument was validated using the method of Polit and Beck to determine the content validity index. Data (n = 348) were analyzed using descriptive statistics, Pearson's Chi-square test and Gamma correlation tests. Results: Pharmacists had 1–4 electronic devices for personal use, mainly smartphones (84.8%; n = 295) and laptops (81.6%; n = 284). At work, pharmacists had access to a computer (89.4%; n = 311), mostly connected to the internet (83.9%; n = 292). They felt competent (very capable/capable) searching for a web page/web site on a specific subject (100%; n = 348), downloading files (99.7%; n = 347), using spreadsheets (90.2%; n = 314), searching using MeSH terms in PubMed (97.4%; n = 339) and general searching for articles in bibliographic databases (such as Medline/PubMed: 93.4%; n = 325). Pharmacists did not feel competent in using statistical analysis software (somewhat capable/incapable: 78.4%; n = 273). Most pharmacists reported that they had not received formal education to perform most of these actions except searching using MeSH terms. Access to bibliographic databases was available in Brazilian hospitals, however, most pharmacists (78.7%; n = 274) reported daily use of a non-specific search engine such as Google. This result may reflect the lack of formal knowledge and training in the use of bibliographic databases and difficulty with the English language. The need to expand knowledge about information search tools was recognized by most pharmacists in clinical practice in Brazil, especially those with less time dedicated exclusively to clinical activity (Chi-square, p = 0.006). Conclusion: These results will assist in defining minimal competencies for the training of pharmacists in the field of information technology to support clinical practice. Knowledge and skill gaps are evident in the use of bibliographic databases, spreadsheets and statistical tools. PMID:29272292

  2. Knowledge, skills and attitudes of hospital pharmacists in the use of information technology and electronic tools to support clinical practice: A Brazilian survey.

    PubMed

    Néri, Eugenie Desirèe Rabelo; Meira, Assuero Silva; Vasconcelos, Hemerson Bruno da Silva; Woods, David John; Fonteles, Marta Maria de França

    2017-01-01

    This study aimed to identify the knowledge, skills and attitudes of Brazilian hospital pharmacists in the use of information technology and electronic tools to support clinical practice. A questionnaire was sent by email to clinical pharmacists working public and private hospitals in Brazil. The instrument was validated using the method of Polit and Beck to determine the content validity index. Data (n = 348) were analyzed using descriptive statistics, Pearson's Chi-square test and Gamma correlation tests. Pharmacists had 1-4 electronic devices for personal use, mainly smartphones (84.8%; n = 295) and laptops (81.6%; n = 284). At work, pharmacists had access to a computer (89.4%; n = 311), mostly connected to the internet (83.9%; n = 292). They felt competent (very capable/capable) searching for a web page/web site on a specific subject (100%; n = 348), downloading files (99.7%; n = 347), using spreadsheets (90.2%; n = 314), searching using MeSH terms in PubMed (97.4%; n = 339) and general searching for articles in bibliographic databases (such as Medline/PubMed: 93.4%; n = 325). Pharmacists did not feel competent in using statistical analysis software (somewhat capable/incapable: 78.4%; n = 273). Most pharmacists reported that they had not received formal education to perform most of these actions except searching using MeSH terms. Access to bibliographic databases was available in Brazilian hospitals, however, most pharmacists (78.7%; n = 274) reported daily use of a non-specific search engine such as Google. This result may reflect the lack of formal knowledge and training in the use of bibliographic databases and difficulty with the English language. The need to expand knowledge about information search tools was recognized by most pharmacists in clinical practice in Brazil, especially those with less time dedicated exclusively to clinical activity (Chi-square, p = 0.006). These results will assist in defining minimal competencies for the training of pharmacists in the field of information technology to support clinical practice. Knowledge and skill gaps are evident in the use of bibliographic databases, spreadsheets and statistical tools.

  3. Development of a Peer Teaching-Assessment Program and a Peer Observation and Evaluation Tool

    PubMed Central

    Trujillo, Jennifer M.; Barr, Judith; Gonyeau, Michael; Van Amburgh, Jenny A.; Matthews, S. James; Qualters, Donna

    2008-01-01

    Objectives To develop a formalized, comprehensive, peer-driven teaching assessment program and a valid and reliable assessment tool. Methods A volunteer taskforce was formed and a peer-assessment program was developed using a multistep, sequential approach and the Peer Observation and Evaluation Tool (POET). A pilot study was conducted to evaluate the efficiency and practicality of the process and to establish interrater reliability of the tool. Intra-class correlation coefficients (ICC) were calculated. Results ICCs for 8 separate lectures evaluated by 2-3 observers ranged from 0.66 to 0.97, indicating good interrater reliability of the tool. Conclusion Our peer assessment program for large classroom teaching, which includes a valid and reliable evaluation tool, is comprehensive, feasible, and can be adopted by other schools of pharmacy. PMID:19325963

  4. Formal verification of a fault tolerant clock synchronization algorithm

    NASA Technical Reports Server (NTRS)

    Rushby, John; Vonhenke, Frieder

    1989-01-01

    A formal specification and mechanically assisted verification of the interactive convergence clock synchronization algorithm of Lamport and Melliar-Smith is described. Several technical flaws in the analysis given by Lamport and Melliar-Smith were discovered, even though their presentation is unusally precise and detailed. It seems that these flaws were not detected by informal peer scrutiny. The flaws are discussed and a revised presentation of the analysis is given that not only corrects the flaws but is also more precise and easier to follow. Some of the corrections to the flaws require slight modifications to the original assumptions underlying the algorithm and to the constraints on its parameters, and thus change the external specifications of the algorithm. The formal analysis of the interactive convergence clock synchronization algorithm was performed using the Enhanced Hierarchical Development Methodology (EHDM) formal specification and verification environment. This application of EHDM provides a demonstration of some of the capabilities of the system.

  5. The application of quality risk management to the bacterial endotoxins test: use of hazard analysis and critical control points.

    PubMed

    Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti

    2013-01-01

    Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.

  6. IOOC Organizational Network (ION) Project

    NASA Astrophysics Data System (ADS)

    Dean, H.

    2013-12-01

    In order to meet the growing need for ocean information, research communities at the national and international levels have responded most recently by developing organizational frameworks that can help to integrate information across systems of existing networks and standardize methods of data gathering, management, and processing that facilitate integration. To address recommendations and identified challenges related to the need for a better understanding of ocean observing networks, members of the U.S. Interagency Ocean Observation Committee (IOOC) supported pursuing a project that came to be titled the IOOC Organizational Network (ION). The ION tool employs network mapping approaches which mirror approaches developed in academic literature aimed at understanding political networks. Researchers gathered data on the list of global ocean observing organizations included in the Framework for Ocean Observing (FOO), developed in 2012 by the international Task Team for an Integrated Framework for Sustained Ocean Observing. At the international scale, researchers reviewed organizational research plans and documents, websites, and formal international agreement documents. At the U.S. national scale, researchers analyzed legislation, formal inter-agency agreements, work plans, charters, and policy documents. Researchers based analysis of relationships among global organizations and national federal organizations on four broad relationship categories: Communications, Data, Infrastructure, and Human Resources. In addition to the four broad relationship categories, researchers also gathered data on relationship instrument types, strength of relationships, and (at the global level) ocean observing variables. Using network visualization software, researchers then developed a series of dynamic webpages. Researchers used the tool to address questions identified by the ocean observing community, including identifying gaps in global relationships and the types of tools used to develop networks at the U.S. national level. As the ION project goes through beta testing and is utilized to address specific questions posed by the ocean observing community, it will become more refined and more closely linked to user needs and interests.

  7. Challenges in the estimation of Net SURvival: The CENSUR working survival group.

    PubMed

    Giorgi, R

    2016-10-01

    Net survival, the survival probability that would be observed, in a hypothetical world, where the cancer of interest would be the only possible cause of death, is a key indicator in population-based cancer studies. Accounting for mortality due to other causes, it allows cross-country comparisons or trends analysis and provides a useful indicator for public health decision-making. The objective of this study was to show how the creation and formalization of a network comprising established research teams, which already had substantial and complementary experience in both cancer survival analysis and methodological development, make it possible to meet challenges and thus provide more adequate tools, to improve the quality and the comparability of cancer survival data, and to promote methodological transfers in areas of emerging interest. The Challenges in the Estimation of Net SURvival (CENSUR) working survival group is composed of international researchers highly skilled in biostatistics, methodology, and epidemiology, from different research organizations in France, the United Kingdom, Italy, Slovenia, and Canada, and involved in French (FRANCIM) and European (EUROCARE) cancer registry networks. The expected advantages are an interdisciplinary, international, synergistic network capable of addressing problems in public health, for decision-makers at different levels; tools for those in charge of net survival analyses; a common methodology that makes unbiased cross-national comparisons of cancer survival feasible; transfer of methods for net survival estimations to other specific applications (clinical research, occupational epidemiology); and dissemination of results during an international training course. The formalization of the international CENSUR working survival group was motivated by a need felt by scientists conducting population-based cancer research to discuss, develop, and monitor implementation of a common methodology to analyze net survival in order to provide useful information for cancer control and cancer policy. A "team science" approach is necessary to address new challenges concerning the estimation of net survival. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  8. Annotation of rule-based models with formal semantics to enable creation, analysis, reuse and visualization.

    PubMed

    Misirli, Goksel; Cavaliere, Matteo; Waites, William; Pocock, Matthew; Madsen, Curtis; Gilfellon, Owen; Honorato-Zimmer, Ricardo; Zuliani, Paolo; Danos, Vincent; Wipat, Anil

    2016-03-15

    Biological systems are complex and challenging to model and therefore model reuse is highly desirable. To promote model reuse, models should include both information about the specifics of simulations and the underlying biology in the form of metadata. The availability of computationally tractable metadata is especially important for the effective automated interpretation and processing of models. Metadata are typically represented as machine-readable annotations which enhance programmatic access to information about models. Rule-based languages have emerged as a modelling framework to represent the complexity of biological systems. Annotation approaches have been widely used for reaction-based formalisms such as SBML. However, rule-based languages still lack a rich annotation framework to add semantic information, such as machine-readable descriptions, to the components of a model. We present an annotation framework and guidelines for annotating rule-based models, encoded in the commonly used Kappa and BioNetGen languages. We adapt widely adopted annotation approaches to rule-based models. We initially propose a syntax to store machine-readable annotations and describe a mapping between rule-based modelling entities, such as agents and rules, and their annotations. We then describe an ontology to both annotate these models and capture the information contained therein, and demonstrate annotating these models using examples. Finally, we present a proof of concept tool for extracting annotations from a model that can be queried and analyzed in a uniform way. The uniform representation of the annotations can be used to facilitate the creation, analysis, reuse and visualization of rule-based models. Although examples are given, using specific implementations the proposed techniques can be applied to rule-based models in general. The annotation ontology for rule-based models can be found at http://purl.org/rbm/rbmo The krdf tool and associated executable examples are available at http://purl.org/rbm/rbmo/krdf anil.wipat@newcastle.ac.uk or vdanos@inf.ed.ac.uk. © The Author 2015. Published by Oxford University Press.

  9. Quality assessment of expert answers to lay questions about cystic fibrosis from various language zones in Europe: the ECORN-CF project.

    PubMed

    d'Alquen, Daniela; De Boeck, Kris; Bradley, Judy; Vávrová, Věra; Dembski, Birgit; Wagner, Thomas O F; Pfalz, Annette; Hebestreit, Helge

    2012-02-06

    The European Centres of Reference Network for Cystic Fibrosis (ECORN-CF) established an Internet forum which provides the opportunity for CF patients and other interested people to ask experts questions about CF in their mother language. The objectives of this study were to: 1) develop a detailed quality assessment tool to analyze quality of expert answers, 2) evaluate the intra- and inter-rater agreement of this tool, and 3) explore changes in the quality of expert answers over the time frame of the project. The quality assessment tool was developed by an expert panel. Five experts within the ECORN-CF project used the quality assessment tool to analyze the quality of 108 expert answers published on ECORN-CF from six language zones. 25 expert answers were scored at two time points, one year apart. Quality of answers was also assessed at an early and later period of the project. Individual rater scores and group mean scores were analyzed for each expert answer. A scoring system and training manual were developed analyzing two quality categories of answers: content and formal quality. For content quality, the grades based on group mean scores for all raters showed substantial agreement between two time points, however this was not the case for the grades based on individual rater scores. For formal quality the grades based on group mean scores showed only slight agreement between two time points and there was also poor agreement between time points for the individual grades. The inter-rater agreement for content quality was fair (mean kappa value 0.232 ± 0.036, p < 0.001) while only slight agreement was observed for the grades of the formal quality (mean kappa value 0.105 ± 0.024, p < 0.001). The quality of expert answers was rated high (four language zones) or satisfactory (two language zones) and did not change over time. The quality assessment tool described in this study was feasible and reliable when content quality was assessed by a group of raters. Within ECORN-CF, the tool will help ensure that CF patients all over Europe have equal possibility of access to high quality expert advice on their illness. © 2012 d’Alquen et al; licensee BioMed Central Ltd.

  10. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography.

    PubMed

    Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young

    2016-01-01

    Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.

  11. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  12. Reduced discretization error in HZETRN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slaba, Tony C., E-mail: Tony.C.Slaba@nasa.gov; Blattnig, Steve R., E-mail: Steve.R.Blattnig@nasa.gov; Tweed, John, E-mail: jtweed@odu.edu

    2013-02-01

    The deterministic particle transport code HZETRN is an efficient analysis tool for studying the effects of space radiation on humans, electronics, and shielding materials. In a previous work, numerical methods in the code were reviewed, and new methods were developed that further improved efficiency and reduced overall discretization error. It was also shown that the remaining discretization error could be attributed to low energy light ions (A < 4) with residual ranges smaller than the physical step-size taken by the code. Accurately resolving the spectrum of low energy light particles is important in assessing risk associated with astronaut radiation exposure.more » In this work, modifications to the light particle transport formalism are presented that accurately resolve the spectrum of low energy light ion target fragments. The modified formalism is shown to significantly reduce overall discretization error and allows a physical approximation to be removed. For typical step-sizes and energy grids used in HZETRN, discretization errors for the revised light particle transport algorithms are shown to be less than 4% for aluminum and water shielding thicknesses as large as 100 g/cm{sup 2} exposed to both solar particle event and galactic cosmic ray environments.« less

  13. Reduced quantum dynamics with arbitrary bath spectral densities: hierarchical equations of motion based on several different bath decomposition schemes.

    PubMed

    Liu, Hao; Zhu, Lili; Bai, Shuming; Shi, Qiang

    2014-04-07

    We investigated applications of the hierarchical equation of motion (HEOM) method to perform high order perturbation calculations of reduced quantum dynamics for a harmonic bath with arbitrary spectral densities. Three different schemes are used to decompose the bath spectral density into analytical forms that are suitable to the HEOM treatment: (1) The multiple Lorentzian mode model that can be obtained by numerically fitting the model spectral density. (2) The combined Debye and oscillatory Debye modes model that can be constructed by fitting the corresponding classical bath correlation function. (3) A new method that uses undamped harmonic oscillator modes explicitly in the HEOM formalism. Methods to extract system-bath correlations were investigated for the above bath decomposition schemes. We also show that HEOM in the undamped harmonic oscillator modes can give detailed information on the partial Wigner transform of the total density operator. Theoretical analysis and numerical simulations of the spin-Boson dynamics and the absorption line shape of molecular dimers show that the HEOM formalism for high order perturbations can serve as an important tool in studying the quantum dissipative dynamics in the intermediate coupling regime.

  14. Reduced quantum dynamics with arbitrary bath spectral densities: Hierarchical equations of motion based on several different bath decomposition schemes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Hao; Zhu, Lili; Bai, Shuming

    2014-04-07

    We investigated applications of the hierarchical equation of motion (HEOM) method to perform high order perturbation calculations of reduced quantum dynamics for a harmonic bath with arbitrary spectral densities. Three different schemes are used to decompose the bath spectral density into analytical forms that are suitable to the HEOM treatment: (1) The multiple Lorentzian mode model that can be obtained by numerically fitting the model spectral density. (2) The combined Debye and oscillatory Debye modes model that can be constructed by fitting the corresponding classical bath correlation function. (3) A new method that uses undamped harmonic oscillator modes explicitly inmore » the HEOM formalism. Methods to extract system-bath correlations were investigated for the above bath decomposition schemes. We also show that HEOM in the undamped harmonic oscillator modes can give detailed information on the partial Wigner transform of the total density operator. Theoretical analysis and numerical simulations of the spin-Boson dynamics and the absorption line shape of molecular dimers show that the HEOM formalism for high order perturbations can serve as an important tool in studying the quantum dissipative dynamics in the intermediate coupling regime.« less

  15. Characterizing Pedagogical Practices of University Physics Students in Informal Learning Environments

    NASA Astrophysics Data System (ADS)

    Hinko, Kathleen

    2016-03-01

    University educators (UEs) have a long history of teaching physics not only in formal classroom settings but also in informal outreach environments. The pedagogical practices of UEs in informal physics teaching have not been widely studied, and they may provide insight into formal practices and preparation. We investigate the interactions between UEs and children in an afterschool physics program facilitated by university physics students from the University of Colorado Boulder. In this program, physics undergraduates, graduate students and post-doctoral researchers work with K-8 children on hands-on physics activities on a weekly basis over the course of a semester. We use an Activity Theoretic framework as a tool to examine situational aspects of individuals' behavior in the complex structure of the afterschool program. Using this framework, we analyze video of UE-child interactions and identify three main pedagogical modalities that UEs display during activities: Instruction, Consultation and Participation modes. These modes are characterized by certain language, physical location, and objectives that establish differences in UE-child roles and division of labor. Based on this analysis, we discuss implications for promoting pedagogical strategies through purposeful curriculum development and university educator preparation.

  16. Visualizing Matrix Multiplication

    ERIC Educational Resources Information Center

    Daugulis, Peteris; Sondore, Anita

    2018-01-01

    Efficient visualizations of computational algorithms are important tools for students, educators, and researchers. In this article, we point out an innovative visualization technique for matrix multiplication. This method differs from the standard, formal approach by using block matrices to make computations more visual. We find this method a…

  17. Nurse manager succession planning: A cost-benefit analysis.

    PubMed

    Phillips, Tracy; Evans, Jennifer L; Tooley, Stephanie; Shirey, Maria R

    2018-03-01

    This commentary presents a cost-benefit analysis to advocate for the use of succession planning to mitigate the problems ensuing from nurse manager turnover. An estimated 75% of nurse managers will leave the workforce by 2020. Many benefits are associated with proactively identifying and developing internal candidates. Fewer than 7% of health care organisations have implemented formal leadership succession planning programmes. A cost-benefit analysis of a formal succession-planning programme from one hospital illustrates the benefits of the programme in their organisation and can be replicated easily. Assumptions of nursing manager succession planning cost-benefit analysis are identified and discussed. The succession planning exemplar demonstrates the integration of cost-benefit analysis principles. Comparing the costs of a formal nurse manager succession planning strategy with the status quo results in a positive cost-benefit ratio. The implementation of a formal nurse manager succession planning programme effectively reduces replacement costs and time to transition into the new role. This programme provides an internal pipeline of future leaders who will be more successful than external candidates. Using an actual cost-benefit analysis equips nurse managers with valuable evidence depicting succession planning as a viable business strategy. © 2017 John Wiley & Sons Ltd.

  18. A Guide for Scientists Interested in Researching Student Outcomes

    NASA Astrophysics Data System (ADS)

    Buxner, Sanlyn R.; Anbar, Ariel; Semken, Steve; Mead, Chris; Horodyskyj, Lev; Perera, Viranga; Bruce, Geoffrey; Schönstein, David

    2015-11-01

    Scientists spend years training in their scientific discipline and are well versed the literature, methods, and innovations in their own field. Many scientists also take on teaching responsibilities with little formal training in how to implement their courses or assess their students. There is a growing body of literature of what students know in space science courses and the types of innovations that can work to increase student learning but scientists rarely have exposure to this body of literature. For scientists who are interested in more effectively understanding what their students know or investigating the impact their courses have on students, there is little guidance. Undertaking a more formal study of students poses more complexities including finding robust instruments and employing appropriate data analysis. Additionally, formal research with students involves issues of privacy and human subjects concerns, both regulated by federal laws.This poster details the important decisions and issues to consider for both course evaluation and more formal research using a course developed, facilitated, evaluated and researched by a hybrid team of scientists and science education researchers. HabWorlds, designed and implemented by a team of scientists and faculty at Arizona State University, has been using student data to continually improve the course as well as conduct formal research on students’ knowledge and attitudes in science. This ongoing project has had external funding sources to allow robust assessment not available to most instructors. This is a case study for discussing issues that are applicable to designing and assessing all science courses. Over the course of several years, instructors have refined course outcomes and learning objectives that are shared with students as a roadmap of instruction. The team has searched for appropriate tools for assessing student learning and attitudes, tested them and decided which have worked, or not, for assessment in the course. Data from this assessment has led to many changes in the course to better meet the course goals. We will share challenges and lessons learned in our project to assist other instructors interested in doing research on student outcomes.

  19. EFL Teachers' Formal Assessment Practices Based on Exam Papers

    ERIC Educational Resources Information Center

    Kiliçkaya, Ferit

    2016-01-01

    This study reports initial findings from a small-scale qualitative study aimed at gaining insights into English language teachers' assessment practices in Turkey by examining the formal exam papers. Based on the technique of content analysis, formal exam papers were analyzed in terms of assessment items, language skills tested as well as the…

  20. Matching biomedical ontologies based on formal concept analysis.

    PubMed

    Zhao, Mengyi; Zhang, Songmao; Li, Weizhuo; Chen, Guowei

    2018-03-19

    The goal of ontology matching is to identify correspondences between entities from different yet overlapping ontologies so as to facilitate semantic integration, reuse and interoperability. As a well developed mathematical model for analyzing individuals and structuring concepts, Formal Concept Analysis (FCA) has been applied to ontology matching (OM) tasks since the beginning of OM research, whereas ontological knowledge exploited in FCA-based methods is limited. This motivates the study in this paper, i.e., to empower FCA with as much as ontological knowledge as possible for identifying mappings across ontologies. We propose a method based on Formal Concept Analysis to identify and validate mappings across ontologies, including one-to-one mappings, complex mappings and correspondences between object properties. Our method, called FCA-Map, incrementally generates a total of five types of formal contexts and extracts mappings from the lattices derived. First, the token-based formal context describes how class names, labels and synonyms share lexical tokens, leading to lexical mappings (anchors) across ontologies. Second, the relation-based formal context describes how classes are in taxonomic, partonomic and disjoint relationships with the anchors, leading to positive and negative structural evidence for validating the lexical matching. Third, the positive relation-based context can be used to discover structural mappings. Afterwards, the property-based formal context describes how object properties are used in axioms to connect anchor classes across ontologies, leading to property mappings. Last, the restriction-based formal context describes co-occurrence of classes across ontologies in anonymous ancestors of anchors, from which extended structural mappings and complex mappings can be identified. Evaluation on the Anatomy, the Large Biomedical Ontologies, and the Disease and Phenotype track of the 2016 Ontology Alignment Evaluation Initiative campaign demonstrates the effectiveness of FCA-Map and its competitiveness with the top-ranked systems. FCA-Map can achieve a better balance between precision and recall for large-scale domain ontologies through constructing multiple FCA structures, whereas it performs unsatisfactorily for smaller-sized ontologies with less lexical and semantic expressions. Compared with other FCA-based OM systems, the study in this paper is more comprehensive as an attempt to push the envelope of the Formal Concept Analysis formalism in ontology matching tasks. Five types of formal contexts are constructed incrementally, and their derived concept lattices are used to cluster the commonalities among classes at lexical and structural level, respectively. Experiments on large, real-world domain ontologies show promising results and reveal the power of FCA.

  1. A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks

    PubMed Central

    Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos

    2016-01-01

    Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool (“ADVISES”) to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies. PMID:28025568

  2. Offshore safety case approach and formal safety assessment of ships.

    PubMed

    Wang, J

    2002-01-01

    Tragic marine and offshore accidents have caused serious consequences including loss of lives, loss of property, and damage of the environment. A proactive, risk-based "goal setting" regime is introduced to the marine and offshore industries to increase the level of safety. To maximize marine and offshore safety, risks need to be modeled and safety-based decisions need to be made in a logical and confident way. Risk modeling and decision-making tools need to be developed and applied in a practical environment. This paper describes both the offshore safety case approach and formal safety assessment of ships in detail with particular reference to the design aspects. The current practices and the latest development in safety assessment in both the marine and offshore industries are described. The relationship between the offshore safety case approach and formal ship safety assessment is described and discussed. Three examples are used to demonstrate both the offshore safety case approach and formal ship safety assessment. The study of risk criteria in marine and offshore safety assessment is carried out. The recommendations on further work required are given. This paper gives safety engineers in the marine and offshore industries an overview of the offshore safety case approach and formal ship safety assessment. The significance of moving toward a risk-based "goal setting" regime is given.

  3. Dynamic principle for ensemble control tools.

    PubMed

    Samoletov, A; Vasiev, B

    2017-11-28

    Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.

  4. Defining Earth Smarts: A Construct Analysis for Socioecological Literacy Based on Justly Maintaining Quality of Life

    NASA Astrophysics Data System (ADS)

    Nichols, Bryan H.

    This paper describes the creation and validation of a new educational construct. Socioecological literacy, or earth smarts, describes the qualities we need to justly maintain or improve our quality of life in a changing world. It was created using construct analysis techniques and systems tools, drawing on an extensive, transdisciplinary body of literature. Concepts related to environmental, ecological and scientific literacy, sustainability and citizenship were combined with educational frameworks, new research in science education, and modern cognitive psychology. After the initial formulation, the results were considered by a variety of experts and professionals from the fields of ecology, environmental science and education, using surveys, conference presentations and interviews. The resulting qualitative and quantitative feedback was used to refine and validate the framework. Four domains emerged from the analysis: concepts, competencies, sense of place, and values. The first two are common in formal education, although many of the more specific components that emerged are not adequately addressed. The second two domains are unlikely to be achieved solely in traditional educational settings, although they emerged as equally important. Sense of place includes affective components such as self-efficacy, while values includes moral development, respect, and justice as fairness. To make culturally and ecologically appropriate localization as accessible as possible, the earth smarts framework (www.earthsmarts.info ) is deliberately nonpartisan and was designed using free and open-source software. It can help educators, policy makers, and researchers interested in more resilient, just and adaptable communities to coordinate their efforts, particularly in the nexus between formal and informal education, which have different strengths and weaknesses.

  5. High Level Analysis, Design and Validation of Distributed Mobile Systems with CoreASM

    NASA Astrophysics Data System (ADS)

    Farahbod, R.; Glässer, U.; Jackson, P. J.; Vajihollahi, M.

    System design is a creative activity calling for abstract models that facilitate reasoning about the key system attributes (desired requirements and resulting properties) so as to ensure these attributes are properly established prior to actually building a system. We explore here the practical side of using the abstract state machine (ASM) formalism in combination with the CoreASM open source tool environment for high-level design and experimental validation of complex distributed systems. Emphasizing the early phases of the design process, a guiding principle is to support freedom of experimentation by minimizing the need for encoding. CoreASM has been developed and tested building on a broad scope of applications, spanning computational criminology, maritime surveillance and situation analysis. We critically reexamine here the CoreASM project in light of three different application scenarios.

  6. Random-Phase Approximation Methods

    NASA Astrophysics Data System (ADS)

    Chen, Guo P.; Voora, Vamsee K.; Agee, Matthew M.; Balasubramani, Sree Ganesh; Furche, Filipp

    2017-05-01

    Random-phase approximation (RPA) methods are rapidly emerging as cost-effective validation tools for semilocal density functional computations. We present the theoretical background of RPA in an intuitive rather than formal fashion, focusing on the physical picture of screening and simple diagrammatic analysis. A new decomposition of the RPA correlation energy into plasmonic modes leads to an appealing visualization of electron correlation in terms of charge density fluctuations. Recent developments in the areas of beyond-RPA methods, RPA correlation potentials, and efficient algorithms for RPA energy and property calculations are reviewed. The ability of RPA to approximately capture static correlation in molecules is quantified by an analysis of RPA natural occupation numbers. We illustrate the use of RPA methods in applications to small-gap systems such as open-shell d- and f-element compounds, radicals, and weakly bound complexes, where semilocal density functional results exhibit strong functional dependence.

  7. A computer program for multiple decrement life table analyses.

    PubMed

    Poole, W K; Cooley, P C

    1977-06-01

    Life table analysis has traditionally been the tool of choice in analyzing distribution of "survival" times when a parametric form for the survival curve could not be reasonably assumed. Chiang, in two papers [1,2] formalized the theory of life table analyses in a Markov chain framework and derived maximum likelihood estimates of the relevant parameters for the analyses. He also discussed how the techniques could be generalized to consider competing risks and follow-up studies. Although various computer programs exist for doing different types of life table analysis [3] to date, there has not been a generally available, well documented computer program to carry out multiple decrement analyses, either by Chiang's or any other method. This paper describes such a program developed by Research Triangle Institute. A user's manual is available at printing costs which supplements the contents of this paper with a discussion of the formula used in the program listing.

  8. Performances on the CogState and standard neuropsychological batteries among HIV patients without dementia.

    PubMed

    Overton, Edgar Turner; Kauwe, John S K; Paul, Robert; Tashima, Karen; Tate, David F; Patel, Pragna; Carpenter, Charles C J; Patty, David; Brooks, John T; Clifford, David B

    2011-11-01

    HIV-associated neurocognitive disorders remain prevalent but challenging to diagnose particularly among non-demented individuals. To determine whether a brief computerized battery correlates with formal neurocognitive testing, we identified 46 HIV-infected persons who had undergone both formal neurocognitive testing and a brief computerized battery. Simple detection tests correlated best with formal neuropsychological testing. By multivariable regression model, 53% of the variance in the composite Global Deficit Score was accounted for by elements from the brief computerized tool (P < 0.01). These data confirm previous correlation data with the computerized battery. Using the five significant parameters from the regression model in a Receiver Operating Characteristic curve, 90% of persons were accurately classified as being cognitively impaired or not. The test battery requires additional evaluation, specifically for identifying persons with mild impairment, a state upon which interventions may be effective.

  9. Mechanisms of Developmental Change in Infant Categorization

    ERIC Educational Resources Information Center

    Westermann, Gert; Mareschal, Denis

    2012-01-01

    Computational models are tools for testing mechanistic theories of learning and development. Formal models allow us to instantiate theories of cognitive development in computer simulations. Model behavior can then be compared to real performance. Connectionist models, loosely based on neural information processing, have been successful in…

  10. Art to science: Tools for greater objectivity in resource monitoring

    USDA-ARS?s Scientific Manuscript database

    The earliest inventories of western US rangelands were “ocular” estimates. Now, objective data consistent with formal scientific inquiry is needed to support management decisions that sustain the resource while balancing numerous competing land uses and sometimes-vociferous stakeholders. Yet, the co...

  11. A knowledge based software engineering environment testbed

    NASA Technical Reports Server (NTRS)

    Gill, C.; Reedy, A.; Baker, L.

    1985-01-01

    The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing

  12. Formal reasoning about systems biology using theorem proving

    PubMed Central

    Hasan, Osman; Siddique, Umair; Tahar, Sofiène

    2017-01-01

    System biology provides the basis to understand the behavioral properties of complex biological organisms at different levels of abstraction. Traditionally, analysing systems biology based models of various diseases have been carried out by paper-and-pencil based proofs and simulations. However, these methods cannot provide an accurate analysis, which is a serious drawback for the safety-critical domain of human medicine. In order to overcome these limitations, we propose a framework to formally analyze biological networks and pathways. In particular, we formalize the notion of reaction kinetics in higher-order logic and formally verify some of the commonly used reaction based models of biological networks using the HOL Light theorem prover. Furthermore, we have ported our earlier formalization of Zsyntax, i.e., a deductive language for reasoning about biological networks and pathways, from HOL4 to the HOL Light theorem prover to make it compatible with the above-mentioned formalization of reaction kinetics. To illustrate the usefulness of the proposed framework, we present the formal analysis of three case studies, i.e., the pathway leading to TP53 Phosphorylation, the pathway leading to the death of cancer stem cells and the tumor growth based on cancer stem cells, which is used for the prognosis and future drug designs to treat cancer patients. PMID:28671950

  13. Formal verification of an oral messages algorithm for interactive consistency

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  14. Lost in the chaos: Flawed literature should not generate new disorders.

    PubMed

    Van Rooij, Antonius J; Kardefelt-Winther, Daniel

    2017-06-01

    The paper by Kuss, Griffiths, and Pontes (2016) titled "Chaos and confusion in DSM-5 diagnosis of Internet Gaming Disorder: Issues, concerns, and recommendations for clarity in the field" examines issues relating to the concept of Internet Gaming Disorder. We agree that there are serious issues and extend their arguments by suggesting that the field lacks basic theory, definitions, patient research, and properly validated and standardized assessment tools. As most studies derive data from survey research in functional populations, they exclude people with severe functional impairment and provide only limited information on the hypothesized disorder. Yet findings from such studies are widely used and often exaggerated, leading many to believe that we know more about the problem behavior than we do. We further argue that video game play is associated with several benefits and that formalizing this popular hobby as a psychiatric disorder is not without risks. It might undermine children's right to play or encourage repressive treatment programs, which ultimately threaten children's right to protection against violence. While Kuss et al. (2016) express support for the formal implementation of a disorder, we argue that before we have a proper evidence base, a sound theory, and validated assessment tools, it is irresponsible to support a formal category of disorder and doing so would solidify a confirmatory approach to research in this area.

  15. Lost in the chaos: Flawed literature should not generate new disorders

    PubMed Central

    Van Rooij, Antonius J.; Kardefelt-Winther, Daniel

    2017-01-01

    The paper by Kuss, Griffiths, and Pontes (2016) titled “Chaos and confusion in DSM-5 diagnosis of Internet Gaming Disorder: Issues, concerns, and recommendations for clarity in the field” examines issues relating to the concept of Internet Gaming Disorder. We agree that there are serious issues and extend their arguments by suggesting that the field lacks basic theory, definitions, patient research, and properly validated and standardized assessment tools. As most studies derive data from survey research in functional populations, they exclude people with severe functional impairment and provide only limited information on the hypothesized disorder. Yet findings from such studies are widely used and often exaggerated, leading many to believe that we know more about the problem behavior than we do. We further argue that video game play is associated with several benefits and that formalizing this popular hobby as a psychiatric disorder is not without risks. It might undermine children’s right to play or encourage repressive treatment programs, which ultimately threaten children’s right to protection against violence. While Kuss et al. (2016) express support for the formal implementation of a disorder, we argue that before we have a proper evidence base, a sound theory, and validated assessment tools, it is irresponsible to support a formal category of disorder and doing so would solidify a confirmatory approach to research in this area. PMID:28301968

  16. Steinberg ``AUDIOMAPS" Music Appreciation-Via-Understanding: Special-Relativity + Expectations "Quantum-Theory": a Quantum-ACOUSTO/MUSICO-Dynamics (QA/MD)

    NASA Astrophysics Data System (ADS)

    Steinberg, R.; Siegel, E.

    2010-03-01

    ``AUDIOMAPS'' music enjoyment/appreciation-via-understanding methodology, versus art, music-dynamics evolves, telling a story in (3+1)-dimensions: trails, frames, timbres, + dynamics amplitude vs. music-score time-series (formal-inverse power- spectrum) surprisingly closely parallels (3+1)-dimensional Einstein(1905) special-relativity ``+'' (with its enjoyment- expectations) a manifestation of quantum-theory expectation- values, together a music quantum-ACOUSTO/MUSICO-dynamics (QA/MD). Analysis via Derrida deconstruction enabled Siegel- Baez ``Category-Semantics'' ``FUZZYICS''=``CATEGORYICS (``SON of 'TRIZ") classic Aristotle ``Square-of-Opposition" (SoO) DEduction-logic, irrespective of Boon-Klimontovich versus Voss- Clark[PRL(77)] music power-spectrum analysis sampling- time/duration controversy: part versus whole, shows that ``AUDIOMAPS" QA/MD reigns supreme as THE music appreciation-via- analysis tool for the listener in musicology!!! Connection to Deutsch-Hartmann-Levitin[This is Your Brain on Music,(2006)] brain/mind-barrier brain/mind-music connection is both subtle and compelling and immediate!!!

  17. The Long Range Reconnaissance and Observation System (LORROS) with the Kollsman, Inc. Model LH-40, Infrared (Erbium) Laser Rangefinder hazard analysis and safety assessment.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Augustoni, Arnold L.

    A laser hazard analysis and safety assessment was performed for the LH-40 IR Laser Rangefinder based on the 2000 version of the American National Standard Institute's Standard Z136.1, for the Safe Use of Lasers and Z136.6, for the Safe Use of Lasers Outdoors. The LH-40 IR Laser is central to the Long Range Reconnaissance and Observation System (LORROS). The LORROS is being evaluated by the Department 4149 Group to determine its capability as a long-range assessment tool. The manufacture lists the laser rangefinder as 'eye safe' (Class 1 laser classified under the CDRH Compliance Guide for Laser Products and 21more » CFR 1040 Laser Product Performance Standard). It was necessary that SNL validate this prior to its use involving the general public. A formal laser hazard analysis is presented for the typical mode of operation.« less

  18. Aspects of electron transport in zigzag graphene nanoribbons

    NASA Astrophysics Data System (ADS)

    Bhalla, Pankaj; Pratap, Surender

    2018-05-01

    In this paper, we investigate the aspects of electron transport in the zigzag graphene nanoribbons (ZGNRs) using the nonequilibrium Green’s function (NEGF) formalism. The latter is an esoteric tool in mesoscopic physics. It is used to perform an analysis of ZGNRs by considering potential well. Within this potential, the dependence of transmission coefficient, local density of states (LDOS) and electron transport properties on number of atoms per unit cell is discussed. It is observed that there is an increment in electron and thermal conductance with increasing number of atoms. In addition to these properties, the dependence of same is also studied in figure of merit. The results infer that the contribution of electrons to enhance the figure of merit is important above the crossover temperature.

  19. Strategy for continuous improvement in IC manufacturability, yield, and reliability

    NASA Astrophysics Data System (ADS)

    Dreier, Dean J.; Berry, Mark; Schani, Phil; Phillips, Michael; Steinberg, Joe; DePinto, Gary

    1993-01-01

    Continual improvements in yield, reliability and manufacturability measure a fab and ultimately result in Total Customer Satisfaction. A new organizational and technical methodology for continuous defect reduction has been established in a formal feedback loop, which relies on yield and reliability, failed bit map analysis, analytical tools, inline monitoring, cross functional teams and a defect engineering group. The strategy requires the fastest detection, identification and implementation of possible corrective actions. Feedback cycle time is minimized at all points to improve yield and reliability and reduce costs, essential for competitiveness in the memory business. Payoff was a 9.4X reduction in defectivity and a 6.2X improvement in reliability of 256 K fast SRAMs over 20 months.

  20. Animal Social Network Theory Can Help Wildlife Conservation.

    PubMed

    Snijders, Lysanne; Blumstein, Daniel T; Stanley, Christina R; Franks, Daniel W

    2017-08-01

    Many animals preferentially associate with certain other individuals. This social structuring can influence how populations respond to changes to their environment, thus making network analysis a promising technique for understanding, predicting, and potentially manipulating population dynamics. Various network statistics can correlate with individual fitness components and key population-level processes, yet the logical role and formal application of animal social network theory for conservation and management have not been well articulated. We outline how understanding of direct and indirect relationships between animals can be profitably applied by wildlife managers and conservationists. By doing so, we aim to stimulate the development and implementation of practical tools for wildlife conservation and management and to inspire novel behavioral research in this field. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Surgical competence.

    PubMed

    Patil, Nivritti G; Cheng, Stephen W K; Wong, John

    2003-08-01

    Recent high-profile cases have heightened the need for a formal structure to monitor achievement and maintenance of surgical competence. Logbooks, morbidity and mortality meetings, videos and direct observation of operations using a checklist, motion analysis devices, and virtual reality simulators are effective tools for teaching and evaluating surgical skills. As the operating theater is also a place for training, there must be protocols and guidelines, including mandatory standards for supervision, to ensure that patient care is not compromised. Patients appreciate frank communication and honesty from surgeons regarding their expertise and level of competence. To ensure that surgical competence is maintained and keeps pace with technologic advances, professional registration bodies have been promoting programs for recertification. They evaluate performance in practice, professional standing, and commitment to ongoing education.

  2. A Pathway to Freedom: An Evaluation of Screening Tools for the Identification of Trafficking Victims.

    PubMed

    Bespalova, Nadejda; Morgan, Juliet; Coverdale, John

    2016-02-01

    Because training residents and faculty to identify human trafficking victims is a major public health priority, the authors review existing assessment tools. PubMed and Google were searched using combinations of search terms including human, trafficking, sex, labor, screening, identification, and tool. Nine screening tools that met the inclusion criteria were found. They varied greatly in length, format, target demographic, supporting resources, and other parameters. Only two tools were designed specifically for healthcare providers. Only one tool was formally assessed to be valid and reliable in a pilot project in trafficking victim service organizations, although it has not been validated in the healthcare setting. This toolbox should facilitate the education of resident physicians and faculty in screening for trafficking victims, assist educators in assessing screening skills, and promote future research on the identification of trafficking victims.

  3. Shared Governance in the Community College: An Analysis of Formal Authority in Collective Bargaining Agreements

    ERIC Educational Resources Information Center

    McDermott, Linda A.

    2012-01-01

    This qualitative study examines shared governance in Washington State's community and technical colleges and provides an analysis of faculty participation in governance based on formal authority in collective bargaining agreements. Contracts from Washington's thirty community and technical college districts were reviewed in order to identify in…

  4. Formalization and Analysis of Reasoning by Assumption

    ERIC Educational Resources Information Center

    Bosse, Tibor; Jonker, Catholijn M.; Treur, Jan

    2006-01-01

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically…

  5. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  6. A Neural-Network-Based Semi-Automated Geospatial Classification Tool

    NASA Astrophysics Data System (ADS)

    Hale, R. G.; Herzfeld, U. C.

    2014-12-01

    North America's largest glacier system, the Bering Bagley Glacier System (BBGS) in Alaska, surged in 2011-2013, as shown by rapid mass transfer, elevation change, and heavy crevassing. Little is known about the physics controlling surge glaciers' semi-cyclic patterns; therefore, it is crucial to collect and analyze as much data as possible so that predictive models can be made. In addition, physical signs frozen in ice in the form of crevasses may help serve as a warning for future surges. The BBGS surge provided an opportunity to develop an automated classification tool for crevasse classification based on imagery collected from small aircraft. The classification allows one to link image classification to geophysical processes associated with ice deformation. The tool uses an approach that employs geostatistical functions and a feed-forward perceptron with error back-propagation. The connectionist-geostatistical approach uses directional experimental (discrete) variograms to parameterize images into a form that the Neural Network (NN) can recognize. In an application to preform analysis on airborne video graphic data from the surge of the BBGS, an NN was able to distinguish 18 different crevasse classes with 95 percent or higher accuracy, for over 3,000 images. Recognizing that each surge wave results in different crevasse types and that environmental conditions affect the appearance in imagery, we designed the tool's semi-automated pre-training algorithm to be adaptable. The tool can be optimized to specific settings and variables of image analysis: (airborne and satellite imagery, different camera types, observation altitude, number and types of classes, and resolution). The generalization of the classification tool brings three important advantages: (1) multiple types of problems in geophysics can be studied, (2) the training process is sufficiently formalized to allow non-experts in neural nets to perform the training process, and (3) the time required to manually pre-sort imagery into classes is greatly reduced.

  7. Evaluation of consent for peer physical examination: students reflect on their clinical skills learning experience.

    PubMed

    Wearn, Andy; Bhoopatkar, Harsh

    2006-10-01

    Early clinical skills teaching often requires students to learn through examining one another. This model should acknowledge ethical, practical and individual issues, disclosure and identification of abnormalities. Consent to peer physical examination (PPE) is usually expected rather than discussed and sought. We sought to evaluate a formal written consent process for PPE and to explore students' views of this approach. A survey tool was designed and distributed to all years 2 and 3 students in the Auckland University medical programme (2004). Results were analysed using univariate statistics and thematic analysis. The response rate was 57% (146/258). Most students had read the participant information sheet prior to signing, with 78% giving consent. They had not felt coerced and the in-course experience matched the 'promise'. Comments included: PPE gave insights into the 'patient's world', encouraged peer learning and raised some professional issues. More than 95% of students took the examination role at least once (less likely if female, P = 0.002). Some European, Maori and Pacific students never took the role; all Asian students did at least once. Students preferred PPE in groups consisting of 'friends'. The task influenced group composition by sex (P < 0.0001) but not ethnicity. Students accept and support a formal consent process. PPE participation rates are similar to predictions. The experience must match the promises made. Formal preparation alone might have produced similar student outcomes. Female students are more selective about tasks undertaken. The influence of ethnicity and the effect on future behaviour and attitudes needs further exploration.

  8. Self-organizing ontology of biochemically relevant small molecules

    PubMed Central

    2012-01-01

    Background The advent of high-throughput experimentation in biochemistry has led to the generation of vast amounts of chemical data, necessitating the development of novel analysis, characterization, and cataloguing techniques and tools. Recently, a movement to publically release such data has advanced biochemical structure-activity relationship research, while providing new challenges, the biggest being the curation, annotation, and classification of this information to facilitate useful biochemical pattern analysis. Unfortunately, the human resources currently employed by the organizations supporting these efforts (e.g. ChEBI) are expanding linearly, while new useful scientific information is being released in a seemingly exponential fashion. Compounding this, currently existing chemical classification and annotation systems are not amenable to automated classification, formal and transparent chemical class definition axiomatization, facile class redefinition, or novel class integration, thus further limiting chemical ontology growth by necessitating human involvement in curation. Clearly, there is a need for the automation of this process, especially for novel chemical entities of biological interest. Results To address this, we present a formal framework based on Semantic Web technologies for the automatic design of chemical ontology which can be used for automated classification of novel entities. We demonstrate the automatic self-assembly of a structure-based chemical ontology based on 60 MeSH and 40 ChEBI chemical classes. This ontology is then used to classify 200 compounds with an accuracy of 92.7%. We extend these structure-based classes with molecular feature information and demonstrate the utility of our framework for classification of functionally relevant chemicals. Finally, we discuss an iterative approach that we envision for future biochemical ontology development. Conclusions We conclude that the proposed methodology can ease the burden of chemical data annotators and dramatically increase their productivity. We anticipate that the use of formal logic in our proposed framework will make chemical classification criteria more transparent to humans and machines alike and will thus facilitate predictive and integrative bioactivity model development. PMID:22221313

  9. A Survey of Logic Formalisms to Support Mishap Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, Chris; Holloway, C. M.

    2003-01-01

    Mishap investigations provide important information about adverse events and near miss incidents. They are intended to help avoid any recurrence of previous failures. Over time, they can also yield statistical information about incident frequencies that helps to detect patterns of failure and can validate risk assessments. However, the increasing complexity of many safety critical systems is posing new challenges for mishap analysis. Similarly, the recognition that many failures have complex, systemic causes has helped to widen the scope of many mishap investigations. These two factors have combined to pose new challenges for the analysis of adverse events. A new generation of formal and semi-formal techniques have been proposed to help investigators address these problems. We introduce the term mishap logics to collectively describe these notations that might be applied to support the analysis of mishaps. The proponents of these notations have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. These proofs can be used to reduce the bias that is often perceived to effect the interpretation of adverse events. Others have argued that one cannot use logic formalisms to prove causes in the same way that one might prove propositions or theorems. Such mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators must use in their analysis of adverse events. This paper provides an overview of these mishap logics. It also identifies several additional classes of logic that might also be used to support mishap analysis.

  10. Phasic dopamine signals: from subjective reward value to formal economic utility

    PubMed Central

    Schultz, Wolfram; Carelli, Regina M; Wightman, R Mark

    2015-01-01

    Although rewards are physical stimuli and objects, their value for survival and reproduction is subjective. The phasic, neurophysiological and voltammetric dopamine reward prediction error response signals subjective reward value. The signal incorporates crucial reward aspects such as amount, probability, type, risk, delay and effort. Differences of dopamine release dynamics with temporal delay and effort in rodents may derive from methodological issues and require further study. Recent designs using concepts and behavioral tools from experimental economics allow to formally characterize the subjective value signal as economic utility and thus to establish a neuronal value function. With these properties, the dopamine response constitutes a utility prediction error signal. PMID:26719853

  11. Standardized terminology for clinical trial protocols based on top-level ontological categories.

    PubMed

    Heller, B; Herre, H; Lippoldt, K; Loeffler, M

    2004-01-01

    This paper describes a new method for the ontologically based standardization of concepts with regard to the quality assurance of clinical trial protocols. We developed a data dictionary for medical and trial-specific terms in which concepts and relations are defined context-dependently. The data dictionary is provided to different medical research networks by means of the software tool Onto-Builder via the internet. The data dictionary is based on domain-specific ontologies and the top-level ontology of GOL. The concepts and relations described in the data dictionary are represented in natural language, semi-formally or formally according to their use.

  12. Formalization and Interaction: Toward a Comprehensive History of Technology-Related Knowledge in Early Modern Europe.

    PubMed

    Popplow, Marcus

    2015-12-01

    Recent critical approaches to what has conventionally been described as "scientific" and "technical" knowledge in early modern Europe have provided a wealth of new insights. So far, the various analytical concepts suggested by these studies have not yet been comprehensively discussed. The present essay argues that such comprehensive approaches might prove of special value for long-term and cross-cultural reflections on technology-related knowledge. As heuristic tools, the notions of "formalization" and "interaction" are proposed as part of alternative narratives to those highlighting the emergence of "science" as the most relevant development for technology-related knowledge in early modern Europe.

  13. Formal Methods for Biological Systems: Languages, Algorithms, and Applications

    DTIC Science & Technology

    2016-09-01

    Moura. The yices SMT solver. Tool paper at http://yices.csl.sri.com/tool-paper. pdf, 2:2, 2006. 1.2 [80] Volker Ellenrieder, Martin E Fernandez Zapico...Oncology, 32(3):128–131, 2010. 6 [82] Mert Erkan, Simone Hausmann, Christoph W Michalski, Alexander A Fingerle, Martin Dobritz, Jörg Kleeff, and...data. In International Conference on Computer Aided Verification, pages 544–560. Springer, 2015. 1.1 [91] Martin Fränzle, Holger Hermanns, and Tino

  14. The Units Ontology: a tool for integrating units of measurement in science

    PubMed Central

    Gkoutos, Georgios V.; Schofield, Paul N.; Hoehndorf, Robert

    2012-01-01

    Units are basic scientific tools that render meaning to numerical data. Their standardization and formalization caters for the report, exchange, process, reproducibility and integration of quantitative measurements. Ontologies are means that facilitate the integration of data and knowledge allowing interoperability and semantic information processing between diverse biomedical resources and domains. Here, we present the Units Ontology (UO), an ontology currently being used in many scientific resources for the standardized description of units of measurements. PMID:23060432

  15. What's it worth? A general manager's guide to valuation.

    PubMed

    Luehrman, T A

    1997-01-01

    Behind every major resource-allocation decision a company makes lies some calculation of what that move is worth. So it is not surprising that valuation is the financial analytical skill general managers want to learn more than any other. Managers whose formal training is more than a few years old, however, are likely to have learned approaches that are becoming obsolete. What do generalists need in an updated valuation tool kit? In the 1970s, discounted-cash-flow analysis (DCF) emerged as best practice for valuing corporate assets. And one version of DCF-using the weighted-average cost of capital (WACC)-became the standard. Over the years, WACC has been used by most companies as a one-size-fits-all valuation tool. Today the WACC standard is insufficient. Improvements in computers and new theoretical insights have given rise to tools that outperform WACC in the three basic types of valuation problems managers face. Timothy Luehrman presents an overview of the three tools, explaining how they work and when to use them. For valuing operations, the DCF methodology of adjusted present value allows managers to break a problem into pieces that make managerial sense. For valuing opportunities, option pricing captures the contingent nature of investments in areas such as R&D and marketing. And for valuing ownership claims, the tool of equity cash flows helps managers value their company's stake in a joint venture, a strategic alliance, or an investment that uses project financing.

  16. THE RAPID OPTICAL TOOL (TM) LASER-INDUCED FLUORESCENCE SYSTEM FOR SCREENING OF PETROLEUM HYDROCARBONS IN SUBSURFACE SOILS

    EPA Science Inventory

    The Consortium for Site Characterization Technology (CSCT) has established a formal program to accelerate acceptance and application of innovative monitoring and site characterization technologies that improve the way the nation manages its environmental problems. In 1995 the CS...

  17. Comprehensive transportation asset management : making a business case and prioritizing assets for inclusion in formal asset management programs.

    DOT National Transportation Integrated Search

    2011-12-01

    Several agencies are applying asset management principles as a business tool and paradigm to help them define goals and prioritize agency resources in decision making. Previously, transportation asset management (TAM) has focused more on big ticke...

  18. Defense Partnerships: Documenting Trends and Emerging Topics for Action

    DTIC Science & Technology

    2015-03-01

    between the air force research lab and antelope Valley College (aVC) results in increases in number of scientists, engi- neers, and technicians from...guiding document, tool, or resource should address best prac- tices for project valuation , what types of formalized arrangements are acceptable, and

  19. Technology Focus: Enhancing Conceptual Knowledge of Linear Programming with a Flash Tool

    ERIC Educational Resources Information Center

    Garofalo, Joe; Cory, Beth

    2007-01-01

    Mathematical knowledge can be categorized in different ways. One commonly used way is to distinguish between procedural mathematical knowledge and conceptual mathematical knowledge. Procedural knowledge of mathematics refers to formal language, symbols, algorithms, and rules. Conceptual knowledge is essential for meaningful understanding of…

  20. Mapping benefits as a tool for natural resource management in estuarine watersheds

    EPA Science Inventory

    Natural resource managers are often called upon to justify the value of protecting or restoring natural capital based on its perceived benefit to stakeholders. This usually takes the form of formal valuation exercises (i.e., ancillary costs) of a resource without consideration f...

  1. Window to the World

    ERIC Educational Resources Information Center

    Cannon, Kama

    2018-01-01

    Although formal papers are typical, sometimes posters or other visual presentations are more useful tools for sharing visual-spatial information. By incorporating creativity and technology into the study of geographical science, STEM (the study of Science, Technology Engineering, and Mathematics) is changed to STEAM (the A stands for ART)! The…

  2. Perceptions of the value of traditional ecological knowledge to formal school curricula: opportunities and challenges from Malekula Island, Vanuatu

    PubMed Central

    2011-01-01

    Background The integration of traditional ecological knowledge (TEK) into formal school curricula may be a key tool for the revitalisation of biocultural diversity, and has the potential to improve the delivery of educational objectives. This paper explores perceptions of the value of TEK to formal education curricula on Malekula Island, Vanuatu. We conducted 49 interviews with key stakeholders (local TEK experts, educators, and officials) regarding the use of the formal school system to transmit, maintain, and revitalise TEK. Interviews also gathered information on the areas where TEK might add value to school curricula and on the perceived barriers to maintaining and revitalising TEK via formal education programs. Results Participants reported that TEK had eroded on Malekula, and identified the formal school system as a principal driver. Most interviewees believed that if an appropriate format could be developed, TEK could be included in the formal education system. Such an approach has potential to maintain customary knowledge and practice in the focus communities. Participants identified several specific domains of TEK for inclusion in school curricula, including ethnomedical knowledge, agricultural knowledge and practice, and the reinforcement of respect for traditional authority and values. However, interviewees also noted a number of practical and epistemological barriers to teaching TEK in school. These included the cultural diversity of Malekula, tensions between public and private forms of knowledge, and multiple values of TEK within the community. Conclusions TEK has potential to add value to formal education systems in Vanuatu by contextualising the content and process of curricular delivery, and by facilitating character development and self-awareness in students. These benefits are congruent with UNESCO-mandated goals for curricular reform and provide a strong argument for the inclusion of TEK in formal school systems. Such approaches may also assist in the maintenance and revitalisation of at-risk systems of ethnobiological knowledge. However, we urge further research attention to the significant epistemological challenges inherent in including TEK in formal school, particularly as participants noted the potential for such approaches to have negative consequences. PMID:22112326

  3. Perceptions of the value of traditional ecological knowledge to formal school curricula: opportunities and challenges from Malekula Island, Vanuatu.

    PubMed

    McCarter, Joe; Gavin, Michael C

    2011-11-23

    The integration of traditional ecological knowledge (TEK) into formal school curricula may be a key tool for the revitalisation of biocultural diversity, and has the potential to improve the delivery of educational objectives. This paper explores perceptions of the value of TEK to formal education curricula on Malekula Island, Vanuatu. We conducted 49 interviews with key stakeholders (local TEK experts, educators, and officials) regarding the use of the formal school system to transmit, maintain, and revitalise TEK. Interviews also gathered information on the areas where TEK might add value to school curricula and on the perceived barriers to maintaining and revitalising TEK via formal education programs. Participants reported that TEK had eroded on Malekula, and identified the formal school system as a principal driver. Most interviewees believed that if an appropriate format could be developed, TEK could be included in the formal education system. Such an approach has potential to maintain customary knowledge and practice in the focus communities. Participants identified several specific domains of TEK for inclusion in school curricula, including ethnomedical knowledge, agricultural knowledge and practice, and the reinforcement of respect for traditional authority and values. However, interviewees also noted a number of practical and epistemological barriers to teaching TEK in school. These included the cultural diversity of Malekula, tensions between public and private forms of knowledge, and multiple values of TEK within the community. TEK has potential to add value to formal education systems in Vanuatu by contextualising the content and process of curricular delivery, and by facilitating character development and self-awareness in students. These benefits are congruent with UNESCO-mandated goals for curricular reform and provide a strong argument for the inclusion of TEK in formal school systems. Such approaches may also assist in the maintenance and revitalisation of at-risk systems of ethnobiological knowledge. However, we urge further research attention to the significant epistemological challenges inherent in including TEK in formal school, particularly as participants noted the potential for such approaches to have negative consequences.

  4. [The workplace-based learning: a main paradigm of an effective continuing medical education].

    PubMed

    Lelli, Maria Barbara

    2010-01-01

    On the strength of the literature analysis and the Emilia-Romagna Region experience, we suggest a reflection on the workplace-based learning that goes beyond the analysis of the effectiveness of specific didactic methodologies and aspects related to Continuing Medical Education. Health education and training issue is viewed from a wider perspective, that integrates the three learning dimensions (formal, non formal and informal). In such a perspective the workplace-based learning becomes an essential paradigm to reshape the explicit knowledge conveyed in formal context and to emphasize informal contexts where innovation is generated.

  5. The Alignment of the Informal and Formal Organizational Supports for Reform: Implications for Improving Teaching in Schools

    ERIC Educational Resources Information Center

    Penuel, William R.; Riel, Margaret; Joshi, Aasha; Pearlman, Leslie; Kim, Chong Min; Frank, Kenneth A.

    2010-01-01

    Previous qualitative studies show that when the formal organization of a school and patterns of informal interaction are aligned, faculty and leaders in a school are better able to coordinate instructional change. This article combines social network analysis with interview data to analyze how well the formal and informal aspects of a school's…

  6. English Language Education in Formal and Cram School Contexts: An Analysis of Listening Strategy and Learning Style

    ERIC Educational Resources Information Center

    Chou, Mu-hsuan

    2017-01-01

    Formal English language education in Taiwan now starts at Year 3 in primary school, with an emphasis on communicative proficiency. In addition to formal education, attending English cram schools after regular school has become a common phenomenon for Taiwanese students. The main purpose of gaining additional reinforcement in English cram schools…

  7. Using perinatal morbidity scoring tools as a primary study outcome.

    PubMed

    Hutcheon, Jennifer A; Bodnar, Lisa M; Platt, Robert W

    2017-11-01

    Perinatal morbidity scores are tools that score or weight different adverse events according to their relative severity. Perinatal morbidity scores are appealing for maternal-infant health researchers because they provide a way to capture a broad range of adverse events to mother and newborn while recognising that some events are considered more serious than others. However, they have proved difficult to implement as a primary outcome in applied research studies because of challenges in testing if the scores are significantly different between two or more study groups. We outline these challenges and describe a solution, based on Poisson regression, that allows differences in perinatal morbidity scores to be formally evaluated. The approach is illustrated using an existing maternal-neonatal scoring tool, the Adverse Outcome Index, to evaluate the safety of labour and delivery before and after the closure of obstetrical services in small rural communities. Applying the proposed Poisson regression to the case study showed a protective risk ratio for adverse outcome following closures as compared with the original analysis, where no difference was found. This approach opens the door for considerably broader use of perinatal morbidity scoring tools as a primary outcome in applied population and clinical maternal-infant health research studies. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  8. Selecting essential information for biosurveillance--a multi-criteria decision analysis.

    PubMed

    Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.

  9. Measurement invariance of TGMD-3 in children with and without mental and behavioral disorders.

    PubMed

    Magistro, Daniele; Piumatti, Giovanni; Carlevaro, Fabio; Sherar, Lauren B; Esliger, Dale W; Bardaglio, Giulia; Magno, Francesca; Zecca, Massimiliano; Musella, Giovanni

    2018-05-24

    This study evaluated whether the Test of Gross Motor Development 3 (TGMD-3) is a reliable tool to compare children with and without mental and behavioral disorders across gross motor skill domains. A total of 1,075 children (aged 3-11 years), 98 with mental and behavioral disorders and 977 without (typically developing), were included in the analyses. The TGMD-3 evaluates fundamental gross motor skills of children across two domains: locomotor skills and ball skills. Two independent testers simultaneously observed children's performances (agreement over 95%). Each child completed one practice and then two formal trials. Scores were recorded only during the two formal trials. Multigroup confirmatory factor analysis tested the assumption of TGMD-3 measurement invariance across disability groups. According to the magnitude of changes in root mean square error of approximation and comparative fit index between nested models, the assumption of measurement invariance across groups was valid. Loadings of the manifest indicators on locomotor and ball skills were significant (p < .001) in both groups. Item response theory analysis showed good reliability results across locomotor and the ball skills full latent traits. The present study confirmed the factorial structure of TGMD-3 and demonstrated its feasibility across normally developing children and children with mental and behavioral disorders. These findings provide new opportunities for understanding the effect of specific intervention strategies on this population. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. The quantitation of buffering action II. Applications of the formal & general approach.

    PubMed

    Schmitt, Bernhard M

    2005-03-16

    The paradigm of "buffering" originated in acid-base physiology, but was subsequently extended to other fields and is now used for a wide and diverse set of phenomena. In the preceding article, we have presented a formal and general approach to the quantitation of buffering action. Here, we use that buffering concept for a systematic treatment of selected classical and other buffering phenomena. H+ buffering by weak acids and "self-buffering" in pure water represent "conservative buffered systems" whose analysis reveals buffering properties that contrast in important aspects from classical textbook descriptions. The buffering of organ perfusion in the face of variable perfusion pressure (also termed "autoregulation") can be treated in terms of "non-conservative buffered systems", the general form of the concept. For the analysis of cytoplasmic Ca++ concentration transients (also termed "muffling"), we develop a related unit that is able to faithfully reflect the time-dependent quantitative aspect of buffering during the pre-steady state period. Steady-state buffering is shown to represent the limiting case of time-dependent muffling, namely for infinitely long time intervals and infinitely small perturbations. Finally, our buffering concept provides a stringent definition of "buffering" on the level of systems and control theory, resulting in four absolute ratio scales for control performance that are suited to measure disturbance rejection and setpoint tracking, and both their static and dynamic aspects. Our concept of buffering provides a powerful mathematical tool for the quantitation of buffering action in all its appearances.

  11. Why Engineers Should Consider Formal Methods

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael

    1997-01-01

    This paper presents a logical analysis of a typical argument favoring the use of formal methods for software development, and suggests an alternative argument that is simpler and stronger than the typical one.

  12. Software Formal Inspections Guidebook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  13. Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis

    ERIC Educational Resources Information Center

    Rubin, Samuel J.; Abrams, Binyomin

    2015-01-01

    Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…

  14. Utility of learning plans in general practice vocational training: a mixed-methods national study of registrar, supervisor, and educator perspectives.

    PubMed

    Garth, Belinda; Kirby, Catherine; Silberberg, Peter; Brown, James

    2016-08-19

    Learning plans are a compulsory component of the training and assessment requirements of general practice (GP) registrars in Australia. There is a small but growing number of studies reporting that learning plans are not well accepted or utilised in general practice training. There is a lack of research examining this apparent contradiction. The aim of this study was to examine use and perceived utility of formal learning plans in GP vocational training. This mixed-method Australian national research project utilised online learning plan usage data from 208 GP registrars and semi-structured focus groups and telephone interviews with 35 GP registrars, 12 recently fellowed GPs, 16 supervisors and 17 medical educators across three Regional Training Providers (RTPs). Qualitative data were analysed thematically using template analysis. Learning plans were used mostly as a log of activities rather than as a planning tool. Most learning needs were entered and ticked off as complete on the same day. Learning plans were perceived as having little value for registrars in their journey to becoming a competent GP, and as a bureaucratic hurdle serving as a distraction rather than an aid to learning. The process of learning planning was valued more so than the documentation of learning planning. This study provides creditable evidence that mandated learning plans are broadly considered by users to be a bureaucratic impediment with little value as a learning tool. It is more important to support registrars in planning their learning than to enforce documentation of this process in a learning plan. If learning planning is to be an assessed competence, methods of assessment other than the submission of a formal learning plan should be explored.

  15. Data-driven Ontology Development: A Case Study at NASA's Atmospheric Science Data Center

    NASA Astrophysics Data System (ADS)

    Hertz, J.; Huffer, E.; Kusterer, J.

    2012-12-01

    Well-founded ontologies are key to enabling transformative semantic technologies and accelerating scientific research. One example is semantically enabled search and discovery, making scientific data accessible and more understandable by accurately modeling a complex domain. The ontology creation process remains a challenge for many anxious to pursue semantic technologies. The key may be that the creation process -- whether formal, community-based, automated or semi-automated -- should encompass not only a foundational core and supplemental resources but also a focus on the purpose or mission the ontology is created to support. Are there tools or processes to de-mystify, assess or enhance the resulting ontology? We suggest that comparison and analysis of a domain-focused ontology can be made using text engineering tools for information extraction, tokenizers, named entity transducers and others. The results are analyzed to ensure the ontology reflects the core purpose of the domain's mission and that the ontology integrates and describes the supporting data in the language of the domain - how the science is analyzed and discussed among all users of the data. Commonalities and relationships among domain resources describing the Clouds and Earth's Radiant Energy (CERES) Bi-Directional Scan (BDS) datasets from NASA's Atmospheric Science Data Center are compared. The domain resources include: a formal ontology created for CERES; scientific works such as papers, conference proceedings and notes; information extracted from the datasets (i.e., header metadata); and BDS scientific documentation (Algorithm Theoretical Basis Documents, collection guides, data quality summaries and others). These resources are analyzed using the open source software General Architecture for Text Engineering, a mature framework for computational tasks involving human language.

  16. Approximate Micromechanics Treatise of Composite Impact

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Handler, Louis M.

    2005-01-01

    A formalism is described for micromechanic impact of composites. The formalism consists of numerous equations which describe all aspects of impact from impactor and composite conditions to impact contact, damage progression, and penetration or containment. The formalism is based on through-the-thickness displacement increments simulation which makes it convenient to track local damage in terms of microfailure modes and their respective characteristics. A flow chart is provided to cast the formalism (numerous equations) into a computer code for embedment in composite mechanic codes and/or finite element composite structural analysis.

  17. Supporting family caregivers to identify their own needs in end-of-life care: Qualitative findings from a stepped wedge cluster trial.

    PubMed

    Aoun, Samar; Deas, Kathleen; Toye, Chris; Ewing, Gail; Grande, Gunn; Stajduhar, Kelli

    2015-06-01

    The Carer Support Needs Assessment Tool encompasses the physical, psychological, social, practical, financial, and spiritual support needs that government policies in many countries emphasize should be assessed and addressed for family caregivers during end-of-life care. To describe the experience of family caregivers of terminally ill people of the Carer Support Needs Assessment Tool intervention in home-based palliative care. This study was conducted during 2012-2014 in Silver Chain Hospice Care Service in Western Australia. This article reports on one part of a three-part evaluation of a stepped wedge cluster trial. All 233 family caregivers receiving the Carer Support Needs Assessment Tool intervention provided feedback on their experiences via brief end-of-trial semi-structured telephone interviews. Data were subjected to a thematic analysis. The overwhelming majority reported finding the Carer Support Needs Assessment Tool assessment process straightforward and easy. Four key themes were identified: (1) the practicality and usefulness of the systematic assessment; (2) emotional responses to caregiver reflection; (3) validation, reassurance, and empowerment; and (4) accessing support and how this was experienced. Family caregivers appreciated the value of the Carer Support Needs Assessment Tool intervention in engaging them in conversations about their needs, priorities, and solutions. The Carer Support Needs Assessment Tool presented a simple, yet potentially effective intervention to help palliative care providers systematically assess and address family caregivers' needs. The Carer Support Needs Assessment Tool provided a formal structure to facilitate discussions with family caregivers to enable needs to be addressed. Such discussions can also inform an evidence base for the ongoing development of services for family caregivers, ensuring that new or improved services are designed to meet the explicit needs of family caregivers. © The Author(s) 2015.

  18. Educational Software for First Order Logic Semantics in Introductory Logic Courses

    ERIC Educational Resources Information Center

    Mauco, María Virginia; Ferrante, Enzo; Felice, Laura

    2014-01-01

    Basic courses on logic are common in most computer science curricula. Students often have difficulties in handling formalisms and getting familiar with them. Educational software helps to motivate and improve the teaching-learning processes. Therefore, incorporating these kinds of tools becomes important, because they contribute to gaining…

  19. A Tool Chain for the V and V of NASA Cryogenic Fuel Loading Health Management

    DTIC Science & Technology

    2014-10-02

    Here, K. (2011). Formal testing for separation assurance. Ann. Math. Artif . Intell., 63(1), 5–30. Goodrich, C., Narasimhan, S., Daigle, M...Probabilistic Reasoning in Intelligent Sys- tems: Networks of plausible inference Morgan Kauf- mann: . Reed, E., Schumann, J., & Mengshoel, O. (2011

  20. 48 CFR 31.205-19 - Insurance and indemnification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., and disappearance of small hand tools that occur in the ordinary course of business and that are not... general conduct of its business are allowable subject to the following limitations: (i) Types and extent... acquisition cost of the insured assets is allowable only when the contractor has a formal written policy...

  1. Basins 4.0 Climate Assessment Tool (Cat): Supporting Documentation and User Manual (External Review Draft)

    EPA Science Inventory

    EPA has released of the draft document solely for the purpose of pre-dissemination peer review under applicable Information Quality Guidelines (IQGs). This document has not been formally disseminated by EPA. It does not represent and should not be construed to represent any Agenc...

  2. Wikis for Building Content Knowledge in the Foreign Language Classroom

    ERIC Educational Resources Information Center

    Pellet, Stephanie H.

    2012-01-01

    Most pedagogical applications of wikis in foreign language education draw on this collaborative tool to improve (formal) writing skills or to develop target language cultural sensitivity, missing largely on the opportunity to support student-developed L2 content knowledge. Seeking an alternative to traditional teacher-centered approaches, this…

  3. Food Safety Posters for Safe Handling of Leafy Greens

    ERIC Educational Resources Information Center

    Rajagopal, Lakshman; Arendt, Susan W.; Shaw, Angela M.; Strohbehn, Catherine H.; Sauer, Kevin L.

    2016-01-01

    This article describes food safety educational tools depicting safe handling of leafy greens that are available as downloadable posters to Extension educators and practitioners (www.extension.iastate.edu). Nine visual-based minimal-text colored posters in English, Chinese, and Spanish were developed for use when formally or informally educating…

  4. Polynomial Approximation of Functions: Historical Perspective and New Tools

    ERIC Educational Resources Information Center

    Kidron, Ivy

    2003-01-01

    This paper examines the effect of applying symbolic computation and graphics to enhance students' ability to move from a visual interpretation of mathematical concepts to formal reasoning. The mathematics topics involved, Approximation and Interpolation, were taught according to their historical development, and the students tried to follow the…

  5. Constrained variational calculus for higher order classical field theories

    NASA Astrophysics Data System (ADS)

    Campos, Cédric M.; de León, Manuel; Martín de Diego, David

    2010-11-01

    We develop an intrinsic geometrical setting for higher order constrained field theories. As a main tool we use an appropriate generalization of the classical Skinner-Rusk formalism. Some examples of applications are studied, in particular to the geometrical description of optimal control theory for partial differential equations.

  6. Swimming with Sharks: A Physical Educator's Guide to Effective Crowdsourcing

    ERIC Educational Resources Information Center

    Bulger, Sean M.; Jones, Emily M.; Katz, Nicole; Shrewsbury, Gentry; Wood, Justin

    2016-01-01

    The reality-competition television series Shark Tank affords up-and-coming entrepreneurs the opportunity to make a formal business presentation to a panel of potential investors. Adopting a similar framework, entrepreneurial teachers have started using web-based collaborative fundraising or crowdsourcing as a tool to build program capacity with…

  7. Basic Education and Policy Support Activity: Tools and Publications.

    ERIC Educational Resources Information Center

    Creative Associates International, Inc., Washington, DC.

    The Basic Education and Policy Support (BEPS) Activity is a United States Agency for International Development (USAID)-sponsored, multi-year initiative designed to further improve the quality of, effectiveness of, and access to formal and nonformal basic education. This catalog is one element of the BEPS information dissemination process. The…

  8. Dialogue as a Tool for Meaning Making

    ERIC Educational Resources Information Center

    Bruni, Angela Suzanne Dudley

    2013-01-01

    In order to empower citizens to analyze the effects, risk, and value of science, a knowledge of scientific concepts is necessary (Mejlgaard, 2009). The formal educational system plays a role in this endeavor (Gil-Perez & Vilches, 2005). One proposed constructivist practice is the use of social learning activities using verbalized, shared…

  9. Engaging Pediatricians in Developmental Screening: The Effectiveness of Academic Detailing

    ERIC Educational Resources Information Center

    Honigfeld, Lisa; Chandhok, Laura; Spiegelman, Kenneth

    2012-01-01

    Use of formal developmental screening tools in the pediatric medical home improves early identification of children with developmental delays and disorders, including Autism Spectrum Disorders. A pilot study evaluated the impact of an academic detailing module in which trainers visited 43 pediatric primary care practices to provide education about…

  10. Visualization of Learning Scenarios with UML4LD

    ERIC Educational Resources Information Center

    Laforcade, Pierre

    2007-01-01

    Present Educational Modelling Languages are used to formally specify abstract learning scenarios in a machine-interpretable format. Current tooling does not provide teachers/designers with some graphical facilities to help them in reusing existent scenarios. They need human-readable representations. This paper discusses the UML4LD experimental…

  11. An Interactive Diagnosis Approach for Supporting Clinical Nursing Courses

    ERIC Educational Resources Information Center

    Wei, Chun-Wang; Lin, Yi-Chun; Lin, Yen-Ting

    2016-01-01

    Clinical resources in nursing schools are always insufficient for satisfying the practice requirements of each student at the same time during a formal course session. Although several studies have applied information and communication technology to develop computer-based learning tools for addressing this problem, most of these developments lack…

  12. Restorative Practices as a Tool for Organizational Change

    ERIC Educational Resources Information Center

    Boulton, John; Mirsky, Laura

    2006-01-01

    Restorative practices focus on repairing the harm to relationships rather than piling on more punishment for violations. Originally popularized in formal conferences between a victim and offender in the justice system, restorative practices have been extended to educational and treatment settings. This article describes how the adversarial climate…

  13. A marketing matrix for health care organizations.

    PubMed

    Weaver, F J; Gombeski, W R; Fay, G W; Eversman, J J; Cowan-Gascoigne, C

    1986-06-01

    Irrespective of the formal marketing structure successful marketing for health care organizations requires the input on many people. Detailed here is the Marketing Matrix used at the Cleveland Clinic Foundation in Cleveland, Ohio. This Matrix is both a philosophy and a tool for clarifying and focusing the organization's marketing activities.

  14. Risk Prioritization Tool to Identify the Public Health Risks of Wildlife Trade: The Case of Rodents from Latin America.

    PubMed

    Bueno, I; Smith, K M; Sampedro, F; Machalaba, C C; Karesh, W B; Travis, D A

    2016-06-01

    Wildlife trade (both formal and informal) is a potential driver of disease introduction and emergence. Legislative proposals aim to prevent these risks by banning wildlife imports, and creating 'white lists' of species that are cleared for importation. These approaches pose economic harm to the pet industry, and place substantial burden on importers and/or federal agencies to provide proof of low risk for importation of individual species. As a feasibility study, a risk prioritization tool was developed to rank the pathogens found in rodent species imported from Latin America into the United States with the highest risk of zoonotic consequence in the United States. Four formally traded species and 16 zoonotic pathogens were identified. Risk scores were based on the likelihood of pathogen release and human exposure, and the severity of the disease (consequences). Based on the methodology applied, three pathogens (Mycobacterium microti, Giardia spp. and Francisella tularensis) in one species (Cavia porcellus) were ranked as highest concern. The goal of this study was to present a methodological approach by which preliminary management resources can be allocated to the identified high-concern pathogen-species combinations when warranted. This tool can be expanded to other taxa and geographic locations to inform policy surrounding the wildlife trade. © 2015 Blackwell Verlag GmbH.

  15. Expert consensus for performing right heart catheterisation for suspected pulmonary arterial hypertension in systemic sclerosis: a Delphi consensus study with cluster analysis.

    PubMed

    Avouac, Jérôme; Huscher, Dörte; Furst, Daniel E; Opitz, Christian F; Distler, Oliver; Allanore, Yannick

    2014-01-01

    To establish an expert consensus on which criteria are the most appropriate in clinical practice to refer patients with systemic sclerosis (SSc) for right heart catheterisation (RHC) when pulmonary hypertension (PH) is suspected. A three stage internet based Delphi consensus exercise involving worldwide PH experts was designed. In the first stage, a comprehensive list of domains and items combining evidence based indications and expert opinions were obtained. In the second and third stages, experts were asked to rate each item selected in the list. After each of stages 2 and 3, the number of items and criteria were reduced according to a cluster analysis. A literature search and the opinions of 47 experts participating in Delphi stage 1 provided a list of seven domains containing 142 criteria. After stages 2 and 3, these domains and tools were reduced to three domains containing eight tools: clinical (progressive dyspnoea over the past 3 months, unexplained dyspnoea, worsening of WHO dyspnoea functional class, any finding on physical examination suggestive of elevated right heart pressures and any sign of right heart failure), echocardiography (systolic pulmonary artery pressure >45 mm Hg and right ventricle dilation) and pulmonary function tests (diffusion lung capacity for carbon monoxide <50% without pulmonary fibrosis). Among experts in pulmonary arterial hypertension-SSc, a core set of criteria for clinical practice to refer SSc patients for RHC has been defined by Delphi consensus methods. Although these indications are recommended by this expert group to be used as an interim tool, it will be necessary to formally validate the present tools in further studies.

  16. Development of an online, publicly accessible naive Bayesian decision support tool for mammographic mass lesions based on the American College of Radiology (ACR) BI-RADS lexicon.

    PubMed

    Benndorf, Matthias; Kotter, Elmar; Langer, Mathias; Herda, Christoph; Wu, Yirong; Burnside, Elizabeth S

    2015-06-01

    To develop and validate a decision support tool for mammographic mass lesions based on a standardized descriptor terminology (BI-RADS lexicon) to reduce variability of practice. We used separate training data (1,276 lesions, 138 malignant) and validation data (1,177 lesions, 175 malignant). We created naïve Bayes (NB) classifiers from the training data with tenfold cross-validation. Our "inclusive model" comprised BI-RADS categories, BI-RADS descriptors, and age as predictive variables; our "descriptor model" comprised BI-RADS descriptors and age. The resulting NB classifiers were applied to the validation data. We evaluated and compared classifier performance with ROC-analysis. In the training data, the inclusive model yields an AUC of 0.959; the descriptor model yields an AUC of 0.910 (P < 0.001). The inclusive model is superior to the clinical performance (BI-RADS categories alone, P < 0.001); the descriptor model performs similarly. When applied to the validation data, the inclusive model yields an AUC of 0.935; the descriptor model yields an AUC of 0.876 (P < 0.001). Again, the inclusive model is superior to the clinical performance (P < 0.001); the descriptor model performs similarly. We consider our classifier a step towards a more uniform interpretation of combinations of BI-RADS descriptors. We provide our classifier at www.ebm-radiology.com/nbmm/index.html . • We provide a decision support tool for mammographic masses at www.ebm-radiology.com/nbmm/index.html . • Our tool may reduce variability of practice in BI-RADS category assignment. • A formal analysis of BI-RADS descriptors may enhance radiologists' diagnostic performance.

  17. Development, validation and psychometric properties of a diagnostic/prognostic tool for breakthrough pain in mixed chronic-pain patients.

    PubMed

    Samolsky Dekel, Boaz Gedaliahu; Remondini, Francesca; Gori, Alberto; Vasarri, Alessio; Di Nino, GianFranco; Melotti, Rita Maria

    2016-02-01

    Breakthrough pain (BTP) shows variable prevalence in different clinical contexts of cancer and non-cancer patients. BTP diagnostic tools with demonstrated reliability, validation and prognostic capability are lacking. We report the development, psychometric and validation properties of a diagnostic/prognostic tool, the IQ-BTP, for BTP recognition, its likelihood and clinical features among chronic-pain (CP) patients. n=120 consecutive mixed cancer/non-cancer CP in/outpatients. Development, psychometric analyses and formal validation included: Face/Content validity (by 'experts' opinion and assessing the relationship between the IQ-BTP classes and criteria derived from BTP operational-case-definition); Construct validity, by Principle Component Analysis (PCA); and the strength of Spearman correlation between IQ-BTP classes and the Brief Pain Inventory (BPI) items; Reliability, by Cronbach's alpha statistics. Associations with clinical/demographic moderators were assessed applying χ(2) analysis. Potential-BTP was found in 36.7% of patients (38.4% of non-cancer and 32.4% of cancer patients). Among these the likelihood for BTP diagnosis was 'high' in 25%, 'intermediate' in 41% and, 'low' 34% of patients. Analyses showed significant differences between IQ-BTP classes and between the latter BPI pain-item scores. Correlation between IQ-BTP classes and BPI items was moderate. PCA and scree test identified 3 components accounting for 62.3% of the variance. Cronbach's alpha was 0.71. The IQ-BTP showed satisfactory psychometric and validation properties. With adequate feasibility it enabled the allocating of cancer/non-cancer CP patients in three prognostic classes. Results are sufficient to warrant a subsequent impact study of the IQ-BTP as prognostic model and screening tool for BTP in both CP populations. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Reverse Engineering a Signaling Network Using Alternative Inputs

    PubMed Central

    Tanaka, Hiromasa; Yi, Tau-Mu

    2009-01-01

    One of the goals of systems biology is to reverse engineer in a comprehensive fashion the arrow diagrams of signal transduction systems. An important tool for ordering pathway components is genetic epistasis analysis, and here we present a strategy termed Alternative Inputs (AIs) to perform systematic epistasis analysis. An alternative input is defined as any genetic manipulation that can activate the signaling pathway instead of the natural input. We introduced the concept of an “AIs-Deletions matrix” that summarizes the outputs of all combinations of alternative inputs and deletions. We developed the theory and algorithms to construct a pairwise relationship graph from the AIs-Deletions matrix capturing both functional ordering (upstream, downstream) and logical relationships (AND, OR), and then interpreting these relationships into a standard arrow diagram. As a proof-of-principle, we applied this methodology to a subset of genes involved in yeast mating signaling. This experimental pilot study highlights the robustness of the approach and important technical challenges. In summary, this research formalizes and extends classical epistasis analysis from linear pathways to more complex networks, facilitating computational analysis and reconstruction of signaling arrow diagrams. PMID:19898612

  19. Formalization of the engineering science discipline - knowledge engineering

    NASA Astrophysics Data System (ADS)

    Peng, Xiao

    Knowledge is the most precious ingredient facilitating aerospace engineering research and product development activities. Currently, the most common knowledge retention methods are paper-based documents, such as reports, books and journals. However, those media have innate weaknesses. For example, four generations of flying wing aircraft (Horten, Northrop XB-35/YB-49, Boeing BWB and many others) were mostly developed in isolation. The subsequent engineers were not aware of the previous developments, because these projects were documented such which prevented the next generation of engineers to benefit from the previous lessons learned. In this manner, inefficient knowledge retention methods have become a primary obstacle for knowledge transfer from the experienced to the next generation of engineers. In addition, the quality of knowledge itself is a vital criterion; thus, an accurate measure of the quality of 'knowledge' is required. Although qualitative knowledge evaluation criteria have been researched in other disciplines, such as the AAA criterion by Ernest Sosa stemming from the field of philosophy, a quantitative knowledge evaluation criterion needs to be developed which is capable to numerically determine the qualities of knowledge for aerospace engineering research and product development activities. To provide engineers with a high-quality knowledge management tool, the engineering science discipline Knowledge Engineering has been formalized to systematically address knowledge retention issues. This research undertaking formalizes Knowledge Engineering as follows: 1. Categorize knowledge according to its formats and representations for the first time, which serves as the foundation for the subsequent knowledge management function development. 2. Develop an efficiency evaluation criterion for knowledge management by analyzing the characteristics of both knowledge and the parties involved in the knowledge management processes. 3. Propose and develop an innovative Knowledge-Based System (KBS), AVD KBS, forming a systematic approach facilitating knowledge management. 4. Demonstrate the efficiency advantages of AVDKBS over traditional knowledge management methods via selected design case studies. This research formalizes, for the first time, Knowledge Engineering as a distinct discipline by delivering a robust and high-quality knowledge management and process tool, AVDKBS. Formalizing knowledge proves to significantly impact the effectiveness of aerospace knowledge retention and utilization.

  20. Introduction of formal debate into a postgraduate specialty track education programme in periodontics in Japan.

    PubMed

    Saito, A; Fujinami, K

    2011-02-01

    To evaluate the formal debate as an active learning strategy within a postgraduate specialty track education programme in periodontics. A formal debate was implemented as an active learning strategy in the programme. The participants were full-time faculty, residents and dentists attending special courses at a teaching hospital in Japan. They were grouped into two evenly matched opposing teams, judges and audience. As a preparation for the debate, the participants attended a lecture on critical thinking. At the time of debate, each team provided a theme report with a list of references. Performances and contents of the debate were evaluated by the course instructors and audience. Pre- and post-debate testing was used to assess the participants' objective knowledge on clinical periodontology. Evaluation of the debate by the participants revealed that scores for criteria, such as presentation performance, response with logic and rebuttal effectiveness were relatively low. Thirty-eight per cent of the participants demonstrated higher test scores after the debate, although there was no statistically significant difference in the mean scores between pre- and post-tests. At the end of the debate, vast majority of participants recognised the significance and importance of the formal debate in the programme. It was suggested that the incorporation of the formal debate could serve as an educational tool for the postgraduate specialty track programme. © 2011 John Wiley & Sons A/S.

  1. Syndromic surveillance of influenza activity in Sweden: an evaluation of three tools.

    PubMed

    Ma, T; Englund, H; Bjelkmar, P; Wallensten, A; Hulth, A

    2015-08-01

    An evaluation was conducted to determine which syndromic surveillance tools complement traditional surveillance by serving as earlier indicators of influenza activity in Sweden. Web queries, medical hotline statistics, and school absenteeism data were evaluated against two traditional surveillance tools. Cross-correlation calculations utilized aggregated weekly data for all-age, nationwide activity for four influenza seasons, from 2009/2010 to 2012/2013. The surveillance tool indicative of earlier influenza activity, by way of statistical and visual evidence, was identified. The web query algorithm and medical hotline statistics performed equally well as each other and to the traditional surveillance tools. School absenteeism data were not reliable resources for influenza surveillance. Overall, the syndromic surveillance tools did not perform with enough consistency in season lead nor in earlier timing of the peak week to be considered as early indicators. They do, however, capture incident cases before they have formally entered the primary healthcare system.

  2. Formal Specification and Validation of a Hybrid Connectivity Restoration Algorithm for Wireless Sensor and Actor Networks †

    PubMed Central

    Imran, Muhammad; Zafar, Nazir Ahmad

    2012-01-01

    Maintaining inter-actor connectivity is extremely crucial in mission-critical applications of Wireless Sensor and Actor Networks (WSANs), as actors have to quickly plan optimal coordinated responses to detected events. Failure of a critical actor partitions the inter-actor network into disjoint segments besides leaving a coverage hole, and thus hinders the network operation. This paper presents a Partitioning detection and Connectivity Restoration (PCR) algorithm to tolerate critical actor failure. As part of pre-failure planning, PCR determines critical/non-critical actors based on localized information and designates each critical node with an appropriate backup (preferably non-critical). The pre-designated backup detects the failure of its primary actor and initiates a post-failure recovery process that may involve coordinated multi-actor relocation. To prove the correctness, we construct a formal specification of PCR using Z notation. We model WSAN topology as a dynamic graph and transform PCR to corresponding formal specification using Z notation. Formal specification is analyzed and validated using the Z Eves tool. Moreover, we simulate the specification to quantitatively analyze the efficiency of PCR. Simulation results confirm the effectiveness of PCR and the results shown that it outperforms contemporary schemes found in the literature.

  3. Learning Competences in Open Mobile Environments: A Comparative Analysis between Formal and Non-Formal Spaces

    ERIC Educational Resources Information Center

    Figaredo, Daniel Domínguez; Miravalles, Paz Trillo

    2014-01-01

    As a result of the increasing use of mobile devices in education, new approaches to define the learning competences in the field of digitally mediated learning have emerged. This paper examines these approaches, using data obtained from empirical research with a group of Spanish university students. The analysis is focused on the experiences of…

  4. The jmzQuantML programming interface and validator for the mzQuantML data standard.

    PubMed

    Qi, Da; Krishna, Ritesh; Jones, Andrew R

    2014-03-01

    The mzQuantML standard from the HUPO Proteomics Standards Initiative has recently been released, capturing quantitative data about peptides and proteins, following analysis of MS data. We present a Java application programming interface (API) for mzQuantML called jmzQuantML. The API provides robust bridges between Java classes and elements in mzQuantML files and allows random access to any part of the file. The API provides read and write capabilities, and is designed to be embedded in other software packages, enabling mzQuantML support to be added to proteomics software tools (http://code.google.com/p/jmzquantml/). The mzQuantML standard is designed around a multilevel validation system to ensure that files are structurally and semantically correct for different proteomics quantitative techniques. In this article, we also describe a Java software tool (http://code.google.com/p/mzquantml-validator/) for validating mzQuantML files, which is a formal part of the data standard. © 2014 The Authors. Proteomics published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Model-Checking with Edge-Valued Decision Diagrams

    NASA Technical Reports Server (NTRS)

    Roux, Pierre; Siminiceanu, Radu I.

    2010-01-01

    We describe an algebra of Edge-Valued Decision Diagrams (EVMDDs) to encode arithmetic functions and its implementation in a model checking library along with state-of-the-art algorithms for building the transition relation and the state space of discrete state systems. We provide efficient algorithms for manipulating EVMDDs and give upper bounds of the theoretical time complexity of these algorithms for all basic arithmetic and relational operators. We also demonstrate that the time complexity of the generic recursive algorithm for applying a binary operator on EVMDDs is no worse than that of Multi-Terminal Decision Diagrams. We have implemented a new symbolic model checker with the intention to represent in one formalism the best techniques available at the moment across a spectrum of existing tools: EVMDDs for encoding arithmetic expressions, identity-reduced MDDs for representing the transition relation, and the saturation algorithm for reachability analysis. We compare our new symbolic model checking EVMDD library with the widely used CUDD package and show that, in many cases, our tool is several orders of magnitude faster than CUDD.

  6. Enantioselective Brønsted Acid Catalysis as a Tool for the Synthesis of Natural Products and Pharmaceuticals.

    PubMed

    Merad, Jérémy; Lalli, Claudia; Bernadat, Guillaume; Maury, Julien; Masson, Géraldine

    2018-03-15

    Synthesis of biologically active molecules (whether at laboratory or industrial scale) remains a highly appealing area of modern organic chemistry. Nowadays, the need to access original bioactive scaffolds goes together with the desire to improve synthetic efficiency, while reducing the environmental footprint of chemical activities. Long neglected in the field of total synthesis, enantioselective organocatalysis has recently emerged as an environmentally friendly and indispensable tool for the construction of relevant bioactive molecules. Notably, enantioselective Brønsted acid catalysis has offered new opportunities in terms of both retrosynthetic disconnections and controlling stereoselectivity. The present report attempts to provide an overview of enantioselective total or formal syntheses designed around Brønsted acid-catalyzed transformations. To demonstrate the versatility of the reactions promoted and the diversity of the accessible motifs, this Minireview draws a systematic parallel between methods and retrosynthetic analysis. The manuscript is organized according to the main reaction types and the nature of newly-formed bonds. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Historical perspective on risk assessment in the federal government.

    PubMed

    Graham, J D

    1995-09-01

    This article traces the evolution of risk assessment as an essential analytical tool in the federal government. In many programs and agencies, decisions cannot be made without the benefit of information from risk assessment. Although this analytical tool influences important public health and economic decisions, there is widespread dissatisfaction with the day-to-day practice of risk assessment. The article describes the sources of dissatisfaction that have been voiced by scientists, regulators, interest groups and ordinary citizens. Problems include the use of arbitrary exposure scenarios, the misuse of the 'carcinogen' label, the excessive reliance on animal cancer tests, the lack of formal uncertainty analysis the low priority assigned to noncancer endpoints, the poor communication of risk estimates and the neglect of inequities in the distribution of risk. Despite these limitations, the article argues that more danger rests in efforts to make decisions without any risk assessment. Recent Congressional and Administration interest in risk assessment is encouraging because it offers promise to learn from past mistakes and set in motion steps to enhance the risk assessment process.

  8. The methodological quality of three foundational law enforcement Drug Influence Evaluation validation studies.

    PubMed

    Kane, Greg

    2013-11-04

    A Drug Influence Evaluation (DIE) is a formal assessment of an impaired driving suspect, performed by a trained law enforcement officer who uses circumstantial facts, questioning, searching, and a physical exam to form an unstandardized opinion as to whether a suspect's driving was impaired by drugs. This paper first identifies the scientific studies commonly cited in American criminal trials as evidence of DIE accuracy, and second, uses the QUADAS tool to investigate whether the methodologies used by these studies allow them to correctly quantify the diagnostic accuracy of the DIEs currently administered by US law enforcement. Three studies were selected for analysis. For each study, the QUADAS tool identified biases that distorted reported accuracies. The studies were subject to spectrum bias, selection bias, misclassification bias, verification bias, differential verification bias, incorporation bias, and review bias. The studies quantified DIE performance with prevalence-dependent accuracy statistics that are internally but not externally valid. The accuracies reported by these studies do not quantify the accuracy of the DIE process now used by US law enforcement. These studies do not validate current DIE practice.

  9. Multiagent Work Practice Simulation: Progress and Challenges

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Sierhuis, Maarten; Shaffe, Michael G. (Technical Monitor)

    2001-01-01

    Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and a computer system. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3D space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).

  10. The minimalist grammar of action

    PubMed Central

    Pastra, Katerina; Aloimonos, Yiannis

    2012-01-01

    Language and action have been found to share a common neural basis and in particular a common ‘syntax’, an analogous hierarchical and compositional organization. While language structure analysis has led to the formulation of different grammatical formalisms and associated discriminative or generative computational models, the structure of action is still elusive and so are the related computational models. However, structuring action has important implications on action learning and generalization, in both human cognition research and computation. In this study, we present a biologically inspired generative grammar of action, which employs the structure-building operations and principles of Chomsky's Minimalist Programme as a reference model. In this grammar, action terminals combine hierarchically into temporal sequences of actions of increasing complexity; the actions are bound with the involved tools and affected objects and are governed by certain goals. We show, how the tool role and the affected-object role of an entity within an action drives the derivation of the action syntax in this grammar and controls recursion, merge and move, the latter being mechanisms that manifest themselves not only in human language, but in human action too. PMID:22106430

  11. Multiagent Work Practice Simulation: Progress and Challenges

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Sierhuis, Maarten

    2002-01-01

    Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and computer systems. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3d space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).

  12. Blended near-optimal tools for flexible water resources decision making

    NASA Astrophysics Data System (ADS)

    Rosenberg, David

    2015-04-01

    State-of-the-art systems analysis techniques focus on efficiently finding optimal solutions. Yet an optimal solution is optimal only for the static modelled issues and managers often seek near-optimal alternatives that address un-modelled or changing objectives, preferences, limits, uncertainties, and other issues. Early on, Modelling to Generate Alternatives (MGA) formalized near-optimal as performance within a tolerable deviation from the optimal objective function value and identified a few maximally-different alternatives that addressed select un-modelled issues. This paper presents new stratified, Monte Carlo Markov Chain sampling and parallel coordinate plotting tools that generate and communicate the structure and full extent of the near-optimal region to an optimization problem. Plot controls allow users to interactively explore region features of most interest. Controls also streamline the process to elicit un-modelled issues and update the model formulation in response to elicited issues. Use for a single-objective water quality management problem at Echo Reservoir, Utah identifies numerous and flexible practices to reduce the phosphorus load to the reservoir and maintain close-to-optimal performance. Compared to MGA, the new blended tools generate more numerous alternatives faster, more fully show the near-optimal region, help elicit a larger set of un-modelled issues, and offer managers greater flexibility to cope in a changing world.

  13. Multi-Agent Modeling and Simulation Approach for Design and Analysis of MER Mission Operations

    NASA Technical Reports Server (NTRS)

    Seah, Chin; Sierhuis, Maarten; Clancey, William J.

    2005-01-01

    A space mission operations system is a complex network of human organizations, information and deep-space network systems and spacecraft hardware. As in other organizations, one of the problems in mission operations is managing the relationship of the mission information systems related to how people actually work (practices). Brahms, a multi-agent modeling and simulation tool, was used to model and simulate NASA's Mars Exploration Rover (MER) mission work practice. The objective was to investigate the value of work practice modeling for mission operations design. From spring 2002 until winter 2003, a Brahms modeler participated in mission systems design sessions and operations testing for the MER mission held at Jet Propulsion Laboratory (JPL). He observed how designers interacted with the Brahms tool. This paper discussed mission system designers' reactions to the simulation output during model validation and the presentation of generated work procedures. This project spurred JPL's interest in the Brahms model, but it was never included as part of the formal mission design process. We discuss why this occurred. Subsequently, we used the MER model to develop a future mission operations concept. Team members were reluctant to use the MER model, even though it appeared to be highly relevant to their effort. We describe some of the tool issues we encountered.

  14. Risk assessment as standard work in design.

    PubMed

    Morrill, Patricia W

    2013-01-01

    This case study article examines a formal risk assessment as part of the decision making process for design solutions in high risk areas. The overview of the Failure Modes and Effects Analysis (FMEA) tool with examples of its application in hospital building projects will demonstrate the benefit of those structured conversations. This article illustrates how two hospitals used FMEA when integrating operational processes with building projects: (1) adjacency decision for Intensive Care Unit (ICU); and (2) distance concern for handling of specimens from Surgery to Lab. Both case studies involved interviews that exposed facility solution concerns. Just-in-time studies using the FMEA followed the same risk assessment process with the same workshop facilitator involving structured conversations in analyzing risks. In both cases, participants uncovered key areas of risk enabling them to take the necessary next steps. While the focus of this article is not the actual design solution, it is apparent that the risk assessment brought clarity to the situations resulting in prompt decision making about facility solutions. Hospitals are inherently risky environments; therefore, use of the formal risk assessment process, FMEA, is an opportunity for design professionals to apply more rigor to design decision making when facility solutions impact operations in high risk areas. Case study, decision making, hospital, infection control, strategy, work environment.

  15. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography

    PubMed Central

    Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young

    2016-01-01

    Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.’s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.’s protocol and existing similar protocols. PMID:27163786

  16. Modeling Of Object- And Scene-Prototypes With Hierarchically Structured Classes

    NASA Astrophysics Data System (ADS)

    Ren, Z.; Jensch, P.; Ameling, W.

    1989-03-01

    The success of knowledge-based image analysis methodology and implementation tools depends largely on an appropriately and efficiently built model wherein the domain-specific context information about and the inherent structure of the observed image scene have been encoded. For identifying an object in an application environment a computer vision system needs to know firstly the description of the object to be found in an image or in an image sequence, secondly the corresponding relationships between object descriptions within the image sequence. This paper presents models of image objects scenes by means of hierarchically structured classes. Using the topovisual formalism of graph and higraph, we are currently studying principally the relational aspect and data abstraction of the modeling in order to visualize the structural nature resident in image objects and scenes, and to formalize. their descriptions. The goal is to expose the structure of image scene and the correspondence of image objects in the low level image interpretation. process. The object-based system design approach has been applied to build the model base. We utilize the object-oriented programming language C + + for designing, testing and implementing the abstracted entity classes and the operation structures which have been modeled topovisually. The reference images used for modeling prototypes of objects and scenes are from industrial environments as'well as medical applications.

  17. Characterizing pedagogical practices of university physics students in informal learning environments

    NASA Astrophysics Data System (ADS)

    Hinko, Kathleen A.; Madigan, Peter; Miller, Eric; Finkelstein, Noah D.

    2016-06-01

    [This paper is part of the Focused Collection on Preparing and Supporting University Physics Educators.] University educators (UEs) have a long history of teaching physics not only in formal classroom settings but also in informal outreach environments. The pedagogical practices of UEs in informal physics teaching have not been widely studied, and they may provide insight into formal practices and preparation. We investigate the interactions between UEs and children in an afterschool physics program facilitated by university physics students from the University of Colorado Boulder. In this program, physics undergraduates, graduate students, and postdoctoral researchers work with K-8 children on hands-on physics activities on a weekly basis over the course of a semester. We use an activity theoretic framework as a tool to examine situational aspects of individuals' behavior in the complex structure of the afterschool program. Using this framework, we analyze video of UE-child interactions and identify three main pedagogical modalities that UEs display during activities: instruction, consultation, and participation modes. These modes are characterized by certain language, physical location, and objectives that establish differences in UE-child roles and division of labor. Based on this analysis, we discuss implications for promoting pedagogical strategies through purposeful curriculum development and university educator preparation.

  18. Statistical Irreversible Thermodynamics in the Framework of Zubarev's Nonequilibrium Statistical Operator Method

    NASA Astrophysics Data System (ADS)

    Luzzi, R.; Vasconcellos, A. R.; Ramos, J. G.; Rodrigues, C. G.

    2018-01-01

    We describe the formalism of statistical irreversible thermodynamics constructed based on Zubarev's nonequilibrium statistical operator (NSO) method, which is a powerful and universal tool for investigating the most varied physical phenomena. We present brief overviews of the statistical ensemble formalism and statistical irreversible thermodynamics. The first can be constructed either based on a heuristic approach or in the framework of information theory in the Jeffreys-Jaynes scheme of scientific inference; Zubarev and his school used both approaches in formulating the NSO method. We describe the main characteristics of statistical irreversible thermodynamics and discuss some particular considerations of several authors. We briefly describe how Rosenfeld, Bohr, and Prigogine proposed to derive a thermodynamic uncertainty principle.

  19. Note-taking and Handouts in The Digital Age.

    PubMed

    Stacy, Elizabeth Moore; Cain, Jeff

    2015-09-25

    Most educators consider note-taking a critical component of formal classroom learning. Advancements in technology such as tablet computers, mobile applications, and recorded lectures are altering classroom dynamics and affecting the way students compose and review class notes. These tools may improve a student's ability to take notes, but they also may hinder learning. In an era of dynamic technology developments, it is important for educators to routinely examine and evaluate influences on formal and informal learning environments. This paper discusses key background literature on student note-taking, identifies recent trends and potential implications of mobile technologies on classroom note-taking and student learning, and discusses future directions for note-taking in the context of digitally enabled lifelong learning.

  20. Incompleteness of Bluetooth protocol conformance test cases

    NASA Astrophysics Data System (ADS)

    Wu, Peng; Gao, Qiang

    2001-10-01

    This paper describes a formal method to verify the completeness of conformance testing, in which not only Implementation Under Test (IUT) is formalized in SDL, but also conformance tester is described in SDL so that conformance testing can be performed in simulator provided with CASE tool. The protocol set considered is Bluetooth, an open wireless communication technology. Our research results show that Bluetooth conformance test specification is not complete in that it has only limited coverage and many important capabilities defined in Bluetooth core specification are not tested. We also give a detail report on the missing test cases against Bluetooth core specification, and provide a guide on further test case generation in the future.

  1. Structured representation for requirements and specifications

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Fisher, Gene; Frincke, Deborah; Wolber, Dave

    1991-01-01

    This document was generated in support of NASA contract NAS1-18586, Design and Validation of Digital Flight Control Systems suitable for Fly-By-Wire Applications, Task Assignment 2. Task 2 is associated with a formal representation of requirements and specifications. In particular, this document contains results associated with the development of a Wide-Spectrum Requirements Specification Language (WSRSL) that can be used to express system requirements and specifications in both stylized and formal forms. Included with this development are prototype tools to support the specification language. In addition a preliminary requirements specification methodology based on the WSRSL has been developed. Lastly, the methodology has been applied to an Advanced Subsonic Civil Transport Flight Control System.

  2. Gulf Coast Clean Energy Application Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillingham, Gavin

    The Gulf Coast Clean Energy Application Center was initiated to significantly improve market and regulatory conditions for the implementation of combined heat and power technologies. The GC CEAC was responsible for the development of CHP in Texas, Louisiana and Oklahoma. Through this program we employed a variety of outreach and education techniques, developed and deployed assessment tools and conducted market assessments. These efforts resulted in the growth of the combined heat and power market in the Gulf Coast region with a realization of more efficient energy generation, reduced emissions and a more resilient infrastructure. Specific t research, we did notmore » formally investigate any techniques with any formal research design or methodology.« less

  3. Hidden symmetry and nonlinear paraxial atom optics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Impens, Francois

    2009-12-15

    A hidden symmetry of the nonlinear wave equation is exploited to analyze the propagation of paraxial and uniform atom-laser beams in time-independent and quadratic transverse potentials with cylindrical symmetry. The quality factor and the paraxial ABCD formalism are generalized to account exactly for mean-field interaction effects in such beams. Using an approach based on moments, these theoretical tools provide a simple yet exact picture of the interacting beam profile evolution. Guided atom laser experiments are discussed. This treatment addresses simultaneously optical and atomic beams in a unified manner, exploiting the formal analogy between nonlinear optics, nonlinear paraxial atom optics, andmore » the physics of two-dimensional Bose-Einstein condensates.« less

  4. Strategic Planning in Population Health and Public Health Practice: A Call to Action for Higher Education

    PubMed Central

    PHELPS, CHARLES; RAPPUOLI, RINO; LEVIN, SCOTT; SHORTLIFFE, EDWARD; COLWELL, RITA

    2016-01-01

    Policy Points: Scarce resources, especially in population health and public health practice, underlie the importance of strategic planning.Public health agencies’ current planning and priority setting efforts are often narrow, at times opaque, and focused on single metrics such as cost‐effectiveness.As demonstrated by SMART Vaccines, a decision support software system developed by the Institute of Medicine and the National Academy of Engineering, new approaches to strategic planning allow the formal incorporation of multiple stakeholder views and multicriteria decision making that surpass even those sophisticated cost‐effectiveness analyses widely recommended and used for public health planning.Institutions of higher education can and should respond by building on modern strategic planning tools as they teach their students how to improve population health and public health practice. Context Strategic planning in population health and public health practice often uses single indicators of success or, when using multiple indicators, provides no mechanism for coherently combining the assessments. Cost‐effectiveness analysis, the most complex strategic planning tool commonly applied in public health, uses only a single metric to evaluate programmatic choices, even though other factors often influence actual decisions. Methods Our work employed a multicriteria systems analysis approach—specifically, multiattribute utility theory—to assist in strategic planning and priority setting in a particular area of health care (vaccines), thereby moving beyond the traditional cost‐effectiveness analysis approach. Findings (1) Multicriteria systems analysis provides more flexibility, transparency, and clarity in decision support for public health issues compared with cost‐effectiveness analysis. (2) More sophisticated systems‐level analyses will become increasingly important to public health as disease burdens increase and the resources to deal with them become scarcer. Conclusions The teaching of strategic planning in public health must be expanded in order to fill a void in the profession's planning capabilities. Public health training should actively incorporate model building, promote the interactive use of software tools, and explore planning approaches that transcend restrictive assumptions of cost‐effectiveness analysis. The Strategic Multi‐Attribute Ranking Tool for Vaccines (SMART Vaccines), which was recently developed by the Institute of Medicine and the National Academy of Engineering to help prioritize new vaccine development, is a working example of systems analysis as a basis for decision support. PMID:26994711

  5. Strategic Planning in Population Health and Public Health Practice: A Call to Action for Higher Education.

    PubMed

    Phelps, Charles; Madhavan, Guruprasad; Rappuoli, Rino; Levin, Scott; Shortliffe, Edward; Colwell, Rita

    2016-03-01

    Scarce resources, especially in population health and public health practice, underlie the importance of strategic planning. Public health agencies' current planning and priority setting efforts are often narrow, at times opaque, and focused on single metrics such as cost-effectiveness. As demonstrated by SMART Vaccines, a decision support software system developed by the Institute of Medicine and the National Academy of Engineering, new approaches to strategic planning allow the formal incorporation of multiple stakeholder views and multicriteria decision making that surpass even those sophisticated cost-effectiveness analyses widely recommended and used for public health planning. Institutions of higher education can and should respond by building on modern strategic planning tools as they teach their students how to improve population health and public health practice. Strategic planning in population health and public health practice often uses single indicators of success or, when using multiple indicators, provides no mechanism for coherently combining the assessments. Cost-effectiveness analysis, the most complex strategic planning tool commonly applied in public health, uses only a single metric to evaluate programmatic choices, even though other factors often influence actual decisions. Our work employed a multicriteria systems analysis approach--specifically, multiattribute utility theory--to assist in strategic planning and priority setting in a particular area of health care (vaccines), thereby moving beyond the traditional cost-effectiveness analysis approach. (1) Multicriteria systems analysis provides more flexibility, transparency, and clarity in decision support for public health issues compared with cost-effectiveness analysis. (2) More sophisticated systems-level analyses will become increasingly important to public health as disease burdens increase and the resources to deal with them become scarcer. The teaching of strategic planning in public health must be expanded in order to fill a void in the profession's planning capabilities. Public health training should actively incorporate model building, promote the interactive use of software tools, and explore planning approaches that transcend restrictive assumptions of cost-effectiveness analysis. The Strategic Multi-Attribute Ranking Tool for Vaccines (SMART Vaccines), which was recently developed by the Institute of Medicine and the National Academy of Engineering to help prioritize new vaccine development, is a working example of systems analysis as a basis for decision support. © 2016 Milbank Memorial Fund.

  6. How do physicians learn to provide palliative care?

    PubMed

    Schulman-Green, Dena

    2003-01-01

    Medical interns, residents, and fellows are heavily involved in caring for dying patients and interacting with their families. Due to a lack of formal medical education in the area, these house staff often have a limited knowledge of palliative care. The purpose of this study was to determine how, given inadequate formal education, house staff learn to provide palliative care. Specifically, this study sought to explore the extent to which physicians learn to provide palliative care through formal medical education, from physicians and other hospital staff, and by on-the-job learning. Twenty physicians were interviewed about their medical education and other learning experiences in palliative care. ATLAS/ti software was used for data coding and analysis. Analysis of transcripts indicated that house staff learn little to nothing through formal education, to varying degrees from attending physicians and hospital staff, and mostly on the job and by making mistakes.

  7. Dynamical analysis of uterine cell electrical activity model.

    PubMed

    Rihana, S; Santos, J; Mondie, S; Marque, C

    2006-01-01

    The uterus is a physiological system consisting of a large number of interacting smooth muscle cells. The uterine excitability changes remarkably with time, generally quiescent during pregnancy, the uterus exhibits forceful synchronized contractions at term leading to fetus expulsion. These changes characterize thus a dynamical system susceptible of being studied through formal mathematical tools. Multiple physiological factors are involved in the regulation process of this complex system. Our aim is to relate the physiological factors to the uterine cell dynamic behaviors. Taking into account a previous work presented, in which the electrical activity of a uterine cell is described by a set of ordinary differential equations, we analyze the impact of physiological parameters on the response of the model, and identify the main subsystems generating the complex uterine electrical activity, with respect to physiological data.

  8. Design and implementation of a control structure for quality products in a crude oil atmospheric distillation column.

    PubMed

    Sotelo, David; Favela-Contreras, Antonio; Sotelo, Carlos; Jiménez, Guillermo; Gallegos-Canales, Luis

    2017-11-01

    In recent years, interest for petrochemical processes has been increasing, especially in refinement area. However, the high variability in the dynamic characteristics present in the atmospheric distillation column poses a challenge to obtain quality products. To improve distillates quality in spite of the changes in the input crude oil composition, this paper details a new design of a control strategy in a conventional crude oil distillation plant defined using formal interaction analysis tools. The process dynamic and its control are simulated on Aspen HYSYS ® dynamic environment under real operating conditions. The simulation results are compared against a typical control strategy commonly used in crude oil atmospheric distillation columns. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  9. A Collaborative Reasoning Maintenance System for a Reliable Application of Legislations

    NASA Astrophysics Data System (ADS)

    Tamisier, Thomas; Didry, Yoann; Parisot, Olivier; Feltz, Fernand

    Decision support systems are nowadays used to disentangle all kinds of intricate situations and perform sophisticated analysis. Moreover, they are applied in areas where the knowledge can be heterogeneous, partially un-formalized, implicit, or diffuse. The representation and management of this knowledge become the key point to ensure the proper functioning of the system and keep an intuitive view upon its expected behavior. This paper presents a generic architecture for implementing knowledge-base systems used in collaborative business, where the knowledge is organized into different databases, according to the usage, persistence and quality of the information. This approach is illustrated with Cadral, a customizable automated tool built on this architecture and used for processing family benefits applications at the National Family Benefits Fund of the Grand-Duchy of Luxembourg.

  10. Financial Data Analysis by means of Coupled Continuous-Time Random Walk in Rachev-Rűschendorf Model

    NASA Astrophysics Data System (ADS)

    Jurlewicz, A.; Wyłomańska, A.; Żebrowski, P.

    2008-09-01

    We adapt the continuous-time random walk formalism to describe asset price evolution. We expand the idea proposed by Rachev and Rűschendorf who analyzed the binomial pricing model in the discrete time with randomization of the number of price changes. As a result, in the framework of the proposed model we obtain a mixture of the Gaussian and a generalized arcsine laws as the limiting distribution of log-returns. Moreover, we derive an European-call-option price that is an extension of the Black-Scholes formula. We apply the obtained theoretical results to model actual financial data and try to show that the continuous-time random walk offers alternative tools to deal with several complex issues of financial markets.

  11. Formal Methods for Verification and Validation of Partial Specifications: A Case Study

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  12. The inner formal structure of the H-T-P drawings: an exploratory study.

    PubMed

    Vass, Z

    1998-08-01

    The study describes some interrelated patterns of traits of the House-Tree-Person (H-T-P) drawings with the instruments of hierarchical cluster analysis. First, according to the literature 1 7 formal or structural aspects of the projective drawings were collected, after which a detailed manual for coding was compiled. Second, the interrater reliability and the consistency of this manual was tested. Third, the hierarchical cluster structure of the reliable and consistent formal aspects was analysed. Results are: (a) a psychometrically tested coding manual of the investigated formal-structural aspects, each of them illustrated with drawings that showed the highest interrater agreement; and (b) the hierarchic cluster structure of the formal aspects of the H-T-P drawings of "normal" adults.

  13. Application of Architectural Patterns and Lightweight Formal Method for the Validation and Verification of Safety Critical Systems

    DTIC Science & Technology

    2013-09-01

    to a XML file, a code that Bonine in [21] developed for a similar purpose. Using the StateRover XML log file import tool, we are able to generate a...C. Bonine , M. Shing, T.W. Otani, “Computer-aided process and tools for mobile software acquisition,” NPS, Monterey, CA, Tech. Rep. NPS-SE-13...C10P07R05– 075, 2013. [21] C. Bonine , “Specification, validation and verification of mobile application behavior,” M.S. thesis, Dept. Comp. Science, NPS

  14. LEGOS: Object-based software components for mission-critical systems. Final report, June 1, 1995--December 31, 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-08-01

    An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enablemore » rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed object operating system, lack of a standard Computer-Aided Software Environment (CASE) tool notation and lack of a standard CASE tool repository has limited the realization of component software. The approach to fulfilling this need is the software component factory innovation. The factory approach takes advantage of emerging standards such as UML, CORBA, Java and the Internet. The key technical innovation of the software component factory is the ability to assemble and test new system configurations as well as assemble new tools on demand from existing tools and architecture design repositories.« less

  15. Mathematical formalisms based on approximated kinetic representations for modeling genetic and metabolic pathways.

    PubMed

    Alves, Rui; Vilaprinyo, Ester; Hernádez-Bermejo, Benito; Sorribas, Albert

    2008-01-01

    There is a renewed interest in obtaining a systemic understanding of metabolism, gene expression and signal transduction processes, driven by the recent research focus on Systems Biology. From a biotechnological point of view, such a systemic understanding of how a biological system is designed to work can facilitate the rational manipulation of specific pathways in different cell types to achieve specific goals. Due to the intrinsic complexity of biological systems, mathematical models are a central tool for understanding and predicting the integrative behavior of those systems. Particularly, models are essential for a rational development of biotechnological applications and in understanding system's design from an evolutionary point of view. Mathematical models can be obtained using many different strategies. In each case, their utility will depend upon the properties of the mathematical representation and on the possibility of obtaining meaningful parameters from available data. In practice, there are several issues at stake when one has to decide which mathematical model is more appropriate for the study of a given problem. First, one needs a model that can represent the aspects of the system one wishes to study. Second, one must choose a mathematical representation that allows an accurate analysis of the system with respect to different aspects of interest (for example, robustness of the system, dynamical behavior, optimization of the system with respect to some production goal, parameter value determination, etc). Third, before choosing between alternative and equally appropriate mathematical representations for the system, one should compare representations with respect to easiness of automation for model set-up, simulation, and analysis of results. Fourth, one should also consider how to facilitate model transference and re-usability by other researchers and for distinct purposes. Finally, one factor that is important for all four aspects is the regularity in the mathematical structure of the equations because it facilitates computational manipulation. This regularity is a mark of kinetic representations based on approximation theory. The use of approximation theory to derive mathematical representations with regular structure for modeling purposes has a long tradition in science. In most applied fields, such as engineering and physics, those approximations are often required to obtain practical solutions to complex problems. In this paper we review some of the more popular mathematical representations that have been derived using approximation theory and are used for modeling in molecular systems biology. We will focus on formalisms that are theoretically supported by the Taylor Theorem. These include the Power-law formalism, the recently proposed (log)linear and Lin-log formalisms as well as some closely related alternatives. We will analyze the similarities and differences between these formalisms, discuss the advantages and limitations of each representation, and provide a tentative "road map" for their potential utilization for different problems.

  16. Seeking high reliability in primary care: Leadership, tools, and organization.

    PubMed

    Weaver, Robert R

    2015-01-01

    Leaders in health care increasingly recognize that improving health care quality and safety requires developing an organizational culture that fosters high reliability and continuous process improvement. For various reasons, a reliability-seeking culture is lacking in most health care settings. Developing a reliability-seeking culture requires leaders' sustained commitment to reliability principles using key mechanisms to embed those principles widely in the organization. The aim of this study was to examine how key mechanisms used by a primary care practice (PCP) might foster a reliability-seeking, system-oriented organizational culture. A case study approach was used to investigate the PCP's reliability culture. The study examined four cultural artifacts used to embed reliability-seeking principles across the organization: leadership statements, decision support tools, and two organizational processes. To decipher their effects on reliability, the study relied on observations of work patterns and the tools' use, interactions during morning huddles and process improvement meetings, interviews with clinical and office staff, and a "collective mindfulness" questionnaire. The five reliability principles framed the data analysis. Leadership statements articulated principles that oriented the PCP toward a reliability-seeking culture of care. Reliability principles became embedded in the everyday discourse and actions through the use of "problem knowledge coupler" decision support tools and daily "huddles." Practitioners and staff were encouraged to report unexpected events or close calls that arose and which often initiated a formal "process change" used to adjust routines and prevent adverse events from recurring. Activities that foster reliable patient care became part of the taken-for-granted routine at the PCP. The analysis illustrates the role leadership, tools, and organizational processes play in developing and embedding a reliable-seeking culture across an organization. Progress toward a reliability-seeking, system-oriented approach to care remains ongoing, and movement in that direction requires deliberate and sustained effort by committed leaders in health care.

  17. NoteCards: A Multimedia Idea Processing Environment.

    ERIC Educational Resources Information Center

    Halasz, Frank G.

    1986-01-01

    Notecards is a computer environment designed to help people work with ideas by providing a set of tools for a variety of specific activities, which can range from sketching on the back of an envelope to formally representing knowledge. The basic framework of this hypermedia system is a semantic network of electronic notecards connected by…

  18. Coordinating Formal and Informal Aspects of Mathematics in a Computer Based Learning Environment

    ERIC Educational Resources Information Center

    Skouras, A. S.

    2006-01-01

    The introduction of educational technology to school classes promises--through the students' active engagement with mathematical concepts--the creation of teaching and learning opportunities in mathematics. However, the way technological tools are used in the teaching practice as a means of human thought and action remains an unsettled matter as…

  19. Evaluating the Effectiveness of a Large-Scale Professional Development Programme

    ERIC Educational Resources Information Center

    Main, Katherine; Pendergast, Donna

    2017-01-01

    An evaluation of the effectiveness of a large-scale professional development (PD) programme delivered to 258 schools in Queensland, Australia is presented. Formal evaluations were conducted at two stages during the programme using a tool developed from Desimone's five core features of effective PD. Descriptive statistics of 38 questions and…

  20. Self-Assessment of Employability Skill Outcomes among Undergraduates and Alignment with Academic Ratings

    ERIC Educational Resources Information Center

    Jackson, Denise

    2014-01-01

    Despite acknowledgement of the benefits of self-assessment in higher education, disparity between student and academic assessments, with associated trends in overrating and underrating, plagues its meaningful use, particularly as a tool for formal assessment. This study examines self-assessment of capabilities in certain employability skills in…

Top